Carter McNamara offers a document which provides guidance toward planning and implementing an evaluation process for for-profit or nonprofit programs - there are many kinds of evaluations that can be applied to programs, for example, goals-based, process-based, outcomes-based, etc. Sections of This Topic Include: - Program Evaluation: carefully getting information to make decisions about programs - Where Program Evaluation is Helpful - Basic Ingredients (you need an organization and program(s)) - Planning Program Evaluation (what do you want to learn about, what info is needed) - Major Types of Program Evaluation (evaluating program processes, goals, outcomes, etc.) - Overview of Methods to Collect Information (questionnaires, interviews, focus groups, etc.) - Selecting Which Methods to Use (which methods work best to get needed info from audiences) - Analyzing and Interpreting Information - Reporting Evaluation Results - Who Should Carry Out the Evaluation? - Contents of an Evaluation Plan - Pitfalls to Avoid
This handbook from the W.K. Kellogg Foundation, written in 1998 and updated 2004, provides a framework for thinking about evaluation as a relevant and useful program tool. It was written primarily for project directors who have direct responsibility for the ongoing evaluation of W.K. Kellogg Foundation-funded projects.
Publications which are practical, easy-to-use guides designed to help Extension faculty better plan and implement credible and useful evaluations. They also may be useful to agencies or funders who are seeking assistance with realistic evaluation strategies.
G*Power 3 covers statistical power analyses for many different statistical tests of the * F test, * t test, * χ2-test and * z test families and some * exact tests. G*Power 3 offers five different types of statistical power analysis: * A priori (sample size N is computed as a function of power level 1-β, significance level α, and the to-be-detected population effect size) * Compromise (both α and 1-β are computed as functions of effect size, N, and an error probability ratio q = β/α) * Criterion (α and the associated decision criterion are computed as a function of 1-β, the effect size, and N) * Post-hoc (1-β is computed as a function of α, the population effect size, and N) * Sensitivity (population effect size is computed as a function of α, 1-β, and N) G*Power 3 provides improved effect size calculators and graphics options. It supports both a distribution-based and a design-based input mode. G*Power 3 is available for Mac OS X 10.4 and Windows XP/Vista. G*Power 3 is free.
Harvard Family Research Project's evaluation periodical, The Evaluation Exchange, addresses current issues facing program evaluators of all levels, with articles written by the most prominent evaluators in the field. Designed as an ongoing discussion among evaluators, program practitioners, funders, and policymakers, The Evaluation Exchange highlights innovative methods and approaches to evaluation, emerging trends in evaluation practice, and practical applications of evaluation theory. It goes out to its subscribers free of charge four times per year.
The DAC Working Party on Aid Evaluation (WP-EV) has developed this glossary of key terms in evaluation and results-based management because of the need to clarify concepts and to reduce the terminological confusion frequently encountered in these areas. ***** Si le Groupe de travail du CAD sur l’évaluation de l’aide (GT-EV) a décidé d’élaborer un glossaire des principaux termes relatifs à l’évaluation et la gestion axée sur les résultats, c’est parce qu’il était indispensable de préciser les concepts utilisés et de limiter les confusions terminologiques, fréquentes dans ce domaine. ***** El Grupo de trabajo del CAD sobre evaluación de la ayuda ha preparado este glosario de los principales términos de evaluación y gestión basada en los resultados con el fin de aclarar conceptos y reducir la confusión terminológica que a menudo impera en estas materias.
Evaluation of the progress and effectiveness of projects funded by the National Science Foundation's (NSF) Directorate for Education and Human Resources (EHR) has become increasingly important. Project staff, participants, local stakeholders, and decisionmakers need to know how funded projects are contributing to knowledge and understanding of mathematics, science, and technology. To do so, some simple but critical questions must be addressed: * What are we finding out about teaching and learning? * How can we apply our new knowledge? * Where are the dead ends? * What are the next steps? Although there are many excellent textbooks, manuals, and guides dealing with evaluation, few are geared to the needs of the EHR grantee who may be an experienced researcher but a novice evaluator. One of the ways that EHR seeks to fill this gap is by the publication of what have been called "user-friendly" handbooks for project evaluation. The first publication, User-Friendly Handbook for Project Evaluation: Science, Mathematics, Engineering and Technology Education, issued in 1993, describes the types of evaluations principal investigators/project directors (PIs/PDs) may be called upon to perform over the lifetime of a project. It also describes in some detail the evaluation process, which includes the development of evaluation questions and the collection and analysis of appropriate data to provide answers to these questions. Although this first handbook discussed both qualitative and quantitative methods, it covered techniques that produce numbers (quantitative data) in greater detail. This approach was chosen because decisionmakers usually demand quantitative (statistically documented) evidence of results. Indicators that are often selected to document outcomes include percentage of targeted populations participating in mathematics and science courses, test scores, and percentage of targeted populations selecting careers in the mathematics and science fields. The current handbook, User-Friendly Guide to Mixed Method Evaluations, has been published in August 1997 and builds on the first but seeks to introduce a broader perspective. It was initiated because of the recognition that by focusing primarily on quantitative techniques, evaluators may miss important parts of a story. Experienced evaluators have found that most often the best results are achieved through the use of mixed method evaluations, which combine quantitative and qualitative techniques. Because the earlier handbook did not include an indepth discussion of the collection and analysis of qualitative data, this handbook was initiated to provide more information on qualitative techniques and discuss how they can be combined effectively with quantitative measures. Like the earlier publication, this handbook is aimed at users who need practical rather than technically sophisticated advice about evaluation methodology. The main objective is to make PIs and PDs "evaluation smart" and to provide the knowledge needed for planning and managing useful evaluations.
The UNESCO Evaluation handbook has been prepared to further understanding among UNESCO staff and key stakeholders on what evaluation is, why it is important and who is responsible for what in the evaluation process.
The International Journal of Qualitative Methods is a peer reviewed journal published quarterly as a web-based journal by the International Institute for Qualitative Methodology at the University of Alberta, Canada, and its international affiliates. It is a multi-disciplinary, multi-lingual journal, free to the public. Our goals are to advance the development of qualitative methods, and to disseminate methodological knowledge to the broadest possible community of academics, students, and professionals who undertake qualitative research. By keeping the journal free of charge, we hope to reach an audience who, for whatever reason, do not read traditional, subscription -based journals. The IJQM is indexed in the following: EBSCO Academic Search and Science + Technology collections, International Bibliography of the Social Sciences, and Sociological Abstracts. For authors interested in submitting an article, the instructions and guidelines are available on the Web site. The journal welcomes articles in all areas of study. ISSN: 16094069