PowerPoint used for RealWorld Evaluation workshop in Minneapolis October 24, 2012.
Note: this version includes a "bonus section" on unanticipated consequences of development interventions and a few thoughts on evaluating complex development interventions.
Workshop Description: How can you conduct adequately valid impact evaluations under real world circumstances? There are influential individuals and institutions who continue to promote experimental research designs and methods as the “gold standard.” However, attempting to implement a randomized control trial (RCT) on development programs in society is often inappropriate or, at least, infeasible. Very frequently projects are started without conducting a baseline study that is comparable with a pre-planned endline evaluation. Even more frequently, it would be impractical or even unethical to randomly select individuals or other units of analysis into ‘treatment’ and ‘control’ groups.
Welcome to the real world! Through participatory processes we will explore techniques that help evaluators and clients ensure the best quality evaluation under real-life constraints like those described above and others. You will learn about the approaches in the 2nd edition of the RealWorld Evaluation book, and from the extensive international experiences of the authors.
You will learn:
The seven steps of the RealWorld Evaluation approach for addressing common issues and constraints;
Practical and adequate designs for times when randomized control trials is neither feasible nor appropriate; how to identify and assess your options;
Ways to reconstruct baseline data when the evaluation does not begin until the project is well advanced or completed;
Alternative strategies for determining what would have happened without the project, producing alternative counterfactuals with the credibility needed for your project;
Strategies for evaluating complex programs where conventional project-level impact evaluation designs cannot be applied;
Ways to identify and address threats to the validity or adequacy of quantitative, qualitative and mixed methods designs.
Jim Rugh has more than 48 years of experience in international development, 32 of them as a professional evaluator, mainly of international NGOs. Michael Bamberger spent a decade working with NGOs in Latin America, and working on evaluations with the World Bank. They, along with Linda Mabry, first co-authored RealWorld Evaluation: Working Under Budget, Time, Data and Political Constraints in 2006, the 2nd edition came out last year and will be the backbone of this workshop.#HowTo #practical #InternationalandCross-CulturalEval #QuantitativeMethods-TheoryandDesign #2012Conference #TeachingofEvaluation #Instruments #TheoriesofEvaluation #QualitativeMethods #ProgramTheoryandTheoryDrivenEvaluation #Mixed-Methods #realworldevaluation #MixedMethodsEvaluation #EvaluationUse