Search

1 to 10 of 11
Sort by

Library Entry
Conference Session: Evaluating Impact: Use of Yin's Partial Comparisons Case Study Approach

The three documents included here for use at the 2009 AEA conference are: 1) a description of our paper, 2) PowerPoint slides for our presentation, and 3) tables demonstrating use of the Partial Comparisons Approach in evaluating two STEM projects in Washington State. #EvaluationUse #Prek...

3 attachments

Focus Search - LeBeau, jlebeau@wsu.edu Washington State University Overview Introduction Partial Comparisons Approach (Yin, 1995; 2000) Rival Explanations Two Examples Discussion Introduction Randomized Controlled Trials (RCTs) given priority for impact evaluation Examples for which RCTs are inappropriate for evaluating effectiveness: Programs in the early stages of development with emerging or less defined outcomes Programs with multi-dimensional outcomes Programs whose outcomes will not occur in the near term Introduction Many STEM projects funded by NSF are difficult to evaluate Introduction of a new strategy is imperative Partial Comparisons Approach (PCA) Partial Comparisons Approach Developed by Yin (1995) Seldom used to evaluate impact of educational programs but shows promise Aim: “multiple partial comparisons instead of imposing a singular research design in carrying out an evaluation” (p. 29) Possibly less expensive and more generalizable than RCTs Partial Comparisons Approach Suggested Partial Comparisons (Yin, 1995) Outcomes-only comparisons Process-only comparisons Causal interpretation Rival interpretations Policy analyses Rival Explanations Strengthen impact claim Two categories of rivals (Yin, 2000) Real-life rivals (pp. 250-258) Direct (practice or policy) rivals Commingled (practice or policy) rivals Implementation rivals Rival theory Super rivals Societal rivals Example One Math-Science Partnership, Building Science Teaching Capacity Need based on analysis of science test score data Five components PCA and Rival Explanations Example One: Rival Explanations Component One Participants came prepared to work on strategic plans and, thus, were more productive and got more out of the SPI



Library Entry
Program Evaluation Tipsheet 68: How Do I Evaluate Impact When Audience Make-Up is Not Consistent

How Do I Evaluate Impact When the Make-Up of the Audience is not Consistent?...This document focuses on how to evaluate impact when the make-up of the audience is not consistent

Tipsheet 68.pdf


Library Entry
Eval13: Panel Session 687 - The Challenges of Shocks, Variability, and Resilience in Evaluating Impact in Adaptation Projects

AEA Conference in Washington, DC 2013 Presentation by Marc Shapiro, Ph.D Project Leader Global Climate Change Monitoring and Evaluation Project Development and Training Services (dTS) #ClimateChange #2013Conference #AdaptationProjects #GlobalClimateChange #EvaluationImpact

GCCME AEA Adaptation Shocking 2013 10 18.ppt



Library Entry
Validated Student, Teacher, and Principal Survey Instruments for STEM Education Programs

The paper shares descriptions of how these surveys have been used in three kindergarten-through-12th-grade STEM education evaluations: evaluation and capacity-building for data-driven decision-making in North Carolina State University's STEM outreach programs; a 14-grant cluster evaluation impacting over 200 elementary, middle, and high schools; and evaluation and capacity-building for district-wide STEM schools

UnfriedFaberTownsendCorn_AEA2014_STEMSurveysPaper.pdf



Library Entry
Program Evaluation Tipsheet 75: Baseline Data for Your Program?

Citation: Kiernan, Nancy Ellen (2006). Baseline Data for Your Program? Tipsheet #75, University Park, PA: Penn State Cooperative Extension. This document is part of a series of Tipsheets that contain practical evaluation illustrations based on current research and developed by Nancy Ellen...

Tipsheet 75.pdf


Library Entry
Program Evaluation Tipsheet 74: Increasing Your Survey Response Rate

Citation: Kiernan, Nancy Ellen (2005). Increasing Your Survey Response Rate.Tipsheet #74, University Park, PA: Penn State Cooperative Extension. This document is part of a series of Tipsheets that contain practical evaluation illustrations based on current research and developed by Nancy...

Tipsheet 74.pdf


Library Entry
Bibliometrics: a Key Performance Indicator in Assessing the Influence of Biomedical Research

Bibliometrics, a quantitative evaluation of publication and citation data, is one type of indicator of productivity, influence, collaboration, and reach of scientific programs. Using research publications from programs funded by the National Institutes of Health (NIH) Common Fund, this...

AEA 2016 Bibliometrics Poster.pdf