1 to 8 of 8
Sort by

Library Entry
NSF Investment in Advancing Evaluation

This poster presented at AEA 2014 displays the results of an exploratory study, in which we examined the grant awards made by the National Science Foundation’s Directorate for Education and Human Resources that were intended to advance evaluation knowledge and practice among STEM educators and evaluators


Library Entry
How the Funder Can Structure a Useful Evaluation

Slide show from expert lecture presentation Saturday a.m. at Evaluation 2009. How the funder of evaluation can facilitate better cooperation among program and evaluation staff. #Non-ProfitsandFoundationsEvaluation #foundation #2009Conference #utilization #EvaluationUse #AdvocacyandPolicyChange #evaluation #Evaluation2009 #OrganizationalLearningandEvalCapacityBuilding #HealthEvaluation #EvaluationManagersandSupervisors

funder slide show.ppt

Library Entry
Eval13: Panel Session 444 - Evaluation Rubrics: What, Why, and How

Presentation at AEA 2013 conference as part of the panel titled: "Evaluation rubrics: What, why, and how" with Jane Davidson, Monica Pinto and Michael Scriven. The panel was designed to help participants understand what evaluation rubrics methodology is, why it is so important, and some tips for applying it successfully. This presentation, by Thomaz Chianca, demontrated the main steps required to develop high-quality rubrics using the example of a recent evaluation of a educational program by a major foundation in Brazil - the Futura Channel/Roberto Marinho Foundation

AEA'13 - Apres Thomaz 2013-10-18.pdf

Library Entry
Eval16 - Are We There Yet? Applying Rapid Cycle Learning Methods to Evaluation within a Foundation’s Program Design

In this panel presentation we discussed: 1) How RWJF shifted its evaluation focus from accountability to learning and prioritized rapid cycle learning (RCL); 2) How the evaluation team worked with RWJF to plan for and integrate RCL tools; and 3) How integrating RCL within the design phase positioned RWJF for learning during program implementation

AEA 2016_RCL panel slides 11.20.2016_FINAL.pdf

Library Entry
Do Health Professions Pipeline Programs Make a Difference? Findings from the RWJF SMDEP Impact Study

This presentation provides an overview of the final evaluation report which describes program implementation across sites and measures the effect of the program on students’ outcomes using a quasi-experimental design. Among other key findings, the evaluation found that the majority of SMDEP participants stay on the path to a potential career in health, but program effectiveness varies by type of program offerings and important implementation characteristics such as program leadership approach and faculty stability


Library Entry
Eval12 Session 653: Common Core Indicators for Describing Successful Alliances

This paper explores the creation of a set of common indicators to measure a set of National Science Foundation (NSF) alliance programs. It draws from indicator frameworks developed for Broadening Participation and for Informal Science Education and initiates a conversation on reporting similar...

Evaluating Alliance Programs.pdf