1 to 10 of 18
Sort by

Library Entry
Meaningful Evaluation in the Context of Schools

Three presentations as they were presented at AEA 2015 for the session, "Meaningful Evaluation in the Context of Schools" on Fri, Nov 13, 2015 (03:30 PM - 04:15 PM). 1. Holistically Evaluating Teacher Professional Development 2. Rapid Cycles of Evaluation for School Improvement 3. School...

3 attachments

Library Entry
NSF Investment in Advancing Evaluation

This poster presented at AEA 2014 displays the results of an exploratory study, in which we examined the grant awards made by the National Science Foundation’s Directorate for Education and Human Resources that were intended to advance evaluation knowledge and practice among STEM educators and...


Library Entry
A Sign of the Times - More is Better: Exploring the Additive Effect of Professional Development on Student Science Exam Outcomes

The purpose of this paper is to examine longitudinal quantitative and qualitative data to uncover how an additive effect of professional development influences student outcomes. The program evaluation of a large science grant initiative outlined the delivery of a series of PD opportunities to a...

2 attachments

Library Entry
Using the Quality Implementation Tool as a Framework for Process Evaluation: Going Beyond Fidelity

Slides for Using the Quality Implementation Tool as a Framework for Process Evaluation: Going Beyond Fidelity talk in Session: Quality Implementation and Evaluation of Evidence-based Interventions #QualityImplementationTool #ImplementationScience #GettingToOutcomes

final AEA QIT for Process Eval 10-15-13.pdf

Library Entry
Eval12 Session 653: Common Core Indicators for Describing Successful Alliances

This paper explores the creation of a set of common indicators to measure a set of National Science Foundation (NSF) alliance programs. It draws from indicator frameworks developed for Broadening Participation and for Informal Science Education and initiates a conversation on reporting similar...

Evaluating Alliance Programs.pdf

Library Entry
Eval12 Session 911: Team Science Evaluation: Developing Methods to Measure Convergence of Fields

A problem in the evaluation of team science is how to measure team formation. For team-based projects where a goal is to bring together researchers from multiple disciplines to catalyze new research approaches, one metric is evolution and use of terms that combine aspects of multiple disciplines...

AEA2012_Slides_Final Unni Jensen.pptx

Library Entry
Eval12 Session 755: Mixed Methods Evaluation of Team Science Education

Abstract: Novel educational offerings aimed at training translational researchers in the skills necessary to pursue collaborative, team-based research have increased with the introduction of the Clinical and Translational Science Awards (CTSA). Yet evaluating team training curriculum and its...

Rainwater & Henderson Team Science AEA 2012.pdf

Library Entry
Eval12 Session 663: Framework for Evaluating Individual Learning Outcomes of Citizen Science

Developed by the DEVISE (Developing, Validating, and Implementing Situated Evaluations) team as part of a 3-yearNSF grant to build evaluation capacity in citizen science programs. Presentation Abstract: Projects that engage the public in scientific research, (often referred to as “citizen...


Library Entry
Eval11 Session 779: Tracking for Translation: Novel Tools for Evaluating Translational Research Education Programs.

The Clinical and Translational Science Awards (CTSA) incorporate innovative translational research training programs aimed at producing a diverse cadre of scientists who work collaboratively to rapidly translate biomedical research into clinical applications. Evaluation of these programs that...

AEA 2011 Session 779 Rainwater Griffin and Henderson - Tracking for Translation.pdf