1 to 10 of 23
Sort by

Library Entry
EVAL 2016: Evaluating Across a Complex System of Arts Education Programs

This panel features three papers that discuss strategies for assessing student engagement, evaluating teacher professional development, and evaluating equitable access across networks of arts partnerships


Library Entry
A New Model for Engaging Under-Represented High School Students in STEM Using Popular Media and Technology: Lessons from the PAC-Involved Evaluation

Poster for AEA2015 To engage “Internet Generation” (born post-2000) students in STEM, we need new pedagogical models. For the PAC (Physics, Astronomy, and Cosmology)-Involved pilot project, we developed and tested an innovative learning model to engage under-represented high school students in...

poster new model.pdf

Library Entry
Newsletter to share PAC-Involved evaluation findings with students

A newsletter by Meaningful Evidence, LLC to share findings of the Howard University PAC-Involved project evaluation with student participants. AEA 2015 conference. #STEMEducationandTraining #EvaluationUse #STEM #studentengagement #evaluationreporting

ME newsletter Final Edits Nov 7.pdf

Library Entry
Handout: The PAC-Involved Evaluation

One-page handout about Meaningful Evidence's evaluation of the Howard University PAC-Involved project, a STEM Education Topical Interest Group Exemplar Evaluation for the American Evaluation Association 2015 Conference (Chicago, November 2015). #SystemsinEvaluation #ProgramDesign ...

PAC-Involved Evaluation handout AEA2015.pdf

Library Entry
AEA 2019 Conference: Crafting Strong Measures for Indicators of Performance (2M Research)

This workshop focused on how to develop strong goals, objectives, and performance measures within an evaluation in order to demonstrate evidence of a successful intervention. The workshop provided definitions of goals, objectives, and performance measures and the criteria that should be...

Crafting Strong Measures_2M Research_AEA 2019.pdf

Library Entry
Eval11 Session 939: Lessons learned from evaluating a complex, multi-partner, multi-year, multi-site K-12 education intervention

In the aftermath of Hurricane Katrina, a multi-national technology corporation partnered with eight districts in Mississippi and Louisiana in order to transform them into 21st Century Learning Systems. Working closely with the funder, district leaders and various partnering organizations, EDC's...

AEA Presenation.pptx

Library Entry
Eval11 Session 293: Does what we value make a difference in our assessment of implementation fidelity?

Evaluators are continually tasked with making value decisions in the course of study design. In our decisions about implementation fidelity, we place value on specific observations (e.g., self-report, trained observer ratings) and measurement indicators (e.g., dosage, environment, observed use)....


Library Entry
Multiple Methods for Examining Outcome Data; Eval 2009 #224

Multiple Methods for Examining Outcome Data with Implementation Data: Alternatives for Determining Key Factors of Effectiveness; W. Wolfersteig, A. Valdivia, & A. Kopak; Arizona State University. Evaluation 2009, session #224. Presentation discussed different ways the authors examined test...

Eval 2009 #224 Wolfersteig.pdf