Here, we apply it to map findings from the evaluation of the PAC-Involved pilot, an innovative learning model to increase high school studentsโ engagement in STEM. This approach provides a transparent, collaborative way to assess the breadth and inter-connectedness of our theory, assess how it is improving, show what parts of our theory are evidence-supported, and identify options and leverage points for action and questions for future research
poster theory visualization.pdf
This panel features three papers that discuss strategies for assessing student engagement, evaluating teacher professional development, and evaluating equitable access across networks of arts partnerships
AEA_EVAL2016_KC_presentation.pdf
Poster for AEA2015 To engage โInternet Generationโ (born post-2000) students in STEM, we need new pedagogical models. For the PAC (Physics, Astronomy, and Cosmology)-Involved pilot project, we developed and tested an innovative learning model to engage under-represented high school students in...
poster new model.pdf
A newsletter by Meaningful Evidence, LLC to share findings of the Howard University PAC-Involved project evaluation with student participants. AEA 2015 conference. #STEMEducationandTraining #EvaluationUse #STEM #studentengagement #evaluationreporting
ME newsletter Final Edits Nov 7.pdf
One-page handout about Meaningful Evidence's evaluation of the Howard University PAC-Involved project, a STEM Education Topical Interest Group Exemplar Evaluation for the American Evaluation Association 2015 Conference (Chicago, November 2015). #SystemsinEvaluation #ProgramDesign ...
PAC-Involved Evaluation handout AEA2015.pdf
This workshop focused on how to develop strong goals, objectives, and performance measures within an evaluation in order to demonstrate evidence of a successful intervention. The workshop provided definitions of goals, objectives, and performance measures and the criteria that should be...
Crafting Strong Measures_2M Research_AEA 2019.pdf
In the aftermath of Hurricane Katrina, a multi-national technology corporation partnered with eight districts in Mississippi and Louisiana in order to transform them into 21st Century Learning Systems. Working closely with the funder, district leaders and various partnering organizations, EDC's...
AEA Presenation.pptx
Evaluators are continually tasked with making value decisions in the course of study design. In our decisions about implementation fidelity, we place value on specific observations (e.g., self-report, trained observer ratings) and measurement indicators (e.g., dosage, environment, observed use)....
AEA_Styers_11-3-11.pdf
Examples of real world use of different kinds of evaluation rubrics in non-profit contexts. #2013Conference #use #Rubrics #EvaluationUse #nonprofit
131018 Rubrics preso AEA McKegg vxx km.pptx
Multiple Methods for Examining Outcome Data with Implementation Data: Alternatives for Determining Key Factors of Effectiveness; W. Wolfersteig, A. Valdivia, & A. Kopak; Arizona State University. Evaluation 2009, session #224. Presentation discussed different ways the authors examined test...
Eval 2009 #224 Wolfersteig.pdf