The more simply a rubric captures complex ideas, the more deep thinking and careful design lies underneath...With awesome rubric design and dataviz, we can achieve breathtaking clarity without boiling rich outcomes down to something overly simplistic
2016-10 AEA Hot Tips for Rubric Design.pdf
#Evaluation2009 #QuantitativeMethods-TheoryandDesign #2009Conference
An Empirical Test of the Regression Discontinuity Design.ppt
#RubricsMethodology #EvaluationandProgramDesign #2016Conference #Rubrics
Apresentacao_Geracao_Movimento_ALTERACOES MM..pdf
Evaluation design and metrics of success were determined using stakeholder involvement approaches; outcome indicators are assessed by triangulating primary and secondary quantitative and qualitative data sources either identified, or created by the collaborative members and evaluators. A quasi-experimental evaluation design is used to compare intervention group outcomes to comparison group outcomes; the comparison group is comprised of residents from a Louisville neighborhood containing comparable levels of crime, and other stress-inducing conditions
Trinidad Jackson 2016 AEA poster.pdf
The presentation addressed four specific areas: facilitating education of federal staff (internal evaluators and program staff) through education and training; addressing resource issues by expanding federal capacity to design and conduct evaluations; help ensure continuous use of evaluation methods and results; and developing increased evaluation capacity by expanding skill sets and methodologies available to support federal evaluation and accountability
2016 Bernstein Hamilton Evaluation Contracting Final.pptx
This presentation touches on matters to address in evaluation design and program delivery intended to be culturally responsive
2 attachments
See matching library entry files - Norma: CDC’s framework for program evaluation in...
Session 1414, Evaluation 2015: Design Thinking for Exemplary Evaluation: Three Examples of Design for Evaluation and Organizational Learning While the other presenters in the panel will describe the application of design thinking to program evaluation, this presentation will focus on a distinct but related topic: the use of design thinking for organizational learning in a foundation setting. Design thinking is a process for developing solutions to abstract, ill-defined, and complex problems. In addition, design thinking’s user-centered, participatory, and iterative nature makes it a useful process for engaging stakeholders. As such, the California HealthCare Foundation turned to design thinking when it began to explore how staff could learn more effectively
RTran_DT_SummaryAndSlides.pdf
Paper from the presentation about "Three Keys to Fusing STEAM Education: Digital Multimedia, Design Tools, and Computer-based Adaptive Multimedia" by Daniel Tillman, Song An, Meilan Zhang, and Rachel Boren
AEA2014 Paper.docx
In addition the panel illustrates the inextricable elements of program design, evaluation design, and information design are best embodied by a full integrated approach of these tools
Session 1921 Evaluation into Lean.pdf
In this panel presentation we discussed: 1) How RWJF shifted its evaluation focus from accountability to learning and prioritized rapid cycle learning (RCL); 2) How the evaluation team worked with RWJF to plan for and integrate RCL tools; and 3) How integrating RCL within the design phase positioned RWJF for learning during program implementation
AEA 2016_RCL panel slides 11.20.2016_FINAL.pdf