1 to 10 of 59
Sort by

Library Entry
Eval2015 Session 1442: Principles in Practice—Stakeholder Engagement in Multisite Evaluations

Presentation 4: Building the Capacity of the Capacity-Builders—Lessons from the Internal Evaluation of a Multistate Technical Assistance Program: The technical assistance program being evaluated operates using a capacity-building framework to provide high-quality, relevant, and useful capacity...

AEA Presentation 2015 - Horwood.pdf

Library Entry
Engaging Various Levels of Stakeholders in Grant-Mandated Evaluation of Community-Based Organizations

These materials are from the Evaluation 2015 session. How can I build stakeholder interest in funder-mandated evaluations of a complex, multi-site program? During this session, participants will explore this question using the Community HealthCorps AmeriCorps program as an example. Community...

2 attachments

Library Entry
Culturally Specific Approaches to Systems and Policy Change to Reduce Health Disparities: Evaluating the EHDI Initiative

In 2001 the Minnesota Legislature established the Eliminating Health Disparities Initiative mandating the allocation of competitive grants to local programs to close the gap in the health status of African Americans/Africans, American Indians, Asian Americans, and Hispanic/Latinos in Minnesota...

2 attachments

Library Entry
Embracing Complexity and Collaboration in Multi-Site Evaluation: Insights From a Randomized Control Trial on Youth Mentoring

This session explored the first two years of the evaluation of the Office of Juvenile Justice and Delinquency Prevention(OJJDP)'s Mentoring Enhancement Demonstration Program (MEDP). MEDP funds 10 collaborative grantees encompassing 32 mentoring agencies nationwide. The American Institutes for...

AEA Presentation - MEDP_20141016.pdf

Library Entry
Data Collection from Afar Think Tank Notes

Remote work: What are best practices and great ideas to manage quality data collection from afar? Think Tank Session led by the Improve Group at AEA 2014 #EvaluationManagersandSupervisors #Cluster,Multi-SiteandMulti-LevelEval #Collaborative,ParticipatoryandEmpowermentEval #2014Conference...

data collection from afar group notes.docx

Library Entry
A Tale of Two Cities: Afterschool Evaluation in Oakland and San Francisco

From the 2014 AEA Annual Conference: A high-level look at aggregate, multi-year data patterns in San Francisco Bay Area afterschool evaluation. By Public Profit. #2014Conference #MixedMethodsEvaluation #Cluster,Multi-SiteandMulti-LevelEval #YouthFocusedEvaluation


Library Entry
Evaluating a Program Designed to Improve Attitudes between Police and Youth

This poster presents the results of an evaluation of a program which aimed to foster positive relationships between police officers and youth in Connecticut. Eleven programs were funded in Year One, with an equal number in Year Two. Funded communities designed programs that included local police...

Evaluating A Police and Youth Progam.pdf

Library Entry
Assessing Program Fidelity Across Multiple Contexts: The Fidelity Index, Part II

From AEA 2014, this demo session followed a step-by-step process for working with stakeholders to compute a fidelity index, or an overall summative score that assesses the extent to which the program in reality aligns to the program in theory. #2014 Conference #Cluster,Multi-SiteandMulti...

Fidelity Index AEA2014_FINAL.pptx

Library Entry
Eval14: Multisite evaluation of school improvement initiatives

Serving Multiple Masters: Program Evaluation for School, District, State, and Federal Audiences TIG: Cluster, Multi-site and Multi-level Evaluation #Prek-12EducationalEvaluation #Communications #2014Conference #Cluster,Multi-SiteandMulti-LevelEval

Serving Multiple MastersPP_2014_10_16.pdf