What does it take to establish and maintain an evaluation community of practice?
Communities of Practice with pics.pptx
Slides from Session 1929 of the 2014 AEA Conference Panel presented by Tania Jarosewich, Kirk Knestis, Jeni Corn, and Rita O'Sullivan Adding "Value" to Evaluation in Education Settings: Opportunities for Evolving Roles of Evaluators in an Education Research and Development Paradigm #collaborativeevaluation #Dissemination #STEMEducationandTraining #2014Conference #CommonGuidelinesforResearch #scaleup
Adding Value to Evaluation in Education Settings.pdf
Evaluation design and metrics of success were determined using stakeholder involvement approaches; outcome indicators are assessed by triangulating primary and secondary quantitative and qualitative data sources either identified, or created by the collaborative members and evaluators. A quasi-experimental evaluation design is used to compare intervention group outcomes to comparison group outcomes; the comparison group is comprised of residents from a Louisville neighborhood containing comparable levels of crime, and other stress-inducing conditions
Trinidad Jackson 2016 AEA poster.pdf
Presenter(s): Gina Svarovsky, Science Museum of Minnesota, gsvarovsky@smm.org Amy Grack Nelson, Science Museum of Minnesota, agnelson@smm.org Abstract: Evaluation projects involving evaluators distributed across the country can have several benefits, but also present a range of challenges. The Science Museum of Minnesota will share their experiences and lessons learned from two distributed evaluation projects. One is an evaluation exploring the communication structures within the Nanoscale Informal Science Education Network (nisenet.org), the other is a collaborative synthesis of evaluation reports posted on the website informalscience.org. Both projects involve a team of evaluators from multiple organizations working together to create a shared vision for the study, finding the best online tools to facilitate collaborative work, cooperatively defining and refining coding categories, coordinating multiple coders within the same NVivo project, and orchestrating reporting to various stakeholders. Session participants will gain insight and share their own ideas about how to conduct these types of distributed evaluation projects and address ways to overcome challenges that naturally arise during this type of collaborative work
2 attachments
AEA 2013, Multipaper Session 751 Session Title: Customizing Evaluation to Make it Work for You: Performance Metrics, Mixed Methods, and Evaluation Systems This paper presents a mixed-method, multi-disciplinary study, focused on student behavior, school culture, and climate work in grades 3-9, piloted by a large, education-focused non-profit
AEA slides_BCCI_FINAL (AEA Library Post).pptx
Working with vulnerable populations raises questions of reciprocity: as evaluators, we acquire information from communities, but what are we giving back?
Eval2015-Reciprocity in Research.pptx
There will be an in-depth discussion evaluating the process of partnership development with local mental health authorities (Core Service Agencies), clinics, providers, advocacy organizations, and consumers throughout Maryland
5 attachments
Focus Search - MOVING FORWARD Lesson Learned: Consumers/Family Members/Advocates have a better understanding of research and indicate an interest in participating in studies Next Steps: Increase utilization of the Network of Care Behavioral Health Research Website through training Address stigmatizing attitudes towards research Increase involvement in all stages of the research process MOVING FORWARD Lesson Learned: Providers who had contact with the PRN have found it to be helpful and beneficial to their organization Next Steps: Arrange for investigators to provide educational presentations including continuing education credits Keep providers informed about the outcome of research that has been conducted at their agency Enable providers to share ideas about potential research topics MOVING FORWARD Lesson Learned: This was not an effective way to evaluate our relationship with Investigators and study teams Next Steps: Incorporate Investigator feedback into the PRN process Obtain feedback from organizations where data was collected Better role definition among the PRN, the Investigator, and the study team For additional information please contact: The Practice Research Network Liaisons Deb Piez Dpiez@psych.umaryland.edu 410-646-0355 Dan Nieberding Dnieberd@psych.umaryland.edu 410-646-1302 Sandra Sundeen, Project Manager Ssundeen@psych.umaryland.edu 410-646-3253An Evaluation of an Innovative, Collaborative Approach to Interfacing Research Systems with the Mental Health Community Abstract: The mission of the Practice Research Network is to build an infrastructure linking investigators at The University of Maryland, Baltimore, Department of Psychiatry with the public mental health system through an innovative approach to nurturing the development of activities that reflect the value of a collaborative and participatory approach to research
Through a strong collaboration between an external evaluator and the DAM’s Education department leaders, rich evaluation data were collected to better understand the interests of Latino audiences in culture-specific programs, and their experiences during a general museum visit. This collaboration was critical to ensure that the results of the evaluation were useful to DAM
Listening to Latino Visitors - Enriquez-Salazar-Nielsen Oct 18-2014.pdf
A community organization delivering a sexual abuse prevention program to at-risk students in a large, urban school district worked collaboratively with an external evaluator to build evaluation capacity and develop student outcome measures
No Secrets Program Evaluation.pdf
AEA 2010 Session 812 presentation #participatory #Stakeholder #involvement #2010Conference #stakeholder-based #collaborative #ResearchonEvaluation
Fukunaga&Brandon_AEA_2010_presentation_2010-11-13.ppsx