Search

1 to 10 of 10
Sort by

Library Entry
Adding "Value" to Evaluation in Education Settings: Opportunities for Evolving Roles of Evaluators in an Education Research and Development Paradigm

Slides from Session 1929 of the 2014 AEA Conference Panel presented by Tania Jarosewich, Kirk Knestis, Jeni Corn, and Rita O'Sullivan Adding "Value" to Evaluation in Education Settings: Opportunities for Evolving Roles of Evaluators in an Education Research and Development Paradigm #collaborativeevaluation #Dissemination #STEMEducationandTraining #2014Conference #CommonGuidelinesforResearch #scaleup

Adding Value to Evaluation in Education Settings.pdf


Library Entry
Pivot to Peace: A Hospital-Based Violence Intervention Collaborative Evaluation Design

Evaluation design and metrics of success were determined using stakeholder involvement approaches; outcome indicators are assessed by triangulating primary and secondary quantitative and qualitative data sources either identified, or created by the collaborative members and evaluators. A quasi-experimental evaluation design is used to compare intervention group outcomes to comparison group outcomes; the comparison group is comprised of residents from a Louisville neighborhood containing comparable levels of crime, and other stress-inducing conditions

Trinidad Jackson 2016 AEA poster.pdf


Library Entry
Eval12 Session 191: Distributed Evaluation - Benefits and Challenges of Cross-Organization Collaborations

Presenter(s): Gina Svarovsky, Science Museum of Minnesota, gsvarovsky@smm.org Amy Grack Nelson, Science Museum of Minnesota, agnelson@smm.org Abstract: Evaluation projects involving evaluators distributed across the country can have several benefits, but also present a range of challenges. The Science Museum of Minnesota will share their experiences and lessons learned from two distributed evaluation projects. One is an evaluation exploring the communication structures within the Nanoscale Informal Science Education Network (nisenet.org), the other is a collaborative synthesis of evaluation reports posted on the website informalscience.org. Both projects involve a team of evaluators from multiple organizations working together to create a shared vision for the study, finding the best online tools to facilitate collaborative work, cooperatively defining and refining coding categories, coordinating multiple coders within the same NVivo project, and orchestrating reporting to various stakeholders. Session participants will gain insight and share their own ideas about how to conduct these types of distributed evaluation projects and address ways to overcome challenges that naturally arise during this type of collaborative work

2 attachments


Library Entry
A Collaborative Mixed-Method Evaluation of a Multi-Site Student Behavior, School Culture, and Climate Program

AEA 2013, Multipaper Session 751 Session Title: Customizing Evaluation to Make it Work for You: Performance Metrics, Mixed Methods, and Evaluation Systems This paper presents a mixed-method, multi-disciplinary study, focused on student behavior, school culture, and climate work in grades 3-9, piloted by a large, education-focused non-profit

AEA slides_BCCI_FINAL (AEA Library Post).pptx



Library Entry
Eval11 Session 220: An Evaluation of an Innovative, Collaborative Approach to Interfacing Research Systems with the Mental Health Community

There will be an in-depth discussion evaluating the process of partnership development with local mental health authorities (Core Service Agencies), clinics, providers, advocacy organizations, and consumers throughout Maryland

5 attachments

Focus Search - MOVING FORWARD Lesson Learned: Consumers/Family Members/Advocates have a better understanding of research and indicate an interest in participating in studies Next Steps: Increase utilization of the Network of Care Behavioral Health Research Website through training Address stigmatizing attitudes towards research Increase involvement in all stages of the research process MOVING FORWARD Lesson Learned: Providers who had contact with the PRN have found it to be helpful and beneficial to their organization Next Steps: Arrange for investigators to provide educational presentations including continuing education credits Keep providers informed about the outcome of research that has been conducted at their agency Enable providers to share ideas about potential research topics MOVING FORWARD Lesson Learned: This was not an effective way to evaluate our relationship with Investigators and study teams Next Steps: Incorporate Investigator feedback into the PRN process Obtain feedback from organizations where data was collected Better role definition among the PRN, the Investigator, and the study team For additional information please contact: The Practice Research Network Liaisons Deb Piez Dpiez@psych.umaryland.edu 410-646-0355 Dan Nieberding Dnieberd@psych.umaryland.edu 410-646-1302 Sandra Sundeen, Project Manager Ssundeen@psych.umaryland.edu 410-646-3253An Evaluation of an Innovative, Collaborative Approach to Interfacing Research Systems with the Mental Health Community Abstract: The mission of the Practice Research Network is to build an infrastructure linking investigators at The University of Maryland, Baltimore, Department of Psychiatry with the public mental health system through an innovative approach to nurturing the development of activities that reflect the value of a collaborative and participatory approach to research