The mission of the Practice Research Network is to build an infrastructure linking investigators at The University of Maryland, Baltimore, Department of Psychiatry with the public mental health system through an innovative approach to nurturing the development of activities that reflect the value of a collaborative and participatory approach to research. There will be an in-depth discussion evaluating the process of partnership development with local mental health authorities (Core Service Agencies), clinics, providers, advocacy organizations, and consumers throughout Maryland. A description of the initial and evolving structure of the network will underscore the importance of involving multiple stakeholders to increase access to studies. Data will be provided to illustrate the number of study referrals generated by the Network. There will be a discussion regarding the modifications that are needed to this approach based on feedback and evaluation received from clients over the past two years, since the network’s inception
5 attachments
Session Description: In an environment of increasing social participation and transparency, communities of practice are one means to unite a variety of partners to address common issues, share resources, and learn new information. When asked to design an evaluation of this type of complex social initiative, evaluators increasingly turn to system level evaluation. One means to frame a system level evaluation is the use of social and behavioral science theory. Both implicit and explicit use of theory will be covered, including theory of change as an explicit approach that links activities, outcomes, and contexts in a way that maximizes the attribution of interventions to outcomes. This workshop will use lecture, exercises and discussion to improve attendees’ ability to understand the application of a systems level evaluation to communities of practice as well as how to design evaluations of such complex social initiatives
See matching library entry files - They are meant to illustrate ho...
This session will provide the opportunity to discuss the development of a university-wide alumni outcomes assessment. Internal academic review at Nova Southeastern University (NSU) requires that all programs provide data on student learning outcomes from current students and alumni. With 14 different academic colleges awarding undergraduate, graduate, and professional degrees, collaboration is essential to develop a single, university-wide assessment instrument. A team of university representatives, including faculty members, researchers, administrators, and an alumni development officer worked with an external consultant to design a uniform, annual assessment grounded in the university’s core values. The assessment provides a mixed methods approach to evaluating alumni feedback with both closed and open-ended questions
AEA.2009.alumni.ppt
Working with vulnerable populations raises questions of reciprocity: as evaluators, we acquire information from communities, but what are we giving back? In this session, the presenter will demonstrate how a community consultation was structured in a way that not only collected data from participants (former political prisoners in Burma), but also trained them on the methods being used to collect their data, allowing them to use these methods with their communities in the future. Topics of this demonstration include how the evaluation team: collaborated with local partners on evaluation design; trained participants on methods they deemed useful; incorporated participatory methods from start (co-creating consent forms) to finish (reporting and use); and, used interactive methods, including story-circles and storyboards, to allow participants to interpret data and document their own recommendations. Throughout the demonstration, the positive and negative implications of adopting such an approach with a vulnerable population will also be discussed. Note: All photos of program participants, which were used during the presentation, were replaced with generic photos prior to being posted here
Eval2015-Reciprocity in Research.pptx
Each one will generate a list of strengths and limitations of each approach. They will report back to the group a summary of their insights. The panel of experts will comments on the lists and engage the group in a dialogue or discussion about the insights shared at this session
Strengths & Limitations of Collaborative Participatory and Empowerment Evaluation Approaches.pdf
PowerPoint presentation for "Examples From the Field: Applying Theories of Collaboration and Communities of Practice", Saturday (11/14) 11:50-12:35 session
AEA - Outcome-based Approach to Evaluate Collaborative Efforts 11-4-09.ppt
The Swiss Innovation Agency CTI has administrated its collaborative research funding scheme since the early 1980s
2 attachments
See matching library entry files - 1 Assessing the Effects of a Co...
Title: Using Developmental Evaluation to Address Uncertainty from a Systems Perspective Abstract 3: This presentation will discuss how three developmental evaluations addressed issues of uncertainty. The studies took place in different settings: (1) a University-based public/private partnership seeking to transform health professions education and clinical practice nation-wide; (2) six organizations collaborating to address youth homelessness in a major urban center; and (3) a collaboration between a foundation and an urban school district
See matching library entry files - Use appropriate methods to ...
Pivot to Peace is a hospital-based violence intervention program that serves patients ages 18-34 who are treated for gunshot/stab wounds at University of Louisville Hospital. Participants reside in one of nine high violence/crime West Louisville neighborhoods. An emergency room community health worker recruits participants at point of service, and thereafter, PeaceEd (community-based organization) facilitates connections to social services and conflict resolution training. Evaluation design and metrics of success were determined using stakeholder involvement approaches; outcome indicators are assessed by triangulating primary and secondary quantitative and qualitative data sources either identified, or created by the collaborative members and evaluators. A quasi-experimental evaluation design is used to compare intervention group outcomes to comparison group outcomes; the comparison group is comprised of residents from a Louisville neighborhood containing comparable levels of crime, and other stress-inducing conditions
Trinidad Jackson 2016 AEA poster.pdf
The entry presents an example of collaboration across systems based on shared principles and an exemplar of a principles-based developmental evaluation
See matching library entry files - This allowed the group to see w...