AEA Conference TIG Sessions 2023
Internal Evaluation Poster Viewing
- Day/Time: Wednesday 5:30 PM – 7:00 PM ET
- Room: Griffin Hall
10 years of Evaluating the Food Safety Program (Poster 87)
- Presenter(s): Debra Dekker
For the past ten years, the U.S, Food and Drug Administration (FDA) has partnered with the National Association of County and City Health Officials (NACCHO) to fund the Voluntary National Retail Food Regulatory Program Standards Mentorship Program. This program provides state, local, territorial, and tribal (SLTT) agencies with funding to participate in a peer mentorship program aimed to improve achievement of conformance with one or more of the standards. The overall goals of the Mentorship Program are to increase enrollment in the training program, provide participating agencies with resources and tools to progress towards conformance, and provide opportunities for discussions and capacity building of participants. Objectives of the final 10-year evaluation were to assess the effectiveness of the Mentorship Program in achieving its goals and determine whether the program improved conformance with the standards. Findings showed that the Mentorship Program laid the groundwork for aspects of standards that mentees described as complex.
Evaluation of Alternative Preservice Teachers Practicum During a Pandemic (Poster 88)
- Presenter(s): Allen M. Mathende
This presentation discusses the critical evaluation components affected during the pandemic, like situations developing measurable outcomes in crunched times and challenges with external and internal validity or ethical norms. The presentation will show a story that emerges as different evaluators seek to solve a situation that has a crunch time to resolve. The Educator Preparation Programs saw the integration of virtual reality to supplement or provide alternative field experiences for preservice teachers who had missed their practicums. From their stories, we develop methods to deal with unforeseen situations that need immediate solutions. The questions to be answered would include the following: How do we validate our outcomes? How do we develop quick, measurable outcomes? During a pandemic, how can these evaluations be shared?
A story of partner engagement using data synthesis and social network analysis as evaluation approaches (Ignite Session)
- Day/Time: Thursday 10:15 AM – 11:15 AM ET
- Presenter(s): Bilquis Khan
- Room: 209
The Centers for Disease Control and Prevention (CDC) Division of Workforce Development Office of Policy, Partnerships, and Recruitment (OPPR) works with different partners to educate policymakers, implement public health workforce programs, and develop strategies to strengthen the public health workforce. These partners support national efforts to recruit, train, and build the workforce. In this session, the presenter shares approaches used and explored to understand partner engagements and their contributions to the public health workforce. Attendees may be able to leverage these approaches and methods in their own settings and programs. Attendees will learn how OPPR: (1) harnesses different data sources to delineate and understand partnership data to create their partner engagement story, and (2) explores social network analysis to organize and visualize metrics to understand the nature of partner connections.
Children Formerly Associated with Armed Forces/Groups (CAAFAG) in South Sudan who received vocational training open a carpentry business, prompting an evaluation that shows widespread success. (Ignite Session)
- Day/Time: Thursday 10:15 AM – 11:15 AM ET
- Presenter(s): Julia King
- Room: 209
In 2021 CMMB, supported by UNICEF, provided vocational training for Children Formerly Associated with Armed Forces/Groups (CAAFAG), more commonly known as former child soldiers, in Western Equatoria State, South Sudan. In November 2022, CMMB learned of the success of the CAAFAGs who own the carpentry business, and their ability to pass on their skills. This prompted CMMB to conduct an internal evaluation in January 2023. The evaluation goal was to assess the impact of the Technical Vocational Education and Training Skills on Children Associated with Armed Forces group enrolled in Tindoka Vocational Centre. The evaluation revealed that vocational training from carpentry to tailoring was successful, trainees now own their own businesses, CAAFAGs are teaching their skills to those who did not attend the vocational classes, and overall psychosocial support has begun to help heal PTSD.
Internal Evaluation of Three Unique Projects Funded through the American Rescue Plan Act (Ignite Session)
- Day/Time: Thursday 3:45 PM – 4:45 PM ET
- Presenter(s): Natalie Wilson, Edis Osmanovic
- Room: 209
Four American Rescue Plan Act-funded initiatives were outlined by the West Virginia Department of Health and Human Resources (DHHR) for the West Virginia University (WVU) Health Affairs Institute (Health Affairs) to implement with their collaboration and sponsorship, including one initiative that would evaluate the other three. On the surface, the initiatives had very different aims. It was the job of the evaluation team to tell the story of how they all worked toward a common goal of strengthening Home and Community-Based Services for Medicaid beneficiaries. The evaluation team used the RE-AIM framework, taking advantage of its adaptability to pivot toward the needs of the internal teams, the sponsors, and other stakeholders. This work was supported under contract with the West Virginia Department of Health and Human Resources.
Working with Program Staff to Create a Compelling Narrative of the Program's Impact Through Meaning-Making
- Day/Time: Thursday 3:45 PM – 4:45 PM ET
- Presenter(s): Lenka Berkowitz, Elena Kuo
- Room: White River Ballroom A
The success of an evaluation hinges on an evaluator's ability to help others make sense of the findings, which is crucial for developing a strong program narrative. In this skills-building workshop we will explore an innovative approach to engaging program staff in reflecting on evaluation findings to co-create a compelling program story. We will use the example of work conducted by the Alliance for a Healthier Generation (Healthier Generation) in partnership with Kaiser Permanente to advance equitable whole-child health. Through this partnership, Healthier Generation works with school districts in 8 regions across the U.S. helping to make policy, practice, and environmental changes. We will demonstrate how evaluators can harness program staff expertise to collaboratively create a narrative that accurately and authentically captures the program's impact using our recent evaluation work. The workshop will spotlight “meaning-making sessions,” a guided process that facilitates group reflection to uncover new insights from data and generate actionable next steps for program improvement. The theoretical grounding combines a utilization-focused approach to evaluation with feedback loops and reflective practice. Attendees will have the opportunity to learn from our recent experiences with meaning-making sessions conducted as part of the program evaluation and will leave the session with practical steps for implementing this approach in their own evaluation work.
Stories of Possibilities
- Day/Time: Friday 11:30 AM – 12:30 PM ET
- Presenter(s): Norma Kok
- Room: White River Ballroom B
Introduction In monitoring and evaluation (M&E), we rely on quantitative and qualitative metrics to draft M&E reports, which are usually technical and theoretical. We are often not hearing the voices of our participants and communities about their own experiences, successes and challenges on various projects. Citizen Leader Lab, a non-profit organisation based in South Africa, uses storytelling to provide insights into programme processes, outcomes and impact.
Rewriting the Narrative - Sharing Stories Behind the Statistics (Birds of a Feather Session)
- Day/Time: Friday 1:00 PM – 2:00 PM ET
- Presenter(s): Lexi Jones
- Room: White River Ballroom E
This session will be an opportunity for evaluators to discuss how their organization is gathering, incorporating, and responding to client voice in their evaluations. The speaker will moderate a discussion around the role of “anecdata” in internal evaluation, co-designing evaluation processes with clients, and using principles of storytelling and data visualization to bring data to life.
Storytelling from within: Parallel initiatives to build evaluation capacity
- Day/Time: Friday 2:30 PM – 3:30 PM ET
- Presenters(s): Linda Lee, Desiree Greenhouse, Marc Salazar
- Room: Grand Ballroom 9
This session will present three parallel activities that an internal evaluation team at a nonprofit organization is implementing to build evaluation capacity. These activities are focused on three groups: the organization’s leadership team, staff, and the internal evaluation team itself. For the organizational leadership team, the internal evaluation team provides training and support to develop and use theories of change, which helps to capture and clearly communicate the story of change that each program is expecting and experiencing. At the staff level, the evaluation team implements a program to develop staff champions for learning and evaluation; the champions develop and implement staff-driven learning agendas which helps to look deeper into the programs, the contexts, and the experiences of community members. Internally, the evaluation team is focusing their capacity building on equity in evaluations, with emphasis on understanding and incorporating the unique contexts and stories of evaluation stakeholders. Discussion in this session will focus on the role that internal evaluators can play in building organizational capacity for telling stories from data.
Creating New Stories and Processes to Help Foster Change (Multi-paper Session)
- Day/Time: Saturday 8:00 AM - 9:00 AM ET
- Room: Grand Ballroom 3
Lessons Learned from Introducing Structural Change to a Change Adverse, Complex Environment
- Presenter(s): Scarlett Kingsley
This paper presentation will chronical the process and lessons learned from implementing phase one of the first ever collective impact evaluation system within Oklahoma State University’s (OSU) Cooperative Extension.
Let the Team Tell Its Own Story: An Internal Evaluation of Agile Processes for Test Development
- Presenter(s): David Anderson
Evaluation orthodoxy holds that an external evaluator is needed to tell a truthful, objective story. This presentation describes an instance where an internal team evaluated its own processes using traditional tools and the participatory writers’ room approach. Eight test developers were tasked with using Agile methods to create practice tests. The team answered the questions:
- Main Question: How effective were Agile methods for test development?
- To what extent did Agile team members use methods that were different from standard test development methods?
- To what extent were Agile methods more effective than standard test development methods?
- What were the benefits and challenges of using Agile methods?
Engaging Your Audience: Mapping the Plot of Your Evaluation Story (Presidential Strand)
- Day/Time: Saturday 9:15 AM - 10:15 AM ET
- Presenter(s): Benjamin Williamson, Facilitator: Alicia Kiremire
- Room: Grand Ballroom 7
Just as a successful story pulls readers into the lives of its characters, an effective evaluation can transport its audience into the lived reality of the individuals who serve as its focus. Yet neither of these works can come together without proper planning and development. In this session, attendees will explore how the steps for developing an effective evaluation plan can be artfully aligned with an author’s process for crafting a successful story. When developing an evaluation plan for a program, project, or intervention, there are many elements that evaluators must consider. Similarly, authors have many practical considerations to attend to when developing a creative project. “Who comprises the cast of characters? What is the setting of the story? What do the characters experience that transforms them?” And, of course, “How will I get published?” and “Who’s going to read this anyway?” Although the creative process of individual authors may differ, each storyteller aims to craft a narrative that engages their audience from beginning to end. Successful stories don’t simply emerge from the author’s mind in their final form! Rather, they are accomplished through an iterative process of planning and development. It is an author’s hope that, through this planning process, a well-designed story structure emerges that can guide their own creative process behind the scenes and propel their future audience through the story. In much the same way, a thoughtfully designed evaluation plan can guide the work we seek to accomplish in the field of evaluation, not only serving as a roadmap for the evaluation work itself but also as a framework for telling the story of the evaluation project. Just as an author must consider their future readers, we as evaluators must intentionally seek the best methods for engaging our audience of stakeholders—whether they are community members, organizational leaders, project funders, legislators, or otherwise. Join us for this interdisciplinary session as we explore how world-building and evaluation capacity-building align! This skill-building workshop will provide evaluation planning resources for attendees and invite them to apply the skill to their own relevant projects in real time. We hope you’ll leave with a new conceptual tool for your next evaluation… or perhaps an idea for your first novel! No writing or evaluation experience is necessary to attend. Come discover the storyteller in you! This session will be led by Benjamin Williamson, who works as an internal evaluator for the K-12 STEM education non-profit CYBER.ORG. In his spare time, Benjamin enjoys many creative endeavors and is an award-winning screenwriter and filmmaker.