Home

Internal Evaluation TIG

Purpose Statement
The purpose of the Internal Evaluation TIG is to provide a forum for networking, community building, learning, and professional development for those interested in internal evaluation in a wide variety of settings – organizations and partnerships, nonprofits and for-profits, governmental agencies and NGO, national and international milieus.
The Internal Evaluation TIG strives to:
  • Develop and sponsor AEA conference sessions related to internal evaluation
  • Promote effective practices of internal evaluation in organizations
  • Promote sharing professional information related to internal evaluation
  • Contribute to the development of theoretical frameworks and successful practices for internal evaluation



From the aea365 Tip-A-Day blog

No Data Found

Internal Evaluation Presentations at AEA 2016

Session Title Presenters Description
A Picture is Worth a Thousand Words … But Will You Use It? Suzanne Markoe Hayes, Evaluation and Research Director, Volunteers of America, Los Angeles
Ana Flores, Research Analyst, VOALA
Joshua Paul, Research Analyst, VOALA
When the job is to help and empower the most vulnerable, most human service providers are not interested in the numbers; it's about increasing quality of life. Given this mindset, the Internal Evaluation department at Volunteers of America, Los Angeles works to connect numbers of individuals being served through quick and meaningful visuals to depict key messages. Visual must be engaging and easy to digest in order of for the user to be able to interpret and utilize the information. Key messages from the reports need to be used to make real-time decisions.
Bridging the Gap Between Internal and External Evaluation Abby Laib, Evaluator, Colorado Dept. of Public Health & Environment
Shannon Lawrence, Evaluator, Colorado Dept. of Public Health & Environment
In larger organizations, "internal" evaluators may be centralized into a single evaluation unit as a way to pool talent and increase efficiencies. Because evaluators in this role are not embedded in individual programs - and may, in fact, evaluate multiple programs -- they often need to negotiate the needs and demands of different stakeholders. As a result, they can find themselves "caught in the middle" between internal and external evaluation.
Challenges of internal evaluators: A LinkedIn post that struck a chord! Birds of a Feather Gathering In October 2015, a LinkedIn message I posted on AEA group resulted in a flow of personal emails, lively online discussion, and even a paragraph in someone's doctoral dissertation. My question "what is your #1 challenge as an internal evaluator?" must have struck a chord with many evaluators. This talk focused on questions like: 1. Given the emergent nature of program design, evaluators and program designers need to work effectively with each other. But does anyone actually know how to do this? Is saying "we should involve evaluation early" enough?
Deepen Your Understanding: Using Evaluative Rubrics as a Tool for Evaluation Capacity Building Sherry Marlow Ormsby, University of Tennessee Frequently, internal evaluators are spread thin and unable to dedicate necessary time and resources to smaller programs. Evaluative rubrics serve as one tool for organizations to learn and implement evaluation strategies; providing a platform to discuss what matters with clients and stakeholders in a transparent way. By teaching program leaders to effectively utilize evaluative rubrics for continuous improvement, the overall evaluation capacity of an organization increases.
Designing Internal Evaluations to Promote Leadership Success Nicole Kozma, Manager, Advocacy and Outreach, St. Louis Children's Hospital Evaluators from an urban pediatric hospital sought to understand components of successful evaluation for leadership and funders and determined what could be improved to match evaluation to strategic planning needs. Results show that evaluation gives leaders confidence in decision-making and is successful when it gives them the data they need to advocate of staff and programs that are making an impact. Leaders and funders need to be educated about the evaluation process and its limitations.
Development of Performance Indicators to Evaluate Diverse Clinical Research Programs Susanna Weiss, National Institute of Allergy and Infectious Diseases This discussion focused on how one division at the National Institutes of Health in Maryland is developing performance measures to provide data-driven progress reports for programs and employees engaged in the facilitation and administration of clinical research in diverse domestic and global settings. The discussion also emphasized the importance of participatory methods of building performance metrics and goal-setting benchmarks to ensure that they adequately and fairly reflect each worker's productivity.
Incorporating Evaluative Thinking into Project Design and Implementation: Internal Emergent Learning, Developmental Evaluation, and Adaptive Management in Climate Change Programming Rees Warne, ClimateWorks Foundation A project well-designed for one context may not work well when that context changes. Climate change mitigation is a relatively new field. Change in political party in power, oil price fluctuations, new laws, or new technologies can substantively change a project's operating environment. An internal Emergent Learning approach to design and implementation can build learning into the project's fabric, facilitating flexibility to adjust to new information/changing conditions and can be supplemented by development evaluation.
Internal Evaluation of Project Intervention Creates Room for Improvement Obialunamma Onoh Routine evaluation of a project's intervention is not always the case in resource limited environment. This talk reports the outcome of a cross-sectional study, which informed the decision of developing a retention calendar, more adherence services, daily tracking of missed appointments, and monthly evaluation of treatment outcome to ensure improved retention rate.
Interventions for Diverse Populations Kathleen Drucker, Associate Vice President, Research, Evaluation & Impact
Elana Habib, Southcentral Foundation Learning Institute
Danelle Marable, Massachusetts General Hospital
Renata Peralta, Associate Director of Research, Evaluation, & Impact, NYC Leaderships Academy
Barbara Sappah, Senior Improvement Advisory, Organizational Development, Southcentral Foundation
Using evaluation to inform the design of a blended learning aspiring principals program. Integrating datasets: Exploring Southcentral Foundation's relationship-based training through Kirkpatrick's four-level training evaluation model.
It's Just Me: Navigating the Waters as the Sole Internal Evaluator Dr. Pamela Bishop, Director, National Institute for STEM Evaluation and Research, University of Tennessee Often, internal evaluators find themselves balancing the multiplicity of roles associated with their position without the benefit of having colleagues with whom they can brainstorm and reflect on their evaluation practice. The presenters shared lessons learned and created a forum for other "island evaluators" to discuss how they address challenges and balance workflow, and to network with other like-minded colleagues in a community of practice.
Maximizing the Benefit of Evaluation Advisory Groups Kimberly Leonard, Senior Evaluation Officer, The Oregon Community Foundation
Sonia Worcel, Vice President of Strategy and Research, The Oregon Community Foundation
Evaluation Advisory Groups have great potential to enrich the design and implementation of evaluations, as well as the use of evaluation results. The Oregon Community Foundation is using Evaluation Advisory Groups for three varied internal evaluations: a developmental evaluation, a policy/systems change evaluation, and a more traditional process and outcomes evaluation. This session shared promising approaches to engaging EAGs, pitfalls to avoid and lessons from the field about the value of these groups.
Multiple perspectives and examples of responsive internal evaluation design Eric Barela, Salesforce.org
Jenn Bejaka
Dominic Combs, University of Illinois
Dana McCurdy, Program Evaluation Director, Partners in School Innovation
This panel highlighted the multiple perspectives within a team of internal evaluators, composed of a lead evaluators and two evaluation interns. From the start, the lead evaluator chose to design a responsive evaluation in tight collaboration with program designers and implementers. With a combination of guidance and simultaneous learning from the lead evaluator, the evaluation interns quickly figured out what the task of responsive, internal evaluator entails.
RAD idea: A student-centered, collaborative, mixed-methods tool to assess change in arts-integrated classrooms Lina Cherfas, Program Evaluator, Urban Arts Partnership
James Miles, Urban Arts Partnership
Working toward closing the education achievement gap, Urban Arts Partnership places teaching artists in arts integration and afterschool instruction residencies in Title I schools. Each residency is documented through a Research and Action Dialogue (RAD), an ongoing performance-based assessment tool and project portfolio. Twice each year, teaching artists submit RAD Reports that document students’ artistic, academic, and social-emotional growth. This session explored the RAD Report findings and how its design serves multiple needs: as a data collection method, a reflection tool, and a way to document progress toward creating change in struggling schools.
Research Priorities for Injury Prevention Karin Teske, ORISE Fellow, National Center for Injury Prevention and Control The National Center for Injury Prevention and Control published research priorities in 2015 that are intended to address gaps in injury and violence prevention research over the next 3-5 years. To determine whether the research priorities are successfully addresses, an evaluation is needed. The evaluation has two aims: 1. Determine if NCIPC is doing research that is in line with the research priorities; 2. Determine the impact of research outcomes and findings to prevent or reduce the public health burden and consequences of injuries and violence.
Techniques for Overcoming Resistance to Program Evaluation Stanley Capela, HeartShare Human Services of New York One major dilemma that is often confronted by an internal evaluator is resistance to program evaluation. The presenter has been an internal evaluator in the nonprofit sector for 38 years where he learned various techniques to overcoming resistance within an organizational setting. Participants learned a variety of techniques that could be utilized in a variety of organizational settings to overcome resistance to program evaluation and ensure providing senior management with the information to strengthen their program performance.
Using Evaluation to Inform the Design of a Blended Learning Aspiring Principals Program Kathleen Drucker, Associate Vice President, Research, Evaluation & Impact, NYC Leadership Academy
Renata Peralta, Associate Director of Research, Evaluation, and Impact, NYC Leadership Academy
Since 2003, the NYC Leadership Academy has been training school leaders with its Aspiring Principals Program. Recently, NYCLA tailored the program to accommodate school leaders that are geographically dispersed. The result has been a "blended" approach to principal preparation that relies on face-to-face and online learning experiences. This paper presents an overview of the program, results from the internal evaluation of this innovation, and how these results will be used to inform future program design.
Working Together -- Evaluating Coordination in a State Health Department Anna von Gohren In order to respond to shifting resource allocated by the Montana state legislature for the hiring of personnel, as well as the complex interaction between chronic conditions and the growing importance of the Triple Aims in health care reform, the Chronic Disease Prevention and Health Promotion Bureau in the Montana Department of Public Health and Human Services started to evaluate the process that began 3 years ago, and as a result has made significant changes to the organization of staff work and responsibilities.


Questions or comments regarding the IE TIG website? Please contact the webmaster.