Coffee Break

Coffee Break

AEA Coffee Break Webinars (CBD)

We're pleased to offer a series of short coffee-break-length demonstrations of 20-minutes each that introduce tools of use to evaluators.

Via this site, you can view and PREregister for upcoming webinars (below) or sign in and go to the Webinars eLibrary to download archived recordings of previous Coffee Break Demonstrations.


If you are not a member, and would like to participate in the Coffee Break webinars, we offer multiple membership options that will allow you to register for any Coffee Break webinars that you wish for one year from the month of payment. The regular e-membership is $85 and gives you access to each coffee break webinar for free.  This is discounted to $30 for full-time students. These e-memberships include electronic only access to four journals. To upgrade and get one-year access to all of AEA's coffee breaks, please click here to join or contact Zachary Grays in the AEA office at

Upcoming Coffee Break Demonstrations

Thursday, June 1, 2017 2-2:20 PM 

Excessive Evaluation Anxiety: What are the signs and how can you address it? - Patrick Cortez, Heather Codd and Emily Connors

Many evaluators have encountered situations in which stakeholders have demonstrated anxiety during a program evaluation. While some anxiety can be good, high levels of anxiety can impact and potential derail the evaluation process. This can leave evaluators feeling confused, disappointed, and looking for solutions. 

Excessive evaluation anxiety is a term used to describe extreme levels of anxiety among stakeholders in response to a program evaluation. Stakeholders who are likely to develop excessive evaluation anxiety include program staff and participants impacted by the program. Some argue that excessive evaluation anxiety is linked to negative outcomes, such as the lack of evaluation use, undermining of evaluation activities, and damage to evaluator credibility.

Scholars have identified six sins of excessive evaluation anxiety including: conflict, withdrawal, resistance, shame, anger, and sense of loss of control. Join us for our upcoming Coffee Break to learn more about these signs and some strategies to help mitigate them!

Learning Objectives:

  •  Deepen attendees’ understanding of excessive evaluation anxiety, including its signs and sources,
  •  Develop awareness of potential strategies to mitigate excessive evaluation anxiety in evaluation contexts,
  •  Provide new insights based on a recent study on excessive evaluation anxiety

Patrick Cortez is the Research Associate for the Center for the Future of Organization at the Peter F. Drucker and Masatoshi Ito Graduate School of Management at Claremont Graduate University. His current work focuses on survey development, data collection, and data visualization of the OSML index and its results. His OSML-related interests include streamlining the reporting process and finding novel ways to evaluate social media initiatives within and across organizations. Prior to this position, he had worked across several different evaluation projects and contexts. This ranges from educational programs in universities across the U.S., while also assessing the effectiveness of various leadership development initiatives such as onboarding of employees and leadership coaching.


Heather is a doctoral student in Evaluation and Applied Research Methods at Claremont Graduate University (CGU). Heather’s research focuses on finding opportunities to enhance organizational performance through improved evaluation practice. Her current research interests include organizational learning and unlearning, systems thinking, evaluation capacity building, and evaluative thinking. Since coming to CGU she has worked on a number of evaluation projects, both international and domestic, in the areas of education and training, health, and professional development. Prior to beginning her doctoral studies, Heather worked for the Government of Saskatchewan in the areas of post-secondary education policy and finance, and intergovernmental relations.   



Ms. Connors, M.S. is the Director of Evaluation and Research Analystics for the Clinical and Translational Science Institute at the Medical College of Wisconsin. Ms. Connors has a strong background in social science research, translational research and multi-site evaluation research. She has been conducting research and evaluation inside the nonprofit sector for over six years.  She brings a strong collaboration and participatory approach to her work and has experience working with diverse sets of stakeholders both inside and outside of the nonprofit sector.  Ms. Connors is a founding Board Member of ¡Milwaukee Evaluation!, a state-based affiliate of the American Evaluation Association (AEA).


Join this presentation by registering here.

Tuesday, June 6, 2017 2-2:20 PM 

Emerging Trends in Evaluation - Beverly Peters

American University’s Measurement and Evaluation program has identified several key trends in evaluation pertinent to its students and evaluators alike. In this second of two Coffee Break webinars, Beverly Peters discusses these trends, which will be elaborated upon in a series of free webinars organized by American University in 2017. Topics include:

  • Evaluating Performance in Disaster Management Measuring performance in disaster management can lead to evaluation of several aspects, including prevention, preparedness, deterrence, and response. A case study from the 2015 Amtrak disaster shows the metrics the city of Philadelphia used to measure its performance related to these aspects. (Samantha C. Phillips, MA, MPH Evaluating Performance in Disaster Management)
  • Complexity Aware Monitoring How an evaluation is designed depends on a number of factors, including the complexity of the project itself. Using a Cynefin framework is one way to flesh out potential complexities and evaluation designs towards developing an appropriate evaluation approach to a project. (Kirsten Bording Collins, Founder and Director of AdaptivePurpose)
  • Evaluating Programs in Complex Environments: The Case of Afghanistan Evaluating programs in complex environments presents a unique set of challenges. The case of Afghanistan highlights these challenges as they relate to human resources, security, data, methodology, and reporting. (Mitch Teberg, M&E Specialist)

In AU’s series of free webinars, experts will discuss these emerging trends, relating these to case studies in their fields. Webpage with webinar registration links:

Beverly Peters is an Assistant Professor with the School of Proessional and Extended Studies at American Unversity, where she teaches in the unit's online Measurement and Evaluation programs. She has more than twenty years of experience teaching, conducting research, and managing community development and governance programming in Africa. She writes an online blog series in Qualitative Methods in Monitoring and Evaluation for American University. 



Join this presentation by registering here.


Thursday, June 8, 2017 2-2:20 PM 

Strategies and Tools for Democracy, Rule of Law, and Governance Research - Cat Kelly 

In this Coffee Break, Cat Kelly will discuss the strengths and weaknesses of several types of qualitative methods for conducting applied democracy, rule of law, and governance research: comparative case studies, focus group discussions, semi-structured interviews, and surveys with qualitative components. Beginning with an overview of how both research and evaluation can use social science theories about development to root an organization’s use of particular methods and specific tools for analysis, the session continues by touching on strengths and weaknesses of various methods and tools, and ends by presenting tips for deploying rapid versions of these methods and tools in practical, policy-oriented (as opposed to academic) settings. The presentation relies on several examples of tools and methods used at the American Bar Association – Rule of Law Initiative to research, monitor, and evaluate programs.  

 Learning Objectives:

  • Learn how to articulate why having a clearly articulated theory about any aspect of democracy, rule of law, and governance that one is researching is essential for making the right choices about which tools and methods to use for analysis;
  • Evaluate the strengths and weaknesses of various qualitative methods and their associated tools for conducting program and policy relevant democracy, rule of law, and governance research
  • Identify ways that practitioners can adapt these tools and methods – which are often used by academics in longer form for multi-year research initiatives- to be tailored for the much more rapid use that they require in democracy, rule of law, and governance programming.

Dr. Catherine Lena Kelly is a Mellon / American Council of Learned Societies Public Fellow, working in the Research Evaluation, and Learning Division of the American Bar Association - Rule of Law Initiative. She brings years of research and evaluation experience in sub-Saharan Africa to her practice, as well as doctoral-level training in qualitative and quantitative methods for research on democracy, rule of law, and governance. Fluent in French and proficient in Wolof, she holds a Ph.D. in government from Harvard University and has consulted or worked for organizations including Freedom House, the University of Cape Town’s African Legislatures Project, the International Budget Partnership, and the U.S. Department of State.  She has conducted nearly two years of field research for academic projects in Senegal, Ghana, and Burkina Faso and has been a Fulbright Scholar at the Free University of Brussels, where she published research about foreign intervention in the Democratic Republic of Congo. Her work has been published by ABA ROLI, by academic outlets like Comparative Politics and Journal of Democracy, and by blogs at The Washington Post, Council on Foreign Relations, and Social Science Research Council. 

Join this presentation by registering here.



Tuesday, August 15, 2017 2-2:20 PM

Intro to Non-Technical Project Management - Mary Fairchild

While we may not call it a project, we are all doing project management of some form or another.  Come learn about the various steps in the project management cycle, some common terms, and a few tips to keep work moving and on task.

Learning Objectives:  

• Identify what we consider "projects" 
• Understand project management common terminology 
• Strategies to stay on task and keep the work moving 


Mary Fairchild is a certified professional project manager who has spent her career applying these techniques within the profession of human resources.  She has worked in the tech industry for large corporation such as Microsoft and has her masters in Organizational Developmental Psychology.  Mary uses program evaluation in the corporate setting to measure the return on human resource programs such as mentoring, coaching, training, and more.  She is currently a consultant in this capacity:


Join this presentation by registering here.