Coffee Break

AEA Coffee Break Webinars (CBD)

We're pleased to offer a series of short coffee-break-length demonstrations of 20-minutes each that introduce tools of use to evaluators.

Via this site, you can view and PREregister for upcoming webinars (below) or sign in and go to the Webinars eLibrary to download archived recordings of previous Coffee Break Demonstrations.

If you are not a member, and would like to participate in the Coffee Break webinars, we offer multiple membership options that will allow you to register for any Coffee Break webinars that you wish for one year from the month of payment. The regular e-membership is $99 and gives you access to each coffee break webinar for free.  This is discounted to $33 for full-time students. These e-memberships include electronic only access to four journals. To upgrade and get one-year access to all of AEA's coffee breaks, please click here to join or contact Zachary Grays in the AEA office at

Upcoming Coffee Break Demonstrations


Applying Complexity to Make Practical Decisions About Evaluation: A Three-session Series

Thursday, April 12, 2018 2-2:20 PM EST

Thursday, April 26, 2018 2-2:20 PM EST

Thursday, May 3, 2018 2-2:20 PM EST

This three-session coffee break series will show evaluators how to draw on complexity to make practical decisions about doing evaluation. The key is to shift attention from “complex systems” to particular themes and behaviors of complex systems. “Systems” is too ambiguous to guide operational decisions about program theory, data collection, data interpretation, or stakeholder engagement. Specific complex behaviors can provide insight about those decisions. Each presentation will start with a short common introduction, and proceed to discuss one specific complex behavior that could be useful to evaluators. Each session will stand on its own. The collective impact will familiarize evaluators with the field of complexity, and instill a sense that the field of complexity can make a contribution to evaluation practice.

Learning objectives

  • Raise evaluators’ comfort level with drawing on complexity as a source of knowledge.
  • Explain and demonstrate an observable and measurable complex behavior that can be applied when doing evaluation.



Jonathan A. Morell PhD (Jonny) believes that evaluation requires a systems approach because interventions produce complex outcomes, but that evaluations should be as simple and straightforward as possible. He has done evaluations of safety programs, multi-attribute needs assessment in environmental planning, and technology diffusion. His tool development work involves outcome monitoring for R&D, and a novel approach to using project schedules as evaluation logic models. His research is on complexity includes: 1) integration of agent based modeling and traditional evaluation, 2) methodologies for evaluating unintended consequences, 3) interactions among models and empirical data, and 4) how to conduct dialogue between funders and evaluators concerning complex behavior. More on his work can be found at his website, blog, and YouTube channel.


Register Here

Qualitative Methods in Evaluation

Coffee Break Webinar Series

Qualitative data gives evaluators unique insight into the project performance and evaluation. Qualitative methods shed light on how projects are working, or are not working, and why. In this webinar series on Qualitative Methods in Evaluation, Professor Beverly Peters will draw upon her twenty years of experience in community development in numerous countries in Africa to discuss several qualitative methods that evaluators use to monitor and evaluate projects. Together, this set of five Coffee Break Webinars will give participants an appreciation for the importance and use of qualitative data collection techniques in evaluation today.


Tuesday, May 1, 2018 2-2:20 PM EST

Using Qualitative Methods Effectively: The Why and the How

What unique insight do qualitative methods give us as program managers and evaluators? How can we use qualitative methods effectively? In this Coffee Break Webinar, Beverly Peters will introduce this series in Qualitative Methods in Evaluation. She will draw on her experiences in southern Africa to discuss why she uses ethnographic methods to gain an emic, or insider, understanding of a project population when evaluating programming.

Learning Objectives 

At the end of this session you will be able to:

  • Understand the nature and rationale of qualitative inquiry.
  • Discriminate between the emic and etic perspectives.
  • Appreciate the use of qualitative methods for monitoring and evaluation.


Register Here

Thursday, May 17, 2018 2-2:20 PM EST

Observation and Participant Observation: What Should we be Observing and How Can We Do It?

Observation and participant observation are both very important data collection tools that evaluators use, oftentimes without even knowing it! These tools provide a unique insight into program operations that can be useful for performance evaluation especially. What is the difference between observation and participant observation? What is the relevance of these traditionally ethnographic data collection tools to our work as evaluators?


Learning Objectives 

At the end of this session you will be able to:

  • Discriminate between observation and participant observation.
  • Compare and contrast the kinds of data that can be collected via observation and participant observation.
  • Discuss how to carry out observation and participant observation.
  • Understand how to record and manage data from observation and participation.
  • Appreciate the use of observation and participant observation for monitoring and evaluation.


Register Here

Tuesday, May 29, 2018 2-2:20 PM EST

Interviewing 101: The Why and the How

Evaluators use several different kinds of interviews to collect data to support their evaluations. These interviews can have more or less structure, depending on data collection needs.

Learning Objectives 

At the end of this session you will be able to:

  • Discriminate between unstructured, semi-structured, and structured interviews.
  • Compare and contrast the kinds of data that can be collected via unstructured, semi-structured, and structured interviews.
  • Understand how to record and manage data from interviews.
  • Appreciate the use of interviews for monitoring and evaluation.

Register Here


Log in to see this information

Either the content you're seeking doesn't exist or it requires proper authentication before viewing.