Coffee Break

AEA Coffee Break Webinars (CBD)

We're pleased to offer a series of short coffee-break-length demonstrations of 20-minutes each that introduce tools of use to evaluators.

Via this site, you can view and PREregister for upcoming webinars (below) or sign in and go to the Webinars eLibrary to download archived recordings of previous Coffee Break Demonstrations.

If you are not a member, and would like to participate in the Coffee Break webinars, we offer multiple membership options that will allow you to register for any Coffee Break webinars that you wish for one year from the month of payment. The regular e-membership is $85 and gives you access to each coffee break webinar for free.  This is discounted to $30 for full-time students. These e-memberships include electronic only access to four journals. To upgrade and get one-year access to all of AEA's coffee breaks, please click here to join or contact Zachary Grays in the AEA office at zgrays@eval.org.

Upcoming Coffee Break Demonstrations

Thursday, March 30, 2017 2-2:20 PM 

Designing Data Visualizations with Empathy - Amanda Buenz Makulec

The best visualizations are rooted in knowing who our audience is and designing with them in mind from the start. But often, "identifying our audience" ends up being a box to be checked by listing a laundry list of stakeholder groups: the funder, the partners, relevant technical experts, all identified by organizational name and title alone. 

What if instead we thought of our audience through a more human lens? By borrowing techniques from human centered design, like persona development and journey mapping, we can better understand the interests, wants, needs, and frustrations of our audiences, and what would motivate them to take action based on what we present. Then, we can design with those considerations in mind. 

During this coffee break, you'll learn: 
• What it means to "design with empathy" 
• Two design techniques for understanding your audience in more detail 
• How these techniques can be used in planning and workshop settings when you're designing your next dashboard, report or presentation 

Amanda Makulec, MPH works at the intersection of data analysis, visualization, and storytelling in public health. As a Visual Analytics Advisor at John Snow Inc., Amanda has worked on information design projects and conducted workshops for clients across the US and more than a dozen countries. She is one of the founders of the Data Viz Hub, a community designed to share examples of great visualizations, resources, tools, and ideas for making information accessible within the international development community, and is passionate about working with teams to find creative ways to promote the use of data for social good

 

Join this presentation by registering here.

   

Tuesday, April 11, 2017 2-2:20 PM

EVIRATER: A Systematic Approach to Rating the Strength of Evidence from the Full Range of Study Designs - Barbara Goodson & Cris Price

This presentation will discuss EVIRATER, a new system for differentiating the strength of evidence produced by evaluations across the full spectrum of study designs. EVIRATER allows for review of studies representing a continuum of evidence including both experimental and quasi-experimental studies typically reviewable within existing federally-endorsed rating systems as well as evaluations using interrupted time series, and weaker quasi-experimental or one group pre-post designs.  EVIRATER provides a systematic approach for distinguishing between different types of study designs as well as stronger and weaker applications of various designs. Using an example from Project LAUNCH, a large multi-site grant program from the Substance Abuse Mental Health Services Administration, we demonstrate how EVIRATER combines information regarding outcome measures, design features, and methods of data analyses to categorize the results of separate impact estimates into one of five “strength of evidence” categories.    

Barbara Dillon Goodson is a proposed co-Principal Investigator on the proposed evaluation.  Dr. Goodson is a nationally-recognized expert in education evaluation, with over 40 years of experience in designing and implementing implementation and outcome studies. She is currently the Principal Investigator on the National Evaluation Technical Assistance for the i3 Investment and Innovation Grants, where she leads the evaluation technical assistance team who are supporting the local evaluators on more than 150 grants testing the impact and implementation of a range of prek – grade 12 educational interventions.

Cristofer Price is a Principal Scientist at Abt Associates with over 29 years of experience in behavioral and educational research.  He is one of Abt Associates’ leading resources for study design and analysis, and is skilled at conveying complex statistical concepts in accessible terms and has presented results not only to project officers and other evaluators, but also to non-technical audiences, including practitioners and other key stakeholders.  In his 20 years at Abt, he has been the technical lead on a wide variety of projects representing a multitude of different types of evaluation designs, data elements, units of analysis, and analytical methods. 

Join this presentation by registering here.

 

Thursday, May 11, 2017 2-2:20 PM

“I Didn’t Know What I Didn’t Know”: Retrospective Pretest/Posttest Design in Teen Programs - Jill Young

Evaluators need more design options to meet the challenges they face when trying to evaluate the effectiveness of a program. Researchers have offered the retrospective pretest/posttest design as a remedy to curb response-shift bias and better estimate program effects, but few studies have used this approach with youth. Response-shift bias occurs when survey respondents overestimate or underestimate themselves at pretest because they do not have an adequate understanding of the construct on which they are evaluating themselves.

After School Matters, a Chicago nonprofit that provides afterschool programs to teens, tested the retrospective pretest/posttest design using a mixed methods approach to determine whether response-shift bias existed for teens in the program, and if so, why. The study also examined the cognitive processes teens used to complete retrospective pretest questions. This presentation provides an overview of the study, including findings and practical recommendations for internal evaluators.

Learning Objectives:
Attendees will learn how to:
• Describe the cognitive process youth utilize to complete surveys
• Identify self-reported biases and how they affect program impact estimates
• Apply the retrospective pretest/posttest design in youth programs

Jill Young has over 10 years of experience in nonprofit research and evaluation. She is currently the Senior Director of Research and Evaluation at After School Matters in Chicago, Illinois, where she leads research and evaluation efforts for 25,000 after-school and summer program opportunities for teens each year. Previous to After School Matters, Ms. Young worked as a statistical analyst at University of Chicago and as a research manager at Northwestern University. She graduated from Drake University with honors, earning her BA in journalism and mass communication. She earned her MA and PhD in research methodology from Loyola University Chicago.

Join this presentation by registering here.

NEW DOCUMENTS

Log in to see this information

Either the content you're seeking doesn't exist or it requires proper authentication before viewing.