This entry contains slides from Ning Rui and Kathryn Henderson's demonstration presentation titled Using Meta-analysis in Evaluation Research: A Demonstration, at the 2013 AEA Meeting. : #metaanalysis
AEA 2013 Rui Henderson 10_17_13.pdf
Evaluation or assessment of scientific work in universities and other research organizations has traditionally been organised around the peer review system with its almost jury-like functionality and a history of more than 300 years. The classic tradition looked only at the output or the product...
AEA paper 2009 Peer reviuew and dialogue.doc
A field experiment was conducted to test the effectiveness of norm-based persuasive messages in the context of evaluation research. Participants in an interdisciplinary conference were invited to complete two successive post-conference surveys and randomly assigned to one of two groups at each time point
AEA 2010-Synopsis and List of Resources-2.docx
Scan of 1986 articles of consolidation document, recording the merger of the Evaluation Research Society and Evaluation Network to form the American Evaluation Association
aea.articles.consolidation.pdf
This "Navigational Map" has four major quadrants to differentiate: 1) research, 2) evaluation research, 3) assessment (within education settings) and 4) program evaluation from each other
JHSingh_2.9.2012_Navigational Map.pdf
This Ignite presentation highlighted how I built relationships with two AEA affiliates, the Eastern Evaluation Research Society and the Washington Evaluators, by volunteering over the past two years
David Urias of the Evaluation Research Network and Olga Pierrakos from James Madison University will explain the basics of photo journaling, including detailing the process and providing examples of the product
2 attachments
Issues associated with this process include the following: Packaging the program for potential users Marketing/disseminating the program to the potential users Building capacity to select and use effective interventions Understanding the decisions involved in adopting a new program Maintaining quality control over multiple implementation sites Facilitating and measuring implementation Considering the balance between fidelity and adaptation Scaling up efforts to achieve public health impact Each of these issues is a possible focus for evaluation research questions
si10.Emshoff.1.7.pdf
Poster abstract: International development evaluators, researchers and donors are increasingly using systematic reviews to expand the evidence base to guide and improve the design of development interventions
AEA Poster_Iceberg on Designing non academic systematic review.jpg