1 to 10 of 23
Sort by

Library Entry
Hot Tips for Rubric Design

The more simply a rubric captures complex ideas, the more deep thinking and careful design lies underneath. In this presentation, Jane shared some of the lesser-known tips for designing rubrics that are conceptually and methodologically sound, as well as great data visualization ideas for validly presenting evaluative conclusions drawn using rubric methodology. With awesome rubric design and dataviz, we can achieve breathtaking clarity without boiling rich outcomes down to something overly simplistic. #evaluationrubrics #2016Conference #MixedMethodsEvaluation #Rubrics #JaneDavidson

2016-10 AEA Hot Tips for Rubric Design.pdf

Library Entry
Eval13: Panel Session 444 - Evaluation Rubrics: What, Why, and How

Presentation at AEA 2013 conference as part of the panel titled: "Evaluation rubrics: What, why, and how" with Jane Davidson, Monica Pinto and Michael Scriven. The panel was designed to help participants understand what evaluation rubrics methodology is, why it is so important, and some tips for applying it successfully. This presentation, by Thomaz Chianca, demontrated the main steps required to develop high-quality rubrics using the example of a recent evaluation of a educational program by a major foundation in Brazil - the Futura Channel/Roberto Marinho Foundation

AEA'13 - Apres Thomaz 2013-10-18.pdf

Library Entry
Eval10 Session 738: Using Rubrics Methodology in Impact Evaluations of Complex Social Programs: The Case of the Maria Cecilia Souto Vidigal Foundation’s Early Childhood Development Program

Multipaper Session 738: Impact Evaluation and Beyond: Methodological and Cultural Considerations in Measuring What Works ABSTRACT: Rubrics are important tools to describe evaluatively how well an evaluand is doing in terms of performance or quality related to specific dimensions, components and/or indicators

TChianca AEA'10 FMCSV PDI 2010-11-05.pdf

Library Entry
Using Peer Feedback for Assessment Capacity-building

Noting several valuable uses for the system and its rubric-structured feedback, the authors go on to provide aggregate data they are using to guide more focused assessment capacity-building efforts

Using Peer Feedback for Assessment Capacity-building.pdf

Library Entry
Navigational Map: Four Quadrants

It’s not unusual for an investigation of an intervention to be a hybrid of two quadrants, especially within educational settings—for example, an experimental or quasi-experimental design that incorporates an “assessment tool” (e.g. rubric) to measure learning gains attributed to an instructional strategy (e.g. mentoring techniques, e-portfolio, problem-based learning, etc.)

JHSingh_2.9.2012_Navigational Map.pdf

Library Entry
Eval12 Session 266: An Example of Court-ordered Education Evaluation to Test Constitutionality and Drive Policy: The Relationships, Relevance, and Responsibilities in the Meaningful Exposure Project

The Meaningful Exposure to Non-Tested Curriculum in Alaska Project was a result of a court order issued to the State of Alaska Department of Education and Early Development (EED) in the Moore v State of Alaska case. The purpose of the Meaningful Exposure project was to conduct an overall...

2 attachments

Focus Search - Results of the statistical analysis of the revised Curriculum Exposure rubric are also included in an Appendix