~AEA Public Library

Evaluation rubrics: how to ensure transparent and clear assessment that respects diverse lines of evidence 

04-17-2013 02:50

This report provides a detailed description of an evaluation, written by Judy Oakden, as part of the first BetterEvaluation writeshop process, led by Irene Guijt. Peer reviewers for this report were Carolyn Kabore and Irene Guijt. Excerpt: Independent external evaluators generally have to work within a range of constraints. Often there is less than ideal availability of time, money, or data. This article presents an example of how a team of external evaluators worked around these constraints on an evaluation in the education sector. The evaluation process incorporated the use of a logic model to identify boundaries. It also featured the use of rubrics, to make evaluative judgements – their use supported robust data collection and framed analysis and reporting. The evaluation used a mixed-methods approach, which included qualitative and quantitative survey data as well as existing project data, which helped build up a rich evidential picture. Furthermore, an indigenous Māori perspective was present throughout the evaluation ensuring Māori views were heard, respected, and actioned within this mainstream project.

#UnderstandCauses #logicmodel #DetermineWhatSuccessLooksLike #Survey #example #Synthesizedatafromasingleevaluation #SupportUse #DevelopProgramTheory #Interviews #UtilizationFocusedEvaluation #Rubrics #NewZealand #DevelopReportingMedia

0 Favorited
0 Files

Related Entries and Links

No Related Resource entered.

Tags and Keywords