~AEA Public Library

Eval10 Session 307: Grappling With Uncertainty in Innovative and Complex Settings: Weaving Quality in Developmental Evaluation 

11-23-2010 13:17

Sponsored by the Systems in Evaluation TIG and the Indigenous Peoples in Evaluation TIG

Chair(s):
Syd King, New Zealand Qualifications Authority, syd.king@nzqa.govt.nz

Discussant(s):
Michael Quinn Patton, Utilization-Focused Evaluaton, mqpatton@prodigy.net

Abstract: They say of life only three things are certain; birth, death and taxes. We say of Developmental Evaluation, the only certainty is uncertainty. In this session we present the challenges associated with ensuring evaluation quality in innovative and complex situations. Through a Developmental Evaluation lens we explore the process of moving from a conceptual vision to on-the-ground practice of evaluation in emergent and dynamic contexts. We reflect on what it takes to systematically weave quality into the engagement processes, data collection and evaluative thinking, in settings where uncertainty reigns.

Navigating Uncertainty: The Cross-site Evaluation of the Supporting Evidence-based Home Visiting Grantee Cluster

Margaret Hargreaves, Mathematica Policy Research, mhargreaves@mathematica-mpr.com
Diane Paulsell, Mathematica Policy Research, dpaulsell@mathematica-mpr.com
Kimberly Boller, Mathematica Policy Research, kboller@mathematica-mpr.com
Deborah Daro, Chapin Hall, ddaro@chapinhall.org
Debra Strong, Mathematica Policy Research, dstrong@mathematica-mpr.com
Heather Zaveri, Mathematica Policy Research, hzaveri@mathematica-mpr.com
Heather Koball, Mathematica Policy Research, hkoball@mathematica-mpr.com
Patricia Del Grosso, Mathematica Policy Research, pdelgrosso@mathematica-mpr.com
Russell Cole, Mathematica Policy Research, rcole@mathematica-mpr.com

In 2008, the Children’s Bureau (CB) within the Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services funded 17 cooperative agreements to support the infrastructure needed for the high-quality implementation of existing evidence-based home visiting (EBHV) programs to prevent child maltreatment. The cross-site evaluation encompassed four domains: systems change, fidelity, child and family outcomes, and cost. Recognizing that grantees were operating in complex, dynamic, and unpredictable environments, the systems change domain used a Developmental Evaluation design that was responsive to changes in the initiatives and in their environments, including recession-related budget cuts and the unexpected drastic reduction in the grant and evaluation funding in its second year of operation. This paper reviews the evaluation’s developmental design and findings, including lessons learned by grantees about how to continue building infrastructure capacities and partnerships to support home visiting in tumultuous times.

Drawing on Deep Values to Ensure Evaluation Quality in Emergent and Uncertain Contexts

Kate McKegg, The Knowledge Institute, kate.mckegg@xtra.co.nz

As a visonary project has moved from the initial conceptual vision to on-the-ground implemention and development, the evaluators have been seriously challenged to ‘keep up’, and stay responsive to changing program needs. One of the emergent learnings from the developmental evaluation trenches has been that the quality and credibility of the evaluation process has been critically dependent on the evaluators ability to draw on the values, needs, strengths, and aspirations of the indigenous communities with whom we work, to define what is meant by ‘good program content/ design,’ ‘high quality implementation/ delivery and ‘valuable outcomes.’ This paper will discuss how the evaluators are learning to systematically build communities own definitions of ‘quality’ and ‘value’ into the evaluative process.

Talking Past Each Other: The Language of the Developmental Evaluator in Indigenous Contexts and Its Link to Quality

Nan Wehipeihana, Research Evaluation Consultancy Limited, nanw@clear.net.nz

“It’s the damn English” (language) is a phrase one of my colleagues uses when she can’t find the English phrasing or terminology to explain a cultural concept, practice or idea. Using a developmental evaluation being conducted with tribal and community based sport and recreation providers, this paper focuses on the language of evaluation, as an essential precursor to quality in evaluation. Why language matters in evaluation, is never more obvious when we are ‘talking past each other’ (Metge & Laing, 1984) and vague looks come back at us, and questions from the floor or in emails make apparent the lack of connection. This paper has a focus on the language of evaluation in an indigenous developmental evaluation context. It provides examples of evaluation language that worked and didn’t work, and reflects on the impact of language on engagement in the evaluation, shared understanding, evaluation practice, and ultimately evaluation quality.

What Does Quality Look Like in Developmental Evaluation in Indigenous Contexts.

Kataraina Pipi, Independent Consultant, kpipi@xtra.co.nz

Six months into a Developmental Evaluation with indigenous and non indigenous providers of sport and recreation services, a key reflection has been the need and opportunity to use evaluation examples that emanate from a Maori world view and are grounded in the lived experience of what it means to be Maori, within the evaluation. Within this context, ‘quality’ is beginning to be conceived of as an ‘as Maori process’, guided by cultural principles, values and practices. In this paper, we share our initial exploration of what this means for what we do, how we do it, and what we prioritize in a Developmental Evaluation in indigenous contexts.



#IndigenousPeoplesinEvaluation #SystemsinEvaluation #Developmental #2010Conference

Statistics
0 Favorited
52 Views
1 Files
0 Shares
84 Downloads

Related Entries and Links

No Related Resource entered.

Tags and Keywords

Attachment(s)
ppt file
Grappling with Uncertainty in Innovative and Complex Sett....ppt   4.07 MB   1 version
Uploaded - 11-23-2010