Tools and Processes

Tools and Processes



Return to Conference Program                                                                       Return to SysTIG homepage

Contact emails are provided for one-to-one contact only and may not be used for mass emailing or group solicitations.

Session Title: Tools and Processes for Monitoring and Evaluating Systems Change
Multipaper Session 705 to be held in Panzacola Section F4 on Saturday, Nov 14, 9:15 AM to 10:45 AM
Sponsored by the Systems in Evaluation TIG
Chair(s):
Bob Williams,  Independent Consultant, bobwill@actrix.co.nz
Keeping Track Under Complex Conditions: The Process Monitoring of Impacts Approach
Presenter(s):
Richard Hummelbrunner, OEAR Regional Development Consultants, hummelbrunner@oear.at
Abstract: This approach systematically observes those processes, which are expected to lead to results or impacts of an intervention. It builds on the assumption that inputs (as well as outputs) have to be used by someone to produce desired effects. A set of hypotheses are identified on the desired use of inputs or outputs by various actors (e.g. partners, project owners, target groups), which are considered decisive for the achievement of effects. These hypotheses are incorporated in logic models as statements for 'intended use', and these assumptions are monitored during implementation - whether they remain valid, actually take place - or should be amended (e.g. to capture new developments or unintended effects). The paper describes the approach as well as the experience gained in Austria, where it has been applied for monitoring EU Structural Fund programs, to provide an adequate understanding of program performance under complex and dynamic implementing conditions.
Addressing the Challenges of Systems Change Evaluation: Tools for assessment
Presenter(s):
Anna F Lobosco, Developmental Disabilities Planning Council of New York State, alobosco@ddpc.state.ny.us
Dianna L Newman, University at Albany - State University of New York, eval@csc.albany.edu
Susan L Rogers, University at Albany - State University of New York, bottlecap22@hotmail.com
Abstract: Existing definitions and models of systems change are reviewed, and practical challenges of systems change evaluation will be discussed. Meta-evaluation information will be used to identify the kinds of changes that occur when systems change efforts have been successful. Based on this, a set of performance indicators will be introduced for use in evaluation of systemic change efforts. Finally, efforts to assess within the context of ongoing systems change evaluation projects are discussed, and the use of a new assessment tool is introduced. Information from the fields of developmental disabilities, mental health, and education will be used as examples.
Evaluating Systems Change at the Massachusetts Department of Developmental Services Using the Build Initiative Framework
Presenter(s):
Jennifer Sulewski, University of Massachusetts Boston, jennifer.sulewski@umb.edu
Abstract: Since 2006, the Massachusetts Department of Developmental Services has been working to make its system of day services and supports more focused on competitive employment outcomes. This effort, undertaken with assistance from the Massachusetts Medicaid Infrastructure and Comprehensive Employment Opportunities (MI-CEO) grant has involved intervention at multiple levels including: 1) changes to agency-wide and regional policies and practices, 2) technical assistance to local service providers, and 3) direct support staff training. Using the systems initiative evaluation framework developed by the Build Initiative, MI-CEO researchers developed a theory of change model incorporating the major strategies of the systems change effort. The model provided an organizing framework for a multi-method and multi-level evaluation that includes both system-wide evaluation strategies and evaluation of individual project components. For example, bellwether interviews assess high-level policy changes, case studies highlight successes and challenges at the local and individual levels, and participant surveys evaluate training outcomes.
Return to Conference Program                                                                         Return to SysTIG homepage