Session Description: In an environment of increasing social participation and transparency, communities of practice are one means to unite a variety of partners to address common issues, share resources, and learn new information. When asked to design an evaluation of this type of complex social initiative, evaluators increasingly turn to system level evaluation. One means to frame a system level evaluation is the use of social and behavioral science theory
5 attachments
The panelists presented (1) the history, development, and context of the network surrounding the evaluation capacity building initiative; (2) the evaluation capacity building initiative and its implementation within the complex system of the NISE Net; and (3) a summary of the results of the initiative using case studies from two different levels of the network. The discussants will discuss (1) the relationships between the network system and evaluation capacity building and (2) the implications of the findings for the field of evaluation. The panel ended with the opportunity for attendees to raise questions or offer their own observations on evaluation capacity building in complex systems. #Systems #ECB #evaluationcapacitybuilding #ResearchonEvaluation #complexadaptivesystems #2015Conference
2015 AEA CASNET combined slides.pptx
This panel addressed strategies for promoting evaluation use in a variety of nonprofit settings. Panelists shared their perspectives on the multiple roles evaluators inhabit; working with stakeholders to promote use throughout the evaluation process, developing a theory of action with appropriate goals, metrics, and indicators; establishing systems to build internal capacity to use evaluation data; and working in and around an organization’s political and cultural structures
AEA 2012 Eval Use-Full Slide Deck v2.pptx
Presentation and related publications on scaling up a reproductive health innovation, using a systems oriented approach, based on a 5 year prospective study in five countries
3 attachments
See matching library entry files - T H E O R Y A N D P R A C T I C E and Monitoring...
The entry presents an example of collaboration across systems based on shared principles and an exemplar of a principles-based developmental evaluation. The collaboration consists of a family foundation convening six nonprofit organizations that serve homeless youth in a metropolitan area. The collaborative approach was based on the assumptions that: youth's different needs are met by different agencies; what constitutes effective programming varies depending on the age, history, and situation of the youth; and the providers are operating within a complex adaptive system. A multiple case study method was selected to investigate how these principles were employed and experienced by multiple stakeholders within the system. A systems inquiry framework was applied to the design and analysis
2 attachments
See matching library entry files - She did a lot of the build up ...
Developmental evaluation (DE) is an early 21st century development, so it is altogether fitting that its state of development be reviewed at the AEA annual conference. DE applies systems thinking and complexity concepts to evaluation of social innovations. Evaluation in complex dynamic systems requires nimble adaptation, timely feedback, and special attention to phenomena like emergence, nonlinearity, and dynamical changes
State of Developmental Eval AEA 2013.pdf
Existing definitions and models of systems change are reviewed, and practical challenges of systems change evaluation will be discussed. Meta-evaluation information will be used to identify the kinds of changes that occur when systems change efforts have been successful.Finally, efforts to assess within the context of ongoing systems change evaluation projects are discussed, and the use of a new assessment tool is introduced. Information from the fields of developmental disabilities, education, and substance abuse will be used as examples #ResearchonEvaluation #Systems #2009Conference #SystemsinEvaluation #Evaluation2009 #TheoriesofEvaluation #Change #efforts #HowTo #evaluating
AEA systems change 11 08 09.doc
Is evaluation used in the public sector? Realization of vision-and-values-driven social innovation (Patton, 2010) typifies the ideal of the public sector - the democratic governments of today. Public sector work in the 21st Century means adaptation and innovation as the norm in providing services to the changing realities of our complex ecology. Evaluators involved in public sector evaluations will share experiences and insights about how governments value, use, do and manage evaluation - including infusing evaluative thinking to facilitate evaluation use through evaluation capacity building of both individuals and the system by engaging stakeholders. We focus on the example of Ontario's education sector where Developmental Evaluation is positioned as an executive leadership responsibility focused on decision-oriented use (Patton 2012) fundamental to professional capacity building. Leaders' valuing of evaluation and modelling evaluative thinking is essential to building the sector's capacity in using evidence to inform decisions and implementation for student achievement and engagement
AEA 2013 Oct 18 Panel Session 442 The role of evaluators in infusing evaluative thinking .pdf
At the UW Institute for Clinical and Translational Research, embedded evaluators use nested logic models to summarize complex layers of intent, assist with program improvement, encourage an evaluative perspective, communicate program achievements, and identify evaluation tasks and metrics
Hogle AEA 2011 session 871.pptx
Evaluators entering the world of Clinical and Translational Science Award (CTSA) institutes are often struck by the complexity of evaluating these institutions. Each CTSA encompasses several cores, or research service providers (i.e., drug discovery and development, biomedical informatics, or community engagement), with drastically different focuses and, therefore, indicators of progress. Annually, evaluators are challenged with developing integrated reports for the National Institute of Health using data collected in the diverse CTSA cores. This poster presentation will first display how two new-to-CTSA evaluators tackled the task of understanding CTSAs
FINAL Poster_10_4_13.pdf