The Systems in Evaluation Topical Interest Group (SETIG) was a community created within the American Evaluation Association to provide a forum for ongoing conversation about the use of systems thinking and systems theory in evaluation.

SETIG developed from a series of sessions held during the 2002 AEA Annual Meeting in Washington DC that focused on systems theory and systems thinking. The TIG held its first planning meeting in Atlanta in 2004 (attendance: 4), and sponsored its first sessions in the 2005 AEA conference in Toronto.

SETIG focuses on the use of systems thinking and systems theory as a framework for evaluation planning, design, implementation, analysis, and reporting, across a wide range of content areas, and using a diverse array of evaluation approaches, research methods, and data collection and analysis tools.

Areas of interest to TIG members include:

  • Evaluation approaches that use systems theory to plan, design, and implement evaluation
  • Discussion about how to ground evaluation methods and approaches in systems thinking and theory
  • The contributions of diverse perspectives in understanding issues related to the use of systems thinking and theory in evaluation

TIG leadership places a priority on welcoming members at all levels of experience, comfort, and familiarity with systems thinking. We have a diverse membership of professionals from 35+ countries spanning a wide range of experience in systems thinking and in evaluation. Several members have published books focusing on systems thinking and evaluation, and a number of TIG members have served on the AEA Board, including 2014 AEA President Beverly Parsons.

A priority for the TIG has been to craft offerings and activities that are both interesting and accessible to all systems thinkers, whether novice or experienced. As a result, we’ve seen the focus of our conference sessions evolve along the following lines:

  • defining this thing called systems thinking
  • how we’re using systems thinking in our work
  • how we can support others in the use of systems thinking in evaluation practice
  • developing guiding principles for the use of systems thinking in program evaluation

The SETIG Leadership

The SETIG Leadership Team consists of five positions: two co-chairs, two program chairs, and  a communications lead. Each member of the leadership team holds a three-year term, with two previously elected members overlapping with two newly elected members to allow a smooth transition. Each role includes the following responsibilities among others that individuals bring to the role: 

  • Co-chair: lead the TIG by organizing the TIG business meeting at AEA conferences, hosting leadership meetings during the year, representing the TIG in AEA meetings  
  • Program co-chair: coordinate peer review process for AEA conference sessions and work with co-chairs to organize virtual networking and learning sessions between conferences 
  • Communications lead: keep website updated, send announcements to members, and use social media to connect TIG members 

Elections are usually held by online ballot, with the newly elected leadership being announced at the following SETIG Business Meeting.

You may contact the leadership team through their individual emails, through systemsinevaluationtig@gmail.com, or through this form.

2021 SETIG Leadership

Emily Gates, TIG Co-Chair (2019-2021)

Assistant Professor, Boston College, MA

As an assistant professor of evaluation at Boston College, I teach graduate-level evaluation & mixed methods courses; conduct evaluations; and juggle several research and writing projects. I’d say my systems interests really took off during graduate school when I did my dissertation research on the implications of systems thinking and complexity science for evaluation. I mostly dug into critical systems heuristics and the systems thinking in practice work of the Open University, but keep expanding to try out other systems approaches. Current projects include evaluating a system change initiative to personalize learning in K-12 school districts and a case study of how systems thinking and modeling informs evaluative learning in a philanthropic health initiative.

Email: emily.gates@bc.edu

Jeneen R. Garcia
Jeneen Garcia, Program Co-Chair (2019-2021)

Evaluation Officer, Global Environment Facility, Washington DC

With an academic background in environmental science and management, I got drawn to the complex systems field through my non-profit work in coupled social and ecological systems (which are basically everywhere on Earth!). My current work as evaluator at the Global Environment Facility -- a multilateral trust fund working with international development banks and UN agencies -- is a natural progression. I enjoy designing interdisciplinary approaches and combining quant and qual methods to assess complex interventions in complex settings.

Email: jgarcia2@thegef.org

Kimberly Norris, TIG Co-Chair (2020-2022)
Senior Evaluation Specialist, EnCompass LLC, MD

With a dual behavioral ecology and systems modelling doctoral focus and, along the journey, having designed and applied adaptive management models with CS Holling (back in the ‘80s – yikes!), I appreciate the evolution of evaluative thinking and how we are moving toward more connected and supportive modes and models – both inherent to systems-based evaluation. I love how I can apply this orientation in my current role as a Senior Evaluation Specialist for a small, woman-owned international evaluation, learning and leadership strengthening firm, EnCompass, based in the DC area.

Email: knorris@encompassworld.com

Clara Shim, Program Co-Chair (2020-2022)
PhD candidate, Boston College, MA

I am currently a doctoral student at Boston College, with training in program evaluation theory and methods, along with data collection and analysis. Currently, I am working on an evaluation for the Boston College Online Master of Healthcare Administration program, which is centered around effective teaching and learning practices in the online learning space. My research interest also aligns with my current work. I have recently developed a comprehensive framework for evaluating quality indicators of online graduate programs. I hope to build upon the framework and study how such indicators can be best measured.

Email: suhcl@bc.edu