The purpose of the evaluation was to learn to what extent Statistics for Action, a set of materials and resources and training developed by math educators with input from environmental organizers, could increase numeracy among environmental organizers and the community members they serve. For this generally unpopular content in an unusual context, the author describes project and evaluation design choices that worked, and those that didn’t.
Presentation Abstract: Evaluators of informal educational interventions may enter semi-public, semi-private settings like living rooms and church community rooms. Under pressure to quantify the academic learning that is taking place, evaluation priorities potentially conflict with participants’ expectations. To gauge changes without pre- and post-assessments of math skills, Statistics for Action (SfA) evaluators designed instruments that supported the principles of the project. The project promoted group work and communication. The evaluators developed focus group protocols and survey questions on group practices and gains. Negotiating conflicting expectations, evaluators adjusted their design to address competing interests of a target audience—community members involved in local environmental issues, including math avoiders; and its client--gung-ho math educators determined to build quantitative literacy in the context of environmental organizing. Project staff share approaches better avoided, methods that elicited evidence of change, and project design choices that enabled evaluators to deliver the data.#EnvironmentalProgramEvaluation #MixedMethodsEvaluation #2012Conference