Evaluation 2020

Illuminating Educator Practices Through Evaluating Out-of-School Time STEM Learning 

10-21-2020 22:05

Abstract
Out-of-school time (OST) programs serve significant populations of youth underrepresented in science and engineering fields. Effective OST STEM programming is important to broaden participation and improve STEM engagement. Systematic evaluation of these efforts is critical to understand which efforts are effective and which outcomes are achieved. This poster will highlight a research and evaluation study of OST STEM programming and share experiences and lessons learned. Planetary Learning that Advances the Nexus of Engineering, Technology and Science (PLANETS) is a NASA-funded project developed to engage youth in planetary science and engineering through three curricular units and educator supports for use in OST settings. Systematic study of this project brought unique challenges that differed from formal education environments, due to different programmatic goals, transience in participating youth, and participation of a broad spectrum of OST educators. We provide insights and opportunities for improving evaluation methods and practices in OST STEM learning. 


Summary
This poster highlights issues of relevance for evaluators and researchers in out-of-school time (OST) spaces. By “shining a light” on evaluation and research in OST STEM environments, we provide insights for improving evaluation methods and practices in this setting. 

Out-of-school time (OST) programs serve significant populations of youth underrepresented in science and engineering fields. Increase in STEM in OST programs has the potential to broaden participation in STEM and improve interest and attitudes towards STEM. OST programs are optimal for STEM learning, as there is more flexibility than formal school and more time for exploration and decision-making, critical for high quality STEM experiences.

Planetary Learning that Advances the Nexus of Engineering, Technology and Science, or PLANETS, is a project by Northern Arizona University, Museum of Science at Boston and USGS Planetary Science, funded by NASA that developed three curricular units and educator supports to engage youth in planetary science and engineering. As STEM experiences increasingly become a focus in OST environments, systematic evaluation of these experiences is critical to understand which efforts are effective and which outcomes are achieved. The project used a robust, systematic, and complementary evaluation and research approach to both inform the project for decision-making purposes and to generate new knowledge about implementation of STEM curricula in different contexts within OST. 

Evaluation of OST learning experiences bring unique challenges that differ from evaluation in formal education environments, due to different programmatic goals, transience in participating youth, and different levels of commitment of OST educators. Based on Allen & Peterman, 2019, ecological validity is important to consider to align methodology with the informal OST space.

Evaluators conducted field tests of the units to understand how educators used the materials, their perceptions of the materials, and effects on their teaching and content knowledge. The study also sought to understand how the materials influenced student attitudes toward engineering. The field test involved 11 OST educators and more than 200 middle school-aged youth at seven sites across the U.S. Data collection included an educator implementation survey, an educator knowledge survey, educator interviews and a youth engineering attitude survey. 

Researchers focused on how four OST educators in different contexts implemented the curricula and what influenced their decision making during implementation. The study used mixed-methods, including more than 100 hours of activity observations, educator implementation surveys, and educator interviews. 

Investigators faced several issues conducting the research and evaluation in OST settings that were markedly different from evaluation in formal education settings.

  1. Although funders may want evidence of content learning, such outcomes may not be appropriate in OST, which typically have different goals for youth such as increasing interest and engagement in the topic area. Constructs such as youth attitudes toward engineering and youth use of engineering habits of mind were more important in this OST space, so became the focus of the youth data collection. A number of instrument suites and repositories are available for OST STEM evaluators to measure affective constructs.
  2. There was a lack of educator and site staff familiarity with the purpose of evaluation studies. To address this, investigators conducted study orientations with opportunities for questions. This also helped establish a relationship between the evaluator/researcher and the participants. Because families may have less contact with the educator and site staff than in a formal classroom environment, communicating with these groups was important, so parent meetings were held to discuss the purpose of the studies and the data collection activities. Informed consent from parents was garnered during this time. Use of everyday language and minimizing jargon when describing the study was important to garner participation and trust from educators and families.
  3. There was low participant responsiveness to data collection requests. Concerns also emerged about collecting student outcome data because programs are sensitive to ensuring OST programs feel fun for youth and different from formal classroom settings. Because of the difficulty getting educators to complete weekly logs, evaluators reduced the data collection burden so that they completed a single implementation survey at the end of the study rather than weekly. Youth data collection also involved a retrospective pretest for this reason, where youth reflect back on their attitudes before and after participating, rather than doing two separate pre- and post-surveys. Significant data was collected through observing activities.
  4. Concerns over seeing an accurate representation of youth experiences emerged during project observations due to the irregular attendance of youth in the voluntary OST program. There was also a concern about accurate representation of youth attitudes for youth who participated irregularly. Tracking attendance of participation, and setting a benchmark for sufficient participation for inclusion in the analytic sample was a strategy for considering eliminating youth with insignificant participation.
  5. Educators in OST have a range of backgrounds from college students, to volunteers to formally trained educators that can influence how effectively they are able to implement a program. Consider collecting data across a range of sites to get at the varied pedagogical and content background of these educators to understand how background influences implementation.

Dr. Nena Bloom & Tina Zecher, Northern Arizona University
Dr. Carol Haden & Beth Peery, Magnolia Consulting, LLC



Reference

Allen, S. & Peterman, K. (2019). Evaluating informal STEM education: Issues and challenges in context. In A. C. Fu, A. Kannan,&R. J. Shavelson (Eds.), Evaluation in Informal Science, Technology, Engineering, and Mathematics Education. New Directions for Evaluation, 161, 17–33.

Statistics
0 Favorited
19 Views
2 Files
0 Shares
6 Downloads

Related Entries and Links

No Related Resource entered.

Tags and Keywords

Attachment(s)
pdf file
Poster - Illuminating Educator Practices Through Evaluati...   934 KB   1 version
Uploaded - 10-21-2020