EVAL 2015: Multipaper Session 1379, Strengthening the Research Evaluation Infrastructure—Perspectives from Practitioners

09-22-2016 10:01

Session Number: 1379 Track: Research, Technology & Development Evaluation Session Type: Multipaper
Tags: strengthening evaluation infrastructure
Session Chair: Robin Marian Wagner [Centers for Disease Control and Prevention]
Presenter 1: Sarah Naylor
Presenter 2: Robin Marian Wagner [Centers for Disease Control and Prevention]
Presenter 3: Kathryn Graham [Alberta Innovates Health Solutions]
Time: Nov 14, 2015 (07:00 AM - 07:45 AM) Room: Skyway 283 Abstract 1

Title: Program Evaluation—The Next “Alternative” Career for PhDs Presentation

Abstract 1: How can we move Program Evaluation onto the list of “alternative” careers for those with scientific training? The Science & Technology Policy Fellowship program sponsored by the American Association for the Advancement of Science provides an opportunity for scientists and engineers to contribute to the federal policymaking process while learning about the intersection of science and policy. Many placements ask fellows to perform evaluations of federal research funding. Offices have found that these PhD scientists are well-suited to perform advanced data analysis and to design valid evaluation plans based on the available data even without formal evaluation training. This presentation will discuss the author’s experience working as a fellow on evaluation projects, and cultivating awareness of and interest in research evaluation and potential strategies to engage more PhD scientists in program evaluation.

Abstract 2 Title: How to Build a Research Evaluation Team in the Era of Big Data Presentation Abstract 2: The evaluation of research programs has a number of well-known methodological challenges. In recent years, a new challenge has emerged: huge numbers of electronically available data sources, such as publications and citations, require staff with expertise and skills needed to evaluate these “big data.” Individuals with the necessary sophisticated data management, quantitative, analytical and communication skills are relatively rare and in high demand. This talk will draw on 10 years of experience at two U.S. federal government agencies on the strategies used to identify, recruit, train and retain a highly talented staff with appropriate skills to evaluate big (and smaller) data in the 21st century. The importance of demonstrating how big data can be used for evaluation to agency leadership to build support for such activities will also be addressed.

Abstract 3 Title: Building an Evaluation Platform for Local and International Needs Presentation Abstract 3: Alberta Innovates Health Solutions (AIHS) is a Canadian provincial health research and innovation (R&I) funder. R&I is considered essential for a healthy population as well as social and economic prosperity. However, evaluating R&I brings unique challenges, including time lags, attribution, and contribution issues as well as lack of standardized data and a skilled workforce. In response to these issues, AIHS founded an International School on Research Impact Assessment with its partners. The goal of this platform is not only to develop capacity and advance knowledge but develop a community of practice through delivering regional workshops. Furthermore, in terms of using common tools, AIHS is looking into a reporting platform that can be used by multi funding organizations capturing standardized data. This presentation will discuss how taking a collaborative approach to building a platform can meet local and international needs as well as creating shared value for the R&I ecosystem.

Audience Level: Intermediate

Session Abstract: The expert application of theory and methods is essential for producing robust and accurate research evaluations. However, success also hinges on the more practical considerations of having a solid foundation to support evaluation. There must be an adequate supply of appropriate trained evaluators at the program, agency and country level. Effective utilization of emerging technologies can also significantly promote and strengthen evaluation collaborations within and between domestic and international research agencies. Three papers will be presented from practitioners in the field on these topics with the goal of promoting discussion and sharing amongst session attendees on what plans and actions they have found most and least helpful in strengthening the research evaluation infrastructure in their own organizations.
#2015Conference
#Research,Technology,andDevelopmentEval
#Presentation
#AEApresentation
#Eval2015

Statistics
0 Favorited
20 Views
2 Files
0 Shares
103 Downloads
Attachment(s)
pptx file
How to Build a Research Evaluation Team in the Era of Big...   4.33MB   1 version
Uploaded - 09-22-2016
pdf file
How to Build a Research Evaluation Team in the Era of Big...   5.50MB   1 version
Uploaded - 11-15-2017
How to Build a Research Evaluation Team in the Era of Big Data Wagner Richards

Tags and Keywords

Related Entries and Links

No Related Resource entered.