AEA 2014

RTD Evaluation 2014 Sessions

Thursday October 16, 2014

Session 1250: Development of Tools to Answer Common Questions Regarding Biomedical Research Portfolios (co-sponsored by the Health Evaluation TIG)

Questions regarding biomedical research portfolios often arise from a variety of audiences, both internal and external. With increasing amounts of administrative, output, and outcome data linked to funded projects, the development of tools to analyze large data aggregations in a standardized manner for analysis and evaluation purposes has become critical. Traditional sources of data on funded projects such as yearly progress reports can be limited in timeliness, scope, and ability to be easily summarized across years, thus not adequately addressing common needs of those seeking to evaluate biomedical research portfolios. The presentations from the National Cancer Institute (NCI), the National Institute of General Medical Sciences (NIGMS), and the National Institutes of Health (NIH) will address the full scope of development of analysis tools for biomedical research portfolios, from initial needs assessment to determine the questions of interest through deployment of enterprise systems available to the public, including lessons learned throughout.

Session Chair: Elizabeth Hsu, National Institutes of Health

Presentations:
Introduction
Acronym Guide
Elizabeth Hsu, National Institutes of Health

NCI-Viz: Developing an agile tool for monitoring and visualizing funding outputs and understanding research portfolios
Duane Williams, Thomson Reuters; Elizabeth Hsu, National Institutes of Health; Joshua Schnell, Thomson Reuters; Danielle Daee, National Institutes of Health; Larry Solomon, National Institutes of Health; James Corrigan, National Institutes of Health

Portfolio Analysis for Basic Biomedical Research using NIHMaps: Lessons Learned and Future Possibilities 
Edmund M. Talley, National Institutes of Health; Lisa Dunbar, National Institutes of Health; Catherine Lewis, National Institutes of Health

Data Analysis Tools and Systems in NIH's Office of Extramural Research
James Onken, National Institutes of Health; Brian Haugen, National Institutes of Health

Session 1629: Effective Research Program Evaluation to connect Research, Development and Innovation in Japan

Challenges are imposed on the research and development (R&D) to keep the global environment, sustainable society, economic growth etc., and constant creation of innovation is necessary to meet these requirements. The 4th Science and Technology Basic Plan (2011-2015) in Japan put emphasis on the integrated development of science, technology and innovation (STI). Characteristic strategy and evaluation of R&D which induces innovation is, therefore, necessary in university, public organization and industry, respectively. Research projects should be promoted under a strategic research program, and evaluation should be carried out to evaluate the series of goal, scenario, roadmap, plan, research inputs and outputs, research management, and process to induce innovation. It is very desirable if this evaluation method can reveal the effective factors which connect the R&D to innovation. In this session, we introduce the recent relevant evaluation methods in a university (Waseda University) and public research organizations (NISTEP and AIST).

Session Chair: Naoto Kobayashi, Waseda University
Discussant: Osamu Nakamura, AIST

Presentations:

Evaluation System of Research Initiatives in Waseda University


Naoto Kobayashi, Waseda University, Takashi Ichinose, Waseda University

Development of a simple visualization application for abductive reasoning toward an evidence-based innovation policy: Reflections on Japan's public-funded research portfolio and strategy under the Second and Third Basic Plan

Nobuyuki Shirakawa, NISTEP and NEDO, Takao Furukawa, National Institute of Science and Technology policy (NISTEP); Kazuhiro Hayashi, National Institute of Science and Technology policy (NISTEP); Masatoshi Tamamura, Keio University, National Institute of Science and Technology policy (NISTEP)

 

The present conditions of the research unit evaluation in the AIST (National Institute of Advanced Industrial Science and Technology). Comment analysis (expectation to the AIST) from the evaluation committee
Masafumi Yamaguchi, AIST, Hitoshi AKIMICHI, National Institute of Advanced Industrial Science and Technology (AIST)

Strategic research and development in AIST to induce innovation
Osamu Nakamura, AIST, Naoto Kobayashi, Waseda University and AIST

Session 1379: Evaluating Research Impact: Lessons Learned Over Eight Years of Process and Impact Evaluations of the Clinical and Translational Science Awards (co-sponsored by the Health Evaluation TIG)

This session highlights the opportunities and challenges involved in evaluating the Clinical and Translational Science Awards (CTSA), a consortium of 62 NIH funded institutes housed in academic medical centers across the United States. With a broad mandate to speed the translation of biomedical science from "bench to bedside", ultimately impacting human health, each CTSA has established an evaluation core. Over eight years, this consortium of evaluators has acquired a wealth of expertise and experience in evaluation of research process and impact. With an eye on lessons learned, as well as on constricting federal budgets, the methods presented represent a unique mix of traditional and innovative approaches to an evaluation continuum including: innovative approaches to calculating ROI, outcomes assessment in a complex environment, and data management systems that allow for actionable reporting. Panelists will address the need to integrate and leverage these methodologies in evaluating efficiency and efficacy of CTSAs.

Session Chair: Adrienne L Zell, Oregon Health and Science University

Presentations:
Defining the Stages of Translational Science in Health Research

Janice Hogle, PhD, Univ. of Wisconsin; Kate Westaby, UW-Madison Institute for Clinical and Translational Research; Trisha Adamus, UW-Madison Ebling Library for the Health Sciences; D. Paul Moberg, UW-Madison Institute for Clinical and Translational Research and UW-Madison Population Health Institute; Alexis Tavano, Marshfield Clinic Research Foundation and Institute for Clinical and Translational Research

Stages of Community Engagement: Tracking and Evaluation in a Community Research Consult Service within the Clinical & Translational Science Collaborative (CTSC)
 
Clara Pelfrey, Case Western Reserve University; Mary Ellen Lawless, Case Western Reserve University; Katrice Cain, Case Western Reserve University; Ashwini Sehgal, Case Western Reserve University

Data Nexus: A Model for Creating an Integrated Data Ecosystem to Enable a More Effective Academic Research Organization

Melanie Funes, PHD, University of Southern California; Katja Reuter, University of Southern California (USC); Praveen Angyan, University of Southern California (USC); Eun Evans, University of Southern California (USC); Annie Hong, University of Southern California (USC)

Return on Investment (ROI) of CTSA Pilot Projects -- A Collaborative Study to Identify a Common ROI Method

Adrienne L Zell, Oregon Health and Science University; Angela Alexander, University of California, San Diego; Melanie Funes, University of Southern California (USC); Deborah Diaz-Granados, Virginia Commonwealth University; Margaret Schneider, University of California, Irvine


Session 335: The NCI Innovative Molecular Analysis Technologies Program: Attributing Program Results

Program evaluations are critical for managers of biomedical R&D portfolios, but often they are limited to demonstrating that programs are associated with positive outcomes rather than understanding their unique contribution. This is largely attributable to the reality that assessing that contribution is difficult and often prohibitively expensive. Further, these programs are not allowed sufficient time to yield major outcomes before an evaluation is requested, or the program was closed long enough ago with limited interest in making the investment required to truly assess its utility. The NCI's Innovative Molecular Analysis Technologies program provides a unique opportunity for skirting some of these hurdles. Three mixed-method evaluations have been conducted on the program, but all fall short of showing that the program was a uniquely significant intervention. This discussion will briefly review these evaluation efforts, and explore what would be required to robustly assess the unique value of the program.

Presentation:
The Innovative Molecular Analysis Technologies Program
Tony Dickherber, National Cancer Institute; Lawrence Solomon, National Cancer Institute


Session RTDE1: Novel Methods for Classifying Research Portfolios (Co-sponsored by the Government Evaluation TIG)


Session Chair: Robin Wagner, National Institutes of Health

Presentations:
How to change collaborative research culture: learning from a unique funding experiment
 
Michelle Picard-Aitken, Science-Metrix Inc.; Andréa Ventimiglia, Science-Metrix Inc

Approaches to Defining Topic Areas
Danielle Daee, National Cancer Institute; James Corrigan, National Cancer Institute (NCI); Elizabeth Hsu, National Institutes of Health


Session RTDE2: Benefits of Big Data Systems For Research Evaluations (co-sponsored by the Government Evaluation TIG)


Session Chair: Liudmila K. Mikhailova, CRDF Global (former U.S. Civillian R&D Foundation)


Presentations:
The Benefits of a Federated Approach for Evaluation

Gavin Reddick, Medical Research Council

Using Grants Administrative Data to Glean Insights about the Research Enterprise: A Case Study from the U.S. National Institutes of Health (NIH)

Robin Wagner, National Institutes of Health; Luci Roberts, National Institutes of Health; Della Hann, National Institutes of Health

Session 1672: Effective Research Program Evaluation to connect Research, Development and Innovation in Japan 

Constant creation of innovation is essential in order to solve various problems in the world, such as global environment, natural resources, renewal energy, health and care, economic and social improvement. The efforts of industry of R&D connecting to innovation is, above all, especially important, and, therefore, it is necessary to evaluate strategically the effective combination of R&D and innovation for activities in industry. In Japan, NEDO has supported various projects over a 30 year period in the fields of energy, environmental and industrial technologies, and integrated knowledge of induction of innovation through series of evaluations including follow-up evaluation of funded programs by NEDO is very suggestive. This session serves discussion on effective evaluation of the research programs through the recent activities of NEDO.

Session Chair: Naoto Kobayashi, Waseda University
Discussant: Mitsuru Takeshita, New Energy and Industrial Technology Development Organization (NEDO)

Presentations:
Analysis of 'NEDO inside Products' Survey 2014

Mitsuru Takeshita, New Energy and Industrial Technology Development Organization (NEDO); Masaru Yamashita; New Energy and Industrial Technology Development Organization (NEDO); Noriko Kimura, New Energy and Industrial Technology Development Organization (NEDO); Tomonaga Yoshida, New Energy and Industrial Technology Development Organization (NEDO);Toshiyuki Isshiki, New Energy and Industrial Technology Development Organization (NEDO)

Numerical Analysis for NEDO Projects Follow up Monitoring

Toshiyuki Isshiki, New Energy and Industrial Technology Development Organization; Masaru Yamashita, New Energy and Industrial Technology Development Organization (NEDO); Tomonaga Yoshida, New Energy and Industrial Technology Development Organization (NEDO); Noriko Kimura, New Energy and Industrial Technology Development Organization (NEDO); Mitsuru Takeshita, New Energy and Industrial Technology Development Organization (NEDO)

Success factors analysis of NEDO project by interview survey

Tomonaga Yoshida, New Energy and Industrial Technology Development Organization; Masaru Yamashita, New Energy and Industrial Technology Development Organization (NEDO); Toshiyuki Isshiki, New Energy and Industrial Technology Development Organization (NEDO); Noriko Kimura, New Energy and Industrial Technology Development Organization (NEDO); Mitsuru Takeshita, New Energy and Industrial Technology Development Organization (NEDO)

TIG Business Meeting

Guest Speaker: Cheryl Oros, Consultant to the AEA Evaluation Policy Task Force

Title: “Update on Federal, State, and International Evaluation Policy”


 

Friday October 17, 2014 

Session 1306: Promoting Combinatorial Science, Technology and Innovation (STI) Programmatic Impacts: An Evaluation of the NSF SBIR/STTR Supplement for Membership in Industry/University Cooperative Research Centers (IUCRC) (co-sponsored by the Business, Leadership, and Performance TIG)

Government programs, including STI programs, often provide supplemental funding opportunities that involve financial support through another program. However, these supplemental initiatives are rarely viewed as efficacious program mechanisms in their own right and, as a consequence, are rarely subjected to serious evaluative scrutiny. The development and evaluation of the NSF SBIR/STTR Membership Supplement in IUCRCs attempts to break new ground in both areas. The supplements, which involve providing Phase II SBIR/STTR firms with a subsidized membership in an IUCRC of their choice, attempt to create a synergistic innovation-related relationship by combining elements of two highly regarded and well evaluated STI programs. Session papers, which highlight the findings of a recently completed study of the program's effects, address three evaluation objectives: the supplement's impact on small business participation in IUCRC; the benefits and costs for IUCRCs and their stakeholders; the benefits and costs for participating SBIR/STTR firms.

Session Chair: Denis Gray, North Carolina State University
Discussant: Shashank Priya, National Science Foundation

Promoting Synergistic Program Impacts: Background of the National Science Foundation SBIR/STTR Supplement for Membership in Industry/University Cooperative Research Centers (IUCRCs)
Denis Gray, North Carolina State University

Outcomes related to combinatorial innovation in government programs: Impact on Member Composition
Lindsey McGowen, North Carolina State University; Lena Leonchuk; NC State University; Tim Michaelis, NC State University

Outcomes related to combinatorial innovation in government programs: Feedback from cooperative research center directors
Drew Rivers, Chronicle Research LLC; Lena Leonchuk; NC State University; Tim Michaelis, NC State University

SBIR Membership Supplements in IUCRCs: Feedback on processes and outcomes from SBIR member firms
Denis Gray, North Carolina State University; Lena Leonchuk; NC State University; Tim Michaelis, NC State University

 

Session 616: Evaluating Research Capacity Development: The National Science Foundation's Experimental Program to Stimulate Competitive Research (EPSCoR) (co-sponsored by the Quantitative Methods TIG)

The NSF Act of 1950 stated that "it shall be an objective of the Foundation to strengthen research and education in the sciences and engineering, including independent research by individuals, throughout the United States, and to avoid undue concentration of such research and education. Since 1979, NSF has conducted a program intended to stimulate research activity in those parts of the country that have been less able to compete for NSF funds. In 2011, NSF asked the IDA Science and Technology Policy Institute (STPI) to conduct an evaluation of NSF EPSCoR. The objective of this evaluation was to perform an in-depth, life-of-program assessment of NSF EPSCoR activities and these activities' outputs and outcomes, and to provide recommendations for better targeting of available funding to those jurisdictions for which the EPSCoR investment could result in the largest incremental benefit to their research capacity. This panel presents the methodology used to conduct the evaluation.

Session Chair: Brian Zuckerman, IDA Science and Technology Policy Institute
Discussant: Gretchen B Jordan, 360 Innovation LLC

Presentations:
Session Overview: EPSCoR Origins and Evaluation Methods

Brian Zuckerman, IDA Science and Technology Policy Institute

Increasing Competitiveness of Investigators: Quantitative Approaches

Thomas W Jones, IDA Science and Technology Policy Institute; Brian Rieksts, IDA Science and Technology Policy Institute

Concentration Modeling

Brian Zuckerman, IDA Science and Technology Policy Institute

Qualitative Approaches to Identifying Increased Research Competitiveness and Improved Science and Engineering Research Bases

Rachel A. Parker, U.S. Agency for International Development


Session 1506: Evolution of a Systems View of Research and Innovation Impacts, Relationships, and Contexts (co-sponsored by the Systems in Evaluation TIG)

The thinking, evaluation approaches and methods used to evaluate research, technology, development (RTD) and innovation programs have evolved immensely in the US, Canada and internationally over the past 25 years. For example, progress has been made in modeling archetypical RTD programs and understanding the importance of reach, relationships, and context. This panel will present a brief review of the evolution of evaluation of RTD and innovation and then focus on the key lessons to be learned for U.S., Canadian and international practice. There has been a strong co-evolution of the approaches. It features a departure from what was at one time a set of abstract metrics used in the mistaken view that one can 'sum the accounts' in this field. The move is towards an approach that recognizes the unique value of RTD and innovation initiatives on a case by case (issue by issue), social science and complex systems basis.

Session Chair: Steve Montague, Performance Management Network Inc

Presentations:
Evolution of a Systems View of Research and Innovation Impacts, Relationships, and Contexts
Gretchen B Jordan, 360 Innovation LLC

Session 989: Evaluating Relationships in the NIH Research Community (co-sponsored by the Social Network Analysis TIG)

Each Institute and Office within the U.S. National Institutes of Health (NIH) is increasingly evaluating its research portfolio achievements to ensure that investments in research are meeting its intended purpose. NIH is specifically interested in understanding the nature of relationships within the biomedical research enterprise because relationships can be a key component in the progression of science. In this session, three related papers examine research network collaborations using social network analysis. Topics include encouraging growth in the field of HIV vaccine research, assessing relationships of principal investigators to fields of science represented by different peer review study sections, and increasing collaboration among researchers participating in a consortium. The panel chair will identify common challenges among the evaluation activities and implications of the findings for NIH's research portfolio.

Session Chair: Elizabeth Ruben, National Institutes of Health

Presentations:
Using Social Network Analysis to Understand Growth and Collaboration in the B-cell mediated HIV Vaccine Field

Dolan Ghosh-Das, National Institute of Allergy and Infectious Diseases, NIH; Kevin Callahan, National Institute of Allergy and Infectious Diseases; Marie Parker, National Institute of Allergy and Infectious Diseases

Network Analysis of National Institutes of Health R01 Grant Applicants to Peer Review Panels to Assess Relationships between Fields of Science in Research
 
Katherine Catevenis, National Institutes of Health; Lindsey Scott, National Institutes of Health; Matthew Eblen, National Institutes of Health; Robin Wagner, National Institutes of Health

Using Social Network Analysis to Understand Research Consortium Impacts

Elizabeth Ruben, National Institutes of Health

Session 1081: External Feedback in Evaluating Education Research -- Innovative approaches for evaluating different types of educational research (co-sponsored by the STEM Education and Training TIG)

The Common Guidelines for Educational Research and Development (IES/NSF, 2013) suggest that external feedback is appropriate for evaluating each of the six different research types: foundational research and development; early stage/exploratory; design and development; efficacy studies; effectiveness studies; and scale-up studies. Unfortunately, there is little elaboration regarding which type of external feedback works best for a particular type of research, or how an external feedback model might be utilized for particular research types. This think tank session will ask participants to engage in dialogue in facilitated small groups to generate insights to respond to the following question: What might the external feedback approach look like for each research type? Groups will be organized by research type and respond to the same set of questions in each group. A general share out will allow participants to deepen their own understanding of various external feedback models and their utility for evaluating research.

Presenters: John Sutton, RMC Research Corporation; Tamara M. Walser, University of North Carolina Wilmington; Catherine Callow-Heusser, Utah State University

Saturday, October 18, 2014


Session RTDE4: Perspectives from International R&D Agencies: Evaluation of Industrial Sector, Individual Firms and Researchers (co-sponsored by the Government Evaluation TIG)

Session Chair: Kathryn Graham, Alberta Innovates Health Solutions

Presentations:
Technology Agency of the Czech Republic: from Thoughts to Innovations
 
Jan Hajic, Member of the Research Board, Technology Agency of the Czech Republic; Rut Bizkova, Technology Agency of the Czech Republic; Miroslav Janecek, Technology Agency of the Czech Republic; Petr Matolin, Technology Agency of the Czech Republic

Evaluation of Start-Up Company Programmes in Tekes, Finland
 
Jari Hyvarinen, Tekes - Finnish Funding Agency

A Case Study on the Evaluation of Researchers in China Based on the Curriculum Vitae Analysis
Jianzhong Zhou, Chinese Academy of Sciences; Fang XU, Chinese Academy of Sciences; Wenbin LIU, University of Kent; Xiaoxuan LI, Chinese Academy of Sciences


Session 1277: Strategies and Tools for Assessing Impacts of Biomedical Research at the National Institutes of Health (co-sponsored by the Health Evaluation TIG)

Increasingly, science managers and researchers are called to define how investments in research lead to improvements in health or other benefits to society. Understanding such impacts helps foster greater accountability for federal research dollars and facilitates data driven planning processes. In this session, three related papers will examine strategies for assessing research impacts at the National Institutes of Health (NIH). The first paper will summarize findings and recommendations from an NIH working group that was formed to assess the quality and accessibility of approaches at the NIH for identifying, analyzing, and reporting on research impacts. The second paper will focus on approaches used to assess the contribution of the National Cancer Institute to a range of outcomes, such as developing clinical guidelines, biomarkers, and drug therapies. The third paper describes an automated approach for assessing impacts using novel bibliometric indicators, developed at the National Institute of Environmental Health Sciences. Our discussant will comment on themes across the three papers and the audience will be invited to offer their views on the findings and approaches presented.

Session Chair: Christie Drew, NIEHS
Discussant: Marina Volkov, National Institutes of Health

Presentations:
Better Data and Tools for Analyzing the Outcomes of NIH Research Spending

Kevin Callahan, National Institutes of Health; Jane Lockmuller, National Institutes of Health; Kevin Wright, National Institutes of Health; Steve Zoha, National Institutes of Health; Elizabeth Hsu, National Institutes of Health

Assessing the Impact of Biomedical Research: Examples of Approaches from the National Cancer Institute (NCI)
 
James Corrigan, National Cancer Institute (NCI), NIH; Joshua Schnell, Thomson Reuters; Duane Williams, Thomson Reuters; Brian Zuckerman, IDA Science and Technology Policy Institute

A Novel Bibliometric Approach for Automated Research Impact Assessment
Christie Drew, National Institute of Environmental Health Sciences; F.O. Finch, III, Open Intelligence; Doug Giles, Open Intelligence; Paul Jordan, National Institutes of Health; Kristi Pettibone, National Institute of Environmental Health Sciences

Session RTDE3: Societal, Cultural and Policy Influences on Research Programs and Their Outcomes

Session Chair: Denis Gray, North Carolina State University

Presentations:
Methodological Triangulation and its Framework for Social Impact Assessment in Science, Technology and Innovation Policy: 
Quantifying and Visualizing Social Needs for Science and Technology Projects in Japan
Nobuyuki Shirakawa, Keio University, NEDO, NISTEP and IFENG; Masatoshi Tamamura, Keio University

Top-Down vs. Bottom-Up Portfolio Analyses: Pros and Cons
Asha Balakrishnan, IDA Science and Technology Policy Institute



RTD Evaluation 2014 Posters


Poster 818:  
Using Evaluation to Facilitate Delaware's Clinical and Translational Research 
Yueyue Fan, Michelle A. Mattera, & Cheryl M. Ackerman, University of Delaware, Newark, DE 

Poster 990:  Factors Associated with the Diffusion of Scientific Findings from the Early Years of the NHLBI-Supported Proteomics Centers Program
Mona Puggal, Richard Fabsitz, and Cheryl Howard, National Institutes of Health

Poster 1649:  Development of Effective Field Technology Evaluation Guidelines
Nancy Merritt, National Institute of Justice, U.S. Department of Justice

Poster 1144:  Diversity Training Grants Analysis at the National Institute of Allergy and Infectious Diseases
Liberty WaltonDione Washington, and Krystal TomlinNational Institutes of Health