AEA 2005-2012

 

RTD Presentations from AEA 2012

RTD Presentations from AEA 2011
RTD Presentations from AEA 2010
RTD Presentations from AEA 2009
RTD Presentations from AEA 2008
RTD Presentations from AEA 2007

RTD Presentations from AEA 2006
RTD Presentations from AEA 2005


RTD Presentations from AEA 2012

(back to list)


Wednesday, October 24

Session:Strategic Research Program Evaluation to Realize Innovation: Toward the New Phase of Evolution in RT&D
Chair: Naoto Kobayashi, Waseda University
Discussant: Osamu Nakamura, National Institute of Advanced Industrial Science and Technology

Evaluation of Research and Development at the Ministry of Economy, Trade, and Industry
Yoshiaki Tamanoue and Shigeki Okamoto, Ministry of Economy, Trade, and Industry

Evaluation of R&D at AIST
Chieko Kifune et al, National Institute of Advanced Industrial Science and Technology

Evaluation of AIST as an Open Innovation Hub to Encourage the Local Industry and Economy
Osamu Nakamura and Shinichi Matsui, National Institute of Advanced Industrial Science and Technology
Ryo Sasaki, International Development Center of Japan

Analysis of 'NEDO Inside Products' Survey 2012
Sayaka Shishido et al, New Energy and Industrial Technology Development Organization

Success Factors Analysis of the New Energy and Industrial Technology Development Organization (NEDO) Project by Follow-up Monitoring
Tomonaga Yoshida et al, New Energy and Industrial Technology Development Organization

Thursday, October 25

Session: Developing Methods to Measure Research, Technology, and Development Collaboration and Networks
Chair: George Teather, Performance Management Network Inc. 

The Review of Transfer and Transformation of Science and Technology Achievements Network in China
Sheng-yan Sun et al, Chinese Academy of Sciences

Design and Evaluation of Science/Engineering/Math (STEM) Training Programs for Hispanic and Native American College Students: The SACNAS Experience
Jack Mills, Independent Consultant
Yvonne Rodriguez, Society for Advancement of Chicanos and Native Americans in Science

Complex Adaptive Systems Framework to Evaluate Virtual Research Collaborations
Arsev Aydinoglu, NASA Astrobiology Institute

Session: New Direction of Research and Development (R&D) Evaluation Systems for Qualitative Excellence in Korea
Chair: Sang Youb Lee, Korea Institute of S&T Evaluation and Planning
Chair: Sang Youb Lee, Korea Institute of S&T Evaluation and Planning

The Performance Management System and Standard Performance Indicators on Korea National R&D Program
Munsang Kang et al, Korea Institute of S&T Evaluation and Planning

New Approach of National R&D Evaluation System in Korea: Open Evaluation to Public R&D Programs
Ji Hyun Park et al, Korea Institute of S&T Evaluation and Planning

New Perspectives on SME Cooperation in Korea: Evaluation Issue and Strategy
Jae-Ho Shin et al, Korea Institute of S&T Evaluation and Planning

Evaluation System of Government-Funded Research Institutes (GRIs) in Korea
Woo Chul Choi and Woong Yong Han, Korea Institute of S&T Evaluation and Planning
Munsang Kang et al, Korea Institute of S&T Evaluation and Planning

New Approach of National R&D Evaluation System in Korea: Open Evaluation to Public R&D Programs
Ji Hyun Park et al, Korea Institute of S&T Evaluation and Planning

New Perspectives on SME Cooperation in Korea: Evaluation Issue and Strategy
Jae-Ho Shin et al, Korea Institute of S&T Evaluation and Planning

Evaluation System of Government-Funded Research Institutes (GRIs) in Korea
Woo Chul Choi and Woong Yong Han, Korea Institute of S&T Evaluation and Planning

Session: Evaluation of Biomedial Research Training & Career Development Programs: Examples from the National Institutes of Health
Chair: James Corrigan, National Institutes of Health
Discussant: Jennifer Sutton, National Institutes of Health


Outcome Evaluation of the National Cancer Institute (NCI) Career Development (K) Awards Program
Julie Mason and Jonathan Wiest, National Institutes of Health
Joshua Schnell, Thomson Reuters

Evaluating Diversity-Focused Training Programs of the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD)
Sarah Glavin et al, National Institutes of Health


Session: Strategy and Indicators for Evaluating Complex Research Portfolios: People, Projects, and Institutes
Chair: Joshua Schnell, Thomson Reuters


Research on Peer Review in CAS Institute Evaluation
Jiang-zhong Zhou et al, Chinese Academy of Sciences 

Modeling the Dissemination and Uptake of Clinical Trial Results: How Long Does Uptake Take?
Jeffrey Schouten, Fred Hutchinson Cancer Research Institute
Scott Rosas, Concept Systems Inc
Jonathan Kagan, National Institutes of Health
Marie Cope, Concept Systems Inc

The Canadian Institutes of Health Research Centers for Research Development: Contributions to Sustainable Innovation Systems
Erica DiRuggiero, Canadian Institutes of Health Research
Natalie Kishchuk, Independent Consultant
Sarah Viehbeck, Canadian Institutes of Health Resarch


Session: Understanding Knowledge Production Systems: Ecology, Context, and Complexity in the National Laboratories
Chair: Gretchen Jordan, 360 Innovation LLC


Organizational Learning in Large Research Environments
Aleia Clark, University of Maryland

Teams vs Organizations: The Balance of Interests in Large-scale Science
Jonathon Mote, Southern Illinois University

Good Jobs in Science: Work Organization and Work Satisfaction in a Large Research Laboratory
Bill Hadden, University of Maryland


Session: Research, Technology, and Development Evaluation in Transition in Asia
Chair: Alan Porter, Georgia Tech & Search Technology Inc


Evidence-based Governmental Decision Making Process: The Example of Agricultural S&T Priority Issues Selection Process in Taiwan
Ling Chu Lee et al, Science & Technology Policy Research and Information Center

Research Evaluation in the Complex Evaluation Ecology of Emerging Countries
Michael Braun et al, Vietnam Science & Technology Evaluation Center

The Transformation of Research Evaluation in China: From Native Mode to International Mode
Tao Dai et al, Chinese Academy of Sciences

Session: Research Programme Evaluation in China and the European Union: Recent Experiences and Future Challenges
Chair: Peter Fisch, European Commission


Research Program Evaluation in China: Recent Experiences and Future Challenges
Huaibin Xing and Xiaoyong Shi, National Center for Science and Technology Evaluation of China

Evaluation of European Union Research Programmes - Recent Experiences and Future Challenges
Peter Fisch, European Commission


Friday, October 26


Session: Assessment of Research in Emerging Technologies: Tools for Early Measurement of Impact
Chair: Juan Rogers, Georgia Tech


Nanotechnology in Building Construction: An Industry Study
Sanjay Arora, Georgia Tech

Assessing the Predictive Power of Indicators for Technical Emergence
Stephanie Bolan and Stephen Carley, Georgia Tech

The Use of Citation Speed to Understand the Effects of a Multi-Institutional Science Center
Jan Youtie, Georgia Tech

The Effect of Human Resource Concentration in Nanotechnology Centers
Juan Rogers, Georgia Tech

Session: Client-evaluator Interaction: Learning from Each Other by Doing with Each Other
Chair: George Teather, The Performance Measurement Network


The Clients Perspective: The Alexander Von Humboldt Foundations Approach to Evaluation
Christina Schuh, Alexander von Humboldt Foundation

The Evaluators Perspective: Lessons from Three Evaluations with Varying Approaches to Get a Better Understanding of Individual Funding and Networking Support
Katharina Warta, Technopolis Group Austria

Session: Extracting Topics and Contributors from Abstract Records for Research Assessment
Presenter: Alan Porter, Georgia Tech and Search Technology Inc

Session: Evaluating National Institutes of Health (NIH) Research: Emerging Strategies and Findings from Evaluation of Research Portfolios 
Chair: Kristi Pettibone, National Institute of Environmental Health Sciences


Growing a Program: Using Evaluation to Understand NIEHS' Neurodegeneration Research Portfolio
Kristi Pettibone, National Institute of Environmental Health Sciences

Factors Predicting Resubmission of R01 Research Grant Applications to the National Institutes of Health (NIH)
Robin Wagner, National Institutes of Health

Session: Evaluation Frameworks for Assessing Private Sector Outcomes of Research, Technology, and Development (RTD) Public Investments
Chair: Henry Doan, U.S. Department of Agriculture


Beyond Surveys: The Research Frontier Moves to the Use of Administrative Data to Evaluate Research and Development (R&D) Grants
Oliver Herrmann, New Zealand Ministry of Business, Innovation, and Employment
Michele Morris, New Zealand Ministry for the Environment

Early Market Impact Evaluation Framework for Public-Private Research Collaborations
Gretchen Jordan, 360 Innovation LLC

Longitudinal Evaluation of Pharmaceutical Programmes
Jari Hyvarinen, Tekes

The Mid-term Evaluation on Development Program of Industrialization for Agricultural Biotechnology in Taiwan
Shan Shan Li et al, Science & Technology Policy Research and Information Center

Session: Enriching Research Assessment on Interdisciplinarity 
Chair:  Alan Porter, Georgia Tech and Search Technology Inc


Evaluating the Outcomes of Government Funded Research Programs: Measuring Interdisciplinarity through Bibliometric Analysis of the CMG Program
Jon Gardner, Search Technology Inc
Alan Porter, Georgia Tech and Search Technology Inc

Evaluating the Outcomes of Government Funded Research Programs: Measuring Interdisciplinarity through Text Analysis of Abstracts of Award-Derived
Christina Freyman and John Chase, SRI International
 
Enriching Educational Research Assessment: Inclusion of Books
David Schoeneck, Search Technology Inc
Gregg Solomon and James Dietz, National Science Foundation

Is Transformative Research Necessarily Interdisciplinary?
Vanessa Pena and Bhavya Lal, Science and Technology Policy Institute

Session: The Preferred Approach to Evaluating Collaborations: What's in Your Evaluation Research and Development (R&D) Tool Box?
Chair: Kathryn Graham, Alberta Innovates Health Solutions
Discussants: Gretchen Jordan, 360 Innovation LLC; Heidi Chorzempa and Andrew Lejeune, Alberta Innovates Health Solutions


Saturday, October 27

Session: Organizational Funding Portfolios and Beyond: Assessing the Full Research Landscape
Chair: Elizabeth Hsu, National Institutes of Health


Assessing the Alignment of Current Research Landscape with a Strategic Plan for Advancing Autism Research
Duane Williams and Joshua Schnell, Thomson Reuters
Sara Dodson et al, National Institutes of Health

Informing Initiative Development Through Portfolio Analysis: Assessment of the Provocative Questions
Samantha Finstad et al, National Institutes of Health
Duane Williams et al, Thomson Reuters

Assessment of Proposed Research: National Cancer Institute Provocative Questions Initiative Case Study
Elizabeth Hsu et al, National Institutes of Health
Leo DiJoseph et al, Thomson Reuters

Session: Lessons Learned From the Use of Economic Impact Analysis and Partial Benefit-Cost Analysis in Evaluations of Science and Technology Research Grant Programs
Chair: George Teather, The Performance Measurement Network
Discussant: George Teather, The Performance Management Network


Economic Impact Analysis of the Collaborative Research Development Grants Program
Michael Goodyear and Susan Morris, Natural Sciences and Engineering Research Council of Canada

Partial Benefit-Cost Analysis of the Strategic Project Grants Program
Anna Engman and Susan Morris, Natural Sciences and Engineering Research Council of Canada

Session: Synthesis of Benefit-Cost Impact Evaluations of R&D Programs
Chair: Alan O'Connor, RTI International


Synthesizing Benefit-Cost Studies of R&D: Lessons and Challenges from Recent Evaluation Series
Alan O'Connor, RTI International

Database Analytical Tool for Aggregating Across R&D Benefit-Cost Studies
Rosalie Ruegg, TIA Consulting

Session: Developing Outcome Indicators for Program Evaluation: Training, Team Science, and Translation of Basic Research
Chair: Joshua Schnell, Thomson Reuters


Training Program Evaluation: Creating a Composite Indicator to Measure Career Outcomes
Yvette Seger and Leo DiJoseph, Thomson Reuters

Team Science Evaluation: Developing Methods to Measure Convergence of Fields
Unni Jensen and Jodi Basner, Thomson Reuters

Measuring Longer-Term Outcomes: Testing the Feasibility of Linking Research Grant Funding to Downstream Drug Development
Duane Williams and Joshua Schnell, Thomson Reuters

Session: From Outputs to Impacts: Emerging Approaches to Track Scientific Research Impacts
Chair: Christie Drew, National Institue of Environmental Health Sciences

Greatest "HITS": A New Tool for Tracking Impacts at the National Institute of Environmental Health Sciences 

Christie Drew and Kristi Pettibone, National Institute of Environmental Health Sciences

Understanding Innovation is Nurturing It

Stefano Bertuzzi and Liza Bundesen, National Institutes of Health

The Becker Model: A Framework for Quantifying Research Impact
Kristi Holmes and Cathy Sarli, Washington University in St Louis

RTD Presentations from AEA 2011

(back to list)

Wednesday, November 2

Session: Exploring Values Alignment in Five Evaluations of Science and Learning Centers
Chair: Brenda Turnbull, Policy Studies Associates
Discussant: Gretchen Jordan, Sandia National Laboratories

It's an Evolution: Changing Roles and Approaches in the Evaluation of the Pittsburgh Science of Learning Center
Brian Zuckerman, Science and Technology Policy Institute

Evaluating the Temporal Dynamics of Learning Center: Addressing Multiple Stakeholders' Information Needs
Brenda Turnbull, Policy Studies Associates 

Organization Consultant, Critical Friend, and Evaluator: The Value and Challenge of Flexible Roles
Kristine Chadwick and Jessica Gore, Edvantia

Coming from Behind: Developing a Logic Model and an Evaluation Strategy Late in the Center's Life Cycle
Judah Leblang, Lesley University

 Thursday,  November 3

Session: Approaches to Biomedical Research and Development Portfolio Analysis: Examples from the National Institutes of Health
Chair: James Corrigan, National Institutes of Health

Identifying the Role of Funded Research in Pivotal Cancer Research Advances
Brian Zuckerman, Science and Technology Policy Institute; James Corrigan, National Institutes of Health; Seth Jonas, Science and Technology Policy Institute; Lawrence Soloman, National Institutes of Health

Estimating the Impact of Hypothetical Portfolio Reductions on Production of Major Discoveries Funded by the National Institute of Allergy and Infectious Diseases
Kevin Wright and Brandie Taylor, National Institutes of Health; Jamie Mihoko Doyle and Brian Zuckerman, Science and Technology Policy Institute

Using Multiple Methods and Data Sources to Analyze Complex Cancer Research Portfolios
Joshua Schnell, Thomson Reuters; Elizabeth Hsu and James Corrigan, National Institutes of Health; Sandeep Patel and Lauren Taffe, Thomson Reuters

Session: New Direction of Research and Development Performance Evaluation System in Korea
Chair: Donghoon Oh, Korea Institute of Science & Technology Evaluation and Planning

New Direction of Research and Development Evaluation System in Korea: From Quantity/Efficiency to Quality/Effectiveness
Seung Jun Yoo, Korea Institute of Science & Technology Evaluation and Planning; ChangWhan Ma, National Science and Technology Commission

Evaluation System of Government-funded Research Initiatives (GRIs) in Korea
Woo Chul Chai, Korea Institute of Science & Technology Evaluation and Planning

Conjoint Analysis for Contract Strategy for Culture Technology Enhancement Program
Yun Jong Kim, Korea Institute of Science & Technology Evaluation and Planning; Uk Jung, Dongguk University

Performance Factors in Industry-University Collaboration
Youngsoo Ryu and Hongbum Kim, Korea Institute of Science & Technology Evaluation and Planning

Session: Evaluating Basic Research in China: Two Very Different Models
Chair: Laurel Haak, Discovery Logic

Estimate Returns to Scale of the Basic Research Institutes in Chinese Academy of Sciences
Guoliang Yang, Chinese Academy of Sciences; Wenbin Liu, University of Kent; Xiaoxuan Li, Chinese Academy of Sciences

Evaluation of the National Science Foundation of China
Erik Arnold, Technopolis and University of Twente

Session: Do Existing Logic Models for Science and Technology Development Programs Build a Theory of Change?
Presenter: Gretchen Jordan

Session: Evaluators as Partners in Technology Program Design
Chair: Mary Beth Hughes, Science and Technology Policy Institute

The Logical Framework Approach to Drafting Proposals for Government Technology Programs: The Case for Taiwan
Ling-Chu Lee and Shan Shan Li, Science & Technology Policy Research and Information Center

The Role of Evaluation within Nanoscale Science and Engineering Education Research: Differential Use, Application, and Benefits of Evaluation
Jennifer Nielsen, Andrew Herrmann, Amara Okoroafor, Taimur Amjad, Shezad Habib, Manhattan Strategy Group

Session: Assessing Additionality of Public Support of Industrial Research and Development
Chair: Cheryl Oros, Oros Consulting, LLC

Evaluating Effectiveness of Public Support to Industrial R&D in Turkey Through Input and Output Additionality
Sinan Tandogan, The Scientific and Technological Research Council of Turkey
M Teoman Pamukcu, Middle East Technical University

Evaluating the Additionality and Certification Effects of Research and Innovation Policy on Small Business Start-Ups: An Inflow-Sampling and Counterfactual Approach
Reynold Galope, Georgia State University

Session: Using Bibliometrics for Research Evaluation of Countries, Institutions, and Researchers: A Review of Statistics, Visualization, and Guidelines
Presenter: Ann Kushmerik, Thomson Reuters

Session: Using Logical Framework to Identify Outcomes and Performance Indicators in Science & Technology Program Proposals
Presenters: Shan Shan Li and Ling-Chu Lee, Science & Technology Policy Research and Information Center

Session: A Panel on the Value of a New Policy Model for Evaluation of Science and Technology
Chair: Jerald Hage, University of Maryland
Discussant: George Teather, Independent Consultant

Critique of the Idea of Measuring Obstacles and Remedies
Cheryl Oros, Oros Consulting, LLC

Critique of the Appropriateness of This Model for Federal Policy Problems
Brian Zuckerman, Science & Technology Policy Institute

Critique of the Relationship Between the Remedies and Social Science Research
Juan Rogers, Georgia Institute of Technology

Session: Multiple Methods for Assessing Scientific and Environmental Impacts of Research
Chair: George Teather, Performance Management Network

Evaluation Tools of Environmental and Welfare Effects in Tekes Funding
Jari Hyvarinen, Research Institute of the Finnish Economy

The Contribution of Research to Socioeconomic Outcomes: A Case Study
George Teather, Performance Management Network; Beth MacNeil and Ajoy Bista, Canadian Forest Service

The Relationship Between Environmental and Scientific Performance of Nations: Lessons Learned from a Macro-Level Evaluation Using Scientometric Indicators and an Environmental Performance Index
Frederic Bertrand et al, Science Metrix

Session: Research, Technology, and Development Evaluation TIG Business Meeting

Friday,  November 4

Session: New Approaches to Assessing National Institutes of Health (NIH) Research Programs
Chair: Robin Wagner, National Institutes of Health

Expert Opinion as a Performance Measure in R&D Evaluation
Kevin Wright, National Institutes of Health

Text Mining for Visualization of Temporal Trends in NIH-Funded Research
L Samantha Ryan, Carl W McCabe, Allan J Medwick, National Institutes of Health

Assessing Grant Portfolios Using Text-Mining and Visualization Methods
Elizabeth Ruben, Kristianna Pettibone, Jerry Phelps, Christina Drew, National Institutes of Health

Network Analysis of Collaboration Among National Heart, Lung, and Blood Institute (NHLBI) Funded Researchers
Carl W McCabe et al, National Institutes of Health

Session: Evaluation of Research Systems in Fast Developing China and Vietnam
Chair: Gretchen Jordan, Sandia National Laboratories

Research Evaluation in a Country in Transition: Experiences and Lessons From Vietnam
Michael Braun, Doan Trinh Ta, Thi Thu Oanh Nguyen, Vietnam Science & Technology Evaluation Center

Self-evaluation Model and Methodology of Chinese Academy of Sciences in Knowledge Innovation Program
Xiaoxuan Li et al, Chinese Academy of Sciences


Session: Improving Peer Review for High Risk and Center-Based Research
Chair: Brian Zuckerman, Science & Technology Policy Institute

Strategies and Lessons Learned from Implementing External Peer Review Panels Online: A Case Example from a National Research Center
Daniela Schroeter, Kelly Robertson, Chris Coryn, Richard Zinser, Western Michigan University

Can Traditional Research and Development Evaluation Methods Be Used for Evaluating High-Risk, High-Reward Research Programs?
Mary Beth Hughes and Elizabeth Lee, Science & Technology Policy Institute

Session: Progress Reporting for US Federal Grant Awards: Templates, Guides, and Data Standards to Support Effective Program Evaluation
Chair: Laurel Haak, Discovery Logic
Discussant: Laurel Haak, Discovery Logic

Using the Logic Model Process to Guide Data Collection on Outcomes and Metrics
Helena Davis, National Institutes of Health

Evaluating Collaboration and Team Science in the National Cancer Institute's Physical Sciences: Oncology Consortium
Larry Nagahara, National Institutes of Health

Creating a Shared Core of Reporting Elements
David Baker, Consortia Advancing Standards in Research Administration Information

Session: Methods and Tools for Evaluating Clinical and Translational Science
Chair: Arthur Blank, Albert Einstein College of Medicine
Discussant: Paul Mazmanian, Virginia Commonwealth University

Linking Strategic Goals to Evaluation Using Microsoft Project 2010
Lisle Hites, Susan Lyons, Molly Wasko, University of Alabama, Birmingham

Using Survey-Based Social Network Analysis to Establish an Evaluation Baseline and Detect Short-term Outcomes of a Clinical and Translational Science Center
Megan Haller and Eric Welch, University of Illinois, Chicago

Tracking for Translational: Novel Tools for Evaluating Translational Research Education Programs
Julie Rainwater, Erin Griffin, Stuart Henderson, University of California, Davis

Integrating the Logic Model and Tyler Matrix Approaches in Evaluating Translational Science
Babbi Winegarden and Angela Alexander, University of California, San Diego

Reframing Analysis: Return on Investment Protocols for Clinical and Translational Science Programs
Kyle Grazier, University of Michigan; William Trochim, Cornell University

Session: Evaluation for Encouragement and Evolution to Innovation: Toward the New Progress Phase of RT&D
Chair: Naoto Kobayashi, Waseda University
Discussant: Osamu Nakamura, National Institute of Advanced Industrial Science and Technology (AIST)

Strategy and Evaluation of Research Initiatives in Waseda University
Naoto Kobayashi, Waseda University

An Improved Approach of the Research Unit Evaluation at the Beginning of the Third Research Program Term of AIST
Takashi Yoshimura et al, National Institute of Advanced Industrial Science and Technology

Strategic Collaboration Network to Develop the Low Carbon Society by the Innovative Renewable Energy
Osamu Nakamura, Shinchi Matsui, Yoshiyuki Sasaki, National Institute of Advanced Industrial Science and Technology

Research on the Derivative Effect Created by NEDO Projects
Sayaka Shishido, Kazuo Fukui, Masaru Yamashita, Mituru Takeshita, New Energy and Industrial Technology Development Organization (NEDO)

Study to Evaluate the Cost-Effectiveness of NEDO Projects: Analysis of 'NEDO inside Products' Survey
Masaru Yamashita, Kazuo Fukui, Sayaka Shishido, Mituru Takeshita, New Energy and Industrial Technology Development Organization (NEDO)


Saturday,  November 5


Session: Evaluating Health Research Impact: Implementation of the Canadian Academy of Health Sciences (CAHS) Model
Chair: Gretchen Jordan, Sandia National Laboratories

Building a Complex Health Research Logic Model: Making Pathways to Impacts Clear
Gretchen Jordan, Sandia National Laboratories

The Call to Action: Building a Network that Links Evaluation to Social Benefit
Inez Jabalpurwala-Graham, Graham Boeckh Foundation

Implementation of an Evaluation Model for Evaluating Complex Health Research Outcomes
Kathryn Graham, Heidi Chorzempa, Daniel Zhang, Alberta Innovates Health Solutions

Session: International Views on Assessing Knowledge and Technology Transfer
Chair: George Teather, Performance Management Network

Results Based Evaluation on the Appraisal Process of a Technology Transfer Program
Yukio Kemmochi, Japan Science and Technology Agency

Assessing the Effects of a Collaborative Research Funding Scheme: An Approach Combining Meta-Evaluation and Evaluation Synthesis
Barbara Good, Technopolis Group

Best Practices in the Transformation of Research Knowledge to Application: A Case Study
George Teather and Suzanne Lafortune, Performance Management Network

Translating New Knowledge from Technology Based Research Projects: A Randomized Control Study of an End-of-Grant Intervention
Vathsala Stone and Machiko Tomita, University at Buffalo

Session: Tests of Two Frameworks for Evaluating the Impact of Health Research
Chair: Mary Beth Hughes, Science and Technology Policy Institute

Evaluating the Impacts of Health Research: Revisiting the Canadian Institutes of Health Research Impact Assessment Framework
Nicola Lauzon, Marc Turcotte, Laura McAuley, Canadian Institutes of Health Research

Evaluating the Commercialization of Technologies Using the Canadian Institutes of Health Research Impact Assessment Framework
Marc Turcotte, Laura McAuley, Nicola Lauzon, Canadian Institutes of Health Research

Putting a Value on Biomedical Research Center Programs: Adapting the Research Payback Framework for Application in the United States
Jack E Scott, Margaret Blasinsky, Mary C Dufour, The Madrillon Group, Incorporated; Rachel Mandel and Stephane Philogene, National Institutes of Health

Session: Evaluation as a Methodology for Understanding and Enabling Interdisciplinary Team Science
Presenters: Deana Pennington, University of Texas at El Paso; Allison Titcomb, ALTA Consulting LLC; Marcia Nation, Arizona State University


Session: Measuring Research Interdisciplinarity and Knowledge Diffusion
Chair: Alan Porter, Georgia Tech and Search Technology Inc
Discussant: Jan Youtie, Georgia Tech

A New Measure of Knowledge Diffusion
Stephen Carley, Georgia Tech; Alan Porter, Georgia Tech and Search Technology Inc

Analyzing the Effect of Interdisciplinary Research on Patent Evaluation: Case Studies in NBs and DSSCs
Wenping Wang, Beijing Institute of Technology; Alan Porter, Georgia Tech and Search Technology Inc; Ismael Rafols, University of Sussex; Nils Newman, Intelligent Information Services; Yun Liu, Beijing Institute of Technology

Measuring Interdisciplinarity: A Unique Comparison Between the Researcher and Research Proposal
Asha Balakrishnan, Vanessa Pena, Bhavya Lal, IDA Science & Technology Policy Institute

Session: Long-term Impacts: Evaluating Long-term Research Impacts and Valorising Evaluation Long-term
Chair: Neville Reeve, European Commission

Long-term Impacts of the Framework Programs
Neville Reeve, European Commission

Long-term Impacts of the FP7 Evaluation Results
Peter Fisch, European Commission

Indicators for Measuring Outcomes and Impacts
Carlos Oliveira, European Commission

Session: Evaluation of a Nano Science and Technology Centers Program: Mixed Methods Approach to Assessing its Realization of Policy Objectives
Chair: Juan Rogers, Georgia Institute of Technology

Program Level Assessment of Outcomes and Impacts of Research Performance of Centers
Juan Rogers, Georgia Institute of Technology

Aggregate Patterns of Linkage of Nanotechnology Centers with Industry: Program Outcomes
Luciano Kay, Georgia Institute of Technology

Societal Dimensions of Nano Science and Technology Centers Program
Jan Youtie, Georgia Institute of Technology


RTD Presentations from AEA 2010

(back to list)

** Note: the RTD TIG Webmaster does not have the presentations from the 2010 conference. Please help us document the products of the RTD TIG community by submitting your presentation, either through the e-library or by emailing the webmaster through the 'Contact Us' button at the bottom of this page. Your assistance is much appreciated! 

Thursday,  November 11

Session: Analysis and Evaluation of Research Portfolios Using Quantitative Science Metrics: Practice
Chair: Laurel Haak, Discovery Logic

Practical Applications of Bibliometrics: What Makes Sense in Different Contexts?
Frederic Bertrand and David Campbell, Science-Metrix Corp

Applying Metrics to Evaluate the Continuum of Research Outputs: Near- to Long-term Impact
Joshua Schnell, Beth Masimore, Laurel Haak, Matt Probus, and Michael Pollard, Discovery Logic

Roundtable Rotation I: Choices of Research and Development (R&D) Evaluation Approaches in Chinese Academy of Sciences (CAS) Institutes: We Reap What We Sow?
Xiaoxi Xiao, Changhai Zhou, and Tao Dai, Chinese Academy of Sciences

Roundtable Rotation II: Producing Evidence of Effectiveness Data in the Real World of Early Childhood Education
Cindy Lin and Marijata Daniel-Echols, HighScope Educational Research Foundation

Session: Analysis and Evaluation of Research Portfolios Using Quantitative Science Metrics: Theory
Chair: Israel Lederhendler, National Institutes of Health
Discussant: Gretchen Jordan, Sandia National Laboratories

How Can Portfolio Analysis Assist Government Research Agencies to Make Wise Research Investments?
Robin Wagner and Matthew Eblen, National Institutes of Health

Limits of Portfolio Analysis to Address Evaluation Questions
Brian Zuckerman, Science and Technology Policy Institute

Reinventing Portfolio Analysis at the National Institutes of Health: Explorations in the Structure and Evolution of Biomedical Research
Israel Lederhendler, Kirk Baker, Archna Bhandari, and Carole Christian, National Institutes of Health

Intersections Among Scientometrics, Science Portfolio Analysis, and Research Evaluation: Does Complex Systems Science Offer Workable Theory?
Caroline Wagner, Science-Metrix Corp.

Session: Third Generation Research Knowledge Tracking: Citation Analyses
Presenters: Alan Porter and Stephen Carley, Georgia Institute of Technology

Roundtable: Methodological Issues in Evaluating Potentially Transformative Research
Mary Beth Hughes, Bhavya Lal, and Asha Balakrishnan, Science and Technology Policy Institute

Session: Evaluating the Science of Discovery in Complex Health Systems: Challenges and Opportunitites
Chair: Alison Buchan, University of British Columbia


An Evaluation Framework for Advancing the Science of Evaluating Team Science: The Research on Academic Research Initiative (RoAR)
Cameron Norman, University of Toronto; Timothy Huerta, Texas Tech University; Sharon Mortimer and Allan Best, Michael Smith Foundation for Health Research; Alison Buchan, University of British Columbia

Advancing the Science of Evaluating Team Science: Psychosocial Factors and Related Outcomes from the RoAR Initiative
Cameron Norman, University of Toronto; Timothy Huerta, Texas Tech University; Sharon Mortimer, Michael Smith Foundation for Health Research; Alison Buchan, University of British Columbia Advancing the Science of Evaluating Team Science: Social Network Outcomes from the RoAR Initiative Timothy Huerta, Texas Tech University; Cameron Norman, University of Toronto; Sharon Mortimer, Michael Smith Foundation for Health Research; Alison Buchan, University of British Columbia

Advancing the Science of Evaluating Team Science: Scientometric-Related Outcomes from the RoAR Initiative
Sharon Mortimer, Michael Smith Foundation for Health Research; Timothy Huerta, Texas Tech University; Bianca Cervantes and Alison Buchan, University of British Columbia

Session: Linking Professional Associations to Advance the Study of Science and Innovation Policy
Chair: Gretchen Jordan, Sandia National Laboratories


View from the Atlanta Science and Technology Policy Conference and Others
Stephanie Shipp, Science and Technology Policy Institute

View from the American Evaluation Association's Research, Technology, and Development Evaluation TIG
Gretchen Jordan, Sandia National Laboratories

View from the Association for Public Policy Analysis and Management
Julia Melkers, Georgia Institute of Technology

View from the Academy of Management
Gordon Kingsley, Georgia Institute of Technology

Friday,  November 12

Roundtable Rotation I: Big Money, More Scrutiny: How to Forge Evaluator-Early Childhood Education Program Partnerships in order to Produce Clear, Relevant, and Useful Data to Inform Policy and Practice
Marijata Daniel-Echols, HighScope Educational Research Foundation

Roundtable Rotation II: A Study on the Indicator of High Quality Papers: The Case of the Chinese Academy of Sciences
Haijin Zheng, Zhongchen Guan, Haiyang Hu, and Bing Shi, Chinese Academy of Sciences

Session: Challenges and Best Practices in Benefit Cost Studies of Research and Technology Programs
Chair: Rosalie Ruegg, TIA Consulting Inc.


A Case Study: A Benefit-Cost Analysis of Department of Energy's Investment in Solar Photovoltaic Energy Systems
Alan O'Connor, Ross J Loomis, and Fern M Braun, RTI International

Views from the Studies Done by the Advanced Technology Program
Rosalie Ruegg, TIA Consulting

Perspectives from Europe
Isabelle Collins, Technopolis

View of the Chairman of the Expert Review Panel
Irwin Feller, Pennsylvania State University

Session: Recent Developments in Research and Development Evaluation: The Academic Side
Chair/Discussant: Juan Rogers, School of Public Policy, Georgia Institute of Technology


Science Overlay Maps: A New Research Evaluation Tool
Alan Porter, Georgia Institute of Technology; Ismael Rafols, University of Sussex

Tracking Knowledge and Collaborative Developments in Research and Development: Combining Social Network and Bibliometric Analysis in the Evaluation of Research Centers
Julia Melkers and Jan Youtie, Georgia Institute of Technology

Bibliometric Analysis as a Methodology for Research Evaluation and Mapping of Science
Anthony van Raan and Thed van Leeuwen, University of Leiden

Session: From Research to Commercialization: Impact Evaluation of Portfolios of Research
Chair: Israel Lederhendler, National Institutes of Health

Tracing from Applied Research Programs to Downstream Applications: Approach and Findings
Rosalie Ruegg, TIA Consulting

Evaluating Ohio's Portfolio of Technology Programs
David Cheney, Jennifer Ozawa, and Chris Ordowich, SRI International

Session: Evaluating Government Research and Technology Policies: Traditional and Emerging Methods
Chair: Cheryl Oros, Oros Consulting LLC


Quality Evaluations of Government Policies in Research and Technology Sector
Yelena Thomas, Ministry of Research and Technology, New Zealand

Applications of Agent-Based Simulations in Evaluating Science and Technology Policies
Branco Ponomariov, University of Texas, San Antonio; Craig Boardman, Ohio State University

Session: Report on a Test of a General Method for Quick Evaluation of Medical Research by Morbidity
Chair: Jerald Hage, University of Maryland


A Method for Quick Evaluation of Medical Research
Jerald Hage, University of Maryland

A Quick Evaluation of Alzheimer's Disease Research
Amber Nelson, University of Maryland

A Quick Evaluation of Breast Cancer Research
J Alice Nixon, University of Maryland

A Quick Evaluation of Colorectal Cancer Research
Joseph Waggle, University of Maryland

Demonstration: Assessing Faculty Productivity and Institutional Research Performance: Using Publication and Citation Key Performance Indicators
Ann Kushmerick, Thomson Reuters

Think Tank: Your Input, Please: Research, Technology, and Development (RTD) Topical Interest Group (TIG) Draft User's Guide to Conducting Research and Development Evaluation
Gretchen Jordan, Sandia National Laboratories
Discussants: Brian Zuckerman, Science and Technology Policy Institute; Cheryl Oros, Oros Consulting LLC; Juan Rogers, School of Public Policy Georgia Technology; George Teather, George Teather and Associates

Saturday,  November 13

Session: Interim Evaluation: The Quality of Research and the Quality of Evaluation - Case Study of the FP7 Interim Evaluation

Case Study of the FP7 Interim Evaluation: Notions of Quality in Evaluation Design
Neville Reeve, European Commission

Evaluation Quality and Long-term Impact
Peter Fisch, European Commission

The Use of Independent Experts to Verify Quality and Strengthen Evaluation Process
Wolfgang Polt, Joanneum Research

The Role of Stakeholders on the Quality of Evaluation Design, Implementation and Impact
Iain Begg, London School of Economics

Session: Metrics for the National Institute of Environmental Health Science (NIEHS): Measuring Outcomes to Advance Partnerships for Environmental Public Health
Chair: Christie Drew, National Institute of Environmental Health Sciences
Discussant: Gretchen Jordan, Sandia National Laboratories


Metrics and Examples for Evaluating Education and Training Evaluation in Environmental Public Health Programs
Helena L Davis, Beth Anderson, Caroline Dilworth, Christie Drew, and Liam O'Fallon, National Institute of Environmental Health Sciences; Ashley Brenner, Cara O'Donnell, Sarah Ryker, Stephanie Shipp and Susannah Howieson, Science and Technology Policy Institute

Partnerships and Training: The United Steelworkers Health and Safety Department Worker Training and Education Program
Thomas McQuiston, United Steelworkers Health and Safety Department

Environmental Health and the Navajo Nation: Products, Dissemination, and Partnerships
Johnnye Lewis, University of New Mexico
 
Session: Systems of Evaluation for Diverse National Portfolios of Research: Lessons From Russia and Finland
Chair: Yelena Thomas, Ministry of Research, Science and Technology


Tekes Impact Goals, Logic Models and Evaluation of Socio-economic Effects
Jari Hyvarinen, Tekes - Finnish Funding Agency for Technology and Innovation

Verifiable Evaluation System for Research Programme
Igor Zatsman, Russian Academy of Sciences

Session: Evaluating Contributions to Knowledge Translation for New Technologies or Medical Treatments
Chair: John Reed, Innovologie LLC


Translating New Knowledge from Technology Based Research Projects: An Intervention Evaluation Study
Vathsala Stone, State University of New York at Buffalo

Evolving a High-Quality Evaluation System for the National Institutes of Health's (NIH) HIV/AIDS Clinical Trials Research Network
Scott Rosas, Concept Systems Inc; Jonathan Kagan, National Institutes of Health; Jeffrey Schouten, Fred Hutchinson Cancer Research Center; and William Trochim, Cornell University

Session: Evaluation of the National Research and Development (R&D) Programs as a Tool for Increasing Efficiency of Public Finance
Chair: Seung Jun Yoo, Korea Institute of Science & Technology Evaluation and Planning (KISTEP)


Trends in the Performance Evaluation System of the National R&D Program in Korea
Chi Yong Kim and Min Ki Kim, Korea Institute of Science & Technology Evaluation and Planning (KISTEP)

Ex-ante Evaluation of the Validity and Economic Impact of Nanotechnology Public R&D Program in Korea
Yun Jong Kim, Korea Institute of Science & Technology Evaluation and Planning (KISTEP); Dae Hyun Jeong Jeong, Korea Zinc Inc; and Jiho Hwang, Korea Institute of Science & Technology Evaluation and Planning (KISTEP)

Evaluation of Effectiveness for the Public Environmental R&D Program in Korea
Heekweon Lee, Jungseok Hong and Hyunjung Cho, Korea Institute of Science & Technology Evaluation and Planning (KISTEP); Suho Bae, Sungkyunkwan University

Evaluation of Horizontal Public R&D Programs: A Case Study in Korea
Il Hwan Lee, Seung Jun Yoo, and Boojong Gill, Korea Institute of Science & Technology Evaluation and Planning (KISTEP)

Session: Informing Portfolio Management Using Tracking Systems and Bibliometrics
Chair: Juan Rogers, School of Public Policy, Georgia Institute of Technology


The Impact of the United States-China Collaboration on China's Research Performance: Evidence from Nanotechnology Publication
Li Tang, Georgia Institute of Technology

The Benefits and Challenges of Participatory Tracking Systems for Monitoring Institutional Change
Marc Brodersen, Kathryn Nearing, Susan Connors, and Bonnie Walters, University of Colorado, Denver

Session: Improving Proposals and Programs by Improving Peer and Stakeholder Review 
Chair: George Teather, George Teather and Associates


Using Fishbone Analysis to Improve the Quality of Science and Technology Program Proposals
Shan Shan and Liand Ling-Chu Lee, National Applied Research Laboratories, Taiwan

Interactive Heuristic Reviewing Mechanism: A New Method of Assessing Exploratory Pioneering Research Projects for National Nature Science Foundation of China (NSFC)
Yue Wang, Xiaoxuan Li, and Jianzhong Zhou, Chinese Academy of Sciences; Yonghe Zheng, National Nature Science Foundation of China; and Guoxiang Xiong, Chinese Academy of Sciences

Session: Evaluation System of Research, Technology, and Development (RT&D) to Induce Innovation: Strategy, Process, and Reflection
Chair: Naoto Kobayashi, Waseda University, Japan


On the Feedback of Evaluation Results to the Management of Advanced Industrial Science and Technology (AIST)
Yoshiaki Tamanoue, Hidenori Endo, Shigeko Togashi, Shuichi Oka, Hiroyuki Suda, Kenta Ooi, Kanji Ueda, Mariko Kamo and Susumu Yasuda, National Institute of Advanced Industrial Science and Technology

Strategic RT&D in Nagasaki to Induce Local and Global Innovation
Osamu Nakamura and Nariatsu Inada, Nagasaki Prefectural Government; Naoto Kobayashi, Waseda University, Japan


RTD Presentations from AEA 2009

(back to list)

Wednesday,  November 11

The Identification of Cluster Growth Factors Based on an Analysis of the Literature
Alan Porter, Georgia Institute of Technology

Thursday,  November 12

Session: The Evaluation of Technology Clusters and Cluster Initiatives, Strategies and Policies: Grounding Factors and Models
Chair: Shannon Townsend, National Research Council Canada

The Identification of Cluster Growth Factors Based on an Analysis of the Literature
Marc Gagne, National Research Council Canada
The Development of a Cluster Leverage Model: Strategies for Measuring Incremental Cluster Inputs
Shazmin Dosani, Centre for Public Management Shannon Townsend, National Research Council Canada
The Use of Community Discussion Groups to Gauge Cluster Cohesion and Action
Shannon Townsend, National Research Council Canada

Session: Evaluations to Improve Research Investment and Performance 
Chair: Dale Pahl,  United States Environmental Protection Agency


Meta-evaluation of the Performance Evaluation System of Public Research Institutes in Korea
Chan Goo Yi, Pukyong National University Jang Jae Lee, Korea Institute of Science and Technology Evaluation and Planning Yong Soo Hwang, Science and Technology Policy Institute

Integrating Evaluation With Business Process Modeling for Increased Efficiency and Faster Results in HIV/AIDS Clinical Trials Research
Jonathan Kagan, National Institutes of Health William Trochim, Cornell University
Using Cross-case Study Analysis to Maximize the Use of an Evaluation of a Research Funding Program
Natalie Froese, RA Malatest & Associates Ltd Nicole Michaud, Social Sciences and Humanities Research Council of Canada Rob Malatest, RA Malatest & Associates Ltd Suzanne Bélanger, RA Malatest & Associates Ltd
Session: Improving on Peer Review, Publication, and Patent Analysis: Views From Denmark, Taiwan and Korea  
Chair: David Bartle,  Ministry of Economic Development

The Peer Review and Dialogue: Including Research Organization in an Evaluation Process (paper)
Finn Hansson, Copenhagen Business School
Analyses and Evaluations of the Academic Activities in Humanities and Social Sciences in Taiwan
Chia-Hao Hsu, Yu-Ling Luo, Kai-Lin Chi, Science & Technology Policy Research and Information Center  

The Study on the Patent Performance of National Research and Development (R&D) Programs in Korea Using Survival Analysis  
Seongjin Kim, Seung Jun Yoo, Korea Institute of Science and Technology Evaluation and Planning
Session: Multi-Level and Multi-Factor Approach to Research Program Evaluation: Designing and Implementing the Evaluation of the Europeans Union's Framework Programs for Research and Technology Development 
Chair: Neville Reeve, European Commission

Objectives and Constraints With Strategy for Research Program Evaluation: Experience of Developing the European Union's (EU) approach to Evaluation of the Framework Programs
Peter Fisch, European Commission
Evaluating the Rationale, Implementation and Achievements of a €20 Bn Research Program? Meta Evaluation of Evidence for the Evaluation of the Sixth Framework Programs
Erik Arnold, Technopolis
Evaluating Research Policy: Understanding and Assessing Policy To Develop The European Research Area  
Philippe Laredo, Manchester Business School
Monitoring And Assessing The Progress Of Research Programs-Implementation of performance monitoring and interim evaluation of the EU's 7th Framework Programs  
Neville Reeve, Costas Paleologos, Martin Ubelhor, European Commission 
   

Session: Assessment of Emergent Technologies: The Case of Nanotechnology  
Chair: Juan Rogers, Georgia Institute of Technology

Assessing Nanotechnology: Research Metrics and Maps  
Alan Porter, Georgia Institute of Technology Who is Funding Nanotechnology and How? Evidence From Publication Analysis
Jue Wang, Florida International University
Assessing the Development Path of "Active Nanotechnology"
Vrishali Subramanian, Georgia Institute of Technology
Publication and Citation Distributions in the Development of Nanotechnology
Juan Rogers, Georgia Institute of Technology

Session: Teams, Collaborative Projects and Networks: Evaluating Them and Their Impact  
Chair: Cheryl Oros,  Independent Consultant


Networking for Results: Assessing Collaborative Projects on Science and Technology  
Ruslan Rakhmatullin, TCD, School of Business Studies Louis Brennan, Trinity College

Review of Research Teams in Chinese Academy of Sciences: Identify the Effective and Efficient Team Building  
Jiaofeng Pan, Qiang Li, Bing Shi, Chinese Academy of Sciences

Applying Network Analysis to Science and Technology Development: Examples From the European Commission (EC Evaluation Part 2)  
Francis Cunningham, European Commission Franco Malerba, University of Bocconi Caroline Wagner, SRI International Marko Grobelnik, Josef Stefan Institute
Nicholas  Vonortas, George Washington University

Session: Knowledge Translation for Technology Adoption  
Chair: Robin Wagner,  National Institutes of Health

The Impact of Farmers' Characteristics on Technology Adoption: A Meta Evaluation  
Guy Blaise Nkamleu, African Development Bank
Achieving Knowledge Translation for Technology Transfer: Implications for Evaluation
Vathsala Stone, University at Buffalo - State University of New York
  

Session: Moving Toward Quantitative Evidence-based Science Policy: Science of Science Policy Developmental Efforts In Theory, Evaluation Methods, and Data Infrastructure
Chair: Kei Koizumi, United States Office of Science & Technology Policy
Discussant: Kei Koizumi, United States Office of Science & Technology Policy

Science of Science Policy: Overview and Strategic Directions
Bill Valdez, United States Department of Energy
Stimulating Research on Science of Science and Innovation  
Julia Lane, National Science Foundation
A Data Infrastructure to Enable Research about Science 
Israel Lederhendler, National Institutes of Health
Science of Science Policy: Methodological Development, Logic Model, and Need for Involvement of American Evaluation Association Community
Cheryl Oros, Independent Consultant

Friday,  November 13

Session: New Evaluation Techniques For Estimating the Impacts of Science and Technology Innovations
Chair: Jerald Hage, University of Maryland

A Credible Approach to Benefit-Cost Evaluation for Federal Energy Technology Programs
Gretchen Jordan, Sandia National Laboratories Rosalie Ruegg, TIA Consulting  

Techniques for Evaluating Potential Benefits of New Scientific Instruments  
Jonathan Mote, Aleia Clark, Jerald Hage, University of Maryland

A New Evaluation Strategy for Measuring the Returns on Investments in Medical Research: The Meso Level of the Treatment Sector
Jerald Hage, University of Maryland
Gretchen Jordan, Sandia National Laboratories

Session: Contextualizing the Evaluand: Planning and Implementing an Evaluation of the Injury Control Research Center (ICRC) Program  
Chair: Sue Lin Yee, Centers for Disease Control and Prevention
Discussant: Thomas Bartenfeld, Centers for Disease Control and Prevention


Negotiating Diverse Contexts and Expectations in Stakeholder Engagement
Sue Lin Yee, Centers for Disease Control and Prevention
Clarifying the Evaluation Focus in a Complex Program Context
Howard Kress, Centers for Disease Control and Prevention
Considering Context in Data Collection and Analysis
Jamie Weinstein, MayaTech Corporation
Contextual Influences and Constraints on Communicating Findings
Kristianna Pettibone, MayaTech Corporation

Session: Evaluating High-Risk, High-Reward Research Programs: Challenges and Approaches 
Chair: Stephanie Philogene, National Institutes of Health
Discussant: Gretchen Jordan, Sandia National Laboratories

Evaluating the National Institutes of Standards and Technology's Technology Innovation Program
Stephen Campbell, National Institutes of Standards and Technology
Evaluating the National Institutes of Health's Director's Pioneer Award Program
Mary Beth Hughes, Science and Technology Policy Institute
Evaluating the Department of Defense's Defense Advanced Research Projects Agency  
Richard Van Atta, Science and Technology Policy Institute
Evaluating the National Science Foundation's Small Grants for Exploratory Research Program  
Caroline Wagner, SRI International
Session: New Direction of Government Research and Development Program Evaluation in Korea  
Chair: June Seung Lee, Korea Institute of Science and Technology Evaluation and Planning

Reformation of National Research and Development (R&D) Program Evaluation System
Dong-Jae Lee, Ministry of Strategy and Finance Korea
In-depth Evaluation of Research and Development (R&D) Program in Korea
Seung Jun Yoo, Boo-jong Kil, Woo Chul Chai, Korea Institute of Science and Technology Evaluation and Planning
Evaluation of National Science and Technology Innovation Capacity
Seung Ryong Lee, Chi Yong Kim, Dong Hoon Oh, Korea Institute of Science and Technology Evaluation and Planning
The Effect of Institutional Evaluation System of GRIs on the Receptivity of Evaluation Result and the Performance of GRIs: Focusing on Economic and Human Society Research Council
Byung Yong Hwang, Soon Cheon Byeon, Korea Institute of Science and Technology Evaluation and Planning
Byung Tae Yoo, Hanyang University
Session: Evaluation for Research Centers, Institutes, and Multi-level Multi-context Programs  
Chair: Brian Zuckerman,  Science and Technology Policy Institute

A Study of the New Model and Methodology for Institute Evaluation in the Chinese Academy of Sciences (CAS)
Xiaoxuan Li, Bing Shi, Jianzhong Zhou, Chinese Academy of Sciences The Status Analysis of Chinese Academy of Sciences Among National Research Institutes in the World Guoliang Yang, Zhiyuan Liu, Wenbin Liu, Chinese Academy of Sciences
Designing Evaluation for Complex Multiple-component Funded Research Programs
Genevieve deAlmeida-Morris, National Institutes of Health
Research, Technology, and Development (RT&D) in Context: Evaluating the Indirect Effects of Self-Sustaining Cooperative Research Centers
Lindsey McGowen, Denis Gray, North Carolina State University

Session: Prospective Evaluation and Technology: New Developments and 21st Century Challenges  
Chair: Valerie J Caracelli, United States Government Accountability Office
Discussants: Bhavya Lal, Science and Technology Policy Institute; George F Grob, Center for Public Program Evaluation

What Might a Theory-Based Roadmap for Developing Innovation Policy Look Like?
Gretchen Jordan, Sandia National Laboratories
Evaluating a Federal Effort to Incentives the Development of Technologies for Helping Drivers Avoid Highway Crashes
Nancy Donovan, United States Government Accountability Office
Evaluating An Agency's Response To A "High Clockspeed" Technology Trend With Unintended Consequences
Judith Droitcour, United States Government Accountability Office
Saturday,  November 14

Session: Results Frameworks: Improving Evaluation of Socio-economic Outcomes of Research  
Chair: Steve Montague,  Performance Management Network

Stakeholder Involvement and the Choice of Science and Technology Policy Outcome Evaluation Methods  
Shan-Shan Li, Ling-Chu Lee, Wen-Chi Hung, Kai-Lin Chi, Chia-Hao Hsu, Science & Technology Policy Research and Information Center
Socioeconomic Effects and Meta Evaluation of Tekes Programs
Jari Hyvarinen, Tekes, Finnish Funding Agency for Technology and Innovation
Evaluation's Contribution to Research, Technology and Development (RT&D): From Prescription to Menu to Guidelines
Steve Montague, Performance Management Network
Rudy Valentin, Christine Williams, Anne Vezina, Canadian Cancer Society

Session: Building Data Systems to Support Evaluation in a Biomedical Research and Development (R&D) Environment  
Chair: James Corrigan, National Institutes of Health
Discussant: Robin Wagner, National Institutes of Health

Data Infrastructure and Access: The National Institutes of Health Research Portfolio Online Reporting Tool (RePORT)
James Onken, National Institutes of Health
The Electronic Scientific Portfolio Assistant (e-SPA): A Tool for Analysis of Research Project Portfolios
Kevin Wright, National Institutes of Health
CareerTrac: An Aid to Effective Evaluation
Linda Kupfer, Ramkripa Raghavan, National Institutes of Health

Session: Measuring Additionally of Government Research: Views From New Zealand and Taiwan  
Chair: Isabelle Collins,  Technopolis

Adding Addtionality to Programs Evaluation
Chin-Wen Chang, Science & Technology Policy Research and Information Center Che-Hao Liu, The Legislative Yuan Republic of China
Evaluating the Additionality of Economic Development Policies in New Zealand
David Bartle, Cavan O'Connor-Close, Ministry of Economic Development

Session: Professional Development for Researchers and Potential Researchers  
Chair: Juan Rogers,  Georgia Institute of Technology

Evaluation of Research Experiences for Undergraduates: Lessons Learned
Teresa Chavez, Thomas Lang, Melinda Hess, University of South Florida
Developing Future Research Leaders: Challenges for Evaluation of a Collaborative University Program
Zita Unger, Evaluation Solutions

Session: Evaluation of Research, Technology, and Development (RT&D) to Illuminate Innovation  
Chair: Shigeko Togashi, National Institute of Advanced Industrial Science and Technology

Strategy and Evaluation in Research, Technology and Development (RT&D) for Innovation
Naoto Kobayashi, National Institute of Advanced Industrial Science and Technology Osamu Nakamura, Science and Technology Promotion Bureau, Nagasaki Prefectural Government Kenta Ooi, National Institute of Advanced Industrial Science and Technology
Systematic Evaluation to Recognize Outcomes of the National Institute of Advanced Science and Technology (AIST) in Society
Kenta Ooi, National Institute of Advanced Industrial Science and Technology Osamu Nakamura, Science and Technology Promotion Bureau, Nagasaki Prefectural Government Shuichi Oka, Koji Masuda, Shigeko Togashi, Naomasa Nakajima, National Institute of Advanced Industrial Science and Technology
Utilization of Follow-up Monitoring Results of Projects for Accountability and Research, Technology and Development (RT&D) Management at New Energy and Industrial Technology Development Organization (NEDO)
Jun-ichi Yoshida, Tsutomu Kitagawa, Takahisa Yano, New Energy and Industrial Technology Development Organization


Session: Improving Government Performance Management  
Chair: Henry Doan, United States Department of Agriculture

Government Performance Management Improvement: The Project
Kathryn E Newcomer, George Washington University
Government Performance Management Improvement Project: The Process
Mara Patermaster, National Academy of Public Administratio
Government Performance Improvement: Reviewing Best Practices
Henry Doan, United States Department of Agriculture

Session: Out of Control? Selecting Comparison Groups for Analyzing National Institute of Health Grants and Grant Portfolios  
Chair: Christie Drew, National Institutes of Health

Establishing a Comparison Set for Evaluating Unsolicited P01s at the National Institute of Environmental Health Sciences
Christie Drew, Jerry Phelps, Martha Barnes, National Institutes of Health

It's A Small World After All: Describing and Assessing National Institutes of Health (NIH)-Funded Research in the Context of A Scientific Field
Sarah Glavin, Jamelle Banks, Paul Johnson, National Institutes of Health
NIH Loan Repayment Program: Applying Regression Discontinuity to Assess Program Effect
Milton Hernandez, National Institutes of Health Laure Haak, Rajan Munshi, Matt Probus, Discovery Logic
The Use of Propensity Scores in a Longitudinal Science Study of Minority Biomedical Research Support From the National Institute of General Medical Sciences
Mica Estrada-Hollenbeck, California State University San Marcos Anna Woodcock, Purdue University David Merolla, Kent University
P Wesley Schultz, California State University San Marcos

Session: Outcome and Impact Evaluations: Brazil, Korea, and European Union  
Chair: Neville Reeve,  DG Research - European Commission

Meta-Analysis of Economic Studies of Public Research and Development (R&D) Programs
Jiyoung Park, Korea Institute of Science and Technology Evaluation and Planning
Multidimensional Ex Post Evaluation of Research and Development (R&D) Programs: The Case of an Oil Refining R&D Program in Brazil
André Furtado, Adalberto Azevedo, André Rauen, Edilaine Camillo, Department of Science and Technology Policy Brazil

Trends and Evolution of the Information and Communication Technology (ICT) Research and Deployment Landscape in Europe
Nicholas Vonortas, George Washington University Franco Malerba, Luigi Bocconi University
Nicoletta Corrocher, Luigi Bocconi University


RTD Presentations from AEA 2008

(back to list)

Wednesday,  November 5

Session: 

Evaluating Public Health Research Centers: Assessing the Value-Added
Discussants: Sue Lin Yee, Demia Wright, Michele Hoover, Centers for Disease Control

Evaluating Public Health Research Centers: Assessing the Value-Added
Thomas Bartenfeld, Howard Kress, Centers for Disease Control and Prevention Thursday, November 6
Session:
Moving Beyond Bibliometric Analysis: Emerging Evaluation Approaches at the National Institute of Environmental Health Sciences Chair: Edward Liebow,  Battelle Discussants: Christie Drew,  National Institute of Environmental Health Sciences; Howard Fishbein,  Battelle Evaluation of Research Impacts: The National Institute of Environmental Health Sciences (NIEHS) Research Portfolio Jerry Phelps,  National Institute of Environmental Health Sciences Conceptual Model of Comprehensive Research Metrics for Improved Human Health and Environment Jill Engel Cox,  Battelle Scientific and Public Health Impacts of the NIEHS Extramural Asthma Research Program from Existing Data Shyanika Rose,  Battelle Scientific and Public Health Impacts of the NIEHS Extramural Asthma Research Program From New Primary Data Carlyn Orians,  Battelle

Session: Bridging Gaps between basic and applied Research and Development and the 'Valley of Death'
Chair: Yongsuk Jang,  Science and Technology Policy Institute

A Valley of Death in the Innovation Sequence: An Economic Investigation
George S Ford,  Thomas M Koutsky,  Lawrewnce J Spiwak, Phoenix Center for Advanced Legal and Economic Public Policy Studies
Organizational Barrier in Bridging the 'Valley of Death' in Korea
Yongsuk Jang,  Science and Technology Policy Institute
'Full Research' to Overcome 'Valley of Death' in National Institute of Advanced Industrial Science and Technology (AIST)
Osamu Nakamura,  National Institute of Advanced Industrial Science and Technology
From S&T to Innovation: A Challenge to Mexico
Enrique Campos Lopez,  Center of Investigation and Assistance in Technology and Design Diseno del Estado de Jalisco
Session: Evaluation Use for Strategic Management in Environmental and Public Health Organizations

Chair: Dale Pahl,  United States Environmental Protection Agency

Performance Measurement for Research Improvement
Phillip Juengst, United States Environmental Protection Agency
A Conceptual Model for the Capability, Effectiveness, and Efficiency of Laboratories at the United States Environmental Protection Agency
Andrea Cherepy, Michael Kenyon,  United States Environmental Protection Agency
Evaluating a Research Program Through Independent Expert Review
Lorelei Kowlaski, United States Environmental Protection Agency
NCI Corporate Evaluation Converging Approaches to Maximize Utility
James Corrigan, Kevin D Wright, Lawrence S Solomon, National Institutes of Health

Session: Evaluation Policy: Current Issues, Potential Impacts, and Future Directions
Presenter: Deborah Duran, National Institutes of Health 

Session: Discussing Approaches to Evaluating a Multifocused Suite of Energy Programs at Natural Resources Canada
Presenter:Gavin Lemieux,  Natural Resources Canada,
Discussants: Olive Kamanyana, Ann Cooper, Natural Resources Canada

Session: Research and Development Challenges to Meeting Government Performance Requirements
Chair: Kathryn Law,  National Institutes of Health
Discussant: 
Deborah Duran,  National Institutes of Health

High Risk, High Reward Research at the National Institutes of Health (NIH)
Goutham Reddy, National Institutes of Health
Evaluation Policy and Evaluation Practice for Large Scientific Research Initiatives
William Trochim, Cornell University
What's the Use of Studying Science: Case - Using Profiling Analysis to Inform Science Management Decision Making
Ken Ambrose, National Institutes of Health

Session: The New Era of Performance Evaluation in Korean Evaluation System
Chair: Young-Wha Cho,  Korea Institute of Science & Technology Evaluation and Planning

Reformation of Research and Development Program Performance Evaluation System
Jong Sung Im,  Ministry of Strategy and Finance
Use of Performance Evaluation Information of Research and Development Program in Budgeting Process in Korea
Boo-jong Kil, Su-dong Park, In-ja Kim, Korea Institute of Science and Technology Evaluation and Planning
Performance Evaluation of National Research and Development in the Field of Environmental Technology With Respect to Science and Technology Classification
Noeon Park, HyunJung Cho, Seung Jun Yoo, Yule Shin, Moon Jung Choi, Sang-youb Lee, Korea Institute of Science and Technology Evaluation and Planning

Evaluation of Technology Level Using a Dynamic Method on the Critical Technologies in the 2nd Science and Technology Basic Plan of Korea
Soon Cheon Byeon, Jiyeon Ryu, Moon Jung Choi, Pyengmu Bark, Hyuck Jai Lee, Korea Institute of Science and Technology Evaluation and Planning

Friday, November 7

Session: Innovation, Complex Research Teams and Problems of Integration: The Missing Link
Chair: 
Jerald Hage,  University of Maryland


Kinds of Complex Research Teams
Gretchen Jordan, Sandia National Laboratories

Examples of Integrated and Non-integrated Research Teams in a Highly Innovative Research Organization' the Institut Pasteur
Jerald Hage, University of Maryland

Critique of Current Network Studies: Not Measuring Complex Nodes, Project Integration and Gaps in the Idea Innovation Network
Jon Mote,  University of Maryland

Session: 

Alternative Approaches to Strategic Management

 and Ex Ante Assessment of Portfolios of Research Programs 


Chair: Cheryl J Oros,  United States Department of Veterans Affairs

Evaluation of Tekes Funding for Research Institutes and Universities
Jari Hyvarinen,  Tekes
Cost-Benefit Analysis for Research and Development Program Evaluation
Jiyoung Park, Seung Kyu Yi, Korea Institute of Science and Technology Evaluation and Planning

Content and Values in the Strategic Management of Research and Development Portfolios
Juan Rogers, Georgia Institute of Technology

Session: What's Common? Impact of Research on Renewable Energy Technology and on Poverty
Chair: 
George Teather,  Performance Management Network Inc

Evaluation of a Portfolio of Technologies: Wind Energy
Rosalie Ruegg,  TIA Consulting Inc
Rethinking Impact Evaluation: Lessons from International Agricultural Research and Development
Jamie Watts,  Bioversity International Nina Lilja,  Consultative Group on International Agricultural Research Patti Kristjanson,  International Livestock Research Institute
Douglas Horton,  Independent Consultant

Session: Enhancing the Understanding of the Technology Development and Innovation Process in Firms: Creation of a Data Enclave for Business Dataset
Chair: Brian Zuckerman,  Science and Technology Policy Institute

Fostering the Data Enclave
Stephanie Shipp,  Science and Technology Policy Institute
Developing the Data Enclave
Julia Lane,  National Science Foundation

Session: Peer Review: From Evaluating Science to Evaluating Science Policy
Chair: Isabelle Collins,  Technopolis Ltd

Papers, Projects, Programs and Portfolios: Peer Review as a Public Health Research Evaluation Tool
Robin Wagner et al, Centers for Disease Control and Prevention
Peer Review and the Open Method of Co-Ordination: Reviewing National Research and Development Policy Mixes
Patries Boekholt,  Technopolis Ltd
Peer Review as a Policy Learning Tool
Isabelle Collins et al, Technopolis Ltd

Session: 

Effectiveness of Evaluation From the Perspective of Outcomes on Research Units

 and Research-supportive and Administrative Departments in National Institute of Advanced Industrial Science and Technology (AIST)
Chair: Osamu Nakamura,  National Institute of Advanced Industrial Science and Technology
Discussant:
Katsuhisa Kudo,  National Institute of Advanced Industrial Science and Technology

Effects of Evaluation from the Perspective of Outcomes on Research Units in National Institute of Advanced Industrial Science and Technology (AIST)
Osamu Nakamura et al, National Institute of Advanced Industrial Science and Technology 

Effects of Evaluation on Research-supportive and Administrative Departments in National Institute of Advanced Industrial Science and Technology (AIST)
Tomoko Mano et al, National Institute of Advanced Industrial Science and Technology

Session: Improving Evaluation Design and Methods: Examples for Research Programs
Chair: Stephanie Shipp,  Science and Technology Policy Institute

The Role of Case Studies in Evaluation of Research Technology and Development Programs
George Teather, Steve Montague, Performance Management Network Inc
Building Evaluation into Program Design: A Generic Evaluation Logic Model for Biomedical Research Programs at the National Cancer Institute
P Craig Boardman et al, Science and Technology Policy Institute

Session: Meeting Challenges of Evaluating and Sustaining Research Centers and Institutes
Chair: Brian Zuckerman,  Science and Technology Policy Institute

A "Meta-Evaluation" of Collaborative Research Programs: Definitions, Program Designs, and Methods
Christina Viola Srivastava, P Craig Boardman, Brian Zuckerman, Science and Technology Policy Institute

Predictors of Cooperative Research Center Post-Graduation Survival and Success: An Update
Lindsey McGowen, Denis Gray, North Carolina State University
From Evaluation Framework to Results: Innovative Approaches piloted with the Interim Evaluation of the Regional Centers of Excellence for Biodefense and Emerging Infectious Diseases Research (RCE) Program
Kathleen M Quinlan,  Concept Systems Inc

Saturday, November 8

Session: 

Engaging Stakeholders in the Scientific Enterprise: 

Using Concept Mapping for Research Priority Setting and Participatory Evaluation 


Chair: Scott Rosas,  Concept Systems Inc
Discussant: 
Scott Rosas,  Concept Systems Inc

Defining Success for the National Institute of Allergy and Infectious Diseases' Regional Centers of Excellence in Biodefense and Emerging Infectious Diseases Research Program: A Co-Authored Evaluation Framework and Plan
Mary Kane, Kathleen M Quinlan, Concept Systems Inc
Identifying Research Priorities for the National Cancer Institute's Cancer Research Network: Developing a Collaboratively Authored Conceptual Framework
Kathleen M Quinlan et al, Concept Systems Inc

Setting the Research Agenda with Communities
Mary Kane,  Concept Systems Inc

Session: From Identification of Major Challenges to Evaluation to Proactively Influencing Evaluation Policies: A Compilation of the Findings from a Series of Think Tanks
Presenters: Rosalie Ruegg, TIA Consulting Inc; Connie Chang, Ocean Tomo Federal Services

Session: Impact Evaluation for Public Research and Development Program in Japan
Chair: Yasukuni Okubo,  Ministry of Economy Trade and Industry

Effective Methods to Evaluate Impacts of Meti Projects: New Approaches for Logic Model Development to Find Paths Between Research and Development Outputs and Objectives
Kazuki Ogasahara et al, Ministry of Economy Trade and Industry 

Case Study and Analysis for Economic and Social Impact on National Research and Development Project Based on the Results of Follow-Up Monitoring and Evaluation at NEDO
Takahisa Yano et al, New Energy and Industrial Technology Development Organization
Measurement of Economic Impact on National Research and Development Program and Cost Benefit Analysis at NEDO
Kazuaki Komoto et al, New Energy and Industrial Technology Development Organization

Session: 

Methods and Techniques for Analyzing, Measuring, and Valuing the Impact 

of Intellectual Property Assets: A Focus on Patents Derived From Federal Research Funding
Chair: 
Connie Chang,  Ocean Tomo Federal Services

Setting the Stage: Introduction to the Panel and General Overview
Connie Chang,  Ocean Tomo Federal Services
Evaluating the Impact of the United States Advanced Technology Program: What Can Patents Tell Us?
Ted Allen,  National Institute of Standards and Technology
Identifying Emerging, High-Impact Technological Clusters: An Overview of a Report Prepared for the Technology Administration, United States Department of Commerce
Tony Breitzman,  1790 Analytics
Soup to Nuts: How NASA Technologies Got Transferred to the Marketplace via a Live Intellectual Property Auction
Darryl Mitchell,  National Aeronautics and Space Administration

Session: 

Science of Science Management: Development of Assessment Methodologies

 to determine Research and Development Progress and Impact
Presenter: Deborah Duran, National Institutes of Health

RTD Presentations from AEA 2007

(back to list)

Wednesday,  November 7

Session: Closing the Loop: Mapping Value to Inform Research Management

Chair: Neville Reeve, European Commission


A Framework for Evaluating Large Scale AIDS Clinical Research Networks
Brendan Cole and Jonathan Kagan, National Institutes of Health
Melissa Burns, Mary Kane, and Kathleen M. Quinlan, Concept Systems Inc.
William Trochim, Cornell University
Daniel Montoya, Hill and Knowlton

Analysis of Follow-up Evaluation Results of Research and Development (R&D) Projects Applying Logic Model to Elucidate the Process of Innovation

 

Kazuki Ogasahara,  Ministry of Economy, Trade and Industry
Osamu Nakamura,  National Institute of Advanced Industrial Science and Technology
Kazuyuki Inahashi,  Chikahiro Miyokawa, Yoshitaka Kimura,  Ministry of Economy, Trade and Industry

Contribution of Evaluation to Management of Research and Development (R&D) in the Process of Technology Transfer: A Knowledge Value Mapping Approach
Juan Rogers, Georgia Institute of Technology

 

Thursday,  November 8

Session: Hard Cases: Measuring and Facilitating Interdisciplinarity and Inter-Organizational Interactions

Chair: Isabelle Collins, Technopolis


University-Industry Collaboration: An issue for Ireland as an Economy With a High Dependence on Academic Research

 

 

James Ryan, CIRCA Group Europe Ltd

 

Measuring the Interdisciplinarity of a Body of Research

 

 

David Roessner, SRI International
Anne Heberger, Alex Cohen, Marty Perreault, The National Academies
Alan Porter, Georgia Institute of Technology

 

Wikis in Evaluation: Evaluating Wikis for Theory Development in a Multi-disciplinary Center

 

 

P Craig Boardman, Nathaniel Deshmukh Towery, Brian Zuckerman, Science and Technology Policy Institute


Session:  Research Evaluation of the Upcoming Europeans Union’s Framework Programme

Chair: Peter Fisch, European Commission


Networks of Innovation in Information Technology: Technology Development and Deployment in Europe

 

 

Nicholas Vonortas, George Washington University 
Franco Malerba, Nicoletta Corrocher, 
Lorenzo Cassi, Luigi Bocconi University

 

A New System for Research Evaluation Under the European Union's Seventh Framework Programme

 

 

Neville Reeve, European Commission

 

The European Union's Seventh Framework Programme and the Role of Evaluation

 

 

 

Peter Fisch, European Commission


Session: Looking Inside the ResearchCenter Black Box: Using Evaluation Research to Promote Organizational Effectiveness of Scientific Research Centers
Chair: Denis Gray, North Carolina State University
Discussant: Gretchen Jordan, Sandia National Laboratories

Evaluating Leadership Development in an Research and Development (R&D) Context: Assessing Alpha, Beta, and Gamma Change

 

 

Bart Craig, North Carolina State University


A Multi-variate Study of Graduate Student Satisfaction and Other Outcomes Within Cooperative Research Centers
Jennifer Schneider, North Carolina State University

 

Enhancing Collaboration Between Historically Black Colleges and Universities (HBCU) and Research Extensive Universities

 

 

Andrea Lloyd, North Carolina State University

 

Predictors of Cooperative Research Center Post-Graduation Survival and Success

 

 

Lindsey McGowen,  North Carolina State University



Session: Tools for Useful Performance Assessment of Science and Technology Programs: An Example

Chair: Jerald Hage, University of Maryland

Discussant: Alfred Powell, National Oceanic and Atmospheric Administration

 

Prologue    Jerald Hage

 

A Strategic Balanced Scorecard for Publicly Funded Science and Technology Programs

 

Gretchen Jordan, Sandia National Laboratories

 

Perceptions of the Research Environment: Kinds of Networks, Research, and Projects

 

Jerald Hage, University of Maryland

 

Measuring the Immeasurable: Innovation and the Economics Benefits of Satellite Data

 

Jonathon Mote, University of Maryland

 

Think Tank:

Identifying Challenges to Using Evaluation to Inform Program Management and Public Policy


 

Chair: Connie Chang, United States Department of Commerce

Discussants: Connie Chang, United States Department of Commerce and Rosalie Ruegg, TIA Consulting Inc


 

Business Meeting: Research, Technology, and Development Evaluation TIG Business Meeting

2006 TIG Leaders: Gretchen Jordan, Sandia National Laboratories, George Teather, Independent Consultant, Brian Zuckerman, Science and Technology Policy Institute

 

Friday, November 9

 

Session: Evaluation of Community-based Participatory Research and Community Mobilization Strategies to Prevent Chronic Disease and Youth Violence: Advances and Lessons Learned by Two Research Center Programs at the Centers for Disease Control and Prevention

Chair: Alicia Norris, Centers for Disease Control and Prevention


Using Document Review and Data Abstraction to Inform Management of a Federal Research Program: Lessons, Benefits, and Challenges Found by the Centers for Disease Control and Prevention's Prevention Research Centers Program

Demia S Wright, Centers for Disease Control and Prevention

 

An Evaluation of Community Based Participatory Research and Community Mobilization: Formative Research Results From the National Academic Centers of Excellence on Youth Violence Prevention

Nancy Stroupe, Centers for Disease Control and Prevention

 

Session: Strategic Evaluation in a Public Research Institute to Contribute to Innovation

Chair: Osamu Nakamura, National Institute of Advanced Industrial Science and Technology

Discussant: Naoto Kobayashi, National Institute of Advanced Industrial Science and Technology

 

Strategic Evaluation of Research Units Towards Innovation in a Public Research Institute

 

Osamu Nakamura, Shin Kosaka, Michiko Takagi Sawada, Isao Matsunaga, Masao Koyanagi, Koichi Mizuno, Naoto Kobayashi - National Institute of Advanced Industrial Science and Technology

 

Evaluation System with PDCA Cycle in the Management of National Institute of Advanced Industrial Science and Technology

 

Sunao Kunimatsu, Osamu Nakamura, Yoshikazu Arai, Hiroshi Sato, Shinichi Kikuchi, Suzuko Nakatsu, Naoto Kobayashi -  National Institute of Advanced Industrial Science and Technology

 

Session: Peer Review and Learning: New Uses

Chair: David Roessner, SRI International

 

Peer Review of Transformative Research: Strategies and Challenges in Identifying Innovation in Ex Ante Evaluation

 

Elmer Yglesias, Science and Technology Policy Institute
David Kaplan, Case Western Reserve University

 

Peer Reviews or Peers Reviewing? Peer Review as Policy Learning in Innovation, Research and Education

 

Erik Arnold, Isabelle Collins, Technopolis, UK

 

Session: Ex Ante Evaluation: Methods for Estimating Innovation and Other Research Outcomes

Chair: George Teather, Independent Consultant

 

Ex Ante Portfolio Analysis of Public R&D Programs for Industrial Technologies in Korea: Practices at the Korea Institute of Industrial Technology Evaluation and Planning

Yongsuk Jang, George Washington University
Jongman Park, Korea Institute of Industrial Technology Evaluation and Planning

 

Impact Evaluation in Preliminary Feasibility Analysis of National R&D Programs

Jiyoung Park, Korea Institute of Industrial Technology Evaluation and Planning

 

Session: A Directory of Evaluation Methods for Managers of Public Research and Technology Programs

Rosalie Ruegg, TIA Consulting Inc

 

Saturday, November 10

 

Session: Ex Ante Evaluation: Evaluation as an Agent of Program Change: An Example From Austria

Chair: Klaus Zinoecker, Vienna Science and Technology Fund

 

The Evaluation of Genome Research Austria (GEN-AU): Overview of the Study's Aims, Structure, Methods, Results, Implications, and Impacts

 

Klaus Zinoecker, Vienna Science and Technology Fund, Alfred Radauer, Iris Fischl, Roald Steiner - Austrian Institute for SME Research,  Brigitte Tempelmaier, Austrian Economic Service, Rosalie Ruegg, TIA Consulting Inc

 

Developing a Plan for Future Monitoring and Impact Analysis of Genome research Austria

 

Rosalie Ruegg, TIA Consulting Inc

 

Session: A Roadmap for Developing a Public Health Research Portfolio Evaluation Program

Chair: Robin Wagner, Centers for Disease Control and Prevention

 

Overview of Methodology Used to Develop a Research Evaluation Program

 

Jerald O'Hara, John Araujo, Mona Choi, Catherine Pepper, Robin Wagner, Guijing Wang, Trevor Woollery, Centers for Disease Control and Prevention

 

MEDLINE Search Strategies vs. Relevant Retrieval: How Closely do They Match for a Research Evaluation Topic?

 

Catherine Pepper, Jerald O'Hara, John Araujo, Mona Choi, Robin Wagner, Guijing Wang, Trevor Woollery, Centers for Disease Control and Prevention

 

Evaluating Public Research Investment: A Literature Review

 

Jerald O'Hara, John Araujo, Mona Choi, Catherine Pepper, Robin Wagner, Guijing Wang, Trevor Woollery, Centers for Disease Control and Prevention

 

Extending the Pay-Back Model to Incorporate Costs as Well as Benefits to Measure the Net Impacts of Organizational Expenditures on Public Health Research

Guijing Wang, Jerald O'Hara, John Araujo, Mona Choi, Catherine Pepper, Robin Wagner, Trevor Woollery, Centers for Disease Control and Prevention

 

A Bibliometric Methodology to Inform a Logic Model for Evaluating a Public Health Research Portfolio

John Araujo, Jerald O'Hara, Mona Choi, Catherine Pepper, Robin Wagner, Guijing Wang, Trevor Woollery, Centers for Disease Control and Prevention

 

 

Session: Using Logic Models to Evaluate Research and Technology Diffusion Results: Two Cases

Chair: Jeff Dowd, United States Department of Energy

 

Cutting Edge Logic Models for Research and Technology Programs

 

Gretchen Jordan, Sandia National Laboratories

 

Linking Projects to Program Outcomes in Metrics for Technology Development Programs

 

John Mortensen, Energetics, Inc.

 

The Logic of Indirect Programs to Diffuse Technologies: The Example of Training

 

John Reed, Innovologie

 

Getting From Training to Credible Energy Savings: An Evaluation Template

 

Harley Barnes, LM Business Process Solutions

 

Session: The Follow-up Monitoring and Outcome Survey for National Research and Development Projects in New Energy and Industrial Technology Development Organization (NEDO)

Chair: Takahisa Yano, New Energy and Industrial Technology Development Organization

 

Study of the Correlation Between Ex-post Evaluation and Follow-up Monitoring of National Research and Development (Part I)

 

Hiroyuki Usuda and Momoko Okada, New Energy and Industrial Technology Development Organization

 

Study for the Important Management Factors Based on Follow-up Monitoring Data (Part II)

Setsuko Wakabayashi, Tsutomu Kitagawa, Takahisa Yano, Kazuaki Komoto, New Energy and Industrial Technology Development Organization

 

Approach for the Understanding of Outcomes Derived from National Research and Development of Energy Conservation Project (Part III)

Kazuaki Komoto, Tsutomu Kitagawa, Takahisa Yano, Setsuko Wakabayashi, New Energy and Industrial Technology Development Organization

 

Session: Forging a Strong Link Between Research and Science Policy for Air Quality Decisions

Chair: Dale Pahl, United States Environmental Protection Agency

 

An Overview of National Ambient Air Quality Standards

Ron Evans, United States Environmental Protection Agency

 

A Paradigm for Federal Particulate Matter Research

 

James Vickery, United States Environmental Protection Agency

 

Relationships Among Atmospheric Contaminants, Air Quality, Human Exposure, and Health

Rochelle Araujo, United States Environmental Protection Agency

 

Synthesis and Evaluation of New Scientific Knowledge

William Wilson, United States Environmental Protection Agency

Session: National Performance Evaluation System of Research and Development Programs in Korea: System and Applications

Chair: Sang-Youb Lee, Korea Institute of Science and Technology Evaluation and Planning

Discussant: Jiyoung Park, Korea Institute of Science and Technology Evaluation and Planning

 

Method and Application of the Survey and Analysis of National Research and Development (R&D) Programs for the Performance Evaluation in Korea

Keun-Ha Chung, Hyejung Joo, Herin Ahn, Korea Institute of Science and Technology Evaluation and Planning

 

The Performance Evaluation of Research and Development Programs in Korea

 

Seong-jin Kim, Soon Cheon Byeon, Korea Institute of Science and Technology Evaluation and Planning

 

Design of Metaevaluation Model for National Research and Development Programs in Korea

 

Young Soo Ryu and Soon Cheon Byeon, Korea Institute of Science and Technology Evaluation and Planning
Byung Dae Choi, Hanyang University

 

Development of the Evaluation Methodology for the Basic Research in Korea

 

Hyeyoung Yang and Sangki Jeong, Korea Institute of Science and Technology Evaluation and Planning

 

Performance Evaluation of Agriculture Research and Development Programs in Korea

 

Hoijong Jung, Korea Institute of Science and Technology Evaluation and Planning

 

Session: Centralized E-Tool for Organizational Performance Management: National Institutes of Health (NIH) Government Performance Result Acts (GPRA) & Program Assessment Rating Tool (PART) Assessments

Deborah Duran, National Institutes of Health


RTD Presentations from AEA 2006

(back to list)

Wednesday,  November 1

Session: Logical Frameworks for Multilevel Evaluation of Research, Technology and Design (RT&D)

Chair: Erik Arnold,  Technopolis Ltd


Framework for Monitoring and Impact Evaluation: Measuring Additionality and Systemic Impacts of Public Research and Development Funding- Case Tekes, Finnish Funding Agency for Technology and Innovation
Rautiainen Anna-Maija,  Finnish Funding Agency for Technology and Innovation

Evaluation Framework and Impact Assessment in Tekes
Jari Hyvarinen,  Tekes Finnish Funding Agency for Technology and Innovation

A Framework for Evaluating Diverse Portfolios of Scientific Work at the National Systems Level

Gretchen Jordan, Sandia National Laboratories
Jerry Hage, University of Maryland
Jonathan Mote,  University of Maryland

 

Thursday,  November 2

Session: Critiques and New Ideas for Systems to Track and Evaluate Research Programs

Chair: N/A


A Comparative Analysis of International Research Evaluation Systems

 

 

Chris L S Coryn,  Western Michigan University

 

An Effective Socio-Economic Impact Tracking System

 

 

A Dennis Rank,  Dennis Rank and Associates

Implications for Evaluation of the Results of an Enterprise Risk Management Pilot at the National Research Council of Canada

 

 

Frances Isaacs,  Shannon Townsend, National Research Council of Canada


Session:  Social Network Analysis in Evaluation of Research Programs

Chair: John H Reed,  Innovologie LLC


Evaluation of Multidepartment (Horizontal) Science & Technology (S&T) Programs 
Steve Montague, George Teather, Performance Management Network Inc
 

New Directions in the Use of Network Analysis in Research and Development Evaluation

 

 

Jonathon Mote, University of Maryland
Gretchen Jordan, Sandia National Laboratories Jerald Hage, University of Maryland       

 


Session: Evaluating Government Research and Development Programs with Economic and Other Quantitative Methods: What Works Best
Chair: Bruce McWilliams,  United States Department of Agriculture
Discussant: George Teather,  Performance Management Network Inc

Evaluating Government Research and Development Programs With Quantitative Methods: National Aeronautics and Space Administration (NASA)'s Erasmus Performance Management System

 

 

Julie Pollitt, National Aeronautics and Space Administration,


Evaluating Federal Research and Development Programs With Quantitative Measures: An Outside-In Look at Translational Research at National Institutes of Health NIH)
Brian Zuckerman, Alexis Wilson, Science and Technology Policy Institute

 

Evaluating Government Research & Development Programs with Quantitative Methods – What Works Best

 

 

Bruce McWilliams, United States Department of Agriculture

 

Session: Evaluation of Technologies
Chair: Dundar Kocaoglu,  Portland State University

Technology Evaluation Strategies

 

Tugrul Daim, Dundar Kocaoglu, Jonathan Ho, Portland State University

 

Technology Evaluation for Policy Decisions

 

Audrey Alvear, Dundar Kocaoglu, Portland State University

 

Hierarchical Decision Models Sensitivity Analysis (HDM-SA) for Technology Evaluation

 

Hongyi Chen, Dundar Kocaoglu, Portland State University


Predicting the Impact of Changing New Product Development Targets on Release Dates

 

Tim Anderson, Robert Harmon, Lane Inman, Portland State University

 


Session: Integrated Approaches to Evaluation of Agency-Level Portfolios of Research

Chair: David Bartle,  New Zealand Ministry of Economic Development


An Integrated Approach to Performance Measurement, Evaluation and Risk Monitoring of Science & Technology (S&T) Programs

George Teather, Steve Montague, Performance Management Network Inc

 

Is Europe Making a Difference: Assessing Impact in Information Society Technologies Research From Methodology to Implementation

Isabelle Collins,  Technopolis Ltd


Challenges and Emerging Tools for Research and Technology Portfolio Evaluation

Christina Viola Srivastava, Bhavya Lal, Brian Zuckerman, Alexis Wilson, Nathaniel Deshmukh Towery, Science and Technology Policy Institute


Evaluation of a Portfolio of Research, Development and Demonstration Programs to Serve Multiple Stakeholders

Helen Kim, Larry Pakenas, New York State Energy Research and Development Authority
Rick Ridge, Heschong Mahone Group Scott Albert, GDS Associates Inc

 

Session: Improving and Integrating Evaluation into Program Management

Chair: Henry Doan,  United States Department of Agriculture
Discussant: Cheryl Oros,  National Institutes of Health


The Portfolio Review Expert Panel (PREP) Process: Planning and Accountability Office, Cooperative State Research, Education, and Extension Service (CSREES) Use and Perspective
Djime Adoum, Henry Doan, United States Department of Agriculture

Improving Program Management using the Portfolio Review Expert Panel (PREP): Program Manager Perspective
Caroline Crocoll, Tom Tate, United States Department of Agriculture

Using a Portfolio Review Expert Panel (PREP)-like Process at the State Level: State Perspective on the Use of Portfolio Reviews

 

Steve Loring,  New Mexico State University


Friday, November 3

 

Session: National Research and Development Program Evaluation in Korea: Various Applications and Utilizations

Chair: Sang-Youb Lee,  Korea Institute of Science and Technology Evaluation and Planning
Discussant: Byoungho Son,  Korea Institute of Science and Technology Evaluation and Planning


Ex Ante-evaluation Methodologies Used in Korea for Public Research Resource Allocation

Byeongwon Park, Seokho Son, Korea Institute of Science and Technology Evaluation and Planning

 

Methodologies and Practices of Preliminary Feasibility Studies on National Research and Development Programs

Jiyoung Park, Su-dong Park, Korea Institute of Science and Technology Evaluation and Planning

Developing Key Performance Indicators for Research & Development Programs Using Logic Model and "Matrix Map"

Sangki Jeong, Jihyun Park, Donghoon Oh, Korea Institute of Science and Technology Evaluation and Planning

Design of In-Depth Research & Development Program Evaluation System and its Application in Korea

Ji Ho HwangJin Kyoung Moon, Korea Institute of Science and Technology Evaluation and Planning

 

 

Session: Evaluation of Research and Technological Development Programmes: A Performance Audit Perspective

Chair: François Colling,  European Court of Auditors

Discussants: Erik Arnold,  Technopolis Ltd; Martin Weber,  European Court of Auditors

 

Evaluation of Research and Technological Development Programmes: A Performance Audit Perspective

 

Gareth Roberts, European Court of Auditors

 

 

Session: Follow-Up Monitoring and Evaluation of National Research and Development in New Energy and Industrial Technology Development Organization (Part I and Part II)

Chair: Hideo Shindo,  New Energy and Industrial Technology Development Organization

 

Follow-Up Monitoring and Evaluation of National Research and Development in New Energy and Industrial Technology Development Organization (Part I): Results of Evaluation in the Second Year

 

Syuji Yumitori, Takahisa Yano, Setsuko Wakabayashi, Kazuaki Kohmoto, New Energy and Industrial Technology Development Organization
 

Study for the Successful Results of Post-Project Activities Based on the Follow-Up Monitoring Data (Part II)

 

Takahisa Yano, Syuji Yumitori, Setsuko Wakabayashi, Kazuaki Kohmoto, New Energy and Industrial Technology Development Organization

 

Session: Utilizations and Analyses of the Results of the National Research and Development Project Evaluations at the New Energy and Industrial Technology Development Organizations (NEDO)

Chair: Hideo Shindo,  New Energy and Industrial Technology Development Organization

 

Effective Evaluation System of Research and Development Projects and Utilization of its Achievement at New Energy and Industrial Technology Development Organization (NEDO)

Hiroyuki Usuda,  New Energy and Industrial Technology Development Organization

 

The Analysis of the Evaluation Results of the Mid-and-Long Term Research and Development Projects From a New Perspective

 

Momoko Okada,  New Energy and Industrial Technology Development Organization

 

Saturday, November 4

Session: 
Generic Theory-Based Logic Model for Creating Scientifically-Based Program Logical Models

Chair: Gretchen Jordan, Sandia National Laboratories

John H Reed,  Innovologie LLC

Session: Strategic Planning and Evaluation of Research and Development in Japan to Produce Outcomes for Industrial Innovation

Chair: Osamu Nakamura,  Ministry of Economy Trade and Industry

 

Strategic Planning and Evaluation of Research and Development Programs in Ministry of Economy, Trade and Industry (METI)

 

Osamu Nakamura, Kazuki Ogasahara, Shoichiro Ishikawa, Hiroya Hirose, Hiroaki Shibao, Ministry of Economy Trade and IndustryResearch

 

Strategic Evaluation of Research: Viewpoint of Outcome in the Evaluation of Research Units in (Advanced Industrial Science and Technology) AIST

 

Masakazu Date et al.,  National Institute of Advanced Industrial Science and Technology



Evaluation for the Research-Support and Administrative Departments in the National Institute of Advanced Industrial Science and Technology in Japan

 

 

Shigeru Suto et al.,  National Institute of Advanced Industrial Science and Technology


Activities for Structuring an Internal Project Management Guideline System at New Energy and Industrial Technology Development Organization (NEDO)

 

Kazuyuki Takada et al.,  New Energy and Industrial Technology Development Organization

 

Session: Evaluating Scientific Research Initiatives
Chair: William Trochim,  Cornell University
Discussants: Jonathan Kagan,  National Institute of Allergies and Infectious Disease; Valerie Caracelli,  United States Government Accountability Office

 

Theory and Evidence in the Evaluation of the Long-Term Impact of Research Programs

 

Leslie Cooksy,  University of California at Davis

 

Evaluation of Large Initiatives of Scientific Research at the National Institutes of Health

 

Mary Kane,  Concept Systems Incorporated
William Trochim,  Cornell University

 

Building Federal Research and Development (R&D) Evaluation Capacity Useful to Policy Makers

 

Cheryl Oros,  National Institutes of Health

 

Session: Meeting an Evaluation Challenge: Identifying and Overcoming Measurement and Data Difficulties
Chair: Rosalie Ruegg,  TIA Consulting Inc

 

Meeting an Evaluation Challenge: Identifying and Overcoming Measurement and Data Difficulties
 Rosalie Ruegg,  TIA Consulting Inc
 Connie Chang,  United States Department of Commerce

 

Session: Improving the Utilization of Evaluation for Research Policy Decision Makers

Chair: Cheryl Oros,  National Institutes of Health
Discussant: Bruce Baskerville,  National Research Council of Canada

 

The Consequences of Research Program Evaluation: Program Improvement and Decision-Making

 

Dale Pahl,  United States Environmental Protection Agency

 

Increasing Evaluation Relevance and Application to Government Policy: Insights from New Zealand Evaluation Experience in Research and Development and Economic Development Policies

David Bartle, Louise Hull, Katherine Meerman, Ministry of Economic Development

 

 

Session: Quantitative and Qualitative Evaluation of Research Networks: Examples from the Prevention Research Centers Program at the Centers for Disease Control and Prevention

Chair: Jo Anne Grunbaum,  Centers for Disease Control and Prevention

 

National Indicators and Qualitative Studies to Measure the Activities and Outcomes of Center for Disease Control’s Prevention Research Centers Program

Demia Sundra,  Centers for Disease Control and Prevention

 

A Case Study Approach to Understanding the Value Added and Lessons Learned from the Healthy Aging Thematic Research Network

 

Doryn Chervin, Gayle-Marie Holmes Payne, ORC Macro

 

Evaluation of the Cancer Prevention and Control Research Network (CPCRN)

Katherine Wilson,  Centers for Disease Control and Prevention
Cathy L Melvin,  University of North Carolina at Chapel Hill   

 

Session: Are We There Yet: A Review of the "Social Science of Science Policy"

Chair: Bhavya Lal, Science and Technology Policy Institute

 

RTD Presentations from AEA 2005

(back to list)

Wednesday, October 26

Session: Methods for Monitoring and Evaluating Research Outcomes

Chair: Isabelle Collins,  Technopolis Ltd


Using Multivariate Techniques for the Analysis of Performance Data: A Case Example
Isabelle Bourgeois,  University of Ottawa

The Institutionalization of Impact Assessment in Argentina: The National Institute for Agricultural Technology as a Study Case

Susana Beatriz Mirassou et al.,  Instituto Nacional de Tecnología Agropecuaria


Applying the Balanced Scorecard in Public Research and Development Programs 
Byeongwon Park et al.,  Korea Institute of Science and Technology Evaluation and Planning

The Use of Surveys in Determining Outcomes of Research, Technology and Development Programs

Suzanne Lafortune,  Performance Management Network


Session: Online Self-Instructional Tool for Managers of Research Programs

Chair: William J Valdez,  US Department of Energy


Thursday,  November 8

Session: New Directions for Performance Monitoring and Evaluation: Experiences from International Agricultural Centers

Chair: Leslie J Cooksy,  University of California, Davis
Discussant: Patrick Grasso, The World Bank


Overview of Performance Monitoring and Evaluation in the International Agricultural Research Centers

 

 

Leslie J Cooksy,  University of California, Davis
Sirkka Immonen,  Science Council Secretariat

 

International Agricultural Research Center Impact Assessments: Demonstrated Benefits and Donor Demands

 

 

David A Raitzer,  Science Council Secretariat
Hans Gregersen,  CGIAR Science Council
Timothy Kelley,  Science Council Secretariat


Learning-Oriented Evaluation in an Era of Performance Measurement and Impact Assessment

 

 

Douglas E Horton,  Consultative Group on International Agricultural Research
Jamie Watts,  International Plant Genetic Resources Institute



Session:  Evaluating Societal Impacts of Research and Technology Programs

Chair: Susan E Cozzens,  Georgia Institute of Technology


Feasibility of Using Project Outcomes Data to Evaluate Partnership for Innovation Program of National Science Foundation

 

 

David Roessner, Jongwon Park, James McCullough, SRI International

 

How to Assess the Societal Impacts of Public Research Organisations? Methods, Practices and Utilization

 

 

 

Pirjo Kutinlahti, Kirsi Hyytinen, VTT Technology Studies
Kaisa Lähteenmäki-Smith,  Nordregio

 

Closing the Gap: Science and Technology, Evaluation, and Inequalities

 

 

 

 

Susan E Cozzens,  Georgia Institute of Technology


Session: Performance Indicators and Assessment for Different Levels of Research and Technology Management
Chair: Jeanne W Powell,  National Institute of Standards and Technology

Assessing Efficacy of Assistive Technology Transfers: Validation of a Technology Transfer Model

 

 

Vathsala I Stone, Sajay Arthanat, Katie Beaver, State University of New York at Buffalo
Douglas J Usiak,  Western New York Independent Living Center

Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency Research and Development Programs
Scott M Albert,  GDS Associates
Helen Kim,  New York State Research & Development Authority Rick S Ridge,  Ridge & Associates Gretchen B Jordan,  Sandia National Laboratories

 

Different Levels of Performance Measures for Different Uses in Science and Technology

 

 

William G. Kennedy, U.S. Naval Research Laboratory

 

Developing Alternative Measures of Technical Innovation for Research Organizations

 

 

Jonathon Mote,  University of Maryland


Session: Evaluating Research Dynamics and Programs Using Social Network Analysis
Chair: Caroline S Wagner,  George Washington University

 

Science Parks as Knowledge Organizations: The ’ba’ in Action?

 

Finn Hansson,  Copenhagen Business School

 

The Critical Link: Evaluating Research Dynamics Using Network Analysis

 

Caroline S Wagner,  George Washington University
Frank Cunningham,  European Commission

 

Social Network-Based Design of Collaborative Research Program Evaluation

 

Brian L Zuckerman,  Center for Science and Technology Policy Studies
Linda E Kupfer,  National Institutes of Health

Using Social Networks Methodology to Evaluate Research and Development Programs

 

Nicholas S Vonortas,  George Washington University
Franco Malerba,  L Bocconi University


Friday, October 28

 

Session: Introduction of Government-Funded Research and Development Project Evaluations and Utilization of the Evaluations in Japan

Introduction of New Energy and Industrial Technology Development Organization’s Research and Development Project Evaluation System

 

Mutsumi Fujita,  New Energy and Industrial Technology Development Organization

 

Essential Factors for Effective Project Management: From Analysis of New Energy and Industrial Technology Development’s Research and Development Project Evaluation Results

Momoko Okada,  New Energy and Industrial Technology Development Organization

 

Session: Follow-Up Monitoring and Evaluation of National Research and Development in New Energy and Industrial Technology Development Organization: Results of the Full Scale Operation in First Year
Chair: Hiroshi Sano,  New Energy and Industrial Technology Development Organization

 

Results Obtained From Follow-up Monitoring

 

Shuji Yumitori, Hiroshi Sano, New Energy and Industrial Technology Development Organization

 

Results of Evaluation in the First Year

 

Hiroshi Sano, Shuji Yumitori, New Energy and Industrial Technology Development Organization

 

Session: Innovation Systems: Theories and Practice
Chair: Steve Montague,  Performance Management Network

 

Examination of National Innovation Systems

 

George Teather,  Independent Consultant

 

Cross-Border Research and Design Evaluation: The Øresund Contracts

 

Isabelle Collins,  Technopolis Ltd

 

Session: Cooperative Efforts in International Science and Technology Evaluation
Chair: William J Valdez,  US Department of Energy
Discussant: William J Valdez,  US Department of Energy

 

An Emerging Framework for International Research and Development Evaluation Cooperation

Peter Johnston,  European Commission

 

World Research Evaluation Network's Critical Science and Technology Assessment of Japan Evaluation and Planning Programs

 

Koh Harada,  Ministry of Economy, Trade and Industry

2005 South Korean/World Research Evaluation Network Workshop Results

 

Jae Young Lee,  Korea Institute of S&T Evaluation and Planning


Unified Science and Technology Evaluation Framework

 

George Teather,  Independent Consultant


Creating Uniform Standards for Research and Development Evaluation

 

Cheryl J Oros,  US Department of Agriculture


Session: Let's Assess the Work Environment for Research, Not Just Outcomes

Chair: Gretchen B Jordan,  Sandia National Laboratories

 

Lessons From Seven Years of Study of the Research Environment

 

Gretchen B Jordan,  Sandia National Laboratories

Crossing Boundaries to Assess the Academic Research Environment at New Mexico State University

 

Laura Haas,  New Mexico State University


Assessing Research Management at Canada’s National Research Council

 

Flavia Leung,  National Research Council Canada


Session: Operating in the Pasteur's Quadrant: Evaluating Use-Inspired Basic Research Programs
Chair: Bhavya Lal,  C-STPS LLC


Session: Logic Models to Describe Diffusion of Innovations

Chair: Gretchen B Jordan,  Sandia National Laboratories
Discussant: Janice Noga,  University of Cincinnati

 

Innovation Systems, Program Theory, and Sustainability Research

 

Diana Bauer et al.,  US Environmental Protection Agency

Generic Logic Models for Federal Program Delivery and Diffusion of Innovation

 

John H Reed,  Innovologie

John C Mortensen,  Energetics

Gretchen B Jordan,  Sandia National Laboratories


Session: National Research Council of Canada's Integrated Approach to Planning and Performance Management
Chair: Jeninfer A Birta,  National Research Council Canada

 

National Research Council of Canada’s Planning and Performance Management Solution

 

N Bruce Baskerville,  National Research Council of Canada

A Piece of the Puzzle: Results From a Recent Evaluation

 

Alexandra Dagger, Jennifer A. Birta, National Research Council Canada


A Piece of the Puzzle: Management Self-Assessments

 

 

Flavia Leung,  National Research Council Canada


A Piece of the Puzzle: The Risk Management Picture

 

Frances E Isaacs,  National Research Council Canada


Session: Experiences From the Use of an Innovative Quantifiable Portfolio Review Process That Blends and Employs Traditional Tools (in Example, Logic Models, and Expert Reviews) to Quantify the Benefits of Research Programs
Chair: Cheryl J Oros,  US Department of Agriculture
Discussant: Cheryl J Oros,  US Department of Agriculture

 

Quantifying Expected Outcomes for Research Via the Use of Logic Models

 

Henry M Doan,  US Department of Agriculture

Developing an Integrated Approach to Data Collection, Analysis, and Reporting for Research Evaluation

 

Djimé D Adoum, Bart Hewitt, US Department of Agriculture

 


Saturday, October 29

 

Session: Leveraging Competing and Complementary Roles for Success in Research and Development
Chair: Sheila A Arens,  Mid-continent Research for Education and Learning

 

Recognizing and Addressing the Realities for Partners in the Research and Development Endeavor

 

Zoe A Barley,  Mid-continent Research for Education and Learning

 

Creating a Coherence With Language of a Research and Development Continuum

 

LeAnn M Gamache,  Mid-continent Research for Education and Learning


Validity Concerns Reconciled? Clients Versus Evaluator Evidentiary Expectations

 

Sheila A Arens,  Mid-continent Research for Education and Learning

Geniuses, Bottom Liners, and Chameleons: Complementary and Varying Roles in Educational Research and Development

 

Helen S Apthorp,  Mid-continent Research for Education and Learning

 


Session: Evaluating the Economic Impact of Research and Technology Programs
Chair: George Teather,  Independent Consultant

 

Toward a Standard Benefit-Cost Methodology for Publicly-Funded Science and Technology Projects

 

Jeanne W Powell,  National Institute of Standards and Technology

 

Approaches to Measuring the Economic Impacts of Research and Development

 

A Dennis Rank, Frederick Kijek, BearingPoint

 

The Economic Impact on Georgia of Georgia Tech’s Microsystems Packaging Research Center

 

Sushanta K Mohapatra, Quindi Franco, David Roessner, SRI International

 


Session: Evaluating Research and Technology Programs at the National and International Level

Chair: Patries Boekholt,  Technopolis Ltd

 

Evaluation of Science Foundation Ireland: A Case Study of a Programme Designed to Enhance National Research Quality

 

Jim Ryan, CIRCA Group Europe
    Michael Fitzgibbon, Forfás

 

The Role of Bibliometrics in the Performance Evaluation of National Research and Development Programs in Korea

 

Soon Cheon Byeon, Korea Institute of Science and Technology Evaluation and Planning

 

Advantages and Limitations of Trans-national Benchmarking as Element in Evaluation

 

Patries Boekholt, Technopolis Ltd

 

Scoreboarding and the Art of Biotechnology Measurement

 

Éric JA Archambault, Etienne Vignola-Gagné, Science-Metrix

 

Session: Meeting an Evaluation Challenge: Identifying and Overcoming Methodological Problems
Presenters:Rosalie Ruegg,  TIA Consulting; Connie Chang,  National Institute of Standards and Technology

  

Session: Extending Organization Capability to Evaluate Federal Environmental Research Programs
Chair: Howard Cantor,  US Environmental Protection Agency

 

Extending Organization Capability to Evaluate Federal Environmental Research Programs: Conceptual Framework

Dale Alan Pahl et al.,  US Environmental Protection Agency

 

Extending Organization Capability to Evaluate Federal Environmental Research Programs: Initial Evaluation Experience

 

Lorei Kowalski, Dale Alan Pahl, US Environmental Protection Agency

 

Extending Organization Capability to Evaluate Federal Environmental Research Programs: Strengthening the Evaluation Culture Within and Among Participating Organizations

Emma Norland, Dale Alan Pahl, US Environmental Protection Agency

 

Research Contributions to Environmental Outcomes and Accountability

Rona Birnbaum et al., US Environmental Protection Agency

Session: Evaluation of Health Related Technologies and Information Systems

Chair: Sandra C Chatterton,  Health Canada

 

The Development and Piloting of a Measurement Instrument to Evaluate Staff Satisfaction With Implementation of a Clinical Information System

Brian Gugerty, Michael Maranda, University of Maryland
Dona Rook,  Pepin Heart Hospital and Research Institute

 

The Impact of a Federally Developed Framework on the Field of Evaluation

 

Sandra C Chatterton, Robert Hanson, Health Canada

 

Potential Roles for Health Technology Assessment Agencies: Opportunities and Challenges for an Effective Health Technology Assessment Practice at the Meso Level

 

Chantale Lessard, Anaîs Tanon, Université de Montréal


Session: Evaluation of Biomedial Research Training & Career Development Programs: Examples from the National Institutes of Health
Chair: James Corrigan, National Institutes of Health
Discussant: Jennifer Sutton, National Institutes of Health
Outcome Evaluation of the National Cancer Institute (NCI) Career Development (K) Awards Program
Julie Mason and Jonathan Wiest, National Institutes of Health
Joshua Schnell, Thomson Reuters
Evaluating Diversity-Focused Training Programs of the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD)
Sarah Glavin et al, National Institutes of Health Session: Strategy and Indicators for Evaluating Complex Research Portfolios: People, Projects, and Institutes
Chair: Joshua Schnell, Thomson Reuters


Research on Peer Review in CAS Institute Evaluation
Jiang-zhong Zhou et al, Chinese Academy of Sciences 

Modeling the Dissemination and Uptake of Clinical Trial Results: How Long Does Uptake Take?
Jeffrey Schouten, Fred Hutchinson Cancer Research Institute
Scott Rosas, Concept Systems Inc
Jonathan Kagan, National Institutes of Health
Marie Cope, Concept Systems Inc

The Canadian Institutes of Health Research Centers for Research Development: Contributions to Sustainable Innovation Systems
Erica DiRuggiero, Canadian Institutes of Health Research
Natalie Kishchuk, Independent Consultant
Sarah Viehbeck, Canadian Institutes of Health Resarch
Chair: Joshua Schnell, Thomson Reuters
Research on Peer Review in CAS Institute Evaluation
Jiang-zhong Zhou et al, Chinese Academy of Sciences 
Modeling the Dissemination and Uptake of Clinical Trial Results: How Long Does Uptake Take?
Jeffrey Schouten, Fred Hutchinson Cancer Research Institute
Scott Rosas, Concept Systems Inc
Jonathan Kagan, National Institutes of Health
Marie Cope, Concept Systems Inc
The Canadian Institutes of Health Research Centers for Research Development: Contributions to Sustainable Innovation Systems
Erica DiRuggiero, Canadian Institutes of Health Research
Natalie Kishchuk, Independent Consultant
Sarah Viehbeck, Canadian Institutes of Health Resarch
Session: Understanding Knowledge Production Systems: Ecology, Context, and Complexity in the National Laboratories
Chair: Gretchen Jordan, 360 Innovation LLC
Organizational Learning in Large Research Environments
Aleia Clark, University of Maryland
Teams vs Organizations: The Balance of Interests in Large-scale Science
Jonathon Mote, Southern Illinois University
Good Jobs in Science: Work Organization and Work Satisfaction in a Large Research Laboratory
Bill Hadden, University of Maryland

Session: Research, Technology, and Development Evaluation in Transition in Asia
Chair: Alan Porter, Georgia Tech & Search Technology Inc
Evidence-based Governmental Decision Making Process: The Example of Agricultural S&T Priority Issues Selection Process in Taiwan
Ling Chu Lee et al, Science & Technology Policy Research and Information Center
Research Evaluation in the Complex Evaluation Ecology of Emerging Countries
Michael Braun et al, Vietnam Science & Technology Evaluation Center
The Transformation of Research Evaluation in China: From Native Mode to International Mode
Tao Dai et al, Chinese Academy of Sciences
Session: Research Programme Evaluation in China and the European Union: Recent Experiences and Future Challenges
Chair: Peter Fisch, European Commission


Research Program Evaluation in China: Recent Experiences and Future Challenges
Huaibin Xing and Xiaoyong Shi, National Center for Science and Technology Evaluation of China

Evaluation of European Union Research Programmes - Recent Experiences and Future Challenges
Peter Fisch, European Commission


Friday, October 26


Session: Assessment of Research in Emerging Technologies: Tools for Early Measurement of Impact
Chair: Juan Rogers, Georgia Tech


Nanotechnology in Building Construction: An Industry Study
Sanjay Arora, Georgia Tech

Assessing the Predictive Power of Indicators for Technical Emergence
Stephanie Bolan and Stephen Carley, Georgia Tech

The Use of Citation Speed to Understand the Effects of a Multi-Institutional Science Center
Jan Youtie, Georgia Tech

The Effect of Human Resource Concentration in Nanotechnology Centers
Juan Rogers, Georgia Tech

Session: Client-evaluator Interaction: Learning from Each Other by Doing with Each Other
Chair: George Teather, The Performance Measurement Network


The Clients Perspective: The Alexander Von Humboldt Foundations Approach to Evaluation
Christina Schuh, Alexander von Humboldt Foundation

The Evaluators Perspective: Lessons from Three Evaluations with Varying Approaches to Get a Better Understanding of Individual Funding and Networking Support
Katharina Warta, Technopolis Group Austria

Session: Extracting Topics and Contributors from Abstract Records for Research Assessment
Presenter: Alan Porter, Georgia Tech and Search Technology Inc

Session: Evaluating National Institutes of Health (NIH) Research: Emerging Strategies and Findings from Evaluation of Research Portfolios 
Chair: Kristi Pettibone, National Institute of Environmental Health Sciences


Growing a Program: Using Evaluation to Understand NIEHS' Neurodegeneration Research Portfolio
Kristi Pettibone, National Institute of Environmental Health Sciences

Factors Predicting Resubmission of R01 Research Grant Applications to the National Institutes of Health (NIH)
Robin Wagner, National Institutes of Health

Session: Evaluation Frameworks for Assessing Private Sector Outcomes of Research, Technology, and Development (RTD) Public Investments
Chair: Henry Doan, U.S. Department of Agriculture


Beyond Surveys: The Research Frontier Moves to the Use of Administrative Data to Evaluate Research and Development (R&D) Grants
Oliver Herrmann, New Zealand Ministry of Business, Innovation, and Employment
Michele Morris, New Zealand Ministry for the Environment

Early Market Impact Evaluation Framework for Public-Private Research Collaborations
Gretchen Jordan, 360 Innovation LLC

Longitudinal Evaluation of Pharmaceutical Programmes
Jari Hyvarinen, Tekes

The Mid-term Evaluation on Development Program of Industrialization for Agricultural Biotechnology in Taiwan
Shan Shan Li et al, Science & Technology Policy Research and Information Center

Session: Enriching Research Assessment on Interdisciplinarity 
Chair:  Alan Porter, Georgia Tech and Search Technology Inc


Evaluating the Outcomes of Government Funded Research Programs: Measuring Interdisciplinarity through Bibliometric Analysis of the CMG Program
Jon Gardner, Search Technology Inc
Alan Porter, Georgia Tech and Search Technology Inc

Evaluating the Outcomes of Government Funded Research Programs: Measuring Interdisciplinarity through Text Analysis of Abstracts of Award-Derived
Christina Freyman and John Chase, SRI International
 
Enriching Educational Research Assessment: Inclusion of Books
David Schoeneck, Search Technology Inc
Gregg Solomon and James Dietz, National Science Foundation

Is Transformative Research Necessarily Interdisciplinary?
Vanessa Pena and Bhavya Lal, Science and Technology Policy Institute

Session: The Preferred Approach to Evaluating Collaborations: What's in Your Evaluation Research and Development (R&D) Tool Box?
Chair: Kathryn Graham, Alberta Innovates Health Solutions
Discussants: Gretchen Jordan, 360 Innovation LLC; Heidi Chorzempa and Andrew Lejeune, Alberta Innovates Health Solutions


Saturday, October 27

Session: Organizational Funding Portfolios and Beyond: Assessing the Full Research Landscape
Chair: Elizabeth Hsu, National Institutes of Health


Assessing the Alignment of Current Research Landscape with a Strategic Plan for Advancing Autism Research
Duane Williams and Joshua Schnell, Thomson Reuters
Sara Dodson et al, National Institutes of Health

Informing Initiative Development Through Portfolio Analysis: Assessment of the Provocative Questions
Samantha Finstad et al, National Institutes of Health
Duane Williams et al, Thomson Reuters

Assessment of Proposed Research: National Cancer Institute Provocative Questions Initiative Case Study
Elizabeth Hsu et al, National Institutes of Health
Leo DiJoseph et al, Thomson Reuters

Session: Lessons Learned From the Use of Economic Impact Analysis and Partial Benefit-Cost Analysis in Evaluations of Science and Technology Research Grant Programs
Chair: George Teather, The Performance Measurement Network
Discussant: George Teather, The Performance Management Network


Economic Impact Analysis of the Collaborative Research Development Grants Program
Michael Goodyear and Susan Morris, Natural Sciences and Engineering Research Council of Canada

Partial Benefit-Cost Analysis of the Strategic Project Grants Program
Anna Engman and Susan Morris, Natural Sciences and Engineering Research Council of Canada

Session: Synthesis of Benefit-Cost Impact Evaluations of R&D Programs
Chair: Alan O'Connor, RTI International


Synthesizing Benefit-Cost Studies of R&D: Lessons and Challenges from Recent Evaluation Series
Alan O'Connor, RTI International

Database Analytical Tool for Aggregating Across R&D Benefit-Cost Studies
Rosalie Ruegg, TIA Consulting

Session: Developing Outcome Indicators for Program Evaluation: Training, Team Science, and Translation of Basic Research
Chair: Joshua Schnell, Thomson Reuters


Training Program Evaluation: Creating a Composite Indicator to Measure Career Outcomes
Yvette Seger and Leo DiJoseph, Thomson Reuters

Team Science Evaluation: Developing Methods to Measure Convergence of Fields
Unni Jensen and Jodi Basner, Thomson Reuters

Measuring Longer-Term Outcomes: Testing the Feasibility of Linking Research Grant Funding to Downstream Drug Development
Duane Williams and Joshua Schnell, Thomson Reuters

Session: From Outputs to Impacts: Emerging Approaches to Track Scientific Research Impacts
Chair: Christie Drew, National Institue of Environmental Health Sciences
Greatest "HITS": A New Tool for Tracking Impacts at the National Institute of Environmental Health Sciences 
Christie Drew and Kristi Pettibone, National Institute of Environmental Health Sciences
Understanding Innovation is Nurturing It
Stefano Bertuzzi and Liza Bundesen, National Institutes of Health

The Becker Model: A Framework for Quantifying Research Impact
Kristi Holmes and Cathy Sarli, Washington University in St Louis