1 to 8 of 8
Sort by

Library Entry
Session 161 AEA 2013 - Harnessing Big Data in Higher Education - Evaluators as Data Scientists

This handout summarizes the key points presented in this session along with references and links to documentation. #highereducation #bigdata


Library Entry
EVAL 2015: Multipaper Session 1379, Strengthening the Research Evaluation Infrastructure—Perspectives from Practitioners

Abstract 2 Title: How to Build a Research Evaluation Team in the Era of Big Data Presentation Abstract 2: The evaluation of research programs has a number of well-known methodological challenges. In recent years, a new challenge has emerged: huge numbers of electronically available data sources, such as publications and citations, require staff with expertise and skills needed to evaluate these “big data.”

2 attachments

Focus Search - How to Build a Research Evaluation Team in the Era of Big Data

Library Entry
Eval13 Session 304: The Evaluator and the Right Brain

In his bestselling book, A Whole New Mind, Daniel Pink argues that left-brain analytical skills are no longer sufficient to thrive professionally (in the developed world). Instead, we need more from our right-brain, our creative, empathetic, big picture side. Why? Abundance, automation, and...

The Evaluator and the Right Brain.pdf

Library Entry
2016 AEA New Approaches to the Design and Evaluation of Global Programs to End Modern Slavery: Establishing an Evidence Base and Understanding What Works

Three presentations to AEA 2016 Conference (Session ID: 2280) - New Approaches to the Design and Evaluation of Global Programs to End Modern Slavery: Establishing an Evidence Base and Understanding What Works #SocialImpact #Eval2016 #HumanRights #AEA2016Conference #FreetheSlaves ...

3 attachments

Focus Search - A look at the role of 'big data' in determining where and how an anti-slavery initiative would be most successful

Library Entry
AEA 2016 RTD Session 1770: Using the Law of Unintended Consequences to Promote Desired Behavior: How Can Evaluation Metrics Influence Research Data Sharing?

As evaluators, we often fear changing the behavior of the participants of the program we are evaluating by putting metrics around their activities. An example is the use of publications in measuring the "success" of an investigator and influencing tenure decisions and the skyrocketing number of...

5 attachments

Focus Search - Panel session 1770 American evaluation association Evaluation 2016 October 29. 2016 Presenters Danielle Rutherford When Data Sharing is the Primary Goal – the Case of UNAVCO and Geodetic Data Danielle Daee Understanding the Value of Data Sharing Within Epidemiology Programs Sharon Williams Preliminary Overview of Data Sharing Practices Across NCI-Funded Cancer Epidemiology Cohorts Elizabeth Hsu Breaking Down the Silos: #IAmAResearchParasite US Federal Government Initiatives Driving Data Sharing February 2013 – White House initiative (“Holdren memo”) to increase access to publications and results of federally funded research May 2013 – OMB memo (“M-13-13”) on open data policy and managing information as an asset July 2011/September 2015 – Proposed Common Rule revisions support broad consent to maximize utility of bio-specimens and data NIH Big Data to Knowledge Initiative (BD2K) supports the broad use of digital assets to enhance the utility of biomedical data and accelerate discovery January 2015 Precision Medicine Initiative – patient-powered research and care that accounts for individualized differences January 2016 Vice President’s National Cancer Moonshot specifically calls out for enhanced data sharing Move to intro presentation 3 Motivation Evaluators often attempt to remain separate so as not to influence behavior The act of measurement ultimately results in behavior change Evaluators must be prudent when deciding to deploy new metrics Cases may exist where it is desirable to influence behavior of participants Increasing desire from biomedical research funders to ensure that funding recipients are sharing research dataBreaking Down the Silos: #IAmAResearchParasite Elizabeth Hsu American Evaluation Association Evaluation 2016 October 29, 2016 National Institutes of Health Sharing Precedents ‹#› Keep here or move to intro?

Library Entry
Eval12 Session 560: Crafting Powerful Reports and Presentations - Strategies for Improving Communication in Evaluation

Presentation Abstract: In evaluation, reporting is the mechanism through which evaluators translate complex data into understandable information. It sets the stage for the utilization of evaluation findings, and therefore is a critical skill for all evaluators. In recent years, there has been a...

2 attachments

Focus Search - Reports which relay all the data but don’t ever get to answers are like a big data dump