Posters

Statistics
0 Favorited
6 Views
6 Files
0 Shares
148 Downloads
Attachment(s)
pdf file
1504: Using Modelling to Identify Matched Groups in Evalu...   1.11MB   1 version
Uploaded - 01-05-2017
The poster will describe a novel approach to incorporate Bayesian modelling into matching algorithms to create matched groups for the evaluation of a National Institutes of Health biomedical research program. Evaluators are often interested in comparing two or more groups that differ only in terms of an intervention. They employ matching algorithms when resource or methodological constraints preclude random assignment. For example, assessing the impact of research funded through a mechanism targeting innovative proposals compared to proposals funded through traditional mechanisms requires matching on variables such as area of science. Matching on area of science may require analysis of the text of the grant application. Traditional matching algorithms cannot incorporate textual data. A set of research topics were derived using Latent Dirichlet Allocation topic modelling and incorporated into a matching algorithm to create a matched set of participants. An introduction to this approach is provided, along with lessons learned.
pdf file
1658: A clue of how the citizens could join in R&D projec...   2.17MB   1 version
Uploaded - 01-05-2017
The qualitative evaluation for R&D projects in Korea is focused on developing its methods. Especially, to intensify the usefulness of social problem-solving R&D projects, its quality assessment requires the participation of stakeholders including the citizens as final technology consumers. But, currently, there is no way to join in R&D project evaluation for the citizens. Technology assessment (TA) has been conducted since 2003 in Korea and ‘Civic Forum’ in the TA has been run from 2006. The citizens take part in that as panels to assess technology’s effects in various aspects (economy, society, environment, etc.). In 2014 TA, what influences the citizens’ thought was surveyed and how their thought was different from that of the experts and the media was investigated by comparison analysis. The result, probably, shows a clue how the citizens could join in R&D projects evaluation to improve quality assessment methods.
pdf file
2736: Dimensions for Funders: Software to aid in the use ...   1.76MB   1 version
Uploaded - 01-05-2017
In this paper we present a new tool, Dimensions for Funders, that facilitates the real time display of the research investments based on roughly $1 trillion of research grants from hundreds of funding bodies globally. Dimensions for Funders facilitates analysis by existing categorization schemes such as the National Institutes of Health Research Condition, and Disease Categorization (RCDC) and the Australian Bureau of Statistics Fields of Research (FOR), as well as the creation and sharing of custom categorization schemes to allow for greater transparency and reproducibility of results in reporting the level of research investments in a given area.
pptx file
2000: Evaluation of the NCI Research Resources Tool   495K   1 version
Uploaded - 01-05-2017
The National Cancer Institute’s (NCI) Research Resources is a web-based centralized directory providing access to numerous NCI-supported research tools. The purpose is to expedite cancer research progress by providing investigators with access to cost-effective resources (e.g., reagents, mice). Recently, NCI evaluated the use of Research Resources to determine strengths, weaknesses and opportunities for improvement in order to develop an evidence base in support of decisions about future investments. This evaluation addressed six key questions using both usability testing and heuristic evaluation. Usability testing involved collecting data using in-person interviews, while heuristic assessments employed a two-pronged approach; performing an environmental scan of the research directory landscape and using Nielsen heuristic criteria to evaluate Research Resources independently and in comparison to similar web directories. Key findings from these evaluation methods will be summarized, lessons learned will be noted, and the status future plans for Research Resources will be discussed.
pptx file
1386: Forecasting a Country-Dependent Technology Growth b...   2.41MB   1 version
Uploaded - 01-05-2017
To establish a proper S&T strategy, it is important to understand current exact technological level and predict future change. In this study, country-dependent technology growth curves are suggested and verified by investigating several types of technology growth curves measured in 2008 and 2010 by Korean government. Technology growth curves measured in 2008 and 2010 showed 3 representing features. First, both curves show the same growth pattern. Second, technology growth curve measured in 2010 shows a systematic shift from the curve measured in 2008. And third, technology growth curve measured in 2010 shows a faster growth compared the curve measured in 2008. The differences of these curves were explained by introducing a country-dependent technology growth curve. The technology growth curves of Korea, U.S.A., and China shows very different technology development speed. By using this model, more accurate prediction on the future technological changes and more proper S&T development strategy are expected.
pdf file
2332: Bibliometrics: a Key Performance Indicator in Asses...   1.28MB   1 version
Uploaded - 01-05-2017
Bibliometrics, a quantitative evaluation of publication and citation data, is one type of indicator of productivity, influence, collaboration, and reach of scientific programs. Using research publications from programs funded by the National Institutes of Health (NIH) Common Fund, this presentation will focus on (1) the feasibility and utility of bibliometrics as a performance indicator for research programs, and (2) how bibliometrics integrate with other methods used to evaluate biomedical research programs. Challenges and lessons learned using bibliometrics as a performance indicator for NIH Common Fund programs will be discussed. Selected results including bibliometric data generated by Web of Science and a new measure--relative citation ratio--will be explored. The implications of these results related to science productivity and influence will also be discussed. Evaluators who assess research and technology programs will benefit from the experiences of the NIH Common Fund using bibliometrics as one of several program assessment tools.

Tags and Keywords

Related Entries and Links

No Related Resource entered.