Relevant metrics for measuring research impact
Christina Locke, PhDProject Coordinator, Beyond the AcademyCambridge Workshop29 May 2019
Overview
Section I: Overview of academic metrics & status of current literature Group discussion
Section II: Example frameworks Group discussion
Section I: Overview of metrics used in academia and current status of literature
Image: The University of Salford Blog
We are defining impact as scholarship or leadership that advances sustainability in a diverse and changing world.
Why measure research impact?
Adam et al. 2018. ISRIA statement: ten-point guidelines. Health Res Policy Sy.
Sustainability depends on evaluation. How do we know if sustainability solutions are working if we don’t assess them?
Metric Proxy measure ofPublication counts ProductivityCitation indices (e.g. h-index) Scholarly influence (positive or negative)Journal impact factors PrestigeGrant income Research environmentGraduate student degrees awarded
Research environment
Metrics measure… what exactly?
Measuring real-world impact requires better tools.
Attempts at measuring external influence
TEXAS CTE
• Web-based metrics• “Altmetrics” - social media focused• Weaknesses: can be easily gamed, measure attention
rather than influence
• Number and type of “occasions of influence”• Track communications with businesses, government
agencies, NGOs• Potential to connect these encounters with eventual
policy change
What about qualitative assessment of research impact?
blog.optimizely.com
Expert panel reviewResearchers are asked to submit narratives or case studies of their work, and draw connections to outcomes and impacts.Examples: • Manuscript review• Grant review• Tenure review process• Research Excellence Framework
(REF) national assessment• “Rule of 5” impact statements
Flickr/AJ Cann, CC BY-SA
Expert panel review - strengths
Flickr/AJ Cann, CC BY-SA
• Nuanced. Social/public impact can be assessed holistically
• Flexible. • Foundational knowledge differs by
discipline. • Organization can decide to emphasize
real-world impact
• Can capture delayed impacts
Expert panel review - weaknesses
Flickr/AJ Cann, CC BY-SA
• Slow and expensive (though most costs are hidden)
• Subject to biases of reviewers • Lack of transparency• Incentivizes researchers to “reach”
for causation
“[Expert] review is not perfect, but it is the least worst form of academic governance we have, and should continue to be the primary basis for assessing research papers, proposals and individuals, and for national assessments like the REF.”“However, carefully selected and applied quantitative indicators can be a useful complement to other forms of decision making.”“Metrics should support, not supplant, expert judgement.”
The Metric Tide (Wilsdon 2015)
“Baskets” of indicators (qualitative and quantitative) provide the best way forward.
Increasing impact
InfluenceAttention Uptake
METRICS
San Francisco Declaration on Research Assessment - sfdora.org -
Recommendations for publishers, institutions, funding orgs, and researchers. Primary themes:• Eliminate the use of journal-based metrics,• Assess research on its own merits rather than on the basis of the
journal,• Capitalize on the opportunities provided by online publication.
1) Quantitative evaluation should support qualitative, expert assessment.
2) Measure performance against the research missions of the institution, group or researcher.
3) Protect excellence in locally relevant research.4) Keep data collection and analytical processes open,
transparent and simple.5) Allow those evaluated to verify data and analysis.6) Account for variation by field in publication and
citation practices.7) Base assessment of individual researchers on a
qualitative judgement of their portfolio.8) Avoid misplaced concreteness and false precision.9) Recognize the systemic effects of assessment and
indicators.10) Scrutinize indicators regularly and update them.
Hicks et al. 2015
Adam et al. 2018. ISRIA statement: ten-point guidelines. Health Res Policy Sy.
CONTEXT
STAKEHOLDERS’ NEEDS
STAKEHOLDER ENGAGEMENT
CONCEPTUAL FRAMEWORKS
METHODS & DATA SOURCES
INDICATORS & METRICS
ETHICS & CONFLICTS OF
INTERESTCOMMUNICATION
COMMUNITY OF PRACTICE
PURPOSE
CONTEXT International School on Research Impact Assessment (ISRIA) guidelines, 2018
Section I take-home messages
• It remains difficult to establish cause and effect of research impact, but qualitative research narratives can be compelling
• Journal impact factors should not be used to evaluate research• Other quantitative measures can be useful, but should support
qualitative, expert assessment.• Plenty of guidelines out there for responsible use of metrics
• Declaration on Research Assessment (DORA) – sfdora.org• The Leiden Manifesto - leidenmanifesto.org• ISRIA guidelines - www.theinternationalschoolonria.com• The Metric Tide - responsiblemetrics.org
*GROUP DISCUSSION*
Section II: Example frameworks
1. Revisit ISRIA guidelines for research impact assessment2. Impact Survey3. Impact Compass
Guidelines can apply to tenure review processes, grant and manuscript review, assessment of research case studies.
Adam et al. 2018. ISRIA statement: ten-point guidelines. Health Res Policy Sy.
CONTEXT
STAKEHOLDERS’ NEEDS
STAKEHOLDER ENGAGEMENT
CONCEPTUAL FRAMEWORKS
METHODS & DATA SOURCES
INDICATORS & METRICS
ETHICS & CONFLICTS OF
INTERESTCOMMUNICATION
COMMUNITY OF PRACTICE
PURPOSE
CONTEXT
Adam et al. 2018. ISRIA statement: ten-point guidelines. Health Res Policy Sy.
CONTEXT
STAKEHOLDERS’ NEEDS
STAKEHOLDER ENGAGEMENT
CONCEPTUAL FRAMEWORKS
METHODS & DATA SOURCES
INDICATORS & METRICS
ETHICS & CONFLICTS OF
INTERESTCOMMUNICATION
COMMUNITY OF PRACTICE
PURPOSE
CONTEXT
What is the logic model behind the research? How does the research lead to impacts?
“Impact Framework” used by the Commonwealth Scientific & Industrial Research Organisation (Australia)
https://www.csiro.au/en/About/Our-impact/Our-impact-model
Used to measure UK health impacts
Adam et al. 2018. ISRIA statement: ten-point guidelines. Health Res Policy Sy.
CONTEXT
STAKEHOLDERS’ NEEDS
STAKEHOLDER ENGAGEMENT
CONCEPTUAL FRAMEWORKS
METHODS & DATA SOURCES
INDICATORS & METRICS
ETHICS & CONFLICTS OF
INTERESTCOMMUNICATION
COMMUNITY OF PRACTICE
PURPOSE
CONTEXT
Responsible use of indicators and metrics to illustrate impact
Examples…
Impact survey (London School of Economics)
• Researchers keep an “impact file” to document occasions of influence • Can be reported to departments or standardized across departments• Can track impact trends over time
https://blogs.lse.ac.uk/impactofsocialsciences/the-handbook/chapter-9-expanding-external-research-impacts/Sno-Isle Libraries
Impact Compass (Stanford Business)
Designed for business students deciding among companies to work for
https://www.gsb.stanford.edu/faculty-research/centers-initiatives/csi/impact-compass
Can be adapted to assess research impact, or researcher, department, or institution performance
https://www.gsb.stanford.edu/faculty-research/centers-initiatives/csi/impact-compass
Impact Compass (Stanford Business)
Impact Compass (Stanford Business)
https://www.gsb.stanford.edu/faculty-research/centers-initiatives/csi/impact-compass
Three Big No’s:
• Negative social outcome • Unethical behavior• Proven failure
Impact Compass
• 6-dimensions, scaled 1-3• Points are multiplied to calculate
final score (min score=1, max score=729)
• “A lower impact potential score doesn’t necessarily indicate a less worthy opportunity.”
• Flexible – can be adapted to assess individuals, departments, institutions.
2
Value to Society
Efficacy
ImpactMagnitude
Environmental & Social
Governance
Scalability
MissionAlignment
1
3
2
3
12
1
3
1
1 1
2
22
3 3
3
Advances social progress
Positivecontribution
Unknown but promising
Solutions that work
Local
Global
No harm done
Strong public stewardship
Some progress
Problem solvedImpact prioritized
Impact considered
Section II take-home messages
• Literature on research impact assessment is maturing.• Conceptual frameworks / logic models for tying impacts to research in
use by many organizations across the world.• Appropriate indicators and metrics should be chosen to suit the
context and purpose of research. • Useful metric frameworks from non-academic sectors can be adapted
to research contexts.
*GROUP DISCUSSION*
References• Adam, Paula et al. (2018) ISRIA statement: ten-point guidelines for an effective process of research impact assessment. Health Research Policy
and Systems 16:8 DOI 10.1186/s12961-018-0281-5
• Bishop, Dorothy. What are metrics good for? Reflections on the Research Excellence Framework (REF) and Teaching Excellence Framework (TEF) https://www.slideshare.net/deevybishop/what-are-metrics-good-for-reflections-on-ref-and-tef
• Bornmann, L. & Haunschild, R. Scientometrics (2016) 107: 1405. https://doi.org/10.1007/s11192-016-1893-6
• Côté IM, and Darling ES. (2018). Scientists on Twitter: Preaching to the choir or singing from the rooftops?. FACETS 3: 682–694. https://doi.org/10.1139/facets-2018-0002
• Hicks et al. 2015. The Leiden Manifesto for research metrics. Nature Letters.
• Liang, X. et al. (2014). Building Buzz: (Scientists) Communicating Science in New Media Environments. Journalism & Mass Communication Quarterly, 91(4), 772–791. https://doi.org/10.1177/1077699014550092
• London School of Economics Impact Blog. https://blogs.lse.ac.uk/impactofsocialsciences/the-handbook/chapter-9-expanding-external-research-impacts/
• Priem, J., Piwowar, H., and Hemminger, B. (2012). Altmetrics in the wild: Using social media to explore scholarly impact. Retrieved from http://arXiv.org/html/1203.4745v1
• Thelwall, M., Haustein, S., Larivière, V., and Sugimoto, C. (2013). Do altmetrics work? Twitter and ten other candidates. PLOS ONE. 8(5), e64841.
• Ravenscroft et al. (2017). Measuring scientific impact beyond academia: An assessment of existing impact metrics and proposed improvements. PlosOne.
• Wilsdon, James. Time for a stern hard look at the REF. https://wonkhe.com/blogs/time-for-a-stern-hard-look-at-the-ref/
• Wilsdon, James et al. (2015). The metric tide: Report of the independent review of the role of metrics in research assessment and management.
Top Related