9 Principles of (alt)Metrics - Amazon S3...3. News/Mass Media Counts and links to scholarly works...

Post on 15-Jul-2020

0 views 0 download

Transcript of 9 Principles of (alt)Metrics - Amazon S3...3. News/Mass Media Counts and links to scholarly works...

| 1

Michael Habib, MSLS Sr. Product Manager, Scopus habib@elsevier.com twitter.com/habib orcid.org/0000-0002-8860-7565

9 Principles of (alt)Metrics

Special Libraries Association – Vancouver, BC – June 10, 2014

| 2

PRINCIPLE 1: Metrics should be used alongside, not as a replacement for peer review and expert opinion

| 3

PRINCIPLE 2: Be clear on the question that is being asked

| 4

• Top-down evaluation? - A publisher assessing a journals performance - An editor seeking authors for a special issue

• Bottom-up showcase?

- An author determining which journal to publish in - A researcher determining which articles to read

PRINCIPLE 2: Be clear on the question that is being asked

| 5

PRINCIPLE 3: Multiple metrics tell the most complete story

| 6

1. There is no single perfect metric to answer any question

2. Optimal decision-making requires input from at least two metrics

3. Alternative Metrics are not a replacement for citations

4. Alternative Metrics are not a replacement for usage metrics

5. Alternative Metrics are not one thing

PRINCIPLE 3: Multiple metrics tell the most complete story

| 7

6 classes of alternative metrics

1. Scholarly activity Counts of activity in scholarly platforms eg, Mendeley, Citeulike, Zotero, etc (in use).

2. Scholarly commentary Counts (and potentially links) to where scholars have annotated, discussed and reviewed articles, eg, recognized scholarly blogs, F1000 Prime reviews, Publons, Peerpub, Mendeley annotations. Potentially qualitative.

3. News/Mass Media Counts and links to scholarly works found in the mass media, eg, NY Times, BBC, HuffPo

4. Social activity Counts of links to research material found on generic social networks, eg, Twitter, Facebook, Pinterest, Delicious

5. Re-use Indicators for the re-use and citation of scholarly artifacts, primarily data,

6. Legal, commerce and governance Indicators for the use and citation of scholarly output by government and governance, eg, policy documents, law, patents

| 8

Mendeley Readership

Statistics Beta launched Mar. 2014

http://blog.scopus.com/posts/mendeley-readership-statistics-available-in-scopus

| 9

Scholarly activity – indirect measurement of activity by people using scholarly platforms, e.g., Mendeley, Zotero, Citeulike.

| 10

Altmetric for Scopus Released June 2012

http://support.altmetric.com/knowledgebase/articles/83246-altmetric-for-scopus

| 11

11

Scholarly activity Social activity Scholarly commentary Mass media coverage

http://www.scopus.com/record/display.url?eid=2-s2.0-84866094818&origin=resultslist

| 12

12

Scholarly Impact Scholarly activity Social activity Scholarly commentary Mass media coverage

| 13

PRINCIPLE 4: Metrics should be as simple as possible

| 14

Composite metrics should be avoided!

| 15

PRINCIPLE 5: No methodological black boxes

| 16

All calculation methods must be freely available, and not

proprietary

| 17

PRINCIPLE 6: Metrics should be agnostic

| 18

PRINCIPLE 7: Disciplinary differences and other factors must be taken into account

| 19

• Normalize for differences in research behavior between fields - Source-Normalized Impact Per-Paper (SNIP) for altmetrics? - Scopus books expansion project - Zotero vs. Mendeley?

• Account for differences in size and age of the entities being

measured - Scopus to add cited references for pre-1996 content

• Distinct disciplines will balance the contributions of metric and peer

review input differently

• Cultural differences need to be accounted for - Twitter vs. Facebook vs. Sina Weibo - http://www.researchtrends.com/issue-37-june-2014/science-without-

borders/ interview with Juan Pablo Alperin (@juancommander)

PRINCIPLE 7: Disciplinary differences and other factors

must be taken into account

| 20

PRINCIPLE 8: Narratives help to interpret metrics (case studies, etc.)

| 21

PRINCIPLE 9: Ownership and acceptance of metrics by communities

| 22

• Global standards for institutional benchmarking

• Endorsed by a group of distinguished UK universities to support their strategic decision making

• Tried and tested, non-proprietary methodologies that are available free-of-charge to the higher education sector

• Absolutely clear, unambiguous definitions enable apples-to-apples comparisons so universities can benchmark themselves against their peers to judge the excellence of their performance

Snowball Metrics are unique because: • Universities drive this bottom up • Academia – industry collaboration

Snowball Metrics – the essentials

Snowball Metrics are unique because:

| 23

Vision: Snowball Metrics enable benchmarking by driving quality and efficiency across higher education’s research and enterprise activities, regardless of system and supplier • Universities endorse metrics to generate a strategic dashboard • Draw on all data: university, proprietary and public • Ensure that the metrics are system-agnostic – that they can be

calculated regardless of systems and data structures

Snowball Metrics approach

Snowball Metrics Project Partners

| 25

25

Snowball Metrics are feasible

Note: this pilot was built for the UK project partners, and is not available more widely.

| 26

26

Metrics can be size-normalized

Note: this pilot was built for the UK project partners, and is not available more widely.

| 27

27

Metrics can be “sliced and diced”

Note: this pilot was built for the UK project partners, and is not available more widely.

| 28

Scholarly Activity

| 29

Scholarly Activity per FTE

| 30

Scholarly Activity per publication

| 31

Scholarly Commentary per publication

| 32

Social Activity

| 33

Mass Media per publication

| 34

Altmetrics

Focus on “societal-economic” impact – Snowball Metrics endorse

alternative metrics to be shared in Recipe Book v2

“Inputs: Enabling Research” “Processes: Doing Research” “Outputs & Outcomes: Sharing Research” Search,

discover, read, review

Synthesize/ Analyze Experiment

Recruit/evaluate researchers

Secure & Manage Funding

Manage facilities

Publish and disseminate

Manage Data

Promote and showcase (esteem)

Commer-cialize

Collaborate & network

Establish partnerships

?

Develop Strategy Benefit

! $

Societal-economic Impact

Scholarly Activity Number of times that an institution’s output has been posted in online tools that are typically used by academic scholars e.g. Mendeley, CiteULike, Academia.edu, ResearchGate

Scholarly Commentary Number of times that an institution’s output has been commented on in online tools that are typically used by academic scholars e.g. science blogs, video posts, peer reviews, post-publication comments, Wikipedia

!

Mass Media Number of times that an institution’s output has been referred to by press clippings and news websites

Social Activity Number of times that an institution’s output has stimulated social media posts e.g. Facebook, Twitter, Reddit, LinkedIn

Esteem Impact

| 35

Euan Adie, Founder of Altmetric.com

"We're very supportive of Snowball Metrics and of the worthy principles behind it, so I'm very happy to see the project partners take the imitative andput forward practical standards for altmetrics in the context of benchmarking. The approach taken is a smart one - structuring the way you look at altmetrics like this is a good way of maximizing the insights that the data can provide. We're working towards including the recipe in our own reporting tools, to make it as easy as possible to put into practice."

| 36

1. That will both use them to assess performance 2. Whose performance will be assessed

PRINCIPLE 9: Ownership and acceptance of metrics by

communities

| 37

1. Metrics should be used alongside, not as a replacement for, peer review and expert opinion

2. Be clear on the question that is being asked 3. Multiple metrics tell the most complete story 4. Metrics should be as simple as possible 5. No methodological black boxes 6. Metrics should be agnostic 7. Disciplinary differences and other factors must be taken into

account 8. Narratives help to interpret metrics (case studies, etc.) 9. Ownership and acceptance of metrics by communities

9 Principles of (alt)Metrics

| 38

www.elsevier.com/research-intelligence

Thank you!

Michael Habib, MSLS Sr. Product Manager, Scopus habib@elsevier.com twitter.com/habib orcid.org/0000-0002-8860-7565

www.researchtrends.com/category/issue-37-june-2014/

– Issue 37 – June 2014