Evaluating the Big Deal: Usage Statistics for Decision Making

Post on 29-Nov-2014

2.060 views 1 download

description

Presentation delivered at the UKSG Usage Statistics for Decision Making workshop. Held at the Institute of Materials, Minerals and Mining, London. 2nd Febrary 2012.

Transcript of Evaluating the Big Deal: Usage Statistics for Decision Making

Evaluating the Big Deal: What metrics matter?

Usage Statistics for Decision MakingLondon, 2 February 2012

Selena KillickLibrary Quality Officer

Introduction

• Institutional, financial and strategic context

• Evaluating the Big Deals

• Quantitative Reporting

• Qualitative Measures

• Using the Results

• Conclusions

Cranfield University

• The UK's only wholly postgraduate university focused

on science, technology, engineering and

management

• One of the UK's top five research intensive

universities

• Annual turnover £150m

• 40% of our students study whilst in employment

• We deliver the UK Ministry of Defence's largest

educational contract

Key Drivers

• Financial realities

• Demonstrating value for money

• Strategic alignment

• Research Excellence Framework (REF)

• Income

• ReputationMissioncritical

Expenditure on Journals

2006-07 2007-08 2008-09 2009-10

Journal Spend

Information Expenditure by Format 2010-11

Books4%

eBooks4%

Journals68%

Databases24%

Information Expenditure by Format 2010-11

Books4%

eBooks4%

Journals31%

Big Deals37%

Databases24%

Evaluating the‘Big Deals’

Previous Techniques Used:

Annual journals review using the follow data

• Circulation figures – issues and renewals

• “Sweep survey” to capture in-house use

• Journal contents page requests

• Download figures

• Journal prices v the cost of ILL requests

More recent focus on “cost per download”

New Approach

Quantitative:• Size

• Usage

• Coverage

• Value for Money

Qualitative:• Academic Liaison

• Reading Lists Review

• REF Preferred

Requirements

• Systematic

• Sustainable

• Internal benchmarking

• Elevator pitch

• So what?

• Enable informed decision making

• Demonstrate smart procurement

Quantitative Reporting

Brought to you by the letters…

&

Our Approach

• What has everyone else done?

• Analysing Publisher Deals Project

• Storage centre

• Excel training

• Template design

Basic Metrics

• Number of titles within a package

• Total annual full-text downloads

• Cost:

• Core titles

• e-Access Fee

• Total costs

Value Metrics

• Average number of requests per title

• Average cost per title

• Total cost as % of information provision expenditure

• Cost per full-text download

• Average download per FTE student/staff/total

• Average cost per FTE student/staff/total

The Long Tail

Do

wn

loa

ds

Titles Titles Titles

Long Tail Short Tail No Tail

Subscribed Titles

• Reviewing performance of core collection

• REF Preferred?

• Popular?

• Three year trends in cost / downloads / CPD

• Cost / Downloads / CPD categorised:• Zero

• Low

• Medium

• High

• Cancel?

Popular Titles

• Which titles are the most popular?

• Top 30 titles in the package

• Three year trends in downloads

• REF Preferred?

• Subscribed title?

Considerations

• When to measure from/to?• calendar, financial/academic, or contract year?

• Which titles make up our core collection?

• Do we have access to all of the „zero use‟ titles?

• What constitutes Low/Medium/High?

• What about the aggregator usage statistics?

• Do we trust the usage statistics?

• What is the size of the target population?

Capturing the data

Downloading Statistics

• Get organised

• Gather your usernames and passwords

• Create local files to save and store usage reports

• Software now on the market to manage this for you

• Joint Usage Statistics Portal

Introductory

Workshop

April 18

Birmingham

Reporting Period

• Calendar, financial/academic, or contract year?

• COUNTER Reports = Calendar year

• Converted using Vlookup on ISSNs

• Manual

• Problematic

• Automatically converted using the JUSP

Aggregator Usage Statistics

• Combining usage from publishers and aggregators at a

title level

• Combined using Pivot Tables

• Manual

• Problematic

• Where possible combined using JUSP

Analysing the data

Excel Template

• Two main data sources:

• COUNTER JR1

• Subscription agent financial report

• Automated as much as possible

• Match formulas working with ISSNs to link title price to

usage/holdings

• All calculations are completed automatically when the

data sources are added

Quantitative Reporting

• Systematic

• Sustainable

• Internal benchmarking

• Elevator pitch

• So what?

• Enable informed decision making

• Demonstrate smart procurement

Qualitative Measures

Academic Liaison

• Who‟s using it?

• Why?

• How?

• What will be the impact if we cancel?

• Teaching?

• Research?

• How valuable is it?

Quantitative on the Qualitative:

Analysis on the five REF Preferred Recommended

Journals Lists:

• Overlapping titles

• Unsubscribed titles

• Financial shortfall

• Current recommended subscribed titles

• Usage data

Reading List Review

Qualitative analysis on course reading lists:

• What are our academic recommending?

• Where is it published?

• How often is it recommended?

• Are there alternatives?

Quantitative & Qualitative Reporting

• Systematic

• Sustainable

• Internal benchmarking

• Elevator pitch

• So what?

• Enable informed decision making

• Demonstrate smart procurement

Using the results

What they can do:

• Both qualitative and quantitative measures tell the

story of the resource

• Aid decision making

• Justify procurement

• Safeguard budgets…?

What they can’t do:

Conclusions

Closing thoughts

• Is it worth investing in this?

• Qualitative & Quantitative

• Danger of relying on cost-per-download

Looking Ahead

• Review of all budgets

• All Resources

• Systems

• Staff

• Services

• Demonstrating Value and Impact

• Resources

• Services

Thank You

Selena KillickCranfield Universitys.a.killick@cranfield.ac.ukTel: 01793 785561