Trials by Juries: Suggested Practices for Database Trials

Post on 04-Dec-2014

298 views 0 download

description

Presentation at the ER&L Conference, 2012.

Transcript of Trials by Juries: Suggested Practices for Database Trials

Annis Lee Adams, Golden Gate UniversityJon Ritterbush, University of Nebraska - KearneyChristine E. Ryan, Clemson University

Tuesday, April 3, 2012

This panel discussion topic began with an innocent question on the ERIL-L listserv in September 2011:

What tools or techniques have worked for you

in gathering feedback on database trials,

whether from librarians or library users?

Annis Lee Adams, Golden Gate University

Does the content directly support our programs?

Does the platform allow for IP and proxy server authentication?

Our FY budget must be completed by March, so we try to trial things in the fall and winter, but will trial all year long.

Collection Development Librarians

Faculty, as appropriate. Note: we only involve

faculty if we know we can afford the product.

Email announcement to CD librarians, who can forward to their faculty. Discussion at CD meetings (which are held twice monthly).

Content

Ease of use/functionality

Cost

Did a faculty member request it?

Internal Blog on WordpressInformation about how to access trial

Cost information

Description of product(s)

Evaluators comment on why or why not to acquire

After trial ends, change blog entry to “draft” to keep for historical records

Spreadsheet to track all product investigations, decisions made and why.

Jon Ritterbush, University of Nebraska-Kearney

A. What criteria apply to selecting product trials?

B. How are trials scheduled?

C. How is trial feedback solicited and recorded?

A. What criteria apply to selecting product trials?

1. Would this product meet the standards of our E-Resource Collection Development Policy?

IP access?

Support the curriculum?

2. Can we realistically afford this product in the near future?

Yes – continue with consideration of trial

No – may postpone official trial, or proceed with full honesty to vendor

A. What criteria apply to selecting product trials?

3. Does this trial have a librarian requestor/sponsor?

Trials requested by students or faculty must be sponsored by the liaison librarian for that department or subject area.

If requestor/sponsor is not the ER Librarian, the other librarian should also evaluate the trial product and solicit feedback from other faculty in their liaison area.

Requiring a librarian sponsor may help deflect vendor-initiated trials

B. How are trials scheduled?

1. Scheduled between Sept. – Nov. and Feb.- Apr. to capture maximum participation by faculty.

2. No more than three trials scheduled simultaneously to avoid “trial fatigue.”

3. Only extended trials (>3 months) are advertised on the library’s blog and website. Most trials (<1 month) are kept internal or shared with faculty by email.

C. How is trial feedback solicited and recorded?

1. Email was a mediocre solution –

Responses from non-librarians were often less descriptive as to how the trial database might be used.

Time-consuming to compile into reports.

2. Short web forms have worked better

Use a mix of Likert scale and free-text questions

UNK uses Qualtrics. LibGuides survey boxes, SurveyMonkey, PollDaddy and Google Forms were other suggested web survey tools shared on ERIL-L

C. How is trial feedback solicited and recorded?

3. Advertising the trial

Post trial information to an internal library blog, with a sample “press release” that could be copied/pasted into emails to specific faculty.

Arrange for a vendor’s webinar or live demonstration of the product for librarians and faculty.

For extended trials, could post info to faculty and/or student listservs, library’s website and public blog.

C. How is trial feedback solicited and recorded?

4. Sharing responses publicly?

PRO: Allows participants to see what others think before/after their trial experience (and perhaps respond).

CON: May discourage participants from sharing “frank” comments.

At UNK, we’re erring on the side of privacy and keeping survey responses internal, with no identifiers required.

Sample survey at bit.ly/y33F7D

Trial Survey QuestionsHow relevant is this database's content to your research and/or instruction at UNK?* (Highly Relevant / Somewhat Relevant / Limited Relevance / Not Relevant)

In which courses -- or areas of research or teaching -- would this database help you?* (Free-text)

How easy was it to navigate this database and access results on your search topic?* (Very Easy / Somewhat Easy / Somewhat Difficult / Very Difficult)

Should the UNK Library acquire this database, in your opinion?* (Yes / No / Not sure)

Additional comments? (Free-text)

Trial Survey QuestionsOPTIONAL: Would you like to be notified regarding the library's decision on this database trial? If yes, please enter your email address below:

Only if the respondent enters an email address, the survey will also ask…

May the library contact you with follow-up questions regarding your responses to this database trial? (Yes / No)

C. How is trial feedback solicited and recorded?

5. Record keeping of trial feedback

CSV and PDF files of survey results are saved and shared with other librarians.

Maintain a basic spreadsheet about past/current trials and decisions.

Christine E. Ryan, Clemson University

WHY TRIAL?

• WE need new content

• New content packages or technology

• Platform changes

• Consortium changes

• User requests

REQUESTS ORIGINATE EVERYWHERE, BUT……

WHO DECIDES TO PROCEED WITH A TRIAL?

• Committee

• Subject liaison

• Other?

SO YOU WANT TO TRIAL, NOW WHAT? ………

WHO COORDINATES THE TRIAL?

WHO COORDINATES THE TRIAL?

• Beginning: what, why

• End: yes/no

• Everything in between

• When to trial

• Who to trial with

• How to…………

HOW TO ACCESS

• Standing trial web page

• Highly visible

• Moderately visible

• Hidden

• Subject or LibGuide

ADVERTISING/PROMOTING THE TRIAL

• General campus email (faculty/staff)

• General student email

• Student organizations

• Subject liaison contacts

• Emails to departments

• Attend department meetings

• Embedded librarians

• Course-specific announcements (Blackboard, library instruction)

• Web page, printed announcements

GATHERING/ANALYZING FEEDBACK

• Gathering Feedback

• Surveys

• How to access?

• Solicitation during conversation, email, focus group training

• Usage stats

• Analyzing

• Analyzing Likert Scale responses

• Compiling free text responses

WHO DECIDES?

• Committee

• Subject liaison

CLOSING THE LOOP

• Participants

• Vendors

Fischer, Christine. 2007. "Group Therapy—Database Trials." Against the Grain 19, no. 6: 65-6.

Street, Chrissy. 2010. "Getting the Most from a Database Trial." Legal

Information Management 10, no. 2: 147-8.

Annis Lee Adams, Golden Gate Universityladams@ggu.edu

Jon Ritterbush, University of Nebraska – Kearneyritterbushjr@unk.edu Twitter: @loperlibrarian

Christine E. Ryan, Clemson Universityryanchr@clemson.edu