Focus on the user environment: increase your library's usability (Suzanne Lewis)
-
Upload
nswhlf2007 -
Category
Technology
-
view
1.129 -
download
1
description
Transcript of Focus on the user environment: increase your library's usability (Suzanne Lewis)
Prepared by Suzanne LewisArea Libraries, IM&T
November 2007
Focus on the User Environment: Enhancing
Library Usability
Usability
Easy to learn
Useful
Easy to use
Pleasant to use
Gould, J.D. and Lewis, C. Designing for usability: key principles and what designers think. Communications of the ACM, 28, 3 (March 1985): 300-311.
Ease of Use
Defined as “how quickly we can use a product to complete tasks”.
Library patrons want ease of use
Usefulness
Defined as whether the product does “what it is supposed to do … Does it work?” What are the end results?
Librarians want usefulness.
ease of use + usefulness=
usability
Dicks, R. Stanley. Mis-usability: on the uses and misuses of usability testing. Proceedings of the 20th Annual International Conference on Computer Documentation, 26-30. October 20-23, 2002, Toronto, Ontario, Canada
Usability Quick Fixes
Signage
Display
Weeding
Opening hours
Food and drink – coffee
Woodward, J. 2005. Creating the customer-driven library: building on the bookstore model. American Library Association: Chicago.
Daily Checklist Completed
(tick √)
Out of date material removed from counter
Brochure holders full of relevant leaflets
Velcro, Blu-Tack and sticky tape removed
Time on clock is correct
Windows, doors, floors clean & all litter removed
All staff wearing name badges
All clutter removed from counter
All displays re-stocked
All lights are working
All signage is relevant to today
All faded, ripped signs are removed
Stanley, J & L. 2004. Think for your customer. Lizardpublishing.biz: Kalamunda, WA.
What Makes a Library Usable?
Accessibility – physical and online
Relevance – right information, right place, right time
Responsiveness to patrons – able to change, adapt, respond quickly
Attitude of library staff
Three Types of Librarianship
“Lollipop librarianship”
“Broccoli librarianship”
Evidence based librarianship
Lollipop Librarianship
Give them what they want
Choose services and resources that are easy to learn and use
Fast results but not always the best or most useful
Lollipop Librarianship
Hangwi Tang & Jennifer Hwee Kwoon Ng. 2006. Googling for a diagnosis – use of Google as a diagnostic aid: internet based study. British Medical Journal 333:1143-1145, 10 November.
“…. In difficult diagnostic cases, it is often useful to ‘google for a diagnosis’. Web based search engines such as Google are becoming the latest tools in clinical medicine, and doctors in training need to become proficient in their use”.
In this study, using 26 case reports from the New England Journal of Medicine, Google searches found the correct diagnosis in 58% of cases.
58% !!!!!!!
Referrals from Search Engines to Web Sites of 844 Journals Hosted by HighWire Press (June 2005)
Steinbrook, R. Searching for the right search – reaching the medical literature. New England Journal of Medicine 2006; 354:4-7.
Broccoli Librarianship
Telling patrons what they should know and how they should use services and resources because “it is good for them”.
Vaughn, D. & Burton C. 2003. Broccoli librarianship and Google-bred patrons, or what’s wrong with usability testing? College & Undergraduate Libraries 10 (2), 1-18.
Information Literacy
“only librarians like to search; everyone else likes to find”.
(Roy Tennant quoted in Wilder, S. 2005. “Information literacy makes all the wrong assumptions”. The Chronicle Review, 51, 18. http://chronicle.com/weekly/v51/i18/18b01301.htm)
“The OPAC meets librarians’ needs, not the end-users’ needs. Change the OPAC rather than doing more information literacy training. Put our content into Google”.
(Abram, Stephen. The Top 10 Strategies for Library Success – An Expert Forum with Stephen Abram. 29 August 2007, Sydney.)
Evidence Based Librarianship
Evidence Based Library and Information Practice (EBLIP) seeks to improve library and information services and practice by bringing together the best available evidence and insights derived from working experience, moderated by user needs and preferences. … It thus attempts to integrate user-reported, practitioner-observed and research-derived evidence as an explicit basis for decision-making.
Booth, A. (2006). Counting what counts: performance measurement and evidence-based practice. Performance Measurement and Metrics, 7(2), 63-74.
All Too Familiar?How often have you seen this sort of message on a listserv:
‘Does anyone out there know how to deal with problem x or y?’
And the reply comes back:
‘Yes, here at Diddly-Squat Library we had the same problem and we fixed it by doing yabba-dabba-doo.’
And more often than not, the response is:
‘Great – we’ll try the same thing and hope it works for us. Thanks so much.’
Well, isn’t that careful, reflective and insightful professional practice!
Gorman, G. E. (2004, April). Evidence-based information practice comes of age. Retrieved 23 August, 2005.
Stages of EBLIP
Formulate the question
Find the evidence
Critically appraise the evidence
Apply the evidence
Evaluate impact and performance
Report findings
Formulate a Question
S Setting Library Services Intranet site
P Perspective Staff and students of the organisation
I Intervention Site improvements
C Comparison Original site
E Evaluation Usability (as a determiner of effectiveness)
Focus the Question
“What improvements to the current Library intranet site should be made to improve usability for the staff and students of the organisation?”
Cotter, L., Harije, L., Lewis, S. & Tonnison, I. 2006. Adding SPICE to a library intranet site: a recipe to enhance usability. Evidence Based Library and Information Practice 1, (1): 3-25.
Finding the Evidence
User-reported – brief online survey
Librarian-observed – usability testing
Research-derived – literature search
Appraise the Evidence
We appraised
– Quality of article
– Level of evidence
– Contextual relevance
We asked
– Is this a study we can use/adapt?
– Is the study valid/reliable/applicable?
Apply the Evidence
DIRECTLY– Raward’s Usability Analysis Tool for library websites
DERIVATION– Usability testing
CONDITIONALLY– Research-Based Web Design & Usability Guidelines
ENLIGHTENMENT– Theoretical discussion, commentaries– Examination of other sites
Evaluate Impact
Evaluating
– Our performance applying the EBL process
– Impact of changes made
Disseminating
– Conference proceedings
– Publication of project report
– EBLIP journal from 2006
Libraries Using Evidence
What does EBLIP have to do with Usability?
“Lollipop librarianship” tends to result in services and resources that are easy to use but not always useful
“Broccoli librarianship” tends to result in services and resources that are useful but not always easy to use
Evidence based librarianship helps you achieve resources and services that are highly usable
Usability and Web 2.0/Library 2.0
Ann Arbor District Library Catalogue
http://www.aadl.org/catalog
OPAC 2.0
tag clouds – subject headings and/or user-generated
“best bets”
“recently arrived”
“most popular” – based on circulation data
community reviews and ratings
federated searching across catalogue and databases