Making Online Instruction Count: Statistical Reporting of ... · to those providing instruction...

14
33 Making Online Instruction Count: Statistical Reporting of Web-Based Library Instruction Activities Tim Bottorff and Andrew Todd Tim Boorff is Head Librarian in the Universal Orlando Foundation Library at the University of Central Florida (Rosen College campus); e-mail: timothy.boorff@ucf.edu. Andrew Todd is Reference Librarian, Regional Campus Libraries at the University of Central Florida (BCC/UCF Joint Use Library); e-mail: [email protected]. © Tim Boorff and Andrew Todd Statistical reporting of library instruction (LI) activities has historically focused on measures relevant to face-to-face (F2F) settings. However, newer forms of LI conducted in the online realm may be difficult to count in traditional ways, leading to inaccurate reporting to both internal and external stakeholders. A thorough literature review is combined with the results of an investigative survey to reveal the current status of report- ing such activities. The results reveal considerable confusion about the reporting of Web-based LI activities, even though a number of librarians are devoting significant amounts of time to this important and growing area of librarianship. n a higher education environ- ment where assessment and accountability are frequent watchwords, accuracy in the reporting of library statistics is an important and timely goal. Service and usage statistics are frequently collected and reported to explain the library’s work to campus administrators and accrediting agencies, to compare with other institu- tions, and to inform individual work assignments, departmental resource allocations, and funding and advocacy efforts. To this end, libraries measure everything from the most basic indicators of use (such as gate count, circulation transactions, study room reservations) to much more complex activities (such as reference questions, electronic serial holdings, user service preferences). In the realm of library statistics, the counting and reporting of library instruction has thus far received relatively lile aention. Since library instruction is one of the areas where the academic library most directly interfaces with students and faculty mem- bers, this issue deserves greater aention. Until recent years, most academic library instruction was conducted in face- to-face (F2F) environments. Statistical reporting of LI activities has tended, there- fore, to focus on measures relevant to F2F seings: for example, librarians dutifully report the number of classes and number of students who participate in “one-shot” LI sessions. However, newer forms of LI conducted in the online realm, particularly if the instruction is asynchronous, may be difficult to count in traditional ways. Many librarians now provide LI through crl-197

Transcript of Making Online Instruction Count: Statistical Reporting of ... · to those providing instruction...

Page 1: Making Online Instruction Count: Statistical Reporting of ... · to those providing instruction through tutorials and other online instructional tools, to those teaching online for-credit

33

Making Online Instruction Count: Statistical Reporting of Web-Based Library Instruction Activities

Tim Bottorff and Andrew Todd

Tim Bottorff is Head Librarian in the Universal Orlando Foundation Library at the University of Central Florida (Rosen College campus); e-mail: [email protected]. Andrew Todd is Reference Librarian, Regional Campus Libraries at the University of Central Florida (BCC/UCF Joint Use Library); e-mail: [email protected]. © Tim Bottorff and Andrew Todd

Statistical reporting of library instruction (LI) activities has historically focused on measures relevant to face-to-face (F2F) settings. However, newer forms of LI conducted in the online realm may be difficult to count in traditional ways, leading to inaccurate reporting to both internal and external stakeholders. A thorough literature review is combined with the results of an investigative survey to reveal the current status of report-ing such activities. The results reveal considerable confusion about the reporting of Web-based LI activities, even though a number of librarians are devoting significant amounts of time to this important and growing area of librarianship.

n a higher education environ-ment where assessment and accountability are frequent watchwords, accuracy in

the reporting of library statistics is an important and timely goal. Service and usage statistics are frequently collected and reported to explain the library’s work to campus administrators and accrediting agencies, to compare with other institu-tions, and to inform individual work assignments, departmental resource allocations, and funding and advocacy efforts. To this end, libraries measure everything from the most basic indicators of use (such as gate count, circulation transactions, study room reservations) to much more complex activities (such as reference questions, electronic serial holdings, user service preferences). In the

realm of library statistics, the counting and reporting of library instruction has thus far received relatively little attention. Since library instruction is one of the areas where the academic library most directly interfaces with students and faculty mem-bers, this issue deserves greater attention.

Until recent years, most academic library instruction was conducted in face-to-face (F2F) environments. Statistical reporting of LI activities has tended, there-fore, to focus on measures relevant to F2F settings: for example, librarians dutifully report the number of classes and number of students who participate in “one-shot” LI sessions. However, newer forms of LI conducted in the online realm, particularly if the instruction is asynchronous, may be difficult to count in traditional ways. Many librarians now provide LI through

crl-197

Page 2: Making Online Instruction Count: Statistical Reporting of ... · to those providing instruction through tutorials and other online instructional tools, to those teaching online for-credit

34 College & Research Libraries January 2012

a myriad of online delivery mechanisms, from those “embedded” in classes through courseware (such as WebCT or Moodle), to those providing instruction through tutorials and other online instructional tools, to those teaching online for-credit library research courses, and so on. In many such forms of LI, the librarian may have multiple (virtual) interactions with students over the course of a semester—or no direct interaction at all, in the case of a tutorial—making it difficult, if not impos-sible, to count “classes” and “students” in the traditional manner.

The inability to accurately account for these types of instructional activities may have significant implications, since the way librarians quantify and report their activities can affect both internal and external sources of administration and funding.

The present study reviews the literature related to the reporting of library instruc-tion statistics, as well as some of the wider literature on LI assessment and effective-ness, and then reveals the results of an investigative study on the topic. It is hoped that this study may lead to the develop-ment of standards that will help academic librarians to more accurately account for LI activities conducted in the online realm.

Literature ReviewWhile online LI activities are increasingly becoming key components of many aca-demic libraries’ overall instruction plans, comparatively little has been written about the accounting or reporting of such activities. Therefore, the following review of the literature casts a wide net, exam-ining some related areas of LI research (such as assessment or effectiveness) as well as professional guidelines related to statistical reporting.

Academic librarians have been offering forms of online LI for at least the better part of two decades. Vishwanatham, Wilkins, and Jevec, for example, described LI efforts in the early to mid-1990s us-ing e-mail, FTP, gopher, and the nascent World Wide Web.1 And by the late 1990s,

several authors were offering tips on creating and using Web-based tutorials, on providing instruction through course-ware, and on developing online for-credit research courses.2 Dewald briefly men-tioned the difficulty of tracking usage of online tutorials in a 1999 piece, but most of the early articles were practical primers focused on the pedagogy and mechanics of setting up online instructional tools.3 In the intervening years, the use of various forms of online LI has become more com-monplace, but recent reviews have noted that there has still been very little written about the tracking or assessment of online tutorials or other forms of online LI.4

As early as 2000, Kyrillidou noted that a decline in LI statistics among As-sociation of Research Libraries (ARL) member institutions was “possibly… a function of the introduction of distance learning technologies in the delivery of library instruction.”5 However, in the years since, the professional literature has not suggested a good means for tracking these new technologies or accounting for the resulting statistical differences. Other authors have since mentioned problems with counting instructional activity deliv-ered through tutorials or through blogs.6 Several researchers writing about online embedded LI have touted the ability of most courseware products to track hits, time spent on pages, and other metrics, but actual examples of how the resulting data could be used have not been offered.7

Scherrer and Jacobson suggested a different approach in advocating for the creation of new measures to account for librarians’ changing roles in the twenty-first century.8 Although Scherrer and Jacobson’s focus was not only on LI, but rather on the professional duties of academic health sciences librarians, their scheme did include mention of various LI activities. Noting that “while librarians have continually redefined and changed their roles, the measures by which librar-ians report and evaluate their activities have not sufficiently changed to reflect these new realities,” they proposed devel-

Page 3: Making Online Instruction Count: Statistical Reporting of ... · to those providing instruction through tutorials and other online instructional tools, to those teaching online for-credit

Making Online Instruction Count 35

oping new measures in categories such as Consultation, Outreach, and Web Author-ing.9 The Consultation category, for ex-ample, would include in-depth research appointments, since “point-of-need instruction is currently an unreported figure, because it does not accurately fit in either the ‘reference’ or the ‘instruction’ category” of library statistics.10 And the Web Authoring category might include activities such as “designing Web pages, creating tutorials, developing pathfind-ers, and participating in the development of new products.”11 In a model such as Scherrer and Jacobson propose, various online LI activities might be accounted for in some fashion, albeit in categories that differ from traditional ones.

In recent years, library research on LI activities has tended to focus more on measuring service quality from the user’s point of view, or on user outcomes and the effectiveness of various forms of instruc-tion. Researchers have written extensively on LibQUAL+ and other tools for measur-ing user satisfaction, for example.12 Such large-scale instruments tend not to focus on LI in detail, since they are geared more toward measuring overall levels of satisfaction with library services. Another large body of literature deals with assess-ment of LI at a programmatic level, fo-cusing on quantifiable student outcomes and information literacy skill acquisition, usually through the use of testing instru-ments such as the Standardized Assess-ment of Information Literacy (SAILS), the Educational Testing Service’s iSkills test, or James Madison University’s Informa-tion Literacy Test (ILT).13 These tests can be used to test information literacy skill acquisition broadly, but so far few, if any, studies have compared student achieve-ment on such tests based on multiple methods of delivery for LI. A few studies have examined whether online or F2F instruction is generally more effective, as well as the preferences of users with regard to instructional delivery methods.

Most researchers have concluded that method of delivery (F2F vs. online vs.

mixed) appears not to greatly impact the effectiveness of or satisfaction with LI, indicating that properly planned and executed online LI activities can be suit-able adjuncts or substitutes for F2F LI.14

Research in all of the above areas is vital to the continued success of academic libraries, especially in a higher education environment where accountability and assessment are increasingly important.15 There can be no doubt that academic librarians need to continue to explore issues of user satisfaction, the measure-ment and benchmarking of information literacy skills, and the effectiveness and perception of various forms of LI. Yet the related need for libraries to maintain basic statistics that accurately reflect the nature of their daily duties and interactions (in-cluding in the online realm) is an area that appears to have received comparatively little attention in the professional library literature.

Guidelines or standards published by library organizations comprise another important source of information about the statistical reporting of LI activities. For example, the Association of Research Libraries’ (ARL) guidelines on instruction statistics stipulate counting only the num-ber of “sessions” (classes) conducted and the number of “participants” (students) taught.16 Therefore, in the longstanding ARL statistical model, the typical one-shot F2F LI situation is counted as one “ses-sion,” and the number of students in the course is counted to determine the number of “participants.” It is not clear how these guidelines could be used to account for asynchronous forms of online LI, such as the case where a librarian is embedded in a class through courseware and in which the librarian may interact with the stu-dents multiple times and in multiple ways through a semester. The ARL guidelines do address how to count instructional situa-tions in which a librarian may meet F2F with the same group of students multiple times over the course of a semester—each new class meeting (“session”) is counted separately, but the number of students

Page 4: Making Online Instruction Count: Statistical Reporting of ... · to those providing instruction through tutorials and other online instructional tools, to those teaching online for-credit

36 College & Research Libraries January 2012

(“participants”) is counted only once.17 Such a methodology still may not fit many forms of online LI, including LI delivered through sequenced tutorials. In general, the ARL standards certainly never explic-itly mention online LI; indeed, the wording of the guidelines appears to be geared only toward traditional F2F LI activities.

The ARL statistics first included num-bers on LI activities in a supplemental survey in 1991 and then as part of the full survey in 1994.18 Since then, the ARL metrics related to LI statistics have not changed, despite significant changes in technologies and methodologies in the intervening years. Recent editions of the statistics have pointed out that “a simple count where each reference question gets a single ‘tally’ cannot capture the varying dimensions and growing complexities of reference services” (emphasis added), though no solutions or new approaches have been offered, nor has any mention been made of similar complexities with regard to instruction activities.19 A recent ARL communication stated that the “ARL Statistics and Assessment Committee is currently engaged in developing new quantitative and qualitative indicators and indices,” but updated methods for counting and describing LI activities were not mentioned as part of the initiative.20

The traditional ARL statistical guide-lines have been widely used, even in many non-ARL institutions, almost to the point of being the only existing standards in the field. For example, the statistics produced by the Association of College and Research Libraries (ACRL)—which offers data from more than 1,500 academic libraries of all sizes in the United States—also uses the ARL questionnaire for the collection of its data.21 Therefore, the ACRL similarly reports only number of “sessions” and number of “participants” for LI, making no provision for forms of online LI that do not fit neatly into those categories. Reporting libraries are allowed to provide footnotes, offering additional explanation about statistics when neces-sary, and a very small number of libraries

over the years have used the footnotes to mention the inclusion or exclusion of online LI activities in their reported sta-tistics.22 For the most part, however, the reader of ACRL statistics can only assume that the LI numbers reported represent mostly traditional, F2F LI activities.

Another major source for cross-com-parisons of data among academic libraries is the Library Statistics Program (LSP) of the National Center for Education Statistics (NCES), part of the Integrated Postsecondary Education Data System (IPEDS).23 The LSP statistics, similar to the ARL and ACRL statistics, col-lect information only on the number of “presentations” to groups and on “total attendance” at these presentations. The LSP instructions for submitting LI num-bers are similar to those used for the ARL and ACRL surveys, although the LSP instructions do add that both “self-paced tutorials” and “web-based presentations” may be included in the data; however, neither of these types of presentations is further defined and it is not clear how many libraries actually report such activi-ties in their figures.24 In fact, comparisons between the LSP data and ARL or ACRL data suggest that most libraries simply report the same LI statistics (and most other statistics) to all three sources.

Taken together, the existing literature and guidelines on LI statistics provides little insight into the problem of account-ing for and reporting online LI activities. A timely and thoughtful local taskforce on statistical reporting at the authors’ insti-tution similarly revealed more questions than answers. Therefore, in an effort to go beyond the existing literature, opinion, and experience in the realm of statistical reporting of online LI activities, the au-thors conducted an investigative survey on the topic.

MethodologyThe authors developed a fourteen-item, voluntary, anonymous survey designed to understand how academic librarians at a wide variety of academic libraries are

Page 5: Making Online Instruction Count: Statistical Reporting of ... · to those providing instruction through tutorials and other online instructional tools, to those teaching online for-credit

Making Online Instruction Count 37

reporting online LI activities in practice. A small number of demographic ques-

tions about respondents’ institutions (funding status, highest degree granted, and number of students) were included to ensure that the survey reached a represen-tative cross-section of academic librarians from different types and sizes of schools.

The survey questions were designed to gather data on how academic librar-ians report the following activities: 1) LI delivered through online courseware; 2) LI delivered through online tutorials; and 3) LI delivered through online for-credit research courses in which a librarian is the instructor of record. The first two activi-ties (LI through courseware and through tutorials) had been identified by librarians at the authors’ institution as being the two most prevalent types of online LI that were not being measured fully by exist-ing reporting standards; the third activity (online for-credit LI) was identified by early reviewers of the survey instrument as another area of possible inconsistency with current reporting standards.

Background information and defi-nitions for key terminology (such as “embedded” or “courseware”) were also provided to respondents. Since this was the first survey to examine these topics in depth, plenty of opportunity was also provided for respondents to provide com-ments and feedback. A link to the full sur-vey instrument is provided in the notes.25

After the survey questions were devel-oped and tested, the study was submitted to the authors’ Institutional Review Board (IRB) for approval, since the research in-volved surveying human subjects. Once IRB approval was secured, the survey was mounted on a Web site and the results were set up to be collected in a database. In an effort to gather responses from a wide swath of the academic library com-munity, the survey invitation was sent to a variety of electronic mailing lists devoted to public service librarianship. Lists tar-geting both large and small institutions were included, since the authors wished to examine results from a broad range of

academic institutions. The professional lists targeted were:

• ACRL-CLS (the list of the College Libraries section of ACRL)

• ACRL-ULS (the list of the Univer-sity Libraries section of ACRL)

• ILI-L (the Information Literacy Instruction Listserv)

• OFFCAMP (the list of the Off-Campus Library Services group)

• RCL-DG (the Regional Campus Libraries Discussion Group)

• RUSA-L (the list of the Reference and User Services Association)

The authors sent invitations to the above lists, with instructions for the survey to be completed within a three-week window. The invitation solicited participation from academic librarians who teach library instruction classes, develop instructional tutorials or other instructional tools, or are involved in the collection or reporting of library instruc-tion statistics. A second message was sent approximately one week before the deadline to remind potential participants of the final deadline. After the survey deadline passed, the authors closed the survey and retrieved the results from the database for analysis.

ResultsThe survey garnered 310 responses, 307 of which were associated with academic libraries and therefore met the criteria for inclusion in the results. The total number of respondents to each question varies slightly per IRB guidelines, which stipulate that respondents do not have to answer every question.

The respondents represented a cross-section of academic librarianship, includ-ing a good mix of librarians from public and private institutions, small and large institutions, and institutions granting dif-ferent levels of degrees, as shown in table 1.

Significantly, as depicted in table 2, 93 percent of the 307 respondents still teach LI classes in F2F formats. But almost 50 percent (145) also teach LI online or as an embedded librarian, and more than two-

Page 6: Making Online Instruction Count: Statistical Reporting of ... · to those providing instruction through tutorials and other online instructional tools, to those teaching online for-credit

38 College & Research Libraries January 2012

thirds (218) are involved in developing on-line tutorials or other materials designed to be used in online LI. In addition, a sizable number of the respondents are involved in the collection/compilation (143) or re-porting (224) of LI stats at their institution.

The majority of the questions on the survey were designed to determine how librarians account for different types of online LI activities. The results reveal a great deal of inconsistency in how dif-ferent library systems count and report such activities.

For example, respondents were asked to report how they count “an activity where a librarian is embedded in a course through courseware,” and the results indicated a wide variety of approaches. Roughly 32 percent (97) of the aggregate

(n=305) admitted that their library does not currently provide any embedded library instruction through courseware. Of the remainder (n=208), responses included counting this activity in a wide variety of ways, or not counting it as instruction at all, as depicted in figure 1.

Some respondent comments indicated that the various accounting methods may be partially due to varying degrees of embedding: for some courses, librarians merely monitor a few discussion boards; while in others, librarians create modules, grade assignments, or otherwise take a much more active role. However, many comments also reflected a great deal of general confusion or uncertainty about counting these activities, as evidenced by comments such as these: “we haven’t yet

TABLE 1Demographics of Respondents

Primary Funding Status of Parent Institution (n=305)Public Private Other64% 34% 2%

Highest Degree Granted by Parent Institution (n = 307)Associate Baccalaureate Master’s Doctorate Other

15% 12% 21% 50% 2%Number of Students at Institution (n = 305)

0–999 1,000–2,999 3,000–9,999 10,000–20,000 20,000+6% 18% 28% 21% 27%

TABLE 2Level of Involvement in Library Instruction (n = 307)

Activity (Multiple Responses Allowed) PercentageTeach F2F library instruction (LI) sessions 93%Teach library instruction sessions online and/or serve as an embedded librarian

47%

Coordinate/schedule library instruction sessions at my institution 59%Develop online tutorials or other materials designed to be used in online instruction

71%

Report statistics related to library instruction at my institution 73%Collect/compile statistics related to library instruction at my institution and report them to others

47%

Other 4%

Page 7: Making Online Instruction Count: Statistical Reporting of ... · to those providing instruction through tutorials and other online instructional tools, to those teaching online for-credit

Making Online Instruction Count 39

standardized ways to record statistics on this activity,” or “there is no formal policy on this yet,” or “we’re still wrestling with this question.” And many respondents commented on how the mere activity of taking the survey has raised their aware-ness of the issue: “I better look into this more since I am responsible for the statis-tics here,” or “I never thought about this particular metric before! Thank you for reminding me that it does not currently ‘get counted’ within the current realm of statistical collection in my library.”

Similarly varied results were received on the question dealing with counting library instruction delivered through online tutorials. Of the original 303 re-spondents who answered this question, 39 (13%) indicated that their library does not currently offer library instruction through online tutorials. The remaining 264 respondents indicated a wide variety of possible approaches to counting this activity, as shown in figure 2.

Various comments reflected the same uncertainty related to counting tutorials as those concerning embedded instruc-tion. One respondent stated: “I would typically count any session done with Synchronous sessions (via Learnlinc) as one session equivalent to an on campus face to face session. This the library agrees to. However, I am also building tutorials (both Flash and via web page on Moodle). Neither have been estimated to count towards instruction.” Others al-luded to the difficulty of getting accurate statistics on the use of online tutorials. According to one respondent, “I create tutorials and add links to the tutorials on the library web page but we have no way to determine how many users have ever viewed or completed a tutorial.” Another comment mentioned a dual reporting system: “Because we are an ARL library and online instruction does not meet the definition of instruction used in the ARL Statistics, we maintain two counts of the

FIGuRE 1Counting Online Library Instruction (n = 208)

7%

18%

15%

4%

24%

10%

22%

0% 5% 10% 15% 20% 25% 30%

Other

Not sure

Not counted as instruction

Counted differently from F2F

May or may not be counted as instruction

Counted as multiple F2F LI sessions

Counted as a one-shot F2F LI session

Which of the following most closely describes how your library counts an activity where a librarian is embedded in a course through courseware?

Page 8: Making Online Instruction Count: Statistical Reporting of ... · to those providing instruction through tutorials and other online instructional tools, to those teaching online for-credit

40 College & Research Libraries January 2012

number of sessions and participants—ARL and non-ARL.”

Respondents were also surveyed about their reporting methods related to online for-credit library research courses. Less than a third (98) of the 304 respondents

who answered the question are currently offering this service. However, this small cohort reported a wide variety of methods of accounting for them, as depicted in figure 3. Comments related to counting LI delivered through online for-credit

FIGuRE 2Counting Online Library Tutorials (n = 264)

FIGuRE 3Counting Online For-Credit Library Courses (n = 98)

How does your library count activity in the case where library instruction is delivered through an online tutorial?

How does your library count activity in the case where a librarian teaches a for-credit library course using courseware or some other online delivery mechanism?

5%

11%

50%

23%

2%

8%

0% 10% 20% 30% 40% 50% 60%

Other

Not sure

Not counted as instruction

Differently from F2F

Counted as multiple F2F LI sessions

Counted as a one-shot F2F LI session

31%

20%

18%

15%

15%

0% 10% 20% 30% 40%

Other

Not sure

Not counted as instruction

Counted as multiple F2F LI sessions

Counted as a one-shot F2F LI session

Page 9: Making Online Instruction Count: Statistical Reporting of ... · to those providing instruction through tutorials and other online instructional tools, to those teaching online for-credit

Making Online Instruction Count 41

library research courses included “We count this separately, but it is not part of our general library instruction statistics” and “This is new and we haven’t worked out reportage yet.”

Another major trend emerged from the survey. A majority of respondents re-ported that online LI activities—whether in the form of embedded instruction, on-line tutorials, or online for-credit research courses—tend to require a greater time commitment than traditional, F2F instruc-tion. For example, one question asked, “As compared to an average course in which you provide face-to-face library instruc-tion (including preparation), how much time do you dedicate to an average course in which you are embedded through courseware?” Fifty-one percent (151) of the 295 respondents who answered this question indicated that they cannot compare the two activities—for example, because they do not do one or the other (or

both) activities. Of the remaining respon-dents (n=144) who teach F2F LI classes and participate in online embedded LI, 58 percent (84) indicated that being embed-ded in an online course requires “slightly more” or “significantly more” time than a typical F2F session, and only 19 percent (27) indicated that online embedded LI requires “slightly less” or “significantly less” time as illustrated in figure 4.

Comments related to the perception of time involved with online embedded library instruction activities compared to F2F LI sessions included the following: “compared to one-shot face-to-face instruc-tion sessions, I spend more time preparing for classes in which I am embedded” and “[being embedded] is more work than face-to-face sessions…[and] more time-consuming.” One respondent said that “the activities that you describe are consuming so much of our time… that we are hiring an additional instruction librarian.”

FIGuRE 4Perceived Time Devoted to Online Embedded Instruction (n = 144)

As compared to an average course in which you provide face-to-face library instruction (including preparation), how much time do you dedicate to an average course in which you are embedded through courseware?

10% 9%

23%19%

39%

0%

10%

20%

30%

40%

50%

Significantly less

Slightly less

About the same

Slightly more

Significantly more

Page 10: Making Online Instruction Count: Statistical Reporting of ... · to those providing instruction through tutorials and other online instructional tools, to those teaching online for-credit

42 College & Research Libraries January 2012

Respondents were also asked how much time they dedicated to an average online tutorial as compared to an average face-to-face library instruction session (including preparation).

Responses to this question were similar to those related to the perception of time involved with embedded instruction, in-dicating that the majority of respondents felt that they invested more time in an average online tutorial than an average in-person library instruction class (see figure 5). According to one respondent, “There is a lot of time put into the [initial] development of the tutorials.”

On the other hand, many respondents noted that the time and work involved in online LI activities is often at the point of creation, after which the time devoted to them can sometimes be less than for F2F activities: for example, “In the beginning [being embedded] is time consuming but once the course is up and working… the time spent maintaining it falls off”

and “it takes significantly more time to create a tutorial than it does to prepare and lead one face-to-face library instruc-tion session; of course, once the tutorial is created, it requires significantly *less* time to deliver it to students than it does to deliver face-to-face instruction.”

As stated previously, however, the majority of librarians who could com-pare both activities said that online LI, as compared to its F2F counterpart, takes more time. Another illustration of this point is the response to this question: “As compared to an average face-to-face for-credit library course (including preparation), how much time do you dedicate to an average for-credit library course that utilizes courseware or some other online delivery mecha-nism?” Figure 6 shows that more than two-thirds (41) of the 60 respondents who indicated that they teach online for-credit library courses feel that they spend more time on an average for-

FIGuRE 5Perceived Time Devoted to Online Tutorials (n = 203)

11%6%

14%

22%

46%

0%

10%

20%

30%

40%

50%

60%

Significantly less

Slightly less

About the same

Slightly more

Significantly more

As compared to an average face-to-face library instruction session (including preparation), how much time do you dedicate to an average online tutorial?

Page 11: Making Online Instruction Count: Statistical Reporting of ... · to those providing instruction through tutorials and other online instructional tools, to those teaching online for-credit

Making Online Instruction Count 43

less pronounced in the case of LI deliv-ered through online tutorials, perhaps because many (though certainly not all) online tutorials are designed to be “just in case” tools that are not tied to par-ticular courses or groups of students; in addition, gathering reliable statistics on online tutorials is often problematic, un-less a quiz or other endpoint provides a reliable mechanism for counting usage.

Taken together, the wide degree of variance in the reporting of online LI activities may be enough to cast doubt on wide-scale comparisons of statistics across institutions, such as that done by ACRL, ARL, or NCES. Without clearer reporting guidelines that address some of the complexities associated with online forms of LI, there will likely continue to be little consistency in how libraries report these statistics.

A sizable number of respondents also reported that online LI activities require a significant time commitment, often equal

credit library course than an average F2F for-credit library course.

Some comments cited additional time-consuming features associated with online LI, beyond those involved in the initial creation, such as grading assign-ments or interacting with discussion boards throughout a semester.

DiscussionThe survey results indicate considerable variance and confusion about the statis-tical counting and reporting of online LI activities in academic libraries. The common activity where a librarian par-ticipates in a class through courseware, for example, is counted as a one-shot LI session at some libraries, as multiple LI sessions at others; it is not counted as instruction at all at some libraries, and it “may or may not” be counted at others. Similar variance was observed with the reporting of online, for-credit research courses. The variability was

FIGuRE 6Perceived Time Devoted to For-Credit Library Courses (n = 60)

2% 3%

27%

20%

48%

0%

10%

20%

30%

40%

50%

60%

Significantly less

Slightly less

About the same

Slightly more

Significantly more

As compared to an average face-to-face for-credit library course (including preparation), how much time do you dedicate to an average for-credit library course that utilizes courseware or some other online delivery mechanism?

Page 12: Making Online Instruction Count: Statistical Reporting of ... · to those providing instruction through tutorials and other online instructional tools, to those teaching online for-credit

44 College & Research Libraries January 2012

to or in excess of that required for F2F LI activities. Few could doubt that providing library services in the online environment has dramatically changed modern librari-anship in myriad ways, and undoubtedly most library administrators evaluate and recognize librarians based on these new realities. Nonetheless, standards or measures that more accurately account for typical librarian activities could be useful for both individual librarians and administrators, not to mention external stakeholders who may be less familiar with the realities of modern academic librarianship.

Many respondents did indicate that awareness of issues related to the report-ing of instructional statistics is heighten-ing. A few respondents mentioned insti-tutional-level committees or guidelines designed to address the tricky reporting issues associated with online LI activi-ties. Others noted that the act of taking the survey itself prompted them to begin thinking about these issues and to begin conversations about LI statistical report-ing at their institutions.

In the future, more research may be needed on various forms of synchronous online LI and the reporting challenges associated with these activities. For ex-ample, several respondents mentioned teaching online LI through Wimba, Adobe Connect, or other real-time instructional products. More research might also be needed on the various types of tutorials being produced by librar-ians (“point of need” vs. “just in case,” standalone vs. course-integrated, and so on) and on the difficulties associated with standardized reporting of such activities. Similarly, for-credit courses may present unique challenges and deserve further study. It remains a point of debate, for example, whether such courses should be considered “library instruction” or, as many respondents to the present survey indicated, in an entirely different category.

Conclusion Statistics help libraries to tell their “sto-ries”—to explain and justify what they do. Quantifiable measures such as LI statistics can impact individual work assignments, the allocation of resources to depart-ments, communication with internal and external stakeholders, and efforts related to advocacy and funding. Online LI statis-tics may be particularly important in this regard, since they are trending upward rapidly at many libraries, while several traditional measures—such as circulation and reference statistics—may be growing more modestly or even declining.

At this time, however, a review of the literature suggests that very little attention has thus far been given to the accounting and reporting of online LI activities. And the results of the present survey suggest a substantial amount of confusion and variance regarding the counting of these activities. In addition, the survey results also indicated that a number of librarians are spending a substantial amount of time attending to various forms of online instruction, often equal to or in excess of that required for F2F LI activities.

A recent Sloan Consortium survey reported that “nearly thirty percent of higher education students now take at least one course online” and that “the twenty-one percent growth rate for online enrollments far exceeds the less than two percent growth of the overall higher edu-cation student population.”26 With such remarkable growth trends in online learn-ing, involvement in online LI activities is likely to continue to increase in the future, and librarians therefore need a reliable structure for the reporting of them. Such a structure should ideally be developed by a large national organization representing academic library interests, such as ACRL, ARL, or NCES. In short, all signs point to the need for work on standards or ap-proaches that will help librarians to more accurately and consistently account for LI activities conducted in the online realm.

Page 13: Making Online Instruction Count: Statistical Reporting of ... · to those providing instruction through tutorials and other online instructional tools, to those teaching online for-credit

Making Online Instruction Count 45

Notes

1. Rama Vishwanatham, Walter Wilkins, and Thomas Jevec, “The Internet as a Medium for Online Instruction,” College & Research Libraries 58, no. 5 (Sep. 1997): 433–44.

2. Susan B. Ardis, “Creating Internet-based Tutorials,” Information Outlook 2, no. 10 (Oct. 1998): 17–20; Nancy H. Dewald, “Web-based Library Instruction: What Is Good Pedagogy?” In-formation Technology and Libraries 18, no. 1 (Mar. 1999): 26–31; Nancy K. Getty, Barbara Burd, Sarah K. Burns, and Linda Piele, “Using Courseware to Deliver Library Instruction Via the Web: Four Examples,” Reference Services Review 28, no. 4 (2000): 349–59; David Gray, “Online at Your Own Pace: Web-based Tutorials in Community College Libraries,” Virginia Libraries 45, no. 1 (Jan./Feb./Mar. 1999): 9–10; Barbara Wittkopf, “Recreating the Credit Course in an Online Environment,” Reference & User Services Quarterly 43, no. 1 (Fall 2003): 18–25.

3. Nancy H. Dewald, “Transporting Good Library Instruction Practices into the Web Envi-ronment: An Analysis of Online Tutorials,” Journal of Academic Librarianship 25, no. 1 (Jan. 1999): 26–31.

4. Lauren Miranda Gilbert, Mengxiong Liu, Toby Matoush, and Jo Bell Whitlatch, “Assessing Digital Reference and Online Instructional Services in an Integrated Public/University Library,” Reference Librarian 46, no. 95 (2006): 149–72; Samantha Schmehl Hines, “How It’s Done: Examining Distance Education Library Instruction and Assessment,” Journal of Library Administration 48, no. 3/4 (2008): 467–78; Rachel G. Viggiano, “Online Tutorials as Instruction for Distance Students,” Internet Reference Services Quarterly 9, no. 1/2 (2004): 37–54.

5. Martha Kyrillidou, “Research Library Trends: ARL Statistics,” Journal of Academic Librarian-ship 26, no. 6 (Nov. 2000): 427–36.

6. Priscilla Coulter and Lani Draper, “Blogging It into Them: Weblogs in Information Literacy Instruction,” Journal of Library Administration 45, no. 1 (2006): 101–15; Brenda Faye Green, Lin Wu, and Richard Nollan, “Web Tutorials: Bibliographic Instruction in a New Medium,” Medical Refer-ence Services Quarterly 25, no. 1 (Spring 2006): 83–91; Melissa H. Koenig and Martin J. Brennan, “All Aboard the eTrain: Developing and Designing Online Library Instruction Modules,” Journal of Library Administration 37, no. 3 (2002): 425–35.

7. Elizabeth W. Kraemer, “Developing the Online Learning Environment: The Pros and Cons of Using WebCT for Library Instruction,” Information Technology and Libraries 22, no. 2 (June 2003): 87–92; Virginia L. Stone, Rachel Bongiorno, Patricia G. Hinegardner, and Mary Ann Williams, “Delivery of Web-based Instruction Using Blackboard: A Collaborative Project,” Journal of the Medical Library Association 92, no. 3 (July 2004): 375–77.

8. Carol S. Scherrer and Susan Jacobson, “New Measures for New Roles: Defining and Mea-suring the Current Practices of Health Science Librarians,” Journal of the Medical Library Association 90, no. 2 (Apr. 2002): 164–72.

9. Ibid., 164, 170.10. Ibid., 170.11. Ibid., 170.12. Eric Ackermann, “Program Assessment in Academic Libraries: An Introduction for

Assessment Practitioners,” Research & Practice in Assessment 1, no. 2 (June 2007): 1–9; Tina E. Chrzastowski, “Assessment 101 for Librarians: A Guidebook,” Science & Technology Libraries 28, no. 1 (2008): 155–76; Libraries Act on Their LibQUAL+ Findings: From Data to Action, eds. Fred M. Heath, Martha Kyrillidou, and Consuella A. Askew (New York: Haworth Information Press, 2004); Amy E. Hoseth, “We Did LibQUAL+: Now What? Practical Suggestions for Maximizing Your Survey Results,” College & Undergraduate Libraries 14, no. 3 (2007): 75–84; Martha Kyrillidou, “The Evolution of Measurement and Evaluation of Libraries: A Perspective from the Association of Research Libraries,” Library Trends 56, no. 4 (Spring 2008): 888–909; Stewart E. Saunders, “The LibQUAL+ Phenomenon: Who Judges Quality?” Reference & User Services Quarterly 47, no. 1 (Fall 2007): 21–24.

13. Lynn Cameron, Steven L. Wise, and Susan M. Lottridge, “The Development and Valida-tion of the Information Literacy Test,” College & Research Libraries 68, no. 3 (May 2007): 229–36; Irvin R. Katz, “Testing Information Literacy in Digital Environments: ETS’s iSkills Assessment,” Information Technology and Libraries 26, no. 3 (Sept. 2007): 3–12; Brian Lym, Hal Grossman, Lauren Yannotta, and Makram Talih, “Assessing the Assessment: How Institutions Administered, Inter-preted, and Used SAILS,” Reference Services Review 38, no. 1 (2010): 168–86; Mary M. Somerville, Lynn D. Lampert, Katherine S. Dabbour, Sallie Harlan, and Barbara Schader, “Toward Large Scale Assessment of Information and Communication Technology Literacy,” Reference Services Review 35, no. 1 (2007): 8–20.

14. Rozalynd P. Anderson and Steven P. Wilson, “Quantifying the Effectiveness of Interactive Tutorials in Medical Library Instruction,” Medical Reference Services Quarterly 28, no. 1 (2009): 10–21; Penny M. Beile and David N. Boote, “Does the Medium Matter?: A Comparison of a Web-

Page 14: Making Online Instruction Count: Statistical Reporting of ... · to those providing instruction through tutorials and other online instructional tools, to those teaching online for-credit

46 College & Research Libraries January 2012

Based Tutorial with Face-to-Face Library Instruction on Education Students’ Self-efficacy Levels and Learning Outcomes,” Research Strategies 20 (2005): 57–68; Elizabeth Blakesley Lindsay, Lara Cummings, Corey M. Johnson, and B. Jane Scales, “If You Build It, Will They Learn? Assessing Online Information Literacy Tutorials,” College & Research Libraries 67, no. 5 (Sept. 2006): 429–45; James Nichols, Barbara Shaffer, and Karen Shockey, “Changing the Face of Instruction: Is Online or In-class More Effective?” College & Research Libraries 64, no. 5 (Sept. 2003): 378–88; Laura M. Schimming, “Measuring Medical Student Preference: A Comparison of Classroom Versus Online Instruction for Teaching PubMed,” Journal of the Medical Library Association 96, no. 3 (July 2008): 217–22; Li Zhang, Erin M. Watson, and Laura Banfield, “The Efficacy of Computer-Assisted In-struction Versus Face-to-Face Instruction in Academic Libraries: A Systematic Review,” Journal of Academic Librarianship 33, no. 4 (July 2007): 478–84.

15. ACRL identified “demands for accountability and assessment” as one of its “top ten trends” impacting academic libraries in the near future: ACRL Research Planning and Review Commit-tee, “2010 Top Ten Trends in Academic Libraries: A Review of the Current Literature,” College & Research Libraries News 71, no. 6 (June 2010): 286–94. ACRL’s recent report Value of Academic Libraries: A Comprehensive Research Review and Report (Sept. 2010), available online at www.acrl.ala.org/value/ [accessed 10 January 2011], similarly pinpoints accountability and assessment as among the most important current issues for academic libraries.

16. Association of Research Libraries, “ARL Statistics Questionnaire, 2008–09: Instructions for Completing the Questionnaire” (2009), available online at www.arl.org/bm~doc/09instruct.pdf [accessed 10 January 2011].

17. Ibid.18. Association of Research Libraries, “ARL Statistics 1994–95” (1995), available online at www.

arl.org/bm~doc/1994-95.pdf [accessed 10 January 2011].19. Association of Research Libraries, “ARL Statistics 2004–05” (2005), available online at www.

arl.org/bm~doc/arlstat05.pdf [accessed 10 January 2011]. Interestingly, the topic of standards and consistency in reference statistics has received much more attention than with instruction statis-tics. See, for example: Harry C. Meserve, Sandra E. Belanger, Joan Bowlby, and Lisa Rosenblum, “Developing a Model for Reference Research Statistics: Applying the ‘Warner’ Model of Reference Question Classification to Streamline Research Services,” Reference & User Services Quarterly 48, no. 3 (2009): 247–58; Sarah M. Philips, “The Search for Accuracy in Reference Desk Statistics,” Community & Junior College Libraries 12, no. 3 (2004): 49–60; Beth Thomsett-Scott and Patricia E. Reese, “Changes in Library Technology and Reference Desk Statistics: Is There a Relationship?” Public Services Quarterly 2, no. 2/3 (2006): 143–65; Debra G. Warner, “A New Classification for Reference Statistics,” Reference & User Services Quarterly 41, no. 1 (Fall 2001): 51–55; Jeanie M. Welch, “Click and Be Counted: A New Standard for Reference Statistics,” Reference Librarian 47, no. 1 (2007): 95–104.

20. Martha Kyrillidou, “Reshaping ARL Statistics to Capture the New Environment,” ARL: A Bimonthly Report, no. 256 (Feb. 2008): 9—11, available online at www.arl.org/bm~doc/arl-br-256-stats.pdf [accessed 10 January 2011].

21. Virgil E. Varvel, Mary Jane Petrowski, and Association of College and Research Libraries, 2007 Academic Library Trends and Statistics for Carnegie Classification: Doctorate Granting Institutions (Chicago: Association of College and Research Libraries, 2009). A companion volume covers bac-calaureate and master’s institutions.

22. Ibid.23. National Center for Education Statistics, “IPEDS Data Center,” available online at http://

nces.ed.gov/ipeds/datacenter/ [accessed 10 January 2011]. The Library Statistics Program is spe-cifically hosted at http://nces.ed.gov/surveys/SurveyGroups.asp?group=5 [accessed 10 January 2011].

24. National Center for Education Statistics, “Instructions for the Academic Libraries Survey: FY 2008” (2008), available online at http://nces.ed.gov/surveys/libraries/compare/PDF/2008_Ques-tionnaire.pdf [accessed 10 January 2011].

25. An archived version of the survey instrument is available at: Tim Bottorff and Andrew Todd, “Online Library Instruction Statistics Survey” (2010), available online at http://library.ucf.edu/rosen/online_li_stats_survey.asp [accessed 10 January 2011].

26. Allen, I. Elaine and Jeff Seaman. “Class Differences: Online Education in the United States, 2010,” report published by Babson Survey Research Group and the College Board, funded by the Alfred P. Sloan Foundation and distributed by the Sloan Consortium (Nov. 2010), available online at http://sloanconsortium.org/sites/default/files/class_differences.pdf [accessed 10 January 2011].