SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

188
SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT? by Patrick Ryan Lowenthal B.A., Georgia State University, 1997 M.A., University of Colorado Boulder, 1999 M.A., University of Colorado Denver, 2003 A thesis submitted to the Faculty of the Graduate School of the University of Colorado Denver in partial fulfillment of the requirements for the degree of Doctor of Philosophy Educational Leadership and Innovation 2012

description

Social presence theory is a central concept in online learning. Hundreds of studies have investigated social presence and online learning. However, despite the continued interest in social presence and online learning, many questions remain about the nature and development of social presence. Part of this might be due to the fact that the majority of past research has focused on students' perceptions of social presence rather than on how students actually establish their social presence in online learning environments. Using the Community of Inquiry Framework, this study explores how social presence manifests in a fully asynchronous online course in order to help instructional designers and faculty understand how to intentionally design opportunities for students to establish and maintain their social presence. This study employs a mixed-methods approach using word count, content analysis, and constant-comparison analysis to examine threaded discussions in a totally online graduate education course. The results of this study suggest that social presence is more complicated than previously imagined and that situational variables such as group size, instructional task, and previous relationships might influence how social presence is established and maintained in threaded discussions in a fully online course.

Transcript of SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

Page 1: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

by

Patrick Ryan Lowenthal

B.A., Georgia State University, 1997

M.A., University of Colorado Boulder, 1999

M.A., University of Colorado Denver, 2003

A thesis submitted to the

Faculty of the Graduate School of the

University of Colorado Denver in partial fulfillment

of the requirements for the degree of

Doctor of Philosophy

Educational Leadership and Innovation

2012

Page 2: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

All rights reserved

INFORMATION TO ALL USERSThe quality of this reproduction is dependent on the quality of the copy submitted.

In the unlikely event that the author did not send a complete manuscriptand there are missing pages, these will be noted. Also, if material had to be removed,

a note will indicate the deletion.

All rights reserved. This edition of the work is protected againstunauthorized copying under Title 17, United States Code.

ProQuest LLC.789 East Eisenhower Parkway

P.O. Box 1346Ann Arbor, MI 48106 - 1346

UMI 3506428

Copyright 2012 by ProQuest LLC.

UMI Number: 3506428

Page 3: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

ii

This thesis for the Doctor of Philosophy degree by

Patrick Ryan Lowenthal

has been approved for the

Educational Leadership and Innovation

by

Joanna C. Dunlap, Chair

Joanna C. Dunlap, Advisor

Rodney Muth

Ellen Stevens

Patti Shank

Date

Page 4: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

iii

Lowenthal, Patrick Ryan (Ph.D., Educational Leadership and Innovation)

Social Presence: What is it? How do we measure it?

Thesis directed by Associate Professor Joanna C. Dunlap

Social presence theory is a central concept in online learning. Hundreds of studies have

investigated social presence and online learning. However, despite the continued interest

in social presence and online learning, many questions remain about the nature and

development of social presence. Part of this might be due to the fact that the majority of

past research has focused on students' perceptions of social presence rather than on how

students actually establish their social presence in online learning environments. Using

the Community of Inquiry Framework, this study explores how social presence manifests

in a fully asynchronous online course in order to help instructional designers and faculty

understand how to intentionally design opportunities for students to establish and

maintain their social presence. This study employs a mixed-methods approach using

word count, content analysis, and constant-comparison analysis to examine threaded

discussions in a totally online graduate education course. The results of this study suggest

that social presence is more complicated than previously imagined and that situational

variables such as group size, instructional task, and previous relationships might

influence how social presence is established and maintained in threaded discussions in a

fully online course.

The form and content of this abstract are approved. I recommend its publication.

Approved: Joanna C. Dunlap

Page 5: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

iv

DEDICATION

I dedicate this thesis to the ladies of my life. First, I dedicate this to my mother. I

would not be the person I am today if it was not for her. Second, I dedicate this to my

wife, Alison, for (among other things) her unfaltering support and patience while I was

avoiding completing this thesis. I could not have completed this without her love and

support. Third, I dedicate this to my daughters, Jordan and Ashlyn. I hope they

understand one day why Daddy spent so much time on the computer. And over time I

hope they see me spend less time on the computer and more time with them. Last but not

least, I dedicate this to the two greatest dogs in the world, Beezer and Nikita. They both

supported me in their own way throughout this process over the years, and I miss them

dearly now that they are gone.

.

Page 6: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

v

ACKNOWLEDGEMENT

I want to thank my advisor, Joanna C. Dunlap, for her guidance, support, and

patience over the years. Joni taught me how to be a scholar and has been a great

colleague and friend. I look forward to continuing our relationship for years to come. I

also want to thank Ellen Stevens for never giving up on me and always asking those

tough questions over the years. I want to thank Rodney Muth for his unending support. I

took my first EDLI course with Rod, I published my first article with Rod, and I finished

my dissertation with Rod. I would also like to thank Marcia Muth for teaching me to be a

writer when that was the last thing I thought I would ever become. And finally I would

like to thank Patti Shank for her continued professional support over the years.

Page 7: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

vi

TABLE OF CONTENTS

FIGURES ............................................................................................................... xi

TABLES .............................................................................................................. xiii

CHAPTER

1. INTRODUCTION ...............................................................................................1

Background ..................................................................................................3

Social Presence Theory ....................................................................3

The Evolution of Social Presence Theory .......................................5

Limitation of Previous Studies.....................................................................6

Statement of the Problem .............................................................................9

Conceptual Framework ..............................................................................10

Goal of the Study .......................................................................................14

Overview of Methods ................................................................................16

Sample ............................................................................................16

Data Analysis .................................................................................16

Reliability and Validity ..................................................................18

Significance of Study .................................................................................18

Limitations .................................................................................................19

Chapter Summary ......................................................................................19

2. LITERATURE REVIEW ..................................................................................21

Page 8: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

vii

A Brief History of Social Presence Theory ...............................................21

Theoretical Foundations of Social Presence Theory .....................21

Intimacy .............................................................................22

Immediacy ..........................................................................22

Influential and Related Research on Social Presence ....................23

Competing Theories of Social Presence Theory ........................................26

Cuelessness ....................................................................................26

Media Richness ..............................................................................27

Social Information Processing .......................................................28

Defining Social Presence ...........................................................................31

Measuring Social Presence ........................................................................33

Gunawardena’s Social Presence Scale ...........................................34

Rourke et al.’s Social Presence Indicators .....................................35

Tu and The Social Presence and Privacy Questionnaire ...............37

Research on Social Presence ......................................................................40

Social Presence and Student Satisfaction ......................................40

Social Presence and Interaction .....................................................44

Social Presence and Student Learning ...........................................47

Establishing and Maintaining Social Presence ..........................................53

Some Gaps in the Literature ......................................................................57

Chapter Summary ......................................................................................60

Page 9: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

viii

3. METHOD .........................................................................................................61

Research Question .....................................................................................61

Research Design .........................................................................................61

Sample ........................................................................................................62

Sampling Scheme ...........................................................................62

Sampling Design ............................................................................65

Data Collection ..........................................................................................67

Data Analysis .............................................................................................67

Word Count ....................................................................................68

Content Analysis ............................................................................69

Constant Comparison Analysis ......................................................77

Reliability and Validity ..............................................................................79

Reliability .......................................................................................79

Validity ..........................................................................................80

Chapter Summary ......................................................................................81

4. RESULTS .........................................................................................................82

Word Count ................................................................................................82

Content Analysis ........................................................................................87

Stage One: Social Presence Categories and Indicators Across All

Threaded Discussions ....................................................................89

Page 10: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

ix

Stage Two: Social Presence Categories and Indicators by Threaded

Discussion ......................................................................................94

Stage Three: Social Presence Categories and Indicators by

Students ........................................................................................101

Constant Comparison Analysis ................................................................106

Chapter Summary ....................................................................................111

5. DISCUSSION ..................................................................................................112

Key Findings ............................................................................................112

Group Size ...................................................................................114

Instructional Task .........................................................................116

Past Relationships ........................................................................119

One Size Does Not Fit All ...........................................................120

Limitations of Studying Social Presence .................................................121

Situational Variables of CMC ......................................................122

Unit of Analysis ...........................................................................126

Problems with the Social Presence Indicators and Treating Them

Equally .........................................................................................128

Problems with Measuring the Community of Inquiry .................130

Limitations of the Study...........................................................................132

Concluding Thoughts and Implications ...................................................133

Page 11: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

x

APPENDIX

A. APPENDIX A .................................................................................................136

B. APPENDIX B .................................................................................................142

C. APPENDIX C .................................................................................................146

REFERENCES ................................................................................................................150

Page 12: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

xi

LIST OF FIGURES

Figure

1.1 Community of Inquiry Framework ........................................................................11

1.2 Visual Depiction of Initial Conceptual Framework of Social Presence Developed

by Rourke et al., 2001a ..........................................................................................14

2.1 Communication Media and Information Richness Diagram .................................28

2.2 Timeline of Competing Theories of Social Presence Preceding the Development

of the Community of Inquiry Framework ..............................................................30

2.3 Continuum of Definitions of Social Presence ........................................................33

3.1 Steps Followed to Complete Constant Comparison Analysis of Online

Discussions ............................................................................................................78

4.1 Word Cloud of Word Count Results Without the Discussions Headings .............84

4.2 Frequency of Possible Social Presence Indicators Across the Three Major

and Most Frequented Threaded Discussions .........................................................85

4.3 Stages of Disaggregation of Content Analysis Used to Explore Use of Social

Presence Indicators in a Fully Online Asynchronous Course ................................88

4.4 A Visual Depiction of the Frequency of Each of the Three Social Presence

Categories ..............................................................................................................90

4.5 Frequency of Social Presence Indicators Across All Threaded

Discussions ............................................................................................................92

4.6 Social Presence Indicators Separated by Category ................................................93

4.7 Visual Depiction of the Average Social Presence Indicators Group by Category in

Closed Threaded Discussions ................................................................................97

Page 13: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

xii

4.8 Ranking of Social Presence Indicators Used By the Three Students with the

Highest Overall Social Presence Per Post Average .............................................104

4.9 Disaggregation of Three Students with Highest Social Presence per Post

Average ................................................................................................................106

Page 14: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

xiii

LIST OF TABLES

Table

1.1 Categories and indicators of social presence ...............................................................12

1.2 Alignment of research questions to data analysis ........................................................18

2.1 Phases of social presence research ...............................................................................30

2.2 Example of social presence indicators ...................................................................36

2.3 Social presence dimension of the Community of Inquiry Questionnaire ..............39

2.4 Strategies to establish and develop social presence ...............................................53

2.5 Strategies to establish and maintain social presence ..............................................55

3.1 Online descriptions ................................................................................................64

3.2 Threaded discussions raw data ...............................................................................66

3.3 Overview of data analysis ......................................................................................68

3.4 Original social presence categories and example indicators ..................................70

3.5 Rourke et al.’s categories and indicators of social presence ..................................71

3.6 Evolution of the indicators of social presence .......................................................72

3.7 Swan and Hughes et al. combined list of categories and indicators of

social presence .......................................................................................................73

3.8 Coding sheet used for content analysis ..................................................................75

4.1 Top 20 words used across all threaded discussions ...............................................83

4.2 Top 20 words across project groups ......................................................................86

4.3 Top 20 words across pairs ......................................................................................86

4.4 Top 20 words across reading groups .....................................................................87

4.5 Social presence frequency across all forums .........................................................91

Page 15: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

xiv

4.6 Social presence indicators ranking from highest to lowest frequency ...................92

4.7 Open vs. closed threaded discussions ....................................................................95

4.8 Average social presence indicators per post across open and closed threaded

discussions .............................................................................................................96

4.9 Average social presence indicators across closed threaded discussions ................97

4.10 Ranking of average social presence indicators across closed threaded discussions98

4.11 Average social presence indicator per threaded discussion .................................100

4.12 Student’s use of social presence categories .........................................................102

4.13 Groups of codes resulting from the constant comparison analysis of reading

Group E ................................................................................................................108

4.14 Groups of codes resulting from the constant comparison analysis of

Pair 9 ....................................................................................................................110

5.1 Teaching presence categories and indicators .......................................................113

5.2 Instructor vs. student postings in small discussions.............................................117

5.3 Measuring social presence in a Community of Inquiry .......................................131

Page 16: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

1

CHAPTER 1

INTRODUCTION

I can remember when I started teaching online. I was a full believer in online

education. I had been teaching face-to-face courses and even taken a few courses online

myself. I was excited to teach online. At the same time, I was scared. I was scared that

somehow my personality, my classroom presence, my empathy, my ability to connect

with my students—all things that I attributed to my success teaching face-to-face—would

not translate to an online environment.

I regularly meet faculty now who have similar fears. They fear that what they do

in the classroom cannot translate to an online environment. Fears like these, though, are

not restricted to faculty. I meet people all the time who make claims like, “I just can’t

learn that way” or “I need to talk to people face-to-face” or “online learning is just not for

me.” For some time, people have had the choice to avoid learning online if it was not

their preferred way to learn. But the growth of online education (see Allen & Seaman,

2006, 2010), legislative trends that require students to learn online (Walters, 2011;

Watson, 2006), and the blurring of boundaries between fully online and traditional face-

to-face courses (Woo, McNeill, Preston, Green, & Phillips, 2008), suggest that in the near

future faculty and students will no longer have the choice to avoid online education.

Based on my research and experience, I contend that one’s success learning

online—specifically in formal online education settings—begins and ends with one’s

ability to communicate effectively online. In my experience, students who struggle

communicating online (whether within a Learning Management System or using email)

struggle learning online in formal online educational settings. Communicating online is

Page 17: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

2

simply different from communicating face-to-face (Suler, 2004). I am interested in these

differences and how people—specifically faculty and students—take advantage of these

differences in formal education settings. In other words, I am interested in how faculty

and students leverage the strengths and minimize the limitations of a computer-mediated

communication (CMC) medium when teaching and learning online.

A supposed limitation of CMC and online education in general is that it is

difficult to establish one’s presence as a “real” person and “connect” with others—

generally called social presence (Kear, 2010). One reason people struggle learning online,

I posit, is related to this concept of social presence or the lack there of. For instance,

isolation and loneliness—which are in part due to a lack of presence—are often cited as

reasons why students do not persist online (Ali & Leeds, 2010; Ludwig-Hardman &

Dunlap, 2003).

I have set forth to investigate the big question of how people establish their

presence online by examining how people present themselves as real people in formal

online education environments (which predominantly rely on asynchronous CMC).

Ultimately, my hope is that my research can help others learn how to establish their social

presence in formal online education environments. In the following pages of this chapter,

I provide a formal rationale for and overview of this study by beginning with some

background literature on social presence, addressing limitations of previous research,

presenting my conceptual framework, and finally providing an overview of the

methodology used for this study.

Page 18: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

3

Background

In the late 1980s and early 1990s, researchers began to study the effects of

computer-mediated communication (CMC) (Daft & Lengel, 1984, 1986; Rutter, 1984,

1987; Walther, 1996). Some concluded that CMC was inherently antisocial and

impersonal (Walther, 1996; Walther, Anderson, & Park, 1994). While Hiltz and Turoff

(1993), two early key researchers of CMC, acknowledged that interpersonal relationships

might be fostered through CMC, early research suggested—and convinced others—that

CMC was better at task-oriented communication than interpersonal communication

(Walther & Parks, 2002). To make sense of findings like these, CMC researchers turned

to theories like Cuelessness Theory (Rutter, 1984, 1987), Media Richness Theory (Daft

& Lengel, 1984, 1986; Daft, Lengel, & Trevino, 1987), Social Information Processing

Theory (Walther, 1996; Walther & Parks, 2002) and Social Presence Theory (Short,

Williams, & Christie, 1976). Overtime, social presence theory appealed to more

researchers of online learning (as is evidenced in the growing body of research on social

presence and online learning). And today, social presence theory is the most often

referenced theory explaining the social nature of CMC in online educational

environments (Lowenthal, 2010).

Social Presence Theory

Short, Williams, and Christie (1976) originally developed the theory of social

presence to explain the effect telecommunications media have on communication. They

defined social presence as the degree of salience (i.e., quality or state of being there)

between two communicators using a communication medium. They posited that

communication media differ in their degree of social presence and that these differences

Page 19: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

4

play an important role in how people interact. They conceptualized social presence

primarily as a quality of a communication medium that can determine the way that people

interact and communicate. From their perspective, people perceive some media as having

a higher degree of social presence (e.g., video) and other media as having a lower degree

of social presence (e.g., audio) and still other media having even a lower degree of social

presence (e.g., text). More importantly, Short et al. believed that a medium with a high

degree of social presence is seen as being sociable, warm, and personal, whereas a

medium with a low degree of social presence is seen as less personal. While people might

want a less intimate or immediate communication medium from time to time (see

Williams, 1975), formal education is a very social process that involves high

interpersonal involvement. Past research, for example, has specifically stressed the

importance of contact and cooperation between faculty and students (Chickering &

Gamson, 1987). Thus, early on social presence theory appeared to have direct

implications for educators in online environments.

In the late eighties and early nineties, relying on this theory, researchers began

concluding that CMC was inherently impersonal because the nonverbal and relational

cues (common in face-to-face communication) are filtered out of CMC (Walther & Parks,

2002). Later though in the mid-nineties, researchers began to notice, even though CMC

lacks nonverbal and relational cues, that it can still be very social and interpersonal

(Gunawardena, 1995; Gunawardena & Zittle, 1997) and at times even hyperpersonal

(Walther, 1996). Further, as researchers (Gunawardena, 1995; Tu, 2000) began

examining the sociability of online education, they started questioning the degree to

which the attributes of a communication medium—in this case the cues filtered out of

Page 20: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

5

CMC systems—determine how people socially interact (Danchak, Walther, & Swan,

2001; Gunawardena, 1995; Gunawardena & Zittle, 1997; Richardson & Swan, 2003; Tu,

2000).

The Evolution of Social Presence Theory

Researchers of online learning (e.g., Gunawardena, 1995; Gunawardena & Zittle,

1997; Tu, 2000) began questioning the theory of social presence developed by Short et al.

(1976). These researchers argued, based on their experience and research, that

participants in online asynchronous discussions, using text alone, are able to project their

personalities into online discussions and create social presence. They found that online

learners are able to present themselves as being “real” as well as “connect” with others

when communicating in online learning environments by doing such things as using

emoticons, telling stories, and even using humor (Rourke et al., 2001a; Swan, 2003).

Thus, a user’s personal perceptions of social presence—which are influenced over time

and with experience using a communication medium—and the behaviors one learns to

use to make up for the cues that are filtered out matter just as much, if not more, than a

medium’s supposed capabilities. This new line of research sparked a renewed interest in

the sociability of online learning, social presence, and CMC as evidenced in the increased

amount of literature focused on social presence.

Given the research stream, social presence is now a central concept in online

learning. For instance, social presence has been listed as a key component in theoretical

frameworks for distance education (Akyol & Garrison, 2009; Benbunan-Fich, Hiltz, &

Harasim, 2005; Vrasidas & Glass, 2002). Researchers have shown—to varying degrees—

a relationship between social presence and student satisfaction (Gunawardena, 1995;

Page 21: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

6

Gunawardena & Zittle, 1997; Hostetter & Busch, 2006; Richardson & Swan, 2003; So &

Brush, 2008), social presence and the development of a community of learners (Rourke,

Anderson, Garrison, & Archer, 2001a; Rovai, 2002; Ryman, Hardham, Richardson, &

Ross, 2009), and social presence and perceived learning (Caspi & Blau, 2008;

Richardson & Swan, 2003). Just as earlier researchers of CMC (Kiesler, 1986; Kiesler,

Siegel, McGuire, 1984) used social presence theory to explain why CMC was inherently

impersonal, later researchers (Gunawardena, 1995; Tu, 2000) reconceptualized social

presence theory—focusing less on the medium and more on how people adapted to the

medium—to explain how CMC in online learning environments can be very personal and

social.

Limitations of Previous Studies

Despite the intuitive appeal and overall popularity of social presence theory,

research on social presence still suffers from a few problems. Early studies of social

presence and CMC had contradictory findings (see Walther et al., 1994). For instance,

studies conducted in laboratory settings tend to support cues-filtered-out perspectives that

suggested that CMC was inherently anti-social (Connolly, Jessup, & Valacich, 1990;

Hiemstra, 1982), whereas studies conducted in the field often did not (Walther, 1992;

Walther et al., 1994; Weedman, 1991). Walther et al. (1994) explain that contradictory

findings like these are likely due to the abbreviated time periods and unrealistic

experimental settings researchers used to study CMC.

In much the same way, later research on the sociability of online learning, social

presence, and CMC suffers from a number of limitations. First, researchers of social

presence cannot agree upon a single definition of social presence (Biocca & Harms,

Page 22: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

7

2002; Biocca, Harms, & Burgoon, 2003; Rettie, 2003; Lane, 2011; Tu, 2002b). Instead,

researchers continue to redefine social presence (Lowenthal, 2010; Picciano, 2002).

Second, the majority of research conducted on social presence has various

conceptual or methodological limitations. For example, Gunawardena (1995;

Gunawardena & Zittle, 1997), one of the foundational and most often cited researchers

on social presence, primarily investigated learners’ feelings toward CMC as a medium of

communication (e.g., asking students the degree to the which they agree to statements

like “CMC is an excellent medium for social interaction”) rather than specifically asking

about how people adapted the medium for social purposes. Other researchers studied

social presence in hybrid courses (e.g., Hughes et al., 2007; Shea & Bidjerano, 2010; So

& Brush, 2008), online courses that had face-to-face meetings at the beginning of the

course (e.g., Tu, 2001; Wise et al., 2004), or non-traditional learning environments (e.g.,

6-week. self-paced, faculty-directed courses consisting of a single student) (e.g., Wise,

Chang, Duffy, & Del Valle, 2004). Each of these contexts would inevitably influence

how one establishes his or her own social presence as well as how one perceived the

social presence of others, but researchers (e.g., Richardson & Swan, 2003; Swan & Shih,

2005) have not explicitly acknowledged how these differences influence social presence.

In addition, most researchers studying social presence (e.g., Arbaugh &

Benbunan-fich, 2006; Garrison, Cleveland-Innes, & Fung, 2010; Gunawardena, 1995; Tu

2002a; Richardson & Swan, 2003) have used similar data-analysis techniques. The

majority of research has relied either on content analysis or on self-report data (obtained

through a questionnaire). Relying solely on one type of analysis can lead researchers to

make interpretive errors about the underlying phenomenon they are studying (Leech &

Page 23: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

8

Onwuegbuzie, 2007). Studies of social presence might benefit from employing multiple

or mixed methods (see Lowenthal & Leech, 2009).

Third, foundational research on social presence is dated (Gunawardena, 1995;

Gunawardena & Zittle, 1997; Rourke et al, 2001a; Tu, 2001, 2002a, 2002b). The

majority of the foundational research on social presence is over five to ten years old, and

during the past five years alone CMC and online learning have grown exponentially.

CMC is no longer a fringe activity used by a select group of users (Smith, 2010); rather,

CMC, issues of the digital divide aside, is commonplace. As people use the Internet and

email to communicate with others more each day, it is logical to assume that they become

more adept at communicating, becoming literate with this medium. This is not simply a

case of supposed “digital natives” (i.e., those who have grown up with technology) using

CMC differently than “digital immigrants” (i.e., those who are new to technology)

(Brown, 2002; Prensky, 2001). Rather, it is an issue of how people learn to use any

communication medium better over time: The cell phone is a perfect example with

millions of users worldwide, from the slums of India to the penthouses of New York

City—nearly everybody seems to have a cell phone these days.

The increased amount of time spent online has led online users of all ages and all

generations to adjust their perceptions, expectations, and day-to-day use of CMC. Just as

research in the early 1990s (e.g., Gunawardena, 1995; Walther, 1992, 1994, 1996) began

to call into question CMC research in the 1980s (i.e., Kiesler, 1986; Kiesler, Siegel,

McGuire, 1984; Rutter, 1984, 1987), additional research on social presence might begin

to question research conducted in the late 1990s and early 2000s (e.g., Gunawardena &

Zittle, 1997; Rourke et al, 2001a; Tu, 2000). Researchers need to continue to study social

Page 24: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

9

presence, and at times even replicate previous studies (unfortunately rarely done), in

order to ensure that current assumptions about social presence are still correct across

various contexts.

Finally, and most important, some research on social presence contradicts other

research (see Lowenthal, 2010). For instance, some researchers have found that social-

presence behaviors used by online learners decrease over time (Rourke, Anderson,

Garrison, & Archer, 2001a), while others have found that social presence behaviors do

not decrease over time (Stacey, 2002). In addition, Picciano (2002) found a relationship

between social presence and student learning, while Wise et al. (2004) did not. For all of

these reasons, additional research on social presence in online learning environments is

needed—and especially in asynchronous learning environments, the dominant form of

online education (National Center for Education Statistics, 2008)—to help clarify what

social presence is and its role in online learning.

Statement of the Problem

Despite the continued interest in social presence and CMC, many questions

remain about the nature and development of social presence (Lowenthal & Dunlap, 2011;

Swan & Shih, 2005; Rourke & Kanuka, 2009). In addition, some of what researchers and

practitioners think they do know is questionable due to the limitations of past research.

The majority of research on social presence (e.g., Gunawardena, 1995; Na Ubon &

Kimble, 2003; Picciano, 2002; Richardson & Swan, 2003; Rourke & Anderson, 2002b;

Russo & Campbell, 2004; Tu, 2002b; Wheeler, 2005; So & Brush, 2008) has focused on

faculty and students perceptions of social presence. Fewer studies by comparison (e.g.,

Hughes, Ventura, & Dando, 2007; Lomicka & Lord, 2007; Rourke et al., 2001a; Swan,

Page 25: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

10

2002, 2003a) have actually studied observable indicators of social presence in online

discussions.

While it is important to understand perceptions of social presence, it is also

important to study what students do and say online (Kramer, Oh, & Fussell, 2006).

However, not enough studies do just this and the few studies that have done this have

failed to describe adequately how social presence manifests itself in asynchronous online

courses. Researchers (e.g., Hughes et al., 2007; Rourke et al., 2001a) have typically

sampled only one part of a course and analyzed it with only one type of analysis,

typically content analysis. As a result, I posit that both researchers and practitioners may

have a very limited understanding of social presence.

Given these reasons, I set forth to conduct a mixed methods exploratory study of

social presence. I chose to do this in hopes of learning more about the observable

indicators of social presence in online course discussions.

Conceptual Framework

Many researchers (Arbaugh, 2007; Delfino, & Manca, 2007; Lomicka & Lord,

2007; Nippard & Murphy, 2007; Rourke & Anderson, 2002a, 2002b; Swan et al., 2008)

have argued for some time that the community of inquiry (CoI) framework is the most

popular framework to study social presence. The CoI framework is a comprehensive

guide (Garrison, Anderson, & Archer, 2000) for research on the practice of online

learning (Garrison & Arbaugh, 2007). Garrison et al. (2000) argued that meaningful

learning takes place in a CoI, comprised of teachers and students, through the interaction

of three core elements: cognitive presence, social presence, and teaching presence (see

Figure 1.1).

Page 26: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

11

Cognitive presence, the first element in the model, is “the extent to which the

participants in. . . a community of inquiry are able to construct meaning through

sustained communication” (Garrison et al., 2000, p. 89). Social presence, the second

element in the model, is the “ability of participants in a community of inquiry to project

their personal characteristics into the community, thereby presenting themselves to other

participants as ‘real people’” (p. 89). Finally, teaching presence, the third element in the

model, is the ability of a teacher or teachers to support and enhance social and cognitive

presence through instructional management, building understanding, and direct

instruction.

Figure 1.1. Community of inquiry framework

Garrison et al. (2000) initially developed three categories of social presence (i.e.,

Emotional Expression, Open Communication, and Group Cohesion). They later

developed specific indicators of social presence (e.g., use of humor, continuing a thread,

or the use of vocatives) (Rourke et al., 2001a) to help identify observable instances of

social presence in CMC (see Table 1.1). They later renamed these categories (e.g.,

Cognitive

Presence

Teaching

Presence

Social

Presence

Educational

Experience

Page 27: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

12

Emotional Expression was renamed Affective Responses) and tested the validity of the

categories and indicators of social presence (Rourke et al., 2001a). Swan (2003)

expanded the indicators even further, and then Hughes et al. (2007) later (though

apparently unaware of Swan’s work) made some changes to Rourke et al.’s indicators as

well. Despite the renaming of the categories and some minor changes to the social

presence indicators (which are discussed in more detail in Chapters 2 and 3), Garrison et

al.’s (2000) original categories and the later complete list of indicators (Rourke et al.,

2001) of social presence have—for the most part—remained unchanged (see Table 1.1).

Table 1.1 Categories and Indicators of Social Presence

Category Indicators Definition of Indicators

Affective

Responses (originally

“Emotional

Expression”)

Expression of

emotions

Conventional expressions of emotion, or

unconventional expressions of emotion,

includes repetitious punctuation,

conspicuous capitalization, emoticons

Use of Humor Teasing, cajoling, irony, understatements,

sarcasm

Self-Disclosure Presents details of life outside of class, or

expresses vulnerability

Interactive

Responses (originally

“Open

Communication”)

Continuing a Thread Using reply feature of software, rather than

starting a new thread

Quoting from Other

Messages

Using software features to quote others

entire message or cutting and pasting

sections of others’ messages

Referring explicitly

to other messages

Direct references to contents of others’ posts

Asking questions Students ask questions of other students or

the moderator

Complimenting,

expressing

appreciation

Complimenting others or contents of others’

messages

Expressing

agreement

Expressing agreement with others or content

of others’ messages

Page 28: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

13

Table 1.1 (con’t.)

Cohesive

Responses (originally

“Group

Cohesion”)

Vocatives Addressing or referring to participants by

name

Addresses or refers

to the group using

inclusive pronouns

Addresses the group as we, us, our, group

Phatics / Salutations Communication that serves a purely social

function; greetings, closures

Note. From “Assessing Social Presence in Asynchronous Text-based Computer

Conferencing,” by L. Rourke, D. R. Garrison, and W. Archer, 2001a, in Journal of

Distance Education, 14.

Garrison, though, pointed out in 2008 that these indicators have not been revisited since

their initial development and that they might need to be revised (Arbaugh et al., 2008)—

which in many ways is a possible outcome of this study.

Rourke et al. (2001a) were the first to test and validate the indicators of social

presence. However, Garrison et al. (2000) and later Rourke et al. (2001a) did not clearly

identify the relationship between the indicators of social presence. In other words, they

left researchers wondering whether certain categories or indicators of social presence are

better examples than others. When faced with the need to calculate a social presence

score—from the frequency of indicators found in the coded transcripts of CMC—they

decided to treat all indicators equally and simply sum the frequencies of all 12 indicators

(Rourke et al., 2001a). This appeared to have been more of a pragmatic decision rather

than a theoretical or empirical decision to find a way to create a social presence score

from the indicators in order to quantify and compare transcripts of CMC. Rourke et al.,

2001a though openly admitted their uncertainty about weighting all 12 indicators equally.

Despite this admitted uncertainty, researchers have followed the same process in

Page 29: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

14

developing a social presence score, though Hughes et al. (2007) was openly critical of

this practice.

Following the work of researchers like Rourke et al. (2001a), I conceptualize

social presence as an additive process in which all categories and indicators of social

presence are of equal importance (see Figure 1.2). However, like Hughes et al. (2007), I

am skeptical of this conceptualization and hope that among other things my research will

(by using multiple forms of analysis) help support or challenge the assumed additive

nature of Rourke et al.’s conceptualization of social presence.

Figure 1.2. Visual depiction of initial conceptual framework of social presence

developed by Rourke et al., 2000a.

Goal of the Study

The goal of this study is to understand better how social presence manifests in

threaded discussions in asynchronous online courses. However, all CMC is not the same

(Herring, 2007). While researchers can generalize about CMC at some level, they should

+  +  = 

Affective 

Responses • Expression 

of emotions 

 

• Use of 

Humor 

 

• Self‐

Disclosure 

Cohesive  

• Vocatives 

 

• Use of 

Inclusive  

Pronouns 

 

• Phatics / 

Salutations 

• Continuing a 

Thread 

 

• Quoting from 

Other Messages 

 

• Referring 

Explicitly to 

Other Messages 

 

• Asking 

Questions 

 

• Complimenting 

/ Expressing 

Appreciation 

 

Interactive  

Social 

Presence 

Page 30: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

15

recognize the situated and changing nature of social presence. Given this and to

accomplish the goal of this study, I study social presence in an intentional, socially

situated, specific context. Thus, the goal of this study is to explore the phenomenon

known as social presence by investigating how it manifests during online discourse in an

asynchronous online graduate education course.

The following research question guides this exploratory study: How does social

presence manifest in an asynchronous, online graduate-education course? This specific

question was chosen because the majority of research on social presence has either relied

solely on self-report data of faculty and student perceptions of social presence or has been

confined to a monomethod approach—usually using content analysis—to analyze a few

weeks of online threaded discussions. Both of these approaches fail to explore and

describe how social presence manifests in threaded discussions over the length of a

course. In other words, what are faculty and students actually doing to establish their

social presence? The focus of this study, given this research question, is on developing a

rich description of social presence by using multiple types of data analysis in order to

help faculty and students have better experiences in online courses and to enable course

designers to develop better online courses.

Overview of Methods

In the following paragraphs, I briefly describe the methods used for this study. I

specifically focus on the sample, data analysis, reliability, and validity. Each of these

topics is addressed in greater detail in Chapter 3.

Page 31: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

16

Sample

A single, completely online graduate course in education was purposefully and

conveniently sampled for this study. Thus, a non-random (non-probability) criterion

sampling scheme was used in this study (Onwuegbuzie & Collins, 2007). A section of

EDLI 7210 Educational Policy Making in a Democratic Society—which was taught

online in the spring of 2007—was identified as an appropriate sample for this study. The

course was a graduate-level online course in the School of Education and Human

Development at the University of Colorado Denver delivered via eCollege. All of the

threaded discussions in the eCollege course shell for this course were used for this study.

The population of the course primarily consisted of graduate students completing

coursework for an Educational Specialist (EdS) degree or a PhD. Many of the EdS

students were also seeking their principal license. Nineteen graduate students were

enrolled in the course.

Data Analysis

The majority of research on social presence has relied primarily on self-report

survey data (e.g., Gunawardena, 1995; Richardson & Swan, 2003). While self-report

survey measures are useful and have their place in educational research, as Kramer, Oh,

and Fussell (2006) point out, they “are retroactive and insensitive to changes in presence

over the course of an interaction [or semester]” (p. 1). In this study, rather than focus on

students’ perceptions of presence (which I have done in other studies such as Lowenthal

& Dunlap, 2011; Lowenthal, Lowenthal, & White, 2009), I focused instead on what was

“said” in the online threaded discussions.

Page 32: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

17

I used a mixed-methods exploratory methodology (Miles & Huberman, 1994;

Onwuegbuzie & Leech, 2005b) that employed both quantitative and qualitative methods

to conduct this study. In order to explore social presence in a specific situated

asynchronous learning environment in great detail, I analyzed the online threaded

discussions (now archived in the discussion forums) using word count, content analysis,

and constant comparison analysis (Leech & Onwuegbuzie, 2007).

More specifically, multiple forms of data analysis were used to address the

research question— How does social presence manifest in a graduate education

asynchronous online course? (see Table 1.2 above for an illustration of this). First, I

analyzed all of the discussions with word count (in conjunction with basic descriptive

statistics of each forum) to identify which threaded discussion had a higher frequency of

words and posts as well as which one’s had a higher number of social presence indicators

(types of words). Second, I used content analysis to analyze every threaded discussion,

using a modified version of the social presence indicators developed by Garrison et al.

(2000) and later modified by Swan (2003) and Hughes et al. (2007). Based on the results

of the word count and content analysis, I then selected two discussion threads—one with

a high number of social presence indicators and one with a low number of social presence

indicators—to analyze in more depth with a grounded theory constant comparison

analysis technique.

Page 33: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

18

Table 1.2 Alignment of Research Questions to Data Analysis

Research Question Data Analysis Type of Data

How does social presence

manifest in a graduate

education asynchronous

online course?

• Word Count

(Quantitative)

• Content Analysis

(Quantitative)

• Constant Comparative

Analysis

(Qualitative)

• All course discussions

• All course discussions

• One discussion threads

with high social

presence & one with

low social presence

Reliability and Validity

Reliability and validity are key considerations for any researcher. The most

common method used to calculate interrater reliability is a percent agreement statistic

(Rourke et al., 2001b). Two researchers (me and another researcher) coded the threaded

discussions using content analysis. A percent agreement statistic was calculated using

Holsti’s (1969) coefficient of reliability. A large component of establishing validity—

which is often described as trustworthiness in qualitative literature—is developing a

sound theoretical framework (Garrison, Cleveland-Innes, Koole, & Kappelman, 2006). I

have established the validity of this study by working from Garrison et al.’s CoI

framework. Further, the coding schemes I used for this study also came directly from the

literature (Hughes et al., 2007; Rourke et al., 2001a; Swan, 2003).

Significance of the Study

Learning is a very human and social activity (Dunlap & Lowenthal, 2009b).

Online learning environments, though, can feel isolating and impersonal. Given this,

educators must find ways to make formal online learning environments more personal

Page 34: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

19

and less isolating not only to help students persist but also to increase engagement and

satisfaction. To accomplish this, educators have focused on establishing social presence

in online courses (Dunlap & Lowenthal, 2009b).

The significance or educational value of this research lies in its ability to help

researchers better identify and study instances of social presence as well as to help faculty

who teach online better understand how they can identify and establish social presence by

using specific indicators of social presence. Further, the results of this study can help

instructional designers design and develop online courses that utilize specific

instructional approaches to help students establish their social presence online.

Limitations

All studies suffer from some type of limitation. Perhaps the most obvious

limitation is the time that has passed between when the course was offered and when I

analyzed the data. Related to this limitation is my inability to check with students

(whether through specific interviews or member checking) to verify whether or not what

I found in the course discussions is actually what they intended. However, one of the

main reasons to focus on the language students use is because students rarely clarify what

they mean by a posting; rather, other students simply do their best to make sense of what

they read. In other words, in my experience very little member checking occurs in a

typical online discussion so this limitation might actually end up being a very realistic

component to this study.

Chapter Summary

Researchers have been studying social presence in online learning environments

for a number of years now (Lowenthal, 2009). However, research on social presence to

Page 35: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

20

date suffers from a host of problems—ranging from inconsistent and contradictory

findings to strange sampling decisions. Part of the problem might be the methodological

decisions made by researchers. Instead of using a monomethod approach like the

majority of past research, I employed a mixed-methods approach to studying social

presence, utilizing both quantitative and qualitative methods to investigate the complex

nature of social presence. In addition, this study specifically focused on how social

presence manifests during threaded discussions in asynchronous online courses.

In Chapter 2, I present a review of the literature. In Chapter 3, I go over the

methods used for this study. In Chapters 4 and 5 I present the results, discuss the

findings, and provide recommendations for faculty and instructional designers as well as

for future research on social presence.

Page 36: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

21

CHAPTER 2

LITERATURE REVIEW

In the following chapter, I synthesize past research on social presence in general

and specifically research on the community of inquiry (CoI) framework to provide a

foundation and some background for my study. I begin by addressing the history of

social presence theory. After that, I address some early competing theories of social

presence and some differences in how researchers define and measure social presence. I

then conclude this chapter by synthesizing some of the research conducted on the

community of inquiry in general and social presence in particular and addressing some

gaps in the literature.

A Brief History of Social Presence Theory

As mentioned in Chapter 1, Short, Williams, and Christie (1976) developed the

initial theory of social presence in their book, The Social Psychology of

Telecommunications. While this book often serves as the foundational text to understand

the initial theory of social presence, it is important to look at the foundations of this

theory as well as later research conducted by Short et al. to understand how the theory of

social presence has evolved over the years.

Theoretical Foundations of Social Presence Theory

The collective work of Short et al. (1976) that is presented in The Social

Psychology of Telecommunications as well the work Short, Williams, and Christie (e.g.,

Short, 1974; Christie & Kingan, 1977; Williams, 1975; Wilson & Williams, 1977)

conducted individually or with other colleagues before and after their seminal text was

influenced by the social psychology concepts of intimacy and immediacy. Short et al.

Page 37: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

22

openly acknowledge that their concept of social presence is related to these two concepts.

Thus, each of these concepts is discussed in more detail in the following paragraphs.

Intimacy. Argyle and Dean (1965) were the first to use the concept of intimacy to

explain communication behavior. They developed a theory of intimacy and equilibrium

to explain how people communicating with each other will adjust their behavior to

maintain a sense of equilibrium. They explain that

aspects of intimacy are governed by both approach and avoidance forces, and are

kept in a condition of equilibrium for any two people…if this equilibrium is

disturbed along one of its constituent dimensions, e.g., by increasing physical

proximity, there will be compensatory changes along the other dimensions. (p.

304)

According to Argle (1969), people establish intimacy in a number of ways when

communicating, such as proximity, eye contact, smiling, and personal topics of

conversation. Short et al. (1976) argue that the social presence of a communication

medium also effects intimacy and therefore should be added to this list of ways that

people establish intimacy.

Immediacy. Wiener and Mehrabian (1968) developed the concept of immediacy.

They conceptualized immediacy as the psychological distance people put between

themselves and others when communicating. While Wiener and Mehrabian (1968) were

initially focused on speech communication, Mehrabian (1972) later distinguished

between three types of immediacy: verbal, nonverbal, and technological immediacy.

Verbal immediacy describes how people use their choice of words to reduce or increase

psychological distance between them and others. For example, the use of the words “let

us” or “we” can create more immediacy between two people than simply using “you” or

“I.”

Page 38: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

23

People also convey immediacy nonverbally through their dress, facial

expressions, or physical proximity (Mehrabian, 1972). Finally, technological immediacy

suggests that a medium of communication can convey immediacy. According to

Mehrabian (1972), communicating face-to-face is more immediate than communicating

with video; further, communicating with a video is more immediate than communicating

by phone.

While immediacy in general, and technological immediacy in particular, is similar

to social presence, Short et al. (1976) argue that important differences exist. For instance,

Short et al. argue that “for any given medium of communication (e.g., telephone) and

situation (e.g., long-distance call), immediacy may vary even when social presence does

not” (p. 73).  

While Short et al. (1976) claim that important differences are found between

immediacy and social presence, the distinction is not very clear. Further, they spend only

a few paragraphs addressing the similarities and differences between social presence,

intimacy, and immediacy. Not surprisingly, subsequent researchers often fail to

differentiate clearly between intimacy, immediacy and social presence; in fact,

researchers often appear to use the terms immediacy and social presence synonymously

(e.g., Gunawardena, 1995).

Influential and Related Research on Social Presence

Short et al. (1976) were all part of the Communications Studies Group at

University College in London. The Communications Studies Group consisted of an

estimated 30 people who conducted a number of experiments in the early 1970s on

communication media (Pye & Williams, 1978). Interestingly, The Social Psychology of

Page 39: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

24

Telecommunications appears to be the only joint publication by these three researchers.

However, each of them published, as individuals or with other colleagues, a number of

other studies on the effects of communication media (e.g., Short, 1974; Christie &

Holloway, 1975; Christie & Kingan, 1977; Williams, 1975; Williams, 1977; Wilson &

Williams, 1977). The majority of this research focused on comparing people’s attitudes

toward different communication media (e.g., face-to-face, audio, video). The following

paragraphs briefly summarize a few key findings from this early research that later

influenced the development of and people’s understandings of social presence theory.

The majority of this early research focused on the assumed importance of the

visual channel of communication. Given the importance placed on the visual channel in

previous literature, Short et al. (1976) and colleagues not surprisingly found that the

visual channel of communication was an advantage of a communication medium and

therefore highly important (Christie, 1974; Short, 1974; Williams, 1975). Christie (1974)

reports from one study that

visual media were judged more useful for complex group discussions, private

conversations and non-private dyadic conversations. Thus, the presence of visual

channel appears to be perceived as an important advantage of a communications

medium. (p. 367)

Additional research (Christie, 1974; Christie & Kingan, 1977; Williams, 1975),

though, began to show that the importance of a communication medium depended largely

on the task at hand. In fact, according to Christie (1974), “it is clearly misleading to

conceptualize different media as lying along a single dimension of acceptability or

usefulness. Their perceived usefulness varies according to the application considered” (p.

368). Williams (1975) argued that people might want a less intimate or immediate

communication medium for certain tasks. For instance, Williams (1975) suggests “that

Page 40: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

25

with tasks of very high intimacy—perhaps very embarrassing, personal or conflictual

ones—the least immediate medium, the telephone, would lead to more favorable

evaluations than either of the more immediate media” (p. 128). Further, their research

showed that tasks that are low on interpersonal involvement but still cooperative in nature

can easily be accomplished by audio or video conferencing (Williams, 1978a); however,

tasks that require more interpersonal involvement “are sensitive to the substitution of

telecommunications for face-to-face interaction” (p. 127).

Other than the suggestions made by Williams (1978a), very little was written in

these early articles about the role of the visual channel for instructional tasks. However,

Williams (1978a) argued that “tele-education seems especially promising since

educational activities are primarily for cooperative problem-solving and the transmission

of information—activities which have been shown to be almost unaffected by the

medium of communication used” (p. 129). Williams (1978a) went on to point out that our

knowledge about the role of mediated communication is far from complete—as was our

understanding of how people learned in the late 1970s.

Later research conducted by Christie and Kingan (1977), showed, among other

things, that while visual cues are helpful, they are not necessary for people to

communicate effectively. In fact, physical presence (i.e., being close to someone

physically) may be even more important for two people communicating than visual cues

(i.e., seeing another person) (Williams, 1978b). Results like these began to call for a

more complex explanation for the role of visual cues in the communication process.

Williams (1978b) suggested that the answers might be found in the theory of social

presence.

Page 41: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

26

Competing Theories of Social Presence

The theory of social presence developed by Short et al. was only one of a number

of theories used to explain the influence a communication medium can have on

communication. The three most popular competing theories of social presence—

especially during the 1980s—were Cuelessness Theory developed by Rutter (1984,

1987), Media Richness Theory developed by Daft and Lengel (1984, 1986; Daft, Lengel,

& Trevino, 1987), and Social Information Processing Theory developed by Walther

(1996; Walther & Parks, 2002). The first two theories (like Social Presence Theory) have

been described as deficit models because they focus on the cues that are filtered out and

idealize face-to-face communication as the gold standard (Thurlow, Lengel, & Tomic,

2004), whereas the third theory focuses not only on what is filtered out but what is gained

through CMC. Each of these theories are addressed briefly in the following sections to

illustrate the zeitgeist of the 1980s and early 1990s when researchers of online learning

reinvented the theory of social presence developed by Short et al.

Cuelessness

Working from a similar theoretical framework, Rutter (1984, 1987; Rutter,

Pennington, Dewey, & Swain, 1984; Kemp & Rutter, 1986) developed what he called the

Cuelessness Model. Rutter was concerned with the over emphasis placed on the

importance of eye-contact when two people were communicating. As a result, he and his

colleagues (1984) set forth to challenge the intimacy model developed by Argyle and

Dean (1965) and later Argyle and Cook (1976). Rutter and his colleagues argued that

previous research had focused too much on looking and eye-gaze and not enough on the

mutual gazing back and forth. Like Williams before, Rutter et al. (1986) found that what

Page 42: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

27

mattered was visual access to the entire person rather than simply access to another’s

eyes. They argued that it was the combined social cues—from vision and other senses—

that mattered.

The Cuelessness Model essentially claims that the fewer social cues, the greater

the psychological distance between two communicators (Rutter et al., 1986). Further, the

greater the psychological distance, the more communication turns to task-oriented

depersonalized content (Kemp & Rutter, 1986; Rutter, 1984; Rutter et al., 1986). In fact,

Rutter and colleagues (Rutter, 1989) found that the number of social cues (i.e., both

visual and physical presence cues) decreased when comparing how people communicated

in certain situations (e.g., closed-circuit television, curtain, and audio).

Media Richness

Another competing theory that emerged during the 1980s is the theory of Media

Richness. Media Richness Theory was developed by Daft and Lengel (1984, 1986).

Whereas Rutter and colleagues were aware of the work of Short et al., Daft and Lengel

never seem to explicitly acknowledge the work of Short et al. Daft and Lengel (1984)

were focused primarily on the information processing behaviors in organizations.

Therefore, they were interested in a concept called information richness:

Richness is defined as the potential information-carrying capacity of data. If the

communication of an item of data, such as a wink, provides substantial new

understanding, it would be considered rich. If the datum provides little

understanding, it would be low in richness. (p. 196)

They posited that a communication medium can determine the richness of information

(Daft & Lengel, 1986). They argued that face-to-face communication had the highest

richness and numeric communication (e.g., spreadsheet with numbers) the lowest; see

Figure 2.1 for a complete list of media richness by media.

Page 43: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

28

Information Medium Information Richness

Face-to-Face Highest

Telephone High

Written, Personal

(bulletins, documents)

Moderate

Written, Formal (bulletins,

documents)

Low

Numeric Formal

(computer output)

Lowest

Figure 2.1. Communication media and information richness diagram

Note. From “Information Richness: A New Approach to Managerial Behavior and

Organizational Design,” by R. L. Daft and R. H. Lengel, 1984, in L. L. Cummings & B.

M. Staw (Eds.), Research in Organizational Behavior (191-233). Homewood, IL: JAI.

Social Information Processing

The last of the three competing models is the Social Information Processing

model developed by Walther (1992, 1994, 1996). Walther developed his model in

response to the previous so-called “deficit” theories. Whereas previous researchers were

interested in media effects across various communication media, Walther focused

primarily on CMC. He criticized previous research, like that addressed earlier in this

chapter, for a number of reasons. First, the majority of the early research was conducted

in experimental settings that did not mirror how people communicate with different

media in real life (1992). Second, these early studies and researchers assumed that the

absence of visual cues led to an absence of sociability. Third, they assumed that task-

oriented communication lacked relational and social communication. Finally, they failed

to acknowledge that just as cues are filtered out, other cues are filtered into CMC and

Page 44: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

29

therefore CMC has some affordances that face-to-face communication does not (Walther,

1996; Walther & Parks, 2002).

Walther (1992) argued that Humans’ social nature is the same in CMC and face-

to-face environments. Given enough time, he believed that people will find ways to

compensate for any cues that are filtered out in CMC. The social information processing

model essentially posits that given enough time, CMC can be very personal and even

hyperpersonal (Walther, 1992, 1996). Previous research tended to put time restrictions on

how people communicated that Walther believed diminished the possibility of

interpersonal and relational communication. Walther’s research on the other hand

suggested that

• Previous interaction between communicators influenced how people

communicated online;

• The possibility of future interaction influenced the degree to which people

socially interacted online;

• The way users used emoticons influenced interpersonal communication

online.

These competing theories help illustrate the way that thinking about a medium’s

effect on communication—especially interpersonal and social communication—change

over time. The research that began with the work of Gunawardena (1995; Gunawardena

& Zittle, 1997)—which I refer to as the third phase of social presence research (see Table

2.1 and Figure 2.2)—was influenced by previous research and theories, especially that of

Walther. Rather than conceptualizing social presence as Short et al. did, Gunawardena

and those that followed her (like Garrison et al., 2000, whose work serves as the

Page 45: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

30

conceptual framework for this study) began reconceptualizing social presence theory—

focusing more on how people appropriate technology rather than simply on what a

technology allows us to do. In fact, the work of Garrison et al. and the CoI really

represent a fourth phase of research on social presence (see Table 2.1 and Figure 2.2).

Table 2.1 Phases of Social Presence Research

Phase Period Key Figures Focus of Research

Phase 1

1970s Short et al. Focused on

Telecommunications

Phase 2 1980s-early1990s Rutter

Daft & Lengel

Kiesler

Walther

Focused on CMC

Phase 3 1990 - 1999 Gunawardena

Rourke et al.

Tu

Focused on Online

Learning

Phase 4 2000s - Present Garrison et al.

Karen Swan

Peter Shea

Focused on Social

Presence’s Role in

establishing a community

of inquiry in Online

Learning

Figure 2.2. Timeline of competing theories of social presence preceding the

development of the community of inquiry framework.

Page 46: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

31

Defining Social Presence

Given the evolution of social presence theory, it is probably not surprising that

there is not a clear, agreed upon, definition of social presence (Rettie, 2003; Tu, 2002b).

In fact, nearly everyone who writes about social presence seems to define it just a little

differently.

Presence is a key theoretical construct used in a variety of disciplines besides

communication and online learning—most notably virtual reality (see Biocca, 1997). In

fact, Lombard and Ditton (1997) identified six interrelated but distinct ways that people

understand “presence”: (a) presence as social richness, (b) presence as realism, (c)

presence as transportation, (d) presence as immersion, (e) presence as social actor within

medium, and (f) presence as medium as social actor. They even attempted to create one

all encompassing definition of presence. According to Lombard and Ditto, the following

definition takes into consideration all six ways presence is understood; presence is “the

perceptual illusion of nonmediation” (presence explicated section). To date, though, their

all encompassing definition has not been widely adopted by others. Biocca, Harms, and

Burgoon (2003) also recognized the different ways researchers across different fields

define presence. They attempted to create an all-encompassing definition of social

presence as well; they defined social presence as simply a “‘sense of being with another’”

(p. 456) whether that other is human or artificial.

Despite attempts by Lombard and Ditto (1997) and Biocca et al. (2003) to

develop some conceptual clarity when it comes to discussions of presence in general or

social presence in particular, researchers of social presence and CMC in educational

environments continue to redefine social presence (Picciano, 2002). Gunawardena (1995)

Page 47: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

32

defined social presence as “the degree to which a person is perceived as a ‘real person’ in

mediated communication” (p. 151). Garrison et al. (2000), on the other hand, originally

defined social presence “as the ability of participants in a community of inquiry to project

themselves socially and emotionally, as ‘real’ people (i.e., their full personality), through

the medium of communication being used” (p. 94). Tu and McIsaac (2002) define social

presence as “the degree of feeling, perception, and reaction of being connected by CMC

to another intellectual entity through a text-based encounter” (p. 140). Finally, Picciano

(2002) defines social presence as “a student’s sense of being in and belonging in a course

and the ability to interact with other students and an instructor” (p. 22).

The differences in how researchers define social presence might seem minor but

they are important (see Ice, Gibson, Boston, & Becher, 2011). For instance, Rourke et al.

(2001) focus on students (or instructors) ability to project themselves as “real” whereas

Picciano focuses more on students’ sense of belonging to a community. Issues of

definition are important because the way researchers define social presence influences

how they measure social presence and the conclusions they draw.

Definitions of social presence, at least for researchers of social presence and

online learning, tend to fall on a continuum (see Figure 2.3). At one end of the

continuum, researchers tend to conceptualize social presence as the degree to which a

person is perceived as being “real” and being “there.” These definitions tend to focus on

whether someone is able to project himself or herself as being “real” in an online

environment and whether others perceived this person as being there and being real. In

fact, Williams (1978a) defined social presence in this way when he defined social

Page 48: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

33

presence as “the feeling of contact obtained. . .” across various communication media (p.

127).

At the other end of the continuum, researchers tend to go beyond whether

someone is perceived as being “present”—that is, simply “there” or “real”—instead

focusing on whether there is an interpersonal emotional connection between

communicators. It is important to note, though, that on this end of the continuum, there

tends to be an assumption that the interpersonal and emotional connection that

communicators establish when there is social presence is a positive connection. Finally,

like most continuums, the majority of researchers find themselves somewhere in the

middle—placing some emphasis on an emotional connection—rather than on the ends of

the continuums.

Figure 2.3. Continuum of Definitions of Social Presence

Measuring Social Presence

After all the theorizing, researchers need to be able to identify, measure, and test

their theories about social presence. As researchers began to conceptualize social

presence differently, rather than use techniques developed and utilized by past

researchers—perhaps because of Walther’s critique of these techniques—they began to

look for new ways to study social presence. Gunawardena (1995), Rourke et al. (2001),

and Tu (2002b) have been three foundational researchers in developing ways to study

Sense that someone is real

Sense that someone is there (present)

Emotional Connection

Sense that someone is real

Sense that someone is there (present)

No focus on emotion

Page 49: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

34

social presence. But just like in the mid-1970s—when researchers either studied social

presence by observing user behavior or examining users attitudes (Christie, 1974)—

researchers in the third and fourth wave of social presence research have tended to focus

either on studying users’ attitudes or their behaviors online. For instance, Gunawardena

and Tu focused primarily on studying users’ attitudes whereas Rourke et al. focused on

studying users’ behaviors (though it is important to note that while Garrison early on

focused on studying users’ behaviors with his colleagues Rourke et al., he later turned to

studying students attitudes). Regardless of their focus, the work of each of these

researchers has heavily influenced most of the studies on social presence and CMC

during the past ten years. In the following paragraphs, I will address how each of these

researchers studied social presence.

Gunawardena’s Social Presence Scales

Gunawardena (1995; Gunawardena & Zittle, 1997) conducted some of the earliest

studies on social presence and CMC in an education setting. In her first article,

Gunawardena (1995) reported on two different studies she conducted in the early 1990s.

In the first study, she measured users’ perceptions of CMC using a survey. She had

students rank 17 bi-polar scales on a 5-point likert-type scale (from negative to positive).

For instance, she asked students whether CMC was more socialable or unsocialable or

more warm or cold (see Table A1 in Appendix A for the complete list). The bi-polar

scales she used focus on users’ perceptions of the medium more than the degree to which

users perceive others as “real” or “there.”

Page 50: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

35

Gunawardena (1995) reports in the same article about a second study in which she

qualitatively analyzed some data; however, she does not elaborate on what data she

analyzed or how she analyzed the data that she reported.

In a later article, Gunawardena and Zittle (1997) reported on additional data

collected from an earlier sample. However, with this study, Gunawardena and Zittle

created an instrument they called the Social Presence Scale (see Appendix A). The Social

Presence Scale was similar to the previous scale used by Gunawardena, but instead of

responding to bi-polar scales, students were asked to rank 14 questions on a scale of 1 to

5. For instance, one question asked students to rank on a scale of 1 to 5 to what degree

they agree or disagree that CMC is an excellent medium for social interaction. The Social

Presence Scale was tested for internal consistency (Alpha = .88) and appears to

investigate the construct of social presence more directly than the previous scale.

Rourke et al.’s Social Presence Indicators

Unlike Gunawardena who measured social presence through a self-report

questionnaire, Rourke et al. (2001) sought to measure social presence through analyzing

online discussions. As touched on in Chapter 1, Rourke et al. identified three different

categories of social presence: affective responses, interactive responses, and cohesive

responses. They then developed twelve indicators that researchers could use to analyze

transcripts of CMC (primarily through content analysis). An example of these indictors

can be seen in Table 2.2 (see Appendix A for the complete list of indicators). Rourke et

al. developed these categories and indicators based on their previous work (Garrison,

Anderson, & Archer, 2000; Rourke, et al., 2001a), other literature in the field, and finally

their experience reading online transcripts.

Page 51: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

36

Table 2.2 Example of Social Presence Indicators

Category Indicators Definition of Indicators

Affective

Responses

Expression of

emotions

Conventional expressions of emotion, or

unconventional expressions of emotion,

includes repetitious punctuation,

conspicuous capitalization, emoticons

Use of Humor Teasing, cajoling, irony, understatements,

sarcasm

Self-Disclosure Presents details of life outside of class, or

expresses vulnerability

Rourke et al. (2001a) tested and measured the “efficacy and reliability” of their

categories and indicators by using them with participants in two graduate education

online courses. One single week from each course was identified, and all of the

discussion postings for those two weeks were analyzed. The first course had more than

twice the number of postings and words as the second course; as a result, in order to

compare the two, Rourke et al. (2001a) summed the raw number of instances and divided

by the total number of words and then multiplied it by 1000 to come up with a social

presence density score. They had high interrater reliability.

Rourke et al. (2001a), though, cautioned readers about generalizing their results

because their main purpose was to “develop and test the efficacy of a tool for analyzing

the social presence component of educational computer conferences” (Discussion

section) rather than to draw conclusions specifically about the samples in question. They

also acknowledged that they were still unclear whether all 12 indicators should be

weighted equally, as well as whether or not there was an optimal level of social presence.

In fact, Garrison mentioned in a round table presentation at the 2008 annual meeting of

Page 52: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

37

the American Educational Research Association (AERA) that these indicators might need

to be revisited to ensure that they do not need to be revised (Arbaugh et al., 2008)

Tu and The Social Presence and Privacy Questionnaire

Tu (2002b) criticized early research on social presence (e.g., Short et al., 1976,

and even Gunawardena’s 1995 study) in which researchers adopted the same semantic

differential technique that simply had people respond to a bi-polar scale. Tu argued that

this technique is not adequate to measure one’s perception of social presence when it

comes to CMC. He also argued that the questionnaire used by Gunawardena and Zittle

(1997) failed to take into consideration different variables cited in the research (e.g.,

recipients, topics, privacy, task, social relationships, communication styles). As a result,

Tu (2002b) developed “The Social Presence and Privacy Questionnaire (SPPQ).”1 Tu

developed the SPQQ by using parts of Steinfield’s (1986) CMC attitude instrument and

Witmer’s (1997) work on privacy.

Tu used a panel of five qualified content experts to test the content validity of the

instrument. However, he did not elaborate on what made these content experts

“qualified.” He then used 310 inservice and preservice teachers to test the construct

validity. Five factors emerged from the factor analysis: social context, online

communication, interactivity, system privacy, and feelings of privacy; these five factors

accounted for 82.33% of the variance with Cronbach’s alpha values ranging from .74 to

.85. Tu acknowledged that online privacy had a weak correlation and therefore might not

need to be included as a dimension of social presence. However, he continued to use

1 In a different article, Tu (2002a) refers to the SPPQ as the CMC Questionnaire;

however, he tends to refer to it more often as the SPPQ and therefore SPPQ will be used

to refer to this instrument.

Page 53: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

38

online privacy as a dimension of social presence in later studies (Tu & Corry, 2004; Tu &

McIsaac, 2002). Despite the strengths of his survey, Tu and McIsaac (2002) later

determined as the result of a mixed method study, using the SPPQ and a dramaturgy

participant observation qualitative approach, “there were more variables that contribute to

social presence” (p. 140) than previously thought. Therefore, Tu and McIsaac concluded

that social presence was more complicated than past research suggested. Appendix A

outlines the new variables identified by Tu and McIsaac. Specifically, they found that the

social context played a larger role than previously thought.

Among other things, the preceding literature illustrates what other researchers

have pointed out—that there is still little agreement on how to measure social presence

(Lin, 2004; Stein & Wanstreet, 2003). Just as Tu criticized how Gunawardena measured

social presence, others have criticized and modified Tu’s work (Henninger &

Viswanathan, 2004). Also, while social presence has been presented as a perceptual

construct, Hostetter and Busch (2006) point out that relying solely on questionnaires (i.e.,

self-report data) can cause problems because “respondents may be providing socially

desirable answers” (p. 9). Further, Kramer, Oh, and Fussell (2006) point out that self-

report data “are retroactive and insensitive to changes in presence over the course of an

interaction [or semester]” (p. 1). But at the same time, even the scale created by Rourke

et al. (2001a) has been modified by Swan (2003) and later by Hughes, Ventura, and

Dando (2007) for use in their own research.

During the past few years, researchers have focused less on studying social

presence by itself—opting instead to study social presence as one aspect of a CoI. As a

result and likely due to the difficulty of coding large samples, these researchers have

Page 54: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

39

focused almost predominantly on studying students attitudes toward the CoI as a whole

and each of the components of the CoI (i.e., social presence, teaching presence, and

cognitive presence). In 2008, a group of researchers came together to develop an

instrument to study the community of inquiry, called the Community of Inquiry

Questionnaire (see Arbaugh et al., 2008; Swan et al., 2008). Table 2.3 lists the part of the

Community of Inquiry Questionnaire used to assess students’ perceptions of social

presence in a CoI (see Appendix A for the entire instrument).

Table 2.3 Social Presence Dimension of the Community of Inquiry Questionnaire

Affective expression

14. Getting to know other course participants gave me a sense of belonging in the

course.

15. I was able to form distinct impressions of some course participants.

16. Online or web-based communication is an excellent medium for social

interaction.

Open communication

17. I felt comfortable conversing through the online medium.

18. I felt comfortable participating in the course discussions.

19. I felt comfortable interacting with other course participants.

Group cohesion

20. I felt comfortable disagreeing with other course participants while still

maintaining a sense of trust.

21. I felt that my point of view was acknowledged by other course participants.

22. Online discussions help me to develop a sense of collaboration

Over five years ago and before the work of Arbaugh et al., Russo and Benson

(2005) argued that researchers need “a multifaceted presence instrument, one that

examines presence more than single items and addresses the construct more by evaluating

specific behaviors rather than a global effect” (p. 60). And while Arbaugh et al. (2008)

hope that the Community of Inquiry Questionnaire is a step in that direction, as a whole

Page 55: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

40

their survey and the research in which it is used for the most part focuses on looking at

the CoI as a whole rather than at its parts (e.g., social presence).

In the end, though, the instrument that researchers use largely influences what

they find. Therefore, any study of social presence should at least acknowledge how its

methodology has been influenced by these early pioneers. Despite the varied

methodologies employed and some contradictions, some trends emerge when looking at

the research on social presence. The following section focuses first on the results of

some research on social presence and then on some recent research focused on social

presence in a CoI.

Research on Social Presence

Despite the differences previously noted, researchers have identified a number of

pedagogical implications—in most cases, benefits—of social presence. In the following

sections, the literature on social presence is summarized and synthesized around three

main themes: (a) social presence and student satisfaction, (b) social presence and

interaction, and (c) social presence and student learning.

Social Presence and Student Satisfaction

Over the years, a number of researchers have shown that there is a consistent

relationship between social presence and student satisfaction (Gunawardena, 1995;

Gunawardena & Zittle, 1997; Hostetter & Busch, 2006; Richardson & Swan, 2003; So &

Brush, 2008). While their conceptualization and methodology differ at times, most

researchers agree that social presence is a predictor of student satisfaction in CMC

environments, which in turn is a key component of online learning. More specifically, in

online learning environments student satisfaction has been connected to student

Page 56: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

41

persistence (Levy, 2007; Willging & Johnson, 2004). Levy (2007) has shown that

student satisfaction “is a major factor in students’ decision to complete or drop” online

courses (p. 198). Therefore, given the importance of student satisfaction, the following

section highlights a few of the main studies on social presence and student satisfaction.

In the 1980s and early 1990s, a number of researchers began investigating social

presence and computer-mediated communication (CMC) (e.g., Walther, 2002, 2004).

However, Gunawardena (1995; Gunawardena & Zittle, 1997) is perhaps the earliest, most

frequently cited, and foundational researcher of social presence and learning

environments using CMC. Gunawardena conducted two studies with Globaled

conference participants (Gunawardena, 1995; Gunawardena & Zittle, 1997). The studies

consisted of graduate students from different universities who attended the Spring 1992

and Fall 1993 Globaled computer conferences via a listserv.2 The participants in the

studies filled out questionnaires after they completed the conferences.

Gunawardena (1995) reported that, contrary to popular opinion, CMC could be

perceived as a social medium and that social presence could be cultivated. Further, she

stated that,

although CMC is described as a medium that is low in nonverbal cues and social

context cues, participants in conferences create social presence by projecting their

identities and building online communities. In order to encourage interaction and

collaborative learning, it is important that moderators of computer conferences

promote the creation of conducive learning environments. (p. 163)

Gunawardena and Zittle (1997), working from data collected from participants in the Fall

1993 conference, later reported that social presence was a strong predictor of student

2 It is important to highlight that the majority of the students in these studies completed

the online learning experience (i.e., the Globaled conference) as a component of a face-

to-face course; further, they took part in the conference via a listserv rather than a course

management system like Blackboard, WebCT, or eCollege.

Page 57: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

42

satisfaction with computer conferences. They also found that students who felt a stronger

sense of social presence enhanced their socio-emotional expression (e.g., through the use

of emoticons) whereas those with a low sense of social presence did not. Gunawardena

and Zittle concluded that social presence (and as a result student satisfaction) depends on

what instructors and students do rather than simply the characteristic of a CMC medium.

Despite shortcomings of their research (e.g., small sample size, sample selection, course

format) as well as the fact that they caution readers not to generalize their results, the

work of Gunawardena (1995) and Gunawardena and Zittle (1997) is regularly cited—and

generalized—as foundational research on social presence and CMC.

Research conducted by Richardson and Swan (2003) is arguably less foundational

than the work of Gunawardena and Zittle but methodologically more sound. Richardson

and Swan (2003) conducted a study to investigate the relationship between students’

perception of social presence, perceived learning, and satisfaction with their instruction.

Their study consisted of 97 participants taking online courses at Empire State College, a

site purposefully chosen because of its nontraditional online program. Almost half of the

students in the sample stated that it was their first online course. Richardson and Swan

developed a survey based on Gunawardena and Zittle’s (1997) survey and used a

multiple regression to analyze the data collected from the survey.

Richardson and Swan (2003) found three things from their study. First, they found

that students with higher perceived social presence scores perceived they learned more

than students with lower scores, thus indicating that there is “a relationship between

students’ perceived social presence and students’ perceived learning” (p. 77). Second,

they found a link between student satisfaction with their instructor and perceived

Page 58: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

43

learning—which researchers have been finding in face-to-face settings for years. Third,

they found that students with high social presence scores “were highly satisfied with their

instructor” (p. 73). However, it is important to note that they did not find a relationship

between age or amount of college experience and social presence. Further, they

concluded that online learners found the social presence of faculty and students to be an

integral aspect of an online course.

Other researchers (Hostetter & Busch, 2006; Russo & Benson, 2005; So & Brush,

2008) have found a relationship between social presence and student satisfaction in

online learning environments as well. In fact, student satisfaction is the most consistent

finding across all studies of social presence and CMC.

However, like most findings on social presence, there always seems to be at least

one study that contradicts the findings of others. For instance, Joo, Lim, & Kim (2011)

recently sought to investigate the structural relationships between perceived level of

presence, perceived usefulness and ease of online tools, and learner satisfaction and

persistence at a South Korean online university (p. 1654). They administered two

different surveys resulting in 709 responses. While they found teaching presence had a

significant effect on both social presence and cognitive presence (which suggests how a

course is designed and facilitated effects social presence), they also found that contrary to

previous studies, social presence was not a significant predictor of satisfaction. Further

research though is needed to see if this study is an outlier or if perhaps students’

perceptions of social presence and its relationship to satisfaction is changing.

While student satisfaction does not equal student learning, it is a necessary

component of a successful learning environment. Further, online learning has a history of

Page 59: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

44

having a higher dropout rate than face-to-face courses (Levy, 2007) as well as being

characterized as involving more work than traditional face-to-face courses. Thus, it is

imperative for online instructors to recognize the important role student satisfaction can

play in online learning environments. If students are not satisfied, they will presumably

not log-on to their online course and therefore will not successfully complete their online

course or learn the required material. Another important component to student learning

online is interaction. The following section will address the relationship between

interaction and social presence.

Social Presence and Interaction

Interaction is a key component of any learning environment (Dunlap, Sobel, &

Sands, 2007). Interaction and online learning has specifically received a great deal of

attention over the years (Anderson, 2006; Anderson & Garrison, 1998; McIsaac, Blocher,

Mahes, & Vrasidas, 1999; Moore, 1989; Moore & Kearsely, 2005; Vrasidas & Glass,

2002; Wagner, 1994). Interaction has been defined in a number of ways. According to

Wagner (1994), interaction is simply “reciprocal events that require at least two objects

and two actions. Interactions occur when these objects and events mutually influence one

another” (p. 8). Interaction—that is, reciprocal events—can occur in many different

forms in an online environment.

Moore (1989) was the first to identify three main types of interaction in distance

education: (a) learner-to-content interaction, (b) learner-to-instructor interaction, (c) and

learner-to-learner interaction. Later researchers identified additional types of interaction

found in online learning environments: (a) teacher-to-teacher, (b) teacher-to-content, (c)

content-to-content, (d) learner-to-technology, and (e) teacher-to-technology (Anderson,

Page 60: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

45

2003, 2006; Anderson & Kuskis, 2007; Shank, 2004). Each of these is an important

component of any online learning environment. Furthermore, each of these types of

interaction can influence social presence. However, learner-to-instructor and learner-to-

learner interaction are the most germane; of the two, learner-to-learner interaction has

received the most attention.

Researchers have shown that learner-to-learner interaction is a critical component

in online learning (Richardson & Swan, 2003). Learner-to-learner interaction is

motivating and stimulating for students (Moore & Kearsley, 2005). Further, social

presence is directly related to learner-to-learner interaction (Tu, 2000). Students are

perceived as being there as a result of their online interactions with their peers; if they do

not interact with their peers and instructors, they are not perceived as being there or

connecting with their peers or instructors. Therefore, in this section, I will summarize a

few key studies about social presence and interaction.

Tu and McIsaac (2002) conducted a mixed methods study with 43 graduate

students in an online course. They found that social presence influences online

interaction. More specifically, they found that social presence is necessary for social

interaction. However, they also found that the quantity or frequency of participation

online did not directly relate to social presence. That is, interacting more did not

necessarily increase one’s social presence. Finally, they found that group size in

synchronous discussions influenced how much students interacted with others online.

As a result of their study, Tu and McIsaac (2002) argued that students need to

establish trust “before attaining a higher level of social presence” (p. 142). They also

found that informality helped increase social presence. But most importantly, they

Page 61: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

46

concluded that it is not the quantity but the quality of interactions online that make the

difference.

Like Tu and McIsaac, Swan and Shih (2005) also discovered some interesting

relationships between social presence and interaction. The participants in their study

came from four online graduate educational technology courses; 51 students completed

an online questionnaire [based on Richardson and Swan’s (2003) previous survey that

was based on the instrument developed by Gunawardena and Zittle]. After the survey was

completed, the 5 students with the highest and the 5 students with the lowest social

presence scores were identified, their postings were analyzed, and then they were

interviewed. Content analysis was used to explore the discussion postings using the

indicators developed by Rourke et al. (2001a).

Like Tu and McIsaac (2002), Swan and Shih found that students who interacted

more in the discussion forums were not necessarily perceived as having the most social

presence; rather, students who were more socially orientated, even if they interacted less

than others, were perceived as having greater social presence. Swan and Shih argued that

this supports the idea that perceived presence is not directly linked to how much one

participates online. Further, they found that students perceiving the most social presence

of others were also the ones who successfully projected their own presence into the

discussions. Swan and Shih concluded that students projected their presence online “by

sharing something of themselves with their classmates, by viewing their class as a

community, and by acknowledging and building on the responses of others” (p. 124),

rather than simply posting more than others. Therefore, this research suggests that the

quality of online postings matters more than the quantity when it comes to social

Page 62: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

47

presence. Finally, unlike earlier research (e.g., Richardson & Swan, 2003), Swan and

Shih found that the age of participants did matter. More specifically, they found that

younger students were more comfortable with online discussions than older students and

that students over the age of 45 did not bond well with other (younger) students (p. 123).

Even so, these differences could be due to the different samples; further, findings such as

these are likely to diminish over time, as more and more people of all ages spend time

online.

It is very possible that the occurrence of contradictory findings, like these about

age and social presence, arise because researchers have studied social presence in very

different contexts, with very different course formats, and very different groups of

students and instructors. Tu and McIsaac (2002) have already shown the important role

social context plays in social presence. Therefore, while age might diminish as a

significant variable over time, course format (e.g., hybrid vs. predominantly

asynchronous vs. predominantly synchronous), length of term, subject of study, and

students’ experience with online courses might continue to influence how social presence

is perceived, maintained, and enhanced in online learning environments.

Perhaps the most important research conducted on social presence has focused on

student learning. But just like previous research, researchers studying social presence

and student learning have found mixed and contradictory results.

Social Presence and Student Learning

Very few researchers have investigated the relationship of social presence and

student learning. That is, of the over 100 studies conducted on social presence, less than a

handful of studies have focused on student learning (Picciano, 2002; Wise et al., 2004)—

Page 63: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

48

and the majority that have only focused on perceived learning (e.g., Richardson & Swan,

2003). This is not, though, simply a trend restricted to research on social presence and

online learning. Tallent-Runnels et al. (2006) have shown that generally speaking, “few

studies actually focus on instruction and learning online” (p. 116). This is most likely due

to the difficulty of measuring student learning (Fryshman, 2008). Because students are

complex creatures with vastly different backgrounds and experiences, it is often difficult

to determine what was learned specifically as the result of what happened in a given

course. As a result, most researchers who do focus on student learning actually

operationalize it as student performance on course assessments. This is true of Picciano

(2002) and Wise et al. (2004), the two researchers who have studied social presence and

student learning.

Picciano (2002) was one of the first to investigate social presence and student

learning. He was interested specifically in the relationship between social presence,

student interaction and performance, and student perceptions of social presence and

actual participation. Participants in Picciano’s study came from one completely online

asynchronous graduate education course. The 23 participants in the study were teachers

seeking a school administrator certification for the state of New York; they all had at

least five years teaching experience and a MA (Picciano, 2002). Only eight of the 23

students had taken an online course before. It is also important to note that the World

Trade Center was attacked during week two of this course; the attack on the World Trade

Center had a great impact on people across the world—but especially in New York.

Therefore, it is likely that the emotional distress felt by many Americans during this time

Page 64: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

49

could have impacted the results of this study and the social presence of the participants

online.

Picciano (2002) collected three different types of data. First, he calculated how

often students participated in the course discussions. However, interestingly, he chose

not to count “one line ‘me too’ postings and social messages” (p. 29); instead he wanted

to focus on “substantive comments.” Given the importance “me too” postings and

“social comments” can play in social presence, this decision likely skewed the results.

He also administered a student satisfaction survey—questions coming from both the

Inventory of Presence Questionnaire developed by the Presence Research Working

Group and Tu’s Social Presence and Privacy Questionnaire (SPPQ). Then, finally, he

collected scores from an exam and a written assignment. The exam was a multiple-choice

exam designed to explore 13 issues the course focused on. The written assignment was a

case study. Picciano created three groups—low, moderate, and high—of student

interaction, and then three groups of social presence (based on the survey). He then

calculated correlations of student interaction and the scores on the exam and the written

assignment as well as social presence and the performance on the exam and the written

assignment. Due to sample size, Picciano only used basic descriptive statistics (i.e.,

means and correlations) to analyze his data.

Regarding student perceptions of interaction and learning, Picciano (2002) found

that “there is a strong, positive relationship between student perceptions of their

interaction in the course and their perceptions of the quality and quantity of their

learning” (p. 28). Further, there was a positive relationship between student perceptions

of interaction, social presence, and their performance. However, he found that perceived

Page 65: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

50

performance and actual performance differed. To investigate actual student interaction

and performance, he found that actual student interaction had no relationship to the

performance on the exam but did have a relationship to the written assignment for the

highly interactive group. Finally, regarding social presence and performance, he found a

positive statistical relationship between social presence and the written assignment

(.5467) but not a positive statistically significant relationship between social presence and

the performance on the exam (-.3570).

Wise et al. (2004) also investigated student learning and social presence. Unlike

Picciano (2002), Wise et al. used an experimental research design that looked at how

social presence is related to learning in self-paced, one-to-one mentoring-supported,

online courses. That is, rather than using one online course that relies heavily on

student-to-student discussions, Wise et al. focused on self-paced online courses designed

for teacher professional development. These were self-paced courses typically completed

in a 12 week time period. However, participants in this study had to complete them in 6

weeks because it was an assignment for another course they were taking.

Twenty students taking a graduate course called Elementary and Secondary

School Curriculum took part in the study; half of the participants had teaching experience

and half did not. The students with teaching experience were randomly assigned to either

a high or low social presence condition; the students without teaching experience were

evenly distributed among the conditions (Wise et al.). For each condition, two instructors

were randomly assigned to five students (p. 255). The instructors had been trained on

social presence cues—which included the indicators developed by Rourke et al. (2001);

Page 66: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

51

they varied the social presence cues they used based on the condition they were assigned

to.

The researchers physically went to students’ courses to solicit their participation.

Students then completed a pre-assessment survey which focused on demographics and

learning intentions. Then after the course was over, they had students complete a survey.

They did not include the survey in their article, and they never explicitly said how the

survey was constructed. They calculated message length as well as level of student social

presence by analyzing student messages with a type of content analysis. Then the final

assignment of the course—which was a lesson plan on integrating technology—was

assessed by two raters using a rubric.

They found that students in the high social presence condition replied with

messages twice as long as those of students in the low social presence condition. Further,

students in the high social presence condition tended to show a higher degree of social

presence in the content of their messages to the instructors. However, Wise et al. did not

find a statistically significant relationship between how students performed on the final

assignment of the course and the social presence condition. Further, unlike previous

studies, they did not find a significant effect between the condition and student

satisfaction or the condition and perceived learning. These findings led Wise et al.

(2004) to conclude the following:

Social presence impacts the atmosphere of the course as indexed by the

perceptions of the instructor and the nature of the interaction, but there is no

identifiable effect on the overall impact of the course as indexed by learning or

perceived learning, engagement, or satisfaction. (p. 262)

Page 67: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

52

While Wise et al. try to make sense of their results, they never acknowledge how certain

issues—such as unique course format, limited time frame, lack of other students (to name

just a few)—might have skewed their results.

While Picciano (2002) and Wise et al. (2004) directly investigated social presence

and student learning (which they operationalize as student performance on specific

assignments), other researchers have investigated social presence and student perceived

learning. For instance, as mentioned earlier, Richardson and Swan (2003) conducted a

study in which they found that students who were identified with higher social presence

perceived they learned more than those with lower social presence. They also found a

relationship between student satisfaction with their instructor and perceived learning.

Russo and Benson (2005) also found a statistically significant relationship between

student perceptions of their own presence and the points they earned in a class. However,

Hostetter and Busch (2006) did not find a relationship between students’ perception of

presence and learning outcomes. Inconsistencies such as these, as well as those between

Picciano and Wise et al., suggest that the findings on social presence and student learning

are inconclusive.

Despite previous researchers’ efforts like those just described, researchers such as

Rourke and Kanuka (2009) have been critical of research on social presence and the CoI

as a whole for not spending enough time showing whether or not teaching presence and

social presence actually influence student learning. But despite some of the

inconsistences reported so far, research suggests social presence plays an important role

in online courses.

Page 68: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

53

Establishing and Maintaining Social Presence

Despite some mixed results, social presence appears to be related to student

satisfaction, student interaction online, students’ sense of community, as well as possibly

student learning. Therefore, as the number of students taking courses online increases

each year (Allen & Seaman, 2010), it is critical to better understand how social presence

is established and maintained in online learning environments.

The categories and indicators developed by Rourke et al. (2001a) and later

expanded upon by Swan (2003) can be seen as guidelines for establishing social

presence. For instance, if the expression of emotions, the use of humor, or self-disclosure

are indicators of social presence, then it is reasonable to expect that if one expresses his

or her emotions more, uses humor, and self-discloses information then he or she will be

able to establish social presence. Further, research suggests (e.g., Wise et al., 2004) that

social presence behaviors engender more social presence. Thus, indicators like those

listed in Table 2.4 should then be seen as guidelines for establishing social presence.

Table 2.4 Strategies to establish and develop social presence

Categories Indicators Strategies

Affective Emotions

Humor

Self-disclosure

Value

Paralanguage

Express emotions

Use humor

Self-disclose personal information

Express personal values

Use features of text, like emoticons to

express emotion

Page 69: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

54

Table 2.4 (con’t.)

Cohesive Greetings & salutations

Vocatives

Group reference

Social sharing

Self-reflection

Greet other students

Address students by name

Use inclusive pronouns like “we,” and “us”

Share personal information

Reflect on the course openly

Interactive Acknowledgement

Disagreement

Approval

Invitation

Personal advice

Refer directly to others’ postings

Agree or disagree with others’ postings

Express approval

Ask questions

Offer advice to peers

Note. From “Developing Social Presence in Online Course Discussions,” by K. Swan,

2003, in S. Naidu (Ed.), Learning and Teaching with Technology: Principles and

Practices (pp. 147-164). London: Kogan Page.

Synthesizing past literature (which included the work of Rourke et al.), Aragon

(2003) identified a similar list of strategies on how to establish social presence. Aragon

identified three components to establishing social presence online: (a) the role of course

design, (b) the role of the instructor, and (c) the role of the participants (see Table 2.5).

Aragon’s three components to establishing social presence are similar to the three types

of presence—social presence, teaching presence, and cognitive presence—of the CoI

framework developed by Garrison et al. (2000). Aragon’s strategies for instructors

though are not simply what Garrison et al. would call “teaching presence” strategies in

the CoI framework.

Page 70: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

55

Table 2.5 Strategies to Establish and Maintain Social Presence

Course Design Instructors Participants

Develop welcome

messages

Include student profiles

Incorporate audio

Limit class size

Structure collaborative

learning activities

Contribute to discussion

boards

Promptly answer email

Provide frequent feedback

Strike up a conversation

Share personal stories &

experiences

Use humor

Use emoticons

Address students by name

Allow students options for

addressing the instructor

Contribute to discussion

boards

Promptly answer email

Strike up a conversation

Share personal stories and

experiences

Use humor

Use emoticons

Use appropriate titles

Note. From “Creating Social Presence in Online Environments,” by S. R. Aragon,

2003, in, New Directions for Adult and Continuing Education, 100, (pp. 57-68).

Certain strategies such as “sharing personal experiences and stories” are more of a

type of social presence strategy used by instructors. This is important because the CoI

traditionally does not differentiate between how a student and how an instructor

establishes his or her social presence. Swan and Shih (2005) though pointed out that there

are some differences between the two. And Nippard and Murphy (2007), who sought to

investigate how teachers and students manifest social presence in synchronous web-based

secondary classrooms, found that in their study students and instructors did in fact engage

in different social presence behaviors.

Page 71: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

56

Some question whether there may be an optimal level of social presence (Garrison

& Anderson, 2003; Wise et al., 2004). For instance, Garrison and Anderson posit that too

little social presence can prevent a learning community from forming but too much social

presence may reduce student learning by discouraging critical discourse (Garrison &

Anderson, 2003). Conjectures such as these might suggest that while it might be

important to focus on establishing social presence early in a course, it might be less

important later on in the course. Dunlap and Lowenthal (2010), though, have argued

based on their experience that it is important to continue to maintain social presence

throughout the duration of an online course.

Mixed results arise, though, on how social presence is maintained throughout the

duration of a course. For instance, the research of Rourke et al. (2001a) suggests that

purely social communication decreases over time. However, Stacey (2002) later found in

her study that purely social communication did not decrease but actually increased over

time; as a result, she concluded “that social relationships require continuation of social

presence factors through a much longer period, as one semester is only the beginning of

group formation online” (p. 150). But then to confuse matters even more, Swan (2003)

later found that “although the use of affective indicators mirrored the general flow of the

course discussions as the course progressed, cohesive indicators declined in importance,

while the importance of interactive indicators increased” (pp. 161-162). This possibly

suggests that different types of social presence behaviors serve different functions and

vary across time. For instance, Lomicka and Lord (2007)—who studied how foreign

language graduate students established and maintained social presence through weekly

Page 72: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

57

journal activities—found that the type of task influenced the type of presence used by

students.

Some Gaps in the Literature

Despite the popularity of social presence research, a number of gaps in the

literature remain. As researchers and practitioners continue to rely on theories and

research on social presence to influence their practice, it is imperative to address some of

these gaps. In the following section, I will briefly describe a few of these gaps that future

research needs to address.

First, as indicated throughout this chapter, previous researchers have found mixed

and contradictory results. For instance, some studies have found a strong relationship

between student satisfaction and social presence (Gunawardena, 1995; Gunawardena &

Zittle, 1997; Richardson & Swan, 2003), but other studies have not (Joo, Lim, & Kim,

2011; Wise et al., 2004). Some studies have found a relationship between social presence

and student performance (whether perceived or actual) (Picciano, 2002; Richardson &

Swan, 2003; Russon & Benson, 2004) while others have not (Hostetter & Busch, 2006;

Wise et al., 2004). Finally, some studies have found that social presence changes over

time (Rourke et al., 2001a; Swan, 2003) while others have not (Lomicka & Lord, 2007;

Stacey, 2002). Recently, Akyol, Vaughan, and Garrison (2011) sought to investigate

how time effects the development of a CoI. They studied the CoI in a 6 week and a 13

week online course. They reported that an “independent samples t-test revealed

statistically significant differences between the short and long-term courses on affective

communication . . . and group cohesion” (Akyol et al., p. 235). More specifically and to

their surprise—because conventional wisdom suggests that more time is needed to

Page 73: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

58

develop group cohesion—they found that students’ messages in the 6 week course

included more group cohesion indicators than the students’ messages in the 13 week

course. At the same time, there were more affective indicators in the 13 week course than

the 6 week course.

Second, conclusions are drawn and assumptions made about the nature of social

presence from research conducted on vastly different types of online courses. For

instance, researchers have failed to recognize how the socio-cultural context and course

format effects social presence. For instance, researchers (e.g., Tu, 2001; Wise et al.,

2004) have conducted a number of studies on social presence in which the teacher had

face-to-face meetings with students; meetings like these will have likely impacted the

development and perception of social presence. This is not to suggest that meeting face-

to-face with online students is a poor decision, but rather that it can influence students’

perceptions and therefore should only be compared to other instances where faculty meet

face-to-face with their students.

In other cases, researchers have studied social presence in non-traditional courses.

For instance, Gunawardena (1995; Gunawardena & Zittle, 1997) studied social presence

in a computer conference administered through a listserv, Tu (2001; 2002a) studied social

presence in face-to-face courses using CMC as well as televised courses, and Wise et al.

(2004) studied social presence in six week self-paced courses. While it is important to

understand how social presence is developed and maintained in these nontraditional types

of online courses, it is even more important to recognize how social presence might

change across a variety of different online learning formats and contexts. For instance,

Lowenthal, Wilson, and Parrish (2009) have argued about the importance context plays in

Page 74: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

59

online learning. And both Arbaugh, Bangert, and Cleveland-Innes (2010) and Gorsky,

Caspi, Antonovsky, Blau, and Mansur (2010) have recently found some interesting

differences in students’ perceptions of the CoI across academic disciplines.

Third, the only relationship found between social presence and actual student

learning (as opposed to perceived student learning) comes from a study that was

conducted in New York City during the attack on the World Trade Center (Picciano,

2002); there is reason to believe that a tragic event such as this could have skewed the

results. Further, Picciano (2002) only found a statistical relationship with one of his

measures of student learning and social presence. However, on the other hand—as

already mentioned—a number of researchers (Picciano, 2002; Richardson & Swan, 2003;

Russon & Benson, 2004) have found a relationship between social presence and

perceived learning. Differences and issues like this illustrate that questions remain about

the relationship of student learning—both actual and perceived—and social presence.

Fourth, and finally, the majority of past research on social presence has heavily

relied on survey data to study social presence. Studying students’ perceptions of social

presence is important. However, relying only on self-report data can be problematic

because students might be simply providing socially desirable answers (Hostetter &

Busch, 2006, p. 9). Further, self-report data tend to be collected at one period during a

semester and therefore cannot show change over time (Kramer, Oh, & Fussell, 2006).

Unfortunately, though, compared to studies using only self-report data, few researchers

(e.g., Delfino & Manca, 2007; Hughes et al., 2007; Rourke et al., 2001a; Swan, 2003)

have actually analyzed online discussions when studying social presence. But if

researchers want to understand better how social presence develops, is maintained, and

Page 75: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

60

changes over a course, they must begin to look at what is “said” and done in threaded

discussions—the primary avenue for interaction.

Chapter Summary

Researchers and practitioners alike seem fascinated by the concept of social

presence. I have found over a 100 articles on the subject. However, like most research on

online learning (Bernard et al, 2004; Tallent-Runnels, 2006), research on social presence

and online learning is of mixed quality. Even though initial research suggests that social

presence is related to student satisfaction, student interaction, and student learning, many

questions remain. In the next chapter, Chapter 3, I will outline the methods used for this

study. Then in Chapter 4, I go over the results of the study and I conclude in Chapter 5

with a discussion of the results.

Page 76: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

61

CHAPTER 3

METHOD

The purpose of this study is to explore the phenomena of social presence in an

online graduate level education course at the University of Colorado Denver. To

accomplish this, I utilized a mixed research methods approach that employed both

quantitative and qualitative methods to understand better social presence. In the following

chapter, I elaborate on the methods used for this study.

Research Questions

Research questions help narrow the focus of a study by providing a framework,

setting boundaries, and giving rise to the type of data that will be collected (Cresswell &

Plano Clark, 2007). The following research question guided this exploratory study: How

does social presence manifest in an asynchronous, online graduate-education course?

Research Design

Mixed methods research has become popular over the past few years (Leech &

Onwuegbuzie, 2006). Around the same time, researchers of CMC (e.g., Goldman,

Crosby, Swan, & Shea, 2005; Gunawardena, Lowe, & Anderson, 1997; Hiltz & Arbaugh,

2003) began arguing about the importance of using multiple methods when studying the

complexity of asynchronous learning environments. However, to date the majority of

research on social presence has utilized either a quantitative or qualitative approach,

which conceivably limits researchers interpretation of the data and understanding of the

phenomenon. I utilized a mixed methods research design in this study. The purpose of

using a mixed methods approach was to facilitate the richness of data and expand the

Page 77: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

62

interpretation of the findings (Collins, Onwuegbuzie, & Sutton, 2006; Onwuegbuzie &

Leech, 2004), as well as to answer the research question that guided this study.

Sample

Researchers differentiate between sampling schemes and sampling designs

(Onwuegbuzie & Collins, 2007; Onwuegbuzie & Leech, 2007b). Thus, I elaborate below

on the sampling scheme and the sampling design used in this study.

Sampling Scheme

The goal of this study was to gain insights into a phenomenon (i.e., social

presence) rather than to generalize findings to a population. In situations like these,

Onwuegbuzie and Collins (2007) argue that a purposeful sample should be used.

Therefore, a non-random (non-probability) criterion sampling scheme was used in this

study.

A section of EDLI 7210 Educational Policy Making for a Democratic Society—

which was taught in the spring of 2007 at the University of Colorado Denver—was

identified as an appropriate sample for this study. This course was selected for a number

of reasons. First, this course was a fully online course. While it is helpful to study social

presence in hybrid or televised courses, the focus of this study is on the nature of social

presence in fully online courses. Second, this course was an asynchronous, instructor-

facilitated, online course—the most popular type of online course in higher education.

And third, this was an education course. Recognizing that CMC is always socially

situated (Herring, 2004), the goal of this study was to study social presence in an

education setting—like the majority of previous research on social presence (Lowenthal,

Lowenthal, & White, 2009).

Page 78: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

63

Nineteen graduate students were enrolled in the course. The following course

description describes the basic focus of the course:

This course examines the role and impact of policy and policy processes in

educational organizations. Models will be developed to analyze the nature of

policy, how policy processes work, conceptualizations of and research on these,

and their implications for improving educational practices to benefit student

learning and other organizational behaviors and outcomes. The study of policy

and policy processes will be facilitated by several activities that will familiarize

you with various perspectives on, models of, and research about the initiation of

policy issues, the processes of implementation and evaluation, and their outcomes

and effects. Collaboration, group work, and research are emphasized.

The learning objectives of the course are:

1. To read critically a variety of works on policy-making processes and outcomes

2. To develop skills as a policy analyst and advocate

3. To develop appreciation for and use of various policy models, policy research,

and policy effects

The following are the main assignments of the course (see Table 3.1 for a description of

each assignment):

• Reading logs

• A policy critique

• An observation

• A book review

• A personal-professional task

• A small-group project

• Online interactivity and quality of work.

Page 79: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

64

Table 3.1 Assignment Descriptions

Individual and Group Assignments

Individual Assignments

Policy Critique (8.47% 11.8% of course grade): Write a five page (or less) double-

spaced paper assessing the goals (intended), trends, conditions, projections,

alternatives relative to a local, regional, or state educational policy.

Observation (8.47% 11.8% of course grade): First observe and analyze a policy

process (school board meeting, city: council, state legislature, county commission).

Then write a 2-3 page analysis of their observation.

Book Review (8.47% 11.8% of course grade): Select a book related to policy studies

and write a 3-5 page review.

Online Interactivity and Quality of Work (16.95% of course grade): Login regularly,

take part in threaded discussions, and produce quality work.

Group Assignments

Reading Logs (15.25% of course grade): Discuss readings in small groups Then write

nine reading logs summarizing the readings and posing questions for the instructor.

Personal-Professional Task (12.7% of course grade): In a group of two, take some

risk and discuss deeply important individual personal-professional goals. Analyze

trends that facilitate or impede each other’s goal achievement. Then write a 3-5 page

paper summarizing his or her partner’s self analysis and plan.

Small-Group Project (23.73% of course grade): Working in small groups of 4-5

students, study a policy at the local, state, or national level. Then write a 10-15 page

double-spaced group paper employing a qualitative approach to collecting data that

informs their critical analysis of the policy.

In addition to the assignments, a number of different types of discussions were

conducted in the course. The most active discussions were the “General Discussion

Forum,” “Reading Groups,” “Reading Log Discussion Forum,” “Pairs,” and finally

“Project Groups” (which everyone but the “General Discussion Forum” was directly tied

to key assignments).

Page 80: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

65

Sampling Design

Social presence researchers who study online course discussions historically only

analyze a small sample of course discussions. For instance, Rourke et al. (2001a) only

analyzed one week of discussions in two different courses in their foundational study,

which consisted of 134 messages or 30,392 words. Swan (2003) analyzed 235 posts with

an average number of words per posting at 82.4 (which she explained was 10% of the

entire courses discussions). And then Hughes et al. (2007) analyzed three different groups

of students with a total of 974 messages or 63,655 words.

For this study, I chose to analyze every threaded discussion in the course using

content analysis, which consisted of 1,822 posts or 160,091 words (see Table 3.2). Then

based on the results of the content analysis, two specific threaded discussions (which

span multiple weeks of the class) were identified—one with the highest social presence

indicators (which was Pair 9) and one with the lowest social presence indicators (which

was Reading Group E)—and analyzed using constant comparison analysis in an effort to

explore better the phenomenon of social presence. It is important to note that for the

purpose of this study, all discussion forums in the course are referred to as “threaded

discussions”—regardless of their purpose, the level of dialogue, or the amount of

interaction between the instructor and the students.

Page 81: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

66

Table 3.2 Threaded Discussions Raw Data*

Discussion Name # of Participants # of Posts # of Words

Virtual Office 7 44 2560

General—Syllabus 14 48 3294

General—Groups 6 14 639

General—Independent Work 3 3 155

General—Individual Work 2 2 84

Adult Learning Discussion Forum—

Your Learning

7 12 456

Adult Learning Discussion Forum—

Questionnaire #1

3 3 221

A: Reading Group A 4 125 7828

B: Reading Group B 5 132 11677

C: Reading Group C 4 95 8452

D: Reading Group D 4 109 12562

E: Reading Group E 5 40 5235

F: Reading Group F 4 106 10916

G: Reading Group G 5 103 8116

Pair 1 3 32 2028

Pair 2 3 40 6222

Pair 3 3 45 3000

Pair 4+ 4 6 248

Pair 5 3 30 2232

Pair 6 3 28 1453

Pair 7 3 26 2687

Pair 8 3 21 3658

Pair 9 3 15 2909

Pair 10 2 22 2129

Plus Delta Week2 8 13 866

Plus Delta Week 3 8 22 2375

Plus Delta Week 4 2 2 299

Plus Delta Week 5 2 2 109

Plus Delta Week 6 3 3 234

Project Group 1 5 109 12673

Project Group 2 5 180 15322

Project Group 3 5 138 8404

Project Group 4 5 113 6791

Project Group 5 4 126 12380

Reading Log 1 5 12 1364

Reading Log 3 1 1 513

Total 156 1822 160,091

*Note. The discussion names were copied exactly as they were worded in the online

course. If a discussion did not have any posts (e.g., Reading Log 2), it was not listed.

Page 82: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

67

Data Collection

The data for this study were collected from the asynchronous threaded discussions

from an online course taught in eCollege—a learning management system used at the

University of Colorado Denver. While the course was taught in 2007, an archived copy

of the course is stored in eCollege. Archived copies of course discussions, like these,

“provide readily accessible records of the evolution of social relationships in online

classes” (Goldman, Crosby, Swan, & Shea, 2005, p. 109). The course discussions were

copied from eCollege to Microsoft Word; each discussion was saved as its own file.

Student names were replaced with pseudonyms, and the files were imported into NVivo

8.

Data Analysis

Initially when researchers began studying online discussions, they focused on the

frequency of participation (Henri, 1992). In fact, researchers have only relatively recently

begun to move beyond the basics (e.g., frequency of student participation, the level of

interaction, and message length) to focus instead on studying the content of messages

online (Pena-Shaff & Nicholls, 2004). When researchers began focusing on analyzing

the content of messages, they turned to content analysis (De Wever, Schellens, Valcke, &

Keer, 2006). In this study, though, I used three types of analysis to analyze the data: (a)

word count, (b) content analysis, and (c) constant comparison analysis (Leech &

Onwuegbuzie, 2007; see Table 3.3).

Page 83: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

68

Table 3.3 Overview of Data Analysis

Research Question: How does social presence manifest in a graduate education

asynchronous online course?

Data Analysis Type of Data Purpose of Results

• Word Count

(Quantitative)

• Content Analysis

(Quantitative)

• Constant

Comparative

Analysis

(Qualitative)

• All course

discussions

• All course

discussions

• One discussion thread

with high social

presence & one with

low social presence

• Explore the frequency of

top words used

• Explore the presence and

frequency of categories

and indicators of social

presence.

• Identify codes, groups,

and themes in the data

missed by content

analysis.

Word Count

Traditionally, word count involves identifying deductively a word or words from

the literature on a subject or inductively identifying from the data specific words that

seem out of place or hold special meaning and then counting the frequency of these

words. For the purpose of this study, word count was used solely as a way to initially

explore the data primarily by looking for the frequency of and more importantly the type

of words used in the online discussions. NVivo 8 can quickly and efficiently calculate

word counts.

Word count is an effective initial way to analyze data by exploring the occurrence

of words in a data set. The assumption with word counts, according to Leech and

Onwuegbuzie (2007), “is that more important and significant words for the person will be

used more often” (p. 568). However, it is important to acknowledge that word count has

some limitations (e.g., it can decontextualize a word and its meaning) (Leech &

Page 84: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

69

Onwuegbuzie). Therefore, word count should not be used as the only method to analyze

data. And for this study, it was used solely as an initial method to identify if certain types

of words (e.g., student names or greetings and salutations which are indicators of social

presence) were used more than others.

Content Analysis

In the social sciences, content analysis has been the leading method used by

researchers to analyze text (Carley, 1993). Content analysis is understood and defined in

a number of different ways. For instance, Berelson (1952) defined it as “a research

technique for the objective, systematic, quantitative description of the manifest content of

communication” (p. 519). But then Holsti (1969) defined it as “any technique for making

inferences by objectively and systematically identifying specified characteristics of

messages” (p. 14). Finally, Carley (1993) explains that “content analysis focuses on the

frequency with which words or concepts occur in texts or across texts” (p. 81).

Regardless of how one defines it, the purpose of content analysis “is to reveal

information that is not situated at the surface of the transcripts” (De Wever, Schellens,

Valcke, & Van Keer, 2006, p. 7).

For this study, I followed the following five steps identified by Herring (2004):

1. The researcher formulates a research question and/or hypotheses

2. The researcher selects a sample

3. Categories are defined for coding

4. Coders are trained, code the content, and the reliability of their coding is

checked

5. The data collected during the coding process are analyzed and interpreted.

Page 85: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

70

I turned to the literature at step number three to identify categories for coding. I first

looked at the broad categories of the CoI framework developed by Garrison et al. (2000).

Garrison et al. identified three categories of social presence—namely, emotional

expression, open communication, and group cohesion. At that time, they only identified

some possible examples of indicators for each category (see Table 3.4). I then referred to

the work of Rourke et al. (2001a)3 in which Garrison and his colleagues more fully

developed the categories and indicators of social presence.

Table 3.4 Original Social Presence Categories and Example Indicators

Element Category Examples of Indicators

Social Presence Emotional Expression

Open Communication

Group Cohesion

Emotions

Risk-free expression

Encouraging collaboration

Rourke et al. changed the names of the categories from Emotional Expression to

Affective Responses, Open Communication to Interactive Responses, and Group

Cohesion to Cohesive Responses. They also identified specific indicators for each

category of social presence as well as definitions of each indicator (see Table 3.5).

3 Please note that some uncertainty exists regarding the original date of Rourke et al.’s

article entitled “Assessing Social Presence in Asynchronous Text-based Computer

Conferencing.” I personally have a hard copy in which the publication is listed as 1999

and another as 2001. Researchers tend to cite it both ways. I contacted Liam Rourke to

get some clarification but he simply replied that he was not sure but that he thought 2001

might be correct. I reference it as 2001 for the purpose of this study.

Page 86: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

71

Table 3.5 Rourke et al.’s Categories and Indicators of Social Presence

Category Indicators Definition of Indicators

Affective

Responses (originally

“Emotional

Expression”)

Expression of

emotions

Conventional expressions of emotion, or

unconventional expressions of emotion,

includes repetitious punctuation,

conspicuous capitalization, emoticons

Use of Humor Teasing, cajoling, irony,

understatements, sarcasm

Self-Disclosure Presents details of life outside of class, or

expresses vulnerability

Interactive

Responses (originally

“Open

Communication”)

Continuing a Thread Using reply feature of software, rather

than starting a new thread

Quoting from Other

Messages

Using software features to quote others

entire message or cutting and pasting

sections of others’ messages

Referring explicitly

to other messages

Direct references to contents of others’

posts

Asking questions Students ask questions of other students

or the moderator

Complimenting,

expressing

appreciation

Complimenting others or contents of

others’ messages

Expressing

agreement

Expressing agreement with others or

content of others’ messages

Cohesive

Responses (originally

“Group

Cohesion”)

Vocatives Addressing or referring to participants by

name

Addresses or refers

to the group using

inclusive pronouns

Addresses the group as we, us, our,

group

Phatics/Salutations Communication that serves a purely

social function; greetings, closures

Note. From “Assessing Social Presence in Asynchronous Text-based Computer

Conferencing,” by L. Rourke, D. R. Garrison, and W. Archer, 2001a, in Journal of

Distance Education, 14.

Swan (2003), however, later made some changes to the list of indicators—namely

Swan simplified the interactive indicators but elaborated on the affective indicators (see

Page 87: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

72

Table 3.6). Finally, Hughes et al. (2007) replicated Rourke et al.’s (2001a) work but

apparently were unaware of Swan’s previous study. They too made some changes to the

indicators of social presence originally developed by Rourke et al. but some that were

different than Swan. Faced with this evolution of social presence indicators (see Table

3.6), I decided to integrate the changes both Swan and Hughes et al. made to the social

presence indicators (see Table 3.7). I used this initial combined list of indicators during

the first training session with one of two coders. During the training sessions, it became

apparent that the list of indicators needed to be amended.

Table 3.6 Evolution of the Indicators of Social Presence

Rourke et al. (2001a) Swan (2003) Hughes et al. (2007)

Categories & Indicators Categories &

Indicators Categories & Indicators

Affective Responses Affective Responses Affective

Expression of emotions

Use of Humor

Self-Disclosure

Paralanguage

Emotion

Value

Humor

Self-Disclosure

Expression of emotion

Use of Humor

Self-Disclosure

Interactive Responses Interactive Responses Interactive

Continuing a Thread

Quoting from Other

Messages

Referring explicitly to

other messages

Asking questions

Complimenting,

expressing appreciation

Expressing agreement

Acknowledgement

Disagreement

Approval

Invitation

Personal Advice

Referring to other’s

messages

Asking Questions

Complimenting,

expressing appreciation

Expressing Agreement

Cohesive Responses Cohesive Responses Cohesive

Vocatives

Addresses or refers to the

group using inclusive

pronouns

Phatics / Salutations

Greetings &

Salutations

Vocatives

Group Reference

Social Sharing

Self-reflection

Vocatives

Expresses group

inclusivity

Phatics / Salutations

Embracing the Group

Page 88: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

73

Table 3.7 Swan & Hughes et al. Combined List of Categories and Indicators of

Social Presence

Category & Indicator Definition (Swan) Criteria (Hughes)

Affective Responses

Paralanguage Features of text outside formal

syntax used to convey emotion

(i.e., emoticons, exaggerated punctuation or spelling)

Emotion Use of descriptive words that

indicate feelings (i.e., love, sad,

hate, silly); conventional or

unconventional expression of emotions

Refers directly to an emotion

or an emoticon. Use of

capitalization only if

obviously intended

Value Expressing personal values,

beliefs, and attitudes

Humor Use of humor—joking, teasing,

cajoling, irony, sarcasm,

understatement

Only code if a clear

indication that this is meant

to be funny, e.g., extra punctuation or an emoticon

Self-Disclosure Sharing personal information,

expressing vulnerability or feelings

An expression that may

indicate an emotional state

but does not directly refer to

it. Uncertainty, non comprehension

Interactive Responses

Acknowledgement Referring directly to the

contents of others’ messages;

quoting from others’ messages agreement

Explicit or implicit

recognition that another

message has been the motivation for this message

Agreement /

Disagreement

Expressing agreement or

disagreement with other’s messages

Expressing agreement with

each other or contents of messages

Approval Expressing approval, offering

praise, encouragement

Invitation Asking questions or otherwise

inviting response. Students ask

questions of each other or moderator

Personal Advice Offering specific advice to

classmates

Complimenting,

expressing

appreciation

Complimenting, expressing appreciation

Complimenting or showing

appreciation of each other or

contents of messages

Page 89: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

74

Table 3.7 (con’t.)

Cohesive Responses

Greetings &

Salutations /

Phatics

Greetings, closures.

Communication that serves a

purely social function

Vocatives Addressing or referring to

classmates by name

Group Reference /

inclusivity

Referring to the group as “we,”

“us,” “our.” Addresses the

group as a possessed or as a

whole

Any reference to the

group with a possessive

pronoun

Social Sharing Sharing information unrelated

to the course

Not really about yourself

but more of a social

response

Self-reflection Reflection on the course itself,

a kind of self-awareness of the

group

Embracing the

Group

Revealing life outside the

group

Any expression that lets

the group know about the

circumstance of the

author but does not make

author vulnerable

For instance, under the affective category, the indicator of value was eliminated

because it was nearly impossible for two coders to reliably identify value. Further, given

the content of the course, nearly every other discussion posting appeared to have a “I

think” or “I feel. . . ” statement (which were the examples originally provided by Swan).

Under the interactive responses, the indicators of approval and personal advice were

eliminated. Approval was eliminated because of its overlap with complimenting/

expression appreciation and personal advice was difficult to identify thus complicating

reliability. Finally, under the category of cohesive responses, social sharing and self-

reflection were eliminated. Social sharing and embracing the group overlapped and self-

Page 90: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

75

reflection was difficult to identify. The following coding sheet in Table 3.8 was used for

the content analysis.

Table 3.8 Coding Sheet Used for Content Analysis

Category &

Indicator

Definition

(Swan)

Criteria Examples

Affective Responses

Paralanguage

(PL)

Features of text

outside formal

syntax used to

convey emotion

(i.e., emoticons,

exaggerated

punctuation or

spelling)

Someday……; How

awful for you ;

Mathcad is definitely

NOT stand along

software;

Absolutely!!!!!

Emotion

(EM)

Use of descriptive

words that

indicate feelings

(i.e., love, sad,

hate, silly);

conventional or

unconventional

expression of

emotions

Refers directly to

an emotion or an

emoticon. Use of

capitalization only

if obviously

intended

When I make a spelling

mistake, I look and feel

stupid; I get chills when

I think of . . . I am

scared; This is fun;

Sorry this is such a lame

email; Hope you are

OK; I am pleased that

Humor (H) Use of humor—

joking, teasing,

cajoling, irony,

sarcasm,

understatement

Only code if a

clear indication

that this is meant

to be funny, e.g.,

extra punctuation

or an emoticon

God forbid leaving your

house to go to the

library

I’m useless at computers

but will this make me a

bad nurse??? Ha Ha ;

LOL

Self-

Disclosure

(SD)

Sharing personal

information,

expressing

vulnerability or

feelings

An expression that

may indicate an

emotional state but

does not directly

refer to it;

Uncertainty, non-

comprehension

I sound like an old lady;

I am a closet writer; We

had a similar problem.

I’m not quite sure how

to . . .; This is strange; I

don’t understand how; I

don’t’ know what that

means; As usual I am

uncertain; It’s all too

much . . .; Website???

Help!!!!

Page 91: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

76

Table 3.8 (con’t.)

Interactive Responses

Acknowledgement

(AK)

Referring directly

to the contents of

others’ messages;

quoting from

others’ messages

agreement;

Reference to

others’ posts

Explicit or

implicit

recognition

that another

message has

been the

motivation for

this message

Those ‘old machines’

sure were something;

we won by a landslide

– ‘landslide’ (next

response)So what

you’re saying is . . .; I

thought that too . . . For

me the question meant .

. .;

Agreement /

Disagreement

(AG)

Expressing

agreement or

disagreement

with others’

messages

Expressing

agreement with

each other or

contents of

messages

I’m with you on that; I

agree; I think what you

are saying is right. I

think that would be a

good plan; I think your

suggestion is good

Invitation (I) Asking questions

or otherwise

inviting response.

Students ask

questions of each

other or

moderator

Any suggestions?;

Would you describe

that for me, I am

unfamiliar with the

term. Does anybody

know . . .?

Expressing

Appreciation (EA)

Showing

appreciation of

each other

Showing

appreciation or

approval of

each other or

contents of

messages or

complimenting

You make a good

point; Right on; Good

luck as you continue to

learn

I like your briefing

paper . . .; It was really

good;

Cohesive Responses

Greetings &

Salutations /

Phatics (GS)

Greetings,

closures.

Communication

that serves a

purely social

function

Hi Mary; That’s it for

now, Tom Hi; Hey;

Bye for now;

Vocatives (V) Addressing or

referring to

classmates by

name

You know, Tamara, . .

.; I totally agree with

you Katherine Sally

said that . . .

Page 92: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

77

Table 3.8 (con’t.)

Cohesive Responses (con’t.)

Group

Reference /

inclusivity

(GR)

Referring to the

group as ‘we’,

‘us’, ‘our’.

Addresses the

group as a

possessed or as a

whole

Any reference to

the group with a

possessive pronoun

We need to be

educated; Our use of

the Internet may not be

free. We need some

ground rules;

The task asks us to . . .

Embracing

the Group

(EG)

Revealing life

outside the group

that is not

emotional or

expressing

vulnerability or

feelings. Also that

isn’t related to the

course

Any expression that

lets the group know

about the

circumstance of the

author

The kids are asleep

now; I’m a

physiotherapist;

It’s raining again; It’s

4am—I’m off to bed;

Constant Comparison Analysis

Constant comparison analysis was the final type of analysis conducted on the

threaded discussions. Constant comparison analysis—a specific type of comparative

analysis—is a general method used in social science research that traces back to the work

of Glaser and Strauss and their development of grounded theory (1967). While

researchers like Krathwohl (2004) and Creswell (1998, 2008) approach constant

comparison analysis from only a grounded theory perspective, it is not restricted to a

grounded theory or inductive approach (Leech & Onwuegbuzie, 2007). Leech and

Onwuegbuzie (2007) explain that constant comparison analysis can be conducted

inductively, deductively, or abductively.

Constant comparison analysis is useful when trying to explore and understand the

big picture of a phenomenon (e.g., social presence). In fact, it is one of the most

commonly used types of qualitative analysis (Leech & Onwuegbuzie, 2007). However,

Page 93: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

78

researchers of CMC rarely use it to analyze online course discussions, most likely due to

the time involved to conduct this type of analysis. It was used in this study to dig more

deeply into the threaded discussions to understand better the nature of social presence. To

conduct constant comparison analysis, I read the entire threaded discussion and

partitioned each meaningful unit into small chunks. I then labeled each chunk with a code

while constantly comparing new codes with previous ones. I then grouped the codes

together. Once I grouped the codes together, I identified themes that emerged from the

data. Figure 3.1 outlines the steps I took with examples from a previous study I

conducted.

Step 1. Read the discussion post

Hello everyone! I love the educational environments you have created this week. Educators and

students should always be the ones who create our schools. It is inspirational to see so many of you

create from the schools you have been in or are currently in. Thanks for your creativity! Dr. Deb C.

Step 2. Chunk the discussion post into meaningful units

[Hello everyone!] [I love the educational environments you have created this week.] [Educators

and students should always be the ones who create our schools.] [It is inspirational to see so many

of you create from the schools you have been in or are currently in.] [Thanks for your creativity!] [Dr. Deb C]

Step 3. Code each meaningful unit while constantly comparing new codes with previous codes [Hello everyone!] GREETING [I love the educational environments you have created this week.]

POSITIVE FEEDBACK [Educators and students should always be the ones who create our

schools.] ELABORATION / CLARIFICATION [It is inspirational to see so many of you create

from the schools you have been in or are currently in.] POSITIVE FEEDBACK [Thanks for your

creativity!] POSITIVE FEEDBACK [Dr. Deb C] CLOSING REMARK

Page 94: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

79

Step 4. Make a list of the codes and group the codes Codes Grouping of codes

Closing remark

Directions

Positive feedback

Greeting

Questioning

Answering question

Elaboration / clarification

Writing style

Resource

Number of students

Inclusive language

Teacher request

Colorado law

Faculty seeking feedback

Empathy

Welcoming

Negotiation

Accommodation

Contact information

Course logistics

Directions

Writing style

Number of students

Teacher request

Colorado law

Greetings and

Salutations

Welcoming

Greeting

Closing remark

Teaching / Facilitation

Questioning

Answering questions

Elaboration /

clarification

Positive feedback

Resource

Step 5. Identify themes that emerge from the data (include specific language from the groups, codes, or data when appropriate) While RTEOF have to deal with day to day course logistics, such as directions on how to complete

assignments and course expectations, they play more of a role of as a facilitator through the use of

questioning, elaborating/clarifying, and giving positive feedback than as a instructor or giver of

knowledge.

Figure 3.1. Steps followed to complete constant comparison analysis of online

discussions

Reliability and Validity

Reliability and validity are key considerations for any researcher. These two

concepts are intricately connected (Cresswell, 2008). Issues of reliability and validity are

addressed in the following pages.

Reliability

Reliability is essentially the consistency of scores researchers obtain from a

measure (Goodwin, 2001). More specifically, according to Goodwin, “interrater

agreement and reliability is the extent to which scores obtained from two or more raters

Page 95: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

80

(scorers, judges, observers) are consistent” (p. 15). The most common method used to

calculate interrater reliability is a percent agreement statistic (Rourke et al., 2001b).

I selected ten percent of the discussions to assess interrater reliability. Two coders

(i.e., me and another researcher) chunked and coded the discussions using content

analysis. One challenge though with interrater reliability is that there is not a consistent

agreed upon level of what must be achieved (Rourke, Anderson, Garrison, & Archer,

2001b). Past research on social presence (e.g., Rourke et al., 2001a; Swan, 2003) was

used as a guide of where interrater reliability should lie.

Following Rourke et al. and Swan, the entire discussion posting was used as the

unit of analysis. As a result, 100% agreement was found between the two coders when

identifying the chunks to code because the learning management system clearly identified

an entire post. After some initial training, the two raters then coded 10% of the

discussions to establish reliability of coding. A percent agreement statistic was calculated

using Holsti’s (1969) coefficient of reliability for each of the threaded discussions:

• Reading Group G: 80%

• Pair 6: 78%

• Project Group 3: 77%

The overall percent agreement for all of the discussion was 78%, which is an acceptable

level of agreement given past research (Garrison, Anderson, & Archer, 2001; Hughes et

al., 2007).

Validity

Validity is a complex concept. Validity has been defined as the “trustworthiness

of inferences drawn from data” (Eisenhart & Howe, 1992, p. 644). However, over the

Page 96: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

81

years, researchers’ understanding of validity—and therefore definitions and standards—

has evolved (Dellinger & Leech, 2007; Goodwin & Leech, 2003). Further, quantitative

and qualitative researchers tend to understand and deal with validity differently

(Cresswell & Plano Clark, 2007; Dellinger & Leech, 2007).

Historically, quantitative researchers separate validity into content, criterion-

related, and construct validity. Qualitative researchers, on the other hand, historically

describe validity as trustworthiness. A large component of establishing trustworthiness is

developing a sound theoretical framework (Garrison, Cleveland-Innes, Koole, &

Kappelman, 2006, p. 2), as I have tried to do throughout this study. Further, the coding

schemes that were used for this study are based directly in the literature.

Chapter Summary

Researchers have studied social presence in online learning environments for a

number of years now. However, to date, research on social presence suffers from a host

of problems—ranging from inconsistent and contradictory findings to strange sampling

decisions. Further, researchers have not been able to demonstrate a consistent

relationship between student learning and social presence. Part of the problem might be

the methodological decisions that researchers have made. Rather than employ a mono-

method approach like the majority of past research, this study employed a mixed methods

approach to studying social presence—utilizing both quantitative and qualitative methods

to understand the complex nature of social presence. In addition, in this study I

specifically focused on how students establish and maintain social presence in a text-

based environment by focusing on what is “said” in the threaded discussions.

Page 97: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

82

CHAPTER 4

RESULTS

As described in Chapter 3, I used a mixed methods research approach to explore

how social presence manifests in an online graduate level education course at the

University of Colorado Denver. More specifically, I was interested in finding out how

users established their social presence through text alone in asynchronous threaded

discussions. In this chapter, I share the results from the word count, content analysis, and

constant comparison analysis I conducted to explore how online learners establish and

maintain their presence in one specific fully online course.

Word Count

I conducted a word count of the threaded discussions to initially explore the data.

I was curious whether certain types of words appeared more frequently than others across

all of the threaded discussions as well as within certain types of threaded discussions as

opposed to others. For example, did certain words appear more in threaded discussions

with a pair of students vs. threaded discussions with small groups of students? Using

Nvivo 8, I specified the parameters for a word count frequency report.

I first looked at the top 50 words used across all of the threaded discussions (see

Appendix B for a complete list of word count frequency’s for each word count

conducted). While I set forth to investigate the top 50 words used across all threaded

discussions, I found that the top 20 words were sufficient to get a basic understanding of

the data. Thus, I only report on the top 20 words in this section (though I have included

the top 50 results in Appendix B).

Page 98: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

83

Table 4.1 lists the top 20 words used across all of the threaded discussions (see

Figure 4.1 for a visual representation). The word “I” was used most frequently (4,858

times which represents 4.13% of all the words used) followed next by the word “you”

(2,186 times; 1.86% of all the words used). The frequency of these words is not that

surprising but the fact that “we” (which is often used as a sign of group reference and

therefore an indicator of social presence) was used 1,367 times (or 1.16% of all words

used) and ranks fourth overall in all words used is noteworthy. Some other things of

interest are the fact that “your” which can often be an example of “acknowledgement”

(i.e., another indicator of social presence) was used 810 times or eighth overall. And

finally, the word “policy”—which is the focus of the course—was used 600 times (or 10th

overall) whereas the instructor’s pseudonym, “Bob,” was used 566 times (or 14th

overall).

Table 4.1 Top 20 Words Used Across All Threaded Discussions

Rank Word Count Percentage (%)

1 I 4858 4.13

2 you 2186 1.86

3 have 1428 1.21

4 we 1367 1.16

5 my 1001 0.85

6 what 948 0.81

7 do 814 0.69

8 your 810 0.69

9 can 730 0.62

10 policy 600 0.51

11 me 595 0.51

12 all 592 0.50

13 about 574 0.49

14 bob 566 0.48

15 so 565 0.48

16 Instructor 564 0.48

17 Think 553 0.47

18 Our 538 0.46

19 Work 494 0.42

20 Would 482 0.41

Page 99: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

84

Figure 4.1. Word cloud of word count results without the discussion headings.

After looking at the frequency of the top 50 words used across all threaded

discussions, I then ran a word count report for each of the main threaded discussions:

Project Groups (see Table 4.2), Pairs (see Table 4.3), and Reading Groups (see Table

4.4). While “I” and “you” were still the first and second most used words in each of the

main threaded discussions, Figure 4.1 illustrates that “we” and “your” (two possible

social presence indicators) show up in the top 20 across all three of these threaded

discussions and “our” (which is also a possible social presence indicator) shows up across

two of the threaded discussions—namely, the Project Groups and the Pairs threaded

discussions. Each of these words according to the coding sheet (which was discussed in

Chapter 3) and the literature in general (see Chapter 2) are possible indicators of group

reference and acknowledgement and therefore considered to be indicators of social

presence. I mention “possible” because in this case word count does not take into

consideration the context in which a given word is used; for instance, “we” could be

referring to “we Americans” or “we the class.”

Page 100: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

85

However, I found it interesting, though not necessarily surprising, that the word

“we” and to a smaller degree “our” (i.e., group reference) show up more in specific types

of small-group discussions where the purpose of the discussion is on collaborating on a

class project together as compared to reading groups (which are also small-group

threaded discussions but one with a different purpose). This suggests that the purpose of a

threaded discussion might influence the degree to which its participants employ certain

types of behaviors (i.e., the things referred to later in this chapter as indicators of social

presence) to establish and maintain their social presence. Once I had a basic feel for the

data, I then conducted a content analysis, which I elaborate on in the next section.

Project Groups Pairs Reading Groups

Rank Word Count % Word Count % Word Count %

1 I 1674 4.08 I 960 4.87 I 1784 3.79

2 you 729 1.78 you 438 2.22 you 802 1.70

3 we 678 1.65 my 339 1.72 have 532 1.13

4 have 497 1.21 have 291 1.48 we 416 0.88

5 what 387 0.94 we 209 1.06 what 358 0.76

6 do 267 0.65 your 189 0.96 do 348 0.74

7 can 261 0.64 me 148 0.75 policy 344 0.73

8 your 258 0.63 what 145 0.74 my 328 0.70

9 our 251 0.61 do 130 0.66 can 297 0.63

10 all 241 0.59 work 123 0.62 reading 293 0.62

11 think 221 0.54 about 119 0.60 your 283 0.60

12 my 215 0.52 goals 116 0.59 one 255 0.54

13 so 214 0.52 can 110 0.56 about 242 0.51

14 data 202 0.49 our 102 0.52 all 231 0.49

15 policy 193 0.47 how 100 0.51 think 227 0.48

16 would 184 0.45 school 97 0.49 me 226 0.48

17 some 181 0.44 teachers 85 0.43 so 225 0.48

18 need 176 0.43 goal 84 0.43 instructor 222 0.47

19 me 173 0.42 some 82 0.42 bob 221 0.47

20 about 171 0.42 would 82 0.42 summary 196 0.42

Figure 4.2. Frequency of possible social presence indicators across the three major

and most frequented threaded discussions.

Page 101: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

86

Table 4.2 Top 20 Words across Project Groups

Rank Word Count Percentage (%)

1 I 1674 4.08

2 you 729 1.78

3 we 678 1.65

4 have 497 1.21

5 what 387 0.94

6 do 267 0.65

7 can 261 0.64

8 your 258 0.63

9 our 251 0.61

10 all 241 0.59

11 think 221 0.54

12 my 215 0.52

13 so 214 0.52

14 data 202 0.49

15 policy 193 0.47

16 would 184 0.45

17 some 181 0.44

18 need 176 0.43

19 me 173 0.42

20 about 171 0.42

Table 4.3 Top 20 Words across Pairs

Rank Word Count Percentage (%)

1 I 960 4.87

2 you 438 2.22

3 my 339 1.72

4 have 291 1.48

5 we 209 1.06

6 your 189 0.96

7 me 148 0.75

8 what 145 0.74

9 do 130 0.66

10 work 123 0.62

11 about 119 0.60

12 goals 116 0.59

13 can 110 0.56

14 our 102 0.52

15 how 100 0.51

16 school 97 0.49

17 teachers 85 0.43

18 goal 84 0.43

19 some 82 0.42

20 would 82 0.42

Page 102: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

87

Table 4.4 Top 20 Words across Reading Groups

Rank Word Count Percentage (%)

1 I 1784 3.79

2 you 802 1.70

3 have 532 1.13

4 we 416 0.88

5 what 358 0.76

6 do 348 0.74

7 policy 344 0.73

8 my 328 0.70

9 can 297 0.63

10 reading 293 0.62

11 your 283 0.60

12 one 255 0.54

13 about 242 0.51

14 all 231 0.49

15 think 227 0.48

16 me 226 0.48

17 so 225 0.48

18 instructor 222 0.47

19 bob 221 0.47

20 summary 196 0.42

Content Analysis

After conducting word count, I used an amended version (see Chapter 3) of the

social presence indicators developed by Rourke et al. (2001a) to conduct content analysis

on all of the threaded discussions in the course in order to identify what types of social

presence indicators were present in each threaded discussion. As an exploratory study, I

was interested in exploring the data to see how the students and the instructor in this

given sample established and maintained their social presence. More specifically, though,

I was curious about the overall occurrence of all of the social presence indicators (taken

as a whole) across all of the threaded discussions, as well as the degree to which each

category (i.e., groups of specific types of social presence indicators) and specifically each

Page 103: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

88

individual indicator were used in this sample. But at the same time, based on the CoI

framework coupled with the word count results, I was also interested in the degree to

which all of the social presence indicators, categories of social presence indicators, and

specifically each individual social presence indicator occurred in specific types of

threaded discussion. Finally, and based in part on the results of my own research

(Lowenthal & Dunlap, 2011), I was curious how individual students might employ

certain types of social presence behaviors differently than others.

In summary, in order to explore how social presence manifests in threaded

discussions (i.e., the research question guiding this study), I was interested in the

occurrence and the frequency of the social presence indicators across all of the threaded

discussions, as well as their occurrence and frequency within specific threaded

discussions, and finally their relationship to each student (i.e., how often each student

used specific social presence indicators). Figure 4.2 visually illustrates the stages of

disaggregation I went through and report on in the following sections.

Figure 4.3. Stages of disaggregation of content analysis used to explore use of social

presence indicators in a fully online asynchronous online course.

Page 104: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

89

Stage One: Social Presence Categories and Indicators across All Threaded

Discussions

Past research on social presence, or at least research focused on identifying

indicators of social presence in online discussions, has focused primarily on reporting the

results in terms of the three categories or types of social presence indicators. Thus, I was

first interested in identifying which category of social presence indicators (i.e., Affective,

Interactive, and Cohesive) was identified the most and which was identified the least

across all of the threaded discussions. In other words, as a class, were “Affective,”

“Cohesive,” or “Interactive” indicators used the most?

Content analysis revealed that of the three different categories (or types) of social

presence, “Interactive” indicators were present the most with 2,581 instances, “Cohesive”

indicators were present the second most with 2,454 instances, and “Affective” indicators

were present the least with 1,373 instances (see Figure 4.4 and Table 4.5). The

differences between “Interactive” indicators and “Cohesive” indicators across all of the

threaded discussions are minor. But there is an observable difference between these two

categories and the “Affective” category of social presence indicators (see Figure 4.4). In

other words, in this sample, students used “Affective” indicators the least. This is

interesting in part because while Hughes et al. (2007) found a similar result in their

sample, Swan (2003) found that “Affective” indicators were actually used the most in her

sample.

Page 105: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

90

Figure 4.4. A visual depiction of the frequency of each of the three social presence

categories.

After examining the category level, I drilled down further to identify the

frequency at which participants in this sample used each of the individual social presence

indicators across all of the threaded discussions. The top three indicators used across all

of the threaded discussions were “acknowledgement” (i.e., recognizing and openly

acknowledging a previous post by a person) which was used the most at 1,137 instances,

followed next by “invitation” (e.g., asking a question) which was used 747 times, and

then by “vocatives” (i.e., addressing someone directly by the first name) which was used

748 times (see Table 4.5 and Figure 4.5).

It is difficult to compare these results to other researchers because as mentioned

earlier, the majority of those who do analyze online discussions focusing on social

presence indicators do not report their results at the indicator level. Swan (2003),

however, is one exception. But Swan only reports her findings at the indicator level

through a series of bar graphs that lack exact numerical values (but still enable a reader to

compare the frequency of each indicator). Acknowledgement was the only top-three

Page 106: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

91

indicator shared with my sample and Swan’s sample; paralanguage which was used

infrequently in this sample was actually the most frequently occurring social presence

indicator in Swan’s study.

Table 4.5 Social Presence Frequency across All Forums

Category & Indicator Frequency

Total Affective Responses 1373

Paralanguage (PL) 270

Emotion (EM) 526

Humor (H) 53

Self-Disclosure (SD) 524

Total Interactive Responses

2581

Acknowledgement (AK) 1137

Agreement / Disagreement (AG) 192

Invitation (I) 747

Expressing Appreciation (EA) 505

Total Cohesive Responses

2454

Greetings & Salutations / Phatics (GS) 714

Vocatives (V) 748

Group Reference / inclusivity (GR) 638

Embracing the Group (EG) 354

Total 6408

The least frequently used indicators of social presence were “humor” which was

used the least at 53 instances (which was also the least used indicator in Swan’s sample),

followed next by “Agreement/Disagreement” which was used 192 times, and then by

“paralanguage” which was used 270 times (see Table 4.6 for a complete ranking of each

of the social presence indicators across all of the threaded discussions).

Page 107: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

92

Figure 4.5. Frequency of social presence indicators across all threaded discussions

Table 4.6 Social Presence Indicators Ranking from Highest to Lowest Frequency

Social Presence Indicators Frequency

Acknowledgement 1173

Vocatives 748

Invitation 747

Greetings & Salutations / Phatics 714

Group Reference / Inclusivity 638

Emotion 526

Self-Disclosure 524

Expressing Appreciation 505

Embracing the Group 354

Paralanguage 270

Agreement / Disagreement 192

Humor 53

While it is useful to compare how the individual social presence indicators

manifest across all three categories of social presence, it is also helpful to drill down to

see how they compare to other indicators within their same category. The reason for this

is because it is possible that within a given category that certain indicators of social

presence are used more frequently than others. For instance, in the “Affective” category

270 

526 

53 

524 

1137 

192 

747 

505 

714  748 638 

354 

200 

400 

600 

800 

1000 

1200 

Affective

Interactive

Cohesion

Page 108: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

93

“emotion” and “self-disclosure” appeared the most frequently and almost in the same

frequency (see Figure 4.6). In the “Interactive” category however, signs of

“acknowledgement” were by far the most frequently used social presence indicator (see

Figure 4.6). Finally, in the “Cohesion” category, “greetings / salutations / phatics,”

“vocatives,” and then “group reference” all appeared in about the same frequency but

“embracing the group” was used the least (see Figure 4.6).

Figure 4.6. Social presence indicators separated by category

Page 109: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

94

Stage Two: Social Presence Categories and Indicators By Discussion Forum

As helpful as it is to look at the frequency of social presence indicators across all

of the threaded discussions and treating all of the threaded discussions essentially as one

case, it is perhaps more insightful and helpful to drill down and look at the occurrence of

social presence indicators across and within types of threaded discussions. At this stage, I

first looked at the occurrence of social presence indicators across specific types of

threaded discussions.

For the ease of reporting, I separated full-class threaded discussions (i.e.,

discussions that are “open” to the entire class) from small-group threaded discussions

(i.e., discussions that are “closed” to a small select group of students assigned with a

specific task like discussing the reading or collaborating on a course project). See Table

4.7 for the list of “open” vs. “closed” threaded discussions. But because each threaded

discussion differs in total number of posts and words, I needed a way to calculate the

social presence density of each discussion.

Following the lead of Rourke et al. (2001a), I calculated the social presence

density for each indicator in each threaded discussion. But because the unit of analysis

for this study was the entire post, I calculated the social presence density by taking the

average social presence indicator per post (as opposed to per word like Rourke et al.,

2001a) to facilitate comparison across open and closed threaded discussions.

Page 110: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

95

Table 4.7 Open vs. Closed Threaded Discussions

Open to Entire Class Small Group (limited to 2-5)

Virtual Office

--Virtual Office

General Discussions

--General Syllabus

--General Groups

--General Independent Work

--General Individual Work

Adult Learning Discussions

--Adult Learning Discussion Forum:

Your Learning

--Adult Learning Discussion Forum:

Questionnaire #1

Plus Delta Discussions

--Plus Delta Week 2

--Plus Delta Week 3

--Plus Delta Week 4

--Plus Delta Week 5

--Plus Delta Week 6

Reading Log Discussions

--Reading Log1

--Reading Log 3

Reading Groups

--Reading Group A

--Reading Group B

--Reading Group C

--Reading Group D

--Reading Group E

--Reading Group F

--Reading Group G

Pairs

--Pair 1

--Pair 2

--Pair 3

--Pair 4

--Pair 5

--Pair 6

--Pair 7

--Pair 8

--Pair 9

--Pair 10

Project Groups

--Project Group 1

--Project Group 2

--Project Group 3

--Project Group 4

--Project Group 5

I found when comparing the average social presence indicators per post between

open threaded discussions and closed threaded discussions that a higher density of social

presence occurred in closed threaded discussions than in open threaded discussions (see

Table 4.8). For instance, the average per post “Affective” indicator is 0.78 in closed

discussions (meaning there is an average .78 affective indicators per post) compared to

0.56 for open discussions; the average “Cohesive” indicators is 1.37 in closed discussions

Page 111: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

96

as compared to 1.17 in open discussions; and the average “Interactive” indicators is 1.45

in closed discussions vs. 1.09 in open discussions.

Table 4.8 Average Social Presence Indicators Per Post across Open and Closed

Threaded Discussions

Open Discussions Closed Discussions

Total

Average

Total

Average

Affective 101 0.56 1272 0.78

Cohesive 211 1.17 2243 1.37

Interactive 197 1.09 2382 1.45

Total 509 2.81 5897 3.59

I then decided to look deeper within the closed discussions to explore any

observable differences between the different types of closed discussions used because

while all three were “closed” discussions, each one had a distinct purpose which could

have influenced how students posted in each threaded discussion. When comparing all

three of the different types of “closed” discussions (see Table 4.9 and Figure 4.7), “Pairs”

had the highest total social presence average per post with 4.20 social presence indicators

per post. “Project Groups” was next with an average of 3.76 social presence indicators

per post. And then “Reading” groups had the lowest average of social presence indicators

per post. These differences could likely be due to a combination of the group size and the

purpose of each of these threaded discussions. For instance, the “Pairs” and the “Project

Groups” had very specific tasks that required interaction, cohesion, and collaboration

whereas the “Reading Groups” (while also a small group) had less prescriptive tasks (see

Table 3.1 in Chapter 3).

Page 112: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

97

Table 4.9 Average Social Presence Indicators across Closed Threaded Discussions

Reading Groups Pairs

Project Groups

Total

Average

Total

Average

Total

Average

Affective 549 0.77 253 0.95 470 0.71

Cohesive 776 1.09 467 1.76 1000 1.50

Interactive 956 1.35 394 1.49 1032 1.55

Total 2281 3.21 1114 4.20 2502 3.76

Figure 4.7. Visual depiction of the average social presence indicators grouped by

category in closed threaded discussions.

But when I began to compare each category and later each indicator, the results

began to change. For instance, the “Pairs” threaded discussions have the highest average

of all of the social presence indicators per post across all the categories and indicators.

Page 113: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

98

But when I disaggregated these results, I found that the “Pairs” threaded discussions did

not have the highest social presence density across all three categories of social presence.

For the interactive category of indicators, the “Pairs” group actually had a lower per post

average than the “Project Groups.” At the same time, while the “Reading Groups” had

the lowest total social presence average per post overall, these threaded discussions

actually had a higher average of affective indicators than “Project Groups” (see Table

4.10). This could suggest that certain types of tasks in certain group sizes could elicit

more social presence behaviors per participant than others. At the same time, the

differences are minor and more research would likely need to be conducted to support

this theory.

Table 4.10 Ranking of Average Social Presence Indicators Across Closed Threaded

Discussions

Social Presence Category & Closed Threaded Discussion Average Per Post

Affective Indicators

Pairs 0.95

Reading Groups 0.77

Project Groups 0.71

Cohesive Indicators

Pairs 1.76

Project Groups 1.50

Reading Groups 1.09

Interactive Indicators

Project Groups 1.55

Pairs 1.49

Reading Groups 1.35

Each threaded discussion—specifically the closed threaded discussions—consists

of different students and therefore even though the tasks might be the same, it is possible

that individual students and their natural or learned communication skills influence the

Page 114: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

99

frequency and therefore overall social presence density in a given threaded discussion

(which is in part why I looked at each student’s social presence behaviors during Stage 3

of the content analysis). Therefore, I dug deeper to look at the social presence density

across all closed threaded discussions (see Table 4.11).

One of the Pairs threaded discussions—specifically Pair 9—had the highest

overall average of social presence indicators per post per discussion as well as the highest

per post average of each of the three categories of social presence indicators. Reading

Group E and Reading Group G ended up with the lowest social presence per post average

per individual threaded discussions. These results follow the general trend identified

earlier (see Figure 4.6) with the Pairs threaded discussions having the overall highest

density of social presence per post and the Reading Groups threaded discussions having

the lowest overall density of social presence per post. This could suggest that the overall

size and purpose of a specific discussion highly influences the amount of social presence

indicators used by students in any given discussions: For instance, the Pairs discussions

involved two students taking part in personal discussions versus the Reading Groups

which involved small groups of 4-5 students talking about the weekly readings in the

course. As one might imagine, two students discussing personal matters might engender

more affective, cohesive, and interactive indicators than a larger group discussing course

readings.

Page 115: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

100

Table 4.11 Average Social Presence Indicator per Threaded Discussion

Discussion Forum Total

Posts

Affective/

Avg. Per

Post

Cohesive/

Avg. Per

Post

Interactive

/Avg. Per

Post

Social

Presence/

Avg. Per

Post

Open Discussions

Virtual Office 44 16 (0.36) 59 (1.34) 44 (1.00) 119 (2.7)

General: Syllabus 48 12 (0.25) 44 (0.92) 34 (0.71) 90 (1.88)

General: Groups 14 8 (0.57) 12 (0.86) 16 (1.14) 36 (2.57)

General:

Independent Work

3 3 (1.00) 5 (1.67) 3 (1.00) 11 (3.67)

General:

Individual Work

2 0 (0.00) 3 (1.50) 2 (1.00) 5 (2.5)

Adult Learning

Discussion Forum

–Your Learning

12 4 (0.33) 13 (1.08) 12 (1.00) 29 (2.42)

Adult Learning

Discussion Forum

–Questionnaire #1

3 4 (1.33) 2 (0.67) 5 (1.67) 11 (3.67)

Plus Delta Week2 13 15 (1.15) 24 (1.85) 15 (1.15) 54 (4.15)

Plus Delta Week 3 22 19 (0.86) 30 (1.36) 36 (1.64) 85 (3.86)

Plus Delta Week 4 2 3 (1.50) 0 (0.00) 3 (1.50) 6 (3.00)

Plus Delta Week 5 2 3 (1.50) 5 (2.50) 2 (1.00) 10 (5.00)

Plus Delta Week 6 3 7 (2.33) 4 (1.33) 4 (1.33) 15 (5.00)

Reading Log 1 12 7 (0.58) 10 (0.83) 20 (1.67) 37 (3.08)

Reading Log 3 1 0 (0.00) 0 (0.00) 1 (1.00) 1 (1.00)

Closed Discussions

Reading Group A 125 110 (0.88) 128 (1.02) 192 (1.54) 430 (3.44)

Reading Group B 132 88 (0.67) 124 (0.94) 203 (1.54) 415 (3.14)

Reading Group C 95 104 (1.09) 129 (1.36) 95 (1.00) 328 (3.45)

Reading Group D 109 120 (1.10) 153 (1.40) 186 (1.71) 459 (4.21)

Reading Group E 40 23 (0.58) 29 (0.73) 41 (1.03) 93 (2.33)

Reading Group F 106 59 (0.56) 84 (0.79) 126 (1.19) 269 (2.54)

Reading Group G 103 45 (0.44) 129 (1.25) 113 (1.10) 287 (2.79)

Pair 1 32 18 (0.56) 46 (1.44) 51 (1.59) 115 (3.59)

Pair 2 40 41 (1.03) 71 (1.78) 59 (1.48) 171 (4.28)

Pair 3 45 41 (0.91) 84 (1.87) 78 (1.73) 203 (4.51)

Pair 4+ 6 5 (0.83) 5 (0.83) 5 (0.83) 15 (2.50)

Pair 5 30 38 (1.27) 65 (2.17) 38 (1.27) 141 (4.70)

Pair 6 28 14 (0.50) 38 (1.36) 30 (1.07) 82 (2.93)

Pair 7 26 23 (0.88) 41 (1.58) 40 (1.54) 104 (4.00)

Pair 8 21 33 (1.57) 48 (2.29) 33 (1.57) 114 (5.43)

Pair 9 15 25 (1.67) 38 (2.53) 30 (2.00) 93 (6.20)

Pair 10 22 15 (0.68) 31 (1.41) 30 (1.36) 76 (3.45)

Page 116: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

101

Table 4.11 (con’t.)

Closed Discussions

(con’t.)

Project Group 1 109 72 (0.66) 160 (1.47) 167 (1.53) 399 (3.66)

Project Group 2 180 96 (0.53) 276 (1.53) 292 (1.62) 664 (3.69)

Project Group 3 138 111 (0.80) 168 (1.22) 189 (1.37) 468 (3.39)

Project Group 4 113 79 (0.70) 136 (1.20) 141 (1.25) 356 (3.15)

Project Group 5 126 112 (0.89) 260 (2.06) 243 (1.93) 615 (4.88)

Stage Three: Social Presence Categories and Indicators By Students

While conducting the content analysis, I began to get a sense that certain students

used certain social presence indicators (e.g., paralanguage and vocatives) more than

others. Therefore, I decided to investigate the frequency at which each student used social

presence indicators. I reasoned that it could be that, even though a certain threaded

discussion (which consisted of a group of students) might have a high social presence

density, it could be the result of one group member who was extremely active and

proficient with employing affective, interactive, and cohesive means of communication in

threaded discussions.

Henceforth, I first looked at each participant’s use of all three of these categories

of social presence as a whole; however, I excluded five students who failed to post more

than ten overall posts throughout the semester. Of those who posted more than ten times,

Cathy had the highest average with 5.43 instances of social presence per post, followed

next by Diana with 4.87 per post, and Mary with 4.64 per post. This becomes more

striking when these results are compared to participants with the lowest use of social

presence indicators per post. The three participants with the lowest number of social

presence indicators per post were Instructor Bob who had the lowest average at 2.24

instances per post, followed by Sam with 2.42 per post, and then Monica at 2.89 per post.

Page 117: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

102

But when I dug a little deeper I found that a high or low social presence rating

(i.e., the average social presence indicators used per post) does not necessarily mean that

the participant in question scores the same on all three categories of indicators or even on

a given set of indicators within a category. In other words, one could feel competent and

comfortable with interactive types of communication but not with affective or cohesive.

For instance, while Cathy had a high overall social presence average per post (when

taking into consideration all three categories of social presence), she had one of the three

lowest interactive averages per post. In other words, while her use of affective and

cohesive indicators was high compared to her peers, her use of interactive indicators was

low compared to her peers. Similarly, while Instructor Bob had an overall low total social

presence score, he in fact had the highest interactive score (see Table 4.12) thus

suggesting that he may be more proficient at interactive types of communication than

cohesive or affective.

Table 4.12 Student’s Use of Social Presence Categories

Total

Posts

Social

Presence

Total Posts

(Avg. Per

Post)

Affective

Total Posts

(Avg. Per

Post)

Cohesive

Total Posts

(Avg. Per

Posts)

Interactive

Total Posts

(Avg. Per

Posts)

Adam 76 254 (3.34) 56 (0.22) 109 (0.43) 89 (0.35)

Cathy 77 418 (5.43) 122 (0.29) 175 (0.42) 121 (0.29)

Christine 107 362 (3.38) 86 (0.24) 115 (0.32) 161 (0.44)

Daphne 73 253 (3.47) 42 (0.17) 112 (0.44) 99 (0.39)

Dawn 121 360 (2.98) 69 (0.19) 123 (0.34) 168 (0.47)

Denise 103 393 (3.82) 61 (0.16) 178 (0.45) 154 (0.39)

Diana 94 458 (4.87) 156 (0.34) 151 (0.33) 151 (0.33)

Erica 66 221 (3.35) 53 (0.24) 101 (0.46) 67 (0.30)

Gabriela 55 173 (3.15) 34 (0.20) 66 (0.38) 73 (0.42)

Instructor Bob 328 736 (2.24) 115 (0.16) 204 (0.28) 417 (0.57)

Page 118: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

103

Table 4.12 (con’t.)

Kate 99 354 (3.58) 52 (0.15) 157 (0.44) 145 (0.41)

Kyleigh 85 274 (3.22) 75 (0.27) 99 (0.36) 100 (0.36)

Laura 39 172 (4.41) 44 (0.26) 73 (0.42) 55 (0.32)

Mary 117 543 (4.64) 91 (0.17) 231 (0.43) 221 (0.41)

Micky 93 423 (4.55) 96 (0.23) 174 (0.41) 153 (0.36)

Monica 53 153 (2.89) 32 (0.21) 61(0.40) 60 (0.39)

Richard 31 130 (4.19) 23 (0.18) 61 (0.47) 46 (0.35)

Sam 78 189 (2.42) 50 (0.26) 55 (0.29) 84 (0.44)

Sara 50 229 (4.58) 54 (0.24) 88 (0.38) 87 (0.38)

Vicky 64 234 (3.66) 47 (0.20) 82 (0.35) 105 (0.45)

But even treating each social presence indicator within a given category equally

could perhaps be hiding certain trends. For instance, it could be that certain people are

strong with certain indicators in a given category but not others (for instance, someone

might have a high Affective category but simply because he or she is really proficient at

disclosing personal information and sharing emotion, but not at using paralanguage and

humor).

So, I decided to take a look at the students with the highest overall social presence

average per post (see Figure 4.8 and Figure 4.9). While Cathy had the highest social

presence per post average at 5.43 instances per post, Cathy’s (like Mary’s) strength

appears to be “greetings and salutations.” Diana on the other hand uses “paralanguage”

more frequently than “greetings and salutations.” Diana though was one of the students in

the Pairs 9 threaded discussion which had the highest per post average of social presence

indicators; it is important to note that she was paired with Sara who was fourth on the

overall list with the highest average of social presence indicators.

Page 119: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

104

These results likely suggest two things. First, that just because someone may be

proficient at employing a certain type or category of social presence behaviors (i.e.,

affective, interactive, and / or cohesive) does not mean that this same person is proficient

at or comfortable with each indicator related to the category of social presence

communication. In other words, while someone might use a lot of affective types of

communication, he or she might never use paralanguage and vocatives, opting instead for

the use of greetings and salutations, acknowledgement of others, and the use of emotion.

Second, these findings might point to the fact that people—especially in small groups—

might begin to mirror the communication behaviors of their peers. For example, if a peer

(in a small group) has strong social presence behaviors and heavily uses paralanguage

then other students in the group might begin to use paralanguage more frequently than

before simply from mimicking their peer.

Cathy Diana Mary

Greetings &

salutations

0.84 Paralanguage 0.64 Greetings &

salutations

0.85

Emotion 0.6 Acknowledgement 0.63 Acknowledgement 0.79

Acknowledgement 0.6 Group Reference 0.62 Invitation 0.5

Vocatives 0.56 Invitation 0.59 Group Reference 0.5

Paralanguage 0.52 Emotion 0.5 Vocatives 0.44

Group Reference 0.51 Self Disclosure 0.49 Expressing

Appreciation

0.44

Invitation 0.45 Greetings &

salutations

0.43 Emotion 0.37

Expressing

Appreciation

0.42 Vocatives 0.31 Self Disclosure 0.31

Embracing the

Group

0.36 Expressing

Appreciation

0.31 Embracing the

Group

0.18

Self Disclosure 0.35 Embracing the

Group

0.26 Agreement 0.17

Humor 0.12 Agreement 0.09 Paralanguage 0.1

Agreement 0.1 Humor 0.03 Humor 0

Figure 4.8. Ranking of social presence indicators used by the three students with the

highest overall social presence per post average.

Page 120: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

105

Mary’s Individual Use of Social Presence Indicators

Diana’s Individual Use of Social Presence Indicators

Cathy’s Individual Use of Social Presence Indicators

0.1 0.37 

0.31 

0.85 

0.44 

0.5 0.18 

0.79 

0.5 

0.17 

0.44 Paralanguage 

EmoSon 

Humor 

Self Disclosure 

GreeSngs & salutaSons 

VocaSves 

Group Reference 

Embracing the Group 

Acknowledgement 

InvitaSon 

0.64 

0.5 

0.03 

0.49 

0.43 

0.31 0.62 

0.26 

0.63 

0.59 

0.09 0.31 

Paralanguage 

EmoSon 

Humor 

Self Disclosure 

GreeSngs & salutaSons 

VocaSves 

Group Reference 

Embracing the Group 

Acknowledgement 

InvitaSon 

Page 121: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

106

Figure 4.9. Disaggregation of three students with highest social presence per post

average.

Constant Comparison Analysis

While social science has a long tradition of using content analysis alone to

analyze the content of online discussions, I decided at the beginning of this study to use

multiple types of analysis in an effort to better explore how social presence manifests in

threaded discussions in a completely online course. I turned to constant comparison

analysis in hopes that it would reveal things that were missed by content analysis.

At the beginning of this study, I decided to analyze the threaded discussion with

the highest average social presence density per discussion post and the threaded

discussion with the lowest in hopes of identifying different ways that social presence

manifests in threaded discussions. After conducting content analysis, I selected the Pair 9

threaded discussion as having the highest social presence density at 6.20 per post and the

Reading Group E as having the lowest social presence density at 2.33 per post. I then

used constant comparison analysis to code these two threaded discussions in an effort to

0.52 

0.6 

0.12 

0.35 

0.84 

0.56 

0.51 

0.36 

0.6 

0.45 

0.1 0.42 

Paralanguage 

EmoSon 

Humor 

Self Disclosure 

GreeSngs & salutaSons 

VocaSves 

Group Reference 

Embracing the Group 

Acknowledgement 

InvitaSon 

Page 122: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

107

see if themes might emerge that tell a similar or different story than the content analysis

results.

Due to the different nature of each threaded discussion, I conducted constant

comparison analysis on each discussion separately. I first analyzed Reading Group E

(which had the lowest social presence density). As touched on in Chapter 3, the Reading

Group discussions consisted of small groups of 4-5 students that were tasked with

discussing the course readings and jointly writing nine different reading logs about the

course readings. The readings logs were supposed to not only summarize the readings but

also bring up any questions the group members had so that the instructor could then

answer them in the Reading Group threaded discussions. Students had two incentives to

take part in the Reading Group threaded discussions: First, students were graded on each

of the nine reading logs, which consisted of 15.25% of the course grade; second, students

were graded for their online interactivity and quality of work, which consisted of 16.95%

of the course grade.

As mentioned in Chapter 3, the first step of conducting constant comparison

analysis involved reading the entire threaded discussion. After reading all of the posts in

the threaded discussion, I then chunked the text into meaningful units. I then coded each

meaningful unit while constantly comparing new codes with previous codes. During the

coding process, I focused on the way people were communicating while still trying not to

limit myself or be confined in any way to the social presence indicators used for content

analysis. After coding all of the meaningful units, I then listed the codes and grouped

similar codes.

Page 123: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

108

The analysis resulted in 89 unique codes (see Appendix C). Those codes were gathered

into eight separate groups (see Table 4.13).

Table 4.13 Groups of Codes Resulting from the Constant Comparison Analysis

Reading Group E

Grouping of Codes

Course logistics & facilitation

Emotion

Greetings and Salutations

Sharing Life Details

Gracious/Gratitude

Self Disclosing Personal Matters

Playing Nice with Others

Policy Related Class Discussions

In the fifth and final step, I identified themes from the data—while including

specific language from the groups, codes, or data when appropriate. The following two

themes emerged from the data from Reading Group E. (I have italicized any text that

came straight from the threaded discussions.)

• Policy is complex and multifaceted; it is something that many students and

teachers have no idea about; while the readings varied in complexity and

required a little more time than texts in past classes, with the help of the

instructor the students came to find the study of policy interesting and

relevant.

• Students began the threaded discussion (which spanned two months) with chit

chatting and telling personal stories but quickly changed their focus to the task

at hand of discussing public policy in general and the readings in particular;

Page 124: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

109

overtime the focus of the discussion was solely on the reading and public

policy—by this point the discussion largely consisted of students posting

questions and the instructor answering the questions.

After analyzing the Reading Group E threaded discussion, I analyzed the Pair 9

threaded discussion (which had the highest social presence density) following the same

steps as above. The Pair 9 threaded discussion had a different purpose than the reading

group. According to the course syllabus, the Pairs group is a place where group members

work on a personal-professional development activity that requires each student to take a

bit of risk and develop some trust with each other while discussing individual personal-

professional goals that are deeply important to one another. Similar to the Reading

Group threaded discussions, students had two incentives to take part in the Pair’s

threaded discussions: First, students were graded on the 3-5 page paper that resulted from

their work in their pairs group they were assigned to, which consisted of 12.7% of the

course grade; second, students were graded for their online interactivity and quality of

work, which consisted of 16.95% of the course grade.

Likely due in part to the different purpose, the Pairs threaded discussions had a

higher social presence density than other threaded discussions but specifically the Pairs 9

group had the highest among all of the Pairs and all of the threaded discussions in

general. Analysis of the Pairs 9 group resulted in 63 codes (see Appendix C), which I

then grouped into nine groups (see Table 4.14).

Page 125: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

110

Table 4.14 Groups of Codes Resulting from the Constant Comparison Analysis

Reading Pair 9

Grouping of Codes

Course logistics & facilitation

Collaboration

Emotion

Sharing Life Details

Playing Nice with Others

Policy Related Class Discussions

Greetings and Salutations

Self Disclosing Personal Matters

Gracious/Gratitude

Three themes emerged from this data as well. Like before, I have italicized any text that

came straight from the threaded discussions.

• Students who have a past relationship and spend time with each other either

professionally (e.g., we are fortunate enough to work together) or personally

outside of class can have an easier time collaborating with each other because of

their past relationship, shared experiences, and geographic closeness which others

might not have. These benefits can help them NOT to be alone, give them

opportunities to chat a lot, provide a strong and safe foundation to openly share

how they are struggling personally and professionally, and to regularly meet face-

to-face.

• Instructors can only react to what they see in a threaded discussion. It is difficult

to assess and to support students when they collaborate offline.

• When asked to take a risk, trust a peer, and self-disclose personal details, it helps

when two people already know each other, have some trust already built, have

shared experiences, and finally have the ability to talk and meet offline.

Page 126: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

111

While the results of the constant comparison analysis did not necessarily

contradict any of the findings from the word count or content analysis, they did begin to

fill in some details about what students were talking about in each threaded discussion

and how the type and purpose of a threaded discussion could influence how people

communicate with one another.

Chapter Summary

I utilized word count, content analysis, and constant comparison analysis to

explore how social presence manifests in a fully online discussion. Results illustrate that

participants’ use of social presence behaviors (e.g., indicators of social presence) vary

across the course. The results also reveal that looking at the total social presence

indicators or even simply the frequency at which each category of social presence is used

(e.g., affective, cohesive, and interactive) might be misleading and miss important details

about how and when people use certain social presence behaviors. These results will be

discussed at greater length in Chapter 5.

Page 127: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

112

CHAPTER 5

DISCUSSION

I set out to explore how social presence manifests in a fully online asynchronous

course. In Chapter 1, I laid out an argument for why additional research needs to be

conducted on social presence. Then in Chapter 2, I reviewed the literature on social

presence and the community of inquiry (CoI). After reviewing the literature, I then

explained in Chapter 3 the methods that were used for this study. Finally in Chapter 4, I

reported on the results of the study. Now in Chapter 5, I will discuss the significance of

these results, the limitations of this study, and the practical implications for the results—

specifically for course designers and faculty.

Key Findings

A deep and meaningful educational experience involves teaching presence, social

presence, and cognitive presence (Garrison et al., 2000). The CoI framework posits that

social presence is developed as the result of teaching presence. More specifically,

educators develop social presence through instructional design and organization,

facilitating discourse, and direct instruction (the three components of teaching presence)

(Anderson, Rourke, Garrison, & Archer, 2001). This does not mean that social presence

cannot naturally occur. Walther (1992) argued almost 20 years ago that people are social

creatures and that given enough time people will find ways to use any communication

medium for social purposes. Online educators however typically do not want to wait and

hope that their student’s natural social tendencies kick-in. Instead they often strive to find

ways to help encourage the development of social presence in online courses (which is

what the CoI refers to as teaching presence).

Page 128: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

113

The CoI framework (as well as the CoI literature as whole), though, does not

provide much guidance on how to design courses, facilitate discourse, and provide direct

instruction to facilitate the development of social presence (Garrison & Arbaugh, 2007).

For instance, how many threaded discussions should be in a course? Should the threaded

discussions be full class discussions or small groups? Should they have specific

instructional tasks? Educators can make some inferences from the indicators of teaching

presence developed by Anderson et al. (2001) (see Table 5.1), but even these indicators

lack sufficient detail.

Table 5.1 Teaching Presence Categories and Indicators

Teaching Presence Categories and Indicators

Instructional Design and Organization

Setting Curriculum

Designing Methods

Establishing Time Parameters

Utilizing Medium Effectively

Establishing Netiquette

Facilitating discourse

Identifying areas of agreement/disagreement

Seeking to reach consensus/understanding

Encouraging, acknowledging, or reinforcing student contributions

Setting climate for learning

Drawing in participants, prompting discussion

Assess the efficacy of the process

Direct Instruction

Present content/questions

Focus the discussion on specific issues

Summarize the discussion

Confirm understanding through assessment and explanatory feedback

Diagnose misconceptions

Inject knowledge from diverse sources e.g., textbook, articles, internet, personal

experiences (includes pointers to resources)

Responding to technical concerns

Page 129: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

114

Some of the results presented in Chapter 4 might begin filling this void. That is,

the results provide a couple of possible guidelines for how educators can design and

develop online courses to increase social presence. However, as an exploratory study

using a small sample, the findings from this study should not be generalized to all

populations. In fact, any and all findings should be confirmed with additional research.

With that in mind, I will address some key findings in the following paragraphs.

Group Size

One of the first things that stood out initially with the word count results and then

with the content analysis results was that the social presence density—that is, the average

social presence indicator per discussion post—differed across types of threaded

discussions, specifically open vs. closed discussions. In other words, a higher social

presence density existed for small-group discussions than for large-group discussions.

This suggests that students projected themselves as “real” and “there” in the threaded

discussions through specific social presence behaviors (e.g., self disclosing information,

addressing people by first name, using emoticons) more frequently in small discussions

than in large discussions.

While very little research has been conducted on group size and social presence,

Tu and McIsaac (2002) claimed that “appropriate communication group size” can

influence social interaction and thus social presence. They concluded based on the

qualitative data in their study that “the size of the discussion group exerted a major

impact on students’ interaction, particularly in real-time discussions” (p. 145). And while

they recommend that two to three participants are an ideal group size for real-time

Page 130: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

115

discussions, they unfortunately do not offer any suggestions for an ideal group size for

asynchronous discussions.

Rourke and Anderson (2002a) conducted a study on using peer teams to lead

discussions. They found that students preferred small-group peer-led threaded discussions

more than full class instructor-led discussions. They concluded that this preference was

possibly due to the fact the small-group discussions consisted of four students and were

led by their peers rather than the instructor. But the students’ preference for small-group

discussions could have been due to a combination of the group size, the instructional task,

and the instructor’s reduced role rather than simply the fact that the discussions were peer

led.

This finding about large- and small-group discussions, however, does not suggest

that social presence cannot develop in large group discussions. In fact, Nagel and Kotze

(2009) found high levels of social presence in a “super-sized” course of 100+ students.

This finding about group size might simply confirm what Kreijns, Kirschner, and

Jochems (2003) argued about group size—namely, that anonymity and non-participation

increases as groups get larger (p. 340). In other words, as the group size (or class size)

increases, it is easier for students to hide and sit back and lurk (or not participate at all).

Lurking is not necessarily a bad thing (see Dennen, 2008). However, students need to

actually interact with their peers in order to project themselves as “real” and “there” in

threaded discussions. And this type of interaction might simply be easier for students in

smaller groups—especially those who might have a tendency to lurk in large threaded

discussions.

Page 131: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

116

My personal experience teaching in face-to-face environments has shown me that

as the class size gets larger, fewer and fewer students ask questions on their own. In my

experience, small groups can force even the shiest and reluctant student to talk to her or

his peers. Given this, it might make sense for faculty to utilize small-group threaded

discussions more at the beginning of a course to help students begin to establish their

social presence early on in a given course or program of study. Small groups likely place

an additional amount of peer pressure on individual students. Individual students are no

longer simply held accountable for their actions by their instructor but also by their peers.

In my experience, peers are much more likely to send other peers an email for

nonparticipation in small groups—especially those that involve group work—than they

would in large group discussions. Further, these results could support the need for the

development of a number of small learning communities rather than the typical approach,

which too often focuses on developing one all-encompassing learning community with

every student in the course. More research though needs to be conducted across other

samples to confirm that group size can in fact influence the development of social

presence.

Instructional Task

In this study, though, group size alone did not guarantee a high level of social

presence. For instance, project groups and pairs had a higher social presence density than

reading groups even though reading groups were also small groups. This difference in the

social presence density likely could be due to the instructional task of each threaded

discussion. In my experience, students quickly identify what discussions they need to

take part in and which one’s they do not—both in terms of the relevance of the threaded

Page 132: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

117

discussions toward the course and the student’s personal and professional goals as well as

any points the discussion is worth toward the final grade.

Students’ participation in both of these threaded discussions was graded and both

of these threaded discussions were tied to specific assignments that were graded as well.

However, as mentioned in Chapter 4, the reading groups involved identifying questions

that resulted from the course readings and then having the instructor answer the

questions. As a result, the dynamic of the discussions appeared to be less goal specific (or

at least less clearly defined) as the other two types of small threaded discussions. Reading

Groups had less peer accountability at least in comparison to the Pairs threaded

discussion. Also, more student-to-instructor and instructor-to-student rather than student-

to-student interaction occurred in these threaded discussions. In fact, when looking at the

number of posts and the number of words in each post in these threaded discussions, the

instructor’s role in the reading groups is more prominent than in the Pairs or Project

groups (see Table 5.2). This does not necessarily mean that instructors should say less or

avoid direct instruction. In fact, the CoI argues for the use of direct instruction as one way

to establish social presence. Rather it might simply suggest that the purpose of a

discussion likely influences how and what a student posts—and therefore the amount of

social presence behaviors used by both faculty and students.

Table 5.2 Instructor vs. Student Postings in Small Discussions

Reading Groups Pairs Project Groups

Posts Words Posts Words Posts Words

Student 543 (77%) 42,176 (71%) 219 (83%) 22,087 (89%) 630 (94%) 47,758 (91%)

Instructor 165 (23%) 16,860 (29%) 46 (17%) 2,629 (11%) 42 (6%) 4,992 (9%)

Page 133: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

118

The pairs discussion groups had the highest overall density of social presence.

While this is likely due in part to the fact that the pairs groups consisted of only two

students, it is perhaps equally due to the fact that the pairs groups were tasked with

sharing personal things with one another. In fact, the pairs had the highest frequency of

affective indicators per post, which is likely largely due to the instructional task. To date

though, no research specifically examines how specific instructional tasks in threaded

discussions affect social-presence behaviors used in the threaded discussions.

Researchers for years have questioned how best to structure threaded discussions

(Gilbert & Dabbagh, 2005). And they have shown that the structure of a threaded

discussion as well as how an instructor posts—thus modeling and setting the tone—can

influence how students post (see Dennen, 2005). While Lowenthal and Dunlap (2011)

investigated students’ perceptions of how specific instructional tasks influence students’

perceptions of social presence, to date there is a lack of research on how small working

groups (working on specific assignments—whether group assignments or not) can help

build social presence.

The reason the pairs group had a higher social presence density though could also

be due in part to the instructors role in these discussions. An, Shin, and Lim (2009) found

that “when the instructor’s intervention was minimal, students tended to more freely

express their thoughts and opinions, with a large number of cues for social presence” (p.

749). Thus, while course designers like myself as well as faculty in general seem to

prefer clear-cut guidelines, it is possible that there are not any clear-cut guidelines. These

results seem to suggest that it could be a combination of small group size, instructional

tasks that engender interpersonal dialogue, and low instructor involvement that helps

Page 134: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

119

build social presence. But at this point, while this is reasonable, it is simply speculation.

Additional variables such as one’s personal communication style, how discussions are

graded, and the relevance of the instructional tasks to name a few, need to be investigated

to see how they too influence how social presence manifests. Further research needs to

be conducted to verify how instructional tasks (including not only what students are

asked to do but also how they are graded as well as the personal and professional

relevance of the assignments), group size, and instructor involvement can impact the

development of social presence.

Past Relationships

Constant comparison analysis of the threaded discussions with the highest and the

lowest density of social presence revealed that the pairs with the highest social presence

density worked together and even carpooled together. Practitioners have argued for years

that online courses—whenever possible—should start with face-to-face meetings to

establish social presence. This finding, though, might suggest something more. It could

suggest that people who have a strong relationship outside of class might have an easier

time with interactive, cohesive, and affective types of communication than people who do

not have a relationship outside of class. This finding is supported by the work of

Lowenthal and Dunlap (2011). Lowenthal and Dunlap found that having a positive group

project experience with a student helps increase a student’s perceptions of social presence

between the students in question and helps them maintain future relationships with one

another—even in the absence of ever meeting face-to-face.

Both of these findings suggest that having a past relationship with someone is

helpful when establishing social presence in online courses. It could be that a cohort

Page 135: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

120

model that enables students multiple opportunities to build relationships with other

students semester after semester is more valuable (at least when it comes to building

social presence) than beginning a course or a program with face-to-face meetings.

Walther (1994) argued years ago that the possibility of future interaction can influence

the degree to which people socially interact online thus further giving support for cohort

models or other types of models that enable students to take multiple courses with the

same students and/or with the same instructor. Further research though is needed to

confirm this because while the students’ past relationship emerged in the data in this one

group, it was difficult to ascertain whether or not other students had past relationships

with their peers and if so to what degree.

One Size Doesn’t Fit All

But perhaps the number one finding of this study from a design perspective, as

disheartening as it is, is that one size does not fit all. In other words, the results show that

while there are trends (e.g., that closed threaded discussions had a higher social presence

density than open threaded discussions), there is not always a clear reason as to why some

students use specific social presence behaviors (e.g., paralanguage) and others do not.

While some students might use (or some threaded discussions might elicit) high levels of

social presence overall, each of the indicators or at least the categories (i.e., types of

social presence) differed across students and types of threaded discussions.

This finding supports what Lowenthal and Dunlap (2011) found. They found that

each student appears to have her or his own threshold for social presence. In other words,

different people have their own social presence needs. What works for one student might

not work for another and what is comfortable or ideal for one student might not be

Page 136: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

121

comfortable or ideal for another. Along these lines, it is possible that each person—

perhaps based in part on his or her own social presence needs—has developed their own

level of proficiency at utilizing social presence behaviors in threaded discussions. That is,

each person has developed different levels of literacy at computer-mediated discourse.

However, a stylistic element appears to affect how people communicate in online

learning environments as well. For instance, some students appear to almost habitually

use emoticons (like Diana) whereas others do not appear to use them at all (like Kate,

Denise, Dawn, or Laura). It is possible that just as people have different communication

styles in face-to-face environments, that they also have different communication styles in

online environments. Further research though is needed to find out why some people use

certain types of communication behaviors (e.g., the use of vocatives or paralanguage) and

others do not.

Limitations of Studying Social Presence

Every research study has some limitations. I address the limitations of this study

later in this chapter. For now, though, I want to address some insights that resulted from

studying social presence behaviors in threaded discussions in this study. These insights

are possible limitations of social presence theory in general (or at least how social

presence is conceptualized within the CoI) as well as possible limitations with identifying

and quantifying social presence behaviors in particular. While practitioners will likely

find little use of these insights, researchers of social presence on the other hand might

find this section the most useful contribution of this study.

Page 137: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

122

Situational Variables of CMC

As mentioned earlier, social presence theory dates back to the work of Short et al.

(1976). Short et al. developed their theory of social presence based on their research on

how telecommunications effects the way two people communicate. In other words, Short

et al. and their theory of social presence originally focused on one-to-one communication.

While instances of one-to-one CMC (e.g., email) occur, more often than not CMC in

online courses takes place in threaded discussions that involve three or more

communicators. Instances of one-to-one communication are found in threaded

discussions. But this one-to-one communication is often done “in front” of others. The

“publicness” of CMC in threaded discussions is likely to influence what, when, and how

a person communicates in online courses—which is perhaps why Tu focused so much on

privacy in his early work (see Chapter 2 and Tu 2000, 2001, 2002a, 2002b) and perhaps

why students feel more comfortable or more pressured to present themselves as “real”

and “there” in small personal groups as opposed to larger impersonal groups.

More often than not, though, communication in threaded discussions is a one-to-

many model—thus changing the dynamic and making it more like public speaking. Or

when it is one-to-one, it is like talking to another person on the phone but while on a

speakerphone (where others are listening). I contend that these changes in the social

context in which one communicates—more than any limitation of the technology—likely

changes how people communicate and establish themselves as “there” and “real.” This

becomes important when one starts to think about the indicators of social presence

developed by Rourke et al. (2001a).

Page 138: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

123

Rourke et al. (2001a) claim that they developed these categories and indicators

based on their previous work (Garrison, Anderson, & Archer, 2000), other literature in

the field, and finally their experience reading online transcripts. However, I posit that we

have reason to believe that, as the technology of online threaded discussion forums has

improved over the years, as bandwidth has increased, as people’s experience using CMC

has improved (e.g., the increase in email use and the fact that Facebook has millions of

users alone are great examples of how people’s use or at least comfort and ability with

CMC has improved), and finally as the pedagogies used in online courses have matured,

the study of online transcripts has or at least should have changed over the years. In other

words, many of these indicators of social presence might no longer be relevant, might

lack enough specificity, or simply might be based too much on old assumptions of so-

called “proper” ways to communicate with CMC (which were likely influenced by the

older one-to-one model of CMC).

For instance, while addressing someone by his or her first name might help build

a sense of closeness and presence, the genre of CMC that takes place in online courses—

especially in large threaded discussions—often makes it difficult to use certain types of

social presence behaviors like addressing someone by her or his first name. For instance,

when an instructor is addressing the class as a whole, it does not make sense to begin a

post by mentioning everyone’s name. Further, while it might make sense to begin an

initial reply to someone’s post by stating her or his first name (to build cohesion), as a

thread continues and the posts go back and forth, an insistence on beginning each post

with someone’s first name could influence the ebb and flow of a conversation and

possibly hurt cohesion by making the conversation feel overly formal. Another example

Page 139: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

124

of the issues that arise when beginning a post with someone’s first name is that times

occur when an instructor is responding to one student but wants to invite the entire class

into the discussion. If an instructor begins the reply with the student’s name, then it may

send a message to the other students that the post is only for that person and not the rest

of the class.

Another situational variable that is given very little attention in social presence

theory in general and specifically in the CoI framework (see Garrison et al., 2000) is how

one’s role or status can influence not only how but what one communicates and how one

is perceived as being “there” or being “real.” For instance, it is reasonable to assume that

students in an online course—even in a so called “learner centered” course—are more

interested in what their instructor has to say than their peers (if only because the

instructor will be assigning their grade at the end of the semester). In fact, eCollege—a

Learning Management System used at the University of Colorado Denver—recently

started highlighting instructors’ posts with a different color to differentiate them from the

rest—thus suggesting that instructors’ posts are different than students’ posts.

While the CoI framework has an element called “teaching presence,” as

mentioned earlier, it focuses on how instructors design and organize a course, facilitate

discourse, and provide direct instruction. Teaching presence, though, does not

specifically address how an instructor establishes his or her own social presence,

especially given the added task of direct instruction and facilitating discourse.

I have argued elsewhere (Lowenthal & Lowenthal, 2010), in part building on the

work of Swan and Shih (2005) and their differentiation between students’ and instructors’

social presence, that one problem with the CoI framework is that it does not differentiate

Page 140: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

125

(or really even acknowledge) how an instructor might establish his or her social presence

differently than students. In my experience, instructors often talk differently than their

students—this happens both in face-to-face classrooms and online. Further, each

instructor has her or his own style and level of comfort in the classroom. While some

instructors share parts of their personality and will engage in affective types of

communication, others will not. Further, while instructors might build opportunities to

establish social presence in their own online courses—in my experience, they often will

not engage in these activities with students.

The bottom line is that when instructors talk (i.e., post), students tend to listen

(i.e., read). This is not always the case when other students talk. Students are not always

as interested in what their peers say as in what their instructor says. I often think about

what instructors do to establish their own social presence and how the little things they do

(because of their status) can carry even more weight than if a fellow student did the exact

same thing. For instance, I posit that, when an instructor engages in affective

communication (e.g., sharing emotion or self-disclosing), it carries more weight than

when a student does the same thing. Further, and because of the difference in roles and

status, students tend to talk to an instructor differently than to their peers (i.e., code

switch; see White & Lowenthal, 2011). But none of these dynamics are considered when

researchers study social presence.

Further, as the result of this study, I have begun thinking about ways that

instructors and students can actually thwart social presence. For instance, what happens

when a student posts a question that nobody acknowledges or responds to? While this

likely happens in most courses at least once a semester—if only because some students

Page 141: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

126

post at the last minute in a given week—as a result of this study, I have begun thinking

about how detrimental it can be if a student self-discloses personal information or shares

emotional things and nobody responds or acknowledges it. Occurrences like this could

possibly result in students feeling alienated and not acknowledged as being “there” and

“real.” But so much of the literature on social presence focuses on what people do to

establish social presence rather than on things people can do to thwart social presence.

Short et al. (1976) originally studied types of communication that were not only

one-to-one but also ongoing in the given moment. I contend that asynchronous threaded

discussions in online courses that take place over time, involve a many-to-many model,

likely involve students who have past relationships with each other (e.g., from past

courses) and likely future relationships (e.g., future courses), and consist of individuals

who are most likely paying money to be involved in the threaded discussions (and

therefore have some extra motivation to effectively communicate with one another and

their instructor) are a bit more complicated than Short et al. and possibly even Rourke et

al. (2001a) could have originally imagined. I also contend that situational variables like

these need to be considered when studying social presence. For instance, while content

analysis is a useful technique to study online discussions, quantitative measures or counts

of social presence behaviors might have limited value—especially when they do not take

into consideration the context in which social behaviors are used.

Unit of Analysis

Among other things, the unit of analysis one uses when conducting content

analysis influences the frequency of social presence indicators. For instance, following

past researchers’ lead (e.g., Rourke et al., 2001a and Swan, 2003), for this study I used

Page 142: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

127

the entire discussion post as the unit of analysis. While I do not regret this decision, I now

recognize that the unit of analysis one chooses can largely determine what one sees and

what one does not see in her or his findings.

I assert that when researchers approach analyzing online threaded discussions

from a purely quantitative content-analysis perspective—frequency counts are

everything. If researchers only count a specific indicator of social presence (e.g., use of

emotion) once in a post because the post is the unit of analysis, he or she is likely to miss

some details. For instance, you can imagine how many times students might use the word

“we” as a group reference within a single post in small-group discussions focused on a

group project. But if the unit of analysis is simply the entire post, the high frequency of

the use of the word “we” may be lost in the totality of the words. I posit that the

frequency of this group reference—the word “we”—would be captured more accurately

if the unit of analysis was smaller than the entire post (e.g., each meaningful unit). For

example, if a discussion post has the group reference “we” five times in the post, this

indicator of social presence would only be counted once if the unit of analysis is the

entire post but might be counted up to five times if the unit of analysis was a meaningful

unit (which is not always but often the sentence level).

Researchers have written much about the ideal unit of analysis when using

content analysis to code online discussions (De Wever, Schellens, Valcke, & Van Keer,

2006; Rourke & Anderson, 2004; Rourke, Anderson, Garrison, & Archer, 2001b).

Unfortunately, very little consensus exists on which is the best approach to take because

while one might gain granularity with using a smaller unit of analysis, interrater

reliability decreases and workload increases. I finally decided to stick with using the

Page 143: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

128

entire discussion post as my unit of analysis after hearing Wise (Wise & Chiu, 2011)

justify her decision for using the entire post as her unit of analysis at AERA. She argued

that students read and therefore interact with and make meaning from each post in

threaded discussions not with each paragraph or word. Future research must investigate

how the unit of analysis influences content analysis results of threaded discussions.

Problems with the Social Presence Indicators and Treating Them Equally

To truly understand social presence, researchers ideally should look at both

students’ attitudes of social presence as well as students’ behaviors online. In other

words, researchers need to get a better idea of what specific behaviors elicit perceptions

of “closeness” and “realness” in others. The indicators of social presence are a great start

but they have limitations (as touched on earlier). For instance, when using them to

conduct content analysis, a researcher is supposed to identify when they find an instance

of each indicator. Let’s take greetings and salutations as an example. The problem with

this is that greetings and salutations, while similar, are two different things. For instance,

one could argue that someone who continually uses a salutation more than a greeting is

focusing more on themselves than on acknowledging others in a given threaded

discussion. Further, a greeting with a vocative (e.g., “Hi John”) is arguably better at

developing a sense of presence and projecting oneself as “real” and “there” than either

“Hi” or ending a post with one’s first name.

Similarly, the current coding sheet lists paralanguage as a type of affective

communication that establishes social presence. The problem, though, with using

paralanguage as an indicator of social presence is two fold: First, all uses of paralanguage

are not necessarily equal; second, students use paralanguage differently. Regarding the

Page 144: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

129

first point, some students appear to be chronic users of ellipses and seem to almost use

them as a period or a pause rather than in an emotive sense. This use of paralanguage is

different than intentional uses of emoticons and should arguably be treated as being

different. Secondly, paralanguage—especially the use of anything more than

emoticons—seems to be a learned behavior that only certain types of students use. In

other words, if a student is likely to express her or himself in ALL CAPS or with !!!!,

then he or she is likely to do it again, whereas other students never seem to use this type

of communication in threaded discussions.

Treating all uses of emotion equally raises other issues. For instance, students

may use the word “hope” but do not appear to be using it in an emotional sense.

Similarly, other students use the word “thanks” as a habitual salutation rather than as a

sign of appreciation.

Given this, it might be more useful for researchers to identify levels of each

indicator. For instance, researchers can identify instances of emotive text but then they

must identify whether it’s a strong, medium, or soft use. One way to address some of this

is to be able to interact with the participants as one codes the threaded discussions. In

other words, member checking might be an essential component when identifying social

presence behaviors because reading the text alone might not be enough. Or even better, a

researcher should be able to check both the original poster (about intent) as well as all

faculty and students about how they perceived the so-called social presence behavior

because analyzing online discussion behaviors without intent and how the

communication behaviors (i.e., the language used in the postings) are perceived is

limiting from both a design and a research perspective.

Page 145: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

130

Another problem—which I mentioned in Chapter 1—focuses on researchers’

tendency to treat all three categories and subsequent indicators of social presence equally.

As I mentioned in Chapter 2, some researchers tend to define social presence as not only

presenting oneself as “real” and “there” but also establishing a positive emotional

connection with others. In this case, it makes sense that while interactive and cohesive

types of communication are important and possibly necessary building blocks for

affective communication, affective communication is the best way to build an emotional

connection with others. In other words, simply ending a discussion posting with a

salutation is not near as powerful as disclosing personal information. Further research

though is needed to test this theory.

Problems with Measuring the Community of Inquiry

One final observation involves a conflict between the Community of Inquiry

Questionnaire—which was developed relatively recently by a team of CoI researchers

(Arbaugh et al., 2008; Swan et al., 2008)—and the indicators of social presence

developed by Rourke et al. (2001a). To illustrate my point, Table 5.3 lists the Community

of Inquiry Questionnaire questions for social presence next to Rourke et al.’s original

indicators.

Page 146: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

131

Table 5.3 Measuring Social Presence in a Community of Inquiry

Community of Inquiry Questionnaire Indicators of Social Presence

Affective expression

14. Getting to know other course participants

gave me a sense of belonging in the course.

15. I was able to form distinct impressions of

some course participants.

16. Online or web-based communication

is an excellent medium for social interaction.

--Paralanguage

--Emotion

--Humor

--Self Disclosure

Open communication

17. I felt comfortable conversing through

the online medium.

18. I felt comfortable participating in the

course discussions.

19. I felt comfortable interacting with

other course participants.

--Acknowledgement

--Agreement / Disagreement

--Invitation

--Expressing Appreciation

Group cohesion

20. I felt comfortable disagreeing with

other course participants while still

maintaining a sense of trust.

21. I felt that my point of view was

acknowledged by other course

participants.

22. Online discussions help me to develop a sense of

collaboration.

--Greetings & Salutations /

Phatics

--Vocatives

--Group Reference / Inclusivity

--Embracing the Group

While keeping in mind that the Community of Inquiry Questionnaire is meant to

measure student's attitudes and perceptions and the indicators of social presence are

meant to identify what students do and say, one would still expect to see more overlap

between the two instruments. But when I look at the questions for affective expression

and then the indicators for affective expression, I see very little overlap. First, the

Page 147: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

132

Community of Inquiry Questionnaire has some strange questions that seem to focus more

on the medium than on what a participant does. For instance, question 16 states “online

or web-based communication is an excellent medium for social interaction.” Students are

asked the degree to which they agree with this statement. But is it not possible that

students might agree that CMC can be an excellent medium for social interaction but

disagree that CMC has been well used in a specific course? Then question 14 appears to

be inquiring about a student's sense of belonging but the social indicators do not address

that. In fact, group cohesion and specifically group reference seem like better indicators

of students’ feeling a sense of belonging to a group. The problems continue when one

looks at open communication and group cohesion. My point or rather my insight is that

researchers who use the Community of Inquiry Questionnaire to study social presence

appear to be studying different things than those who use indicators of social presence.

Limitations of the Study

As I mentioned earlier and in Chapter 1, every study suffers from some type of

limitation. Perhaps the first limitation of this study is the small sample size. While I

intentionally chose this small sample as a starting point for my line of research, I

recognize that multiple samples might have provided a nice point of comparison.

Threaded discussions are rich and full of data for researchers to mine. But I have

come to the conclusion that relying only on threaded discussions is limiting. A researcher

misses the things that might be said in emails, over the phone, or even in assignments

turned in to a drop-box. Recently, researchers (Archer, 2010; Shea et al., 2010; Shea et

al., 2009; Shea, Vickers, et al., 2009; Shea & Vickers, 2010) have argued about the need

to look at an entire course—rather than just threaded discussions or survey data—when

Page 148: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

133

studying the CoI. Like these researchers, I have found that another limitation of this

study is that it only focused on what was posted in the threaded discussions.

Last but not least, conducting content analysis without being able to check with

students about the meaning behind their postings as well as how other students interpret

their postings is also problematic and perhaps the biggest limitation of studies like this.

Concluding Thoughts and Implications

Despite the aforementioned limitations, the results of this study can be useful for

researchers and practitioners alike. From a research perspective, the study suggests that

social presence is much more complicated than previously conceptualized. While it is

helpful to investigate how students establish and maintain social presence in online

courses—which in this study was restricted to investigating postings in threaded

discussions—the list of indicators of social presence originally developed by Rourke et

al. (2001a) need to be revised. Further, multiple and mixed methods should be employed

whenever possible to investigate not only what students do and say but also how these

behaviors are perceived by others. Finally, and perhaps most importantly, researchers

need to spend more time focusing on how situational variables, such as the size of the

group, the instructional task, and the instructor’s role, in combination with personal

preferences influence how social presence is established and maintained.

Research, though, should inform practice. The results of this study, despite

limitations, have a number of pedagogical implications that instructional designers and

faculty alike can apply. For instance, the results of this study point to the importance of

the intentional use of different types of group activities and threaded discussions. Much

like in large lecture classrooms, students taking part in large threaded discussions might

Page 149: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

134

feel that their voice is lost, that it is simply too hard to reach out and be heard, and that it

is too difficult to project one’s personality. Further, rather than having several small

weekly threaded discussions, it might make more sense—at least from a social presence

perspective—to include some longer project-based discussions with smaller groups that

take place over multiple weeks.

The results also argue for the importance of “positive” past relationships with

students. Developing a traditional cohort model where students complete their entire

program of study with the same group of students is one way to help leverage the power

of past relationships to build social presence. This research, though, simply suggests that

it might be advantageous in terms of social presence to have students take a few back-to-

back courses that build upon one another together, and involve the students in well

orchestrated group work. This research also seems to suggest that having the same

instructor teach more than one course (and possibly back to back) could be powerful in

terms of leveraging past relationships between instructors and students—this though

assumes the past relationships are positive.

However, when it is not possible to have students complete their program of study

in a cohort or take back-to-back courses, designers and faculty could focus on having

both getting-to-know-you activities upfront as well as reconnecting activities throughout

a given course (see Dunlap & Lowenthal, 2011, in press, for more on “getting to know

you” strategies that can be used in the beginning of a course as well as throughout a

course).

Finally, it is likely that a magic social presence formula does not exist. Each

student might have her or his own sensitivity to and proficiency at projecting her- or

Page 150: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

135

himself as “real” and being “there” and specifically establishing connections with others.

Therefore, instructors and designers must try to find multiple and continual ways for

students and instructors to present themselves as “real” and “there” in threaded

discussions as well as other parts of online courses. For instance, some of the strategies I

have used and written about with colleagues to establish and maintain social presence

involve using digital stories (see Lowenthal & Dunlap, 2007, 2010), using social media

(see Dunlap & Lowenthal, 2009a, 2009b), using digital music (see Dunlap & Lowenthal,

2010b), giving feedback publicly (see Lowenthal & Thomas, 2010), and making one-on-

one phone calls to students (see Dunlap & Lowenthal, 2010a).

Given all of this, and in conclusion, designers and faculty should consider the

following elements the next time that they design or teach a course:

• Set up a variety of small-group discussion groups;

• Provide well structured small-group assignments that take time and

collaboration to complete;

• Balance instructor involvement;

• Establish incentives for students to take part in threaded discussions;

• Use a variety of instructional tasks and discussion prompts—some of which

ask for students to share personal and emotional details (when appropriate).

Page 151: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

136

APPENDIX A

Social Presence Measures

Table A1 Feelings about CMC

1. Stimulating-dull

2. Personal-impersonal

3. Sociable -unsociable

4. Sensitive-insensitive

5. Warm-cold

6. Colorful-colorless

7. Interesting-boring

8. Appealing-not appealing

9. Interactive-noninteractive

10. Active-passive

11. Reliable-unreliable

12. Humanizing-dehumanizing

13. Immediate-non-immediate

14. Easy-difficult

15. Efficient-inefficient

16. Unthreatening-threatening

17. Helpful-hindering

Note. From “Social Presence Theory and Implications for Interaction and Collaborative

Learning in Computer Conferences,” by C. N. Gunawardena, 1995, in International

Journal of Educational Telecommunications, 1(2/3), 147-166.

Page 152: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

137

Table A2 Social Presence Scale

1. Messages in GlobalEd were impersonal.

2. CMC is an excellent medium for social interaction.

3. I felt comfortable conversing through this text-based medium.

4. I felt comfortable introducing myself on GlobalEd.

5. The introduction enabled me to form a sense of online community.

6. I felt comfortable participating in GlobalEd discussions.

7. The moderators created a feeling of online community.

8. The moderators facilitated discussions in the GlobalEd conference.

9. Discussions using the medium of CMC tend to be more impersonal than face-

to-face discussion.

10. CMC discussions are more impersonal than audio conference discussions.

11. CMC discussions are more impersonal than video teleconference discussions.

12. I felt comfortable interacting with other participants in the conference.

13. I felt that my point of view was acknowledged by other participants in

GlobalEd.

14. I was able to form distinct individual impressions of some GlobalEd

participants even though we communicated only via a text-based medium.

Note. From “Social Presence as a Predictor of Satisfaction Within a Computer-mediated

Conferencing Environment,” by C. N. Gunawardena and F. J. Zittle, 1997, in The

American Journal of Distance Education, 11(3), 8-26.

Page 153: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

138

Table A3 Model and Template for Assessment of Social Presence

Category Indicators Definition of Indicators

Affective

Responses

Expression of

emotions

Conventional expressions of emotion, or

unconventional expressions of emotion,

includes repetitious punctuation,

conspicuous capitalization, emoticons

Use of Humor Teasing, cajoling, irony, understatements,

sarcasm

Self-Disclosure Presents details of life outside of class, or

expresses vulnerability

Interactive

Responses

Continuing a Thread Using reply feature of software, rather

than starting a new thread

Quoting from Other

Messages

Using software features to quote others

entire message or cutting and pasting

sections of others’ messages

Referring explicitly

to other messages

Direct references to contents of others’

posts

Asking questions Students ask questions of other students or

the moderator

Complimenting,

expressing

appreciation

Complimenting others or contents of

others’ messages

Expressing

agreement

Expressing agreement with others or

content of others’ messages

Cohesive

Responses

Vocatives Addressing or referring to participants by

name

Addresses or refers to

the group using

inclusive pronouns

Addresses the group as we, us, our, group

Phatics / Salutations Communication that serves a purely social

function; greetings, closures

Note. From “Assessing Social Presence in Asynchronous Text-based Computer

Conferencing,” by L. Rourke, D. R. Garrison, and W. Archer, 2001a, in Journal of

Distance Education, 14.

Page 154: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

139

Table A4 Additional Social Presence Variables

Dimensions

I. Social Context

II. Online

Communication

III. Interactivity

IV. Privacy

Familiarity with

recipients

Keyboarding and

accuracy skills

Timely Response Formats of CMC

Assertive /

acquiescent

Use of emoticons

and paralanguage

Communication

Styles

Access and

Location

Informal/formal

relationship

Characteristics of

real-time

discussion

Length of

Messages

Patterns of CMC

Trust relationships Characteristics of

discussion boards

Formal/Informal

Social

relationships (love

and information)

Language skills

(reading, writing)

Type of tasks

(planning,

creativity, social

tasks)

Psychological

attitude toward

technology

Size of Groups

Access and

location

Communication

strategies

User’s

characteristics

Note. From “The Relationship of Social Presence and Interaction in Online Classes,” by

C.-H. Tu and M. McIsaac, 2002, in The American Journal of Distance Education, 16(3),

131-150.

Page 155: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

140

Table A5 Community of Inquiry Survey Instrument

5 point Likert scale 1=strongly disagree, 2=disagree, 3=neutral, 4=agree, 5=strongly agree

Teaching Presence

Design & Organization 1. The instructor clearly communicated important course topics.

2. The instructor clearly communicated important course goals.

3. The instructor provided clear instructions on how to participate in course learning activities.

4. The instructor clearly communicated important due dates/time frames for learning activities.

Facilitation 5. The instructor was helpful in identifying areas of agreement and disagreement on course topics

that helped me to learn.

6. The instructor was helpful in guiding the class towards understanding course topics in a way

that helped me clarify my thinking.

7. The instructor helped to keep course participants engaged and participating in productive

dialogue.

8. The instructor helped keep the course participants on task in a way that helped me to learn.

9. The instructor encouraged course participants to explore new concepts in this course.

10. Instructor actions reinforced the development of a sense of community among course

participants.

Direct Instruction 11. The instructor helped to focus discussion on relevant issues in a way that helped me to learn.

12. The instructor provided feedback that helped me understand my strengths and weaknesses.

13. The instructor provided feedback in a timely fashion.

Social Presence

Affective expression 14. Getting to know other course participants gave me a sense of belonging in the course.

15. I was able to form distinct impressions of some course participants.

16. Online or web-based communication is an excellent medium for social interaction.

Open communication 17. I felt comfortable conversing through the online medium.

18. I felt comfortable participating in the course discussions.

19. I felt comfortable interacting with other course participants.

Group cohesion 20. I felt comfortable disagreeing with other course participants while still maintaining a sense of

trust.

21. I felt that my point of view was acknowledged by other course participants.

22. Online discussions help me to develop a sense of collaboration.

Page 156: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

141

Table A5 (con’t.)

Cognitive Presence

Triggering event 23. Problems posed increased my interest in course issues.

24. Course activities piqued my curiosity.

25. I felt motivated to explore content related questions.

Exploration 26. I utilized a variety of information sources to explore problems posed in this course.

27. Brainstorming and finding relevant information helped me resolve content related questions.

28. Online discussions were valuable in helping me appreciate different perspectives.

Integration 29. Combining new information helped me answer questions raised in course activities.

30. Learning activities helped me construct explanations/solutions.

31. Reflection on course content and discussions helped me understand fundamental

concepts in this class.

Resolution 32. I can describe ways to test and apply the knowledge created in this course.

33. I have developed solutions to course problems that can be applied in practice.

34. I can apply the knowledge created in this course to my work or other non-class related

activities.

Note. From “Validating a Measurement Tool of Presence in Online Communities of

Inquiry,” by K. Swan, P. Shea, J. Richardson, P. Ice, D. R. Garrison, M. Cleveland-Innes,

and J. B. Arbaugh, 2008, in E-Mentor, 2(24), 1-12.

Page 157: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

142

APPENDIX B

Word Count Results

Table B1 Word Count Results across All Forums

Rank Word Count Percentage (%)

1 I 4858 4.13

2 you 2186 1.86

3 have 1428 1.21

4 we 1367 1.16

5 my 1001 0.85

6 what 948 0.81

7 do 814 0.69

8 your 810 0.69

9 can 730 0.62

10 policy 600 0.51

11 me 595 0.51

12 all 592 0.50

13 about 574 0.49

14 bob 566 0.48

15 so 565 0.48

16 instructor 564 0.48

17 think 553 0.47

18 our 538 0.46

19 work 494 0.42

20 would 482 0.41

21 one 456 0.39

22 how 454 0.39

23 reading 428 0.36

24 week 421 0.36

25 from 414 0.35

26 some 414 0.35

27 more 407 0.35

28 just 392 0.33

29 know 390 0.33

30 need 383 0.33

31 get 381 0.32

32 group 348 0.30

33 doc 346 0.29

34 also 345 0.29

35 an 341 0.29

36 well 334 0.28

37 out 328 0.28

38 school 324 0.28

39 like 320 0.27

40 good 318 0.27

41 thanks 313 0.27

42 here 302 0.26

43 which 288 0.24

44 other 281 0.24

45 should 275 0.23

46 data 264 0.22

47 i’m 264 0.22

48 paper 261 0.22

49 could 254 0.22

50 research 251 0.21

Page 158: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

143

Table B2 Word Count Results across Project Groups

Rank Word Count Percentage (%)

1 I 1674 4.08

2 you 729 1.78

3 we 678 1.65

4 have 497 1.21

5 what 387 0.94

6 do 267 0.65

7 can 261 0.64

8 your 258 0.63

9 may 255 0.62

10 our 251 0.61

11 all 241 0.59

12 think 221 0.54

13 my 215 0.52

14 so 214 0.52

15 data 202 0.49

16 policy 193 0.47

17 would 184 0.45

18 some 181 0.44

19 need 176 0.43

20 me 173 0.42

21 about 171 0.42

22 from 169 0.41

23 paper 169 0.41

24 work 166 0.40

25 how 161 0.39

26 draft 155 0.38

27 just 150 0.37

28 know 150 0.37

29 also 144 0.35

30 bob 142 0.35

31 instructor 142 0.35

32 more 139 0.34

33 out 136 0.33

34 should 133 0.32

35 thanks 132 0.32

36 get 130 0.32

37 schools 127 0.31

38 mary 125 0.30

39 here 122 0.30

40 good 120 0.29

41 doc 118 0.29

42 like 114 0.28

43 could 112 0.27

44 group 112 0.27

45 make 111 0.27

46 project 111 0.27

47 week 111 0.27

48 school 109 0.27

49 programs 108 0.26

50 well 107 0.26

Page 159: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

144

Table B3 Word Count Results across Pairs

Rank Word Count Percentage (%)

1 I 960 4.87

2 you 438 2.22

3 my 339 1.72

4 have 291 1.48

5 we 209 1.06

6 your 189 0.96

7 me 148 0.75

8 what 145 0.74

9 do 130 0.66

10 work 123 0.62

11 about 119 0.60

12 goals 116 0.59

13 can 110 0.56

14 our 102 0.52

15 how 100 0.51

16 school 97 0.49

17 time 88 0.45

18 teachers 85 0.43

19 goal 84 0.43

20 some 82 0.42

21 would 82 0.42

22 think 79 0.40

23 bob 76 0.39

24 instructor 76 0.39

25 need 75 0.38

26 know 73 0.37

27 so 72 0.37

28 an 71 0.36

29 week 66 0.34

30 well 65 0.33

31 all 63 0.32

32 out 62 0.31

33 been 61 0.31

34 like 61 0.31

35 more 60 0.30

36 one 60 0.30

37 also 59 0.30

38 from 58 0.29

39 just 57 0.29

40 get 55 0.28

41 see 55 0.28

42 good 54 0.27

43 when 54 0.27

44 where 54 0.27

45 i’m 53 0.27

46 each 52 0.26

47 help 52 0.26

48 students 52 0.26

49 other 51 0.26

50 plan 50 0.25

Page 160: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

145

Table B4 Word Count Results across Reading Groups

Rank Word Count Percentage (%)

1 I 1784 3.79

2 you 802 1.70

3 have 532 1.13

4 we 416 0.88

5 what 358 0.76

6 do 348 0.74

7 policy 344 0.73

8 my 328 0.70

9 can 297 0.63

10 reading 293 0.62

11 your 283 0.60

12 one 255 0.54

13 about 242 0.51

14 all 231 0.49

15 think 227 0.48

16 me 226 0.48

17 so 225 0.48

18 instructor 222 0.47

19 bob 221 0.47

20 week 202 0.43

21 summary 196 0.42

22 doc 190 0.40

23 would 185 0.39

24 more 175 0.37

25 how 171 0.36

26 our 165 0.35

27 get 164 0.35

28 from 162 0.34

29 questions 161 0.34

30 work 156 0.33

31 just 155 0.33

32 log 151 0.32

33 group 141 0.30

34 which 141 0.30

35 an 139 0.29

36 know 138 0.29

37 research 138 0.29

38 some 138 0.29

39 well 131 0.28

40 democratic 130 0.28

41 good 128 0.27

42 Lasswell 128 0.27

43 also 126 0.27

44 chapter 122 0.26

45 here 122 0.26

46 has 117 0.25

47 like 115 0.24

48 thanks 114 0.24

49 heck 113 0.24

50 any 112 0.24

Page 161: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

146

APPENDIX C

Constant Comparison Analysis Results

Table C.1 Reading Group E Codes

Codes Generated

Acknowledging lack of

knowledge

Greeting Reflection

Addressing question Happiness Reflection about course

material

Advice Heading Relating

Agreement Hope Relating to Others

Answer Hoping for help Research discussion

Answering questions Humor Resource

Anticipation Justification of example Resource recommendation

Apology Likes course reading Response

Appreciation Note Reveal life outside of class

Assignment discussion Opinion Reveal life outside of class as

relates to class

Belief Paralanguage Reveal problems

Bias Personal course interests Reveal struggling

Clarification Personal example Reveals lack of knowledge

Commitment to more

discussion

Personal interest in course stuff Salutation

Complimenting texts Personal interest in reading Shares thinking about thinking

Contextualizing Point Personal Life Details Shares thoughts about reading

Critique Personal story Sharing thoughts

Critique of writing Personal study details Sharing values

Discussing Reading Personalization of Material Showing relevance

Discussing policy Philosophical Discussion Story example

Doubt Plan Thinking about reading

Empathy Plea Thinking about the course

Enjoyment Policy answer Thinking about thinking

Example Positive feedback Thought

Example of criticism Positive thinking Thoughts about policy

Excitement Question Thoughts about reading

Explaining struggles Quotation Vocative

Explanation Reading discussion Wonder

General Policy Discussion Recommendation Worth mentioning

Grade details Recommending other sources

Page 162: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

147

Table C.2 Reading Group E Groups

Grouping of Codes

Course logistics & facilitation

Addressing question

Answering questions

Critique

Critique of writing

Grade details

Heading

Question

Advice

Recommendation

Recommending other sources

Policy Related Class Discussions

Reflection

Reflection about course material

Thinking about reading

Thinking about the course

Thinking about thinking

General Policy Discussion

Discussing reading

Discussing policy

Thoughts about policy

Thoughts about reading

Policy answer

Shares thinking about thinking

Shares thoughts about reading

Sharing thoughts

Research discussion

Showing relevance

Belief

Bias

Thought

Wonder

Worth mentioning

Response

Resource

Resource recommendation

Personal interest in course stuff

Personal interest in reading

Plan

Personalization of Material

Philosophical Discussion

Complimenting texts

Contextualizing Point

Justification of example

Likes course reading

Note

Opinion

Assignment discussion

Quotation

Reading discussion

Example

Example of criticism

Clarification

Commitment to more discussion

Personal course interests

Answer

Explanation

Emotion

Anticipation

Paralanguage

Apology

Doubt

Empathy

Enjoyment

Excitement

Happiness

Hope

Hoping for help

Plea

Humor

Greetings and Salutations

Greeting

Salutation

Vocative

Sharing Life Details

Personal example

Personal story

Story example

Reveal life outside of class

Reveal life outside of class as relates to class

Personal Life Details

Personal study details

Gracious / Gratitude

Appreciation

Positive feedback

Positive thinking

Self Disclosing Personal Matters

Explaining struggles

Acknowledging lack of knowledge

Reveal problems

Reveal struggling

Reveals lack of knowledge

Sharing values Playing Nice with Others

Agreement

Relating

Relating to Others

Page 163: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

148

Table C.3 Pair9 Codes

Codes Generated

Acknowledgement Positive feedback

Agreement Positive self assessment

Answer to question Question

Anticipation Question about meeting

Appreciation Reassurance

Assignment details Recommendation

Assignment discussion Reference education literature

Best wishes Reflection

Brainstorming Relating

Commitment Relating to others

Concern Request

Confidence in peers Reveal uncertainty

Course planning Revealing concerns

Doubt Revealing life in other courses

Emotion Revealing struggles

Enjoyment Revealing thinking

Explanation Revealing unawareness

Explanation for struggles Salutation

Feeling overwhelmed Self assessment

Greeting Self disclosure

Happiness Sharing course plans

Heading Sharing plans

Hope Sharing successes

Inquiring about life outside of course Thanks

Introspection Thinking about policy

invitation Thinking out loud

Likes idea Thoughts on assignments

Paralanguage Thoughts on instructor

Personal sharing Thoughts on leadership

Plan to collaborate Understanding task

Plans Vocative

Plans to meet

Page 164: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

149

Table C.4 Pair9 Groups

Course logistics & facilitation Collaboration

Recommendation

Question

Request

invitation

Understanding task

Assignment details

Assignment discussion

Heading

Question about meeting

Course planning

Plan to collaborate

Plans

Plans to meet

Sharing course plans

Sharing plans

Emotion Sharing Life Details

Concern

Anticipation

Happiness

Doubt

Emotion

Enjoyment

Feeling overwhelmed

Hope

Paralanguage

Introspection

Personal sharing

Reflection

Self assessment

Revealing life in other courses

Positive self assessment

Sharing successes

Inquiring about life outside of course

Playing Nice with Others Policy Related Class Discussions

Relating

Relating to others

Acknowledgement

Agreement

Answer to question

Likes idea

Confidence in peers

Commitment

Reassurance

Reference education literature

Brainstorming

Thinking about policy

Thinking out loud

Thoughts on assignments

Thoughts on instructor

Thoughts on leadership

Revealing thinking

Explanation

Self Disclosing Personal Matters Gracious / Gratitude

Revealing unawareness

Revealing struggles

Self disclosure

Reveal uncertainty

Revealing concerns

Explanation for struggles

Positive feedback

Best wishes

Appreciation

Thanks

Greetings and Salutations

Greeting

Vocative

Salutation

Page 165: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

150

REFERENCES

Akyol , Z., & Garrison, D.R. (2009). Community of inquiry in adult online learning:

Collaborative-constructivist approaches. In T. T. Kidd (Ed.), Adult learning in the

digital age: Perspectives on online technologies and outcomes (pp. 55-66).

Hershey, PA: IGI Global.

Akyol, Z., Vaughan, N., & Garrison, D.R. (2011). The impact of course duration on

the development of a community of inquiry. Interactive Learning Environments,

19(3), 231-246.

Ali, R., & Leeds, E. M. (2010). The impact of face-to-face orientation on online

retention: A pilot study. Online Journal of Distance Learning Administration,

13(4). Retrieved from http://www.westga.edu/~distance/ojdla/winter124

/ali124.html

Allen, I. E., & Seaman, J. (2006). Making the grade: Online education in the United

States, 2006. Needham, MA: Sloan-C.

Allen, I. E., & Seaman, J. (2010). Learning on demand: Online education in the United

States, 2009. Needham, MA: Sloan-C.

An, H., Shin, S., & Lim, K. (2009). The effects of different instructor facilitation

approaches on students’ interactions during asynchronous online discussions.

Computers & Education, 53(3), 749-760.

Anderson, T. (2003a). Getting the mix right again: An updated and theoretical

rationale for interaction. The International Review of Research in Open and

Distance Learning, 4(2). Retrieved from http://www.irrodl.org/index.php/

irrodl/article/view/149/230

Page 166: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

151

Anderson, T. (2003b). Modes of interaction in distance education: Recent developments

and research questions. In M. Moore (Ed.), Handbook of distance education (pp.

129-144). Mahwah, NJ: Lawrence Erlbaum.

Anderson, T., & Garrison, D. R. (1998). Learning in a networked world: New roles and

responsibilities. In C. Gibson (Ed.), Distance learners in higher education. (pp.

97-112). Madison, WI: Atwood.

Anderson, T., & Kuskis, A. (2007). Modes of interaction. In M. G. Moore (Ed.),

Handbook of distance education (pp. 295-309). Mahwah, NJ: Lawrence Erlbaum.

Aragon, S. R. (2003). Creating social presence in online environments. New Directions

for Adult and Continuing Education, 100, 57-68.

Arbaugh, J.B. (2007). An empirical verification of the community of inquiry

framework. Journal of Asynchronous Learning Network, 11(1), 73-85.

Arbaugh, B., Bangert, A., & Cleveland-Innes, M. (2010). Subject matter effects and the

community of inquiry framework. The Internet and Higher Education, 13(1-2),

37-44.

Arbaugh, J. B., & Benbunan-fich, R. (2006). An investigation of epistemological and

social dimensions of teaching in online learning environments. Academy of

Management Learning & Education, 5(4), 435-447.

Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, G. R., Philip, I.,

Richardson, J. C., Shea, P., & Swan, K. P. (2008). The community of inquiry

framework: Development, validation, and directions for further research. Paper

presented at the annual meeting of the American Education Research Association,

New York, NY.

Page 167: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

152

Argyle, M. (1969). Social interaction. New York: Atherton Press.

Argyle, M., & Cook, M. (1976). Gaze and mutual gaze. London: Cambridge University.

Argyle, M., & Dean, J. (1965). Eye contact, distance and affiliation. Sociometry, 28, 289-

304.

Benbunan-Fich, R., Hiltz, S. R., & Harasim, L. (2005). The online interaction learning

model: An integrated theoretical framework for learning networks. In S. R. Hiltz

& R. Goldman (Eds.), Learning together online: Research on asynchronous

learning networks (pp. 19-37). Mahwah, NJ: Lawrence Erlbaum.

Berelson, B. (1952). Content analysis in communication research. Glencoe, IL: Free

Press.

Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., et

al. (2004). How does distance education compare with classroom instruction? A

meta-analysis of the empirical literature. Review of Educational Research, 74(3),

379-439.

Biocca, F. (1997). The cyborg's dilemma: Progressive embodiment in virtual

environments. Journal of Computer-Mediated Communication, 3(2). Retrieved

from http://www. ascusc.org/jcmc/vol3/issue2/biocca2.html

Biocca, F., & Harms, C. (2002). What is social presence? In F. Gouveia & F. Biocca,

Presence 2002 Proceedings. Porto, Portugal: University of Fernando Pessoa

Press.

Biocca, F., Harms, C., & Burgoon, J. K. (2003). Toward a more robust theory and

measure of social presence: Review and suggested criteria. Presence:

Teleoperators & Virtual Environments, 12(5), 456-480.

Page 168: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

153

Brown, J. S. (2002). Growing up digital: How the web changes work, education, and the

ways people learn. USDLA Journal, 16(2). Retrieved

fromhttp://www.usdla.org/html/journal/FEB02_Issue/article01.html

Carley, K. (1993). Coding choices for textual analysis: A comparison of content analysis

and map analysis. Sociological Methodology, 23, 75-126.

Caspi, A., & Blau, I. (2008). Social presence in online discussion groups: Testing

three conceptions and their relations to perceived learning. Social Psychology

of Education, 11(3), 323-346.

Chickering, A.W., & Gamson, Z.F. (1987). Seven principles for good practice in

undergraduate education. AAHE Bulletin, 39(7), 3-7.

Christie, B. (1974). Perceived usefulness of person-person telecommunications media

as a function of the intended application. European Journal of Social Psychology,

4(3), 366-368.

Christie, B., & Holloway, S. (1975). Factors affecting the use of telecommunications

by management. Journal of Occupational Psychology, 48, 3-9.

Christie, B., & Kingan, S. (1977). Electronic alternatives to the business meeting:

Managers’ choices. Journal of Occupational Psychology, 50, 265-273.

Collins, K. M. T., Onwuegbuzie, A. J., & Sutton, I. L. (2006). A model incorporating

the rationale and purpose for conducting mixed methods research in special

education and beyond. Learning Disabilities: A Contemporary Journal, 4, 67-

100.

Connolly, T., Jessup, L. M., & Valacich, J, S, (1990). Effects of anonymity and

Page 169: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

154

evaluative tone on idea generation in computer-mediated groups. Management

Science, 36, 97-120.

Creswell, J. W. (1998). Qualitative inquiry and research design: Choosing among

five traditions. Thousand Oak, CA: Sage.

Creswell, J. W. (2008). Educational research: Planning, conducing, and evaluating

quantitative and qualitative research (3rd ed.). Upper Saddle River, NJ: Pearson.

Creswell, J. W., & Plano Clark, V. L. (2007). Designing and conducting mixed methods

research. Thousand Oaks, CA: Sage.

Daft, R. L., & Lengel, R. H. (1984). Information richness: A new approach to

managerial behavior and organizational design. In L. L. Cummings & B. M.

Staw (Eds.), Research in organizational behavior (pp. 191-233). Homewood, IL:

JAI Press.

Daft, R. L., & Lengel, R. H. (1986). Organizational information requirements, media

richness and structural design. Management Science, 32(5), 554-571.

Daft, R. L., Lengel, R. H., & Trevino, L. K. (1987). Message equivocality, media

selection, and manager performance: Implications for information systems. MIS

Quarterly, 11(3), 355-366.

Danchak, M. M., Walther, J. B., & Swan, K. P. (2001, November). Presence in mediated

instruction: Bandwidth, behavior, and expectancy violations. Paper presented at

the annual meeting of Asynchronous Learning Networks, Orlando, FL.

De Wever, B., Schellens, T., Valcke, M., & Van Keer, H. (2006). Content analysis

schemes to analyze transcripts of online asynchronous discussion groups: A

review. Computers & Education, 46, 6-28.

Page 170: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

155

Delfino M., & Manca S. (2007). The expression of social presence through the use of

figurative language in a web-based learning environment. Computers in Human

Behaviour, 23(5), 2190-2211

Dellinger, A. B., & Leech, N. L. (2007). Toward a unified validation framework in mixed

methods research. Journal of Mixed Methods Research, 1(4), 309-332.

Dennen, V. P. (2005). From message posting to learning dialogues: Factors affecting

learner participation in asynchronous discussion. Distance Education, 26(1), 127-

148.

Dennen, V. P. (2008). Pedagogical lurking: Student engagement in non-posting

discussion behavior. Computers in Human Behavior, 24(4), 1624-1633.

Dunlap, J. C., & Lowenthal, P. R. (2009a). Horton hears a tweet. EDUCAUSE

Quarterly, 32(4).

Dunlap, J. C., & Lowenthal, P. R. (2009b). Tweeting the night away: Using Twitter to

enhance social presence. Journal of Information Systems Education, 20(2), 129-

136.

Dunlap, J. C., & Lowenthal, P. R. (2010a). Defeating the Kobayashi Maru: Supporting

student retention by balancing the needs of the many and the one. EDUCAUSE

Quarterly, 33(4). Retrieved from http://www.educause.edu/

EDUCAUSE+Quarterly/EDUCAUSEQuarterlyMagazineVolum/DefeatingtheKo

bayashiMaruSuppo/219103

Dunlap, J. C., & Lowenthal, P. R. (2010b). Hot for teacher: Using digital music to

enhance student’s experience in online courses. TechTrends, 54(4), 58-73. doi:

10.1007/s11528-010-0421-4

Page 171: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

156

Dunlap, J. C., & Lowenthal, P. R. (2011). Who are you? Alternative online meet-

and-greet tactics. In P. Shank (Ed.), The online learning idea book: Proven ways

to enhance technology-based and blended learning (vol. 2; pp. 149-152). San

Francisco: Pfeiffer.

Dunlap, J. C., & Lowenthal, P. R. (in press). Getting to know you: The first week of

class and beyond. In P. R. Lowenthal, D. Thomas, B. Yuhnke, A. Thai, M.

Edwards, & C. Gasell (Eds.), The CU Online Handbook, 2011. Raleigh, NC: Lulu

Enterprises.

Dunlap, J. C., Sobel, D., & Sands, D. I. (2007). Supporting students’ cognitive processing

in online courses: Designing for deep and meaningful student-to-content

interactions. TechTrends, 51(4), 20-31.

Eisenhart, M., & Howe, K. (1992). Validity in educational research. In M. LeCompte,

W. Millroy, & J. Preissle (Eds.), The handbook of qualitative research in

education (pp. 642-680). San Diego, CA: Academic Press.

Fryshman, B. (2008). Do we assess learning? Pull up a chair… Inside Higher Ed.

Retrieved from http://www.insidehighered.com/views/2008/08/07/fryshman

Garrison, D. R., & Anderson, T. (2003). E-learning in the 21st century: A framework for

research and practice. New York: RoutledgeFalmer.

Garrison, D. R. & Arbaugh, J.B. (2007). Researching the community of Inquiry

Framework: Review, Issues, and Future Directions. The Internet and Higher

Education, 10(3), 157-172.

Page 172: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

157

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based

environment: Computer conferencing in higher education. The Internet and

Higher Education, 2(2-3), 87-105.

Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive

presence, and computer conferencing in distance education. American Journal of

Distance Education, 15(1), 7-23.

Garrison, D. R., Cleveland-Innes, M., Koole, M., & Kappelman, J. (2006). Revisiting

methodological issues in transcript analysis: Negotiated coding and reliability.

Internet and Higher Education, 9(1), 1-8.

Garrison, D. R., Cleveland-Innes, M., & Fung, T. S. (2010). Exploring causal

relationships among teaching, cognitive, and social presence: Student perceptions

of the community of inquiry framework. The Internet and Higher Education,

13(1-2), 31-36.

Gilbert, P. K., & Dabbagh, N. (2005). How to structure online discussions for

meaningful discourse: A case study. British Journal of Educational Technology,

36(1), 5-18.

Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for

qualitative research. New Brunswick, NJ: AldineTransaction.

Goldman, R., Crosby, M., Swan, K., & Shea, P. (2005). Qualitative and quisitive research

methods for describing online learning. In S. R. Hiltz & R. Goldman, Learning

together online (pp. 103-120). Mahwah, NJ: Lawrence Erlbaum.

Goodwin, L. D. (2001). Interrater agreement and reliability. Measurement in Physical

Education and Exercise Science, 5(1), 13-34.

Page 173: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

158

Goodwin, L. D., & Leech, N. L. (2003). The meaning of validity in the new standards for

educational and psychological testing: Implications for measurement courses.

Measurement and Evaluation in Counseling and Development, 36(3), 181-191.

Gorsky, P., Caspi, A. Antonovsky, A., Blau, I., & Mansur, A. (2010). The relationship

between academic discipline and dialogic behavior in open university course

forums. The International Review of Research in Open and Distance Learning,

11(2). Retrieved from http://www.irrodl.org/index.php/

irrodl/article/view/820/1546

Gunawardena, C. N. (1995). Social presence theory and implications for interaction and

collaborative learning in computer conferences. International Journal of

Educational Telecommunications, 1(2/3), 147-166.

Gunawardena, C. N., & Zittle, F. J. (1997). Social presence as a predictor of satisfaction

within a computer-mediated conferencing environment. The American Journal of

Distance Education, 11(3), 8-26.

Gunawardena, C. N., Lowe, C. A., & Anderson, T. (1997). Analysis of a global online

debate and the development of an interaction analysis model for examining social

construction of knowledge in computer conferencing. Journal of Educational

Computing Research, 17(4), 397-431.

Henninger, M., & Viswanathan, V. (2004). Social presence in online tutoring: What we

know and what we should know. In P. Gerjets, P. A. Kirschner, J. Elen, & R.

Joiner (Eds.), Proceedings of the first joint meeting of the EARLI SIGs

Instructional Design and Learning and Instruction with Computers (CD-ROM)

(pp. 365-376). Tuebingen: Knowledge Media Research Center.

Page 174: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

159

Henri, F. (1992). Computer conferencing and content analysis. In A. R. Kaye (Ed.),

Collaborative learning through computer conferencing. The Najadan Papers (pp.

117-136). London: Springer-Verlag.

Herring, S. C. (2004). Content analysis for new media: Rethinking the paradigm. In New

research for new media: Innovative research methodologies symposium working

papers and readings (pp. 47-66). Minneapolis, MN: University of Minnesota

School of Journalism and Mass Communication.

Herring, S. C. (2007). A faceted classification scheme for computer-mediated

discourse. Language@Internet, 4(1). Retrieved from http://www.

languageatinternet.de/articles/2007/761/Faceted_Classification_Scheme_for_CM

D.pdf

Hiemstra, G. (1982). Teleconferencing, concern for face, and organizational culture.

In M. Burgoon (Ed.), Communication yearbook 6 (pp. 874-904). Newbury Park,

CA: Sage.

Hiltz, S. R., & Arbaugh, J. B. (2003). Improving quantitative research methods in studies

of asynchronous learning networks (ALN). In J. R. Bourne & J. C. Moore (Eds.),

Elements of quality online education: Practice and direction (pp. 59-72).

Needham, MA: Sloan-C.

Hiltz, S. R., & Turoff, M. (1993). The network nation. Cambridge, MA: MIT Press.

Holsti, O. R. (1969). Content analysis for the social sciences and humanities. Reading:

MA, Addison-Wesley.

Page 175: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

160

Hostetter, C., & Busch, M. (2006). Measuring up online: The relationship between social

presence and student learning satisfaction. Journal of Scholarship of Teaching

and Learning, 6(2), 1-12.

Hughes, M., Ventura, S., & Dando, M. (2007). Assessing social presence in online

discussion groups: A replication study. Innovations in Education and Teaching

International, 44(1), 17-29.

Ice, P., Gibson, A. M., Boston, W. & Becher, D. (2011). An exploration of

differences between community of indicators in low and high disenrollment

online courses. Journal of Asynchronous Learning Networks, 15(2), 44-70.

Johansen, R. (1977). Social evaluations of teleconferencing. Telecommunications

Policy, 1(5), 395-419.

Joo, Y. J., Lim, K. Y. & Kim, E. K. (2011). Online university students’ satisfaction

and persistence: Examining perceived level of presence, usefulness and ease of

use as predictors in a structural model. Computers & Education, 57, 1654–1664.

Kear, K. (2010). Social presence in online learning communities. In L. Dirckinck-

Holmfeld, V. Hodgson , C. Jones, M. de Laat, D. McConnell, & T. Ryberg

(Eds.), Proceedings of the 7th International Conference on Networked Learning

(pp. 541-548). Lancaster: University of Lancaster. Retrieved from

http://www.lancs.ac.uk/fss/organisations/netlc/past/nlc2010/abstracts/PDFs/Kear.

pdf

Kemp, N. J., & Rutter, D. R. (1986). Social interaction in blind people: An

experimental analysis. Human Relations, 39(3), 195-210.

Kiesler, S. (1986). The hidden messages in computer networks. Harvard Business

Page 176: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

161

Review, 64(3), 46-54, 58-60.

Kiesler, S., Siegel, J., McGuire, T. W. (1984). Social psychological aspects of

computer-mediated communication. American Psychologist, 39(10), 1123-

1134.

Kramer, A. D. I., Oh, L. M., & Fussell, S. R. (2006). Using linguistic features to

measure presence in computer-mediated communication. In Proceedings of the

SIGCHI conference on Human Factors in Computing Systems (pp. 913-916). New

York: ACM Press.

Krathwohl, D. R. (2004). Methods of educational and social science research: An

integrated approach (2nd ed.). Long Grove, IL: Waveland Press.

Kreijns, K., Kirschner, P. A., & Jochems, W. (2003). Identifying the pitfalls of social

interaction in computer-supported collaborative learning environments: A review

of the research. Computers in Human Behavior, 19(3), 335-353.

Lane, C-A (2011, April). Social presence impacting cognitive learning of adults in

distance education. A thesis submitted to the faculty at Athabasca.

https://dt.athabascau.ca/jspui/bitstream/10791/10/1/MasterThesisLane-v36-

Apr282011.pdf

Leech, N. L., & Onwuegbuzie, A. J. (2006). A typology of mixed research designs.

Quality and Quantity. Retrieved January 22, 2008. doi: 10.1007/s11135-007-

9105-3

Leech, N. L., & Onwuegbuzie, A. J. (2007). An array of qualitative data analysis tools: A

call for data analysis triangulation. School Psychology Quarterly, 22(4), 557-584.

Page 177: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

162

Levy, Y. (2007). Comparing dropouts and persistence in e-learning courses. Computers

& Education, 48(2), 185-204.

Lin, G.-Y. (2004, October). Social presence questionnaire of online collaborative

learning: Development and validity. Paper presented at the annual meeting of the

Association for Educational Communications and Technology, Chicago, IL.

Lombard, M., & Ditton, T. (1997). At the heart of it all: The concept of presence. Journal

of Computer-Mediated Communication, 3(2). Retrieved from

http://jcmc.indiana.edu/vol3/issue2/lombard.html

Lomicka, L. & Lord, G. (2007). Social presence in virtual communities of foreign

language (FL) teachers. System, 35, 208-228.

Lowenthal, P. R. (2010). The evolution and influence of social presence theory on online

learning. In S. Dasgupta (Ed.), Social computing: Concepts, methodologies, tools,

and applications (pp. 113-128). Hershey, PA: IGI Global.

Lowenthal, P. R. (2009). Social presence. In P. Rogers, G. Berg, J. Boettcher, C.

Howard, L. Justice, & K. Schenk (Eds.), Encyclopedia of distance and online

learning (2nd ed., pp. 1900-1906). Hershey, PA: IGI Global.

Lowenthal, P., & Dunlap, J. (2007). Digital stories. In P. Shank (Ed.), The online

learning idea book: 95 proven ways to enhance technology-based and blended

learning (pp. 110-111). San Francisco: Pfeiffer.

Lowenthal, P. R., & Dunlap, J. (2010). From pixel on a screen to real person in your

students’ lives: Establishing social presence using digital storytelling. The Internet

and Higher Education, 13(1-2), 70-72. doi:10.1016/j.iheduc.2009.10.004

Lowenthal, P. R., & Dunlap, J. C. (2011, April). Investigating students’ perceptions

Page 178: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

163

of various instructional strategies to establish social presence. Paper presented at

the annual meeting of the American Educational Research Association, New

Orleans, LA.

Lowenthal, P. R., & Leech, N. (2009). Mixed research and online learning: Strategies

for improvement. In T. T. Kidd (Ed.), Online education and adult learning: New

frontiers for teaching practices (pp. 202-211). Hershey, PA: IGI Global.

Lowenthal, D. A., & Lowenthal, P. R. (2010, April). A mixed methods examination

of instructor social presence in accelerated online courses. Paper presented at the

annual meeting of the American Education Research Association, Denver, CO.

Lowenthal, P. R., Lowenthal, D. A., & White, J. W. (2009). The changing nature of

online communities of inquiry: An analysis of how discourse and time shapes

students' perceptions of presence. In M. Simonson (Ed.), 32nd Annual

proceedings: Selected research and development papers presented at the annual

convention of the Association for Educational Communications and Technology.

Washington DC: Association for Educational Communications and Technology.

Lowenthal, P. R., & Thomas, D. (2010). Death to the Digital Dropbox: Rethinking

student privacy and public performance. EDUCAUSE Quarterly, 33(3). Retrieved

from http://www.educause.edu/EDUCAUSE+Quarterly/

EDUCAUSEQuarterlyMagazineVolum/DeathtotheDigitalDropboxRethin/213672

Lowenthal, P. R., Wilson, B., & Parrish, P. (2009). Context matters: A description

and typology of the online learning landscape. In M. Simonson (Ed.), 32nd

Annual proceedings: Selected research and development papers presented at

the annual convention of the Association for Educational Communications

Page 179: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

164

and Technology. Washington D. C.: Association for Educational

Communications and Technology.

Ludwig-Hardman, S., & Dunlap, J. C. (2003). Learner support services for online

students: Scaffolding for success. The International Review of Research in Open

and Distance Learning, 4(1). Retrieved from http://www.irrodl.org

/index.php/irrodl/article/view/131/211

McIsaac, M. S., Blocher, J. M., Mahes, V., & Vrasidas, C. (1999). Student and teacher

perceptions of interaction in online computer-mediated communication.

Educational Media International, 36(2), 121-131.

Mehrabian, A. (1972). Nonverbal communication. New Brunswick, NJ:

AldineTransaction.

Miles, M., & Huberman, A. M. (1994). Qualitative data analysis: An expanded

sourcebook (2nd ed.). Thousand Oaks, CA: Sage.

Moore, M. G. (1989). Three types of interaction. The American Journal of Distance

Education, 3(2), 1-6.

Moore, M. G., & Kearsley, G. (2005). Distance education: A systems view (2nd ed.).

New York: Wadsworth.

Na Ubon, A. & Kimble, C. (2003. Supporting the creation of social presence in online

learning communities using asynchronous text-based CMC. In The 3rd

International Conference on Technology in Teaching and Learning in Higher

Education (pp. 295-300). Heidelberg, Germany.

Page 180: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

165

National Center for Education Statistics. (2008). Distance education at degree-granting

postsecondary institutions: 2006-07. Washington, DC: U.S. Department of

Education. Retrieved from http://nces.ed.gov/pubs2009/

2009044.pdf

Nippard, E. & Murphy, E. (2007). Social Presence in the Web-based Synchronous

Secondary Classroom. Canadian Journal of Learning and Technology, 33(1).

Onwuegbuzie, A. J., & Collins, K. M. T. (2007). A typology of mixed methods sampling

designs in social science research. The Qualitative Report, 12(2), 281-316.

Onwuegbuzie, A. J., & Leech, N. L. (2004). Enhancing the interpretation of “significant”

findings: The role of mixed methods research. The Qualitative Report, 9(4), 770-

792.

Onwuegbuzie, A. J., & Leech, N. L. (2005b). Taking the “Q” out of research: Teaching

research methodology courses without the divide between quantitative and

qualitative paradigms. Quality & Quantity, 39, 267-296.

Onwuegbuzie, A. J., & Leech, N. L. (2007). Sampling designers in qualitative research:

Making the sampling process more public. The Qualitative Report, 12(2), 238-

254.

Pena-Shaff, J. B., & Nicholls, C. (2004). Analyzing student interactions and meaning

construction in computer bulletin board discussions. Computers & Education, 42,

243-265.

Picciano, A. (2002). Beyond student perceptions: Issues of interaction, presence, and

performance in an online course. Journal of Asynchronous Learning Networks,

6(1), 21-40.

Page 181: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

166

Prensky, M. (2001, September/October). Digital natives, digital immigrants. On the

Horizon, 9(5), 1-6.

Pye, R., & Williams, E. (1978). Teleconferencing: Is video valuable or is audio

adequate? Telecommunications Policy, 1(3), 230-241.

Rettie, R. (2003). Connectedness, awareness, and social presence. Paper presented at the

6th International Presence Workshop, Aalborg, Denmark.

Richardson, J. C., & Swan, K. (2003). Examining social presence in online courses in

relation to students' perceived learning and satisfaction. Journal of Asynchronous

Learning Networks, 7(1), 68-88.

Rourke, L., & Anderson, T. (2002a). Exploring social communication in computer

conferencing. Journal of Interactive Learning Research, 13(3), 259-275.

Rourke, L., & Anderson, T. (2002b). Using peer teams to lead online discussions. Journal

of Interactive Media in Education, 1, 1-21.

Rourke, L., & Anderson, T. (2004). Validity in qualitative content analysis. Educational

Technology Research and Development, 52(1), 5-18.

Rourke, L., Anderson, T., Garrison, D. R., & Archer, W. (2001a). Assessing social

presence in asynchronous text-based computer conferencing. Journal of Distance

Education, 14. Retrieved from http://cade.athabascau.ca/vol14.2/

rourke_et_al.html

Rourke, L., Anderson, T., Garrison, D. R., & Archer, W. (2001b). Methodological issues

in the content analysis of computer mediated transcripts. International Journal of

Artificial Intelligence in Education, 12, 8-22.

Rourke, L., & Kanuka, H. (2009). Learning in communities of inquiry: A review of

Page 182: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

167

the literature. Journal of Distance Education, 23(1), 19-48.

Rovai, A. P. (2002). Building a sense of community at a distance. International Review of

Research in Open and Distance Learning, 3(1). Retrieved from

http://www.irrodl.org/index.php/irrodl/article/view/79/153

Russo, T., & Benson, S. (2005). Learning with invisible others: Perceptions of online

presence and their relationship to cognitive and affective learning. Educational

Technology & Society, 8(1), 54-62.

Russo, T. C., & Campbell, S. W. (2004). Perceptions of mediated presence in an

asynchronous online course: Interplay of communication behaviors and medium.

Distance Education, 25(2), 216-232.

Rutter, D. R. (1984). Looking and seeing: The role of visual communication in social

interaction. London: John Wiley.

Rutter, D. R. (1987). Communicating by telephone. Oxford: Pergamon Press.

Rutter, D. R. (1989). The role of cluelessness in social interaction: An examination

of teaching by telephone. In D. Roger & P. Bull (Eds.), Conversation (pp.

294-312). Philadelphia, PA: Multilingual Matters.

Rutter, D. R., Pennington, D. C., Dewey, M. E., & Swain, J. (1984). Eye-contact as a

chance product of individual looking: Implications for the intimacy model of

argyle and dean. Journal of Nonverbal Behavior, 8(4), 250-258.

Ryman, S., Hardham, G., Richardson, B., & Ross, J. (2009). Creating and sustaining

online learning communities: Designed for transformative learning. International

Journal of Pedagogies & Learning, 5(3), 32-45.

Shank, P. (2004). New social interaction tools for online instruction. ITFORUM

Page 183: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

168

paper #81. Retrieved from http://it.coe.uga.edu/itforum/paper81/paper81.html

Shea, P., & Bidjerano, T. (2010). Learning presence: Towards a theory of self-

efficacy, self-regulation, and the development of a communities of inquiry in

online and blended learning environments. Computers & Education, 55(4), 1721-

1731.

Short, J. A. (1974). Effects of medium of communication on experimental negotiation.

Human Relations, 27(3), 225-234.

Short, J., Williams, E., & Christie, B. (1976). The social psychology of

telecommunications. London: John Wiley & Sons.

Smith, A. (2010, August 11). Home broadband 2010. Washington, DC: Pew

Research Center’s Internet & American Life Project. Retrieved from

http://pewinternet.org/~/media//Files/Reports/2010/Home%20broadband%20201

0.pdf

So, H. J., & Brush, T. A. (2008). Student perceptions of collaborative learning, social

presence, and satisfaction in a blended learning environment: Relationships and

critical factors. Computers & Education, 51(1), 318-336.

Stacey, E. (2002). Quality online participation: Establishing social presence. Paper

presented at the Research in Distance Education Conference, Deakin University,

Geelong.

Stein, D. S., & Wanstreet, C. E. (2003). Role of social presence, choice of online or face-

to-face group format, and sat with perceived knowledge gained in a distance

learning environment. Paper presented at the Midwest Research to Practice

Conference in Adult, Continuing, and Community Education, Columbus, OH.

Page 184: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

169

Steinfield, C. W. (1986). Computer-mediated communication in an organization

setting: Explaining task-related and socioemotional uses. In M. L.

McLaughlin (Ed.), Communication Yearbook 9 (pp. 777-804). Newbury Park,

CA: Sage.

Suler, J. (2004). The online disinhibition effect. CyberPsychology & Behavior, 7(3),

321-326.

Swan, K. (2002). Immediacy, social presence, and asynchronous discussion In J. Bourne

& J. C. Moore (Eds.), Elements of quality online education (Vol. 3, pp. 157-172).

Needham, MA: Sloan Center for Online Education.

Swan, K. (2003). Developing social presence in online course discussions. In S. Naidu

(Ed.), Learning and teaching with technology: Principles and practices (pp. 147-

164). London: Kogan Page.

Swan, K. P., Richardson, J. C., Ice, P., Garrison, R. D., Cleveland-Innes, M.,

& Arbaugh, J. B. (2008). Validating a measurement tool of presence in

online communities of inquiry. e-mentor, 2(24)

Swan, K., Shea, P., Richardson, J., Ice, P., Garrison, D. R., Cleveland-Innes, M., &

Arbaugh, J. B. (2008). Validating a measurement tool of presence in online

communities of inquiry. E-Mentor, 2(24), 1-12. http://www.e-

mentor.edu.pl/e_index.php?numer=24&all=1

Swan, K., & Shih, L. F. (2005). On the nature and development of social presence in

online course discussions. Journal of Asynchronous Learning Networks, 9(3),

115-136.

Page 185: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

170

Tallent-Runnels, M. K., Thomas, J. A., Lan, W. Y., Cooper, S., Ahern, T. C., Shaw, S.

M., et al. (2006). Teaching courses online: A review of the research. Review of

Educational Research, 76(1), 93-135.

Thurlow, C., Lengel, L., & Tomic, A. (2004). Computer mediated communication: Social

interaction and the Internet. Thousand Oaks, CA: Sage.

Tu, C.-H. (2000). On-line learning migration: From social learning theory to social

presence theory in a CMC environment. Journal of Network and Computer

Applications, 2, 27-37.

Tu, C.-H. (2001). How Chinese perceive social presence: An examination of interaction

in online learning environment. Education Media International, 38(1), 45-60.

Tu, C.-H. (2002a). The impacts of text-based CMC on online social presence. The

Journal of Interactive Online Learning, 1(2). Retrieved from http://www.

ncolr.org/jiol/issues/PDF/1.2.6.pdf

Tu, C.-H. (2002b). The measurement of social presence in an online learning

environment. International Journal on E-Learning, 1(2), 34-45.

Tu, C.-H., & Corry, M. (2004). Online discussion durations impact online social

presence. In C. Crawford. et al. (Ed.), Proceedings of Society for Information

Technology and Teacher Education International Conference 2004 (pp. 3073-

3077). Chesapeake, VA: AACE.

Tu, C.-H., & McIsaac, M. (2002). The relationship of social presence and interaction in

online classes. The American Journal of Distance Education, 16(3), 131-150.

Page 186: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

171

Vrasidas, C., & Glass, G. V. (2002). A conceptual framework for studying distance

education. In C. Vrasidas & G. V. Glass (Eds.), Distance education and

distributed learning (pp. 31-55). Greenwich, CT: Information Age Publishing.

Wagner, E. D. (1994). In support of a functional definition of interaction. The American

Journal of Distance Education, 8(2), 6-29.

Walters, D. (2011, February). The rewrite: Who’d have thought Idaho would be a haven

for radial education reform? The Pacific Northwest Inlander. Retrieved from

http://www.inlander.com/spokane/article-16203-the-rewrite.html

Walther, J. B. (1992). Interpersonal effects in computer-mediated interaction: A

relational perspective. Communication Research, 19, 52-90.

Walther, J. B. (1994). Anticipated ongoing interaction versus channel effects on

relational communication in computer-mediated interaction. Human

Communication Research, 20, 473-501

Walther, J. B. (1996). Computer-mediated communication: Impersonal, interpersonal,

and hyperpersonal interaction. Communication Research, 23(1), 3-43.

Walther, J. B., Anderson, J. F., & Park, D. W. (1994). Interpersonal effects in computer-

mediated interaction: A meta-analysis of social and antisocial communication.

Communication Research, 21(4), 460-487.

Walther, J. B., & Parks, M. R. (2002). Cues filtered out, cues filtered in. In M. L. Knapp

& J. A. Daly (Eds.), Handbook of interpersonal communication (pp. 529-563).

Thousand Oaks, CA: Sage.

Page 187: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

172

Watson, H. (2006, April). Governor signs bill establishing rigorous high school

curriculum. Retrieved from http://www.michigan.gov/som/0,1607,7-192-29939-

141369--,00.html

Weedman, J. (1991). Task and non-task functions of a computer conference used in

professional education: A measure of flexibility. International Journal of

Man-Machine Studies, 34, 303-318.

Wheeler, S. (2005). Creating social presence in digital learning environments:

A presence of mind? Paper presented at TAFE Conference (November 11th), Que

ensland, Australia.

White, J. W., & Lowenthal, P.R. (2011). Minority college students and tacit "codes of

power": Developing academic discourses and identities. Review of Higher

Education, 34(2), 283-318.

Wiener, M., & Mehrabian, A. (1968). Language within language: Immediacy, a channel

in verbal communication. New York: Appleton-Century-Crofts.

Willging, P. A., & Johnson, S. D. (2004). Factors that influence students decision to

dropout of online courses. Journal of Asynchronous Learning Environments, 8(4),

105-118.

Williams, E. (1975). Medium or message: Communications medium as a determinant of

interpersonal evaluation. Sociometry, 38(1), 119-130.

Williams, E. (1977). Experimental comparisons of face-to-face and mediated

communication: A review. Psychological Bulletin, 84(5), 963-976.

Williams, E. (1978a). Teleconferencing: Social and psychological factors. Journal of

Communication, 84, 125-131.

Page 188: SOCIAL PRESENCE: WHAT IS IT? HOW DO WE MEASURE IT?

173

Williams, E. (1978b). Visual interaction and speech patterns: An extension of previous

results. British Journal of Social and Clinical Psychology, 17, 101-102.

Wilson, C., & Williams, E. (1977). Watergate worlds: A naturalistic study of media and

communication. Communication Research, 4(2), 169-178.

Wise, A., Chang, J., Duffy, T., & Del Valle, R. (2004). The effects of teacher social

presence on student satisfaction, engagement, and learning. Journal of

Educational Computing Research, 31(3), 247-271.

Wise, A.F. & Chiu, M. M. (2011, April). The power of a synthesizer role in online

discussion forums: Encouraging midway summaries drives the knowledge

construction process. Paper presented at the Annual Meeting of the American

Education Research Association, April 2011, New Orleans, LA.

Witmer, D. F. (1997). Risky business: Why people feel safe in sexually explicit on-line

communication. Journal of Computer Mediated Communication, 2(4).

Woo, K., Gosper, M., McNeill, M., Pretson, G., Green, D., & Phillips, R. (2008).

Web-based lecture technologies: Blurring the boundaries between face-to-face

and distance learning. Research in Learning Technology, 16(2), 81-93.