Particularities of Membrane Gas Separation Under Unsteady State
Patterns and Particularities in Consumer Health Informatics
Transcript of Patterns and Particularities in Consumer Health Informatics
1
Patterns and Particularities in Consumer Health Informatics:
Examining the Search Behavior of Consumers Seeking Online Health Information
Abstract As the influence of the Internet in modern society continues to grow, consumers
are increasingly utilizing online sources of health information. To investigate this
growing trend, we looked at the search behavior of consumers seeking online health
information in three ways. First, we had our participants fill out a self-reported
questionnaire on their Internet usage. Then, we recorded the participants as they
completed two online search tasks using Google. Last, we interviewed the participants
after they had completed their tasks and asked a few follow-up questions. In examining
the data we collected, we discovered that there were many commonalities in sources of
online information chosen by our participants, that very few participants used advanced
search options, and that assessments of credibility and utility were largely based on the
individual participant. We also discovered that query reformulation does not always
follow the pattern of general to specific and that online sources were not the only source
of health information used regularly by our participants. Search behavior is not easy to
categorize unilaterally and seems to be endemic to the individual searcher.
Introduction
Consumers are increasingly turning to the Internet as a primary source of
information about health-related issues and concerns. There has been limited research
thus far into online search behavior in the context of consumer health information in
order to understand how and why consumers make the information-seeking choices that
2
they do. Insights gained in this area can help health care providers, health advocates, and
health resource designers offer online content more effectively, resulting in quality health
information that is easily retrievable by a greater number of people. In addition to
benefiting the individual consumer and his or her family and friends, increased consumer
access to reliable sources of health information is beneficial to society as a whole.
Healthier individuals and communities can result in significant cost-savings at a time
when health care costs are substantial and continually on the rise.
Brief Literature Review � There have been several studies in the area of consumer search behavior for
online health information. Analysis of selection of results is looked at in depth. One
aspect users consider in determining the validity of a website is its design. In their study,
Sillence, Briggs, Fishwick, and Harris (2004) found that a site’s design was was taken
into consideration in determining the trustworthiness of a website quite a bit more than
the site’s actual content. This is an interesting and surprising finding.
Query formulation and reformulation is also briefly touched on in these studies.
Eysenbach and Köhler (2002) examined the effectiveness of their participants’ searching
methods. They found that only 35% of their participants included more than one word in
their searches, leading to more broad and less helpful lists of results. Specificities of
language and phrasing have been explored as well. Toms and Latter (2007) found that
one participant included additional search terms to make the search process more “robust.”
They also found that participants included terms to geographically restrict their results,
and used more advanced search techniques like including plus signs or adding quotation
marks.
3
Several of the studies articulate a need for further research on these topics to be
done, and we hope to contribute towards this goal.
Research Questions
In our study, we examined the search behavior of consumers looking for online
health information via Google. We were specifically interested in:
● how and why consumers reformulate their queries,
● how consumers select resources from the results list, and
● how consumers subsequently analyze the web pages to appraise the credibility
and utility of the site.
Data Collection Methods
To best answer these questions, we used three methods of collecting data. First,
we asked our participants to self-report certain data about their typical Internet usage and,
more specifically, their use of the Internet to search for online health information
(Appendix A). Second, we asked participants to complete two online health information
searching tasks, one prescribed and the other more participant-directed (Appendix B). We
asked the participants to think aloud as they were completing these search tasks. We
recorded their search sessions using the Morae software in the Information eXperience
laboratory at the iSchool. The Morae software captured video and audio of the
participants as they were thinking aloud and searching. The software also captures
onscreen activity, including mouse movement. Third, we interviewed our participants
after they had completed their tasks and recorded them with a hand-held audio recorder.
We asked them a few follow-up questions that were designed to investigate the
4
participant’s belief in the credibility and utility of the information they found online in
the course of completing the tasks (Appendix C).
Data Analysis
We collected data from seven participants within a limited range of demographics
and backgrounds (Appendix D). All seven of our participants were iSchool students. Six
were female, and one was male. Participants’ ages ranged from 23 to 38. All of our
participants have been using the Internet for at least a decade, and most have used it for
much longer, with several beginning use in childhood. A majority of participants
currently use the Internet daily for a variety of purposes, including work, school,
communication, general information seeking, and entertainment.
Participants’ self-reported use of the Internet to search online for health
information ranged from “never” to “five times per year” to “once per week.” Most
participants reported that they use Google to search for health information on the Internet.
Two participants specifically mentioned the Mayo Clinic website, and one mentioned
WebMD. In describing how they choose sources of online health information,
participants considered the following measures: apparent credibility based on aesthetic
features (such as style and layout), authoritativeness (such as materials prepared by a
health professional), recency (such as sources updated in the last year), recommendations
(such as suggestions from family members or links from other sources), Google’s ranking
order, and forums discussing the everyday experiences of other people.
We focused our data analysis on the following elements: the nature of participants’
query formulations and reformulations, including the relative broadness or specificity of
their search term revisions; the extent to which participants explored the results returned
5
by the search engine, including how far down the results list they looked and how many
sites they considered and visited; the types of sites that participants chose to visit; the
measures of credibility that participants used in evaluating the sites; the role that
participants reported that family members, friends, and medical professionals play in
their health information seeking choices; and participants’ reliance on alternative sources
of health information other than the Internet.
Results
For the first task, the number of websites that participants visited ranged from one
to five (Appendix E), and the number of query reformulations that participants performed
ranged from zero to two (Appendix F). For the second task, the number of websites that
participants visited ranged from two to five, and the number of query reformulations that
participants performed ranged again from zero to two. The two most commonly visited
sources were WebMD and the CDC at five visits each, followed by Wikipedia at four
visits and the Mayo Clinic at three visits (Appendix G).
Discussion (report subjective data, compare to other sources)
Our results yielded similarities and differences to the studies we reviewed. Similar
to the findings of Sillence, Briggs, Fishwick, and Harris (2004), we noted that a few of
our participants considered sites to be credible based on their design. One participant
deemed a site credible because the design “... [was] obviously professional.” Overall,
design is an important element in health information systems.
Unlike Toms and Latter (2007), we did not find that our participants used
advanced searching techniques very often. We had one participant limit her search using
6
a Google function that restricted results by recency. Another participant indicated in the
self-reporting survey that she preferred more recent results (“updated in the last year”),
but did not narrow her search as such while completing the tasks. Additionally, our
participants generally did not use special search symbols such as plus signs or quotation
marks regularly. Some did, but not all.
Our research also yielded interesting results in regard to query reformulation.
Most notably, query reformulation was not as prevalent as we had initially anticipated
that it would be prior to conducting our research. Also, rather than exclusively showing a
trend from general to more specific language, our participants’ revised queries often
showed expansion from narrow to more general. One participant initially searched for
“meningitis vaccine” and then later reformulated the query to simply “meningitis.”
Oftentimes participants would move laterally through their results; that is, they would not
search again using different terms but rather would scroll up and down the Google results
page looking at the same results and seeing different things in the link descriptions. This
perhaps corresponds well to the graph in Appendix E. On the whole, our participants did
not visit many different sites in the course of completing their tasks, largely because they
did not have many query reformulations. On a related note, the average number of
different sites visited per task differed significantly from about 2.5 in Task 1 to 3.2 in
Task 2. The open-ended task was perhaps more interesting for the participants, which
may have motivated them to look for more results.
Participants selected sites from the search results list predominantly based on
position in the search results and name recognition. Unsurprisingly, none of the
participants went beyond the first page of results. In fact, most participants stayed within
7
the top few sites and rarely looked at sites towards the bottom of the page. Many
participants commented that they had heard of certain sites before and were subsequently
more inclined to look at them (Appendix H). Beyond being a qualifier for selecting a
page, name recognition was an important component in participants’ determination of a
website’s credibility. Many participants noted that they had heard of WebMD before and
subsequently deemed the information on the site credible.
Credibility assessment was based largely on the individual. We initially
hypothesized that our participants would be likely to select a “.gov” website based simply
on the principle that it contains government regulated information that was therefore
credible. However, participants selected what they considered to be credible information
from sites with a variety of top level domains, including “.edu,” “.gov.,” “.com,” and
“.org.” Suffixes did play a role in participants’ selection of sources; for example, one
participant selected a site based on the fact that the URL ended in “.edu,” even though
she was unfamiliar with the school. Though we saw a range of sites used by participants,
four were visited most commonly: Wikipedia, WebMD, the CDC, and the Mayo Clinic.
Wikipedia, the Mayo Clinic, and WebMD all end in “.com,” disproving our initial notion
that “.gov” is the primary credibility identifier for information seekers. Rather, it seems
that name recognition mattered more.
In a related vein to the participants’ conception of credibility, we were also
interested in looking at participants’ confidence in their results. The follow-up questions
in our post-task interview were designed to examine this idea (Appendix C). When we
asked our participants if they would feel comfortable sharing their results with a friend or
family member, we got a fairly consistent answer: most would feel comfortable doing so,
8
but would also tell the other person that they should check with a doctor. This suggests
that although participants deemed their findings credible when completing the tasks, as
evidenced by their comments, participants did not consider the Internet to be the absolute
authority on health information.
When we asked participants if they would feel comfortable sharing their results
with a doctor, we got an unexpected range of responses that provided interesting insight
into participants’ perspectives on doctors. This spectrum had three major clusters:
participants who believed doctors to be the only correct source of health information,
those who liked to have a “baseline” of information about a health related topic gathered
from other sources (including online sources), and those who considered doctors to be
somewhat limited in their capacity to give health knowledge (Appendix I). This spectrum
suggests that while participants did not consider the Internet to be the ultimate authority
on health information, they also did not unanimously consider doctors to be the authority
either.
Conclusions The main takeaway from our research is that search behavior is largely specific to
the individual. Although we were able to draw some parallels between our participants,
the ways in which they searched were unique. For example, in looking at two different
participants’ comments on doctors, one said that she was mistrustful of doctors, while the
other said that she would immediately call her father, a doctor, with medical questions.
Participants also tended to visit the same websites, such as the CDC and WebMD, and
there was some commentary on the consideration of website suffixes in relation to
9
credibility, which at least partially confirmed our notion that users are drawn to more
“official” top level domains like “.gov.”
In choosing their self-selected tasks, many participants chose medical conditions
directly related to them. For example, one participant researched melanoma because she
has a family history of it. Another participant, having recently experienced swimmer’s
ear, chose to conduct research on that condition. Additionally, the age range of
participants also seemed indicative of their need to research health information. We had a
relatively young sample; older participants might have more necessity to seek health
information, and therefore have more experience and insight into the process. Future
studies could encompass a wider range of ages among participants. These background
experiences—the sum total of an individual’s life—have a lot of bearing on their online
search behaviors in the realm of health information.
Acknowledgments
We would like to thank all of the participants in our study. We would also like to
thank Yan Zhang for all of her excellent advice and guidance.
References Eysenbach, G. & Köhler, C. (2002). How do consumers search for and appraise health information on the world wide web? Qualitative study using focus groups, usability tests, and in-depth interviews. BMJ, 324, 573-577. Sillence, E., Briggs, P., Fishwick, L., & Harris, P. (2004). Trust and mistrust of online health sites. Proceedings of CHI, 2004, 663-670. Toms, E. & Latter C. (2007). How consumers search for health information. Health Informatics Journal, 13(3), 223-235.
10
Appendix A: Demographic and Background Information Questionnaire �1.Age �2.Gender 3.�Occupation 4.�Please provide a brief description of your Internet experience (How long you have been using the Internet, how often you use it, why you use it, etc.?) �5.How often do you search for health information on the Internet? �6. How do you go about searching for health information online? 7. �How do you choose which sources of information to use? Appendix B: Tasks �1. Using Google, determine who needs to get the meningitis vaccine and why. �2. Think of a health related topic that interests you. Find two suggested treatments or therapies. Appendix C: Follow-up Interview Questions 1. Where do you get health information from / Where else might you look for health information? 2. How comfortable do you feel with the results you came up with? Would you share these results with a doctor or other medical professional? Would you share it with a friend or family member? Appendix D: Demographics and Background Information Results
13
Appendix G: Common Sources Used by Participants
Appendix H: Participants’ Stated Credibility Rationales