Security Practice: Design, Adoption, and Use of Technology for Collaboratively Managing Sensitive...

download Security Practice: Design, Adoption, and Use of Technology for  Collaboratively Managing Sensitive Personal Information

of 5

Transcript of Security Practice: Design, Adoption, and Use of Technology for Collaboratively Managing Sensitive...

  • 8/8/2019 Security Practice: Design, Adoption, and Use of Technology for Collaboratively Managing Sensitive Personal Informa

    1/5

    Security Practice: Design, Adoption, and Use of Technology for

    Collaboratively Managing Sensitive Personal Information

    Laurian VegaDepartment of Computer Science, Virginia [email protected]

    ABSTRACTThere is a need in socio-technical systems, and more specifically medical informatics, to study how trust

    and privacy can affect the ad hoc negotiation of security rules and how they are managed in practice. For

    my interdisciplinary dissertation I propose to study how people collaboratively work to manage private

    information in the domains of childcare centers and physicians offices. These locations are places where

    private information is collected for multiple purposes (eg. tracking care, receiving payments). However,

    there has been little work that has examined how these needs are reflected or balanced in the acts of

    privacy.

    Building off of the preliminary work of observations and interviews, my approach to this topic is the use of

    mirrored ethnographies in each setting. The goal is to collect data on observed breakdowns in the policies

    for these groups. The data will be analyzed using a phenomenological approach to understand the

    experiences of security breakdowns: what was the cause, how breakdowns were responded to (e.g., bending

    policies, creating new policies), who was involved, and how privacy and security were maintained, or not

    maintained. Breakdowns may be as small as a file missing a piece of information, or as large as a missing

    child as observed in preliminary studies. The outcomes from this work will be thick descriptions and

    categorizations of security system breakdowns along with near-future scenarios to depict aspects of future

    innovations. A goal of this work is to build on the literature in the areas of medical informatics and usable

    security in order to design prototypes in childcares and physicians offices.

    The preliminary analysis has produced three loci where implicit and explicit norms influence security and

    privacy: (1) human-mediated access management explores how a filing cabinets and a computers locationwere physical situated and mediated by one person within each center, thereby negotiating access; (2)

    information duplication explores the role and use of having information in multiple forms and across

    distributed locations; and (3), community of trust explores how personnel coordinate and co-construct

    knowledge of the context to assess for nuances in polices. These findings and work represent novel and

    practical evaluations of security in context to look at factors related to security and privacy that affect

    technological adoption.

    INTRODUCTIONHealth records intrinsically contain sensitive personal information. With the growing use of electronic

    health records and other electronic communication methods, there are parallel growing patient concerns in

    regards to security and privacy of their sensitive personal information that extends beyond HIPAA. In 2009

    thirty-nine percent of physicians were communicating with patients over email, secure messaging services,or instant messaging and ninety-nine percent of doctors were using the internet [2]. What is not being

    considered, though, with the increase in technical appropriation are the patients concerns in regards to their

    privacy. According to the California HealthCare Foundation, one in fourteen users are using personal

    health records. However, privacy is an enduring concern staying relatively high from a 2005 survey in

    reference to the adoption and use of these records. Seventy-five percent of participants who said that they

    did not want to use electronic health records reported that it was because of privacy concerns particularly

    their sensitive personal information being stored online [1]. This need is echoed in a review of research on

    personal health records in 2008 of 100 papers on Personal Health Records (PRH) by Kaelber et al. They

  • 8/8/2019 Security Practice: Design, Adoption, and Use of Technology for Collaboratively Managing Sensitive Personal Informa

    2/5

    found that privacy and security was one of the seven factors that not only affects adoption but is also an

    emerging and important research area [12]. All of this work presents a need to understand the security and

    privacy of health records.

    In parallel to the work of health informatics is a growing body of work at the conjunction of Human-

    Computer Interaction (HCI) and security, appropriately termed Usable Security. Usable security is a

    movement within the security sub-domain of computer science that recognizes that security has two parts:

    the computers and humans. Usable Securitys proposal is that if security is hard to manage it is not because

    the users are at fault; it is because the security mechanisms are incongruent with the users primary task [4].

    To understand the users task involves understanding the users needs, practices, and values. Additionally,

    with computing systems becoming increasingly ubiquitous, they will become more involved and integrated

    into everyday work; computers are no longer solitary machines under the computer desk. They are the

    tablets that physicians carry, or the bluetooth enabled diabetes glucose meter that patients use. Making

    these systems work in a way that respects the users needs but also respects the clients privacy in a secure

    way is a primary goal of usable security.

    Unfortunately, a good amount of work in the realm of security focuses on the creation of rules or policies.

    The problem with security policies is that they are often only secure in principle. They are seldom secure in

    practice [7]. Practice is what happens in the moment; it is the activity; it is what is actually done. In this

    space there is a tension between work practice and security. There has been a plethora of research that has

    demonstrated that when security policies or mechanisms are not appropriately designed to support workpractice, security breaks down (e.g., creating work-arounds such as writing passwords on post-it notes, or

    as was observed in the pilot studies shouting passwords) [4, 9]. When a breakdown occurs, though, in a

    social system, workers do not stop doing work. They create special cases or methods that allows them to

    continue bends in the formal policies. In this sense, social systems are intrinsically flexible. When we

    start to think about electronic systems, the reverse is true: electronic systems work according to pre-

    encoded, deterministic rules.

    My dissertation merges the theoretical frameworks presented from the field of HCI, the standpoint of

    usable security, and domains of healthcare and childcare. The domains of healthcare and childcare are two

    instantiations of a similar socio-technical system. The type of socio-technical system I have been and will

    be studying is one in which groups coordinate and manage their clients private information through

    physical and technological mechanisms. In childcare centers workers care for and manage the enrolled

    children and also the enrolled childs personal information. In medical centers workers manage the patientshealth along with the patients health information. The use of two areas allows for generalization across

    similar work environments while also exploring different dimensions (e.g. routines, legislation).

    For my study it was critical to select areas where privacy was being managed in practice. The settings of

    childcares and medical practices were selected because they are similar in their goal to manage and respect

    sensitive personal information. Childcares were selected because they are a setting where access are more

    easily granted, regulations are less pervasive, and sensitive information is ubiquitous to the caretaking job.

    Privacy concerns may not be as highly prevalent as physicians offices given that the information being

    stored is not life critical. Medical practices were selected because of the daily use of information, the highly

    regulated nature of its use, and because there is a life-critical aspect to having correct information. By

    studying both areas my desire is to better understand the area of managing private information. Areas that

    were considered but not used were employee files, criminal files, student files, and client files.

    While it is not surprising that there is a difference between expected by-the-book processes and actual

    practice, this issue is of particular importance in regards to security. Security as a theoretical construct

    and as a focus of technological development is an area where failure has a high associated risk. Systems,

    both social and technical, that are both correct and dependable are an increasing problem with the growing

    reliance on rigid security systems [10]. Additionally, the study of these is becoming ever more necessary as

    we strive to design secure and usable systems for childcares and medical practices that are increasingly

    adopting digital documentation systems [11]. It is for this reason that there exists a need to study and

    understand how socio-technical systems manage security practice.

  • 8/8/2019 Security Practice: Design, Adoption, and Use of Technology for Collaboratively Managing Sensitive Personal Informa

    3/5

    Responding to the need for research in this area, my research question is: how do socio-technical systems

    that use sensitive personal information manage work-practice breakdowns surrounding the implicit and

    explicit rules of process? I have further broken this down into three sub-questions:

    What are the implicit and explicit rules surrounding how medical practices and childcares handlesensitive personal information?

    What breakdowns happen when the explicit and implicit rules are not followed? How are breakdowns accounted for, negotiated, and managed in socio-technical systems where

    sensitive personal information exists?

    Through understanding the breakdowns that are encountered in the use of both social and technical

    systems, new systems can be designed that are flexible, have socially negotiated policies, and result in more

    reliable and secure day-to-day practice. Specifically, by understanding what causes security work arounds

    and how users respond to them, technological design can start to create systems that result in less end-user

    frustration and fewer discrepancies between what the user needs for their practice and what is necessary for

    security and privacy.

    METHODOLOGYWith recently granted approval from the Virginia Tech Institutional Review Board, active-participant

    observations are being employed. For these, data will be gathered from observing work at both a childcare

    and medical centers. The pilot data and resulting data from the observations will form the basis for the

    overall findings of this study. Daily observation logs will be kept along with appropriate pictures of

    representative artifacts. Interviews with audio recordings will be transcribed verbatim. Breakdowns will

    then be analyzed using a phenomenological approach to produce an emergent understanding of how the

    socio-technical system employs security in practice [16]. The use of observations and interviews from key

    stakeholders will be used to triangulate and discern practices.

    Qualitative methods, such as interviews and observations, are critical as investigation mechanisms because

    they account for the reasons and motivations that may go unreported on surveys. These methods will allow

    me to see when and where technology fails in maintaining security and propose more ecologically-valid

    solutions to address these breakdowns. The second reason for using qualitative studies in this space is

    because of the lack of knowledge in regards to actual security and privacy practice. Prior work has

    examined and asked about what happens when security policies are not adequate for the solution. Little

    work has looked at what actually happens. Qualitative methods that look at the specific in order to abstractto the general are appropriate for the research goals of understanding security breakdowns.

    Given the intrinsically sensitive nature of the data I am collecting, strict privacy protocols have been and

    will be adhered to. There are national and state regulations to protect child and patient information (e.g.,

    HIPAA, FERPA). For my study, identifiers of the participants are stripped from all documentation except

    for the informed consent and a document listing identifiers, names, and contact information. This includes

    secondary participants such as patients, children, and caregivers. All data is stored on password-protected

    computers, on external hard drives that are locked in cabinets, and on data print-outs that have removed

    identifying information but are still locked in cabinets. All original documentation apart from the informed

    consents has been shredded. It is not our intention to collect information about particular children, care

    givers, or patients. However, when names are encountered, they are given unique identifiers in the data to

    protect identity and anonymity.

    RESULTS FROM PRELIMINARY STUDIESFour pilot studies were conducted to explore security issues involved in the practice of collaborative

    sensitive information management: 12 interviews of childcare directors, 16 interviews of physician office

    directors, follow-up interviews with 4 childcare directors, and two to three observations in 4 childcares. All

    interviews and observations were transcribed. All participants were from the southwest area of Virginia.

    The directors were recruited through a comprehensive list of all area businesses; the response rates were

  • 8/8/2019 Security Practice: Design, Adoption, and Use of Technology for Collaboratively Managing Sensitive Personal Informa

    4/5

    55% for childcares, and 26% for physician offices. Roughly 1,500 pages of data have been collected:

    approximately 750 pages of transcripts, 200 pages of observation journals, 123 forms, and 125 pictures.

    The preliminary analysis has produced three loci where implicit and explicit norms influence security and

    privacy. The first, human-mediated access management, explores how a filing cabinets and a computers

    location were physical located and mediated by one person within each center, thereby negotiating access.

    The second, information duplication, explores the role and use of having information in multiple forms and

    across distributed locations. The last, community of trust, explores how personnel coordinate and co-

    construct knowledge of the context to assess for nuances in polices. More information about these can be

    found in [15].

    RELATED WORK & RESEARCH CONTRIBUTIONThere has been a dearth of research examining the day-to-day practices of childcares and their relation to

    information security. However, the work of Kientz et al. has explored how to design a technological

    solution for information that is stored and managed about children [13]. One important finding from the

    study was that doctors were the most trusted source of information about a childs development. This

    finding speaks to the conceptions that parents have about authoritative information, thus impacting what is

    shared and documented about their child. While this work embodies some of the same user needs as my

    study (i.e., mass amounts of information, data recording, etc.), it does not focus on the security and privacy

    of practice. Additionally, Kientzs work focuses on how parents manage the documentation, while I am

    focusing on how childcares co-manage documentation with parents and other secondary caregivers.

    In contrast, there has been extensive research on security and privacy with some focus on the medical

    settings. Prior work has examined the security of private information [3], how documentation and

    articulation work supports collaboration in a medical setting [8], how to manage the mobility of medical

    collaboration [5], and the use and creation of multiple surfaces can support collaboration and management

    with a specific focus on supporting work practice [6]. The work of Reddy and Dourish [14] is

    representative of how practice and context can affect information dissemination. In their paper temporal

    rhythms are proposed to explain community patterns that healthcare workers follow in seeking, providing,

    and managing information. My work, instead, focuses on the inextricable relationship between social and

    technical mechanisms that are used in the medical work practice to negotiate ad hoc security and privacy

    needs.

    It is the combination of using these two areas of focus that will provide novel insight into day-to-day

    security practice of managing sensitive personal information. To date, no one has examined this area of

    research to see how people manage policy breakdowns and ad hoc response. From my work there are two

    contributions that this research will make: theoretical and technical. First, this work will benefit the health

    informatics, the usable security, and the HCI community by detailing a deep exploration of how

    communities manage explicit and implicit policies. The results from this body of work will be a set of

    properties that will help the design community to create technology and tools to support secure work

    practice. Additionally, there has been a dearth of research studying how groups manage and coordinate

    security and put these constructs into practice. This work will add to that body of literature and

    understanding.

    The product of this research will be a set of near-future scenarios depicting the positive and negative design

    of technologies in response to the categorization and analysis of the breakdowns. From these scenarios,other masters and Ph.D. students in my lab are starting and will continue to design prototypes that will

    further explore the design of flexible and negotiated security policies. For example, we are currently

    considering how to design technologies that recognize a situation in which security policies should be

    mediated. It is easy to imagine a scenario in which a patient enters an emergency room, the doctors need

    immediate access to all medical information, and as such, all privacy policies are removed from an

    electronic health record. At a basic level, similar scenarios suggest technological solutions such as

    temporary access or decaying access rights to information that could be implemented to allow for flexible

    and negotiated privacy management. Outcomes such as these will further the contribution of my research.

  • 8/8/2019 Security Practice: Design, Adoption, and Use of Technology for Collaboratively Managing Sensitive Personal Informa

    5/5