Privacy in Pervasive Computing - USI Informatics · Informational Self ‐ Determination...
Transcript of Privacy in Pervasive Computing - USI Informatics · Informational Self ‐ Determination...
PRIVACY IN PERVASIVE COMPUTING
Marc LangheinrichUniversity of Lugano (USI), Switzerland
Approaches to Ubicomp Privacy Disappearing Computer Troubadour Project (10/2002 ‐ 05/2003)
Promote Absence of Protection as User Empowerment“It’s maybe about letting them find their own ways of cheating“
Make it Someone Else’s Problem“For [my colleague] it is more appropriate to think about [security and privacy] issues. It’s not really the case in my case“
Insist that “Good Security“ will Fix It“All you need is really good firewalls“
Conclude it is Incompatible with Ubiquitous Computing“I think you can’t think of privacy... it’s impossible, because if I do it, I have troubles with finding [a] Ubicomp future“
Marc Langheinrich: The DC‐Privacy Troubadour – Assessing Privacy Implications of DC‐Projects. Designing for Privacy Workshop. DC Tales Conference, Santorini, Greece, June 2003.
4
Today‘s Menu
Understanding PrivacyDefinitions
1. Public policy
2. Laws and regulations
3. Interpersonal aspects
Technical ApproachesChallenges
1. Location privacy
2. RFID privacy
3. Smart environments
5
UNDERSTANDING PRIVACYPrivacy in Pervasive Computing
What Is Privacy?“The right to be let alone.“
Warren and Brandeis, 1890 (Harvard Law Review)
“Numerous mechanical devices threaten to make good the prediction that ’what is whispered in the closet shall be proclaimed from the housetops’“ Louis D. Brandeis, 1856 ‐ 1941
7
Technological Revolution, 1888
8
Information Privacy“The desire of people to choose freely under what circumstances and to what extent they will expose themselves, their attitude and their behavior to others.“
Alan Westin, 1967Privacy And Freedom, Atheneum
Dr. Alan F. Westin
9
1. PRIVACY AS PUBLIC POLICYPrivacy in Pervasive Computing
Why Privacy?“A free and democratic society requires respect for the autonomy of individuals, and limits on the power of both state and private organizations to intrude on that autonomy… privacy is a key value which underpins human dignity and other key values such as freedom of association and freedom of speech…“
Preamble To Australian Privacy Charter, 1994“All this secrecy is making life harder, more expensive, dangerous and less serendipitous“
Peter Cochrane, Former Head Of BT Research“You have no privacy anyway, get over it“
Scott McNealy, CEO Sun Microsystems, 1995
11
The NTHNTF‐Argument„If you’ve got nothing to hide,
you’ve got nothing to fear”UK Gov’t Campaign Slogan for CCTV (1994)
AssumptionPrivacy is (mostly) about hiding (evil/bad/unethical) secrets
ImplicationsPrivacy protects wrongdoers (terrorists, child molesters, …)
No danger for law‐abiding citizens
Society overall better off without it!
12
Informational Self‐Determination“Informationelle Selbstbestimmung“
“If one cannot with sufficient surety be aware of the personal information about him that is known in certain part of his social environment, . . . can beseriously inhibited in his freedom of self‐determined planning and deciding. A society in which the individual citizen would not be able to find out who knows what when about them, would not be reconcilable with the right of self‐determination over personal data. Those who are unsure if differing attitudes and actions are ubiquitously noted and permanently stored, processed, or distributed, will try not to stand out with their behavior. . . . This would not only limit the chances for individual development, but also affect public welfare, since self‐determination is an essential requirement for a democratic society that is built on the participatory powers of its citizens.“
German Federal Constitutional Court (Census Decision ’83)
13
Informational Self‐Determination“Informationelle Selbstbestimmung“
“The problem is the possibility of technology taking on a life of its own, so that the actuality and inevitability of technology creates a dictatorship. Not a dictatorship of people over people with the help of technology, but a dictatorship of technology over people.“
Ernst Benda (1983)Federal Constitutional Court Chief Justice
Ernst Benda, *1925Chief Justice 1971‐1983
15
Issue: ProfilesAllow Inferences About You
May or may not be true (re. AOLStalker)!
May Categorize YouHigh spender, music afficinado, credit risk
May Offer Or Deny ServicesRebates, different prices, priviliged access
„Social Sorting“ (Lyons, 2003)Opaque decisions „channel“ life choices
Image Sources: http://www.jimmyjanesays.com/sketchblog/paperdollmask_large.jpg and http://www.queensjournal.ca/story/2008‐03‐14/supplement/keeping‐tabs‐personal‐data/
Not Orwell, But Kafka!
17
2. PRIVACY LAW PRIMERPrivacy in Pervasive Computing
Privacy Law HistoryJustices Of The Peace Act (England, 1361)
Sentences for Eavesdropping and Peeping Toms
„The poorest man may in his cottage bid defiance to all the force of the crown. It may be frail; its roof may shake; … – but the king of England cannot enter; all his forces dare not cross the threshold of the ruined tenement“
William Pitt the Elder (1708‐1778)
First Modern Privacy Law in the German State Hesse, 1970
19
Fair Information Principles (FIP)Drawn up by the OECD, 1980
“Organisation for economic cooperation and development“
Voluntary guidelines for member states
Goal: Ease transborder flow of goods (and information!)
Five Principles (simplified)
Core principles of modern privacy laws world‐wide
1. Openness
2. Data access and control
3. Data security
4. Collection Limitation
5. Data subject’s consent
20
Laws and RegulationsPrivacy laws and regulations vary widely throughout the worldUS has mostly sector‐specific laws, with relatively minimal protections
Self‐Regulation favored over comprehensive Privacy LawsFear that regulation hinders e‐commerce
Europe has long favored strong privacy lawsOften single framework for both public & private sectorPrivacy commissions in each country (some countries have national and state commissions)
21
EU Privacy LawEU Data Protection Directive 1995/46/EC
Sets A benchmark for national law for processing personal information in electronic and manual files
Expands on OECD Fair Information Practices: no automated ad‐verse decisions, minimality, retention, sensitive data, checks, …
Facilitates data‐flow between Member States and restrictsexport of personal data to „unsafe“ non‐eu countries
“E‐Privacy“ Directive 2002/58/EC (“amends“ 95/46/EC)Provisions for “public electronic communications services“
Data Retention Directive 2006/24/ECOrders storage of “traffic data“ for law enforcement
22
US‐EU: Safe HarborHow to Make US a “Safe“ Country (in terms of the Directive)
US companies self‐certify adherence to requirementsDept. of Commerce maintains list (1790 as of 04/09) http://www.export.gov/safeharbor/
Signatories must providenotice of data collected, purposes, and recipientschoice of opt‐out of 3rd‐party transfers, opt‐in for sensitive dataaccess rights to delete or edit inaccurate informationsecurity for storage of collected dataenforcement mechanisms for individual complaints
Approved July 26, 2000 by EU (w/ right to renegotiate)So far, not a single dispute!
23
APEC Privacy Framework 2004APEC – Asia Pacific Economic Group
21 Member States, e.g., Japan, South Korea, PR China, Hong Kong, Philipines, Australia, New Zealand, Macau, U.S., Canada
APEC „agreements“ non‐binding, only public commitment
Defines Nine „APEC Privacy Principles“Typically less strict than EU and even OECD principles, e.g., no purpose specification, no prior notice, use of “harm principle”
No details or checks on national implementation
No attempt at EU Data Directive 95/46/EC compliance
No consideration of existing privacy laws in region (see in italics)
24See also: (Kennedy et al., 2009), (Greenleaf, 2009)
3. INTERPERSONAL PRIVACYPrivacy in Pervasive Computing
Privacy InvasionsWhen Do We Feel that Our Privacy Has Been Violated?
Perceived privacy violations due to crossing of “privacy borders“
Privacy Boundaries1. Natural
2. Social
3. Spatial / temporal
4. TransitoryGary T. Marx
MIT
27
Privacy Borders (Marx)Natural
Physical limitations (doors, sealed letters)
SocialGroup confidentiality (doctors, colleagues)
Spatial / TemporalFamily vs. work, adolescence vs. midlife
TransitoryFleeting moments, unreflected utterances
28
Privacy Regulation TheoryPrivacy as Accessibility Optimization: Inputs and Outputs
Spectrum: “Openness“/ “Closedness“
Contrasts with privacy as withdrawal (“to be let alone“)
Privacy not monotonic: “More“ is not always “better“
Dynamic Boundary Negotiation ProcessNeither static nor rule‐based
Requires fine‐grained coordination of action & disclosure
Focus on public spaces, mediated by spatial environment
Irwin AltmanUniversity of Utah
29
Managing Privacy BoundariesUse Altman‘s Theory for Networked Environments
Very different from real‐world public spaces!Disclosure Boundary: Private and Public
We sometimes use publicity to limit accessibilityIdentity Boundary: Self and Other
Acting according to status, group, affiliationDisclosure according to recipient’s identity & roleDisclosure as means to differentiate or associate
Temporality Boundary: Past, Present, and FutureEffects of temporal sequence of disclosures
Leysia PalenUniv. of Colorado
Paul DourishUC Irvine
30L. Palen, P. Dourish: “Unpacking "privacy" for a networked world.” Proceedings of CHI 2003. pp.129‐136.
Today‘s Menu
Understanding PrivacyDefinitions
1. Public policy
2. Laws and regulations
3. Interpersonal aspects
Technical ApproachesChallenges
1. Location privacy
2. RFID privacy
3. Smart environments
31
TECHNICAL APPROACHESPrivacy in Pervasive Computing
Ubicomp Privacy Implications Data Collection (“more transactions“)
Scale (everywhere, anytime)
Manner (inconspicuous, invisible)
Motivation (context!)
Data Types (“not without computers“)Observational instead of factual data
Data Access (“more easily accessible“)“The Internet of Things“
34
FIP Challenges in UbicompHow to inform subjects about data collections?
Unintrusive but noticeableHow to provide access to stored data?
Who has it? How much of this is “my data“?How to ensure confidentiality, and authenticity?
Without alienating user!How to minimize data collection?
What part of the “context“ is relevant?How to obtain consent from data subjects?
Missing UIs? Do people understand implications?
35
Border Crossings in UbicompSmart appliances (natural borders)
“Spy“ on you in your own home
Family intercom (social borders)Grandma knows when you’re home
Consumer profiles (temporal borders)Span time & space
“Memory amplifier“ (transitory borders)
Records careless utterances
36
1. LOCATION PRIVACYPrivacy in Pervasive Computing
Location Privacy“… the ability to prevent other parties from learning one’s current or past location.“ (Beresford and Stajano, 2003)
Why Share Your Location?By‐product of positioning technology (e.g., cell towers)
Required to use service (recommendations, toll roads, ...)
Let others (friends, family) know where I am
Why NOT to Share Your Location?Location profiles reveal/imply activities, interests, identity
Useful Definition?! Think Altman!
38
Location Privacy TechnologyMany Proposals
Laws/regulations and audits (enterprise privacy)
Anonymization (“k‐anonymity“)
Obfuscation
Rule‐based access control
Privacy Model?Assumption: Less location disclosure means more privacy
(Krumm, 2008) Provides Overview of State‐of‐the‐Art
John KrummMicrosoft Research
39
Location Obfuscation
Adding noise, pertubation, dummy traffic to location dataProtects against attackers, but degrades service use(Krumm, 2007) showed that LOTS of obfuscation is neededTypically combined with rules to selectively adjust accuracy
Image Source: Krumm, J., Inference Attacks on Location Tracks, in Fifth International Conference on Pervasive Computing (Pervasive 2007). 2007: Toronto, Ontario Canada. p. 127‐143. 40
Location Mix ZonesFrequently Change Pseudonyms to Prevent Tracking
Change often trivial to detect
Idea: Designate “Mix Zones“ With No Tracking / LBS Active
Change pseudonyms only within mix zone
(Beresford and Stajano, 2003) offer probabilistic model for unlinkability in mix zones
Alastair BeresfordCambridge Univ.
Frank StajanoCambridge Univ.
41
2. RFID PRIVACYPrivacy in Pervasive Computing
RFID Privacy Concerns
43
Why RFID Privacy?Embarrassment
Whig? Underwear? Medicine?
Criminal ActsTheft, assault, murder, terror
WhigModel #2342
Material: Polyester
WalletCash: 370 Euro
Student ID: #2845/ETH
Tiger TangaMaker: Aldi (Suisse)
Last washed: 5 days ago
ViagraMaker: Pfitzer
Size: Maxi (60 pills)
PassportName: John DoeNationality: USAVisa for: Israel
Original “RFID‐M
an“ Artwork (c) 2
006 Ari Ju
els, RSA
Laboratories
44
Why RFID Privacy?Embarrassment
Whig? Underwear? Medicine?
Criminal ActsTheft, assault, murder, terror
Indirect ControlSubtle influence through consumer profiles
Direct Control“Technology Paternalism“, government surveillance
Spiekermann, Pallas: Technology Paternalism – Wider Implications of Ubiquitous Computing. Poiesis and Praxis: International Journal of Technology Assessment and Ethics of Science. Springer, Jan 2006, pp.1–13 45
RFID Privacy ApproachesTag Deactivation
Fry, cut, or silence (software)Prevents further use
Tag Encryption (Lots!)More expensive tagsPassword management!
Readout Interference (“Blocker‐Tag“, “Guardian“)Reliability? Feasibility? Legal?Burdens user (conscious use, configuration)
(Juels, 2006) Provides Overview of State‐Of‐The‐Art See also (Langheinrich, 2008) or (Spiekermann, 2008) Ari Juels
RSA Laboratories
Kill‐StationMETRO Future Store
46
Shamir Tags: “Keyless“ EncryptionIdea: Encrypted Tag Carries Its Own Key
No need to manage keys!
Prevent Skimming: Key Readout Takes Long TimeBitwise release, short range (e.g., one bit/sec)
Intermediate results meaningless, since encrypted
Prevent Tracking: Reply With Random BitsDecryption requires all bits being read
Allow Known Tags to be Directly IdentifiedAllows owner to use tags without apparent restrictions
Initial bit‐release enough for instant identification from known set
Source: Langheinrich, Marti: Practical Minimalist Cryptography for RFID Privacy. IEEE Systems Journal, Vol. 1, No. 2, 2007
Remo MartiErgon Informatik
(This Speaker)
47
011010111…1101 Secret s
111000011…101101 101101101…110111 101010011…101101 Shares hi
96‐bit EPC‐Code
106‐bit Shamir Share
111000011101010001010111010101101010100…1010101110101 Shamir Tag318‐bit Shamir Tag
Bit D
isclosure Over Time
10‐bit x‐value 96‐bit y‐value
111000011101010001010111010101101010100…1010101110101 Initial Reply16‐bit Reply
111000011101010001010111010101101010100…1010101110101+1 bit
+1 bit
111000011101010001010111010101101010100…1010101110101
111000011101010001010111010101101010100…1010101110101+1 bit
111000011101010001010111010101101010100…1010101110101+1 bit
111000011101010001010111010101101010100…1010101110101+1 bit
Unknown tags willeventually be identified
Instant identificationof known items
48
3. SMART ENVIRONMENTSPrivacy in Pervasive Computing
Smart EnvironmentsPrivacy Middleware
Machine‐readable privacy policies con‐trol data collection, processing, access
Personal device (e.g., mobile phone) to monitor and configure environment
Optional: Built‐in data obfuscation
Example ProjectsPawS/P3P (Langheinrich, 2003)
Confab toolkit (Hong and Landay, 2004) James LandayUniv. of Washington
Jason HongCMU
Aware HomeGeorgia Tech
50
Presence TechnologyProviding Control and Awareness to Users
Who is seeing what information about me?
CSCW / Telecommuting(Bellotti and Sellen, 1993) – EuroPARC’s RAVE media space
(Neustaedter, Greenberg, and Boyle, 2006) – Blurring?
Location Disclosure(Hong and Landay, 2004) – Lemming: Location‐enhanced IM
(Consolvo et al., 2005) – Social relations and loc. disclosure
Image Source: (Neu
staedter, G
reen
berg, and
Boyle, 2006)
51
Related IssuesPrivacy and Usability
CUPS group @ CMU
Hippocratic DatabasesPrivacy‐compliant processing
Statistical DatabasesAnonymization in databases (“k‐anonymity“)
Economics of PrivacyWhen do people share data?
Rakesh AgrawalMicrosoft Research
Lorrie F. CranorCMU
Latanya SweeneyCMU
Alessandro AcquistiCMU
52
SUMMARY & OUTLOOKPrivacy in Pervasive Computing
Take Home MessagePrivacy is Not Just Secrecy and Seclusion!
Privacy is a process, not a stateSolution requires good understanding of social, legal, and policy issues involved
Pervasive Computing Offers New ChallengesInvisible, comprehensive, sensor‐based, …
Ubicomp (Privacy) ChallengesUser interface (notice, choice, consent)Protocols (anonymity, security, access)Social compatibility (privacy boundaries)
54
Some Techno FallaciesThe Objectivity Of Numbers
Data Means Knowledge
More Data Means More Knowledge
If It Is In The Computer, It Must Be Right
If You Have Nothing To Hide, There’s No Danger
Less Data Means More Privacy
Technology Is Neither Good Nor Bad. Nor Is It Neutral Melvin C. Kranzberg
See, e.g., Gary Marx: Rocky Bottoms and Some Information Age Techno‐Fallacies. Intl. Political Sociology, Vol. 1, No. 1. March 2007, pp. 83‐110.
Irwin AltmanUniversity of Utah
Melvin C. KranzbergGeorgia Tech (1917‐1995)
Gary T. MarxMIT
55
Thank You For Your Attention
Understanding PrivacyDefinitions
1. Public policy
2. Laws and regulations
3. Interpersonal aspects
Technical ApproachesChallenges
1. Location privacy
2. RFID privacy
3. Smart environments
56
General ReadingDavid Brin: The Transparent Society. Perseus Publishing, 1999Simson Garfinkel: Database Nation –The Death of Privacy in the 21stCentury. O’Reilly, 2001Lawrence Lessig: Code and Other Laws of Cyberspace. Basic Books, 2006http://codev2.cc/
57
Privacy LawRotenberg: The Privacy Law Sourcebook 2004. EPIC, 2004
Privacy & Human Rights 2006.EPIC
Solove, Schwartz: Information Privacy Law. 3rd edition, Aspen, 2009
58
Privacy and TechnologyDeborah Estrin (ed.): Embedded, Every‐where: A Research Agenda for Networked Systems of Embedded Computers. National Academies Press, 2001.http://www.nap.edu/openbook.php?isbn=0309075688
Waldo, Lin, Millett (eds.): Engaging Privacy and Information Technology in a Digital Age. National Academies Press, 2007.
Wright, Gutwirth, Friedewald, et al.: Safeguards in a World of Ambient Intelligence. Springer, 2008
59