Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science...

55
Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative outcome of an exercise on collective and connective intelligence towards exploring the meaning and concept of usable privacy leading to privacy technology.

Transcript of Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science...

Page 1: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Biometric Security and PrivacyModule 2.1

By Bon Sy (editor)Queens College/CUNY, Computer Science

Note: 1. This slide set is on-going and a collaborative outcome of an exercise on collective and connective intelligence towards exploring the meaning and concept of usable privacy leading to privacy technology.

Page 2: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Privacy, Usable privacy, and Privacy technology

What is privacy? Biometrics and Privacy … Survey on many different views on privacy with an

objective to identify the characteristics behind the concept of privacy.

Collective and connective intelligence approach towards a taxonomy for usable privacy.

Framework for developing usable privacy technology Linking usable privacy technology to secure information

processing with information assurance and secure computation.

Page 3: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

What is privacy? Biometrics and Privacy… By

Arun Prakash Kumara Krishnan Kodak entered the handheld camera market in 1931 with the Retina 35mm camera,

The number of sensationalistic tabloids rife with gossip, being published quadrupled during the next few years.

A few scientists at Kodak seem to have helped shape a significance of our generation’s privacy concerns and beliefs.

What does this tell us about the potential impact of the mere existence technology on privacy?

How fast can new technology come to affect our Privacy Concerns, And even shape the privacy beliefs of this and future generations?

How Crucial is it for Biometric technology developers, Concerned persons to understand privacy concerns and how biometric technology might impact these concerns? How can this influence the design phase of a biometric system? How would this influence Biometric Research efforts? Should researchers in the field of biometrics not pursue certain paths

because they might lead to significant privacy concerns?

Page 4: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

A few potential concerns for a biometric technology developer or a Concerned Citizen:

What are the situations in which Biometric data can be misused? Do the privacy risks introduced by this technology outweigh any benefits? How compliant is this technology with prevailing privacy norms? Can any of these concerns be mitigated by certain software/hardware

restrictions or other technology like Encryption etc.? Is this technology privacy neutral, privacy intrusive, or privacy protective?

Can you think of any other privacy related concerns a biometric technology developer might/must take into consideration?

Understanding how certain biometric technologies might affect privacy is a crucial necessity for the biometric industry and concerned citizens; however this first entails understanding what Privacy means, the nature of the various privacy concerns, and what a society considers “reasonable”.

In this entire Slide-Set, discussions about Biometric Technology can usually be taken to apply to Technology in general also.

Do you agree? Why does one need to understand Privacy to assess how it is affected by Biometric Technology?

Page 5: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

What is this thing we call “Privacy”

Can you try to define what privacy is?

Here is what the Webster’s dictionary defines privacy as: 1a : the quality or state of being apart from company or

observation : seclusion 1b : freedom from unauthorized intrusion <one's right to privacy> 2 archaic : a place of seclusion 3a : secrecy 3b : a private matter : secret

How helpful are these definitions in trying to determine if a certain biometric technology is privacy neutral, privacy intrusive, or privacy protective? i.e. in general, trying to answer questions posed in the previous slide with any specificity?

Page 6: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

The Problem Of Ambiguity…

“Privacy seems to encompass everything, and therefore it appears to be nothing in itself”

- Daniel Solove, Professor of Law at the George Washington University

“Privacy is a value so complex, so entangled in competing and contradictory dimensions, so engorged with various and distinct meanings, that I sometimes despair whether it can be usefully addressed at all.”

- Philosopher Robert C. Post

“Perhaps the most striking thing about the right to privacy is that nobody seems to have any very clear idea what it is.”

- Philosopher Judith Jarvis Thomson

When people claim that privacy should be protected, it is unclear precisely what they mean. This lack of clarity creates a difficulty when making policy or resolving a case because lawmakers and judges cannot easily articulate the privacy harm.

-Daniel Solove, Professor of Law at George Washington University-

Page 7: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

The Problem Of Ambiguity…

Various Laws, Privacy policies purport to protect one’s privacy. The 911 commission in its report for example recommended that the varied government agencies share information, but in a manner that “Safeguards the privacy of individuals about whom information is shared”, but what does “Safeguards the privacy” mean in this case?

A firm enough conception of privacy which is not too ambiguous, is required for there to be an effective respect and safeguarding of the privacy of individuals.

The essential problem here is: how can one conceive of privacy where it is not too vague, rendering it ineffective in practice(i.e. using it to evaluate the impact of biometric technologies for example) and not too narrow to where it does not encompass major aspects of what me mean by privacy.

Page 8: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

What is privacy? Many different thoughts!By

Arun Prakash Kumara KrishnanDiscussion moderator: Bon Sy

The various conceptions of privacy can be derived form the following core ideas, the following slides will explore each of these conceptions in detail including their merits and shortcomings:1. The right to be left alone.

2. Limited access to the self-The ability to shield oneself from unwanted access by others.

3. Secrecy – The concealment of certain matters from others.

4. Control over (access/use of) personal information.

5. Personhood – The protection of one’s personalities, individuality and dignity.

6. Intimacy – Control over, or limited access to one’s intimate relationships.

Page 9: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

The right to be left alone.

Samuel Warren and Louis Brandeis published their seminal work “The Right To Privacy” (1980) Four major impacts on shaping the idea of privacy in tort

(common) law in the United States.

(1) On the use of cameras by the tabloid industry, Warren and Brandeis describe the situation: “The press is overstepping in each direction the obvious bounds of propriety and decency” “ Gossip is no longer the resource of the idle and of the vicious, but has become a trade”

Page 10: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

(2) On the defamation laws related to injury to reputations, Warrne and Barndeis defined the right to privacy was “the right of determining ordinarily, to what extent one’s thoughts, sentiments and emotions shall be communicated to others”

The United States Supreme court adopted Warren and Barndeis’s view of privacy in Katz v. United States (1967): Extending the Forth amendment’s unreasonable

search and seizure clause protection to all areas where there is a “reasonable expectation of privacy” i.e. a warrant is required for legal intrusion of spaces where one has a “reasonable expectation of privacy”

Overturning a long standing court ruling Olmstead v. United States (1928) which allowed for warrantless wiretapping.

The right to be left alone.

Page 11: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

The right to be left alone.

• (3) Supreme Court Justice Abe Fortas’s opinion in Katz v. United States: “to live one’s life as one chooses, free from assault, intrusion or invasion, except as they can be justified by the clear needs of community living under a government of law”

• (4) Justice William O. Douglas agreed by directly quoting Brandeis “The right of privacy was called by Mr. Justice Brandeis “the right to be let alone”. That right includes the privilege of an individual to plan his own affairs, for outside areas of “plainly harmful conduct, every American is left to shape his own life as he thinks best, do what he pleases, go where he pleases.””

Page 12: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

The right to be left alone.

• The issue on the concept of privacy as a “right to be left alone”: “On what matters should one be let alone?”

• Re-formulation of what the issue is by Law professor Anita Allen –

“punching someone would violate “right to be let alone” but punching is not considered a violation of privacy while a peeping in a bedroom is considered a violation of privacy.”

• Main aspect of the issue: punching is not an intrusion of privacy matter by peeping is.

• “right to be left alone” is one of the privileges that one should enjoy in private mattes, but it is not a conception of what one means by “private matters”

• Even as a legal right, where “right to be left alone” refers to non-interference by the state, as legal scholar Ruth Gavison argues, “Most privacy claims are not for non-interference by the state, but rather for interference by the state in resolving privacy infractions by other individuals.”

Page 13: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

The right to be left alone.

• Reflection on privacy as the “right to be left alone”:• Privacy as the “right to be left alone” seems to predicate on

the notion of interference/disturbance. In order to shield from the external interference/disturbance, we need to have a filtering mechanism in place. But then we need to think about what criteria to use for filtering, and how to stipulate them.

-Thoughts by Bon Sy.

Page 14: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Privacy As Limited access to the self. – The ability to shield oneself from unwanted access by others.

• The Right of every man to keep his affairs to himself, and to decide for himself to what extent they shall be the subject of public observation and discussion.

• Limited access to the self is not merely solitude. Solitude is a form of seclusion, of withdrawal from other individuals, of being alone. Limited access includes Solitude but goes further, embracing freedom from press surveillance, interference from the government etc.

“nothing is better worthy of legal protection than private life, or, in other words, the right of every man to keep his affairs to himself, and to decide for himself to what extent they shall be the subject of public observation

and discussion”

-E.L. Godkin

Page 15: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Privacy As Limited access to the self. – The ability to shield oneself from unwanted access by others.

Privacy is the exclusive access of a person to a realm of his own. The right to privacy entitles one to exclude others from watching, utilizing, invading his

private realm.-Ernest Van Den Haag

• “Limited access to the self” stems from the original notion of “right to be let alone” by extending the idea to incorporate intp the notion of privacy the will of segregating aspects of one’s.

• Certainly not all access to the self violates privacy, only access to certain areas of the self involve privacy concerns, as Daniel Solove, Professor of Law claims

“The theory provides no understanding as to the degree of access necessary to constitute a privacy violation, in the continuum between absolutely no access to the self and total access, the important question is where the lines should be drawn – that is , what degree of access should we recognize as reasonable?”

Page 16: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

• Issues related to the privacy conception based on “Limited access”:• ambiguous and broad. • The concept of limited access depends on classifying matters into

private or non-private. Yet whether a matter is private or not may depend on the situation or the relationship with the other party; e.g., health condition is typically considered as private by an individual, but not from the health insurance company.

• To address this ambiguity, Legal Theorist explains what constitutes limited access. Gavison defines access by restricting privacy matters to withdrawal and concealment.

• Privacy concerns rising from The Government’s interactions in private decisions regarding one’s body, health etc.

Privacy As Limited access to the self. – The ability to shield oneself from unwanted access by others.

Page 17: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

• Reflection on privacy as limited access to the self:• Privacy as limited access to the self suggests an individual to

take proactive role to build a firewall around oneself. We will need to conceptualize mechanism and constraints to impose upon for allowing limited access, and what can be accessed in what degree of limitation is another important aspect that will need to be addressed for this notion of privacy to be useful.

-Thoughts by Bon Sy.

Privacy As Limited access to the self. – The ability to shield oneself from unwanted access by others.

Page 18: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Privacy as Secrecy – the concealment of certain matters from others.• Secrecy is perhaps the most prevalent understanding of the term

privacy. A person’s privacy is considered being violated when certain previously concealed information is released publically.

• Secrecy specifically deals with information, i.e., by its nature information may exist external to the self. In this sense this conception of privacy is different from the conception of “being left alone”.

The word ‘privacy’ seems to embrace at least two distinct interests. One is the interest in being left alone—the interest that is invaded by the unwanted telephone solicitation, the noisy sound truck, the music in elevators, being jostled in the street, or even an obscene theater billboard or shouted obscenity. The other privacy interest, concealment of information, is invaded whenever private information is obtained against the wishes of the person to whom the information pertains.

- Judge Richard Posner

Page 19: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Privacy as Secrecy – the concealment of certain matters from others.• Privacy as Secrecy may be seen as a subset of “Limited Access To The Self”

where access is restricted specifically to private information about oneself.

• As Poser further claims “People want to manipulate the world around them by selective disclosure of facts about themselves”

• Secrecy has been interpreted by the courts rather narrowly. No matter the extent of disclosure, with any public disclosure ones looses his right to privacy in that matter. i.e. there can be no “reasonable expectation of privacy” when certain information is exposed to the public, no matter how unlikely its discovery is.

• Public exposure leading to no “reasonable expectation of privacy” could be problematic. While it is widely held as a consensus that we are entitled to privacy within our property, can our “reasonable expectation of privacy” be taken away/challenged when remote sensing technology enables surveillance capability to take satellite photo of our property from public airspace? According to the precedence of court ruling, this seems to be the case!

Page 20: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Privacy as Secrecy – the concealment of certain matters from others.• Privacy as Secrecy is challenging in practically any social

interaction setup. • Exchanging complains among co-workers on their superior is a

situation where privacy is expected when sharing information with a select group of trusted people. But can privacy be really expected?

• Can an individual in the group sharing the “secret” with individuals of other groups where it is a privacy (secrecy) on disclosing the sharing of the “secret”?

We become what we are not only by establishing boundaries around ourselves but also by periodic opening of these boundaries to nourishment, to learning, and to intimacy. But the opening of a boundary of the self may require a boundary farther out, a boundary around the group to which we are opening ourselves.

-Sociologist Arnold Simmel

Page 21: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

• Another shortfall of “privacy as secrecy”:• Many things we do everyday in public (e.g., the books we buy,

products we purchase at the supermarket) are considered to be private matters. This is not because these activities are carried out in secrecy but because as Stanley Benn States “are matters that it be inappropriate for other to try to find out, much less report on, without consent.”

• Using the notion of Secrecy, especially as interpreted by the courts, “privacy as secrecy” is likely to be inadequate in many situation involving group interactions.

• Reflection on privacy as secrecy:• The notion and scope of privacy as secrecy is likely to be too limited and

narrow. While keeping secret or confidential is one aspect of privacy, it falls way too short on circumstances where sharing private information is necessary but the privacy concern is on the use of information.

-Thoughts by Bon Sy.

Privacy as Secrecy – the concealment of certain matters from others.

Page 22: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Privacy as Control Over Personal Information.• “Privacy is not simply an absence of information about us in the minds of others, rather it

is the control we have over information about ourselves”

-Charles Fried

.• “An individual’s claim to control the terms under which personal information – information

indetifiable to the individual – is acquired, disclosed and used.”

- - President Bill Clinton’s Information Infrastructure Task force

• Control over personal information is a leading theory of privacy. Alan Westin, one of its foremost proponents, put forth the following meaning for “privacy and freedom”:

• “Privacy is the claim of individuals, groups or institutions, to determine for themselves when, how and to what extent information about them is communicated to others”

• Privacy is not an absolute

• Privacy is not a constant state but rather a process:“Each individual is continually engaged in a personal adjustment process in which (s)he balances the desire for privacy with the desire for disclosure and communication …”

Page 23: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Westin’s four states of privacy Solitude

Individual separated from the group and freed from the observation of other persons.

Intimacy Individual is part of a small unit.

Anonymity Individual in public but still seeks and finds freedom from

identification and surveillance. Reserve

The creation of a psychological barrier against unwanted intrusion – holding back communication.

Privacy as Control Over Personal Information.

Page 24: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Difference between limited access to information vs control over information

Privacy as limited access to information about oneself The extent to which we are known to others and the extent to

which others have access to information about us. Privacy as control over (the use of) information by oneself

Not simply limiting what others know about you, but controlling how the information could be used.

This assumes individual autonomy, that you can control information in a meaningful way.

Privacy as Control Over Personal Information.

Page 25: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

• What information can be controlled, and by whom?• Supreme court case of Haynes V. Alfred A. Knopf Inc.

• The case involved the publication of a book which chronicled the life of an abused African American wife. Ruby Lee Daniels Suffered gravely from her husband’s alcoholism and misconduct. Luther Haynes, Ruby’s husband sued her publisher under the public disclosure of private facts tort claiming that he had turned his life around and the ugly episodes of his life were entitled to privacy protection. Judge Posner in his majority opinion concluded that there could be no liability for invasion of privacy because

“A person does not have a legally protected right to a reputation based on the concealment of truth and because the book narrated a story not only of legitimate but of transcendent public interest.”

Privacy as Control Over Personal Information.

Page 26: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Privacy as Control Over Personal Information.• Challenging issues of “Privacy as Control over Information”:

• We live in a system with a complex societal architecture of information regulation --- where millions of actors share, manipulate, and withhold information as they go about their projects of self creation. Privacy is nowhere close to just being an matter of the exercise of individual control.

• In an era where information about a person can be obtained instantaneously from hundreds of sources, exerting control over --- even just the use of --- information could be extremely challenging if not impossible.

• How does this conception hold up when information is created by a group of people in cooperation? Like A husband and wife creating an intimate video, multiple uses on Facebook collaborating to share experiences through the “Wall” etc.?

• Reflection on privacy as control over information:• Privacy as control over personal information suggests the notion of ownership on

information, the existence of mechanism to dictate and enforce terms under which the information flow could occur (e.g., specific parties, time, location, and other situational conditions).

-Thoughts by Bon Sy.

Page 27: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Privacy as Control Over Personal Information.• If privacy is about control over information, much like security control,

perhaps privacy protection could be approached from (1) technical safeguard, (2) procedural safeguard, and (3) administrative safeguard, against information leakage.

• The notion of safeguard against information leakage suggests the possibility of approaching privacy protection

• by design, and • by drawing from the wisdom of the crowd to identify the factors, and

their inter-relationship, in formulating a design framework useful for the development of privacy safeguards.

• Recently the notion of co-privacy grounded on mutual benefits among multiple parties to voluntarily regulate information sharing was proposed. This may be considered as privacy protection based on control over information except in the group level as opposed to the individual level.

-Thoughts by Bon Sy.

Page 28: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Privacy as Personhood – The protection of one’s personality, individuality and dignity.• The conception of Privacy as personhood usually

compliments other conceptions of privacy and by itself is usually not viewed as an atomic conception.

• Personhood conception of privacy: • “Privacy amounts to respect for individuals as choosers:

Respect for someone as a person, as a chooser, implies respect for him as one engaged in a kind of self-creative enterprise, which could be disrupted, distorted or frustrated even by so limited an intrusion as watching”. - Philosopher Stanley Benn

• Privacy in this conception is the essence of freedom. For the mere gaze of the other restricts the certain possibilities of self creation

Page 29: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

• Sartre’s examples on peeping Tom: A peeping tom looks through a keyhole. While he is “peeping” his consciousness of himself does not change if he is not discovered. Once someone discovers him, their mere gaze changes the peeping tom’s conception of himself.

• Government surveillance/threat of surveillance or regulation restricts the possibilities of a person’s project of self creation; thus one may no longer be a pure object but now also a subject.

• If one were watched by cameras all day long, like in London for example, one’s project of self creation might be totally different from if he/she were in New Delhi with no cameras. Public Spaces will be viewed in a totally different manner, with New Delhi offering a more fertile ground for self creation.

Privacy as Personhood – The protection of one’s personality, individuality and dignity.

Page 30: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

• A seminal Supreme Court case that embraces the personhood conception of privacy is Roe V. Wade in Planned Parenthood.

• In this seminal case abortion is regarded as a woman's right to privacy, and was explained by V. Casey the Justices in their majority opinion as to what the constitutional right to privacy encompasses:

“These matters, involving the most intimate and personal choices a person may make in a lifetime, choices central to personal dignity and autonomy, are central to the liberty protected by the Fourteenth Amendment. At the heart of liberty is the right to define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life. Beliefs about these matters could not define the attributes of personhood were they formed under compulsion of the State.”

• Here one aspect of privacy is conceptualized as the non interference of the state in matters that are crucial to defining one’s personhood.

Privacy as Personhood – The protection of one’s personality, individuality and dignity.

Page 31: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Privacy as Personhood – The protection of one’s personality, individuality and dignity.• Problem of the privacy notion as personhood:

• The right to personhood have two contradictory sides; i.e. the state needs to prevent disruptive characters from infringing other person’s realizations of personhood. However, by doing so, the state can (un)intentionally impose a sense of what is more important and hence prevent an unlimited realization of personhood, which is exactly why personhood was conceived as an aspect of privacy in the first place.

• Example: In Sept 2010 NYC expanded the NYPD surveillance camera network to cover not just downtown area, but mid-town area. Mayor Bloomberg argued in the TV news interview that the expanded surveillance network reduces potential terrorist activities, thereby enhancing privacy by increasing the opportunity of the “right to being left alone” in an unharmed way.

Page 32: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Privacy as Personhood – The protection of one’s personality, individuality and dignity.• Problem of the privacy notion as personhood (continued):

• “By conceiving of the conduct that it purports to protect as ‘essential to the individual’s identity,’ personhood inadvertently reintroduces into privacy analysis the very premise of the invidious uses of state power it seeks to overcome. When the state endeavors to protect personhood, it must adopt and enforce its own conception of individual identity, impinging upon the freedom of individuals to define for themselves what is central to their identities. - Philosopher Jed Rubenfeld

• Reflection on privacy as personhood:• Personhood manifests a very broad concept on what privacy is.

In conjunction with other theories, privacy as personhood could help us to develop a useful framework for the universal conception of privacy.

- Thoughts by Bon Sy

Page 33: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Privacy as Intimacy – Control over, limited access to one’s intimate relationships.• Conception of privacy in terms of intimacy recognizes that privacy is

essential not merely for self-creation, but for human relationships.

• This conception of privacy consists of a form of limited access while it locates the value of privacy in the development of personal relationships.

• “We form relationships with differing degrees of intimacy and self-revelation, and we value privacy so that we can maintain the desired levels of intimacy for each of our varied relationships. Intimate relationships simply could not exist if we did not continue to insist on privacy for them. By focusing on the relationship-oriented value of privacy, the theory of privacy as intimacy attempts to define what aspects of life we should be able to restrict to, or what information we should be able to control or keep secret.”

- Political Scientist Robert Gerstein

Page 34: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

• Problem of the notion of privacy as intimacy:• It ignores serious breaches of privacy. As political

scientist Priscilla Regan describes:

“Computer databases post a significant threat to privacy but do not primarily affect relationships of friendship, love, and trust. Instead, these threats come from private and governmental organizations- the police, welfare agencies, credit agencies, banks and employers.”

• Reflection on privacy as intimacy:• Intimacy is considered by Alan Westin as a state of privacy

rather than the totality of privacy. In either case, perhaps what is missing is a need to document use cases where the contextual situations under which a particular notion of privacy is appropriate and the associated limitations.

- Thoughts by Bon Sy

Page 35: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

A Usable Definition Of Privacy For Assessing Biometric Technology… Many of the above mentioned Conceptions of privacy are

insightful. Individually, however they seem to lead to more confusion, and raise more questions than solve problems. The spirit of Usable Privacy is to provide a concrete enough mechanism which protects privacy while not getting tangled in ambiguity, for Even with widespread agreement on privacy’s importance for democracy, freedom etc., ambiguity in its conception has lead to inaction.

• Privacy is definitely not merely a positive right, it is also a negative right, For the concept of privacy does not exist in the vacuum of one persons mind. Like language, privacy consists of inter-personal rules, norms that one needs to follow in interaction between persons. The negative right aspect of privacy requires that individuals refrain from engaging in certain activities. Without an effective conception of privacy, these negative rights are more often ignored.

Page 36: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Collective Intelligence…

“Naturalis Historia” is considered to be one of the oldest encyclopedias. Pliny the Elder, a Roman statesman dating from the first century AD is credited with compiling it. Encyclopedias like this and others over the centuries essentially contained the judgments/opinions of a certain group of “experts” on various matters.

A new type of encyclopedia has emerged over the last few years,

the “Wikipedia”, A “collective” encyclopedia of sorts where minds from all over the world collaborate to achieve common descriptions, definitions, explanations etc.

In Wikipedia, persons explicitly contribute to this “collective knowledgebase”. Other technologies such as Google Trends, are mapping our implicit tendencies, patterns, choices etc. to extract other forms of this “collective knowledge” that we possess.

Page 37: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Many have come to call these stores of knowledge contained in Wikipedia, Google Trends, Freebase etc. “Collective Intelligence”

Initial suspicion aside, Collective Intelligence projects such as “Wikipedia” have become rather successful. Explicit Collective Knowledge projects such as Wikipedia seem to have created vast sources of continuously refining knowledge through the contribution of millions of authors; and implicit knowledge sources like Google trends seem to be discovering vast areas of collective knowledge by mapping the desires, decisions etc. of billions of people.

How effectively in your opinion can we use these new tools to create a Dynamic Taxonomy of Usable Privacy mentioned earlier?

Page 38: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Specifically, Collective Intelligence is a term used to describe the knowledge collectively embedded within cultures/societies or even the entirety of the human race. It is knowledge that emerges when persons interact and share ideas, it is not within any one person, but emerges from the interaction of ideas among various persons.

Collective Intelligence is sometimes divided into explicit, implicit types. Explicit Collective Intelligence emerges from the active sharing of ideas among persons, perhaps like in Wikipedia. Knowledge is gathered by the interaction of many. Implicit Collective Intelligence results form the information generated by mapping the activities, desires and tendencies etc. of many persons over time. By analyzing their patterns, correlations, relationships etc. more and more accurate predictions of one’s preferences, behaviors etc. are being made.

Netflix for example, tracks the thousands of movie rentals, looks for patterns and recommends various movies to its subscribers. Google page rank for example tracks the billions of clicks of Google users and ranks various webpages based on the “Collective Intelligence” obtained from this web of information that these billions of clicks create.

Page 39: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Towards using Collective Intelligence to create a Dynamic Taxonomy of Usable Privacy… How do you think that one can use the notion of “Collective”

intelligence to create a Dynamic Taxonomy of Usable Privacy?

Certainly both an Explicit and Implicit Collective Intelligence approach is required.

Formulating an Explicit collective intelligence approach seems straightforward, An open forum for the discussion of privacy concerns like Wikipedia where multitudes of people contribute and try to layout what “Reasonable Expectations” are.

Page 40: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Of crucial importance however in a implicit Collective Intelligence approach for Usable Privacy, is defining what aspects of the multitude of human interactions /actions one should focus on. Defining the methods of analysis is also crucial in achieving applicable results.

The essential question is: what aspects of a person’s particular “Specific” experiences need focus to allow an abstraction in the conception of privacy which will transcend particular contexts and give wide applicability. In other words at what level should the focus be in particular experiences to allow for the analysis to lead to a Collective Intelligence conception of “Usable Privacy”

One could Focus on collecting individual preferences, however individuals could well have unreasonable demands. Moreover individuals might express demands that they might not adhere to when a particular situation presents itself, i.e. the problem of determining how people behave and what they desire. It is also many times the case that they are inconsistent i.e. further exploring a particular position shows that its contradictory with another position they are holding.

Page 41: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

The crucial fact here is that the goal of “Usable Privacy” is not to try to survey the variety of beliefs on privacy that individuals might have, but rather what a society considers as “Reasonable Expectation of Privacy” It is not a catalogue of various individual preferences, but rather to chart the web of the various Privacy Concerns that arise within persons while engaged in the business of living.

In particular everyday contexts, experiences that should be cataloged and analyzed are specific privacy problems that come up from day to day in people’s lives. I.e. specific concrete situations where people feel the need for privacy.

Privacy concerns and the want for protection do not come from a vacuum, they come up because specific situations triggered the desire for their need. These crucial needs are the cornerstone of every functioning society. Cataloguing and analyzing these particular events where one feels the “desire” for privacy will probably lead to the most comprehensive formation of implicit Collective Knowledge in the realm of usable privacy.

Page 42: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Collective and connective intelligence approach towards a taxonomy for usable privacy

byBon Sy

Privacy as the “right to be left alone” Predicate on the notion of interference/disturbance. On what matters do we entitle for the right to be left alone? What constitutes as an external interference/ disturbance? E.g., when

there is loud music from the block party, do we entitle the right to be left alone?

Does our demand on being left alone itself constitute an interference on others?

Answers to the questions above are likely to depend on the group dynamics and interaction. The group consensus on the nature of intrusion and decisional interference must first be determined and stipulated in order to devise filtering mechanism, and criteria to apply filtering.

Page 43: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Collective and connective intelligence approach towards a taxonomy for usable privacy

Privacy as limited access to the self Limited access to the self suggests an individual to take proactive

role to build a firewall around oneself. what can be accessed in what degree of limitation will be

necessary for this notion of privacy to be useful? How do we conceptualize mechanism and constraints to impose

upon for allowing limited access? Should biometrics be an aspect to be protected with limited

access? Should biometrics be means to enable limited access? What are the real world use cases for each of the above two? As a group or community, how would the utility function

f(acceptance, privacy, security) look like in regard to settling the answer for the above two questions?

Page 44: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Collective and connective intelligence approach towards a taxonomy for usable privacy

Privacy as secrecy The notion and scope of privacy as secrecy is too limited and narrow in many

privacy situations. While keeping secret or confidential is one aspect of privacy, it falls way too

short on circumstances where sharing private information is necessary but the privacy concern is on the use of information.

As a group or community, identify the list of matters/information/events that are commonly agreed by all as privacy with the notion of secrecy/confidential. commonly agreed by all as private but could not be kept completely

secret; e.g., medical record disclosure to insurance company. considered as private but not under the notion of privacy as secrecy; e.g.,

sharing family photo among friends. As a group or community, identify private technologies that are usable for the

above three cases, and scenario/situations of privacy concern with respect to each case.

Page 45: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Collective and connective intelligence approach towards a taxonomy for usable privacy

Privacy as control over information This notion of privacy is (probably most) relevant and concrete in regard to the

development of privacy technology because it is restricted to one specific aspect; namely the privacy on information.

While in the broadest sense the ability to “control over information” should include the ability to keep secrecy and allow limited access, this is typically not the context.

The context on the “control over information” is often oriented towards the mechanism to dictate and enforce terms under which the information flow could occur (e.g., specific parties, time, location, and other situational conditions).

While “control over information” suggests the notion of ownership and regulation on information, the control mechanism may involve multiple parties in different locations; e.g., the concept of co-privacy.

Secure Multi-party Computation (SMC) offers privacy protection by defining protocol for information exchange. SMC is attractive because it guarantees provable privacy protection on the function that SMC computes. What practical real world scenario would find the notion of privacy as control over information essential? Can these scenario be characterized by functions that SMC can efficiently compute? What are these functions?

Page 46: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Collective and connective intelligence approach towards a taxonomy for usable privacy

Personhood manifests a very broad concept on what privacy is. The main essence of personhood perspective is freedom and choice. The intertwined issue of this notion of privacy is the freedom and

choice for one party may impede the freedom and choice of others; for example, pro-life advocates the freedom to live, but impedes the freedom of women to choose in the abortion issue.

In conjunction with other theories, privacy as personhood could help us to develop a useful framework for the universal conception of privacy.

As a group or a community, what matters, events, or information should be entitled for “freedom act” that will not cause the dilemma just mentioned.

With respect to these matters, events, and information, what could be the role of technology on privacy protection?

Page 47: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Collective and connective intelligence approach towards a taxonomy for usable privacy

Two different views on privacy as intimacy: Intimacy itself is the totality of privacy; i.e., an individual only

shares “private matters” with those in the circle of friend(s) with whom intimate relationship was developed.

Intimacy is considered by Alan Westin as a state of privacy, not the totality.

There is a need to document use cases where the contextual situations under which a particular notion of privacy is appropriate and the associated limitations.

PGP may be considered as an encryption utilizing the concept of intimacy (i.e., circle of friends). For privacy protection, the challenge is the information leakage by individuals belong to different circles of friends. What technical safeguard could prevent this kind of information leakage?

Page 48: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Collective and connective intelligence approach towards a taxonomy for usable privacy

What is “Usable Privacy” ? conception of privacy which aims for maximizing usability i.e. it can be used

to provide concrete guidance for engineers involved in developing products for example.

aims to chart what “society is prepared to recognize as “Reasonable” , even individuals may have arbitrary unreasonable demands for Privacy; e.g., one might believe that photographing one’s hand in public is a privacy violation. “Usable Privacy”

The crucial fact here is that the goal of “Usable Privacy” is not to try to survey the variety of beliefs on privacy that individuals might have, but rather what a society of individuals together consider as “Reasonable Expectation of Privacy”

Do you think that mapping what “society is prepared to recognize as reasonable” is a fruitful endeavor? Do the majority of us even share something common in our individual conceptions of privacy?

Page 49: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Collective and connective intelligence approach towards a taxonomy for usable privacy

Usable privacy aims at mapping various notions of privacy to the related activities of privacy concern. As a use case, we can use Professor Solove’s activity list below as a start point to construct a taxonomy:

Activity of privacy concern Related privacy notion

1. Information collection Privacy as limited access to self

(Surveillance, Interrogation) Privacy as personhood, being left alone

2. Information processing Privacy as control over information

(Aggregation, Identification, Insecurity

Secondary Use, Exclusion)

3. Information Dissemination Privacy as secrecy, being left alone

(Breach of confidentiality, Disclosure Privacy as limited access

Exposure, Increased accessibility

Blackmail, Appropriation, Distortion)

4. Invasion Privacy as being left alone

(Intrusion, Decisional Interference) Privacy as personhood

Page 50: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Framework for developing usable privacy technology

Activity of privacy concern Related privacy notion Technical requirements

1. Information collection

(Surveillance, Interrogation)

2. Information processing Limited access

(Aggregation, Identification, SIPPA

Insecurity, Secondary Use,

Exclusion) Control over

information SMC

3. Information Dissemination

(Breach of confidentiality,

Disclosure, Exposure, Secrecy Cryptography

Increased accessibility

Blackmail, Appropriation,

Distortion) Being left alone

4. Invasion

(Intrusion Personhood

Decisional Interference)

Page 51: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Framework for developing usable privacy technology

The IBG Bio-Privacy institute has come up with the following system of Assessing any specific Biometric Technology for example:

Verification/Identification Technologies capable of robust identification are rated higher;

technologies that are only capable of verification are rated lower. Overt/Covert

Technologies capable of operating without user knowledge or consent are rated higher; technologies that only operate with user consent are rated lower.

Behavioral/Physiological Technologies based on unchanging physiological characteristics are

rated higher; technologies that are based on variable behavioral characteristics are rated lower.

Availability of Searchable Databases Technologies for which searchable databases exist (or are likely to exist

in the near future) are more likely to be used in a privacy-invasive fashion than those for which no databases exist (or are likely to exist in the near future.

Page 52: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Technologies are rated Low, Medium, and High in each of these categories.  Low: Little privacy risk

The basic functionality of the technology ensure that there are few if any privacy issues.

Medium: Potential privacy risk The technology could be used in a privacy-invasive fashion, but the range of

potential misuse is limited. High: Moderate privacy risk

For certain types of deployments, proper protections should be in place to ensure that the technology is not misused.

How effective do you think this method is to evaluate the privacy concerns of particular biometric technologies? What specific problems do you notice?

Do all forms of data collection entail privacy risks? Does this method take this into consideration, and lay out the distinction?

How effective is this method in trying to shed light on the distinction between how much one can sacrifice in terms of privacy in order to gain other “goods” like Convenience, Prevention of Crime, Accumulation of knowledge etc.?

Is this method flexible enough to accommodate the various conceptions of privacy among varied societies?

Page 53: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Here are a few examples of how the Bio-Privacy Initiative classifies certain Biometric Technologies.

TechnologyPositivePrivacyAspects

NegativePrivacyAspects

BioPrivacy Technology Risk Ratings

Fingerprint

· Can provide different fingers for different systems· Large variety of vendors with different templates and algorithms

· Storage of images in public sector applications· Use in forensic applications· Strong identification capabilities

 Identification: HCovert: M

Physiological: HDatabases: H

 Risk Rating: H

Facial Recognition

· Changes in hairstyle, facial hair, position, lighting reduce ability of technology to match without user compliance

· Easily captured without consent or knowledge· Large number of existing images can be used for comparison

Identification: HCovert: H

Physiological: MDatabases: H

 Risk Rating: H

Iris Recognition

· Current technology requires high degree of user cooperation - difficult to acquire image without consent· Iris images not used in forensic applications

· Very strong identification capabilities· Development of technology may lead to covert acquisition capability· Most iris templates can be compared against each other - no vendor heterogeneity 

 Identification: HCovert: L

Physiological: HDatabases: L

 Risk Rating: H

Retina-scan

· Requires high degree of user cooperation - image cannot be captured without consent

· Very strong identification capabilities

Identification: HCovert: L

Physiological: H Databases: L

 Risk Rating: M

Page 54: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

TechnologyPositivePrivacyAspects

NegativePrivacyAspects

BioPrivacy Technology Risk Ratings

Voice-scan

Voice is text- dependent: the user must speak the enrollment password to be verifiedNot capable of identification usage

· Can be captured without consent or knowledge

Identification: LCovert: H

Physiological: LDatabases: L

 Risk Rating: M

Dynamic Signature

Verification

· Signing is largely behavioral - can be modified at will

· Signature images can be used to commit fraud 

Identification: LCovert: L

Physiological: LDatabases: L

 Risk Rating: L

Keystroke Dynamics

· A highly behavioral characteristic - subject to significant changes

· Can be captured without knowledge/consent

Identification: LCovert: M

Physiological: LDatabases: L

 Risk Rating: L

Hand Geometry

· Physiological biometric, but not capable of identification· Not a palm-scanner, but a measure of hand structure· Requires proprietary device 

None Identification: L

Covert: LPhysiological: M

Databases: L Risk Rating: L

Page 55: Biometric Security and Privacy Module 2.1 By Bon Sy (editor) Queens College/CUNY, Computer Science Note: 1. This slide set is on-going and a collaborative.

Conclusion

Understanding privacy using a taxonomic framework provides for a concrete method of classification that is not too narrow and not too vague. It is a real tool that can be used to make reasonable judgments as to the question of weather a certain activity might breach privacy, and it is thus usable in real work scenarios to solve real world problems.

Given the dynamism of what we mean by “privacy” across various cultures, various times in history, How effective is a rigid taxonomy?

Can you think of ways in which one can build a dynamic but concrete taxonomy? Which will more accurately reflect our sensibilities for privacy. Thus laying a dynamic but concrete, evolving frame-work which provides effective tools for evaluating how a piece of technology may cause privacy concerns.