introduction to sociotechnical computer ethics

42
1/42 introduction to sociotechnical computer ethics Burak Galip ASLAN, PhD

Transcript of introduction to sociotechnical computer ethics

Page 1: introduction to sociotechnical computer ethics

1/42

introduction to sociotechnical computer ethics

Burak Galip ASLAN, PhD

Page 2: introduction to sociotechnical computer ethics

2/42

ethics: morality

computer: information and communication technology

Page 3: introduction to sociotechnical computer ethics

3/42

“Guns don’t kill people, people kill people.”

Page 4: introduction to sociotechnical computer ethics

4/42

the traditionalist account

Page 5: introduction to sociotechnical computer ethics

5/42

Does IT create new ethical issues that never existed before or is it just new

versions of old ethical issues?

1. technology -> complex ethical issues [nuclear weapons, genetically modified organisms (GMO), cloning, genetic diseases research, mind-altering technology – still – the science of subjective experience, etc.] … this will lead to standard account

Page 6: introduction to sociotechnical computer ethics

6/42

Does IT create new ethical issues that never existed before or is it just new

versions of old ethical issues?

1. technology -> complex ethical issues (nuclear weapons, genetically modified organisms - GMO, cloning, genetic diseases research, mind-altering pharmacology – the science of subjective experience)

2. specifically IT -> more than automobiles, electricity and bridges, more to discuss at the end of course, now a preliminary start … this will lead to socio-technical systems approach

Page 7: introduction to sociotechnical computer ethics

7/42

the standard account

James Moor, 1985, “What is Computer Ethics?”

computers create new possibilities and opportunities (community groups, global companies, closing geographical gaps) (no possibility before – data mining, proprietary software, child abuse on the web) (even more – Second Life, robot caring for the elderly esp. in Japan, artificial robo-medics, intelligent warfare)

Page 8: introduction to sociotechnical computer ethics

8/42

the standard account

James Moor, 1985, “What is Computer Ethics?”

computers create new possibilities and opportunities (community groups, global companies, closing geographical gaps) (no possibility before – data mining, proprietary software, child abuse on the web) (even more – Second Life, robot caring for the elderly esp. in Japan, artificial robo-medics, intelligent warfare)

good & bad come together…

virtual reality –> modeling, simulation, training + escaping real world

robo-caring elderly –> better care + less human contact)

Page 9: introduction to sociotechnical computer ethics

9/42

the standard account

James Moor, 1985, “What is Computer Ethics?”

computers create new possibilities and opportunities (community groups, global companies, closing geographical gaps) (no possibility before – data mining, proprietary software, child abuse on the web) (even more – Second Life, robot caring for the elderly esp. in Japan, artificial robo-medics, intelligent warfare)

good & bad come together (virtual reality – modeling, simulation, training – escaping real world; robo-caring elderly – better care – less human contact)

economic view, politic view, etc. –> important (descriptive approach – how do people behave)

ethical perspective –> very important – normative approach (how should people behave)

Page 10: introduction to sociotechnical computer ethics

10/42

the standard account

James Moor, 1985, “What is Computer Ethics?”

new possibility – vacuum of policies

Page 11: introduction to sociotechnical computer ethics

11/42

the standard account

James Moor, 1985, “What is Computer Ethics?”

new possibility – vacuum of policies

computer ethics – filling policy vacuums (files -> no privacy concern -> remote access -> a policy vacuum)

Page 12: introduction to sociotechnical computer ethics

12/42

the standard account

James Moor, 1985, “What is Computer Ethics?”

new possibility – vacuum of policies

filling policy vacuums -> sorting out conceptual muddles (copyrights and patent laws, concept: computer program)

computer ethics – filling policy vacuums (files -> no privacy concern -> remote access -> a policy vacuum)

Page 13: introduction to sociotechnical computer ethics

13/42

an update to the standard account

change in IT -> science and technological studies (transplantation -> donation vs. selling?, urine tests at work-> offense on home privacy, sexual preference, race or religion?, genetically modified food?)

policy vacuum – conceptual muddle -> new technology ethics

Page 14: introduction to sociotechnical computer ethics

14/42

an update to the standard account

1- new first: filled or not filled, if filled; bad-policies or good-policies -> never forget that: IT shaping moral practices as well

Page 15: introduction to sociotechnical computer ethics

15/42

an update to the standard account

1- new first: filled, not filled, if filled; bad-policies, good-policies -> IT shaping moral practices

2- it isn’t just “new” anymore, e-ntegration (my call)

Page 16: introduction to sociotechnical computer ethics

16/42

an update to the standard account

1- new first: filled, not filled, if filled; bad-policies, good-policies -> IT shaping moral practices

2- it isn’t just “new” anymore

3- newness as an impact at introduction -> missing that there is social context in it, technology <-> human (Steve Jobs, Steve Wozniak, Apple, HP, California, garage, electricity, electronic engineer, mouse, on-screen icones, hobby store)

Page 17: introduction to sociotechnical computer ethics

17/42

an update to the standard account

1- new first: filled, not filled, if filled; bad-policies, good-policies -> IT shaping moral practices

2- it isn’t just “new” anymore

3- newness as an impact at introduction, there is social context in it, technology <-> human (Steve Jobs, Steve Wozniak, Apple, HP, California, garage, electricity, electronic engineer, mouse, on-screen icones, hobby store)

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design) do NOT get blinded please!

Page 18: introduction to sociotechnical computer ethics

18/42

an update to the standard account

The standard account is not especially for “computer” or “IT”, it is rather on “new technology” in general.

Page 19: introduction to sociotechnical computer ethics

19/42

the sociotechnical systems perspective

Science and Technology Studies (STS)

SocioTechnical Systems (STS) perspective

“Avoid 3 mistakes while thinking about technology!”

Page 20: introduction to sociotechnical computer ethics

20/42

the sociotechnical systems perspective

1- reject technological determinism -> think coshaping

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

Page 21: introduction to sociotechnical computer ethics

21/42

the sociotechnical systems perspective

1- reject technological determinism -> think coshaping

tenets of technological determinism

a- technological development is independent from society X (decisions of government agencies, social incidents, war/terrorist attacks – security, National Science Foundation – NSF investment on nanotechnology)

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

Page 22: introduction to sociotechnical computer ethics

22/42

the sociotechnical systems perspective

1- reject technological determinism -> think coshaping

tenets of technological determinism

a- technological development is independent from society X (decisions of government agencies, social incidents, war/terrorist attacks – security, National Science Foundation – NSF investment on nanotechnology)

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

both (a) and (b) can be referred to 3rd required update to the standard account (my call)

b- technology determines the character of society X (automobile safety and fuel efficiency -> automobile design)

Page 23: introduction to sociotechnical computer ethics

23/42

the sociotechnical systems perspective

1- reject technological determinism -> think coshaping

tenets of technological determinism

a- technological development is independent from society X (decisions of government agencies, social incidents, war/terrorist attacks – security, National Science Foundation – NSF investment on nanotechnology)

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

both (a) and (b) can be referred to 3rd required update to the standard account (my call)

b- technology determines the character of society X (automobile safety and fuel efficiency -> automobile design)

Page 24: introduction to sociotechnical computer ethics

24/42

the sociotechnical systems perspective

1- reject technological determinism -> think coshaping

tenets of technological determinism

a- technological development is independent from society X (decisions of government agencies, social incidents, war/terrorist attacks – security, National Science Foundation – NSF investment on nanotechnology)

“Nature cannot be made to do just anything humans want it to do. Nevertheless, nature does not entirely determine the technologies we get.”

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

b- technology determines the character of society X (automobile safety and fuel efficiency -> automobile design)

Page 25: introduction to sociotechnical computer ethics

25/42

the sociotechnical systems perspective

2- reject technology as material objects -> think sociotechnical systems

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

Page 26: introduction to sociotechnical computer ethics

26/42

the sociotechnical systems perspective

2- reject technology as material objects -> think sociotechnical systems

artifacts <- intentional human activity

technology -> “social product”

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

Page 27: introduction to sociotechnical computer ethics

27/42

the sociotechnical systems perspective

2- reject technology as material objects -> think sociotechnical systems

artifacts <- intentional human activity

technology -> “social product”

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

artifact is only a part of a social system and it doesn’t do anything itself (employee monitoring system)

Page 28: introduction to sociotechnical computer ethics

28/42

the sociotechnical systems perspective

2- reject technology as material objects -> think sociotechnical systems

artifacts <- intentional human activity

technology -> “social product”

sociotechnical systems (Hughes, 1994) – not only social, not only technical, (different from traditional ethics)

cannot understand a technology just by focusing on the artifact (monitoring tool, data-mining tool, word processor) (updates to 3rd and 4th of std.acc.)

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

artifact is only a part of a social system and it doesn’t do anything itself (employee monitoring system)

Page 29: introduction to sociotechnical computer ethics

29/42

the sociotechnical systems perspective

3- reject technology as neutral -> think technology infused with values

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

Page 30: introduction to sociotechnical computer ethics

30/42

the sociotechnical systems perspective

3- reject technology as neutral -> think technology infused with values

“value neutral”

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

Page 31: introduction to sociotechnical computer ethics

31/42

the sociotechnical systems perspective

3- reject technology as neutral -> think technology infused with values

“value neutral”

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

Langden Winner, 1986, “Do artifacts have politics?”

Particular technologies cannot exist or function without particular kinds of social arrangements.

adoption of technology -> adoption of social order (nuclear power v.s. windmills, trains v.s. bicycles) (social v.s. individual)

Page 32: introduction to sociotechnical computer ethics

32/42

the sociotechnical systems perspective

enforcing social biases (Bridges of Long Island, 1930s, New York, Robert Moses)

Page 33: introduction to sociotechnical computer ethics

33/42

the sociotechnical systems perspective

a race-biased arrangement?

enforcing social biases (Bridges of Long Island, 1930s, New York, Robert Moses)

Page 34: introduction to sociotechnical computer ethics

34/42

the sociotechnical systems perspective

a race-biased arrangement?

enforcing social biases (Bridges of Long Island, 1930s, New York, Robert Moses)

could be technical determinism as well?

• height of bridges

• size of buses

• previous social arrangements

• ideas about race and social hierarchy

Page 35: introduction to sociotechnical computer ethics

35/42

sociotechnical computer ethics

just for example, a STS here:

Facebook (1-Mark Zuckerberg, Harvard, didn’t come out of nowhere, 2-social networking site, users contribution, users enforcement, 3-not neutral, user values v.s. facebook values – profit organization)

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

Page 36: introduction to sociotechnical computer ethics

36/42

micro and macro level analysis

micro level -> individual (RFID tag, RFID tag implanted on elderly?)

macro level -> groups, organizations, countries (hospital policies, should software be proprietary?, should employers monitor employee emails?)

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

Page 37: introduction to sociotechnical computer ethics

37/42

micro and macro level analysis

micro level -> individual (RFID tag, RFID tag implanted on elderly?)

macro level -> groups, organizations, countries (hospital policies, should software be proprietary?, should employers monitor employee emails?)

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

both levels affect each other

hacking -> individual action (micro level), how to deal with it? (macro level – laws)

Page 38: introduction to sociotechnical computer ethics

38/42

micro and macro level analysis

micro level -> individual (RFID tag, RFID tag implanted on elderly?)

macro level -> groups, organizations, countries (hospital policies, should software be proprietary?, should employers monitor employee emails?)

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

both levels affect each other

hacking -> individual action (micro level), how to deal with it? (macro level – laws)

interaction of micro and macro levels:

“The hacker was wrong to gain unauthorized access because it is illegal.”

Page 39: introduction to sociotechnical computer ethics

39/42

return to “why computer ethics?” question

1- Why a book on technology and ethics?

(technology is a part of human activity, it shapes and it is shaped by morality – decisions and choices)

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

Page 40: introduction to sociotechnical computer ethics

40/42

return to “why computer ethics?” question

1- Why a book on technology and ethics?

(technology is a part of human activity, it shapes and it is shaped by morality – decisions and choices)

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

2- Why a book on specifically “computers” or “IT” and ethics?

Different technologies have different effects.

Not just “ethics” instead of “IT ethics”, because STS perspective notes that all social activities are, in a part at least, shaped by technology.

Page 41: introduction to sociotechnical computer ethics

41/42

return to “why computer ethics?” question

1- Why a book on technology and ethics?

(technology is a part of human activity, it shapes and it is shaped by morality – decisions and choices)

4- newness – only possible? best possible? (Helen Nissenbaum – TrackMeNot, value sensitive design)

2- Why a book on specifically “computers” or “IT” and ethics?

Different technologies have different effects.

Not just “ethics” instead of “IT ethics”, because STS perspective notes that all social activities are, in a part at least, shaped by technology.

ethics

IT ethics

Page 42: introduction to sociotechnical computer ethics

42/42

references