Security: Whose Responsibility

29
Vermelding onderdeel organisatie October 22, 2012 1 CAST seminar, Copenhagen, Denmark David Koepsell, TU Delft, TPM Faculty, Philosophy Section Security, whose responsibility?: The ontological foundations for a scientific commons in the age of Dual-Use concerns.

Transcript of Security: Whose Responsibility

Page 1: Security: Whose Responsibility

Vermelding onderdeel organisatie

October 22, 2012

1

CAST seminar, Copenhagen, Denmark

David Koepsell, TU Delft, TPM Faculty, Philosophy Section

Security, whose responsibility?: The ontological foundations for a scientific commons in the age of Dual-Use concerns.

Page 2: Security: Whose Responsibility

October 13, 2012 2

The Ethical Context

Rapid rate of technological progress, and increasing availability of cheaper tools for scientific and technological applications, make it harder to ensure public safety.

It is becoming easier to create catastrophic technologies without detection.

Page 3: Security: Whose Responsibility

October 13, 2012 3

The Ethical Context

How can we help ensure a safer world? What roles do governments have, and what roles do scientists and technologists have?

Who is morally responsible for dangerous research and development?

What can governments legitimately inhibit?

Page 4: Security: Whose Responsibility

October 13, 2012 4

Aims

To provide an argument for a distinction between the realm of science, and the realm of technology as technologies converge.

To argue for unfettered inquiry into scientific truths

To establish where government might legitimately regulate technology

Page 5: Security: Whose Responsibility

October 13, 2012 5

Science and Ethics

Traditionally, individual responsibility for deployment of dangerous technology has divorced scientists from the consequences.

Precepts: a) science should inquire into everything

b) politicians and maybe engineers are responsible for deployment

Page 6: Security: Whose Responsibility

October 13, 2012 6

Science and Ethics

These precepts lead to a sort of “scientific firewall” against moral responsibility. Scientists cannot be morally responsible because their duty is the unfettered exploration of everything, regardless of potential consequences.

Is there an ontological basis for the distinction? If so, what responsibility do scientists have as compared to engineers and politicians?

Page 7: Security: Whose Responsibility

October 13, 2012 7

Science and Ethics

Q: Do scientists ever have a positive moral duty to refrain? Let’s consider a graphic example…

Page 8: Security: Whose Responsibility

October 13, 2012 8

Smallpox Science

Smallpox was eliminated from the environment in 1977. It could have been eliminated altogether, and all stores of the virus destroyed. But as late as 2001, scientists in the US decided to conduct experiments to create a monkey-model of variola infection…

Page 9: Security: Whose Responsibility

October 13, 2012 9

The Australian Mousepox “Trick”UPI: “CANBERRA, Australia, Jan. 11 (UPI) --

Scientists working for the Australian government have created a genetically engineered mousepox virus more deadly to mice than the original virus. Even when vaccinated with a normally effective vaccine, half the mice died after infection with the new virus.

Biological warfare experts are worried that the current international Biological and Toxin Weapons Convention, abbreviated BTWC, may not be strong enough to cope with the misuse of the genetic engineering techniques. Governments from all over the world have been meeting in Geneva for six years to address the BTWC shortcomings, but have failed to reach final agreement.

Dr. Ian Ramshaw, a viral engineer and the immunologist on the mousepox experiment, told United Press International that inserting genetic material has hazards. His team will publish their research in the February issue of the Journal of Virology.

"It is a potentially vile weapon," Renshaw said.”

Page 10: Security: Whose Responsibility

October 13, 2012 10

The Australian Mousepox “Trick”

The gene splice involved with the Mousepox Trick may easily be applied to smallpox, making a nearly unstoppable weapon.

So why shouldn’t scientists now take the next step and see if this is true?

Critical inquiry: is it scientifically necessary? Is it morally permissible?

Page 11: Security: Whose Responsibility

October 13, 2012 11

Smallpox Ethics

The Dual-Use argument ultimately is unhelpful, even a nuclear bomb has a dual-use (like Project Orion, above). Dual-use was used to justify smallpox research (a catch-22 argument).

Are there or should there be moral limits to some research? Is some research morally prohibited because of its nature?

Is there a model for shaping researchers’ behaviours?

Ontology provides some guidance…

Page 12: Security: Whose Responsibility

October 13, 2012 12

Examples

Science doesn’t kill people; people with technologies kill people …

Page 13: Security: Whose Responsibility

October 13, 2012 13

Examples

But even the most ardent gun-rights proponent will not support free ownership of tactical nuclear weapons, and international law prohibits research and development of such weapons.

Page 14: Security: Whose Responsibility

October 13, 2012 14

Examples

I contend that the bulwark against regulation must stand between the realms of science and technology

Science demands free and unfettered investigation into nature.

Technology may be ethically regulated, however…

Page 15: Security: Whose Responsibility

October 13, 2012 15

Converging Technologies

Converging technologies (synthetic biology, nanotechnology) pose a theoretical conundrum for previously clear distinctions between nature and man-made…

Where components of new technologies are molecular, at what level is it possible to regulate without infringing on the right of inquiry? Is it morally right to restrict or track precursors?

Page 16: Security: Whose Responsibility

A Defense of Basic Science

Regardless the scale, the distinction between nature and artifice is always the border between what may and may not be ethically regulated.

E.g., Research into fission cannot be legitimately curtailed, even to the point of producing nuclear chain reactions, when a) science demands it (something remains

unknown) and b) the intent is to further human knowledgeOctober 13, 2012 16

Page 17: Security: Whose Responsibility

A Defense of Basic Science

Freedom of conscience and expression demand that free, unfettered exploration into nature continue, which sometimes requires testing of hypotheses through experiment or proof of concept.

The first successful nuclear test could have been morally defensible if a) it aimed to test hypotheses as part of

exploration into nature, and b) if the science gained were then made open

and publicOctober 13, 2012 17

Page 18: Security: Whose Responsibility

A Defense of Basic Science

Failing to disclose the basic science undermines its role in inquiry, and impedes the scientific commons. Only by disclosure can hypotheses be properly tested.

Scientific truths (laws of nature) are a “commons-by-necessity” and cannot be justly monopolized by scientists (as opposed to their applications through technology)

October 13, 2012 18

Page 19: Security: Whose Responsibility

A Defense of Basic Science

The dividing lines:

nature experiment technology

free inquiry free inquiry may be limited (as nec to (significant harm, test hypotheses) least restr.

means)

October 13, 2012 19

Page 20: Security: Whose Responsibility

A Defense of Basic Science

How to distinguish nature from artifacts:

Nature: no human intention or design. This is a commons-by (logical/material) necessity, and may be freely explored by all

Artifice (artifacts and man-made processes): human intention and design. Inhibiting impedes rights to expression, but does not impede the scientific commons-by-necessity

October 13, 2012 20

Page 21: Security: Whose Responsibility

Regulation of Artifice

Artifice is legitimately regulated, but must be recognized as curtailing free expression, thus burden is to show:

a) substantial harm without regulation and compelling state interest in preventing the

harm

b) least restrictive means and amount of censorship

October 13, 2012 21

Page 22: Security: Whose Responsibility

Regulation of Artifice

Thus, e.g., “mousepox trick”

Basic science, including proof of concept in mice, should be unfettered. Must also be published as truths of nature are scientific commons. Discoveries must be open and free to fulfill aims and methods of basic science.

BUT: smallpox testing poses significant harm, and arguably unnecessary. Mousepox model sufficient, and not harmful.

October 13, 2012 22

Page 23: Security: Whose Responsibility

Regulation of Artifice

H5N1 research:

Basic science, including proof of concept in ferrets, should be unfettered. Must also be published as truths of nature are scientific commons. Discoveries must be open and free to fulfill aims and methods of basic science.

BUT: further testing poses significant harm, and arguably unnecessary.

October 13, 2012 23

Page 24: Security: Whose Responsibility

Conclusions

1) Basic science must not be regulated. Free inquiry is necessary, including experiments when• Necessary to delve into a truth of nature

(Nature is a commons)• Results published freely and openly

(without this, science cannot proceed, hypotheses cannot be tested, challenged)

October 13, 2012 24

Page 25: Security: Whose Responsibility

Conclusions

2) Artifice can be regulated, just as certain other expressions may be regulated, when• A significant harm could result, and• Least restrictive means used to regulate

October 13, 2012 25

Page 26: Security: Whose Responsibility

Conclusions

3) distinction between nature and artifice marks a dividing line beyond which scientists and others must impose greater self-restraint and reflection:

• Nature (no human intention or design, may be freely inquired into)

• Artifice (human intention and design), may be regulated to a degree

October 13, 2012 26

Page 27: Security: Whose Responsibility

Conclusions

4) scientists have a primary, moral responsibility to consider the potential harms of their research, anticipate and avoid them as best they can. Moral education should be a part of scientific training. The scientific firewall is a myth.

October 13, 2012 27

Page 28: Security: Whose Responsibility

Conclusions

The ontology informs the ethics.

October 13, 2012 28

Page 29: Security: Whose Responsibility

October 13, 2012 29

Thank you

Atlas R. M. and Dando M. (2006). The dual-use dilemma for the life sciences: perspectives, conundrums, and global solutions, Biosecurity and Bioterrorism: Biodefense Strategy, Practice, and Science, Vol. 4, No. 3, pp. 276-286.

Childress, J., Meslin, E., & Shapiro, H., Eds. (2005). Belmont revisited: Ethical principles for research with human subjects. Washington, DC: Georgetown University Press.

Cohen H.W., Gould R.M., Sidel V.W. (2004), The pitfalls of bioterrorism preparedness: the anthrax and smallpox experiences, American Journal of Public Health, Vol. 94, No. 10, pp. 1667-1671.

Corneliussen F. (2006). Adequate regulation, a stop-gap measure, or part of a package? EMBO Reports, Vol. 7, pp. s50-s54.

Ehni, H-J. (2008). Dual use and the ethical responsibility of scientists. Arch. Immunol. Ther. Exp., Vol. 56, pp. 147-152.

Jones N.L. (2007). A code of ethics for the life sciences, Science, Engineering Ethics, Vol. 13, pp. 25-43.

Kelley M. (2006). Infectious disease research and dual-use risk, Virtual Mentor: Ethics Journal of the American Medical Association, Vol. 8, No. 4, pp. 230-234.

Koepsell D 2009 "On Genies and Bottles: Scientists' Moral Responsibility and Dangerous Technology R&D" in Science and Engineering Ethics. Vol. 16, No. 1, pp. 119-133

Miller S and Selgelid M.J. (2008). Chap. 3: The Ethics of dual-use research, in Ethical and Philosophical Consideration of the Dual-Use Dilemma in the Biological Sciences (Miller ed.), Springer Sciences, NV.

Musil, R. K. (1980). There must be more to love than death: A conversation with Kurt Vonnegut. The Nation, Vol. 231 (Issue 4): p128–132.

Nixdorff K. and Bender W. (2002). Ethics of university research, biotechnology and potential military spin-off, Minerva Vol. 40, pp. 15-35.

Preston R. (2003). The Demon in the Freezer (Fawcett).Somerville M.A. and Atlas R. M. (2005), Ethics: a weapon to counter bioterrorism, Science, Policy

Forum, Mar. 25, p. 1881.