The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

35
The Human Importance of the Intelligence Explosion Eliezer Yudkowsky Singularity Institute for Artificial Intelligence singinst.org

description

from the Singularity Conference 2006

Transcript of The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

Page 1: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

The Human Importance of the

Intelligence Explosion

Eliezer YudkowskySingularity Institute for Artificial Intelligence

singinst.org

Page 2: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

"Intelligence explosion:"

• Concept invented by I. J. Good (famous name in Bayesian statistics) in 1965.

• Hypothesis: The smarter you are, the more creativity you can apply to the task of making yourself even smarter.

• Prediction: Positive feedback cycle rapidly leading to superintelligence.

Eliezer Yudkowsky Singularity Institute for AI

(Good, I. J. 1965. Speculations Concerning the First Ultraintelligent Machine. Pp. 31-88 in Advances in Computers, 6, F. L. Alt and M. Rubinoff, eds. New York: Academic Press.)

Page 3: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

Intelligence explosion hypothesis does not imply, nor require:

• More change occurred from 1970 to 2000 than from 1940 to 1970.

• Technological progress follows a predictable curve.

• Does not even require that "Real AI" is possible! (An intelligence explosion could happen with augmented humans.)

Eliezer Yudkowsky Singularity Institute for AI

Page 4: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

"Book smarts" vs. cognition:

"Book smarts" evokes images of:

• Calculus• Chess• Good recall of facts

Other stuff that happens in the brain:

• Social persuasion• Enthusiasm• Reading faces• Rationality• Strategic ability

Eliezer Yudkowsky Singularity Institute for AI

Page 5: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

The scale of intelligent minds:a parochial view.

Village idiot Einstein

Eliezer Yudkowsky Singularity Institute for AI

Page 6: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

The scale of intelligent minds:a parochial view.

Village idiot Einstein

A more cosmopolitan view:

Village idiot

Chimp Einstein

Mouse

Eliezer Yudkowsky Singularity Institute for AI

Page 7: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

The power of intelligence:

• Fire

• Language

• Nuclear weapons

• Skyscrapers

• Spaceships

• Money

• Science

Eliezer Yudkowsky Singularity Institute for AI

Page 8: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

One of these things is not like the other...

• Space travel• Extended lifespans

• Artificial Intelligence• Nanofactories

Eliezer Yudkowsky Singularity Institute for AI

Page 9: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

Intelligence:

• The most powerful force in the known universe - see effects every day

• Most confusing question in today's science - ask ten scientists, get ten answers

• Not complete mystery – huge library of knowledge about mind / brain / cognition – but scattered across dozens of different fields!

Eliezer Yudkowsky Singularity Institute for AI

Page 10: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

If I am ignorant about a phenomenon,

this is a fact about my state of mind,

not a fact about the phenomenon.

Confusion exists in the mind, not in reality.

There are mysterious questions.Never mysterious answers.

(Inspired by Jaynes, E.T. 2003. Probability Theory: The Logic of Science. Cambridge: Cambridge University Press.)

Eliezer Yudkowsky Singularity Institute for AI

Page 11: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

For more about intelligence:

• Go to http://singinst.org/

(Or google "Singularity Institute")

• Click on "Summit Notes"

• Lecture video, book chapters

Eliezer Yudkowsky Singularity Institute for AI

Page 12: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

The brain's biological bottleneck:

• Neurons run at 100Hz

• No read access

• No write access

• No new neurons

• Existing code not human-readable

Eliezer Yudkowsky Singularity Institute for AI

Page 13: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

Relative difficulty:

• Build a Boeing 747 from scratch.

• Starting with a bird,• Modify the design to

create a 747-sized bird,• That actually flies,• As fast as a 747,• Then migrate actual living

bird to new design,• Without killing the bird or

making it very unhappy.

Eliezer Yudkowsky Singularity Institute for AI

Page 14: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

The AI Advantage(for self-improvement)

• Total read/write access to own state

• Absorb more hardware(possibly orders of magnitude more!)

• Understandable code

• Modular design

• Clean internal environment

Eliezer Yudkowsky Singularity Institute for AI

Page 15: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

Biological bottleneck(for serial speed)

• Lightspeed >106 times faster than axons, dendrites.

• Synaptic spike dissipates >106 minimum heat (though transistors do worse)

• Transistor clock speed >>106 times faster than neuron spiking frequency

Eliezer Yudkowsky Singularity Institute for AI

Page 16: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

• Physically possible to build brain at least 1,000,000 times as fast as human brain

• Even without shrinking brain, lowering temperature, quantum computing, etc...

• Drexler's Nanosystems says sensorimotor speedup of >>106 also possible

• 1 year → 31 seconds

Eliezer Yudkowsky Singularity Institute for AI

Village idiot

Chimp Einstein

Mouse

Page 17: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

10,000 years to nanotech?(for superintelligence)

• Solve chosen special case of protein folding

• Order custom proteins from online labs with 72-hour turnaround time

• Proteins self-assemble to primitive device that takes acoustic instructions

• Use to build 2nd-stage nanotech, 3rd-stage nanotech, etc.

• Total time: 10,000 years ~ 4 days

Eliezer Yudkowsky Singularity Institute for AI

Page 18: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

Respect the power of creativityand be careful what you call

"impossible".

Eliezer Yudkowsky Singularity Institute for AI

Page 19: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

Eliezer Yudkowsky Singularity Institute for AI

Ancient Greeks

Agriculture

Hunter-gatherers

Renaissance

Industrial Revolution

Electrical Revolution

Nuclear, Space, Computer, Biotech, Internet Revolutions

Molecular nanotechnology

Page 20: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

vs.

Eliezer Yudkowsky Singularity Institute for AI

Hunter-gatherers

Chimps Internet

Bees

Ancient Greeks

Agriculture

Hunter-gatherers

Renaissance

Industrial Revolution

Electrical Revolution

Nuclear, Space, Computer, Biotech, Internet Revolutions

Molecular nanotechnology

Page 21: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

Can an intelligence explosionbe avoided?

• Self-amplifying once it starts to tip over

• Very difficult to avoid in the long run

• But many possible short-term delays

• Argument: A human-level civilization occupies an unstable state; will eventually wander into a superintelligent region or an extinct region.

Eliezer Yudkowsky Singularity Institute for AI

Page 22: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

Fallacy of the Giant Cheesecake

• Major premise: A superintelligence could create a mile-high cheesecake.

• Minor premise: Someone will create a recursively self-improving AI.

• Conclusion: The future will be full of giant cheesecakes.

Power does not imply motive.

Eliezer Yudkowsky Singularity Institute for AI

Page 23: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

Fallacy of the Giant Cheesecake

• Major premise: A superintelligence could create a mile-high cheesecake.

• Minor premise: Someone will create a recursively self-improving AI.

• Conclusion: The future will be full of giant cheesecakes.

Power does not imply motive.

Eliezer Yudkowsky Singularity Institute for AI

Page 24: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

Spot the missing premise:

• A sufficiently powerful AI could wipe out humanity.

• Therefore we should not build AI.

• A sufficiently powerful AI could develop new medical technologies and save millions of lives.

• Therefore, build AI.

Eliezer Yudkowsky Singularity Institute for AI

Page 25: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

Spot the missing premise:

• A sufficiently powerful AI could wipe out humanity.

• [And the AI would decide to do so.]

• Therefore we should not build AI.

• A sufficiently powerful AI could develop new medical technologies and save millions of lives.

• [And the AI would decide to do so.]

• Therefore, build AI.

Eliezer Yudkowsky Singularity Institute for AI

Page 26: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

Design space ofminds-in-general

All human minds

Gloopy AIs

Freepy AIs

Bipping AIs

Eliezer Yudkowsky Singularity Institute for AI

Page 27: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

AI isn't a prediction problem,it's an engineering problem.

We have to reach into mind design space, and pull out a mind such that

we're glad we created it...

Eliezer Yudkowsky Singularity Institute for AI

Page 28: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

AI isn't a prediction problem,it's an engineering problem.

We have to reach into mind design space, and pull out a mind such that

we're glad we created it...

Challenge is difficult and technical!

Eliezer Yudkowsky Singularity Institute for AI

Page 29: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

"Do not propose solutions until the problem has been discussed as thoroughly as possible without suggesting any."

-- Norman R. F. Maier

"I have often used this edict with groups I have led - particularly when they face a very tough problem, which is when group members are most apt to propose solutions immediately."

-- Robyn Dawes

(Dawes, R.M. 1988. Rational Choice in an Uncertain World. San Diego, CA: Harcourt, Brace, Jovanovich.)

Eliezer Yudkowsky Singularity Institute for AI

Page 30: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

What kind of AI do we want to see?

Much easier to describe AIswe don't want to see...

Eliezer Yudkowsky Singularity Institute for AI

Page 31: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

"Friendly AI"...

(the challenge of creating an AIthat, e.g., cures cancer, rather

than wiping out humanity)

...looks possible but very difficult.

Eliezer Yudkowsky Singularity Institute for AI

Page 32: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

The intelligence explosion:Enough power to...

make the world a better place?

Eliezer Yudkowsky Singularity Institute for AI

Page 33: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

Someday,the human species

has to grow up.Why not sooner

rather than later?

Eliezer Yudkowsky Singularity Institute for AI

Page 34: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

In a hundred million years,no one's going to care

who won the World Series,but they'll remember the first AI.

Eliezer Yudkowsky Singularity Institute for AI

Page 35: The Human Importance of the Intelligence Explosion (Eliezer Yudkowsky, SSS)

For more information, please visit the Singularity Institute athttp://singinst.org/