Church, Kolmogorov and von Neumann: Their Legacy Lives in Complexity

Post on 30-Dec-2015

28 views 0 download

description

Church, Kolmogorov and von Neumann: Their Legacy Lives in Complexity. Lance Fortnow NEC Laboratories America. 1903 – A Year to Remember. 1903 – A Year to Remember. 1903 – A Year to Remember. Kolmogorov. Church. von Neumann. Andrey Nikolaevich Kolmogorov. - PowerPoint PPT Presentation

Transcript of Church, Kolmogorov and von Neumann: Their Legacy Lives in Complexity

Church, Kolmogorov and von Neumann:Their Legacy Lives in Complexity

Lance Fortnow

NEC Laboratories America

1903 – A Year to Remember

1903 – A Year to Remember

1903 – A Year to Remember

Kolmogorov Church von Neumann

Andrey Nikolaevich Kolmogorov

Born:April 25, 1903Tambov, Russia

Died:Oct. 20, 1987

Alonzo Church

Born:June 14, 1903Washington, DC

Died:August 11, 1995

John von Neumann

Born:Dec. 28, 1903Budapest, Hungary

Died:Feb. 8, 1957

Frank Plumpton Ramsey

Born:Feb. 22, 1903Cambridge, England

Died:January 19, 1930

Founder of Ramsey Theory

Ramsey Theory

Ramsey Theory

Applications of Ramsey Theory

Logic Concrete Complexity Complexity Classes Parallelism Algorithms Computational Geometry

John von Neumann

Quantum Logic Game Theory Ergodic Theory Hydrodynamics Cellular Automata Computers

The Minimax Theorem (1928)

Every finite zero-sum two-person game has optimal mixed strategies.

Let A be the payoff matrix for a player.

max min min maxT T

Y YX XX AY X AY

The Yao Principle (Yao, 1977)

Worst case expected runtime of randomized algorithm for any input equals best case running time of a deterministic algorithm for worst distribution of inputs.

Invaluable for proving limitations of probabilistic algorithms.

Making a Biased Coin Unbiased

Given a coin with an unknown bias p, how do we get an unbiased coin flip?

Making a Biased Coin Unbiased

Given a coin with an unknown bias p, how do we get an unbiased coin flip?

TAILS

HEADS

or Flip Again

Making a Biased Coin Unbiased

Given a coin with an unknown bias p, how do we get an unbiased coin flip?

TAILS

HEADS

or Flip Again

(1-p)p

p(1-p)

Weak Random Sources

Von Neumann’s coin flipping trick (1951) was the first to get true randomness from a weak random source.

Much research in TCS in 1980’s and 90’s to handle weaker dependent sources.

Led to development of extractors and connections to pseudorandom generators.

Alonzo Church

Lambda Calculus Church’s Theorem

No decision procedure for arithmetic.

Church-Turing Thesis Everything that is

computable is computable by the lambda calculus.

The Lambda Calculus

Alonzo Church 1930’s A simple way to define and manipulate

functions. Has full computational power. Basis of functional programming

languages like Lisp, Haskell, ML.

Lambda Terms

x xy x.xx

Function Mapping x to xx xy.yx

Really x(y(yx)) xyz.yzx(uv.vu)

Basic Rules

-conversion x.xx equivalent to y.yy

-reduction x.xx(z) equivalent to zz

Some rules for appropriate restrictions on name clashes (x.(y.yx))y should not be same as y.yy

Normal Forms

A -expression is in normal form if one cannot apply any -reductions.

Church-Rosser Theorem (1936) If a -expression M reduces to both A and B

then there must be a C such that A reduces to C and B reduces to C.

If M reduces to A and B with A and B in normal form, then A = B.

Power of -Calculus

Church (1936) showed that it is impossible in the -calculus to decide whether a term M has a normal form.

Church’s Thesis Expressed as a Definition An effectively calculable function of the

positive integers is a -definable function of the positive integers.

Computational Power

Kleene-Church (1936) Computing Normal Forms has equivalent

power to the recursive functions of Turing machines.

Church-Turing Thesis Everything computable is computable by a

Turing machine.

Andrei Nikolaevich Kolmogorov

Measure Theory Probability Analysis Intuitionistic Logic Cohomology Dynamical

Systems Hydrodynamics

Kolmogorov Complexity

A way to measure the amount of information in a string by the size of the smallest program generating that string.

min :p

K x p U p x

Incompressibility Method

For all n there is an x, |x| = n, K(x) n. Such x are called random. Use to prove lower bounds on various

combinatorical and computational objects. Assume no lower bound. Choose random x. Get contradiction by giving

a short program for x.

Incompressibility Method

Ramsey Theory/Combinatorics Oracles Turing Machine Complexity Number Theory Circuit Complexity Communication Complexity Average-Case Lower Bounds

Complexity Uses of K-Complexity

Li-Vitanyi ’92: For Universal Distributions Average Case = Worst Case

Instance Complexity Universal Search Time-Bounded Universal Distributions Kolmogorov characterizations of

computational complexity classes.

Rest of This Talk

Measuring sizes of sets using Kolmogorov Complexity

Computational Depth to measure the amount of useful information in a string.

Measuring Sizes of Sets

How can we use Kolmogorov complexity to measure the sizeof a set?

Measuring Sizes of Sets

How can we use Kolmogorov complexity to measure the sizeof a set?

Strings of length n

Measuring Sizes of Sets

How can we use Kolmogorov complexity to measure the sizeof a set?

Strings of length n

An

Measuring Sizes of Sets

How can we use Kolmogorov complexity to measure the sizeof a set?

The string in An of highest Kolmogorov complexity tells us about |An|.

Strings of length n

An

Measuring Sizes of Sets

There must be a string x in An such that K(x) ≥ log |An|.

Simple counting argument, otherwise not enough programs for all elements of An.

Strings of length n

An

Measuring Sizes of Sets

If A is computable, or even computably enumerable then every string in An hasK(x) ≤ log |An|.

Describe x by A and index of x in enumeration of strings of An.

Strings of length n

An

Measuring Sizes of Sets

If A is computable enumerable then

Strings of length n

An max log n

x AK x A

Measuring Sizes of Sets in P

What if A is efficiently computable?

Do we have a clean way to characterize the size of A using time-bounded Kolmogorov complexity?

Strings of length n

An

Time-Bounded Complexity

Idea: A short description is only useful if we can reconstruct the string in a reasonable amount of time.

min : in stepst

pK x p U p x t x

Measuring Sizes of Sets in P

It is still the case that some element x in An has Kpoly(x) ≥ log |A|.

Very possible that there are small A with x in A with Kpoly(x) quite large.

Strings of length n

An

Measuring Sizes of Sets in P

Might be easier to recognize elements in A than generate them.

Strings of length n

An

Distinguishing Complexity

Instead of generating the string, we just need to distinguish it from other strings.

( , ) 1

min : ( , ) 0 for

( , ) uses (| | | |) time

t

p

U p x

KD x p U p y y x

U p y t x y

Measuring Sizes of Sets in P

Ideally would like

True if P = NP. Problem: Need to

distinguish all pairs of elements in An

Strings of length n

An

max logpoly n

x AKD x A

Measuring Sizes of Sets in P

Intuitively we need

Buhrman-Laplante-Miltersen (2000) prove this lower bound in black-box model.

2max log 2logpoly n n

x AKD x A A

Measuring Sizes of Sets in P

Buhrman-Fortnow-Laplante (2002) show

We have a rough approximation of size

max 2logpoly n

x AKD x A

log max 2logn poly n

x AA KD x A

Measuring Sizes of Sets in P

Sipser 1983: Allowing randomness gives a cleaner connection.

Sipser used this and similar results to show how to simulate randomness by alternation.

max | logpoly n

x AKD x r A

Useful Information

Simple strings convey small amount of information. 00000000000000000000000000000000

Random string have lots of information 00100011100010001010101011100010

Random strings are not that useful because we can generate random strings easily.

Logical Depth

Chaitin ’87/Bennett ’97 Roughly the amount of time needed to

produce a string x from a program p whose length is close to the length of the shortest program for x.

Computational Depth

Antunes, Fortnow, Variyam and van Melkebeek 2001

Use the difference of two Kolmogorov measures.

Deptht(x) = Kt(x) – K(x) Closely related to “randomness deficiency”

notion of Levin (1984).

Applications

Shallow Sets Generalizes random and sparse sets with

similar computational power. L is “easy on average” iff time required is

exponential in depth. Can easily find satisfying assignment if

many such assignments have low depth.

1903 – A Year of Geniuses

Several great men that helped create the fundamentals of computer science and set the stage for computational complexity.

2012 - The Next Celebration

Alan Turing Born:

June 23, 1912London, England

Died:June 7, 1954