Download - Non Deterministic Space Avi Ben Ari Lior Friedman Adapted from Dr. Eli Porat Lectures Bar-Ilan - University.

Transcript

Non Deterministic Space

Avi Ben Ari

Lior FriedmanAdapted from Dr. Eli Porat Lectures

Bar-Ilan - University

Agenda

• Definition of NSPACE, NSPACEOn, NSPACEOff,

NL, NL-COMPLETE, CONN PROBLEM.• We will prove that :

• We will prove that :

• We will prove that :

)2())(( )(sOon NTIMEsONSPACE

)))(log(())(( sONSPACEsONSPACE offon

)2())(( )(sOonoff NSPACEsONSPACE

Agenda (cont)

• We will prove that :• DSPACE(O(S)) NSPACE(O(S))

• We will prove that :• NSPACE(O(S)) DSPACE(O(S2))

• We will prove that :

• CONN NL-Complete

NSPACE Definition

• For any function S:N N– NSPACE(S) =

{ | NDTM (M) s.t for any input ,M(x) accepts, and M uses at most S(|x|) space.}.

Where NDTM - a Turing Machine with a non-deterministic transition function, having a work tape, a read only tape and a unidirectional write-only output tape. The machine is said to accept input X if there exists a computation ending in an accepting state

*L *xLx

NSPACE Explanation

• We use the term NSPACE(s) to measure the space complexity of a language.

• We talk about all the languages which have a NDTM that accept the language, and does so without using too much space.

• The amount of space allowed is determined by S and is of course dependant on the length of the input of a specific word.

NSPACEOff DefinitionNSPACEOff(S)

• offline TM - is a Turing machine with a work tape,a read-only tape, a two-way read-only guess tape and a unidirectional write-only output tape. The machine is said to accept input X if there exists contents to the guess tape (a guess string Y) such that when the machine starts working with X in the input tape and Y in the guess tape it eventually enter an accepting state

})(

),(

),(|{

*

**

*

spacexSmostatusesMyLx

acceptsyxMyx

MTMofflineL

off

off

off

NNS :

NSPACEOff Explanation

• This definition is based on the “guess” approach to non determinism Turing machine. As we replace the non deterministic transition function with a “guess” we use this in the definition of NSPACEOff.

• Since the space the machine use is of importance we limit the access to the guess tape as read only and by that we don’t give “extra” space to the machine.

NSPACEOn Definition

• NSPACEOn

• online TM - A Turing machine with a work tape,a read-only input tape, a unidirectional read-only guess tape and a unidirectional write-only output tape. Again, the machine is said to accept input X if there exists a guess Y such that when the machine working on (X,Y) will eventually enter an accepting state

NNS :})(

),(

),(|{

*

**

*

spacexSmostatusesMyLx

acceptsyxMyx

MTMonlineL

off

off

off

NSPACEOn Explanation

• By limiting the machine to be unidirectional we get a process by which in each step the machine has to choose which way to go (similar to online decisions) and the only way to “remember” past decisions is by recording them on the work tape.

• We will show that the two definitions of NSPACEOn and NSPACEOff are different.

The relation between NSPACEOn and NSPACE

• claim (1):

• Where:– M is an NDTM.– M’ is an online TM.

))(&)('('

))(&)((

SOTOinLidentifyMML

SOTOinLidentifyMML

Claim (1) - Proof

• Given M we can easily transform it to the online M’ in the following way:– M’ simulate M, whenever M has several option

in the transition function M’ decides which option to take according to the guess tape (and than move to the following cell on the guess tape).

• we can limit the guess tape alphabet to be (0,1) only since every multiple decision can be composed of several two way decision.

Claim (1) – Proof (Cont)

• Given the online M’ we transform it to M :• M states = ,

– S(M’) are the states of M’.

• the transition function of M:– M is in state (s,a) and reads b,c on the work,input tapestranslate to

M’ is in state (s), reading a,b,c on the guess,work and input tapes.

• In this case we know the new state (s’) of M’ and :– If the guess head doesn’t move M new state is (s’,a).– If the guess head moves M new state is chosen non

deterministically from (s’,e) where e can be every symbol in the alphabet

xMS )'(

Explanation

• The conclusion from Claim (1) is that the families NSPACE and NSPACEOn are actually the same.

• Since for any problem in NSPACE we can build an online machine that will use the same space (and time).

• This fact is very important and we will use this to prove the following theorem.

Theorem 1

• Definition - configuration.– A configuration is a full description of a state in the

computation of a TM on a given input– On a NDTM it is defined as:

• The Content of the Work Tape.• The Position of the head on the work tape.• The Position of the head on the input tape.• The state of the machine.

)2()( )(SOon NTIMESNSPACE

).log( xS

Theorem 1(Cont)

• Next we want to bound the the number of different configuration (#Conf(M,x)).

• We know that:

– where :• is the number of possibilities of the work tape content.• is the maximal length of the work tape.• is the length of the input tape.• is the number of state M has.

• • And since we conclude that

)()(),(#)(

MSxxSxMConfxS

xxMConf xSO *2),(# ))(().log( xS

))((2),(# xSOxMConf

)( xS

)( xS

x)(MS

Theorem 1(Cont)

• The last thing we need to show is the following claim (2) :– If there exists an accepting computation of a NDTM M

on input x then there exists such a computation in which no configuration appears more than once.

• And because #Conf(M,x)<• and each step we go to a different configuration.

• The number of steps (and time) < ■ ))((2 xSO

))((2 xSO

Claim (2) - Proof• Let c0,c1,….,cn be a description on an accepting

computation.• suppose:

c0,c1,..ck,cl+1….,cn is also an accepting computation.• To prove that we will show:

1. The first configuration c0 is an initial state of M.

2. Every configuration ci is followed by a configuration that is possible in the sense that M may move in one step from ci to ci+1.

3. The last configuration cn is in an accepting state of M.

lk ccnlklk ,0,

Claim (2) – Proof(Cont)

• Clearly properties 1 and 3 does not change by the removal of the specified configurations.

• Property 2 still holds since cl+1 is possible after cl

and therefore possible after ck.

• We can iterate through this process to achieve a computation with no identical configurations.

Theorem 1- Explanation• To conclude the proof of theorem (1) we combine

claim (2) with the fact the the number of possible different configuration is bound, and therefore the time of the computation of machine M is bounded.

• By using claim (1) we know that each NDTM M has an equivalent online machine M’ that uses the same space and therefore we have proven the theorem.

The relation between NSPACEOn and NSPACEOff

• Theorem (2):

S is at least logarithmic.

• Proof : – Let Mon be a TM that uses S space.

– We define Moff to do:

))(()( SLogNSPACESNSPACE OffOn

,: NNS

Theorem (2) Proof

1. Moff will ‘guess’ the computation of Mon on a given input x.

2. Moff verify the guessed computation.• Verification of a computation:

– As in claim 2 we show that properties 1-3 holds.

– To verify properties 1 and 3 we only need to look at the first and last block and make sure that they stands for an initial and an accepting states of M. this can be done in O(1) Space.

Theorem (2) Proof (Cont)

• To verify Property 2 we need look at two blocks and check:– The content of the work space is the same except perhaps

the cell on which M head was on.

– The position of the heads of both the input tape and the work tape had not moved by more than one cell.

– The new state and the positions of the heads of M are possible under the transition function of M.

• This can be done using a fixed number of counters.

Theorem (2) Proof (Cont)

• A counter to Z needs log(Z) space.• The counters we use count to Max(|x|,|configuration|). • A configuration is:

– The content of the work tape – O(S(x))– The location of the work head – log(O(S(x)))– The state of the machine – O(1). – The location of the input head O(log(x)).

• Since S is at least log(x) • |configuration| = O(S(x)) • for a single counter we need only log(S(x)) space ■

Theorem (2) Proof-Explanation

• We showed that we can build an offline machine that guess and verify the computation of an NDTM using only log(O(S(x))) space.

• The only thing we omitted is that the guess length is bounded by the number of different configuration the machine has.

• This is true as we showed in claim (2) that if M accepts x there is a computation which is composed of different configurations.

• And we also showed a bound on these number.

The relation between NSPACEOff and NSPACEOn

• Theorem (3):

S is at least logarithmic.

• Claim (3) :– The number of time during an accepting

computation of Moff in which the guess tape head visits a specified cell

)2()( )(sOOnOff NSPACESNSPACE

,: NNS

)(2),(# SOxMConf

Claim (3) - Proof• Definition: A CWG - is a configuration of an

offline machine without the guess tape content and the guess head location.

• Again the combinatorial analysis shows that:

• If Moff visits the same guess cell twice the same CWG the entire computational state is the same.

• Since Moff transition function is deterministic Moff is in an infinite loop ■

)(2),(## SOxMConfCWG

Theorem 3 (cont)

• Claim (4):– If– Than

• Proof– Lets look at the guess tape cells c0,c1,…cn

and their content g0,g1,…gn where n=|y|.– For ci ,in a computation on x, moff have visited it

more than once each time with a different CWG.

)))(((2),(# 2),(#xSOxMConfxMConfyy

stopsyxMy off ),(

Claim (4) - Proof• We will call the sequence of these CWGs the visiting

sequence of ci.• Suppose that for k<l, gk=gl and ck, cl have same visiting

sequence • g0,g1…gk,gl+1,…gn is also a guess which will cause Moff to

accept x.• To see that we just need to follow the computation.• Every time we want to move from ck to ck+1 we just ‘jump’

to cl.• By repetition this we can get a shorter guess tape with no

such two cells.

Claim (4) – Proof (Cont)

• There are of CWGs. • There are visiting sequences of n

CWGs.• Each sequence is at most of length • So over all we have:

nxMConf ),(#

),(# xMConf

))((21),(#

),(#),(#

1

2),(#

),(#),(#),(#

xSOxMConf

xMConfxMConf

i

i

xMConf

xMConfxMConfxMConf

),(# xMConf

Theorem 3 - Explanation• As we see this boundary of claim 4 is not enough

to prove theorem (3). • We prove the theorem as we proved Theorem (2).• The online machine will ‘guess the’ computation

of the offline machine • Instead of representing the computation using

configuration we will use visiting sequence.• As in theorem (2) we only need to check whether

this “guessed” sequences form an accepting computation.

Theorem 3 – Proof(Cont)

• Definitions –

– Given a computation of Moff and a guess cell ci we define by Directed Visiting Sequence(DVS) of ci:

• The content of the guess cell ci

• The visiting sequence of ci

• For every CWG in the visiting sequence, the direction from which the guess head arrived to the cell (R,L or S for Right,Left or Stay).

Theorem 3 – Proof(Cont)

• We also define :• A DVS has the reasonable returning direction property

if: whenever according to the CWG and cell content the guess head should move right, than the direction associated with the next CWG is left.(and vice versa).

• An ordered pair of DVS is called locally consistent if they appear as if they may be consecutive in a computation (according to Moff transition function). In addition both DVS’s must be entered from the left and have the reasonable returning property.

Theorem 3 – Proof(Cont)

• To verify that a string is a legal computation we only need to make sure that:

– The CWG of the first DVS describes an initial configuration of Moff.

– Every two consecutive DVS’s are locally consistent.

– In some DVS the last CWG describes an accepting configuration.

– In the last (most right) DVS there is no CWG that according to it the guess head should move to the right.

Theorem 3 – Proof

• To Prove the theorem we build the Mon to:1. Guess a sequence of DVSs.2. Verify the guess.

• It is clearly seen that to do that Mon only need to copy to the work space two DVSs at a time.

• And as seen in claim 4 the length of a visiting sequence is bounded by CWGs each take log(S(x))

• We conclude that the space needed for a single DVS is bounded by ■

),(# xMConf

)(2 SO

• Lemma 4: DSPACE(O(S)) NSPACE(O(S))

• Proof: deterministic machinces are in particular degenerated non-deterministic machines ■

The relation between DSPACE and NSPACE

• suppose there exists a deterministic Turing machine that accepts L in space O(s), than the same machine can be said to be a degenerated non-deterministic Turing machine and thus accepts L, as well.

Lemma 4 - Explanation

• every space constructable function S which is at least logarithmic NSPACE(O(S)) DSPACE(O(S2))

Theorem 5 (Savitch’s theorem)

• We will show that for any non deterministic machine Mn that accepts L in space S, there is a deterministic machine M that accepts L in space S2.

• First, we define: M’s configuration over x. Given a machine M which works in space S and an input string x, M’s configuration over x (Gmx) is the directed graph in which the set of vertices is all the possible configurations of M (with input x) and there exists a directed edge from s1 to s2 iff it is possible for M, being in configuration s1, to change its configuration to s2.

Proof 5 (Savitch’s theorem)

• We make a few observations:• It is clear that M is deteministic iff the out

degree of all the vertices of the above graph are one.

• The questions whether there exists a computation of M that accepts x can be phrased in the graph terminology as “is there a directed graph from startm to acceptm” (startm is the node that represents the first configuration. acceptm is the node that represents the accepted configuration).

Theorem 5 – Proof(Cont)

• If there exists a path between two nodes in the graph, it is a simple one (otherwise, the equivalent Turing machine M would be in an infinite loop).

• If there exists a path from startm to acceptm then there is one of length of at most |VM|(= # of vertices).

Theorem 5 – Proof(Cont)

• given a graph G = (V, E), two vertices s, t V and a number m, in a way that solving the question of whether there exists and edge between two vertices can be done in O(S) space, the question “is there a path of length at most m from s to t” can be answered in space O(S • log(m)).

Lemma 6

• If there is a path from s to t of length at most m, either there is an edge from s to t or there is a vertex u s.t. there is a path from s to u of length at most [m/2] and a path from u to t of length at most [m/2]. It is easy to implement a recursive procedure PATH(a, b, m) to answer the question (see on next slide..)

Lemma 6 - Proof

procedure PATH(a, b, m) 1.    boolean PATH(a, b, m)2.    if there is an edge from a to b then

return TRUE3.    else4.    for every vertex v V5.    if PATH(a, v, [m/2]) and

PATH(v, b, [m/2])6.    return TRUE7.    otherwise return FALSE

Lemma 6 - Proof (Cont)

• how much does PATH(a, b, m) use ?• we denote W(m) for the space used by PATH when

invoked with parameter l.• each invocation of PATH with parameter m, it uses

O(S) space to store a, b and m, check whether there is an edge from s to t and handle the for loop control variable.

• Recursively, path is invoked twice with parameter m/2.

• But, we can use the same space twice.

Lemma 6 - Proof (Cont)

• We get the Recursive formula:o W(m) = O(S) + W(m/2)o W(m) = O(S).

• The solution is W(m) = O(S • log(m)).

Lemma 6 - Proof (Cont)

• Now we return to Savitch’s theorem: • We apply lemma 1 by asking “is there a

path from startm to acceptm in Gmx ? “• The deterministic machine M gets x as

input and build Gmx on the fly, i.e. build and keep in memory only the parts it needs for the operation is performs.

Theorem 5 – Proof(Cont)

• It can be done since the vertices of Gmx are configuration of Mn and there is an edge from v to u iff it is possible for Mn, being in configuration v, to change for configuration u. That can be easily be checked by looking at the transition function of Mn.

• Therefore, if M works in O(s) space then in Gmx we need O(s) to store a vertex (i.e. configuration) and log(O(s)) space to check if there is an edge between two stored vertices

• All is left to apply the lemma 6 ■

Theorem 5 – Proof(Cont)

NL Definition

• The complexity class Non-Deterministic logaritmic space (NL) is defined as Nspace(O(log(n)).

• More formally: A language L NL if there is a nondeterministic

Turing machine M that accepts L and a function f(n) = O(log(n)) such that for every input x and for every computation of M at most f(|x|) different work-tape cells are used.

NL-Complete Definition

• L is NL-Complete if:1) L NL ; and2) V L' NL, L' is log-space reducible to L.

• The problem graph connectivity (conn) is defined to be the problem where for the input G = (V, E) (a directed graph) and v, u V

• It gives the answer whether there is a directed path from v to u in G.

The Complete problem: graph

connectivity

• Proof:1) conn NL-Complete.

First we proof the first condiction, i.e., conn NL. We present the code itself (see

next slide).

Theorem 7: conn NL-Complete.

1.     x v (given at the input)2.     counter |V|3.     repeat4.     counter --5.     guess a node y V s.t. (x, y) E6.     if y <> u then x y7.     until y = u or counter = 08.     if y = u then Accept, else Reject.

Theorem 7 – Proof(Cont)

Theorem 7 – Proof(Cont)

• A machine that implement the previous algorithm would require logarithmic work-tape space (in the size of the input) for its variables (x, counter, y), i.e., conn NL.

1.    

2) Now, we need to show that every language L NL is log-space reducible to conn:•       Let L be a language in NL•       There is a non-deterministic log space

machine M than decides L.For every input x for machine M we can build in

non-deterministic log space an input, which is a function of M and x, such that there is a path from s to v in G iff M accepts input x.

Theorem 7 – Proof(Cont)

• The graph is constructed as follows:Input: An input string x (the machine M is fixed)

Output: A graph G=(E, V) and two nodes u, v such that there is a path from v to u in G iff x accepted by M.

Theorem 7 – Proof(Cont)

1. compute n, the number of different configurations of M while computing input x

2.     for i = 1 to n3.     for j = 1 to n4.           if there is a transition from

configuration number i to configuration number j

output 1 otherwise output 05.    output 1 and n

Theorem 7 – Proof(Cont)