Post on 04-Jun-2018
8/13/2019 Introduction to AI and the perceptron
1/70
CS344: Introduction to Artificial
Intelligence(associated lab: CS386)
Pushpak BhattacharyyaCSE Dept.,
IIT Bombay
From 29 Jan, 2014
(lecture 7 was by Vinita and Rahul on
Sentiment and Deep Learning)
8/13/2019 Introduction to AI and the perceptron
2/70
AI Perspective (post-web)
Planning
Computer
Vision
NLP
ExpertSystems
Robotics
Search,Reasoning,Learning
IR
8/13/2019 Introduction to AI and the perceptron
3/70
Search: Everywhere
8/13/2019 Introduction to AI and the perceptron
4/70
Planning (a) which block to pick, (b) which to stack, (c) which to unstack, (d)
whether to stack a block or (e) whether to unstackan already stackedblock. These options have to be searched in order to arrive at the rightsequence of actions.
A CB A
B
C
Table
8/13/2019 Introduction to AI and the perceptron
5/70
Vision A search needs to be carried out to find which point in the image of L
corresponds to which point in R. Naively carried out, this can becomean O(n2) process where n is the number of points in the retinal
images.
World
Two eyesystem
R L
8/13/2019 Introduction to AI and the perceptron
6/70
Robot Path Planning searching amongst the options of moving Left, Right, Up or Down.
Additionally, each movement has an associated cost representing therelative difficulty of each movement. The search then will have to findthe optimal, i.e., the least cost path.
O1
R
D
O2
Robot
Path
8/13/2019 Introduction to AI and the perceptron
7/70
Natural Language Processing search among many combinations of parts of speech on the way to
deciphering the meaning. This applies to every level of processing-syntax, semantics, pragmatics and discourse.
The man would like to play.
Noun Verb NounVerb VerbPreposition
8/13/2019 Introduction to AI and the perceptron
8/70
Expert SystemsSearch among rules, many of which can apply to a
situation:
If-conditionsthe infection is primary-bacteremiaAND the site of the culture is one of the sterile sitesAND the suspected portal of entry is the gastrointestinal tract
THENthere is suggestive evidence (0.7) that infection is bacteroid
(from MYCIN)
8/13/2019 Introduction to AI and the perceptron
9/70
Search building blocks
State Space : Graph of states (Express constraints
and parameters of the problem)
Operators : Transformations applied to the states.
Start state : S0(Search starts from here)Goal state : {G} - Search terminates here.
Cost : Effort involved in using an operator.
Optimal path : Least cost path
8/13/2019 Introduction to AI and the perceptron
10/70
Examples
Problem 1 : 8puzzle
8
4
6
5
1
7
2
1
4
7
63 3
5
8
S
2
G
Tile movement represented as the movement of the blank space.
Operators:L : Blank moves left
R : Blank moves right
U : Blank moves up
D : Blank moves downC(L) = C(R) = C(U) = C(D) = 1
8/13/2019 Introduction to AI and the perceptron
11/70
Problem 2: Missionaries and Cannibals
ConstraintsThe boat can carry at most 2 people
On no bank should the cannibals outnumber the missionaries
River
R
L
Missionaries Cannibals
boat
boat
Missionaries Cannibals
8/13/2019 Introduction to AI and the perceptron
12/70
State :
#M= Number of missionaries on bankL
#C= Number of cannibals on bankL
P= Position of the boat
S0 =
G = < 0, 0, R >
Operations
M2= Two missionaries take boat
M1= One missionary takes boat
C2= Two cannibals take boatC1= One cannibal takes boat
MC = One missionary and one cannibal takes boat
8/13/2019 Introduction to AI and the perceptron
13/70
C2 MC
Partial searchtree
8/13/2019 Introduction to AI and the perceptron
14/70
Problem 3
B B W W WB
G: States where no Bis to the left of any W
Operators:1) A tile jumps over another tile into a blank tile with cost
2
2) A tile translates into a blank space with cost 1
8/13/2019 Introduction to AI and the perceptron
15/70
Algorithmics of Search
8/13/2019 Introduction to AI and the perceptron
16/70
General Graph search Algorithm
SS
AA CB
F
ED
G
1 103
5 4 6
23
7
Graph G = (V,E)
A CB
D E
F G
S
8/13/2019 Introduction to AI and the perceptron
17/70
1) Open List : S (, 0)
Closed list :
2) OL : A(S,1)
, B(S,3)
, C(S,10)
CL : S
3) OL : B(S,3), C(S,10), D(A,6)
CL : S, A
4) OL : C(S,10), D(A,6), E(B,7)
CL: S, A, B
5) OL : D(A,6), E(B,7)
CL : S, A, B , C
6) OL : E(B,7), F(D,8), G(D, 9)
CL : S, A, B, C, D
7) OL : F(D,8)
, G(D,9)
CL : S, A, B, C, D, E
8) OL : G(D,9)
CL : S, A, B, C, D, E, F
9) OL :
CL : S, A, B, C, D, E,
F, G
St f GGS
8/13/2019 Introduction to AI and the perceptron
18/70
Steps of GGS(principles of AI, Nilsson,)
1. Create a search graph G, consisting solely of thestart node S; put Son a list called OPEN.
2. Create a list called CLOSEDthat is initially empty.
3. Loop: if OPENis empty, exit with failure.
4. Select the first node on OPEN, remove from OPENand put on CLOSED, call this node n.
5. if nis the goal node, exit with the solutionobtained by tracing a path along the pointers from n
to sin G. (ointers are established in step 7). 6. Expand node n, generating the set Mof its
successors that are not ancestors of n. Install thesememes of Mas successors of nin G.
8/13/2019 Introduction to AI and the perceptron
19/70
GGS steps (contd.)
7. Establish a pointer to nfrom those members of Mthat were not already in G(i.e., not already on eitherOPEN or CLOSED). Add these members of Mto
OPEN. For each member of M that was already onOPENor CLOSED, decide whether or not to redirectits pointer to n. For each member of M already onCLOSED, decide for each of its descendents in G
whether or not to redirect its pointer. 8. Reorder the list OPENusing some strategy.
9. Go LOOP.
8/13/2019 Introduction to AI and the perceptron
20/70
GGS is a general umbrella
S
n1
n2
g
C(n1,n2)
h(n2)
h(n1)
)(),()( 2211 nhnnCnh
OL is a
queue
(BFS)
OL is
stack
(DFS)
OL is accessed by
using a functions
f= g+h
(Algorithm A)
8/13/2019 Introduction to AI and the perceptron
21/70
Algorithm A
A functionfis maintained with each node
f(n) = g(n) + h(n), nis the node in the open list
Node chosen for expansion is the one with leastfvalue
For BFS: h= 0,g= number of edges in the
path to S
For DFS: h= 0,g=
8/13/2019 Introduction to AI and the perceptron
22/70
Algorithm A* One of the most important advances in AI
g(n)= least cost path to n from S found so far
h(n)
8/13/2019 Introduction to AI and the perceptron
23/70
A* AlgorithmDefinition andProperties
f(n) = g(n) + h(n) The node with the least
value of fis chosen from theOL.
f*(n) = g*(n) + h*(n),where,
g*(n)= actual cost ofthe optimal path (s, n)
h*(n)= actual cost ofoptimal path (n, g)
g(n) g*(n)
By definition, h(n) h*(n)
S s
n
goal
State space graph G
g(n)
h(n)
8/13/2019 Introduction to AI and the perceptron
24/70
8-puzzle: heuristics
2 1 4
7 8 3
5 6
1 6 7
4 3 2
5 8
1 2 3
4 5 6
7 8s n g
Example: 8 puzzle
h*(n)= actual no. of moves to transform nto g
1. h1(n)= no. of tiles displaced from their destined
position.2. h2(n)= sum of Manhattan distances of tiles from
their destined position.
h1(n) h*(n) andh1(n) h*(n)
h*
h2
h1
Comparison
8/13/2019 Introduction to AI and the perceptron
25/70
A* critical points
Goal
1. Do we know the goal?
2. Is the distance to the goal known?
3. Is there a path (known?) to the goal?
8/13/2019 Introduction to AI and the perceptron
26/70
A* critical points
About the pathAny time before A* terminates there
exists on the OL, a node from the optimalpath all whose ancestors in the optimalpath are in the CL.
This means,in the OL always a node n s.t.
g(n) = g*(n)
8/13/2019 Introduction to AI and the perceptron
27/70
Key point about A* search
S
Statement:
Let S -n1-n2-n3ni-nk-1-nk(=G) be an optimal path.
At any time during thesearch:
1. There is a node nifrom theoptimal path in the OL
2. For niall its ancestorsS,n1,n2,,ni-1are in CL
3. g(ni) = g*(ni)
S
|
n1|
n2|
.
.
ni.
.nk-1|
nk=g
8/13/2019 Introduction to AI and the perceptron
28/70
Proof of the statement
Proof by induction on iteration no. j
Basis : j = 0, S is on the OL, S satisfies
the statement
Hypothesis : Let the statement be true forj = p (pthiteration)
Let nibe the node satisfying thestatement
8/13/2019 Introduction to AI and the perceptron
29/70
Proof (continued)
Induction : Iteration no. j = p+1Case 1 : niis expanded and moved to
the closed list
Then, ni+1from the optimal pathcomes to the OL
Node ni+1satisfies the statement
(note: if ni+1is in CL, then ni+2satisfiesthe property)
Case 2 : Node x niis expanded
Here, nisatisfies the statement
8/13/2019 Introduction to AI and the perceptron
30/70
Admissibility: An algorithm is called admissible if italways terminates and terminates in optimal path
Theorem: A* is admissible.
Lemma: Any time before A* terminates there existson OLa node nsuch that f(n)
8/13/2019 Introduction to AI and the perceptron
31/70
f*(ni) = f*(s), ni sand ni g
Following set of equations show the above equality:
f*(ni) = g*(ni) + h*(ni)f*(ni+1) = g*(ni+1) + h*(ni+1)
g*(ni+1) = g*(ni) + c(ni , ni+1)
h*(ni+1) = h*(ni) - c(ni , ni+1)
Above equations hold since the path is optimal.
A* Properties (contd.)
8/13/2019 Introduction to AI and the perceptron
32/70
8/13/2019 Introduction to AI and the perceptron
33/70
Lemma
Any time before A* terminates there exists in the open list a node n'
such thatf(n')
8/13/2019 Introduction to AI and the perceptron
34/70
If A* does not terminate
Let ebe the least cost of all arcs in the search graph.
Theng(n) >= e.l(n)where l(n)= # of arcs in the path from Sto
nfound so far. If A* does not terminate,g(n)and hence
f(n) = g(n) + h(n) [h(n) >= 0]will become unbounded.
This is not consistent with the lemma. So A* has to terminate.
8/13/2019 Introduction to AI and the perceptron
35/70
2ndpart of admissibility of A*
The path formed by A* is optimal when it has terminated
Proof
Suppose the path formed is not optimal
Let Gbe expanded in a non-optimal path.
At the point of expansion of G,
f(G) = g(G) + h(G)
= g(G) + 0
> g*(G) = g*(S) + h*(S)
= f*(S)[f*(S)= cost of optimal path]
This is a contradiction
So path should be optimal
8/13/2019 Introduction to AI and the perceptron
36/70
Summary on Admissibility
1. A* algorithm halts
2.A* algorithm finds optimal path
3. If f(n) < f*(S) then node nhas to be expandedbefore termination
4. If A* does not expand a node nbefore termination
then f(n) >= f*(S)
8/13/2019 Introduction to AI and the perceptron
37/70
Exercise-1
Prove that if the distance of every node from the goalnode is known, then no search: is necessary
Ans:
For every node n, h(n)=h*(n). The algo is A*.
Lemma proved: any time before A* terminates, there is a nodemin the OL that has f(m)
8/13/2019 Introduction to AI and the perceptron
38/70
Exercise-2If the hvalue for every node over-estimates the h*value of the
corresponding node by a constant, then the path found need
not be costlier than the optimal path by that constant. Provethis.
Ans:
Under the condition of the problem, h(n)
8/13/2019 Introduction to AI and the perceptron
39/70
Better Heuristic PerformsBetter
h
8/13/2019 Introduction to AI and the perceptron
40/70
Theorem
A version A2* of A* that has a better heuristic than another version
A1* of A* performs at least as well as A1*Meaning of better
h2(n) > h1(n)for all n
Meaning of as well asA
1* expands at least all the nodes of A
2*
h*(n)
h2(n)
h1(n) For all nodes n,
except the goal
node
P f b i d i h h f A *
8/13/2019 Introduction to AI and the perceptron
41/70
Proof by induction on the search tree of A2*.
A* on termination carves out a tree out of G
Induction
on the depth k of the search tree of A2*. A
1* before termination
expands all the nodes of depth kin the search tree of A2*.
k=0. True since start node Sis expanded by both
Suppose A1* terminates without expanding a node nat depth (k+1)of
A2* search tree.
Since A1* has seen all the parents of nseen by A
2*
g1(n)
8/13/2019 Introduction to AI and the perceptron
42/70
Proof for g1(nk+1)
8/13/2019 Introduction to AI and the perceptron
43/70
Proof for g1(nk+1)
8/13/2019 Introduction to AI and the perceptron
44/70
Proof for g1(nk+1)
8/13/2019 Introduction to AI and the perceptron
45/70
k+1
S
G
Since A1* has terminated without
expanding n,
f1(n) >= f*(S) (2)
Any node whosef value is strictly less
thanf*(S)has to be expanded.
Since A2* has expanded n
f2(n) = h2(n) which is a contradiction. Therefore, A1* has to expand
all nodes that A2* has expanded.
Exercise
If better meansh2(n) > h1(n)for some nand h2(n) = h1(n)for others,
then Can you prove the result ?
8/13/2019 Introduction to AI and the perceptron
46/70
Monotonicity
Steps of GGS
8/13/2019 Introduction to AI and the perceptron
47/70
Steps of GGS(principles of AI, Nilsson,)
1. Create a search graph G, consisting solely of thestart node S; put Son a list called OPEN.
2. Create a list called CLOSEDthat is initially empty.
3. Loop: if OPENis empty, exit with failure.
4. Select the first node on OPEN, remove from OPENand put on CLOSED, call this node n.
5. if nis the goal node, exit with the solutionobtained by tracing a path along the pointers from n
to sin G. (ointers are established in step 7). 6. Expand node n, generating the set Mof its
successors that are not ancestors of n. Install thesememes of Mas successors of nin G.
8/13/2019 Introduction to AI and the perceptron
48/70
GGS steps (contd.)
7. Establish a pointer to nfrom those members of Mthat were not already in G(i.e., not already on eitherOPEN or CLOSED). Add these members of Mto
OPEN. For each member of M that was already onOPENor CLOSED, decide whether or not to redirectits pointer to n. For each member of M already onCLOSED, decide for each of its descendents in G
whether or not to redirect its pointer. 8. Reorder the list OPENusing some strategy.
9. Go LOOP.
I ustration or CL parent pointer
8/13/2019 Introduction to AI and the perceptron
49/70
I ustration or CL parent pointerredirection recursively
S
1
23
54
6
Node in CL
Node in OL
Parent Pointer
8/13/2019 Introduction to AI and the perceptron
50/70
8/13/2019 Introduction to AI and the perceptron
51/70
8/13/2019 Introduction to AI and the perceptron
52/70
Definition of monotonicity
A heuristic h(p)is said to satisfy themonotone restriction, if for all p,
h(p)
8/13/2019 Introduction to AI and the perceptron
53/70
8/13/2019 Introduction to AI and the perceptron
54/70
Grounding the Monotone Restriction7 3
1 2 4
8 5 6
1 2 3
4 5 6
7 8
nG
7 3 4
1 2
8 5 6
h(n) -: number of displaced tiles
Is h(n) monotone ?h(n) = 8h(n) = 8
C(n,n) = 1
Hence monotone
n
M i i f # f Di l d
8/13/2019 Introduction to AI and the perceptron
55/70
Monotonicity of # of DisplacedTile Heuristic
h(n) < = h(n) + c(n, n)
Any move changes h(n) by at most 1
c = 1
Hence, h(parent) < = h(child) + 1
If the empty cell is also included in thecost, then hneed not be monotone(try!)
M t i it f M h tt
8/13/2019 Introduction to AI and the perceptron
56/70
Monotonicity of ManhattanDistance Heuristic (1/2)
Manhattan distance= X-dist+Y-distfromthe target position
Refer to the diagram in the first slide: hmn(n) = 1 + 1 + 1 + 2 + 1 + 1 + 2 + 1 =
10 hmn(n) = 1 + 1 + 1 + 3 + 1 + 1 + 2 + 1
= 11 Cost = 1 Again, h(n) < = h(n) + c(n, n)
M t i it f M h tt
8/13/2019 Introduction to AI and the perceptron
57/70
Monotonicity of ManhattanDistance Heuristic (2/2)
Any move can either increase the h valueor decrease it by at most 1.
Cost again is 1. Hence, this heuristic also satisfies
Monotone Restriction If empty cell is also included in the cost
then manhattan distance does not satisfymonotone restriction (try!)
Apply this heuristic for Missionaries andCannibals problem
R l ti hi b t
8/13/2019 Introduction to AI and the perceptron
58/70
Relationship betweenMonotonicity and Admissibility
Observation:
Monotone Restriction Admissibility
but not vice-versa
Statement: If h(ni)
8/13/2019 Introduction to AI and the perceptron
59/70
8/13/2019 Introduction to AI and the perceptron
60/70
Proof of MR leading to optimal path
8/13/2019 Introduction to AI and the perceptron
61/70
Let S-N1- N2- N3- N4... NmNk be an optimal path from S to Nk (all ofwhich might or might not have been explored). Let Nm be the lastnode on this path which is on the open list, i.e., allthe ancestors from Sup to Nm-1are in the closed list.
For every node Np on the optimal path,
g*(Np)+h(Np)
8/13/2019 Introduction to AI and the perceptron
62/70
Now if Nk is chosen in preference to Nm,f(Nk)
8/13/2019 Introduction to AI and the perceptron
63/70
Monotonicity of f(.) values
Statement:fvalues of nodes expanded by A*
increase monotonically, if his
monotone.Proof:
Suppose niand njare expanded with
temporal sequentiality, i.e., njisexpanded after ni
8/13/2019 Introduction to AI and the perceptron
64/70
Proof (1/3)
niexpanded before nj
niand njco-existing njcomes to open list as aresult of expandingni and isexpanded immediately
njsparent pointerchanges to niandexpanded
njexpandedafter ni
8/13/2019 Introduction to AI and the perceptron
65/70
Proof (2/3)
All the previous cases are forms of thefollowing two cases (think!)
CASE 1:njwas on open list when niwas expanded
Hence, f(ni)
8/13/2019 Introduction to AI and the perceptron
66/70
Proof (3/3)
ni
nj
Case 2:
f(ni) = g(ni) + h(ni) (Defn of f)f(nj) = g(nj) + h(nj) (Defn off)
f(ni) = g(ni) + h(ni) = g*(ni) + h(ni) ---Eq 1
(since niis picked for expansion niis on optimal path)
With the similar argument for njwe can write the following:f(nj) = g(nj) + h(nj) = g*(nj) + h(nj) ---Eq 2
Also,h(ni) < = h(nj) + c(ni, nj) ---Eq 3 (Parent- child
relation)g*(nj) = g*(ni) + c(ni, nj) ---Eq 4 (both nodes on
optimal path)From Eq 1, 2, 3 and 4
f(ni)
8/13/2019 Introduction to AI and the perceptron
67/70
Better way to understandmonotonicity of f()
Let f(n1), f(n2), f(n3), f(n4) f(nk-1), f(nk) be the fvalues of k expanded nodes.
The relationship between two consecutive expansionsf(ni) and f(ni+1) nodes always remains the same, i.e.,
f(ni)
8/13/2019 Introduction to AI and the perceptron
68/70
Monotonicity of f()
f(n1), f(n2), f(n3), ,f(ni), f(ni+1), ,f(nk)Sequence of expansion ofn1, n2,n3 ni nk
f values increase monotonically
f(n) = g(n) + h(n)
Consider two successive expansions - > ni, ni+1
Case 1:ni & ni+1 Co-existing in OL
ni precedes ni+1
By definition of A*f(ni)
8/13/2019 Introduction to AI and the perceptron
69/70
Monotonicity of f()
Case 2:ni+1 came to OL because of expanding ni and ni+1
is expanded
f(ni) = g(ni) + h(ni)
8/13/2019 Introduction to AI and the perceptron
70/70
A list of AI Search Algorithms
A* AO* IDA* (Iterative Deepening)
Minimax Search on Game Trees
Viterbi Search on Probabilistic FSA Hill Climbing Simulated Annealing Gradient Descent
Stack Based Search Genetic Algorithms Memetic Algorithms