Neural Networks Simple Neural Nets For Pattern Classification CHAPTER 2.
Using Neural Networks for Pattern Classification...
Transcript of Using Neural Networks for Pattern Classification...
![Page 1: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/1.jpg)
1
Using Neural Networks forPattern Classification Problems
Converting an Image• Camera captures
an image
• Image needs to beconverted to a formthat can beprocessed by theNeural Network
![Page 2: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/2.jpg)
2
Converting an Image• Image consists
of pixels• Values can be
assigned tocolor of eachpixel
• A vector canrepresent thepixel values inan image
Converting an Image
• If we let +1represent blackand 0 representwhite
• p = [0 1 0 1 0 10 1 0 0 0 1 0 0 01 0 …..
![Page 3: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/3.jpg)
3
Neural Network PatternClassification Problem
Tank image =[0 1 0 0 1 1 0…. ]
House image= [1 1 0 0 0 10 ….]
Neural NetworkTank orhouse ?
Types of Neural Networks
• Perceptron• Hebbian• Adeline• Multilayer with Backpropagation• Radial Basis Function Network
![Page 4: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/4.jpg)
4
2-Input Single NeuronPerceptron: Architecture
aaa
+n ap1
p2
1
b
w1,1
w1,2
a = hardlims Wp+b( ) = hardlims w1,1
w1,2[ ]
p1
p2
!
" #
$
% & + b
'
( )
*
+ ,
= hardlims w1,1p
1+ w
1,2p
2+ b( ) =
-1, if w1,1p
1+ w
1,2p
2+ b < 0
+1, if w1,1p
1+ w
1,2p
2+ b . 0
/ 0 1
A single neuronperceptron:
Output:
Symmetrical hardlimiter
2-Input Single NeuronPerceptron: Example
a = hardlims w1,1p
1+ w
1,2p
2+ b( )
=!1, w
1,1p
1+ w
1,2p
2+ b < 0
+1, w1,1p
1+ w
1,2p
2+ b " 0
# $ %
Example: w1,1 = -1 w1,2 = 1 b = -1
a =!1, ! p
1+ p
2!1 < 0 or ! p
1+ p
2< 1
+1, ! p1
+ p2!1" 0 or ! p
1+ p
2"1
#
$
%
This separates the inputs p = [p1, p2]T into twocategories separated by the boundary: -p1 + p2 = 1
aaa
+n ap1
p2
1
b
w1,1
w1,2
![Page 5: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/5.jpg)
5
a
p1
p2
-1
1
decision boundary - p1 + p2 = 1
(-2,1)
(2,-1)
2-Input Single NeuronPerceptron: Decision Boundary
a =!1, ! p
1+ p
2!1 < 0 or ! p
1+ p
2< 1
+1, ! p1
+ p2!1" 0 or ! p
1+ p
2"1
#
$
%
Inputs in this region have an outputof +1
Inputs in this regionhave an output of -1
aaa
+n ap1
p2
1
-1
-1
1
aaa
+n ap1
p2
1
-1
-1
1
a
p1
p2
-1
1
decision boundary - p1 + p2 = 1W = [-1, 1]
2-Input Single NeuronPerceptron: Weight Vector
• The weight vector, W, is orthogonal to thedecision boundary
![Page 6: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/6.jpg)
6
2-Input Single NeuronPerceptron: Weight Vector
• W points towards the class with an output of +1
a
p1
p2
-1
1
decision boundary - p1 + p2 = 1
(-2,1)
(2,-1)
W
Simple Perceptron Design
• The design of a simple perceptron isbased upon:– A single neuron divides inputs into two
classifications or categories– The weight vector, W, is orthogonal to the
decision boundary– The weight vector, W, points towards the
classification corresponding to the “1” output
![Page 7: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/7.jpg)
7
Orthogonal Vectors
• For any hyperplane of the form:a1p1 + a2p2 + a3p3 + . . . + anpn = b
the vector c*[ a1, a2, …, an ] is orthogonal tothe hyperplane (where c is a constant).
- p1 + p2 = - 1 * p1 + 1*p2 = 1
W = [ -1 , 1 ]
AND Gate: Description
• A perceptron can be used to implement most logic functions
• Example: Logical AND Truth table:
111001010000
OutputInputs
![Page 8: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/8.jpg)
8
AND Gate: Architecture
“hardlim” is usedhere to provideoutputs of 0 and 1
aaa
+n ap1
p2
1
b
w1,1
w1,2
TwoinputAND
p1 =0
0
!
" #
$
% & , t1 = 0
'
(
)
*
+
,
p2 =0
1
!
" #
$
% & , t2 = 0
'
(
)
*
+
,
p3 =1
0
!
" #
$
% & , t3 = 0
'
(
)
*
+
,
p4 =1
1
!
" #
$
% & , t4 = 1
'
(
)
*
+
,
Input/Target pairs:
AND Gate: GraphicalDescription
• Graphically:
• Where do we place the decision boundary?
a
= zero output= one output
1
1
111001010000
OutputInputs
p1
p2
![Page 9: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/9.jpg)
9
AND Gate: DecisionBoundary
• There are an infinite number of solutions
• What is the corresponding value of W?
One possible decisionboundary
1
1
AND Gate: Weight Vector• W must be orthogonal to the decision boundary• W must point towards the class with an output of 1
• Output:
a
W1.5
1.5
a = hardlim 2 2[ ]p1
p2
!
" #
$
% & + b
' ( )
* + ,
= hardlim 2p1
+ 2p2
+ b{ }
One possiblevalue is [2 2]
1
1
Decision boundary
![Page 10: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/10.jpg)
10
AND Gate: Bias• Decision Boundary:
• At (1.5, 0): 2(1.5) + 2(0) + b = 0 b = -3
2p1
+ 2p2
+ b = 0
a
W1.5
1.5
Passes through (1.5, 0)
1
1
AND Gate: Final Design
• Final Design:
• Test:
a = hardlim 2 2[ ]p1
p2
!
" #
$
% & ' 3
( ) *
+ , -
aaa
+n ap1
p2
1
-32
2
a = hardlim 2 2[ ]0
0
!
" # $
% & ' 3
( ) *
+ , -
= 0
a = hardlim 2 2[ ]0
1
!
" # $
% & ' 3
( ) *
+ , -
= 0
a = hardlim 2 2[ ]1
0
!
" # $
% & ' 3
( ) *
+ , -
= 0
a = hardlim 2 2[ ]1
1
!
" # $
% & ' 3
( ) *
+ , -
=1
![Page 11: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/11.jpg)
11
Perceptron Learning Rule• Most real problems involve input vectors,
p, that have length greater than three• Images are described by vectors with
1000s of elements• Graphical approach is not feasible in
dimensions higher than three• An iterative approach known as the
Perceptron Learning Rule is used
Character RecognitionProblem
• Given: A network has two possible inputs, “x” and “o”.These two characters are described by the 25 pixel (5 x5) patterns shown below.
• Problem: Design a neural network using the perceptronlearning rule to correctly identify these input characters.
x o
![Page 12: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/12.jpg)
12
Character RecognitionProblem: Input Description
• The inputs must be described as columnvectors
• Pixel representation: 0 = white 1 = black
The “x” is represented as: [ 1 0 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 1 0 0 0 1]T
The “o” is represented as: [ 0 1 1 1 0 1 0 0 0 1 1 0 0 0 1 1 0 0 0 1 0 1 1 1 0 ]T
Character RecognitionProblem: Output Description
• The output will indicate that either an “x” or“o” was received
• Let: 0 = “o” received 1 = “x” received
• The inputs are divided into two classesrequiring a single neuron
• Training set:p1 = [ 1 0 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 1 0 0 0 1]T, t1 = 1
p2 = [ 0 1 1 1 0 1 0 0 0 1 1 0 0 0 1 1 0 0 0 1 0 1 1 1 0 ]T , t2 = 0
A hard limiterwill be used
![Page 13: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/13.jpg)
13
Character RecognitionProblem: Network Architecture
aa
+n a
p1p2
1
b
w1,1w1,2
p25
w1,25
“hardlim is used toprovide an outputof “0” or “1”
a = hardlim(Wp+ b)The input,p, has 25components
Perceptron Learning Rule:Summary
• Step 1: Initialize W and b (if non zero) to small randomnumbers.
• Step 2: Apply the first input vector to the network andfind the output, a.
• Step 3: Update W and b based on:
Wnew = Wold + (t-a)pT
bnew = bold + (t-a)
• Repeat steps 2 and 3 for all input vectors repeatedlyuntil the targets are achieved for all inputs
![Page 14: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/14.jpg)
14
Character Recognition Problem:Perceptron Learning Rule
• Step 1: Initialize W and b (if non zero) to small random numbers.
– Assume W = [0 0 . . . 0] (length 25) and b = 0
• Step 2: Apply the first input vector to the network
–
–
• Step 3: Update W and b based on:
Wnew = Wold + (t-a)p1T = Wold + (1-1)p1
T
= [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
bnew = bold + (t-a) = bold + (1-1) = 0
p1 = [ 1 0 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 1 0 0 0 1]T, t1 = 1
a = hardlim(W(0)p1 + b(0)) = hardlim(0) = 1
Character Recognition Problem:Perceptron Learning Rule
• Step 2 (repeated): Apply the second input vector to the network
–
–
• Step 3 (repeated): Update W and b based on
Wnew = Wold + (t-a)p1T = Wold + (0-1)p2
T
= [0 -1 -1 -1 0 -1 0 0 0 -1 -1 0 0 0 -1 -1 0 0 0 -1 0 -1 -1 -1 0]
bnew = bold + (t-a) = bold + (0-1) = -1
p2 = [0 1 1 1 0 1 0 0 0 1 1 0 0 0 1 1 0 0 0 1 0 1 1 1 0 ]T, t2 = 0a = hardlim(W(1)p2 + b(1)) = 1
![Page 15: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/15.jpg)
15
Character Recognition Problem:Perceptron Learning Rule
011p10[1 -1 -1 -1 1 -1 1 0 1 -1 -1 0 1 0 -1 -1 1 0 1 -1 1 -1 -1 -1 1]
000p20[1 -1 -1 -1 1 -1 1 0 1 -1 -1 0 1 0 -1 -1 1 0 1 -1 1 -1 -1 -1 1]
101p1-1[0 -1 -1 -1 0 -1 0 0 0 -1 -1 0 0 0 -1 -1 0 0 0 -1 0 -1 -1 -1 0]
-110p20[0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
011p10[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
eatpbW
Character RecognitionProblem: Results
• After three epochs, W and b converge to:– W = [1 -1 -1 -1 1 -1 1 0 1 -1 -1 0 1 0 -1 -1 1 0 1 -1 1 -1 -1 -1 1]
– b = 0
• One possible solution based on the initial conditionselected. Other solutions are obtained when the initialvalues of W and b are changed.
• Check the solution: a = hardlim(W*p + b) both both inputs
![Page 16: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/16.jpg)
16
Character RecognitionProblem: Results
• How does this network perform in the presence of noise?
• For the “x” with noise:
a = hardlim{W*[1 0 0 0 1 0 1 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1] + 0} = 1
• For the “o” with noise:
a = hardlim{W*[0 1 1 1 0 1 0 0 0 1 1 1 0 0 0 1 0 1 0 1 0 1 1 1 0] + 0} = 0
• The network recognizes both the noisy x and o.
x and o with threepixel errors in each
Character RecognitionProblem: Simulation
• Use MATLAB to perform the following simulation:– Apply noisy inputs to the network with pixel errors ranging from 1
to 25 per character and find the network output
– Each type of error (number of pixels) was repeated 1000 timesfor each character with the incorrect pixels being selected atrandom
– The network output was compared to the target in each case.
– The number of detection errors was tabulated.
![Page 17: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/17.jpg)
17
Character Recognition Problem:Performance Results
111000100016 - 25.891885100015.621616100014.28.9527694813.06.7658759120.400399110.100961000001 - 9
oxoxNo. ofPixel
Errors
Probability of ErrorNo. of Character Errors
An “o” with11 pixelerrors
Perceptrons: Limitations
• Perceptrons only work for inputs thatare linearly separable
a
xx xxo
oo
a
x
xx xooo
Linearly separable Not Linearly separable
![Page 18: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/18.jpg)
18
Other Neural Networks
• How do the other types of neuralnetworks differ from the perceptron?– Topology– Function– Learning Rule
Perceptron Problem: Part 1
• Design a neural network that can identifya tank and a house.– Find W and b by hand as illustrated with the x-
o example.– Use the Neural Network Toolbox to find W
and b
Tank(t = 1)
House(t = 0)
![Page 19: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/19.jpg)
19
Perceptron Problem: Part 2
• Design a neural network that can find atank among houses and trees.– Repeat the previous problem but now with a
tree included.– Both the house and tree have targets of zero.
Tank(t = 1)
House(t = 0)
Tree(t = 0)
Perceptron Problem: Part 3• Design a neural network that can find a
tank among houses, trees and otheritems.– Create other images on the 9 x 9 grid.– Everything other than a tank will have a target
of zero.– How many items can you introduce before
the perceptron learning rule no longerconverges?
Tank(t = 1)
House(t = 0)
Tree(t = 0)
+ ????
![Page 20: Using Neural Networks for Pattern Classification Problemsskatz/nn_proj/percept_intro_nnproj_bw.pdf · Using Neural Networks for Pattern Classification Problems Converting an Image](https://reader033.fdocuments.in/reader033/viewer/2022051105/5ab2b8687f8b9a7e1d8d9a3e/html5/thumbnails/20.jpg)
20
MATLAB: Neural NetworkToolbox
• >> nntool
MATLAB: Neural NetworksToolbox
• Go to MATLAB Help and review thedocumentation on the Neural NetworksToolbox
• Use the GUI interface (>> nntool) toreproduce the results you obtained for theperceptron (tank vs. house, tree, etc.)
• Data can be imported/exported from theworkspace to the NN Tool.