Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts...
-
date post
21-Dec-2015 -
Category
Documents
-
view
215 -
download
1
Transcript of Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts...
Neural Network Computing
Lecture no.1
All rights reserved L. Manevitz Lecture 1 2
McCullogh-Pitts Neuron
The activation of a McCullogh-Pitts Neuron is binary.
McCullogh-Pitts Neurons are connected by directed, weighted paths.
Each neuron has a fixed threshold.
All rights reserved L. Manevitz Lecture 1 3
Architecture
1w
2w2x
nx
1x
else 0
w if 1n
1ii ix
f
nw
All rights reserved L. Manevitz Lecture 1 4
Theorem
We can model any function or phenomenon that can be represented as a logic function.
First step we’ll show that the neuron can perform a simple logic function as AND, OR and NOT.
At the second step we’ll use these simple neurons as building blocks.
(Recall representability of logic functions by DNF form).
All rights reserved L. Manevitz Lecture 1 5
AND
00
01
10
11
1x 2x AND
0
0
0
1
0
0
0*1+0*1=0 0<1.5
0
1
0*1+1*1=1 1<1.5
1
0
1*1+0*1=1 1<1.5
1
1
1*1+1*1=2 2>1.51x
2x
All rights reserved L. Manevitz Lecture 1 6
OR
00
01
10
11
1x 2x OR
0
1
1
1
0
0
0*1+0*1=0 0<0.9
0
1
0*1+1*1=1 1>0.9
1
0
1*1+0*1=1 1>0.9
1
1
1*1+1*1=2 2>0.91x
2x
All rights reserved L. Manevitz Lecture 1 7
NOT
x NOT
1-*1-=1 -1-<0.50-*1=0 0->0.50
0
1
1
0
1
x
All rights reserved L. Manevitz Lecture 1 8
DNF
DNF form :
OR OR AND AND AND 21 nPPP
All rights reserved L. Manevitz Lecture 1 9
Biases and Thresholds
We can replace the threshold with a bias.
A bias acts exactly as a weight on a connection from a unit whose activation is always 1.
n
iii xw
1
n
iii xw
1
0-
n
iii xw
0
0 1 - 00 xw
All rights reserved L. Manevitz Lecture 1 10
Perceptron
Loop : Take an example and apply to network. If correct answer – return to Loop. If incorrect – go to Fix.
Fix : Adjust network weights by input example. Go to Loop.
All rights reserved L. Manevitz Lecture 1 11
Perceptron Algorithm
Let be arbitrary
Choose: choose
Test: If and go to Choose
If and go to Fix plus
If and go to Choose
If and go to Fix minus
Fix plus: go to Choose
Fix minus: go to Choose
vFx
Fx
FxFxFxFx
0xv
0xv
0xv0xv
xvv :xvv :
All rights reserved L. Manevitz Lecture 1 12
Perceptron Algorithm
Conditions to the algorithm existence : Condition no.1:
Condition no.2:
We choose F to be a group of unit vectors.
0*
0* ,
xvFxif
xvFxifv
All rights reserved L. Manevitz Lecture 1 13
Geometric viewpoint
*vx
nw
1nw
All rights reserved L. Manevitz Lecture 1 14
Perceptron Algorithm
Based on these conditions the number of times we enter the Loop is finite.
Proof:
0 0
*
yAyxAx
FFF*
Positive examples
Negative
examples
Examples world
All rights reserved L. Manevitz Lecture 1 15
Perceptron Algorithm-Proof
We replace the threshold with a bias. We assume F is a group of unit vectors.
1 F
All rights reserved L. Manevitz Lecture 1 16
Perceptron Algorithm-Proof
We reduce what we have to prove by eliminating all the negative examples and placing their negations in the positive examples.
0
0
*
*
*
A
AF
AF
FF
-yy
yaya
ii
iiii
0 0 **
All rights reserved L. Manevitz Lecture 1 17
Perceptron Algorithm-Proof
1*
*
*
A
AA
AA
AAAG
nAA
AAAAAAAAA
n
llll
*
****1
*
The numerator :
After n changes
All rights reserved L. Manevitz Lecture 1 18
Perceptron Algorithm-Proof
nA
AAA
AAAAA
n
ttt
ttttt
2
222
11
2
1
1 0
12
The denominator :
After n changes
All rights reserved L. Manevitz Lecture 1 19
Perceptron Algorithm-Proof
2
*
*
2
1
1
n
nn
n
A
AAAG
nAA
nA
n
n
n
n
From the numerator :
From the denominator :
n is final
All rights reserved L. Manevitz Lecture 1 20
Example - AND
001
011
101
111
1x 2x AND
0
0
0
1
bias
1,0,11,1,00,1,1FIX
11,1,0
0,1,11,0,01,1,1FIX
11,0,0
1,1,11,1,10,0,0FIX
01,1,1
01,0,1
01,1,0
01,0,0
0,0,0
123
212
012
101
301
030
020
010
000
0
xww
wxw
xww
wxw
xww
wxw
wxw
wxw
wxw
w
wrong
wrong
wrong
etc…
All rights reserved L. Manevitz Lecture 1 21
AND – Bi Polar solution
-1-110
1-110
-1110
1110
-1-110
1-111
-1111
1111
111
11-1
wrong +
wrong
wrong -
- 020
-1-110
1-110
-1110
1111
continue
success
All rights reserved L. Manevitz Lecture 1 22
Problem
111 RHSMHSLHS 333 RHSMHSLHS222 RHSMHSLHS
0111 RHSMHSLHS 0222 RHSMHSLHS 0333 RHSMHSLHS
321 MHSMHSMHS
011 RHSLHS 022 RHSLHS 033 RHSLHS
21 LHSLHS 32 RHSRHS
should be small enough so that
should be small enough so that
2RHS2LHS
022 RHSLHS 022 RHSLHS
contradiction!!!
All rights reserved L. Manevitz Lecture 1 23
Linear Separation
Every perceptron determines a classification of vector inputs which is determined by a hyperline
Two dimensional examples (add algebra)
OR AND XOR
not possible
All rights reserved L. Manevitz Lecture 1 24
Linear Separation in Higher DimensionsIn higher dimensions, still linear separation, but
hard to tell
Example: Connected; Convex - which can be handled by Perceptron with local sensors; which can not be.
Note: Define local sensors.