Neural Network -AI
Transcript of Neural Network -AI
-
8/9/2019 Neural Network -AI
1/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
1
Artifcial IntelligenceChapter 20.5: Neural
Networks
Michael Scherger
Department o ComputerScience
!ent State "ni#ersit$
-
8/9/2019 Neural Network -AI
2/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
2
Contents
% Intro&uction
% Simple Neural Networks or 'attern
Classifcation% 'attern Association
% Neural Networks (ase& on
Competition% (ackpropagation Neural Network
-
8/9/2019 Neural Network -AI
3/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
3
Intro&uction
% Much o these notes come rom)un&amentals o Neural Networks:Architectures* Algorithms* an&Applications +$ ,aurene )ausett*'rentice -all* nglewoo& Cli/s* N*13.
-
8/9/2019 Neural Network -AI
4/84
-
8/9/2019 Neural Network -AI
5/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
5
6hat are Neural Networks7
% Neural Networks 8NNs9 are networks o neurons* or eample* asoun& in real 8i.e. +iological9 +rains.
% Artifcial Neurons are cru&e approimations o the neurons oun& in+rains. ;he$ ma$ +e ph$sical &e#ices* or purel$ mathematicalconstructs.
% Artifcial Neural Networks 8ANNs9 are networks o ArtifcialNeurons* an& hence constitute cru&e approimations to parts o real+rains. ;he$ ma$ +e ph$sical &e#ices* or simulate& on con#entionalcomputers.
% )rom a practical point o #iew* an ANN is simplife& our ANNs are compare& to real +rains.
-
8/9/2019 Neural Network -AI
6/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
6
6h$ Stu&$ Artifcial NeuralNetworks7
% ;he$ are etremel$ powerul computational &e#ices 8;uringeui#alent* uni#ersal computers9
% Massi#e parallelism makes them #er$ e?cient
% ;he$ can learn an& generali@e rom training &ata 4 so there is nonee& or enormous eats o programming
% ;he$ are particularl$ ault tolerant 4 this is eui#alent to thegraceul &egra&ationB oun& in +iological s$stems
% ;he$ are #er$ noise tolerant 4 so the$ can cope with situationswhere normal s$m+olic s$stems woul& ha#e &i?cult$
% In principle* the$ can &o an$thing a s$m+oliclogic s$stem can &o*an& more. 8In practice* getting them to &o it can +e rather &i?cult9
-
8/9/2019 Neural Network -AI
7/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
7
6hat are Artifcial NeuralNetworks "se& or7
% As with the fel& o AI in general* there aretwo +asic goals or neural network research:4 Brain modeling: ;he scientifc goal o +uil&ing
mo&els o how real +rains work
% ;his can potentiall$ help us un&erstan& the nature ohuman intelligence* ormulate +etter teaching strategies*or +etter reme&ial actions or +rain &amage& patients.
4 Artifcial System Building : ;he engineering
goal o +uil&ing e?cient s$stems or real worl&applications.% ;his ma$ make machines more powerul* relie#e humans
o te&ious tasks* an& ma$ e#en impro#e upon humanperormance.
-
8/9/2019 Neural Network -AI
8/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
6hat are Artifcial NeuralNetworks "se& or7
% (rain mo&eling4 Mo&els o human &e#elopment 4 help chil&ren with &e#elopmental
pro+lems4 Simulations o a&ult perormance 4 ai& our un&erstan&ing o how the
+rain works4 Neurops$chological mo&els 4 suggest reme&ial actions or +rain &amage&
patients
% Eeal worl& applications4 )inancial mo&eling 4 pre&icting stocks* shares* currenc$ echange rates4 =ther time series pre&iction 4 climate* weather* airline marketing
tactician4 Computer games 4 intelligent agents* +ackgammon* frst person shooters
4 Control s$stems 4 autonomous a&apta+le ro+ots* microwa#e controllers4 'attern recognition 4 speech recognition* han&>writing recognition* sonar
signals4 Data anal$sis 4 &ata compression* &ata mining4 Noise re&uction 4 unction approimation* CF noise re&uction4 (ioinormatics 4 protein secon&ar$ structure* DNA seuencing
-
8/9/2019 Neural Network -AI
9/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
!
,earning in Neural Networks
% ;here are man$ orms o neural networks.Most operate +$ passing neuralGacti#ationsH through a network oconnecte& neurons.
% =ne o the most powerul eatures o neuralnetworks is their a+ilit$ to learn an&
generalize rom a set o training &ata. ;he$ a&apt the strengthsweights o theconnections +etween neurons so that thefnal output acti#ations are correct.
-
8/9/2019 Neural Network -AI
10/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
10
,earning in Neural Networks
% ;here are three +roa& t$pes olearning:
1. Supervised Learning 8i.e. learningwith a teacher9
2. Reinorcement learning 8i.e.
learning with limite& ee&+ack9
3. nsupervised learning 8i.e. learningwith no help9
-
8/9/2019 Neural Network -AI
11/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
11
A (rie -istor$
% 1943 McCulloch an& 'itts propose& the McCulloch>'itts neuron mo&el
% 1949 -e++ pu+lishe& his +ook The Organization of Behavior* in which the -e++ianlearning rule was propose&.
% 1958 Eosen+latt intro&uce& the simple single la$er networks now calle& 'erceptrons.
% 1969 Minsk$ an& 'apertHs +ook Perceptrons &emonstrate& the limitation o single la$erperceptrons* an& almost the whole fel& went into hi+ernation.
% 1982 -opfel& pu+lishe& a series o papers on -opfel& networks.
% 1982 !ohonen &e#elope& the Sel>=rgani@ing Maps that now +ear his name.
% 1986 ;he (ack>'ropagation learning algorithm or Multi>,a$er 'erceptrons was re>&isco#ere& an& the whole fel& took o/ again.
% 1990s ;he su+>fel& o Ea&ial (asis )unction Networks was &e#elope&.
% 2000s ;he power o nsem+les o Neural Networks an& Support ector Machines+ecomes apparent.
-
8/9/2019 Neural Network -AI
12/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
12
=#er#iew
% Artifcial Neural Networks are powerul computationals$stems consisting o man$ simple processing elementsconnecte& together to perorm tasks analogousl$ to+iological +rains.
% ;he$ are massi#el$ parallel* which makes them e?cient*ro+ust* ault tolerant an& noise tolerant.
% ;he$ can learn rom training &ata an& generali@e to newsituations.
% ;he$ are useul or +rain mo&eling an& real worl&applications in#ol#ing pattern recognition* unctionapproimation* pre&iction*
-
8/9/2019 Neural Network -AI
13/84
-
8/9/2019 Neural Network -AI
14/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
14
,e#els o (rain =rgani@ation
% ;he +rain contains +oth large scale an& small scaleanatomical structures an& &i/erent unctions take place athigher an& lower le#els. ;here is a hierarch$ o interwo#enle#els o organi@ation:1. Molecules an& Ions
2. S$napsesK. Neuronal microcircuits
3. Den&ritic trees5. Neurons6. ocal circuitsL. Inter>regional circuits
. Central ner#ous s$stem
% ;he ANNs we stu&$ in this mo&ule are cru&eapproimations to le#els 5 an& .
-
8/9/2019 Neural Network -AI
15/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
15
(rains #s. Computers
% ;here are approimatel$ 10 +illion neurons in the human corte*compare& with 10 o thousan&s o processors in the most powerulparallel computers.
% ach +iological neuron is connecte& to se#eral thousan&s o otherneurons* similar to the connecti#it$ in powerul parallel computers.
% ,ack o processing units can +e compensate& +$ spee&. ;he t$picaloperating spee&s o +iological neurons is measure& in millisecon&s810>K s9* while a silicon chip can operate in nanosecon&s 810 > s9.
% ;he human +rain is etremel$ energ$ e?cient* using approimatel$
10>1
-
8/9/2019 Neural Network -AI
16/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
16
Structure o a -uman (rain
-
8/9/2019 Neural Network -AI
17/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
17
Slice ;hrough a Eeal (rain
-
8/9/2019 Neural Network -AI
18/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
1
(iological Neural Networks
% ;he ma
-
8/9/2019 Neural Network -AI
19/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
1!
;he McCulloch>'itts Neuron
% ;his #astl$ simplife& mo&el o real neurons is also known asa '(res(old Logic nit :4 A set o s$napses 8i.e. connections9 +rings in acti#ations rom
other neurons.4 A processing unit sums the inputs* an& then applies a non>linear
acti#ation unction 8i.e. suashingtranserthreshol& unction9.
4 An output line transmits the result to other neurons.
-
8/9/2019 Neural Network -AI
20/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
20
Networks o McCulloch>'ittsNeurons
% Artifcial neurons ha#e the same +asic components as+iological neurons. ;he simplest ANNs consist o a set o)c*ulloc(+,itts neurons la+ele& +$ in&ices k * i* j an&acti#ation Jows +etween them #ia s$napses with strengthswki, wij:
-
8/9/2019 Neural Network -AI
21/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
21
Some "seul Notation
% 6e oten nee& to talk a+out or&ere& sets o relate& num+ers4 we call them vectors* e.g. & O 8 x 1* x 2* x 3* * x n9 * # O 8 y 1* y 2* y 3* * y m9
% ;he components x i can +e a&&e& up to gi#e a scalar
8num+er9* e.g.s O x 1 P x 2 P x 3 P P x n O S"M8i* n* x i )
% ;wo #ectors o the same length ma$ +e added to gi#eanother #ector* e.g.
z O & P # O 8 x 1 P y 1* x 2 P y 2* * x n P y n9
% ;wo #ectors o the same length ma$ +e multiplied to gi#ea scalar* e.g. p O & .# O x 1 y 1 P x 2 y 2 P P x n y n O S"M8i* N* x i y i )
-
8/9/2019 Neural Network -AI
22/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
22
Some "seul )unctions
% Common acti#ation unctions
4 I&entit$ unction
% 89 O or all
4 (inar$ step unction 8with threshol& θ98aka -ea#isi&e unction or threshol&
unction9
<>=
=θ
θ
xif 0
xif 1)( x f
-
8/9/2019 Neural Network -AI
23/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
23
Some "seul )unctions
% (inar$ sigmoi&
% (ipolar sigmoi&
x
e x f
σ −+=1
1)(
1
1
21)(2)( −
+=−= − x
e
x f x g σ
-
8/9/2019 Neural Network -AI
24/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
24
;he McCulloch>'itts Neuronuation
% "sing the a+o#e notation* we can now write &owna simple euation or the output ot o aMcCulloch>'itts neuron as a unction o its ninputs ini :
-
8/9/2019 Neural Network -AI
25/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
25
Ee#iew
% (iological neurons* consisting o a cell +o&$*aons* &en&rites an& s$napses* are a+le toprocess an& transmit neural acti#ation
% ;he McCulloch>'itts neuron mo&el 8;hreshol&,ogic "nit9 is a cru&e approimation to realneurons that perorms a simple summation an&threshol&ing unction on acti#ation le#els
% Appropriate mathematical notation acilitates thespecifcation an& programming o artifcialneurons an& networks o artifcial neurons.
-
8/9/2019 Neural Network -AI
26/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
26
Networks o McCulloch>'ittsNeurons
% =ne neuron canHt &o much on its own."suall$ we will ha#e man$ neurons la+ele&+$ in&ices k * i* j an& acti#ation Jows +etweenthem #ia s$napses with strengths wki, wij:
-
8/9/2019 Neural Network -AI
27/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
27
;he 'erceptron
% 6e can connect an$ num+er o McCulloch>'ittsneurons together in an$ wa$ we like.
% An arrangement o one input la$er o McCulloch>'itts neurons ee&ing orwar& to one output la$er oMcCulloch>'itts neurons is known as a ,erceptron.
-
8/9/2019 Neural Network -AI
28/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
2
,ogic Fates with M'Neurons
% 6e can use McCulloch>'itts neurons to implement the +asic logic gates.
% All we nee& to &o is fn& the appropriate connection weights an&neuron threshol&s to pro&uce the right outputs or each set o inputs.
% 6e shall see eplicitl$ how one can construct simple networks thatperorm N=;* AND* an& =E.
% It is then a well known result rom logic that we can construct an$logical unction rom these three operations.
% ;he resulting networks* howe#er* will usuall$ ha#e a much more
comple architecture than a simple 'erceptron.
% 6e generall$ want to a#oi& &ecomposing comple pro+lems into simplelogic gates* +$ fn&ing the weights an& threshol&s that work &irectl$ ina 'erceptron architecture.
-
8/9/2019 Neural Network -AI
29/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
2!
Implementation o ,ogical N=;*AND* an& =E
% ,ogical =E
1 2 $
0 0 0
0 1 1
1 0 1
1 1 1
1
2
$
2
2
QO2
-
8/9/2019 Neural Network -AI
30/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
30
Implementation o ,ogical N=;*AND* an& =E
% ,ogical AND
1 2 $
0 0 0
0 1 0
1 0 0
1 1 1
1
2
$
1
1
QO2
-
8/9/2019 Neural Network -AI
31/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
31
Implementation o ,ogical N=;*AND* an& =E
% ,ogical N=;
1 $
0 1
1 0
1
$
>1QO2
2
+ias
1
-
8/9/2019 Neural Network -AI
32/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
32
Implementation o ,ogical N=;*AND* an& =E
% ,ogical AND N=;
1 2 $
0 0 0
0 1 0
1 0 1
1 1 0
1
2
$
2
>1
QO2
-
8/9/2019 Neural Network -AI
33/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
33
,ogical R=E
% ,ogical R=E
1 2 $
0 0 0
0 1 1
1 0 1
1 1 0
1
2
$
7
7
-
8/9/2019 Neural Network -AI
34/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
34
,ogical R=E
% -ow long &o we keep looking or asolution7 6e nee& to +e a+le to calculateappropriate parameters rather than
looking or solutions +$ trial an& error.
% ach training pattern pro&uces a linearineualit$ or the output in terms o the
inputs an& the network parameters. ;hesecan +e use& to compute the weights an&threshol&s.
-
8/9/2019 Neural Network -AI
35/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
35
)in&ing the 6eightsAnal$ticall$
% 6e ha#e two weights w1 an& w2 an&the threshol& * an& or each trainingpattern we nee& to satis$
-
8/9/2019 Neural Network -AI
36/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
36
)in&ing the 6eightsAnal$ticall$
% )or the R=E network4 Clearl$ the secon& an& thir& ineualities are
incompati+le with the ourth* so there is in act nosolution. 6e nee& more comple networks* e.g. thatcom+ine together man$ simple networks* or use
&i/erent acti#ationthreshol&ingtranser unctions.
-
8/9/2019 Neural Network -AI
37/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
37
ANN ;opologies
% Mathematicall$* ANNs can +e represente& as weig(teddirected grap(s. )or our purposes* we can simpl$ think interms o acti#ation Jowing +etween processing units #iaone>wa$ connections4 Single!ayer "eed!#or$ard NNs =ne input la$er an& one
output la$er o processing units. No ee&>+ack connections.
8)or eample* a simple 'erceptron.9
4 %ulti!ayer "eed!#or$ard NNs =ne input la$er* one outputla$er* an& one or more hi&&en la$ers o processing units. Noee&>+ack connections. ;he hi&&en la$ers sit in +etween theinput an& output la$ers* an& are thus hi!!en rom the outsi&e
worl&. 8)or eample* a Multi>,a$er 'erceptron.9
4 &ecurrent NNs An$ network with at least one ee&>+ackconnection. It ma$* or ma$ not* ha#e hi&&en units. 8)oreample* a Simple Eecurrent Network.9
-
8/9/2019 Neural Network -AI
38/84
-
8/9/2019 Neural Network -AI
39/84
-
8/9/2019 Neural Network -AI
40/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
40
Detecting -ot an& Col&
% ;he &esire& response o the s$stem is thatcol& is percei#e& i a col& stimulus isapplie& or two time stepsB
4 $28t9 O 28t>29 AND 28t>19
% It is also reuire& that heat +e percei#e&i either a hot stimulus is applie& or a col&
stimulus is applie& +rieJ$ 8or one timestep9 an& then remo#e&B
4 $18t9 O 18t>19T =E 28t>K9 AND N=; 28t>29T
-
8/9/2019 Neural Network -AI
41/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
41
Detecting -eat an& Col&
1
2
@1
@2 $2
$12
1
1
2
2
>1
2
-eat
Col&
-
8/9/2019 Neural Network -AI
42/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
42
Detecting -eat an& Col&
0
1
-eat
Col&
Appl$Col&
-
8/9/2019 Neural Network -AI
43/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
43
Detecting -eat an& Col&
0
0
0
1
-eat
Col&
Eemo#e Col&
-
8/9/2019 Neural Network -AI
44/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
44
Detecting -eat an& Col&
1
0 0
0-eat
Col&
-
8/9/2019 Neural Network -AI
45/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
45
Detecting -eat an& Col&
0
1-eat
Col&
'ercei#e -eat
-
8/9/2019 Neural Network -AI
46/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
46
Detecting -eat an& Col&
0
1
-eat
Col&
Appl$ Col&
-
8/9/2019 Neural Network -AI
47/84
-
8/9/2019 Neural Network -AI
48/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
4
Detecting -eat an& Col&
0
1 1
0-eat
Col& 'ercei#e Col&
-
8/9/2019 Neural Network -AI
49/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
4!
ample: Classifcation
% Consi&er theeample oclassi$ing airplanesgi#en their masses
an& spee&s
% -ow &o we construct
a neural network thatcan classi$ an$ t$peo +om+er or fghter7
A F l ' &
-
8/9/2019 Neural Network -AI
50/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
50
A Feneral 'roce&ure or(uil&ing ANNs
% 1. "n&erstan& an& speci$ $our pro+lem in terms o inputs and re-uired outputs*e.g. or classifcation the outputs are the classes usuall$ represente& as +inar$#ectors.
% 2. ;ake the simplest orm o network $ou think might +e a+le to sol#e $ourpro+lem* e.g. a simple 'erceptron.
% K. ;r$ to fn& appropriate connection weig(ts 8inclu&ing neuron threshol&s9 so thatthe network pro&uces the right outputs or each input in its training &ata.
% 3. Make sure that the network works on its training data* an& test its generali@ation+$ checking its perormance on new testing data.
% 5. I the network &oesnHt perorm well enough* go +ack to stage K an& tr$ har&er.
% . I the network still &oesnHt perorm well enough* go +ack to stage 2 an& tr$ har&er.
% L. I the network still &oesnHt perorm well enough* go +ack to stage 1 an& tr$ har&er.
% . 'ro+lem sol#e& 4 mo#e on to net pro+lem.
( il&i NN =
-
8/9/2019 Neural Network -AI
51/84
November 11, 2004
AI: Chapter 20.5: NeuralNetworks
51
(uil&ing a NN or =urample
% )or our airplane classifer eample* our inputs can +e &irectenco&ings o the masses an& spee&s
% Fenerall$ we woul& ha#e one output unit or each class*with acti#ation 1 or G$esH an& 0 or GnoH
% 6ith
-
8/9/2019 Neural Network -AI
52/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 52
(uil&ing a NN or =urample
(uil&ing a NN or =ur
-
8/9/2019 Neural Network -AI
53/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 53
(uil&ing a NN or =urample
D i i ( & i i ;
-
8/9/2019 Neural Network -AI
54/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 54
Decision (oun&aries in ;woDimensions
% )or simple logic gate pro+lems* it iseas$ to #isuali@e what the neuralnetwork is &oing. It is orming decision
"oundaries +etween classes.Eemem+er* the network output is:
% ;he &ecision +oun&ar$ 8+etween ot O
0 an& ot O 19 is atw1in1 " w2in2 # $% &
D i i ( & i i ;
-
8/9/2019 Neural Network -AI
55/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 55
Decision (oun&aries in ;woDimensions
In two &imensions the &ecision+oun&aries are alwa$s on
straight lines
Decision (oun&aries or AND
-
8/9/2019 Neural Network -AI
56/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 56
Decision (oun&aries or ANDan& =E
-
8/9/2019 Neural Network -AI
57/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 57
Decision (oun&aries or R=E
% ;here are two o+#iousreme&ies:
4 either change the
transer unction so thatit has more than one&ecision +oun&ar$
4 use a more comple
network that is a+le togenerate more comple&ecision +oun&aries
-
8/9/2019 Neural Network -AI
58/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 5
,ogical R=E 8Again9
% @1 O 1 AND N=;2
% @2 O 2 AND N=;1
% $ O @1 =E @2
1
2
@1
@2
$
2
2
>1
2
2
>1
Decision -$perplanes an&
-
8/9/2019 Neural Network -AI
59/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 5!
Decision -$perplanes an&,inear Separa+ilit$
% I we ha#e two inputs* then the weights &efnea &ecision +oun&ar$ that is a one &imensionalstraight line in the two &imensional inputspace o possi+le input #alues
% I we ha#e n inputs* the weights &efne a&ecision +oun&ar$ that is an n#1 &imensional(#perplane in the n &imensional inputspace:
w1in1 " w2in2 " ' " wninn # $% &
Decision -$perplanes an&
-
8/9/2019 Neural Network -AI
60/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 60
Decision -$perplanes an&,inear Separa+ilit$
% ;his h$perplane is clearl$ still linear 8i.e.straightJat9 an& can still onl$ &i#i&e thespace into two regions. 6e still nee& morecomple transer unctions* or more comple
networks* to &eal with R=E t$pe pro+lems
% 'ro+lems with input patterns which can +eclassife& using a single h$perplane are sai&
to +e linearl# separa"le. 'ro+lems 8such asR=E9 which cannot +e classife& in this wa$are sai& to +e non+linearl# separa"le.
Feneral Decision
-
8/9/2019 Neural Network -AI
61/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 61
Feneral Decision(oun&aries
% Fenerall$* we willwant to &eal withinput patterns that arenot +inar$* an& epect
our neural networks toorm comple &ecision+oun&aries
% 6e ma$ also wish to
classi$ inputs intoman$ classes 8such asthe three shown here9
-
8/9/2019 Neural Network -AI
62/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 62
,earning an& Fenerali@ation
% A network will also pro&uce outputs or input patterns thatit was not originall$ set up to classi$ 8shown with uestionmarks9* though those classifcations ma$ +e incorrect
% ;here are two important aspects o the networkHs operation
to consi&er:4 earning ;he network must learn &ecision suraces rom a seto training patterns so that these training patterns areclassife& correctl$
4 'enerali(ation Ater training* the network must also +e a+leto generali@e* i.e. correctl$ classi$ test patterns it has ne#erseen +eore
% "suall$ we want our neural networks to learn well* an& alsoto generali@e well.
-
8/9/2019 Neural Network -AI
63/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 63
,earning an& Fenerali@ation
% Sometimes* the training &ata ma$ containerrors 8e.g. noise in the eperimental&etermination o the input #alues* or incorrectclassifcations9
% In this case* learning the training &ataperectl$ ma$ make the generali@ation worse
% ;here is an important tradeo +etweenlearning an& generali@ation that arises uitegenerall$
-
8/9/2019 Neural Network -AI
64/84
-
8/9/2019 Neural Network -AI
65/84
-
8/9/2019 Neural Network -AI
66/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 66
;raining a Neural Network
% 6hether our neural network is a simple 'erceptron*or a much more complicate& multila$er networkwith special acti#ation unctions* we nee& to&e#elop a s$stematic proce&ure or &eterminingappropriate connection weights.
% ;he general proce&ure is to ha#e the networklearn the appropriate weights rom arepresentati#e set o training &ata
% In all +ut the simplest cases* howe#er* &irectcomputation o the weights is intracta+le
-
8/9/2019 Neural Network -AI
67/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 67
;raining a Neural Network
% Instea&* we usuall$ start o/ with random initialweig(ts an& a&
-
8/9/2019 Neural Network -AI
68/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 6
'erceptron ,earning
% )or simple 'erceptrons perorming classifcation*we ha#e seen that the &ecision +oun&aries areh$perplanes* an& we can think o learning as theprocess o shiting aroun& the h$perplanes untileach training pattern is classife& correctl$
% Somehow* we nee& to ormali@e that process oshiting aroun&B into a s$stematic algorithm thatcan easil$ +e implemente& on a computer
% ;he shiting aroun&B can con#enientl$ +e split upinto a num+er o small steps.
-
8/9/2019 Neural Network -AI
69/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 6!
'erceptron ,earning
% I the network weights at time t are wij(t)*then the shiting process correspon&s tomo#ing them +$ an amount ∆wij(t) so thatat time t"1 we ha#e weights
wij(t"1) % wij(t) " ∆wij(t)
% It is con#enient to treat the threshol&s asweights* as &iscusse& pre#iousl$* so we&onHt nee& separate euations or them
)ormulating the 6eight
-
8/9/2019 Neural Network -AI
70/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 70
)ormulating the 6eightChanges
% Suppose the target output o unit j istarg j an& the actual output is ot j Osgn8Σ ini wij9* where ini are the
acti#ations o the pre#ious la$er oneurons 8e.g. the network inputs9
% ;hen we can
-
8/9/2019 Neural Network -AI
71/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 71
'erceptron Algorithm
% Step 0: Initiali@e weights an& +ias4 )or simplicit$* set weights an& +ias to @ero
4 Set learning rate α 80 UO α UO 19 8η9
% Step 1: 6hile stopping con&ition isalse &o steps 2>
% Step 2: )or each training pair s:t &osteps K>5
% Step K: Set acti#ations o input units x i % si
-
8/9/2019 Neural Network -AI
72/84
November 11, 2004 AI: Chapter 20.5: NeuralNetworks 72
'erceptron Algorithm
% Step 3: Compute response o outputunit:
<
≤≤
>
−
=
++=
∑
θ
θ θ
θ
-y_inif
y_in-if
y_inif
1
0
1
_
y
w xbin yi ii
-
8/9/2019 Neural Network -AI
73/84
November 11, 200
4
AI: Chapter 20.5: Neural
Networks
73
'erceptron Algorithm
% Step 5: "p&ate weights an& +ias i an erroroccurre& or this patterni y % t
wi(new) % wi(o*!) " α tx i
+(new) % +(o*!) " α t else
wi(new) % wi(o*!)
+(new) % +(o*!)
% Step : ;est Stopping Con&ition4 I no weights change& in Step 2* stop* else*
continue
Con#ergence o 'erceptron
-
8/9/2019 Neural Network -AI
74/84
November 11, 200
4
AI: Chapter 20.5: Neural
Networks
74
Con#ergence o 'erceptron,earning
% ;he weight changes ∆wij nee& to +e applie&repeate&l$ 4 or each weight wij in the network*an& or each training pattern in the trainingset. =ne pass through all the weights or thewhole training set is calle& one epoc( otraining
% #entuall$* usuall$ ater man$ epochs* whenall the network outputs match the targets or
all the training patterns* all the ∆wij will +e@ero an& the process o training will cease. 6ethen sa$ that the training process hasconverged to a solution
Con#ergence o 'erceptron
-
8/9/2019 Neural Network -AI
75/84
November 11, 200
4
AI: Chapter 20.5: Neural
Networks
75
Con#ergence o 'erceptron,earning
% It can +e shown that i there &oes eist apossi+le set o weights or a 'erceptronwhich sol#es the gi#en pro+lem correctl$*then the 'erceptron ,earning Eule will fn&
them in a fnite num+er o iterations
% Moreo#er* it can +e shown that i a pro+lemis linearl$ separa+le* then the 'erceptron,earning Eule will fn& a set o weights in afnite num+er o iterations that sol#es thepro+lem correctl$
-
8/9/2019 Neural Network -AI
76/84
November 11, 200
4
AI: Chapter 20.5: Neural
Networks
76
=#er#iew an& Ee#iew
% Neural network classifers learn &ecision +oun&aries romtraining &ata
% Simple 'erceptrons can onl$ cope with linearl$ separa+lepro+lems
% ;raine& networks are epecte& to generali@e* i.e. &ealappropriatel$ with input &ata the$ were not traine& on
% =ne can train networks +$ iterati#el$ up&ating their weights
% ;he 'erceptron ,earning Eule will fn& weights or linearl$separa+le pro+lems in a fnite num+er o iterations.
++i i
-
8/9/2019 Neural Network -AI
77/84
November 11, 200
4
AI: Chapter 20.5: Neural
Networks
77
-e++ian ,earning
% In 13 neurops$chologist Donal& -e++ postulate& how +iologicalneurons learn:4 6hen an aon o cell A is near enough to ecite a cell ( an&
repeate&l$ or persistentl$ takes part in fring it* some growth process ormeta+olic change takes place on one or +oth cells such that AHse?cienc$ as one o the cells fring (* is increase&.B
% In other wor&s:4 1. I two neurons on either si&e o a s$napse 8connection9 are acti#ate&
simultaneousl$ 8i.e. s$nchronousl$9* then the strength o that s$napseis selecti#el$ increase&.
% ;his rule is oten supplemente& +$:
4 2. I two neurons on either si&e o a s$napse are acti#ate&as$nchronousl$* then that s$napse is selecti#el$ weakene& oreliminate&.
% so that chance coinci&ences &o not +uil& up connection strengths.
++i i l i h
-
8/9/2019 Neural Network -AI
78/84
November 11, 200
4
AI: Chapter 20.5: Neural
Networks
7
-e++ian ,earning Algorithm
% Step 0: Initiali@e all weights4 )or simplicit$* set weights an& +ias to @ero
% Step 1: )or each input training #ector &o steps 2>3
% Step 2: Set acti#ations o input units x i % si
% Step K: Set the acti#ation or the output unit
y % t
% Step 3: A&
-
8/9/2019 Neural Network -AI
79/84
November 11, 200
4
AI: Chapter 20.5: Neural
Networks
7!
-e++ian #s 'erceptron,earning
% In the notation use& or 'erceptrons* the /e""ianlearning weight up&ate rule is:
wij (new)O ot j . ini
% ;here is strong ph$siological e#i&ence that this t$peo learning &oes take place in the region o the
+rain known as the hippocamps.
% Eecall that the ,erceptron learning weightup&ate rule we &eri#e& was:
wij (new)O η. targ j ini
% ;here is some similarit$* +ut it is clear that -e++ianlearning is not going to get our 'erceptron to learna set o training &ata.
A& li
-
8/9/2019 Neural Network -AI
80/84
November 11, 200
4
AI: Chapter 20.5: Neural
Networks
0
A&aline
% A&aline 8A&apti#e ,inear Network9 was&e#elope& +$ 6i&row an& -o/ in 10.
4 "ses +ipolar acti#ations 8>1 an& 19 or its
input signals an& target #alues4 6eight connections are a&
-
8/9/2019 Neural Network -AI
81/84
November 11, 200
4
AI: Chapter 20.5: Neural
Networks
1
A&aline ;raining Algorithm
% Step 0: Initiali@e weights an& +ias4 )or simplicit$* set weights 8small ran&om #alues9
Set learning rate α 80 UO α UO 19 8η9
% Step 1: 6hile stopping con&ition is alse &osteps 2>
% Step 2: )or each training pair s:t &o steps K>5
% Step K: Set acti#ations o input units x i % si
A& li ; i i Al ith
-
8/9/2019 Neural Network -AI
82/84
November 11, 200
4
AI: Chapter 20.5: Neural
Networks
2
A&aline ;raining Algorithm
% Step 3: Compute net input to outputunit
y-in % + " Σ x iwi
% Step 5: "p&ate +ias an& weights
wi(new) % wi(o*!) " α (t#y-in)x i
+(new) % +(o*!) " α (t#y-in)
% Step : ;est or stopping con&ition
A t i ti N t
-
8/9/2019 Neural Network -AI
83/84
November 11, 200
4
AI: Chapter 20.5: Neural
Networks
3
Autoassociati#e Net
% ;he ee& orwar&autoassociati#e net hasthe ollowing &iagram
% "seul or &etermining issomething is a part o
the test pattern or not% 6eight matri &iagonal
is usuall$ @eroimpro#es generali@ation
% -e++ian learning imutuall$ orthogonal#ectors are use&
1
i
n
$1
$<
$m
(AM N t
-
8/9/2019 Neural Network -AI
84/84
(AM Net
% (i&irectional Associati#e Net