Function Approx2009

36
Function Approximation And Pattern Recognition Imthias Ahamed T. P. Dept. of Electrical Engineering, T.K.M.College of Engineering, Kollam – 691005, [email protected]

description

Neural network viewed as a tool for Function approximation

Transcript of Function Approx2009

Page 1: Function Approx2009

Function ApproximationAnd

Pattern Recognition

Imthias Ahamed T. P.

Dept. of Electrical Engineering,

T.K.M.College of Engineering,

Kollam – 691005,

[email protected]

Page 2: Function Approx2009

Function Approximation Problem x = [0 1 2 3 4 5 6 7 8 9 10]; d = [0 1 2 3 4 3 2 1 2 3 4];

Find f such that ( )i if x d

Page 3: Function Approx2009

A matlab program

clf clear x= [0 1 2 3 4 5 6 7 8 9 10]; d = [0 1 2 3 4 3 2 1 2 3 4]; plot(x,d,'x') pause

Page 4: Function Approx2009

Learning Problem

n

i

ieC1

2

2

1w

( ) ( ) ( ( ))i i f i e d X

Page 5: Function Approx2009

Optimization Technique: Steepest Descent

n

i

ieC1

2

2

1w

wg C

The steepest descent algorithm: w(n+1)=w(n)-g(n)

Page 6: Function Approx2009

6

Least-Mean-Square (LMS) Algorithm

neC 2

2

1w e(n) is the error signal measured at time n.

ww

w

nene

C

ˆ ˆ1

e nn n e n

w w

w

Page 7: Function Approx2009

Model of a Simple Perceptron

wk1

wk2

wkm

x1

x2

xm

......

S

Biasbk

Summing junction

Synaptic weights

Input signals j×

Activation function

vkOutput

yk

m

jjkjk xwv

0and kk vy jLet bk=wk0 and x0=+1

Page 8: Function Approx2009

8

Activation Functions

0.20.40.60.81

1.2

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

vj

v

0.20.40.60.81

1.2

-8 -6 -4 -2 0 2 4 6 80

vj

v

Increasinga

Threshold Function

0v if

0v ifv

0

1j

Sigmoid Function

parameter slopethe is a

avv

exp1

1j

Page 9: Function Approx2009

Multi Layer Perceptron

Multi Layer Perceptron or Feedforward Network Consists of

Input Layer One or more Hidden Layer Output Layer

Page 10: Function Approx2009

Multi Layer Perceptron

Page 11: Function Approx2009

A matlab program

clf clear x= [0 1 2 3 4 5 6 7 8 9 10]; d = [0 1 2 3 4 3 2 1 2 3 4]; plot(x,d,'x') pause

Page 12: Function Approx2009

net = newff([0 10],[5 1],{'tansig' 'purelin'}); ybeforetrain = sim(net,x) plot(x,d,'x',x,ybeforetrain,'o') legend('desired','actual') pause

Page 13: Function Approx2009

net.trainParam.epochs = 50; net = train(net,x,d); Y = sim(net,x); plot(x,d,'x',x,Y,'o') legend('desired','actual') pause

Page 14: Function Approx2009

xtest=0:.5:10; ytest = sim(net,xtest); plot(x,d,'x',xtest,ytest,'o') legend('desired','actual')

Page 15: Function Approx2009

Pattern Recognition Problem

?

Page 16: Function Approx2009

An Example

x=[-0.5 -.5 .3 .1 .6 .7; -.5 .5 -.5 1 .8 1 ] Y=[ 1 1 0 0 0 0]

Page 17: Function Approx2009
Page 18: Function Approx2009
Page 19: Function Approx2009

Linearly Non Separable data

Page 20: Function Approx2009

Summary

Perceptron Weights Activation Function Error minimization Gradient descent Learning Rule

Page 21: Function Approx2009

Training data Testing data Linearly separable Linearly non separable

Page 22: Function Approx2009

Back-propagation algorithm. Notations: i, j and k refer to different neurons; with

signals propagating through the network from left to right, neuron j lies in a layer to the right of neuron i.

wji(n): The synaptic weight connecting the output of neuron i to the input of neuron j at iteration n.

Page 23: Function Approx2009

23

Back-propagation Algorithm

nyndne jjj

2

1 2

Cj

j nen

m

iijij nynwnv

0

nvny jj j

nw

nv

nv

ny

ny

ne

ne

n

nw

n

ji

j

j

j

j

j

jji

Page 24: Function Approx2009

Back-propagation Algorithm Contd...Local Gradient

nw

nv

nv

ny

ny

ne

ne

n

nw

n

ji

j

j

j

j

j

jji

nv

ny

ny

ne

ne

n

nv

nn

j

j

j

j

jjj

)(nyn

nw

nvn

nw

n

ij

ji

jj

ji

nj

nynw

nvi

ji

j

nw

nnw

jiji

nynnw ijji

Page 25: Function Approx2009

25

Case 1: Neuron j is an Output Node

1

ny

ne

j

j nene

nj

j

nvnv

nyj

j

j j

nvnenv

ny

ny

ne

ne

n

nv

nn jjj

j

j

j

j

jjj j

nynnw ijji

Page 26: Function Approx2009

26

Case 2: Neuron j is a Hidden Node

nvny

n

nv

ny

ny

njj

jj

j

j

j

k j

k

k

kk

k j

kk

j

Ckk

j

ny

nv

nv

nene

ny

nene

ny

n

nen

ny

n

2

1

?

2

nvnv

nek

k

k j

m

jjkjk nynwnv

0

nwny

nvkj

j

k

nv

nn

jj

Page 27: Function Approx2009

27

Case 2: Neuron j is a Hidden Node (Contd…)

k

kjkjjj nwnnvn j

kkjk

kkjkkk

j

nwn

nwnvneny

n

j

Page 28: Function Approx2009

Delta Rule

×

×

ny

signal

input

n

Gradient

Local

terrateparame

Learning

nw

correction

Weight

ijji

If neuron j is an output node,

nvnen jjjj j

If neuron j is an hidden node,

k

kjkjjj nwnnvn j

Page 29: Function Approx2009

29

Back-propagation Algorithm:Summary

1. Initialization. Pick all of

the wji from a uniform

distribution.

2. Presentations of

Training Examples.

3. Forward Computation.

4. Backward

Computation.

5. Iteration.

Page 30: Function Approx2009

Back-propagation Algorithm:Summary 1. Initialiaze 2. Forward Computation:

3. Backwar Computration:

For all hidden layers l do

jiW

jLL eyvyvyv ,,,,,, 2211

jsoutputnodeallfordonen Lj

Lj

llayerinjnodeshiddenallfornwnnvnk

kjl

kl

jjl

j 1j

Page 31: Function Approx2009

4. Update weights

nynnwnw lij

llji

lji

11

Page 32: Function Approx2009

32

Learning with a Teacher (Supervised Learning)

Page 33: Function Approx2009

33

Learning without a Teacher Reinforcement Learning

Page 34: Function Approx2009

34

Learning Tasks Function Approximation

d=f(x)

x: input vector d: output vectorf(×) is assumed to be unknown

Given a set of labeled examples:

Requirement: Design a neural network to approximate this unknown function f(×) such that F(×).

||F(x)-f(x)||< for all x, where is a small positive number

N

iii dx 1,

Page 35: Function Approx2009

35

Learning TasksPattern Recognition

Def: A received pattern/signal is assigned to one of a prescribed number of classes.

Input patternx

Unsupervisednetwork for

featureextraction

Featurevector y Supervised

network forclassification

12

r

Page 36: Function Approx2009

Thank You