Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

13
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Artificial Neural Networks Networks 0909.560.01/0909.454.01 0909.560.01/0909.454.01 Fall 2004 Fall 2004 Shreekanth Mandayam ECE Department Rowan University http://engineering.rowan.edu/~shreek/ spring04/ann/ Lecture 6 Lecture 6 October 18, 2004 October 18, 2004

description

Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004. Lecture 6 October 18, 2004. Shreekanth Mandayam ECE Department Rowan University http://engineering.rowan.edu/~shreek/spring04/ann/. Plan. Radial Basis Function Networks RBF Formulation Network Implementation - PowerPoint PPT Presentation

Transcript of Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

Page 1: Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

S. Mandayam/ ANN/ECE Dept./Rowan University

Artificial Neural NetworksArtificial Neural Networks0909.560.01/0909.454.010909.560.01/0909.454.01

Fall 2004Fall 2004

Shreekanth MandayamECE Department

Rowan University

http://engineering.rowan.edu/~shreek/spring04/ann/

Lecture 6Lecture 6October 18, 2004October 18, 2004

Page 2: Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

S. Mandayam/ ANN/ECE Dept./Rowan University

PlanPlan

• Radial Basis Function Networks• RBF Formulation• Network Implementation• Matlab Implementation

• Design Issues• Center Selection: K-means Clustering Algorithm• Input data processing

• Selection of training and test data - cross-validation• Pre-processing: Feature Extraction

• Lab Project 3

Page 3: Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

S. Mandayam/ ANN/ECE Dept./Rowan University

RBF PrincipleRBF Principle

Non-linearly separable classes

Linearly separable classes

Transform to

“higher”-dimensionalvector space

Page 4: Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

S. Mandayam/ ANN/ECE Dept./Rowan University

Example: X-OR ProblemExample: X-OR Problem

x1 x2 y

0 0 00 1 11 0 11 1 0

(x) 2(x) y'

0.13 1 00.36 0.36 10.36 0.36 1

1 0.13 0

x1

x2

(x)

(x)

DecisionBoundary

Page 5: Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

S. Mandayam/ ANN/ECE Dept./Rowan University

RBF FormulationRBF Formulation

Problem Statement• Given a set of N distinct real data vectors

(xj; j=1,2,…,N) and a set of N real numbers (dj; j=1,2,…,N), find a function that satisfies the interpolating condition

F(xj) = dj; j=1,2,…,N

Page 6: Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

S. Mandayam/ ANN/ECE Dept./Rowan University

RBF NetworkRBF Network

1

1

1

x1

x2

x3

y1

y2

1wij

InputLayer

Hidden Layer

OutputLayer

Inputs Outputs

2

2ji

2

cx

ij e

-5 5

0

0.5

1(t)

t

Page 7: Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

S. Mandayam/ ANN/ECE Dept./Rowan University

Matlab ImplementationMatlab Implementation%Radial Basis Function Network%S. Mandayam/ECE Dept./Rowan University%Neural Nets/Fall 04clear;close all;%generate training data (input and target)p = [0:0.25:4];t = sin(p*pi);%Define and train RBF Networknet = newrb(p,t);plot(p,t,'*r');hold;%generate test datap1 = [0:0.1:4]; %test networky = sim(net,p1);plot(p1,y,'ob');

legend('Training','Test');xlabel('input, p');ylabel('target, t')

Matlab Demos

» demorb1» demorb3» demorb4

Page 8: Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

S. Mandayam/ ANN/ECE Dept./Rowan University

RBF - Center SelectionRBF - Center Selection

x1

x2

Data points Centers

Page 9: Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

S. Mandayam/ ANN/ECE Dept./Rowan University

K-means Clustering AlgorithmK-means Clustering Algorithm

• N data points, xi; i = 1, 2, …, N

• At time-index, n, define K clusters with cluster centers cj

(n) ; j = 1, 2, …, K

• Initialization: At n=0, let cj(n)

= xj; j = 1, 2, …, K (i.e. choose the first K data points as cluster centers)

• Compute the Euclidean distance of each data point from the cluster center, d(xj , cj

(n)) = dij

• Assign xj to cluster cj(n)

if dij = mini,j {dij}; i = 1, 2, …, N, j = 1, 2, …, K

• For each cluster j = 1, 2, …, K, update the cluster center cj

(n+1) = mean {xj cj

(n)}

• Repeat until ||cj(n+1)

- cj(n)|| <

Page 10: Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

S. Mandayam/ ANN/ECE Dept./Rowan University

Selection of Training and Test Data: Selection of Training and Test Data: Method of Cross-ValidationMethod of Cross-Validation

Train Train Train Test

Train Train Test Train

Train Test Train Train

Test Train Train Train

Trial 1

Trial 2

Trial 3

Trial 4

• Vary network parameters until total mean squared error is minimum for all trials

• Find network with the least mean squared output error

Page 11: Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

S. Mandayam/ ANN/ECE Dept./Rowan University

Feature ExtractionFeature Extraction

Objective:• Increase information content• Decrease vector length• Parametric invariance

• Invariance by structure

• Invariance by training

• Invariance by transformation

Page 12: Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

S. Mandayam/ ANN/ECE Dept./Rowan University

Lab Project 3: Lab Project 3: Radial Basis Function Neural Networks Radial Basis Function Neural Networks

http://engineering.rowan.edu/~shreek/fall04/ann/lab3.html

Page 13: Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

S. Mandayam/ ANN/ECE Dept./Rowan University

SummarySummary