Matlab File

20
Experiment No 1 Aim: To study the introduction to Neural Network Toolbox. Tool Used: MATLAB 7 Theory: The term neural network was traditionally used to refer to a network or circuit of biological neurons The modern usage of the term often refers to ANN which are composed of artificial neuron or nodes. Thus the term has two distinct usages: 1. Biological neuron network are made up of real biological neurons that are connected or functionally related in a nervous system. In the field of neurons, they are often identified as groups of neurons that perform a specific physiological function in laboratory analysis. 2. ANN are composed of interconnecting artificial neurons (programming constructs that mimic the properties of biological neurons). Artificial neural networks may either be used to gain an understanding of biological neural networks, or for solving artificial intelligence problems without necessarily creating a model of a real biological system. The real, biological nervous system is highly complex: artificial neural network algorithms attempt to abstract this complexity and focus on what may hypothetically matter most from an information processing point of view. Good performance (e.g. as measured by good predictive ability, low generalization error), or performance mimicking animal or human error patterns, can then be used as one source of evidence towards supporting the hypothesis that the abstraction really captured something important from the point of view of information processing in the brain. Another incentive for these abstractions is to reduce the amount of computation required to simulate artificial neural networks, so as to allow one

Transcript of Matlab File

Page 1: Matlab File

Experiment No 1

Aim: To study the introduction to Neural Network Toolbox.

Tool Used: MATLAB 7

Theory:

The term neural network was traditionally used to refer to a network or circuit of biological neurons The modern usage of the term often refers to ANN which are composed of artificial neuron or nodes. Thus the term has two distinct usages:

1. Biological neuron network are made up of real biological neurons that are connected or functionally related in a nervous system. In the field of neurons, they are often identified as groups of neurons that perform a specific physiological function in laboratory analysis.

2. ANN are composed of interconnecting artificial neurons (programming constructs that mimic the properties of biological neurons). Artificial neural networks may either be used to gain an understanding of biological neural networks, or for solving artificial intelligence problems without necessarily creating a model of a real biological system. The real, biological nervous system is highly complex: artificial neural network algorithms attempt to abstract this complexity and focus on what may hypothetically matter most from an information processing point of view. Good performance (e.g. as measured by good predictive ability, low generalization error), or performance mimicking animal or human error patterns, can then be used as one source of evidence towards supporting the hypothesis that the abstraction really captured something important from the point of view of information processing in the brain. Another incentive for these abstractions is to reduce the amount of computation required to simulate artificial neural networks, so as to allow one to experiment with larger networks and train them on larger data sets.

Page 2: Matlab File

Neural Network Applications:

Neural networks have been applied in many other fields. A list of some applications mentioned in the literature follows.

Aerospace- High performance aircraft autopilot, flight path simulation, aircraft control systems, autopilot enhancements, aircraft component simulation, aircraft component fault detection

Automotive- Automobile automatic guidance system, warranty activity analysis Banking- Check and other document reading, credit application evaluation Defense- Weapon steering, target tracking, object discrimination, facial recognition, new

kinds of sensors, sonar, radar and image signal processing including data compression, feature extraction and noise suppression, signal/image identification

Electronics- Code sequence prediction, integrated circuit chip layout, process control, chip failure analysis, machine vision, voice synthesis, nonlinear modeling

Financial- Real estate appraisal, loan advisor, mortgage screening, corporate bond rating, credit-line use analysis, portfolio trading program, corporate financial analysis, currency price prediction

Industrial- Neural networks are being trained to predict the output gasses of furnaces and other industrial processes. They then replace complex and costly equipment used for this purpose in the past.

Medical- Breast cancer cell analysis, EEG and ECG analysis, prosthesis design, optimization of transplant times, hospital expense reduction, hospital quality improvement, emergency-room test advisement

Robotics- Trajectory control, forklift robot, manipulator controllers, vision systems Telecommunications- Image and data compression, automated information services, real-

time translation of spoken language, customer payment processing systems Speech- Speech recognition, speech compression, vowel classification, text-to-speech

synthesis Securities- Market analysis, automatic bond rating, stock trading advisory systems

Transfer Functions:

Three of the most commonly used functions are shown below.

1. Hard Limit Transfer function (hardlim)

Page 3: Matlab File

The hard-limit transfer function shown above limits the output of the neuron to either 0, if the net input argument n is less than 0; or 1, if n is greater than or equal to 0.

2. Linear Transfer Function (purelin)

Linear TF gives the linear output to respected input. Neurons of this type are used as linear approximations in Linear Filters.

3. Log-Sigmoid Transfer Function (logsim)

The sigmoid transfer function shown above takes the input, which may have any value between plus and minus infinity, and squashes the output into the range 0 to 1. This transfer function is commonly used in back propagation networks, in part because it is differentiable.

Architecture :

The basic architecture consists of three types of neuron layers: input, hidden, and output. In feed-forward networks, the signal flow is from input to output units, strictly in a feed-forward direction. The data processing can extend over multiple layers of units, but no feedback

Page 4: Matlab File

connections are present. Recurrent networks contain feedback connections. Contrary to feed-forward networks, the dynamical properties of the network are important. In some cases, the activation values of the units undergo a relaxation process such that the network will evolve to a stable state in which these activations do not change anymore.

In other applications, the changes of the activation values of the output neurons are significant, such that the dynamical behavior constitutes the output of the network. Other neural network architectures include ART maps and competitive networks

Conclusion: We have studied introduction to Neural Network toolbox in MATLAB.

Page 5: Matlab File

Experiment No 2

Aim: write a program to demonstrate a neural network function through matlab.

Tool Used: MATLAB 7

Theory:

1. hardlim - Hard-limit transfer function

Graph-

Syntax-

A = hardlim(N,FP)dA_dN = hardlim('dn',N,A,FP)info = hardlim('code')

Description-

hardlim is a neural transfer function. Transfer functions calculate a layer's output from its net input.

2. purelin - Linear transfer function

Graph-

Page 6: Matlab File

Syntax-

A = purelin(N,FP)dA_dN = purelin('dn',N,A,FP)info = purelin('code')

Description-

purelin is a neural transfer function. Transfer functions calculate a layer's output from its net input.

Program-

p = [8 7 6 5 4 3 2 1 0];

t = [0 0.80 0.95 0.15 -0.56 -0.96 -0.30 0.66 0.89];plot(p,t,'o')

net = newff([0 8],[10 1],{'hardlims' 'purelin'},'trainlm');y1 = sim(net,p)plot(p,t,'o',p,y1,'x')

net.trainParam.epochs = 50;net.trainParam.goal = 0.01;net = train(net,p,t);

y2 = sim(net,p)plot(p,t,'o',p,y1,'x',p,y2,'*')

output-

0 1 2 3 4 5 6 7 8-1

-0.8

-0.6

-0.4

-0.2

0

0.2

0.4

0.6

0.8

1

2.

Page 7: Matlab File

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 210

-3

10-2

10-1

100

101

Best Training Performance is 0.42474 at epoch 2M

ea

n S

qu

are

d E

rro

r (

ms

e)

2 Epochs

Train

BestGoal

3.

10-20

100

1020

grad

ient

Gradient = 5.2755e-011, at epoch 2

10-5

10-4

10-3

mu

Mu = 1e-005, at epoch 2

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2-1

0

1

val f

ail

2 Epochs

Validation Checks = 0, at epoch 2

Page 8: Matlab File

.

-0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8

-0.8

-0.6

-0.4

-0.2

0

0.2

0.4

0.6

0.8

Target

Ou

tpu

t ~

= 3

.9e

-01

8*T

arg

et

+ 0

.18

Training: R=-3.8519e-034

Data

FitY = T

Conclusion- Required output is obtain by using hardlim function and purelin function.

Page 9: Matlab File

Experiment No 3

Aim: write a program to demonstrate a neural network function tansig and newff through matlab.

Tool Used: MATLAB 7

Theory:

Explanation and syntax-

1. CREATING A PERCEPTRON (newp)

A perceptron can be created with the newp function:

net = newp(P,T)

where input arguments are as follows:

P is an R-by-Q matrix of Q input vectors of R elements each.

T is an S-by-Q matrix of Q target vectors of S elements each.

Commonly, the hardlim function is used in perceptrons,

2. TANSIG- Hyperbolic tangent sigmoid transfer function

SYNTAX

A = tansig(N,FP)

dA_dN = tansig('dn',N,A,FP)

DESCRIPTION

tansig is a neural transfer function. Transfer functions calculate a layer's output from its net input.

PROGRAM-

p = [0 1 2 3 4 5 6 7 8];t = [0 0.84 0.91 0.14 -0.77 -0.96 -0.28 0.66 0.99];plot(p,t)net = newff([0 8],[10 1],{'tansig' 'purelin'},'trainlm');y1 = sim(net,p)

Page 10: Matlab File

plot(p,t,p,y1)net.trainParam.epochs = 50;net.trainParam.goal = 0.01;net = train(net,p,t);y2 = sim(net,p)plot(p,t,,p,y1,p,y2)

OUTPUT-

0 1 2 3 4 5 6 7 8-1.5

-1

-0.5

0

0.5

1

1.5

2

2.5

3

Graph-1

Page 11: Matlab File

G-2.

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 210

-6

10-4

10-2

100

Best Training Performance is 9.3375e-006 at epoch 2

Me

an

Sq

ua

red

Err

or

(m

se

)

2 Epochs

Train

BestGoal

G-3.

10-5

100

105

gra

die

nt

Gradient = 0.0070003, at epoch 2

10-5

10-4

10-3

mu

Mu = 1e-005, at epoch 2

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2-1

0

1

val fa

il

2 Epochs

Validation Checks = 0, at epoch 2

CONCLUSION- Required output is obtain by using NEWFF function and TANSIG function.

Page 12: Matlab File

Experiment No 4

Aim: write a program to demonstrate a neural network function through matlab.

Tool Used: MATLAB 7

Theory:

CLASSIFICATION WITH A 2 PERCEPTRON SIMUP - Simulates a perceptron layer.TRAINP - Trains a perceptron layer with perceptron rule.

Using the above functions a 2-input hard limit neuron is trained to classify 4 input vectors into two categories.

DEFINING A CLASSIFICATION PROBLEM

A row vector P defines four 2-element input vectors: P = [-0.5 -0.5 +0.3 +0.0; -0.5 +0.5 -0.5 +1.0];

A row vector T defines the vector's target categories. T = [1 1 0 0];

PLOTTING THE VECTORS TO CLASSIFY

We can plot these vectors with PLOTPV: plotpv(P,T);

The perceptron must properly classify the 4 input vectors in P into the two categories defined by T.

DEFINE THE PERCEPTRON Perceptrons have HARDLIM neurons. These neurons are capable of separating an input pace with a straight line into two categories (0 and 1).

INITP generates initial weights and biases for our neuron: [W,b] = initp(P,T)

INITIAL PERCEPTRON CLASSIFICATION The input vectors can be replotted... plotpv(P,T) with the neuron's initial attempt at classification.

INITP - Initializes a perceptron layer. [W,B] = INITP(P,T)

P - RxQ matrix of input vectors.

Page 13: Matlab File

T - SxQ matrix of target outputs. Returns weights and biases.

PROGRAM-

P = [+0.1 +0.2 +0.3 +0.4 ; +0.5 +0.6 +0.3 +0.5 ];T = [0.6 0.8 0.6 0.9 ]; plot(P,T);[W,b] = double (P,T)figure; plot(P,T); figure;plotpc(W,b);[W,b,epochs,errors] = trainp(W,b,P,T,1);figure;ploterr(errors)display(' see below format to get output from trained network ')display('p = [-0.5; 0.5];')display ('a = simup(p,W,b)')

OUTPUT-

0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 0.55 0.60.55

0.6

0.65

0.7

0.75

0.8

0.85

0.9

0.95

Page 14: Matlab File

Experiment No 5

Aim: write a program to demonstrate a neural network function through matlab.

Tool Used: MATLAB 7

THEORY-

logsig - Log-sigmoid transfer function

Graph and Symbol

Syntax-

A = logsig(N,FP)dA_dN = logsig('dn',N,A,FP)info = logsig('code')

Description-

logsig is a transfer function. Transfer functions calculate a layer's output from its net input.

A = logsig(N,FP) takes N and optional function parameters,

N S-by-Q matrix of net input (column) vectors

FP Struct of function parameters (ignored)

and returns A, the S-by-Q matrix of N's elements squashed into [0, 1].

.

Page 15: Matlab File

Program-

p = [0 1 2 3 4 5 6 7 8];t = [0 0.84 0.91 0.14 -0.77 -0.96 -0.28 0.66 0.99];plot(p,t)net = newff([0 8],[10 1],{'logsig' 'purelin'},'trainlm');y1 = sim(net,p)plot(p,t,p,y1)net.trainParam.epochs = 50;net.trainParam.goal = 0.01;net = train(net,p,t);sim=(net,p)plot(p,t,p,y1,p,y2)

output-

0 1 2 3 4 5 6 7 8-1

-0.5

0

0.5

1

1.5

Graph-1

Page 16: Matlab File

G-2

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 110

-3

10-2

10-1

100

Best Training Performance is 0.0017424 at epoch 1

Me

an

Sq

ua

red

Err

or

(m

se

)

1 Epochs

Train

BestGoal

G-3

10-2

10-1

100

gra

die

nt

Gradient = 0.063164, at epoch 1

10-4

10-3

mu

Mu = 0.0001, at epoch 1

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1-1

0

1

val fa

il

1 Epochs

Validation Checks = 0, at epoch 1

CONCLUSION- logsig and purlin function is studied and desire output is obtain