NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

24
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac [email protected] m FON, 2015.

description

BRIEF OVERVIEW Lectures 1.Basic concepts, architecture and components 2.Multi Layer Perceptron and Backpropagation 3.Problem solving using neural networks 4.Backpropagation improvements Labs 1.and 2. Neural network framewrok Neuroph and basic usage for image recognition and OCR 3. and 4. Neuroph framework architecture and extensions

Transcript of NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

Page 1: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

NEURAL NETWORKS

LECTURE 1

dr Zoran Š[email protected]

FON, 2015.

Page 2: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

AIMS

• Brief introduction to neural networks concepts and architectures

• Capabilities and possible applications• Java Neural Network Framework-om

Neuroph• Recognize typical neural network

problems and implement neural network based solutions

• Neuroph framework development

Page 3: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

BRIEF OVERVIEWLectures1. Basic concepts, architecture and components2. Multi Layer Perceptron and Backpropagation3. Problem solving using neural networks4. Backpropagation improvements

Labs1. and 2. Neural network framewrok Neuroph and basic

usage for image recognition and OCR3. and 4. Neuroph framework architecture and

extensions

Page 4: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

WHAT IS NEURAL NETWORK• Brain inspired mathematical models• Biological and artificial NN

Page 5: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

BIOLOGICAL NEURON

Page 6: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

Biological Neural Network

Page 7: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

BRAIN AND ANN• Brain:

– 1010 neurons– 1013 connections– speed at milisec– Fully parallel

• ANN:– 20 000 neurons– Speed at nanosec– Simulating parallel

computation

Page 8: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

Biological and artificial neuron

• Basic parts: body(soma), dendrites(inputs), axons (outputs), synapses (connections)

Page 9: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

Artificial Neuron

output = f (w1in1+ …+wninn)

Page 10: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

Basic parts of artificial neuron

• Input summing function• Transfer function• Weighted inputs• Output

Page 11: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

TRANSFER FUNCTION

Linear

Step

Sigmoid

Page 12: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

McCulloch Pits NeuronThreshold Logic Unit

y = STEP (w1u1+ …+wnun)

Page 13: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

NEURAL NETWORK ARCHITECTURES

Page 14: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

Neural network training/learning

• Learning procedure: adjusting connection weights, until network gets desired behaviour

• Supervised Learning• Unsupervised Learning

Page 15: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

SUPERVISED LEARING

Basic principle: iterative error minimization

w ji ( k+ 1 )=w ji ( k )+Δw ji ( k )=w ji ( k )+μE ( y j ( k ) ,d j ( k ) )

Neural network

Input X

Desired output d(x)

Actual output y(x)

Error

e(x) = d(x) – y(x)

Page 16: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

ADALINE

Linearna transfer functionLinear combination of inputsy = w1u1+w2u2+…+wnun,

Learning algorithm: Least Mean Squares

u1

u2

u3

w1

w2

w3

y

Page 17: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

LMS LEARNING

E= 12n ∑

p=1

n

εp2

w ji ( k+ 1 )=w ji ( k )+με ( k ) u ji ( k )

LMS rule equations:

(1) output neuron error for pattern p

εp=dp-yp

(3) total network error for all patterns from training set (training stops when this error is below some predefined value)

(2) weight change according to error

Page 18: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

PERCEPTRON

• Step transfer function• Perceptron learning - first algorithm for

learning nonlinear systems• Only for linear separable problems

u1

u2

y1

y2

y3

Page 19: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

Multi Layer Perceptron• Basic perceptrn extended with

one or more layers of hidden neurons between input and output layer

• Differentiable neuron transfer functions (tanh, sigmoid)

• Using Backpropagation algorithm for learning, which is bassed on LMS algorithm

• Able to solve complex problems

Page 20: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

Backpropagation algorithm

• For Multi Layer Perceptron training – able to adjust weights in hidden layers

• Supervised learning algorithm based on LMS algorithm

• Multi Layer Perceptron with Backpropagation is universal aproximator

Page 21: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

Backpropagation algorithm

• Formula for adjusting hidden neurons

Page 22: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

Backpropagation settings

• Max error• Max iterations• Learning rate• Momentum• Batch mode

Page 23: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

Backpropagation algoritam

• Local minimum• Stopping conditions:

– Max iterations– Max error– Error stall– MSE for specific test set

Page 24: NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.

LINKS AND BOOKS• http://neuroph.sourceforge.nethttp://ai.fon.bg.ac.rs/osnovne/inteligentni-sistemi/

• Neural Networks - A Systematic Introduction , free online book

• Introduction to Neural Computation