Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and...

15
Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University http://cct.lsu.edu/~iwsmith

Transcript of Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and...

Page 1: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith.

Parallel Artificial Neural Networks

Ian Wesley-SmithFrameworks DivisionCenter for Computation and TechnologyLouisiana State Universityhttp://cct.lsu.edu/~iwsmith

Page 2: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith.

Basics of ANNs

• Vague model of biological neural network

• No authoritative definition for ANNs– Group of small computational

components (neurons) networked together

Page 3: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith.

Strengths of ANNs

– Inherently Non-Linear– Learn

• Supervised• Input-Output Mapping

– Simple enough to implement in hardware

Good at:– Optimization problems (traveling

salesman)– Pattern classification (handwriting

analysis)

Page 4: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith.

Examples of ANNs

• Digital Signal Processing (DSP)• Optical Character Recognition (OCR)• Sales Forecasting• Industrial Process Control• SONAR/RADAR• Medical Assessment• Games

Page 5: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith.

Examples of ANNs

• Robot Army

Page 6: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith.

Components of Neurons

• Inputs– Vector

• Weights– Matrix [inputs x outputs]

• Output (activation) Function– Threshold– Sigmoid

Page 7: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith.

Example Neuron

x1

x2

xn

.

.

.

f(x)

WeightsInputs Output Function

A Single Neuron

Page 8: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith.

Perceptrons• Simplest ANN• Single Layer• Single Neuron

– Can be more

• Simple pattern classifiers– Only classify linearly separable sets

• Learning with the delta-rule • Output function is threshold

Page 9: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith.

Sample Data

Page 10: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith.

Perceptrons

• Computation

x1

x2

xn

.

.

.

f(x)

WeightsInputs Output Function

A Single Neuron

w=Weight Column Vectori= Input Vectorf i⋅w

Page 11: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith.

Perceptrons

• Delta-Learning

n= Indexw=Weight Column Vectori= Input Vectorz=Learning Constante=Errorerror=desired response−actual response

w n=w n z∗e∗in

Page 12: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith.

Methodology

• Implicit parallelism• Neurons are independent of one another• Calculations are relatively simple• Process large datasets faster

• Implementation– Serial to Parallel Implementation– PETSc

• Portable Extensible Toolkit for Scientific Computing

– Run Details• AMD Dual Opteron• 2 Processor Run• Varying Sized Data Sets (100-10 million)

Page 13: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith.

Results

• Parallel ANN was functional• Parallel implementation performs slower

than serial– This is expected

• Possible Reasons– Single Neuron Problem– PETSc/MPI Overhead

Page 14: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith.

Future Work

• Implement more advanced (recurrent) networks

• Hand code MPI instead of relying on PETSc

• Test in larger environments– 32 processors minimum

Page 15: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith.

Acknowledgments

• Yaakoub Y. El-Khamra• Dr. Gabrielle Allen• Dr. Ed Seidel• Kathy Traxler• Louisiana State University

– Center for Computation and Technology– Computer Science Department