Adaptive Hopfield Network Gürsel Serpen Dr. Gürsel Serpen Associate Professor Electrical...
-
Upload
norman-woods -
Category
Documents
-
view
215 -
download
3
Transcript of Adaptive Hopfield Network Gürsel Serpen Dr. Gürsel Serpen Associate Professor Electrical...
Adaptive Hopfield Network
Dr. Gürsel SerpenGürsel SerpenAssociate Professor
Electrical Engineering and Computer Science DepartmentUniversity of ToledoToledo, Ohio, USA
Presentation Topics
Motivation for research Classical Hopfield network (HN) Adaptation – Gradient Descent Adaptive Hopfield Network (AHN) Static Optimization with AHN Results and Conclusions
Serpen et al., Upcoming Journal Article (Insallah!)
http://www.eecs.utoledo.edu/~serpen
FOR MORE INFO...
Motivation Classical Hopfield neural network (HN) has been shown to
have the potential to address a very large spectrum of static optimization problems.
Classical HN is NOT trainable: implies that it can NOT learn from prior search attempts.
A hardware realization of the Hopfield network is very attractive for real-time, embedded computing environments.
Is there a way (e.g., training or adaptation) to incorporate
the experience (gained as a result of prior search attempts) into the network dynamics (weights) to help the network focus on promising regions of the overall search space?
Research Goals Propose gradient-descent based procedures to “adapt”
weights and constraint weighting coefficients of HN. Develop an indirect procedure to define Develop an indirect procedure to define “pseudo” values “pseudo” values
for desired neuron outputsfor desired neuron outputs (much like the way desired (much like the way desired output values for hidden layer neurons in an MLP).output values for hidden layer neurons in an MLP).
Develop space-efficient schemes to store the symmetric weight matrix (upper/lower triangular) for large-scale problem instances.
Apply (through simulation) the adaptive HN algorithm to (large-scale) static optimization problems.
Classical Hopfield Net Dynamics
K
jijiji
i btzwtudt
tdu
1
)( ii ufz
Neuron Dynamics Sigmoid functionNumber of Neurons
,,....,2,1 Ki
jiij ww wk1
wk2
wkJ
1z
2z
Kz
kz k-th node
Weights (interconnection) - Redefined
S K
iii
S K
i
K
jjiijij zgzzdgE
1 11 1 12
1)(z
S
ijijij dgw1
=
S
ii gb1
=
K
i
K
jjiij zzwE
1 12
1
K
i
z K
iii
i
zbdzzf1 0 1
1 )(1
Liapunov Function
Generic
Decomposed
Weights Defined
Adaptive Hopfield NetBlock Diagram
(Classical)
Hopfield Network
Adapt Constraint Weighting Coefficients
Adapt Weights
,ktz
1ktW
ktg
0tg
0tz
Initial weight values
Initial weight coefficient values
Initial neuron outputs
0tW
Adjoint Hopfield Network
,*ktz
Adaptive Hopfield NetPseudoCode
Initialization• Initialize network constraint weighting coefficients.• Initialize weights.• Initialize Hopfield net neuron outputs (randomly).
Adaptive SearchRelaxation• Relax Hopfield dynamics until convergence to a fixed
point.Adaptation• Relax Adjoint network until convergence to a fixed point.• Update weights.• Update constraint weighting coefficients.
Termination Criteria • if not satisfied, continue with Adaptive Search.
Hopfield Network Relaxation
wk1
wk2
wkK
Hopfield Network
1z
2z
Kz
kz
k-th node
Adaptation of WeightsAdjoint Hopfield Network
1z
2z
Kz
*kz
k-th node
kwuf 22
kwuf 11
KkK wuf
ek
K
jijjiji
i etzwuftzdt
tdz
1
***
iii ze
Adjoint Network
,,....,2,1 Ki
Adaptation of WeightsRecurrent BackProp
jiiij
ij zzufw
Ew *
jiij ww
Kji ,...,2,1,
Weight Update – Recurrent BackProp
AdaptationConstraint Weighting Coefficients
),(),(),(),( 21 kSkkk tEtEtEtE zzzz
k
kkk tg
tEtgtg
,
1zGradient Descent
Adaptation Rule
Error Function – Problem Specific and Redefined
),()(),()( ''11 kCkCkk tEtgtEtg zz
),(),(),(),( 21 kSkkk tEtEtEtE zzzz
AdaptationConstraint Weighting Coefficients
kk
k tEtg
tE,
)(
, ' zz
kkk tEtgtg ,'1 z
Partial Derivative – Readily Computable
Final Form of Coefficient Update Rule
Mapping A Static Optimization Problem
N
i
N
j
N
nnjrowrow zgE
1 1
2
1
1
NN
iieE
1
2
2
1
NN
i kl
ii
kl w
ze
w
E
1
N
q
N
r kl
qrqr
kl w
ze
w
E
1 1
.12
1 1 1
N
q
N
r kl
qrN
nqnrow
kl
row
w
zzg
w
E
N
nnrrowqr zge
1
12
Generic Partial Problem-Specific Partial
Simulation Study
Traveling Salesman ProblemA preliminary work at this timeUp to 100 cities performedComputing Resources – Ohio Supercomputing CenterPreliminary findings suggest that the theoretical framework is sound and projections are validComputational cost (weight matrix size) poses significant challenge for simulation purposes – on going research effortCurrently in progress
Conclusions
An adaptation mechanism, which modifies constraint weighting coefficient parameter values and weights of the classical Hopfield network, was proposed.
A mathematical characterization of the adaptive Hopfield network was presented.
Preliminary simulation results suggest the proposed adaptation mechanism to be effective in guiding the Hopfield network towards high-quality feasible solutions of large-scale static optimization problems.
We are also exploring incorporating a computationally viable stochastic search mechanism to further improve quality of solutions computed by the adaptive Hopfield network while preserving parallel computation capability.
Thank You !
Questions ?We gratefully acknowledge the computing resources grant provided by the State of Ohio Supercomputing Center (in USA) in facilitating the simulation study.
We appreciate the support provided by the Kohler Internationalization Awards Program at the University of Toledo to facilitate this conference presentation.