By Bernard Widrow, Professor Emeritus, Stanford Youngsik Kim, … · 2015. 1. 15. · The...

26
The Hebbian-LMS Algorithm By Bernard Widrow, Professor Emeritus, Stanford Youngsik Kim, Ph.D. candidate, Stanford Dookun Park, Ph.D. candidate, Stanford Widrow talk The Hebbian-LMS Algorithm 1

Transcript of By Bernard Widrow, Professor Emeritus, Stanford Youngsik Kim, … · 2015. 1. 15. · The...

  • The Hebbian-LMS AlgorithmBy

    Bernard Widrow, Professor Emeritus, StanfordYoungsik Kim, Ph.D. candidate, StanfordDookun Park, Ph.D. candidate, Stanford

    Widrow talk The Hebbian-LMS Algorithm

    1

  • Figure 1: Adaline. An adaptive linear neuron

    Widrow talk The Hebbian-LMS Algorithm

    InputPatternVector

    Output

    +

    -

    kw1kx1kw2kx2kw3kx3

    nkwnkx

    … 1−1+

    Signum

    kky (SUM)= kq

    kd

    keError

    DesiredResponse

    kx

    Weights

    Summer

    2

    LMS algorithm = + 2= −

  • Figure 2: Knobby adaline

    Widrow talk The Hebbian-LMS Algorithm

    3

  • Figure 3: Adaline with bootstrap learning.

    Widrow talk The Hebbian-LMS Algorithm

    InputPatternVector

    Output

    kw1kx1kw2kx2kw3kx3

    nkwnkx

    … 1−1+

    Signum

    ky kq

    keError

    kx

    Weights

    Summer

    kk qd =

    k)SUM(=

    – +

    4

  • Figure 4-(a): Bootstrap learning. The quantized output, the sum, and the error vs (SUM).

    Widrow talk The Hebbian-LMS Algorithm

    ky

    Error

    Positive stable equilibrium point

    Unstable equilibrium pointNegative stable

    equilibrium point

    Error

    slope = 1

    = Signum(SUM)kq

    (SUM)

    +

    +

    kq

    5

  • Figure 4-(b): Bootstrap learning. The error vs (SUM).

    Widrow talk The Hebbian-LMS Algorithm

    ky

    Positive stable equilibrium point

    Unstable equilibrium point

    Negative stable equilibrium point

    (SUM)

    +

    +

    6

  • Figure 5: Decision-directed learning for channel equalization.

    Widrow talk The Hebbian-LMS Algorithm

    TappedDelayLine

    Received Data

    Stream

    kw11−z

    kw2

    kw3

    nkw

    … 1−1+

    Signum

    ky kq

    keError

    Weights

    Summer

    Gatekk qd =

    Strobe

    BinaryData

    StreamTransmitter

    Strobe

    TelephoneChannel

    Data Pulses

    1−z1−z

    1−z

    Channel Output

    – +

    7

  • Figure 6: Eye patterns produced by overlaying cycles of the received waveform.

    (a) Before equalization. (b) After equalization.

    Widrow talk The Hebbian-LMS Algorithm

    8

  • Error

    Xkinput

    vector

    weights

    SIGMOID

    (Sum)k(Out)k

    - +

    εk

    Figure 7: A sigmoidal neuron trained with bootstrap learning.

    Widrow talk The Hebbian-LMS Algorithm

    9

  • Figure 8-(a): The error of a sigmoidal neuron with bootstrap learning.

    Widrow talk The Hebbian-LMS Algorithm

    (OUT)

    (SUM)

    slope =

    SGM

    Unstable equilibrium point

    +1

    -1 Positive stable equilibrium point

    Negative stable equilibrium point

    +

    +

    Sigmoid Error

    Error

    10

  • Figure 8-(b): The error function.

    Widrow talk The Hebbian-LMS Algorithm

    (SUM)

    Unstable equilibrium point

    Positive stable equilibrium point

    Negative stable equilibrium point

    ++

    11

  • Figure 9: A post-synaptic neuron with excitatory and inhibitory inputs and all positive weights trained with LMS bootstrap learning. All outputs are positive after rectification

    Widrow talk The Hebbian-LMS Algorithm

    - +

    +++-- -

    AllPositiveInputs

    Excitatory

    Inhibitory

    (SUM)

    Error

    AllPositiveWeights

    OUTPUT

    SIGMOID

    HALF SIGMOID

    12

  • Figure 10-(a): The error of the sigmoidal neuron with rectified output, trained with bootstrap learning.

    Widrow talk The Hebbian-LMS Algorithm

    (SUM)

    +1

    -1

    (OUT’) = 0

    (OUT) = (OUT’)

    +

    +

    Half Sigmoid Error

    Error

    (OUT) (OUT’)

    slope =

    Negative stable equilibrium point

    Unstable equilibrium point

    Positive stable equilibrium point

    13

  • Figure 10-(b): The error function.

    Widrow talk The Hebbian-LMS Algorithm

    (SUM)

    Unstable equilibrium point

    Positive stable equilibrium point

    Negative stable equilibrium point

    ++

    14

  • (Sum)

    (Out)

    Figure 11-(a): Hebbian-LMS learning process. Initial condition.

    Widrow talk The Hebbian-LMS Algorithm

    15

  • (Sum)

    (Out)

    Figure 11-(b): Hebbian-LMS learning process. After 100 iterations

    Widrow talk The Hebbian-LMS Algorithm

    16

  • (Sum)

    (Out)

    Figure 11-(c): Hebbian-LMS learning process. After 2000 iterations

    Widrow talk The Hebbian-LMS Algorithm

    17

  • The Hebbian-LMS Algorithm

    (Sum)

    (Out)

    Figure 11-(d): Hebbian-LMS learning process. After 5000 iterations

    Widrow talk

    18

  • Figure 11-(e): Hebbian-LMS learning curve.

    0

    0.05

    0.1

    0.15

    0.2

    0 1000 2000 3000 4000 5000

    MSE

    Iteration

    Widrow talk The Hebbian-LMS Algorithm

    19

  • Figure 12: An example of a layered neural network.

    Widrow talk The Hebbian-LMS Algorithm

    InputVectors

    Outputs

    First Layer Neurons

    Connection(Synapses)

    First LayerOutput

    Second Layer Neurons Third Layer

    Neurons

    Connection(Synapses)

    Second LayerOutput

    Connection(Synapses)

    Third LayerOutput

    20

  • Figure 13: A general form of Hebbian-LMS.

    Widrow talk The Hebbian-LMS Algorithm

    +++-- -

    AllPositiveInputs

    Excitatory

    Inhibitory

    (SUM)

    Error, = f(SUM)

    AllPositiveWeights

    OUTPUT

    HALF SIGMOID

    ERRORFUNCTION

    21

  • Figure 14: A synapse corresponding to a variable weight.

    Widrow talk The Hebbian-LMS Algorithm

    SynapticInputSignal

    SynapticOutputSignal

    VariableWeight

    SynapticCleft

    FromPresynapticNeuron

    ToPostsynapticNeuron

    Neurotransmittermolecules

    Receptors

    (a) Synapse (b) A Variable Weight

    22

  • Figure 15: A neuron, dendrite, and synapse.

    Widrow talk The Hebbian-LMS Algorithm

    (SUM) T PG

    T: ThresholdPG: Pulse Generator

    AXON

    NEUROTRANSMITTER

    SYNAPSE

    DENDRITE

    CELL BODY &NUCLEUS

    SOMA

    DENDRITE

    MEMBRANE

    23

  • Widrow talk The Hebbian-LMS Algorithm

    • When the pre-synaptic neuron is not firing, there will be no neurotransmitter in the gap and there will be no weight change. This applies to both excitatory and inhibitory synapses.

    • When the pre-synaptic neuron is firing, and the post-synaptic neuron is also firing, there will be neurotransmitter in the gap and the post-synaptic membrane voltage will be positive since the (SUM) is positive, and the number of neuroreceptors will gradually increase, thus increasing the weight. This applies to excitatory synapses.

    • When the pre-synaptic neuron is firing, and the post-synaptic neuron is not firing, there will be neurotransmitter in the gap and the post-synaptic membrane voltage will be negative since the (SUM) is negative and its number of neuroreceptors will gradually decrease, thus decreasing the weight. This applies to excitatory synapses.

    • The opposite of these rules apply to inhibitory synapses.

    Figure 16: Postulates of synaptic plasticity

    24

  • Figure 17: A linear error function.

    Widrow talk The Hebbian-LMS Algorithm

    ky(SUM)

    ε

    +

    25

  • Figure 18: Hebbian-LMS with a linear error function.

    Widrow talk The Hebbian-LMS Algorithm

    +++-- -

    AllPositiveInputs

    Excitatory

    Inhibitory

    (SUM)

    Error

    AllPositiveWeights

    OUTPUT

    HALF SIGMOID

    26