Neural Networks - Lecture 03.ppt

13
TIME DELAY NEURAL NETWORKS TIME DELAY NEURAL NETWORKS Introduction Time delay neural networks (TDNN) have the ability to represent relationships between events in time The neuron is fed not only the current input, but previous inputs also In this way, a TDNN neuron has the ability to relate and compare current input to the past history of events

description

Neural Networks by Muhammad Amjad

Transcript of Neural Networks - Lecture 03.ppt

  • TIME DELAY NEURAL NETWORKSIntroduction

    Time delay neural networks (TDNN) have the ability to represent relationships between events in time

    The neuron is fed not only the current input, but previous inputs also

    In this way, a TDNN neuron has the ability to relate and compare current input to the past history of events

  • ArchitectureTIME DELAY NEURAL NETWORKS

  • ArchitectureTIME DELAY NEURAL NETWORKS

  • ArchitectureTIME DELAY NEURAL NETWORKS

  • ArchitectureTIME DELAY NEURAL NETWORKS

  • TIME DELAY NEURAL NETWORKSArchitecture

  • Training of Weights

    Since the time shifted copies of the TDNN neurons are mere duplicates,

    and they are supposed to look for the same event in the input pattern;

    hence their weights are constrained to be the sameTIME DELAY NEURAL NETWORKS

  • Training of Weights

    Back-propagation forward and backward pass are done normally as if the time shifted copies are the same events

    Rather than changing the weights on time-shifted connections separately, they are averaged and updated on each connection by the same value

    In this way, the network is forced to discover useful features in the input, regardless of when in time they actually occurredTIME DELAY NEURAL NETWORKS

  • Utilization

    The network is presented with the complete time-dependent pattern, and it calculates its output

    TDNN are not shift-sensitive, i.e. misaligned input patterns do not result in poor performanceTIME DELAY NEURAL NETWORKS

  • Utilization

    They require patterns of fixed time lengths

    If the length of the pattern is longer, we can truncate it

    If the length is shorter, we can duplicate the last input vector to complete the specified length

    This is safe only if the pattern does not contain useful information at its endTIME DELAY NEURAL NETWORKS

  • Utilization

    Recurrent nets are shown only the current input. Previous inputs are implicitly present in the network

    TDNN is shown the complete pattern explicitlyTIME DELAY NEURAL NETWORKS

  • Utilization

    This weight sharing concept has been extended to spatial dimensions, and has been used in optical character recognition systems, where the location of an image in the input space is not precisely knownTIME DELAY NEURAL NETWORKS

  • References

    Read Section 3.1.5, Engelbrecht& Page 642 (course folder)TIME DELAY NEURAL NETWORKS

    ************