Post on 11-Feb-2016
description
Finding Structure in Time
Jeffrey L. Elman
Presented by: Kaushik Choudhary
Outline
• Introduction• The Problem with Time• Networks with Memory• Experiments with Exclusive-OR• Structure in Letter Sequences• Discovering the Notion “Word”• Simple Sentences• Conclusion
Introduction
• How might one represent temporal events in PDP models?
• We utter words in a sequence and not all together!
• This paper discusses an approach to account for time by the “effect it has on processing”
Outline
• Introduction• The Problem with Time• Networks with Memory• Experiments with Exclusive-OR• Structure in Letter Sequences• Discovering the Notion “Word”• Simple Sentences• Conclusion
The Problem with Time
• Possible approach - Represent temporal events by elements in a pattern vector
• Problems with the approach• Would require an interface to buffer the input. It would
be impossible to determine when to examine the buffer• Buffers would impose a limit on the input size and
demand it to be fixed• The vectors 011100000 and 000111000 are different
locations in space and thus the similarity goes undetected by PDP models.
Outline
• Introduction• The Problem with Time• Networks with Memory• Experiments with Exclusive-OR• Structure in Letter Sequences• Discovering the Notion “Word”• Simple Sentences• Conclusion
Networks with Memory
• Jordan (1986) proposed a network with recurrent connections.
• In such networks the hidden units could see their previous outputs to determine the future outputs – memory of the network.
• In this paper, Elman proposes a similar network with additional units at the input layer.
• These units are referred to as “Context Units” and are also hidden.
• The input and context units activate the hidden units which in turn activate the output units and feed back the context units.
Networks with Memory
Output
Networks with Memory
Hidden Units
Input Context Units
Elman’s proposed recurrent network.
• In the above architecture, context units remember prior internal state for a specific output
• The hidden units develop a mapping to remember the temporal properties of the input
• This lends the network temporal sensitivity.
Networks with Memory
Outline
• Introduction• The Problem with Time• Networks with Memory• Experiments with Exclusive-OR• Structure in Letter Sequences• Discovering the Notion “Word”• Simple Sentences• Conclusion
Experiments with Exclusive-OR
• Sample input : • Sample output: • Every third bit is XOR of 1st and 2nd • Objective of the network is to predict the next
bit.• It is only possible to predict every third bit
accurately.
1 0 1 0 0 0 0 1 1 1 1 0
0 1 0 0 0 0 1 1 1 1 0 ?
Experiments with Exclusive-OR
Outline
• Introduction• The Problem with Time• Networks with Memory• Experiments with Exclusive-OR• Structure in Letter Sequences• Discovering the Notion “Word”• Simple Sentences• Conclusion
Structure in Letter Sequences
• Sample input: Consonants b,d and g combined randomly. Then replaced with b->ba, d->dii and g->guuu.
• Each letter was assigned a unique 6-bit vector.
• Objective of the network was to predict the next letter in the input sequence.
• Network structure: 6 input units, 6 output units, 20 hidden units and 20 context units.
• The network was trained through 200 passes over the sequence diibaguuubadiidiiguuu…
Structure in Letter Sequences
Structure in Letter Sequences
Outline
• Introduction• The Problem with Time• Networks with Memory• Experiments with Exclusive-OR• Structure in Letter Sequences• Discovering the Notion “Word”• Simple Sentences• Conclusion
Discovering the Notion “Word”
• Input to the network: 200 sentences with no breaks between them (1270 words, 4963 letters)
• Each letter represented by a 5-bit vector• Network structure: 5 input units, 5 output
units, 20 hidden units and 20 context units.• Objective of the network was to predict the
next letter in the sequence.
Discovering the Notion “Word”
• The authors defend the ambiguity in results indicating that the experiment had only set out to show that there is predictability in boundaries of words in the sequence.
• And that the recurrent network is able to extract this information!
Discovering the Notion “Word”
Outline
• Introduction• The Problem with Time• Networks with Memory• Experiments with Exclusive-OR• Structure in Letter Sequences• Discovering the Notion “Word”• Simple Sentences• Conclusion
Simple Sentences
• 10,000 random sentences were created.• Each word in the sentence was assigned a 31-
bit vector with each bit representing a different word.
• No breaks between sentences thus giving a stream of 27,534 words.
• The network experienced six passes over this stream.
• The objective of the network was to predict the next word.
Simple Sentences
• The RMS error calculated based on successive words was about 0.88.
• The RMS error calculated based on probability of occurrence of a word was about 0.053.
• Impressive!
Simple Sentences
Simple Sentences
Outline
• Introduction• The Problem with Time• Networks with Memory• Experiments with Exclusive-OR• Structure in Letter Sequences• Discovering the Notion “Word”• Simple Sentences• Conclusion
Conclusion
• Problems defined in terms of temporal events change nature.
• RMS error calculated over time may be used to evaluate temporal structures.
• More sequential dependencies does not necessarily translate to worse performance.
• Representations of time and hence memory depend on the task in hand.
• Representations may be structured.
Outline
• Introduction• The Problem with Time• Networks with Memory• Experiments with Exclusive-OR• Structure in Letter Sequences• Discovering the Notion “Word”• Simple Sentences• Conclusion
Thank you!