Connectionist Sentence Comprehension and Production System A model by Dr. Douglas Rohde, M.I.T by...

31
Connectionist Sentence Comprehension and Production System A model by Dr. Douglas Rohde, M.I.T by Dave Cooke Nov. 6, 2004
  • date post

    22-Dec-2015
  • Category

    Documents

  • view

    217
  • download

    0

Transcript of Connectionist Sentence Comprehension and Production System A model by Dr. Douglas Rohde, M.I.T by...

Connectionist Sentence Comprehension and Production System

A model by Dr. Douglas Rohde, M.I.T

byDave CookeNov. 6, 2004

OverviewIntroduction – A brief overview of Artificial Neural Networks– The basic architecture

Introduce Douglas Rohde's CSCP model– Overview– Penglish Language– Architecture– Semantic System– Comprehension, Prediction, and Production System– Training– Testing– Conclusions

Bibliography

A Brief Overview o Basic definition of an Artificial Neural Network

o A network of interconnected “neurons” inspired by the biological nervous system.

o The function of an Artificial Neural Network is to produce an output pattern from a given input.

o First described by Warren McCulloch and Walter Pitts in 1943 in their seminal paper “A logical calculus of ideas imminent in nervous activity”.

Architecture -- NeuronsArtificial neurons are modeled after biological neurons

A diagram of an Artificial Neuron

Architecture -- Neurons (cont. )The Neuron

o Connected by directed links from the preceding layer

o Each link has a numeric value called a weight.

o Input Function: weighted sum of all the neurons inputs.

o Types of activation functions usedo Sigmoido Thresholdo Linearo Piece-wise

o Activation functions are to be “active” near +1, and inactive near (-1)

o Output function.

Architecture -- Neurons (cont.)

Illustration of an Artificial Neuron

Architecture -- Structure

o Network Structureo Many types of neural network structures

o Ex Feedforward, Recurrent, Completely connected

o Feedforwardo Can be single layered or multi-layeredo Inputs are propagated forward to the output layer

Architecture -- Recurrent NNo Recurrent Neural Networks

o Operate on an input space and an internal state space – they have memory.

o Primary types of Recurrent neural networkso simple recurrento fully recurrent

o Below is an example of a simple recurrent network (SRN)

Architecture -- Learning

o Learning used in NN's

o Learning = change in connection weights

o Supervised networks: network is told about correct answer o ex. back propagation, back propagation through time,

reinforcement learning

o Unsupervised networks: network has to find correct input.o competitive learning, self-organizing or Kohonen maps

Architecture -- Learning (BPTT)

o Backpropagation Through Time (BPTT) is used in the CSCP Model and

SRNso In BPTT the network runs ALL of its forward passes then performs

ALL of the backward passes.o Equivalent to unrolling the network backwards through time

The CSCP Model

o Connectionist Sentence Comprehension and Production model

o Current Goal: learn to comprehend and produce sentences developed in the Penglish( Pseudo English) language.

o Long-term Goal: to construct a model that will acount for a wide range of human sentence processing behaviours.

The CSCP Model -- Basic Architecture

o A Simple Recurrent NN is used

o Penglish (Pseudo English) was used to train and test the model.

o Consists of 2 separate parts contected by a “message layer”o Semantic System (Encoding/Decoding System)o CPP system

o Backpropagation Through Time (BPTT) is the learning algorithm.

o method for learning temporal tasks

The CSCP Model -- Penglish

o Goal: to produce only sentences that are reasonably valid in english

o Built around the framework of a stochastic context-free grammar.

o Given a SCFG it is easy to generate sentences, parse sentences, and perform optimal prediction

o Subset of english some grammatical structures used areo 56 verb stemso 45 noun stemso adjectives, determiners, adverbs, subordinate clauseso several types of logical ambiguity.

The CSCP Model -- Penglish

o Penglish sentences do not always sound entirely natural even though constraints to avoid semantic violations were implemented

o Example sentences are:

o (1) We had played a trumpet for you

o (2) A answer involves a nice school.

o (3) The new teacher gave me a new book of baseball.

o (4) Houses have had something the mother has forgotten

The CSCP Model

Semantic System

CPP

System

Start

stores all propositions seen for current sentence

The CSCP Model -- Semantic System

Propositions loaded sequentially

Propositions stored in Memory

The CSCP Model -- Semantic System

Error measure

The CSCP Model -- Training (SS)

o Backpropagationo Trained separate and prior to the rest of the model.

o The decoder: uses standard single-step backpropagation

o The encoder is trained using BPTT.

o Majority of the running time is in the decoding stage.

The CSCP Model -- Training (SS)

Error is assessed here.

The CSCP Model -- CPP System

Error measure

Phonologically encoded word.The CPP System

The CSCP Model -- CPP System (cont.)

Starts here by trying to predict next word in sentence.

Goal to produce next word in sentence and pass it to Word Input Layer

Starts here by trying to predict next word in sentence.

Goal to produce next word in sentence and pass it to Word Input Layer

o Production Algorithm (Prediction layer -> Word Input Layer)

o The model uses “free production”o Starts with a fully “clamped” message in the message layer

o currWord = Start_Symbolo While ( nextWord != period && wordCount != WORD_MAX)

o // start at word input layero //Convert word from localist phonological encoding to distributed o currWord = convert(currWord)

o //Insert word into word input layero insert(currWord)

o //get next predicted word from prediction layero CurrWord = getNextWord()

The CSCP Model -- CPP System (cont.)

The CSCP Model -- CPP System (cont.)

o Training the CPP Systemo CPP system uses backpropagation through time.

o Backpropagation starts with the message layer on the last time step.

o Error derivatives are backpropagated to the word input layer.

o The activations in the network are reset to their state on the next-to-last time step,

o the previously recorded output errors are injected at the prediction and message layers.

o Backpropagation then runs through all the layers, from prediction to word input.

o This repeats, running backwards in time until the first tick is reached.

The CSCP Model -- Trainingo 16 Penglish training sets

o Set = 250,000 sentences, total = 4 million sentences

o 50 000 weight updates per set = 1 epoch

o Total of 16 epochs.

o The learning rate start at .2 for the first epoch and then was gradually reduced over the course of learning.

o After the Semantic System the CPP system was similarily trained

o Training began with limited complexity sentences and complexity increased gradually.

o Training a single network took about 2 days on a 500Mhz alpha. Total training time took about two months.

o Overall 3 networks were trained

The CSCP Model -- Testing

o 50,000 sentences

o 33.8% of testing sentences also appeared in one of the training sets.

o Nearly all of the sentences had 1 or 2 propositions.

o Measuring Comprehension 3 forms of measurement used.

o Used a multiple choice criterion

o Reading time measure

o Grammaticality rating measure

The CSCP Model -- Testing (Multiple Choice)

o Example: “When the owner let go, the dog ran after the mailman.”o Expressed as [ran after, theme, ?]

o Possible answerso Mailman (correct answer)o owner, dog, girls, cats. (distractors)

o Error measure is

o When applying four distractors, the chance performance is 20% correct.

The CSCP Model -- Testing (Reading Time)

o Also known as Comprehension difficultyo 4 components

o The first two “Measure the degree to which the current word was expected”

o The prediction layer has two parts – stem and ending.o These first two components reflect the prediction error of

these two parts on the previous time step.

o 3rd “The change in the message that occurred when the current word was read”

o Mean squared difference between the previous activation and the current activation of each unit in the message layer.

o 4th “The average level of activation in the message layer”

The CSCP Model -- Testing (Grammaticality)

o The Grammaticality Method

o (1) prediction accuracy (PE)o Indicator of syntactic complexityo Involves the point in the sentence at which the worst two consecutive

pre-dictions occur.o (2) comprehension performance (CE)

o Average strict-criterion comprehension error rate on the sentence.o Intented to reflect the degree to which the sentence makes sense.

o Simulated ungrammaticality rating (SUR)o SUR = (PE – 8) X (CE + 0.5)o combines the two components into a single measure of

ungrammaticality

The CSCP Model -- Conclusionso General Comprehension Results

o final networks are able to provide complete, accurate answero Given NO choices 77%o Given 5 choices 92%

o Sentential Complement Ambiguityo Strict criterion error rate 13.5%o Multiple choice 2%

o Subordinate Clause Ambiguityo Ex. Although the teacher saw a book was taken in the school.o Intransitive, weak bad, weak good condition, strong bad, and strong

good all were under 20% error rate on multiple choice questions.o Relative Clauses

o Average around 40% error rate.

Bibliography

1. Artificial Intelligence 4th ed, Luger G.F., Addison Wesley, 2002

2. Artificial Intelligence 2nd ed, Russel & Norvig, Prentice Hall, 2003

3. Neural Networks 2nd ed, Picton P., Palgrave, 20004. A connectionist model of sentence comprehension and

production, Rohde D., MIT, March 2 20025. Finding Structure in Time, Elman J.L, UC San Diego,

Cognitive Science, 14, 179-211, 19906. Fundamentals of Neural Networks, Fausett L, Pearson,

1994