COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during...

133
COSC451: Artificial Intelligence Lecture 17: How infants learn syntax Alistair Knott Dept. of Computer Science, University of Otago Alistair Knott (Otago) COSC451 Lecture 17 1 / 32

Transcript of COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during...

Page 1: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

COSC451: Artificial IntelligenceLecture 17: How infants learn syntax

Alistair Knott

Dept. of Computer Science, University of Otago

Alistair Knott (Otago) COSC451 Lecture 17 1 / 32

Page 2: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Recap

Last lecture, we looked at how infants learn single words.Cross-situational learning, and the role of the phonological loopThe role of joint attention and intention recognition

Recall from Lecture 15: word-meaning mappings and syntacticprocessing appear to involve different brain areas.

Word meanings: ‘Wernicke’s area’ and associated temporal areasSyntax: Broca’s area.

Today: a model of what’s happening in Broca’s area.

Alistair Knott (Otago) COSC451 Lecture 17 2 / 32

Page 3: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Outline of today’s lecture

1 Learning syntax: early developmental stages

2 The nativist-empiricist debate

3 Empiricist models of syntax

4 A new model of syntactic processing

Alistair Knott (Otago) COSC451 Lecture 17 3 / 32

Page 4: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Learning syntax: early developmental stages

Outline of today’s lecture

1 Learning syntax: early developmental stages

2 The nativist-empiricist debate

3 Empiricist models of syntax

4 A new model of syntactic processing

Alistair Knott (Otago) COSC451 Lecture 17 4 / 32

Page 5: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Learning syntax: early developmental stages

Learning syntax: early developmental stages

Early syntactic development has some fairly well agreed stages.

1. Single-word utterances (holophrases).

Utterances in service of specific goals.The goal can be ‘declarative’ (e.g. car!)The goal can be ‘imperative’ (e.g. doggy! more!)

It’s only when children have learned the mapping between meaningsand words that such utterances become effective.

But children still have to learn that (in some situations) ‘enteringverbal mode’ can be a means to achieving their goals.

Alistair Knott (Otago) COSC451 Lecture 17 5 / 32

Page 6: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Learning syntax: early developmental stages

Learning syntax: early developmental stages

2. Simple two-word utterances.

Word combinations: unstructured collections of words.E.g. my . . . cup! cup . . . my!

Pivot schemas: two word units structured around a single wordE.g. my cup! my cake! [my X]

Tomasello: pivot schemas support some generalisation, but are mainlybased on surface word ordering conventions.

Alistair Knott (Otago) COSC451 Lecture 17 6 / 32

Page 7: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Learning syntax: early developmental stages

Learning syntax: early developmental stages

3. Item-based syntactic constructions

At 18 months, children begin to understand simple transitivesentences.

Around 24 months: the earliest ‘syntactic constructions’.Children begin to produce transitive sentences.Children begin to use syntactic function words (e.g. the, of) andinflections (e.g. likes).

The interesting thing about early constructions is that they tend to betied to specific words.

Open it with this.He hit me this.

Alistair Knott (Otago) COSC451 Lecture 17 7 / 32

Page 8: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Learning syntax: early developmental stages

Learning syntax: early developmental stages

4. Progressively more complex syntactic constructions.

At this point, utterances are complex enough that you need a propersyntactic theory to chart development.

That’s where things start to get contentious.

Alistair Knott (Otago) COSC451 Lecture 17 8 / 32

Page 9: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

Outline of today’s lecture

1 Learning syntax: early developmental stages

2 The nativist-empiricist debate

3 Empiricist models of syntax

4 A new model of syntactic processing

Alistair Knott (Otago) COSC451 Lecture 17 9 / 32

Page 10: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

The nativist-empiricist debate

There’s a huge debate between nativists and empiricists indevelopmental linguistics.

Nativists believe that infants are born with ‘knowledge’ of theuniversal properties of language.All they have to learn from their environment are the parametersettings which define their particular language.Empiricists believe that infants use general-purpose learningmechanisms to acquire language.They learn language ‘from scratch’.

Alistair Knott (Otago) COSC451 Lecture 17 10 / 32

Page 11: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

An example of a nativist model: Minimalism

Recall:The Minimalist model of ‘The man grabbed a cup’ holds that thesame LF structure underlies this sentence in every language.This LF structure contains multiple positions for the agent, patientand inflected verb. (Because these items ‘move’ duringderivation.)Children are born knowing how to derive the LF representation.What they have to learn is the mapping from LF to PF.I.e. whether to ‘read out’ items before or after movement.

Alistair Knott (Otago) COSC451 Lecture 17 11 / 32

Page 12: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

An example of a nativist model: MinimalismHere’s the LF derivation of our example sentence.

V

grabbed

IP

AgrP

VP

I’

Agr’

V’

I

Agr

DP

a cup

the man

Spec

Spec

Spec

Alistair Knott (Otago) COSC451 Lecture 17 12 / 32

Page 13: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

An example of a nativist model: MinimalismHere’s the LF derivation of our example sentence.

V

grabbed

IP

AgrP

VP

I’

Agr’

V’

I

Agr

DP

a cup

Spec

Spec

Spec

the man

Alistair Knott (Otago) COSC451 Lecture 17 12 / 32

Page 14: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

An example of a nativist model: MinimalismHere’s the LF derivation of our example sentence.

V

grabbed

IP

AgrP

VP

I’

Agr’

V’

I

Agr

DP

a cup

the manSpec

Spec

Spec

Alistair Knott (Otago) COSC451 Lecture 17 12 / 32

Page 15: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

An example of a nativist model: MinimalismHere’s the LF derivation of our example sentence.

V

grabbed

IP

AgrP

VP

I’

Agr’

V’

I

Agr

DP

a cup

the manSpec

Spec

Spec

Alistair Knott (Otago) COSC451 Lecture 17 12 / 32

Page 16: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

An example of a nativist model: MinimalismHere’s the LF derivation of our example sentence.

V

grabbed

IP

AgrP

VP

I’

Agr’

V’

I

Agr

DP

the manSpec

Spec

Spec

a cup

Alistair Knott (Otago) COSC451 Lecture 17 12 / 32

Page 17: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

An example of a nativist model: MinimalismHere’s the LF derivation of our example sentence.

IP

AgrP

VP

I’

Agr’

V’

I

Agr

DP

the man

a cup

Spec

Spec

Spec

V

grabbed

Alistair Knott (Otago) COSC451 Lecture 17 12 / 32

Page 18: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

An example of a nativist model: MinimalismHere’s the LF derivation of our example sentence.

IP

AgrP

VP

I’

Agr’

V’

I

Agr

DP

the man

a cup

Spec

Spec

Spec

V

grabbed

Alistair Knott (Otago) COSC451 Lecture 17 12 / 32

Page 19: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

An example of a nativist model: MinimalismHere’s the LF derivation of our example sentence.

IP

AgrP

VP

I’

Agr’

V’

I

Agr

DP

the man

a cup

Spec

Spec

Spec

V

grabbed

Alistair Knott (Otago) COSC451 Lecture 17 12 / 32

Page 20: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

An example of a nativist model: MinimalismHere’s the LF derivation of our example sentence.

IP

AgrP

VP

I’

Agr’

V’

I

Agr

DP

grabbed

the man

a cup

Spec

Spec

Spec

V

Alistair Knott (Otago) COSC451 Lecture 17 12 / 32

Page 21: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

An example of a nativist model: MinimalismHere’s the LF derivation of our example sentence.

IP

AgrP

VP

I’

Agr’

V’

I

Agr

DP

grabbed

the manSpec

Spec

Spec

V

a cup

Alistair Knott (Otago) COSC451 Lecture 17 12 / 32

Page 22: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

An example of a nativist model: MinimalismHere’s the LF derivation of our example sentence.

IP

AgrP

VP

I’

Agr’

V’

I

Agr

DP

grabbed

the manSpec

Spec

Spec

V

a cup

Alistair Knott (Otago) COSC451 Lecture 17 12 / 32

Page 23: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

An example of a nativist model: MinimalismHere’s the LF derivation of our example sentence.

V

grabbed

IP

AgrP

VP

I’

Agr’

V’

I

Agr

DP

a cup

the man

grabbed

grabbed

the man

a cup

Spec

Spec

Spec

Alistair Knott (Otago) COSC451 Lecture 17 12 / 32

Page 24: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

An example of a nativist model: MinimalismBoth agent and patient can be read out before or after movement.The inflected verb can be read out in three positions.All the child has to learn is ‘when to read out each item’.

V

grabbed

IP

AgrP

VP

I’

Agr’

V’

I

Agr

DP

a cup

the man

grabbed

grabbed

the man

a cup

Spec

Spec

Spec

Alistair Knott (Otago) COSC451 Lecture 17 12 / 32

Page 25: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

An example of a nativist model: MinimalismIn English, we read out as follows:

IP

AgrP

VP

I’

Agr’

V’

I

Agr

DP

a cup

the man

grabbed

grabbed

the man

a cup

Spec

Spec

Spec

V

grabbed

Alistair Knott (Otago) COSC451 Lecture 17 12 / 32

Page 26: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

An example of a nativist model: MinimalismIn Maori, we read out as follows:

IP

AgrP

VP

I’

Agr’

V’

I

Agr

DP

a cup

the man

grabbed

grabbed

the man

a cup

Spec

Spec

Spec

V

grabbed

Alistair Knott (Otago) COSC451 Lecture 17 12 / 32

Page 27: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

An example of a nativist model: MinimalismIn French/Italian, we read out as follows:

IP

AgrP

VP

I’

Agr’

V’

I

Agr

DP

a cup

the man

grabbed

grabbed

the man

a cup

Spec

Spec

Spec

V

grabbed

Alistair Knott (Otago) COSC451 Lecture 17 12 / 32

Page 28: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

The nativist-empiricist debate

Some arguments for a nativist position

1. ‘Poverty of the stimulus’ arguments. (Chomsky, 1980)‘There’s not enough information in language exposure data tolearn a language.’‘Language is just too complex to be able to learn from data.’

2. Arguments from pidgins and creoles (Bickerton, 1981)Pidgins are languages which are ‘invented’ when two languagecommunities meet, and need to communicate.They are not true natural languages.Children who grow up in communities speaking a pidgin languagedevelop a creole.Creoles have all of the syntactic complexity of ‘established’ naturallanguages.

Alistair Knott (Otago) COSC451 Lecture 17 13 / 32

Page 29: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

Outline of today’s lecture

1 Learning syntax: early developmental stages

2 The nativist-empiricist debate

3 Empiricist models of syntax

4 A new model of syntactic processing

Alistair Knott (Otago) COSC451 Lecture 17 14 / 32

Page 30: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

Empiricist models of syntax

Empiricist linguists argue that children have very powerfulgeneral-purpose learning mechanisms.

These are sufficient to acquire a language without (much) innatelanguage-specific machinery.

The training data: utterances occurring in communicative contexts.There are regularities within utterances.There are regularities linking utterances and contexts.

Children have pattern-finding mechanisms which pick up theseregularities.

Alistair Knott (Otago) COSC451 Lecture 17 15 / 32

Page 31: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

What pattern-finding mechanisms are involved?

1. A mechanism which finds regularities in sequential data.Consider the following sequence: John went to the. . .What word comes next?We’ve already seen that infants can pick up regularities in astream of phonemes (Saffran et al.).ga bi ro to ba di ga bi ro to ba di

2. A mechanism which finds mappings between pairs of complexpatterns.

We’ve already hypothesised such a mechanism in our accounts ofthe mirror system.It’s also attested in our ability to perform analogical reasoning.

Alistair Knott (Otago) COSC451 Lecture 17 16 / 32

Page 32: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

What pattern-finding mechanisms are involved?

1. A mechanism which finds regularities in sequential data.Consider the following sequence: John went to the. . .What word comes next?We’ve already seen that infants can pick up regularities in astream of phonemes (Saffran et al.).ga bi ro to ba di ga bi ro to ba di

2. A mechanism which finds mappings between pairs of complexpatterns.

We’ve already hypothesised such a mechanism in our accounts ofthe mirror system.It’s also attested in our ability to perform analogical reasoning.

Alistair Knott (Otago) COSC451 Lecture 17 16 / 32

Page 33: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

What pattern-finding mechanisms are involved?

1. A mechanism which finds regularities in sequential data.Consider the following sequence: John went to the. . .What word comes next?We’ve already seen that infants can pick up regularities in astream of phonemes (Saffran et al.).ga bi ro to ba di ga bi ro to ba di

2. A mechanism which finds mappings between pairs of complexpatterns.

We’ve already hypothesised such a mechanism in our accounts ofthe mirror system.It’s also attested in our ability to perform analogical reasoning.

Alistair Knott (Otago) COSC451 Lecture 17 16 / 32

Page 34: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

What sorts of pattern are found?

1. Patterns are statistical tendencies, rather than universal rules.

A traditional grammar divides sentences discretely into ‘well-formed’and ‘ill-formed’.Empiricist language models often assign probabilities to sentences.

John went to the pub John went to the ?? cup

Traditional grammar works with ‘cleaned-up’ sentences, with pauses,false starts, repetitions etc removed.

Chomsky distinguished between syntactic competence andperformance. He saw grammar as modelling competence.

Empiricist grammars tend to be trained on ‘real’ language data.

Alistair Knott (Otago) COSC451 Lecture 17 17 / 32

Page 35: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

What sorts of pattern are found?

1. Patterns are statistical tendencies, rather than universal rules.

A traditional grammar divides sentences discretely into ‘well-formed’and ‘ill-formed’.Empiricist language models often assign probabilities to sentences.

John went to the pub John went to the ?? cup

Traditional grammar works with ‘cleaned-up’ sentences, with pauses,false starts, repetitions etc removed.

Chomsky distinguished between syntactic competence andperformance. He saw grammar as modelling competence.

Empiricist grammars tend to be trained on ‘real’ language data.

Alistair Knott (Otago) COSC451 Lecture 17 17 / 32

Page 36: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

What sorts of pattern are found?

2. Patterns are often patterns in surface language.

In generative grammar, most of the rules are about deriving LF.There are no rules about the ‘surface form’ of a sentence.

However, in language, there appear to be lots of regularities which canonly be expressed as surface regularities.

The classic example is idioms.

Alistair Knott (Otago) COSC451 Lecture 17 18 / 32

Page 37: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

Idioms

An idiom is an arbitrary sequence of words which collectely have anarbitrary semantic interpretation.

E.g. by and large (meaning ‘typically’).The meaning of this phrase doesn’t come from the meanings of itsindividual words.It doesn’t conform to any general syntactic rules.

Idioms often have ‘slots’, which can be filled by syntactically regularconstituents.

Far be it from NP to VP.

Idioms are often syntactically regular, even though their meaning is notcompositional.

NP kicked the bucket.

Alistair Knott (Otago) COSC451 Lecture 17 19 / 32

Page 38: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

Idioms in the nativist-empiricist debate

Empiricist linguists argue that idioms are very common in language.

They argue that there’s a continuum of idiomaticity.At one end, there are ‘pure’ idioms (e.g. by and large.In the middle there are idioms containing ‘slots’, andgrammatically regular idioms.At the other end there are statistical tendencies.E.g. went to the pub, give up, pull over. . .

Empiricist models are well-suited for capturing idioms.Idioms are statistical regularities in surface language, mapped toarbitrary semantic/pragmatic patterns.

Minimalist models have real difficulties with idioms.

Alistair Knott (Otago) COSC451 Lecture 17 20 / 32

Page 39: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

Idioms in a Minimalist model

If idioms are continuous, they can simply be treated as multi-wordlexical items.

E.g. Winnie-the-Pooh, by-and-large. . .

The difficulty is with non-continuous idioms, and with idioms whichretain some degree of syntactic regularity.

Take NP to task (= criticise NP)NP let the cat out of the bagThe cat was let out of the bag by NP

There’s nothing in Minimalism which can explain these constructions.

Alistair Knott (Otago) COSC451 Lecture 17 21 / 32

Page 40: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

Empiricist models of language acquisition

Empiricist models of language acquisition have an easier timeexplaining the different stages of syntactic development.

Infants begin by detecting simple statistical regularities in surfacelanguage, and map these to semantic representations.Then they identify progressively more complex regularities.

Minimalism is an account of ‘mature’ language competence; it’s notclear how this emerges during development.

Alistair Knott (Otago) COSC451 Lecture 17 22 / 32

Page 41: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

Simple recurrent networks

Obviously, empiricists need to propose models of the learningarchitectures which infants are using to learn patterns in language.

One of the key models is the simple recurrent network (SRN;Elman, 1990).

A SRN takes as input a sequence of words (one word at a time).It is trained to predict the next word in the sequence.

Alistair Knott (Otago) COSC451 Lecture 17 23 / 32

Page 42: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

Simple recurrent networks

Hidden layer

Predicted next word Err Actual next word

Current word Context

+ −

A SRN maintains a context representation, which is a copy of its hiddenlayer at the previous timestep.

The context rep holds a history of recent inputs.After training, the context units can be interpreted as holding arepresentation of the most common sequences in the trainingdata.

Alistair Knott (Otago) COSC451 Lecture 17 24 / 32

Page 43: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

Simple recurrent networks

Hidden layer

Predicted next word Err Actual next word

Current word Context

+ −

A trained SRN can’t (normally) predict exactly which word will comenext.

It can distinguish between those words which are likely to comenext, and those which are unlikely.It’s basically a model of syntax.

Alistair Knott (Otago) COSC451 Lecture 17 24 / 32

Page 44: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

Simple recurrent networks

Interestingly, after training, words from the same syntactic (and evensemantic) categories generate similar patterns of activation in thehidden layer of an SRN.

This is because words from the same syntactic/semanticcategories tend to occur in the same (surface) contexts.

Overleaf is a diagram showing how the activities hidden-unit wordrepresentations cluster after training.

Alistair Knott (Otago) COSC451 Lecture 17 25 / 32

Page 45: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

Simple recurrent networks

Alistair Knott (Otago) COSC451 Lecture 17 26 / 32

Page 46: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

Empiricist models of syntax

Adding semantics to simple recurrent networks

As just described, SRNs can learn two things (from scratch):A (very surface-y) model of syntactic structure.A taxonomy of syntactic / semantic word categories.

That’s very useful. . .However, if we want to model the mapping between syntax andsemantics, we need to extend the model.

Alistair Knott (Otago) COSC451 Lecture 17 27 / 32

Page 47: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

Outline of today’s lecture

1 Learning syntax: early developmental stages

2 The nativist-empiricist debate

3 Empiricist models of syntax

4 A new model of syntactic processing

Alistair Knott (Otago) COSC451 Lecture 17 28 / 32

Page 48: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A new model of syntactic processing

We already have an account of what semantic representations looklike. (For concrete sentences.)

When we ‘entertain the meaning’ of a concrete sentence, weinternally rehearse a SM sequence stored in WM.

I need to give a model of how children learn to generate surface wordsequences from these SM replay operations.I.e. of how children learn the mapping from LF to PF.

The model should ideally include an account of learned surfacepatterns in language.It should also support an account of the different stages insyntactic development.

Alistair Knott (Otago) COSC451 Lecture 17 29 / 32

Page 49: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

The basic idea

In our account, the semantics (LF) of a sentence takes the form of arehearsed SM sequence.

We need a network which learns to map a SM sequence onto asequence of words.The network needs to learn which SM representations in thesequence need to be ‘pronounced’, and which need to be skipped.

I’ll introduce the network bit by bit.

Alistair Knott (Otago) COSC451 Lecture 17 30 / 32

Page 50: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

The basic idea

I’m thinking of the LF representation as a sequence of contexts, withtwo SM representations evoked in each context.

V

grabbed

IP

VP

I’

Agr’

V’

I

Agr

DP

a cup

the man

Spec

Spec

Spec

the man

grabbed

a cup

grabbed

AgrP

Alistair Knott (Otago) COSC451 Lecture 17 31 / 32

Page 51: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

The basic idea

I’m thinking of the LF representation as a sequence of contexts, withtwo SM representations evoked in each context.

V

grabbed

VP

Agr’

V’

Agr

DP

a cup

the man

Spec

Spec

a cup

grabbed

AgrP

IP

Specthe man

I’

Igrabbed

C1

Alistair Knott (Otago) COSC451 Lecture 17 31 / 32

Page 52: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

The basic idea

I’m thinking of the LF representation as a sequence of contexts, withtwo SM representations evoked in each context.

V

grabbed

VP

Agr’

V’

Agr

DP

a cup

the man

Spec

Spec

a cup

grabbed

AgrP

IP

Spec I’

Igrabbed

the man

Alistair Knott (Otago) COSC451 Lecture 17 31 / 32

Page 53: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

The basic idea

I’m thinking of the LF representation as a sequence of contexts, withtwo SM representations evoked in each context.

V

grabbed

VP

Agr’

V’

Agr

DP

a cup

the man

Spec

Spec

a cup

grabbed

AgrP

IP

Spec I’

Igrabbed

the man

Alistair Knott (Otago) COSC451 Lecture 17 31 / 32

Page 54: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

The basic idea

I’m thinking of the LF representation as a sequence of contexts, withtwo SM representations evoked in each context.

V

grabbed

VP

V’

DP

a cup

the man

Spec

the man

IP

Spec I’

Igrabbed

AgrP

Speca cup

Agr’

Agr

grabbed

C2

Alistair Knott (Otago) COSC451 Lecture 17 31 / 32

Page 55: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

The basic idea

I’m thinking of the LF representation as a sequence of contexts, withtwo SM representations evoked in each context.

V

grabbed

VP

V’

DP

a cup

the man

Spec

the man

IP

Spec I’

Igrabbed

AgrP

Spec Agr’

Agr

grabbed

a cup

Alistair Knott (Otago) COSC451 Lecture 17 31 / 32

Page 56: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

The basic idea

I’m thinking of the LF representation as a sequence of contexts, withtwo SM representations evoked in each context.

V

grabbed

VP

V’

DP

a cup

the man

Spec

the man

IP

Spec I’

Igrabbed

AgrP

Spec Agr’

Agr

grabbed

a cup

Alistair Knott (Otago) COSC451 Lecture 17 31 / 32

Page 57: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

The basic idea

I’m thinking of the LF representation as a sequence of contexts, withtwo SM representations evoked in each context.

V

grabbed

DP

a cup

the man

IP

Spec I’

Igrabbed

Speca cup

AgrP

Agr’

Agr

grabbed

VP

Spec

the man

V’

C3

Alistair Knott (Otago) COSC451 Lecture 17 31 / 32

Page 58: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

The basic idea

I’m thinking of the LF representation as a sequence of contexts, withtwo SM representations evoked in each context.

DP

a cup

the man

IP

Spec I’

Igrabbed

Speca cup

AgrP

Agr’

Agr

grabbed

VP

Spec V’

V

grabbed

the man

Alistair Knott (Otago) COSC451 Lecture 17 31 / 32

Page 59: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

The basic idea

I’m thinking of the LF representation as a sequence of contexts, withtwo SM representations evoked in each context.

DP

a cup

the man

IP

Spec I’

Igrabbed

Speca cup

AgrP

Agr’

Agr

grabbed

VP

Spec V’

V

grabbed

the man

Alistair Knott (Otago) COSC451 Lecture 17 31 / 32

Page 60: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

The basic idea

I’m thinking of the LF representation as a sequence of contexts, withtwo SM representations evoked in each context.

the man

IP

Spec I’

Igrabbed

Speca cup

AgrP

Agr’

Agr

grabbed

the man

Spec

VP

V’

grabbed

V DP

a cup

C4

Alistair Knott (Otago) COSC451 Lecture 17 31 / 32

Page 61: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

The basic idea

I’m thinking of the LF representation as a sequence of contexts, withtwo SM representations evoked in each context.

the man

IP

Spec I’

Igrabbed

Speca cup

AgrP

Agr’

Agr

grabbed

the man

Spec

VP

V’

grabbed

V DP

a cup

Alistair Knott (Otago) COSC451 Lecture 17 31 / 32

Page 62: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingHere’s the network for learning word meanings from last lec-ture.

Word−semantics

network

Err

Associated

word

Phonological

buffer

Semantic representations

(object reps, action reps..)

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 63: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingWe can be more precise about where the semantic reps come from:

They’re generated when WM episodes are rehearsed.

Word−semantics

network

ContextObject

rep

Episode

plan

Episode rehearsal system

Associated

word

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 64: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingWe can be more precise about where the semantic reps come from:

Rehearsal evokes a sustained episode plan, and transient objectreps. (With an updating SM context.)

Word−semantics

network

Associated

word

C1attend−agentattend−cupgrasp

attending

agent

Episode rehearsal system

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 65: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingWe can be more precise about where the semantic reps come from:

Rehearsal evokes a sustained episode plan, and transient objectreps. (With an updating SM context.)

Word−semantics

network

Associated

word

attend−agentattend−cupgrasp

attending C2

cup

Episode rehearsal system

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 66: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingWe can be more precise about where the semantic reps come from:

Rehearsal evokes a sustained episode plan, and transient objectreps. (With an updating SM context.)

Word−semantics

network

Associated

word

attend−agentattend−cupgrasp

attending C3

agent

Episode rehearsal system

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 67: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingWe can be more precise about where the semantic reps come from:

Rehearsal evokes a sustained episode plan, and transient objectreps. (With an updating SM context.)

Word−semantics

network

Associated

word

attend−agentattend−cupgrasp

attending

cup

Episode rehearsal system

C4

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 68: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingCurrently the word-semantics network generates pairs of words.

It really needs to deliver words one at a time.Idea: a pattern generator can alternate between semantic signals.

Word−semantics

network

ContextObject

rep

Episode

plan

Episode rehearsal system

Associated

word

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 69: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingThe pattern generator alternately sends the object representation andthe episode plan to the word-sem network.

Word−semantics

network

Context

’advance’

Object

rep

Episode

plan

Episode rehearsal system

Associated

word

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 70: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingThe pattern generator alternately sends the object representation andthe episode plan to the word-sem network.

Word−semantics

network

Context

’advance’

Episode

plan

Episode rehearsal system

Object

rep

Object

word

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 71: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingThe pattern generator alternately sends the object representation andthe episode plan to the word-sem network.

Word−semantics

network

Context

’advance’

Object

rep

Episode rehearsal system

Episode

plan

Episode−plan

word

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 72: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingAssume that the pattern generator also ‘advances’ a rehearsed SM se-quence (at the end of every cycle).

Word−semantics

network

Context

’advance’

Object

rep

Episode

plan

Episode rehearsal system

Associated

word

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 73: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingAssume that the pattern generator also ‘advances’ a rehearsed SM se-quence (at the end of every cycle).

Now we get a unique word at each time step.

Word−semantics

network

’advance’

Episode rehearsal system

Man

rep

"MAN"

attend−man

attend−cup

grasp

C1

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 74: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingAssume that the pattern generator also ‘advances’ a rehearsed SM se-quence (at the end of every cycle).

Now we get a unique word at each time step.

Word−semantics

network

’advance’

Episode rehearsal system

attend−man

attend−cup

grasp

Man

rep

"GRABBED"

C1

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 75: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingAssume that the pattern generator also ‘advances’ a rehearsed SM se-quence (at the end of every cycle).

Now we get a unique word at each time step.

Word−semantics

network

’advance’

Episode rehearsal system

rep

attend−man

attend−cup

grasp

Cup

"CUP"

C2

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 76: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingAssume that the pattern generator also ‘advances’ a rehearsed SM se-quence (at the end of every cycle).

Now we get a unique word at each time step.

Word−semantics

network

’advance’

Episode rehearsal system

attend−man

attend−cup

grasp

Man

rep

"GRABBED"

C2

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 77: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingAssume that the pattern generator also ‘advances’ a rehearsed SM se-quence (at the end of every cycle).

Now we get a unique word at each time step.

Word−semantics

network

’advance’

Episode rehearsal system

Man

rep

"MAN"

attend−man

attend−cup

grasp

C3

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 78: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingAssume that the pattern generator also ‘advances’ a rehearsed SM se-quence (at the end of every cycle).

Now we get a unique word at each time step.

Word−semantics

network

’advance’

Episode rehearsal system

attend−man

attend−cup

grasp

Man

rep

"GRABBED"

C3

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 79: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingAssume that the pattern generator also ‘advances’ a rehearsed SM se-quence (at the end of every cycle).

Now we get a unique word at each time step.

Word−semantics

network

’advance’

Episode rehearsal system

rep

attend−man

attend−cup

grasp

Cup

"CUP"

C4

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 80: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Predicted next

word (wd sem)

Word−semantics

network

Err

Context

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 81: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

The network is trained in joint attention situations.

Predicted next

word (wd sem)

Word−semantics

network

Err

Context

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 82: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

The agent observes an episode, and hears an utterance.

Predicted next

word (wd sem)

Word−semantics

network

Err

Context

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 83: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

The agent replays the episode and the utterance in synch.

Predicted next

word (wd sem)

Word−semantics

network

Err

Context

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 84: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

The network is trained to predict the next word in the utterance.

Predicted next

word (wd sem)

Word−semantics

network

Err

Context

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 85: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Its input is the context (C1. . . C4) and the oscillator.

Predicted next

word (wd sem)

Word−semantics

network

Err

Context

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 86: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

If word-sem network output matches the next word, it’s passed on.

Predicted next

word (wd sem)

Word−semantics

network

Err

Context

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 87: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

If not, it’s blocked.

Predicted next

word (wd sem)

Word−semantics

network

Err

Context

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 88: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Example: training in English (man, grabbed, cup).

Predicted next

word (wd sem)

Word−semantics

network

Err

Context

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 89: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Output matches: train C1/1→ PASS.

Word−semantics

network

Err

’advance’

Filter

network

MAN

attend−manC1

Episode rehearsal system

attend−cupgrasp

MAN

Actual next word

Output word

Man

rep

MAN

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 90: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Output matches: train C1/2→ PASS.

Word−semantics

network

Err

’advance’

Filter

network

C1Man

rep

Episode rehearsal system

Actual next word

Output word

GRABBED

GRABBED

attend−manattend−cupgrasp

GRABBED

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 91: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Output matches: train C2/1→ PASS.

Word−semantics

network

Err

’advance’

Filter

network

attend−man

Episode rehearsal system

attend−cupgrasp

Actual next word

Output word

C2

CUP

CUP

Cup

rep

CUP

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 92: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Output doesn’t match: train C2/2→ BLOCK.

Word−semantics

network

Err

’advance’

Filter

network

Episode rehearsal system

Actual next word

Output word

C2Cup

rep

EOS

GRABBED

attend−manattend−cupgrasp

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 93: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Output doesn’t match: train C3/1→ BLOCK.

Word−semantics

network

Err

’advance’

Filter

network

attend−man

Episode rehearsal system

attend−cupgrasp

Actual next word

Output word

EOS

C3Man

rep

MAN

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 94: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Output doesn’t match: train C3/2→ BLOCK.

Word−semantics

network

Err

’advance’

Filter

network

Episode rehearsal system

Actual next word

Output word

rep

EOS

Man C3attend−manattend−cupgrasp

GRABBED

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 95: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Output doesn’t match: train C4/1→ BLOCK.

Word−semantics

network

Err

’advance’

Filter

network

attend−man

Episode rehearsal system

attend−cupgrasp

Actual next word

Output word

EOS

C4Cup

rep

CUP

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 96: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Example: training in a VSO language (grabbed, man, cup).

Predicted next

word (wd sem)

Word−semantics

network

Err

Context

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 97: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Output doesn’t match: train C1/1→ BLOCK.

Word−semantics

network

Err

’advance’

Filter

network

attend−manC1

Episode rehearsal system

attend−cupgrasp

Actual next word

Output word

Man

rep

MAN

GRABBED

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 98: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Output matches: train C1/2→ PASS.

Word−semantics

network

Err

’advance’

Filter

network

C1Man

rep

Episode rehearsal system

Actual next word

Output word

GRABBED

GRABBED

attend−manattend−cupgrasp

GRABBED

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 99: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Output doesn’t match: train C2/1→ BLOCK.

Word−semantics

network

Err

’advance’

Filter

network

attend−man

Episode rehearsal system

attend−cupgrasp

Actual next word

Output word

C2Cup

rep

CUP

MAN

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 100: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Output doesn’t match: train C2/2→ BLOCK.

Word−semantics

network

Err

’advance’

Filter

network

Episode rehearsal system

Actual next word

Output word

C2Cup

rep

GRABBED

attend−manattend−cupgrasp

MAN

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 101: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Output matches: train C3/1→ PASS.

Word−semantics

network

Err

’advance’

Filter

network

attend−man

Episode rehearsal system

attend−cupgrasp

Actual next word

Output word

C3Man

rep

MAN

MAN

MAN

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 102: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Output doesn’t match: train C3/2→ BLOCK.

Word−semantics

network

Err

’advance’

Filter

network

Episode rehearsal system

Actual next word

Output word

rep

Man C3attend−manattend−cupgrasp

GRABBED

CUP

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 103: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingNow we can envisage a filter network which learns when to ‘produce’words generated by the word-semantics network.

Output matches: train C4/1→ PASS.

Word−semantics

network

Err

’advance’

Filter

network

attend−man

Episode rehearsal system

attend−cupgrasp

Actual next word

Output word

C4Cup

rep

CUP

CUP

CUP

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 104: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingThis network generates sentences ‘compositionally’:

it uses mappings from semantic reps to words, and a mechanismfor rehearsing episode plans.

Predicted next

word (wd sem)

Word−semantics

network

Err

Context

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 105: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingBut remember that there’s probably another mechanism for gen-erating word sequences, which doesn’t rely on word seman-tics.

Predicted next

word (wd sem)

Word−semantics

network

Err

Context

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 106: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingProposal: an Elman-style word-sequencing network operates in parallelwith the word-semantics/filter network.

Predicted next

word (wd seq)

Err

Word−sequencing

network

Current

word

Current

surface

context

Actual next

word

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 107: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingBoth networks generate predictions about the next word in an utter-ance.

Predicted next

word (wd sem)

Predicted next

word (wd seq)

Word−semantics

network

Err Err

Ent

Context

Word−sequencing

network

Current

word

Current

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

surface

context

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 108: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingFor ‘compositional’ language, the episode-rehearsal/word-meaning net-work’s predictions are more accurate.

Predicted next

word (wd sem)

Predicted next

word (wd seq)

Word−semantics

network

Err Err

Ent

Context

Word−sequencing

network

Current

word

Current

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

surface

context

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 109: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingBut for idiomatic language, the word-sequencing network’s predictionsare more accurate.

Predicted next

word (wd sem)

Predicted next

word (wd seq)

Word−semantics

network

Err Err

Ent

Context

Word−sequencing

network

Current

word

Current

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

surface

context

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 110: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingThe filter network has to learn when to rely on which predic-tor.

Predicted next

word (wd sem)

Predicted next

word (wd seq)

Word−semantics

network

Err Err

Ent

Context

Word−sequencing

network

Current

word

Current

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

surface

context

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 111: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingIt can use the entropy of the word-sequencing network to decide this.

Entropy is a measure of the confidence of the word-sequencingnetwork in its prediction.

Predicted next

word (wd sem)

Predicted next

word (wd seq)

Word−semantics

network

Err Err

Ent

Context

Word−sequencing

network

Current

word

Current

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

surface

context

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 112: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingHere’s an example of a sentence containing an idiom:

Winnie the pooh grabbed cup

Predicted next

word (wd sem)

Predicted next

word (wd seq)

Word−semantics

network

Err Err

Ent

Context

Word−sequencing

network

Current

word

Current

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

surface

context

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 113: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingFor the first word, the compositional network is more accu-rate.

Word−semantics

network

Err Err

Word−sequencing

network

Current’advance’

Episode

plan

surface

context

Filter

network

WINNIE

Episode rehearsal system

Pooh

rep

WINNIE

C1

???

Start−of

sentence

>0

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 114: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingFor the first word, the compositional network is more accu-rate.

Word−semantics

network

Err Err

Word−sequencing

network

Current’advance’

Episode

plan

surface

context

Filter

network

WINNIE

Episode rehearsal system

Pooh

rep

WINNIE

C1

???

Start−of

sentence

>0

WINNIE

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 115: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingFor the second word (part of an idiom), the word-sequencing networkis more accurate. So its output is chosen.

Word−semantics

network

Err Err

Word−sequencing

network

’advance’

Filter

network

Episode rehearsal system

C1

WINNIE Start/

winnie

context

THE

0

THE

Episode

plan

GRABBED

Pooh

rep

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 116: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingFor the second word (part of an idiom), the word-sequencing networkis more accurate. So its output is chosen.

Word−semantics

network

Err Err

Word−sequencing

network

’advance’

Filter

network

Episode rehearsal system

C1

WINNIE Start/

winnie

context

THE

0

THE

THE

GRABBED

Episode

plan

Pooh

rep

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 117: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingLikewise for the third word (still part of the idiom).

Word−semantics

network

Err Err

Word−sequencing

network

’advance’

Filter

network

Episode rehearsal system

C1

0

THE winnie/

the

context

POOHGRABBED

Pooh

rep

Episode

plan

POOH

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 118: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingLikewise for the third word (still part of the idiom).

Word−semantics

network

Err Err

Word−sequencing

network

’advance’

Filter

network

Episode rehearsal system

C1

0

THE winnie/

the

context

POOHGRABBED

Pooh

rep

Episode

plan

POOH

POOH

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 119: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingFor the fourth word (no longer part of the idiom), the compositional net-work is again more accurate.

Word−semantics

network

Err Err

Word−sequencing

network

’advance’

Filter

network

Episode rehearsal system

C1

POOH the/

pooh

context

???

>0

Pooh

rep

Episode

plan

GRABBED

GRABBED

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 120: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingFor the fourth word (no longer part of the idiom), the compositional net-work is again more accurate.

Word−semantics

network

Err Err

Word−sequencing

network

’advance’

Filter

network

Episode rehearsal system

C1

POOH the/

pooh

context

???

>0

Pooh

rep

Episode

plan

GRABBED

GRABBED

GRABBED

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 121: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingDiscontinuous idioms are a bit more tricky.

E.g. John takes Bill to task.

Predicted next

word (wd sem)

Predicted next

word (wd seq)

Word−semantics

network

Err Err

Ent

Context

Word−sequencing

network

Current

word

Current

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

surface

context

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 122: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingOne way of dealing with these is to allow word semantics to contributeto the context representation of the word-sequencing network.

Predicted next

word (wd sem)

Predicted next

word (wd seq)

Word−semantics

network

Err Err

Ent

Context

Word−sequencing

network

Current

word

Current

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

surface

context

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 123: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingThe first word of the idiom is produced compositionally.

Word−semantics

network

Err Err

Context

Word−sequencing

network

Current’advance’

Episode rehearsal system

surface

context

Filter

network

TAKES

TAKES

JOHN

???

>0

criticise−

plan

TAKES

Bill

rep

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 124: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingThen another compositional expression is produced.And the word-sequencing network’s context is updated.

Word−semantics

network

Err Err

Context

Word−sequencing

network

’advance’

Episode rehearsal system

context

Filter

network

???

>0

Bill

rep

criticise−

plan

BILL BILL

BILL

TAKES John/

takes

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 125: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingFor the second part of the idiom, the word-sequencing network is againconfident.

Word−semantics

network

Err Err

Context

Word−sequencing

network

’advance’

Episode rehearsal system

Filter

network

Bill

rep

criticise−

plan

BILL

TAKES

TO

TOTO

0

takes/

NP/

criticise

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 126: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingHow does this network map onto the brain?

Predicted next

word (wd sem)

Predicted next

word (wd seq)

Word−semantics

network

Err Err

Ent

Context

Word−sequencing

network

Current

word

Current

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

surface

context

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 127: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingThe filter network is part of Broca’s area.

Language is about learning control/gating strategies.(C.f. Earl Miller’s model of PFC.)

Predicted next

word (wd sem)

Predicted next

word (wd seq)

Word−semantics

network

Err Err

Ent

Context

Word−sequencing

network

Current

word

Current

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

surface

context

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 128: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingThe word-sequencing network is also part of Broca’s area.

Language is about sequencing abilities.(C.f. Elman and many others).

Predicted next

word (wd sem)

Predicted next

word (wd seq)

Word−semantics

network

Err Err

Ent

Context

Word−sequencing

network

Current

word

Current

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

surface

context

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 129: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingThe episode-rehearsal system is also part of Broca’s area.

Language is about working memory episode representations.(C.f. Baddeley and many others).

Predicted next

word (wd sem)

Predicted next

word (wd seq)

Word−semantics

network

Err Err

Ent

Context

Word−sequencing

network

Current

word

Current

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

surface

context

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 130: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingThe word-semantics network is part of ‘Wernicke’s area’ (and associ-ated temporal regions).

Predicted next

word (wd sem)

Predicted next

word (wd seq)

Word−semantics

network

Err Err

Ent

Context

Word−sequencing

network

Current

word

Current

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

surface

context

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 131: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingWhat can the network say about the developmental stages of syntax inchildhood?

Predicted next

word (wd sem)

Predicted next

word (wd seq)

Word−semantics

network

Err Err

Ent

Context

Word−sequencing

network

Current

word

Current

Actual next

word

Predicted next

word

’advance’

Object

rep

Episode

plan

Episode rehearsal system

surface

context

Filter

network

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 132: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingThe single-word / holophrase stage:

Children represent their own intentions in the ‘episode plan’medium. And they attend to what they want. . .

Predicted next

word (wd sem)

Predicted next

word (wd seq)

Word−semantics

network

Err Err

Ent

Context

Word−sequencing

network

Current

word

Current

Actual next

word

Predicted next

word

’advance’

Episode rehearsal system

surface

context

Filter

network

Episode

plan

Object

rep

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32

Page 133: COSC451: Artificial Intelligence · and inflected verb. (Because these items ‘move’ during derivation.) Children are born knowing how to derive the LF representation. What they

A new model of syntactic processing

A network for syntactic processingThe pivot-schemas / item-based constructions stage:

Perhaps there can be direct connections from semanticrepresentations to the word-sequencing network.

Predicted next

word (wd sem)

Predicted next

word (wd seq)

Word−semantics

network

Err Err

Ent

Context

Word−sequencing

network

Current

word

Current

Actual next

word

Predicted next

word

’advance’

Episode rehearsal system

surface

context

Filter

network

Episode

plan

Object

rep

Alistair Knott (Otago) COSC451 Lecture 17 32 / 32