June 6: General Introduction and “Framing Event Variables”

Post on 24-Feb-2016

34 views 0 download

description

Meanings First Context and Content Lectures, Institut Jean Nicod. June 6: General Introduction and “Framing Event Variables” June 13: “I-Languages, T-Sentences, and Liars” June 20: “Words, Concepts, and Conjoinability ” June 27: “Meanings as Concept Assembly Instructions” - PowerPoint PPT Presentation

Transcript of June 6: General Introduction and “Framing Event Variables”

June 6: General Introduction and “Framing Event Variables”

June 13: “I-Languages, T-Sentences, and Liars”

June 20: “Words, Concepts, and Conjoinability”

June 27: “Meanings as Concept Assembly Instructions”

SLIDES POSTED BEFORE EACH TALK terpconnect.umd.edu/~pietro

(OR GOOGLE ‘pietroski’ AND FOLLOW THE LINK) pietro@umd.edu

Meanings FirstContext and Content Lectures, Institut Jean Nicod

Reminders of last two weeks...

Human Language: a language that human children can naturally acquire

(D) for each human language, there is a theory of truth that is also the core of an adequate theory of meaning for that language

(C) each human language is an i-language: a biologically implementable procedure that generates expressions that connect meanings with articulations

(B) each human language is an i-language for which there is a theory of truth that is alsothe core of an adequate theory of meaning for that i-language

(D) for each human language, there is a theory of truth that is also the core of an adequate theory of meaning for that

language

Good Ideas Bad Companion Ideas

“e-positions” allow for “e-positions” are Tarskian variables

conjunction reductions that have mind-independent values

Alvin moved to Venice happily. Alvin moved to Venice.

ee’e’’[AL(e’) & MOVED(e, e’) & T0(e, e’’) & VENICE(e’’) & HAPPILY(e)]ee’e’’[AL(e’) & MOVED(e, e’) & T0(e, e’’) & VENICE(e’’)]

(D) for each human language, there is a theory of truth that is also the core of an adequate theory of meaning for that language

Good Ideas Bad Companion Ideas

“e-positions” allow for “e-positions” are Tarskian variables

conjunction reductions that have mind-independent values

Alvin moved to Venice happily. Alvin moved Torcello to Venice.

Alvin moved to Venice. Alvin chased Pegasus.

Alvin chased Theodore happily.

Theodore chased Alvin unhappily.

(D) for each human language, there is a theory of truth that is also the core of an adequate theory of meaning for that language

Good Ideas Bad Companion Ideas

“e-positions” allow for “e-positions” are Tarskian variables

conjunction reductions that have mind-independent values

as Foster’s Problem reveals, the meanings computed are humans compute meanings truth-theoretic properties of via specific operations human i-language expressions Liar Sentences don’t Liar T-sentences are true

preclude meaning theories (‘The first sentence is true.’ iff

for human i-languages the first sentence is true.)

(D) for each human language, there is a theory of truth that is also the core of an adequate theory of meaning for that language

Good Ideas Bad Companion Ideas

“e-positions” allow for characterizing meaning

conjunction reductions in truth-theoretic terms

yields good analyses as Foster’s Problem reveals, of specific constructions

humans compute meanings via specific operations such

characterization also

helps address foundational Liar Sentences don’t issues concerning how preclude meaning theories human linguistic

expressions for human i-languages could exhibit

meanings at all

Weeks 3 and 4: Short Form

• In acquiring words, kids use available concepts to introduce new ones

Sound('ride') + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + 'ride'

• Meanings are instructions for how to access and combine i-concepts

--lexicalizing RIDE(_, _) puts RIDE(_) at an accessible address

--introduced concepts can be conjoined via simple operations that require neither Tarskian variables nor a Tarskian ampersand

'ride fast' RIDE( )^FAST( )'fast horse'

FAST( )^HORSE( )'horses'

HORSE( )^PLURAL( )

PLURAL( ) => COUNTABLE(_)

Weeks 3 and 4: Short Form

• In acquiring words, kids use available concepts to introduce new ones.

Sound('ride') + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + 'ride'

• Meanings are instructions for how to access and combine i-concepts

--lexicalizing RIDE(_, _) puts RIDE(_) at an accessible address

--introduced concepts can be conjoined via simple operations that require neither Tarskian variables nor a Tarskian ampersand

'fast horses' FAST( )^HORSES( )

'ride horses' RIDE( )^[Θ( , _)^HORSES(_)]

Weeks 3 and 4: Short Form

• In acquiring words, kids use available concepts to introduce new ones.

Sound('ride') + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + 'ride'

• Meanings are instructions for how to access and combine i-concepts

--lexicalizing RIDE(_, _) puts RIDE(_) at an accessible address

--introduced concepts can be conjoined via simple operations that require neither Tarskian variables nor a Tarskian ampersand

'fast horses' FAST( )^HORSES( )

'ride horses' RIDE( )^[Θ( , _)^HORSES(_)]

Meaning('fast horses') = JOIN{Meaning('fast'), Meaning('horses')} Meaning('ride horses') = JOIN{Meaning('ride'), Θ[Meaning('horses')]}

= JOIN{fetch@'ride'), Θ[Meaning('horses')]}

Weeks 3 and 4: Short Form

• In acquiring words, kids use available concepts to introduce new ones

Sound('ride') + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + 'ride'

• Meanings are instructions for how to access and combine i-concepts

--lexicalizing RIDE(_, _) puts RIDE(_) at an accessible address

--introduced concepts can be conjoined via simple operations that require neither Tarskian variables nor a Tarskian ampersand

'ride horses' RIDE( )^[Θ( , _)^HORSES(_)]

'ride fast horses' RIDE( )^[Θ( , _)^FAST(_)^HORSES(_)]

'ride horses fast' RIDE( )^[Θ( , _)^HORSES(_)]^FAST( )

Weeks 3 and 4: Very Short Form

• In acquiring words, kids use available concepts to introduce i-concepts, which can be “joined” to form conjunctive monadic concepts,

which may or may not have Tarskian satisfiers.

'fast horses' FAST( )^HORSES( ) 'ride horses' RIDE( )^[Θ( , _)^HORSES(_)]

'ride fast horses' RIDE( )^[Θ( , _)^FAST(_)^HORSES(_)] 'ride fast horses fast' RIDE( )^[Θ( , _)^FAST(_)^HORSES(_)]^FAST( )

• Some Implications

Verbs do not fetch genuinely relational concepts

Verbs are not saturated by grammatical arguments

The number of arguments that a verb can/must combine with

is not determined by the concept that the verb fetches

Words, Concepts, and Conjoinability

What makes humans linguistically special?

(i) Lexicalization: capacity to acquire words

(ii) Combination: capacity to combine words

(iii) Lexicalization and Combination

(iv) Distinctive concepts that get paired with signals

(v) Something else entirely

FACT: human children are the world’s best lexicalizers

SUGGESTION: focus on lexicalization is independently plausible

Constrained Homophony Again

• A doctor rode a horse from Texas

• A doctor rode a horse, and

(i) the horse was from Texas

(ii) the ride was from Texas

why not…

(iii) the doctor was from Texas

Leading Idea (to be explained and defended)

• In acquiring words, we use available concepts to introduce new ones

Sound(’ride’) + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + ’chase’

• The new concepts can be systematically conjoined in limited ways

'rode a horse from Texas'

RODE(_) & [Θ(_, _) & HORSE(_) & FROM(_, TEXAS)]

RIDE(_) & PAST(_) & [Θ(_, _) & HORSE(_) & [FROM(_, _) & TEXAS(_)]]

RODE(_) & [Θ(_, _) & HORSE(_)] & FROM(_, TEXAS)

y[RODE(x, y) & HORSE(y)] & FROM(x, TEXAS)

A doctor rode a horse from Texas

A doctor rode a horse that was from Texas

x{Doctor(x) & y[Rode(x, y) & Horse(y) & From(y, Texas)]}

&

A doctor rode a horse from Texas

&A doctor rode a horse and the ride was from Texas

ex{Doctor(x) & y[Rode(e, x, y) & Horse(y) & From(e,

Texas)]}

A doctor rode a horse from Texas

A doctor rode a horse that was from Texas ex{Doctor(x) & y[Rode(e, x, y) & Horse(y) & From(y, Texas)]}

&

A doctor rode a horse from Texas

&A doctor rode a horse and the ride was from Texas

ex{Doctor(x) & y[Rode(e, x, y) & Horse(y) & From(e,

Texas)]}

A doctor rode a horse from Texas

&A doctor rode a horse and the ride was from Texas

ex{Doctor(x) & y[Rode(e, x, y) & Horse(y) & From(e,

Texas)]}

But why doesn’t the structure below support a different meaning:A doctor both rode a horse and was from Texas

ex{Doctor(x) & y[Rode(e, x, y) & Horse(y) & From(x, Texas)]}

Why can’t we hear the verb phrase as a predicate that is satisfied by x iff x rode a horse & x is from Texas?

• In acquiring words, we use available concepts to introduce new ones

Sound('ride') + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + 'ride'

• The new concepts can be systematically conjoined in limited ways

'rode a horse from Texas'

RODE(_) & [Θ(_, _) & HORSE(_) & FROM(_, TEXAS)]RODE(_) & [Θ(_, _) & HORSE(_)] & FROM(_, TEXAS)y[RODE(e, x, y) & HORSE(y)] & FROM(x, TEXAS)

if 'rode' has a rider-variable, why can’t it be targeted by 'from Texas’?

Verbs don’t fetch genuinely relational concepts.A phrasal meaning leaves no choice

about which variable to target.

• In acquiring words, we use available concepts to introduce new ones

Sound('ride') + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + 'ride'

• The new concepts can be systematically conjoined in limited ways

'rode a horse from Texas'

RODE(_)^[Θ(_, _)^HORSE(_)^FROM(_, TEXAS)]RODE(_)^[Θ(_, _)^HORSE(_)]^FROM(_, TEXAS)y[RODE(e, x, y) & HORSE(y)] & FROM(x, TEXAS)

Composition is simple and constrained, but unbounded.

Phrasal meanings are generable, but always monadic.Lexicalization introduces concepts that can be

systematically combined in simple ways.

• In acquiring words, we use available concepts to introduce new ones

Sound('ride') + RIDE(_, _) ==> RIDE(_) + RIDE(_, _) + 'ride'

• DISTINGUISH

Lexicalized concepts, L-concepts

RIDE(_, _) GIVE(_, _, _) ALVIN HORSE(_)RIDE(_, _, ...) MORTAL(_, _)

Introduced concepts, I-concepts

RIDE(_) GIVE(_) CALLED(_, Sound('Alvin'))MORTAL(_) HORSE(_)

hypothesis: I-concepts exhibit less typology than L-conceptsspecial case: I-concepts exhibit fewer adicities than L-concepts

Conceptual Adicity

Two Common Metaphors

• Jigsaw Puzzles

• 7th Grade Chemistry -2+1H–O–H+1

Jigsaw Metaphor

A

THOUGHT

Jigsaw Metaphor

UnsaturatedSaturater

Doubly Un-

saturated

1stsaturater

2nd saturater one Monadic Concept

(adicity: -1)

“filled by” one Saturater (adicity +1)

yields a complete Thought

one Dyadic Concept (adicity: -2)

“filled by” two Saturaters (adicity +1)

yields a complete Thought

BrutusSang( )

Brutus CaesarKICK(_, _)

7th Grade Chemistry Metaphor

a molecule

of water

-2 +1H(OH+1)-1

a single atom with valence -2 can combine with

two atoms of valence +1 to form a stable molecule

7th Grade Chemistry Metaphor

-2+1Brutus(KickCaesar+1)-1

7th Grade Chemistry Metaphor

+1NaCl-1

an atom with valence -1 can combine with

an atom of valence +1 to form a stable molecule

+1BrutusSang-1

Extending the Metaphor

AggieBrown( )

AggieCow( )

AggieBrownCow( )

Brown( ) &

Cow( )

Aggie is (a) cow

Aggie is brown

Aggie is (a) brown cow

-1 -1+1 +1

Extending the Metaphor

AggieBrown( )

AggieCow( )

Aggie

-1 -1+1 +1

Conjoining twomonadic (-1)concepts can

yield a complexmonadic (-1)

concept

Brown( ) &

Cow( )

Conceptual Adicity

TWO COMMON METAPHORS--Jigsaw Puzzles--7th Grade Chemistry

DISTINGUISHLexicalized concepts, L-concepts

RIDE(_, _) GIVE(_, _, _) ALVIN Introduced concepts, I-concepts

RIDE(_) GIVE(_)CALLED(_, Sound(’Alvin’))

hypothesis: I-concepts exhibit less typology than L-conceptsspecial case: I-concepts exhibit fewer adicities than L-concepts

A Different (and older) Hypothesis

(1) concepts predate words

(2) words label concepts

• Acquiring words is basically a process of pairing pre-existing concepts with perceptible signals

• Lexicalization is a conceptually passive operation

• Word combination mirrors concept combination

• Sentence structure mirrors thought structure

Bloom: How Children Learn the Meanings of Words

• word meanings are, at least primarily, concepts that kids have prior to lexicalization

• learning word meanings is, at least primarily, a process of figuring out which

existing concepts are paired with which word-sized signals

• in this process, kids draw on many capacities—e.g., recognition of syntactic cues and speaker intentions—but no capacities specific to acquiring word meanings

Lidz, Gleitman, and Gleitman

“Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence is a function of the number of participants logically implied by the verb meaning. It takes only one to sneeze, and therefore sneeze is intransitive, but it takes two for a kicking act (kicker and kickee), and hence kick is transitive.

Of course there are quirks and provisos to these systematic form-to-meaning-correspondences…”

Lidz, Gleitman, and Gleitman

“Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence is a function of the number of participants logically implied by the verb meaning. It takes only one to sneeze, and therefore sneeze is intransitive, but it takes two for a kicking act (kicker and kickee), and hence kick is transitive.

Of course there are quirks and provisos to these systematic form-to-meaning-correspondences…”

Why Not...

Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence is not a function of the number of participants logically implied by the verb meaning. A paradigmatic act of kicking has exactly two participants (kicker and kickee), and yet kick need not be transitive.

Brutus kicked Caesar the ball Caesar was kicked Brutus kicked Brutus gave Caesar a swift kick

Of course there are quirks and provisos. Some verbs do require a certain number of noun phrases in active voice sentences.

*Brutus put the ball*Brutus put*Brutus sneezed Caesar

Concept of

adicity n

Concept of

adicity n

PerceptibleSignal

Quirky information for lexical items like ‘kick’

Concept of

adicity -1

PerceptibleSignal

Quirky information for lexical items like ‘put’

Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence is a function of the number of participants logically implied by the verb meaning.

It takes only one to sneeze, and therefore sneeze is intransitive, but it takes two for a kicking act (kicker and kickee), and hence kick is transitive.

Of course there are quirks and provisos to these systematic form-to-meaning-correspondences.

Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence isn’t a function of the number of participants logically implied by the verb meaning.

It takes only one to sneeze, and usually sneeze is intransitive. But it usually takes two to have a kicking; and yet kick can be untransitive.

Of course there are quirks and provisos. Some verbs do require a certain number of noun phrases in active voice sentences.

Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence is a function of the number of participants logically implied by the verb meaning.

It takes only one to sneeze, and therefore sneeze is intransitive, but it takes two for a kicking act (kicker and kickee), and hence kick is transitive.

Of course there are quirks and provisos to these systematic form-to-meaning-correspondences.

Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence isn’t a function of the number of participants logically implied by the verb meaning.

It takes only one to sneeze, and sneeze is typically used intransitively; but a paradigmatic kicking has exactly two participants, and yet kick can be used intransitively or ditransitively.

Of course there are quirks and provisos. Some verbs do require a certain number of noun phrases in active voice sentences.

Quirks and Provisos, or Normal Cases?

KICK(x1, x2) The baby kicked

RIDE(x1, x2) Can you give me a ride?

BEWTEEN(x1, x2, x3) I am between him and her

why not: I between him her

BIGGER(x1, x2) This is bigger than that why not: This bigs that

MORTAL(…?...) Socrates is mortal

A mortal wound is fatal

FATHER(…?...) Fathers father

Fathers father future fathersEAT/DINE/GRAZE(…?...)

Lexicalization as Concept-Introduction (not mere labeling)

Concept of

type TConcep

t of

type TConcept

of type T*

PerceptibleSignal

Lexicalization as Concept-Introduction (not mere labeling)

PerceptibleSignal

Number(_)

type:<e, t> Number(_

)type:

<e, t>NumberOf[_,

Φ(_)]type:

<<e, t>, <n, t>>

Lexicalization as Concept-Introduction (not mere labeling)

Concept of

type TConcep

t of

type TConcept

of type T*

PerceptibleSignal

Concept of

adicity -1 Concept

of adicity -

1 Concept of

adicity -2

PerceptibleSignal

ARRIVE(x) ARRIVE(e, x)

One Possible (Davidsonian) Application: Increase Adicity

Concept of

adicity -2 Concept

of adicity -

2 Concept of

adicity -3

PerceptibleSignal

KICK(x1, x2) KICK(e, x1, x2)

One Possible (Davidsonian) Application: Increase Adicity

Concept of

adicity n Concept

of adicity

nConcept

of adicity -1

PerceptibleSignal

KICK(x1, x2) KICK(e)

KICK(e, x1, x2)

Lexicalization as Concept-Introduction: Make Monads

Concept of

adicity n

Concept of

adicity n(or n−1)

PerceptibleSignal

Concept of

adicity nConcept of

adicity −1

PerceptibleSignal

Further lexical

information(regarding flexibilities)

further lexical information

(regarding inflexibilities)

Two Pictures of Lexicalization

Experienceand

Growth

LanguageAcquisition

Device in itsInitial State

Language AcquisitionDevice in

a Mature State(an I-Language):

GRAMMAR LEXICON

PhonologicalInstructions

Semantic InstructionsLexicalizable

concepts

Introduced concepts

Articulation and Perception

of Signals

Lexicalized concepts

Concept of

adicity n

Concept of

adicity n(or n−1)

PerceptibleSignal

Concept of

adicity nConcept of

adicity −1

PerceptibleSignal

Further lexical

information(regarding flexibilities)

further lexical information

(regarding inflexibilities)

Two Pictures of Lexicalization

Subcategorization

A verb can access a monadic concept and impose further (idiosyncratic) restrictions on complex

expressions

• Semantic Composition Adicity Number (SCAN) (instructions to fetch) singular concepts +1 singular <e>

(instructions to fetch) monadic concepts -1 monadic <e, t>

(instructions to fetch) dyadic concepts -2 dyadic <e,<e, t>>

• Property of Smallest Sentential Entourage (POSSE)zero NPs, one NP, two NPs, …

the SCAN of every verb can be -1, while POSSEs vary: zero, one, two, …

POSSE facts may reflect

...the adicities of the original concepts lexicalized

...statistics about how verbs are used (e.g., in active voice)

...prototypicality effects

...other agrammatical factors

• ‘put’ may have a (lexically represented) POSSE of three in part because

--the concept lexicalized was PUT(_, _, _) --the frequency of locatives (as in ‘put the cup on the table’) is salient

• and note: * I put the cup the table ? I placed the cup

On any view: Two Kinds of Facts to Accommodate

• Flexibilities– Brutus kicked Caesar– Caesar was kicked– The baby kicked– I get a kick out of you– Brutus kicked Caesar the ball

• Inflexibilities– Brutus put the ball on the table– *Brutus put the ball– *Brutus put on the table

On any view: Two Kinds of Facts to Accommodate

• Flexibilities– The coin melted– The jeweler melted the coin– The fire melted the coin– The coin vanished– The magician vanished the coin

• Inflexibilities– Brutus arrived– *Brutus arrived Caesar

Concept of

adicity n Concept

of adicity n

Concept of adicity

−1

PerceptibleSignal

further POSSE information,

as for ‘put

Two Pictures of Lexicalization

Word: SCAN -1

Last Task for Today(which will carry over to next time):

offer some reminders of the reasons for adopting the second picture

Absent Word Meanings

Striking absence of certain (open-class) lexical meanings that would be permitted if Human I-Languages permitted nonmonadic semantic types

<e,<e,<e,<e, t>>>> (instructions to fetch) tetradic concepts <e,<e,<e, t>>> (instructions to fetch) triadic concepts <e,<e, t>> (instructions to fetch) dyadic concepts

<e> (instructions to fetch) singular concepts

Proper Nouns

• even English tells against the idea that lexical proper nouns label singular concepts (of type <e>)

• Every Tyler I saw was a philosopherEvery philosopher I saw was a Tyler There were three Tylers at the partyThat Tyler stayed late, and so did this onePhilosophers have wheels, and Tylers have stripesThe Tylers are coming to dinnerI spotted Tyler Burge

I spotted that nice Professor Burge who we met before

• proper nouns seem to fetch monadic concepts, even if they lexicalize singular concepts

Concept of

adicity n Concept

of adicity

nConcept

of adicity -1

PerceptibleSignal

TYLER TYLER(x)

CALLED[x, SOUND(‘Tyler’)]

Lexicalization as Concept-Introduction: Make Monads

Concept of

adicity n Concept

of adicity

nConcept

of adicity -1

PerceptibleSignal

KICK(x1, x2) KICK(e)

KICK(e, x1, x2)

Lexicalization as Concept-Introduction: Make Monads

Concept of

adicity n Concept

of adicity

nConcept

of adicity -1

PerceptibleSignal

TYLER TYLER(x)

CALLED[x, SOUND(‘Tyler’)]

Lexicalization as Concept-Introduction: Make Monads

Absent Word Meanings

Striking absence of certain (open-class) lexical meanings that would be permitted

if I-Languages permit nonmonadic semantic types

<e,<e,<e,<e, t>>>> (instructions to fetch) tetradic concepts <e,<e,<e, t>>> (instructions to fetch) triadic concepts <e,<e, t>> (instructions to fetch) dyadic concepts

<e> (instructions to fetch) singular concepts

Absent Word Meanings

Brutus sald a car Caesar a dollar

sald SOLD(x, $, z, y)

[sald [a car]] SOLD(x, $, z, a car)

[[sald [a car]] Caesar] SOLD(x, $, Caesar, a car)

[[[sald [a car]] Caesar]] a dollar] SOLD(x, a dollar, Caesar, a car)_________________________________________________

Caesar bought a car

bought a car from Brutus for a dollar

bought Antony a car from Brutus for a dollar

x sold y to z (in exchange) for $

Absent Word Meanings

Brutus tweens Caesar Antony

tweens BETWEEN(x, z, y)

[tweens Caesar] BETWEEN(x, z, Caesar)

[[tweens Caesar] Antony] BETWEEN(x, Antony, Caesar)

_______________________________________________________

Brutus sold Caesar a car

Brutus gave Caesar a car *Brutus donated a charity a car

Brutus gave a car away Brutus donated a car

Brutus gave at the office Brutus donated anonymously

Absent Word Meanings

Alexander jimmed the lock a knife

jimmed JIMMIED(x, z, y)

[jimmed [the lock] JIMMIED(x, z, the lock)

[[jimmed [the lock] [a knife]] JIMMIED(x, a knife, the lock)

_________________________________________________

Brutus froms Rome

froms COMES-FROM(x, y)

[froms Rome] COMES-FROM(x, Rome)

Absent Word Meanings

Brutus talls Caesar

talls IS-TALLER-THAN(x, y)

[talls Caesar] IS-TALLER-THAN(x, Caesar)

_________________________________________

*Julius Caesar

Julius JULIUS

Caesar CAESAR

Absent Word Meanings

Striking absence of certain (open-class) lexical meanings that would be permitted

if I-Languages permit nonmonadic semantic types

<e,<e,<e,<e, t>>>> (instructions to fetch) tetradic concepts <e,<e,<e, t>>> (instructions to fetch) triadic concepts <e,<e, t>> (instructions to fetch) dyadic concepts

<e> (instructions to fetch) singular concepts

I’ll come back to this next week

What makes humans linguistically special?

(i) Lexicalization: capacity to acquire words

(ii) Combination: capacity to combine words

(iii) Lexicalization and Combination

(iv) Distinctive concepts that get paired with signals

(v) Something else entirely

FACT: human children are the world’s best lexicalizers

One of Aristotle’s Observations

Some animals are born early, and take time to grow into their “second nature”

One of Aristotle’s Observations

Some animals are born early, and take time to grow into their “second nature”

Experienceand

Growth

LanguageAcquisition

Device in itsInitial State

Language AcquisitionDevice in

a Mature State(an I-Language):

GRAMMAR LEXICON

PhonologicalInstructions

Semantic InstructionsLexicalizable

concepts

Introduced concepts

Articulation and Perception

of Signals

Lexicalized concepts

Weeks 3 and 4: Very Short Form

• In acquiring words, kids use available concepts to introduce i-concepts, which can be “joined” to form conjunctive monadic concepts,

which may or may not have Tarskian satisifiers.

'fast horses' FAST( )^HORSES( ) 'ride horses' RIDE( )^[Θ( , _)^HORSES(_)]

'ride fast horses' RIDE( )^[Θ( , _)^FAST(_)^HORSES(_)] 'ride fast horses fast' RIDE( )^[Θ( , _)^FAST(_)^HORSES(_)]^FAST( )

• Some Implications

Verbs do not fetch genuinely relational concepts

Verbs are not saturated by grammatical arguments

The number of arguments that a verb can/must combine with is not determined by the concept that the verb fetches

Words, Concepts, and Conjoinability

THANKS!

On this view, meanings are neither extensions nor concepts. Familiar difficulties for the idea that lexical meanings are concepts

polysemy 1 meaning, 1 cluster of concepts (in 1 mind)

intersubjectivity 1 meaning, 2 concepts (in 2 minds)

jabber(wocky) 1 meaning, 0 concepts (in 1 mind)

But a single instruction to fetch a concept from a certain address

can be associated with more (or less) than one concept

Meaning constancy at least for purposes of meaning composition

Lots of Conjoiners

• P & Q purely propositional

• Fx &M Gx purely monadic

• ??? ???

• Rx1x2 &DF Sx1x2 purely dyadic, with fixed orderRx1x2 &DA Sx2x1 purely dyadic, any order

• Rx1x2 &PF Tx1x2x3x4 polyadic, with fixed order

Rx1x2 &PA Tx3x4x1x5 polyadic, any order

Rx1x2 &PA Tx3x4x5x6 the number of variables in the

conjunction can exceed

the number in either conjunct

NOT EXTENSIONALLY EQUIVALENT

Lots of Conjoiners, Semantics

• If π and π* are propositions, then TRUE(π & π*) iff TRUE(π) and TRUE(π*)

• If π and π* are monadic predicates, then for each entity x: APPLIES[(π &M π*), x] iff APPLIES[π, x] and APPLIES[π*, x]

• If π and π* are dyadic predicates, then for each ordered pair o: APPLIES[(π &DA π*), o] iff APPLIES[π, o] and APPLIES[π*, o]

• If π and π* are predicates, then for each sequence σ: SATISFIES[σ, (π &PA π*)] iff SATISFIES[σ, π] and

SATISFIES[σ, π*] APPLIES[σ, (π &PA π*)] iff APPLIES[π, σ] and

APPLIES[π*, σ]

Lots of Conjoiners

• P & Q purely propositional

• Fx &M Gx purely monadic

Fx^Gx ; Rex^Gx a monad can “join” with a monad

or a dyad (with order fixed)

• Rx1x2 &DF Sx1x2 purely dyadic, with fixed orderRx1x2 &DA Sx2x1 purely dyadic, any order

• Rx1x2 &PF Tx1x2x3x4 polyadic, with fixed order

Rx1x2 &PA Tx3x4x1x5 polyadic, any order

Rx1x2 &PA Tx3x4x5x6 the number of variables in the

conjunction can exceed

the number in either conjunct

A Restricted Conjoiner and Closer, allowing for a smidgeon of dyadicity

• If M is a monadic predicate and D is a dyadic predicate,

then for each ordered pair <x, y>:

the junction D^M applies to <x, y> iff

D applies to <x, y> and M applies to y

• [D^M] applies to x iff

for some y, D^M applies to <x, y>

D applies to <x, y> and M applies to y

A Restricted Conjoiner and Closer, allowing for a smidgeon of dyadicity

• If M is a monadic predicate and D is a dyadic predicate,

then for each ordered pair <x, y>:

the junction D^M applies to <x, y> iff

D applies to <x, y> and M applies to y

• [Into(_, _)^Barn(_)] applies to x iff

for some y, Into(_, _)^Barn(_) applies to <x, y>

Into(_, _) applies to <x, y> and Barn(_) applies to y