Expectation-based Sentence Comprehension The … Role of Animacy Information in Expectation-based...

1
The Role of Animacy Information in Expectation-based Sentence Comprehension Zhong Chen (Rochester Institute of Technology, USA) [email protected] 1. Introduction Sentence processing is predictive. Each in- coming word introduces information that helps us to shape expectations about the rest of the sentence (Hale, 2001; Levy, 2008). S TRUCTURAL INFORMATION, e.g. gram- matical category, phrase structure hierarchy, or syntactic movement, often revises existing parses. In contrast, new words also render N ON - STRUCTURAL INFORMATION like the- matic relation or information structure. But how do structural and non-structural ex- pectations interact during comprehension? This work investigates the role of noun phrase A NIMACY in parsing by model- ing two eye-tracking experiments using E NTROPY R EDUCTION . 2. Entropy Reduction Put E NTROPY , the information-theoretic no- tion, in a language-processing scenario. The random variable X may take values that are derivations on a probabilistic grammar G. H (X )= - X xX p(x) log 2 p(x) Extend the entropy notation to express con- ditioning events. If w 1 ...w i is an initial sub- string of a sentence generated by G, the con- ditional entropy H i will be the uncertainty about derivations that have a w 1 ...w i prefix. E NTROPY R EDUCTION (ER) is a complexity metric of sentence comprehension that quan- tifies the amount of information a word con- tributes towards reducing uncertainty (Hale, 2003, 2006, Frank et al., 2015). ER i = ( H i-1 - H i if this difference > 0 0 otherwise The uncertainty level depends on weighted, predictive syntactic analyses that are "still in play" at a given point. This work takes one step further by considering readers’ expecta- tion on the non-structural factor animacy. Selected References Frank, S., Otten, L., Galli, G., & Vigliocco, G. (2015). The ERP response to the amount of information conveyed by words in sentences. Brain and language, 140, 1-11. Hale, J. (2006). Uncertainty about the rest of the sentence. Cognitive Science, 30(4). Linzen, T. & Jaeger T.F. (In press) Uncertainty and Expectation in Sentence Processing: Evidence From Subcategorization Distributions. Cognitive Science. Lowder, M., & Gordon, P. (2012). The pistol that injured the cowboy: Difficulty with inanimate subject-verb integration is reduced by structural separation. JML, 66(4), 819–832. Traxler, M., Morris, R., & Seely, R. (2002). Processing subject and object relative clauses: Evidence from eye movements. Journal of Memory and Language, 47, 69–90. Traxler, M., Williams, R., Blozis, S., & Morris, R. (2005). Working memory, animacy, and verb class in the processing of relative clauses. Journal of Memory and Language, 53, 204–224. 3. Animacy Information This study adopts a binary classification of noun phrase animacy, namely + ANIM vs - ANIM, similar to previous works (MacWhinney et al., 1984, Traxler et al, 2002). The frequency distributions of NP animacy are obtained from the annotated hand- parsed Switchboard corpus of conversational American English (Zaenen et al., 2004). 4. Modeling Incremental Comprehension (Cornell CCPC) Minimalist Grammar (Stabler, 1997) Weighted Multiple Context-Free Grammar ‘Intersection’ Grammar conditioned on prexes (Nederhof & Satta, 2008) Weighted, predictive syntactic analysis weighting constructions with corpus counts parse each prex in the sentence Uncertainty at this word set of derivations derivation 5. Example 1: Traxler et al. (2002, 2005) Animacy facilitates the assignment of thematic roles and affects how ambiguities are resolved in relative clauses. The easiness of SRs ( S UBJ A DV ) is less prominent when inanimate heads are involved, especially among readers with high working memory capacity (WMC). Type Examples Head Embedded S UBJ A DV within the three-word region after the head Traxler et al (2002) Traxler et al (2005) This study (ER) High WMC SR The director that watched the movie. . . +anim -anim 294 ms 239 ms 1.01 bit OR The director that the movie pleased. . . +anim -anim SR The movie that pleased the director. . . -anim +anim 39 ms -49 ms -0.73 bit OR The movie that the director watched. . . -anim +anim ER predictions mirror the reversed S UBJ A DV observed within the high WMC group, confirming a recent finding such that ER is a stronger predictor of processing difficulty when a larger amount of syntactic lookahead is considered (Linzen & Jaeger, in press). SRs with inanimate heads are in fact harder than their OR counterparts because (1) they are less frequent and (2) the parser is highly uncertain about choosing SR or OR as the continuation given an inanimate head. More disambiguation effort will be made later within the RC region. Prob Remainder 0.106 Det AniN who Vt Det InaniN Vi 0.078 Det AniN who Vt Pronoun Vi 0.056 Det AniN who Vt Det AniN Vi 0.055 Det AniN who Vi Vi 0.026 Det AniN who Vt Det InaniN Vt Det InaniN 0.020 Det AniN who Vt Pronoun Vt Det InaniN 0.019 Det AniN who Det AniN Vt Vi 0.018 Det AniN who Pronoun Vt Vi 0.017 Det AniN who Vt Det InaniN Vdi Det AniN to Det InaniN 0.015 Det AniN who Vt Det InaniN Vdi Det InaniN to Det InaniN ... ... entropy = 7.929 Prob Remainder 0.047 Det InaniN who Vt Det InaniN Vi 0.047 Det InaniN who Det AniN Vt Vi 0.046 Det InaniN who Pronoun Vt Vi 0.035 Det InaniN who Vt Pronoun Vi 0.034 Det InaniN who Vi Vi 0.025 Det InaniN who Vt Det AniN Vi 0.016 Det InaniN who Vt Det InaniN Vt Det InaniN 0.016 Det InaniN who Det AniN Vt Vt Det InaniN 0.016 Det InaniN who Pronoun Vt Vt Det InaniN 0.012 Det InaniN who Vt Pronoun Vt Det InaniN ... ... entropy = 9.092 (a) Animate prefix (b) Inanimate prefix 6. Example 2: Lowder & Gordon (2012) Animacy also interacts with the depth of structural relations. The difficulty of integrating a sub- ject with an action verb is significantly reduced when the subject is inanimate and the verb occur across the clausal boundary in an embedded clause, e.g. SR. Subject NP Verb Location Examples The clausal-boundary effect at the verb Lowder & Gordon (2012) This Study (ER) +anim matrix The cowboy concealed the pistol. . . 379 ms 36 ms 1.12 bit 0.08 bit embedded The cowboy that concealed the pistol. . . 343 ms 1.04 bit -anim matrix The pistol injured the cowboy. . . 415 ms 93 ms 2.32 bit 0.45 bit embedded The pistol that injured the cowboy. . . 322 ms 1.86 bit ER predictions are similar such that the magnitude of clausal boundary effect is amplified with an inanimate subject. This is because the inanimate subject allows a variety of continuations while more disambiguation is done at the main verb than at the embedded verb. Prob Remainder 0.304 Det InaniN Vi 0.105 Det InaniN Vt Det Inaninoun 0.067 Det InaniN Vdi Det Aninoun to Det Inaninoun 0.061 Det InaniN Vdi Det Inaninoun to Det Inaninoun 0.055 Det InaniN Vt Det Aninoun 0.054 Det InaniN Vi in Det Inaninoun 0.037 Det InaniN Vt Pronoun 0.019 Det InaniN Vi in Det Aninoun 0.012 Det InaniN Vdi Det Aninoun to Det Aninoun 0.011 Det InaniN Vdi Det Inaninoun to Det Aninoun ... ... entropy = 5.374 Prob Remainder 0.424 Det Inaninoun Vt Det Inaninoun 0.223 Det Inaninoun Vt Det Aninoun 0.148 Det Inaninoun Vt Pronoun 0.043 Det Inaninoun Vt Det Inaninoun in Det Inaninoun 0.023 Det Inaninoun Vt Det Aninoun in Det Inaninoun 0.015 Det Inaninoun Vt Det Inaninoun in Det Aninoun 0.015 Det Inaninoun Vt Pronoun in Det Inaninoun 0.008 Det Inaninoun Vt Det Aninoun in Det Aninoun 0.006 Det Inaninoun Vt Det Inaninoun who Vt Det Inaninoun 0.006 Det Inaninoun Vt Det Inaninoun who Det Aninoun Vt ... ... entropy = 3.058 ER=2.32 7. Conclusions This work models the animacy effect observed in comprehension experiments and provides lin- guistically plausible interpretations. It visualizes alternative derivations that are “still in-play" by using probabilistic grammars and therefore allows us to describe the interaction between structural and non-structural expectations.

Transcript of Expectation-based Sentence Comprehension The … Role of Animacy Information in Expectation-based...

The Role of Animacy Information inExpectation-based Sentence ComprehensionZhong Chen (Rochester Institute of Technology, USA)[email protected]

1. Introduction• Sentence processing is predictive. Each in-

coming word introduces information thathelps us to shape expectations about the restof the sentence (Hale, 2001; Levy, 2008).

• STRUCTURAL INFORMATION, e.g. gram-matical category, phrase structure hierarchy,or syntactic movement, often revises existingparses. In contrast, new words also renderNON-STRUCTURAL INFORMATION like the-matic relation or information structure.

• But how do structural and non-structural ex-pectations interact during comprehension?This work investigates the role of nounphrase ANIMACY in parsing by model-ing two eye-tracking experiments usingENTROPY REDUCTION.

2. Entropy Reduction• Put ENTROPY, the information-theoretic no-

tion, in a language-processing scenario. Therandom variable X may take values that arederivations on a probabilistic grammar G.

H(X) = −∑x∈X

p(x) log2 p(x)

• Extend the entropy notation to express con-ditioning events. If w1 . . . wi is an initial sub-string of a sentence generated by G, the con-ditional entropy Hi will be the uncertaintyabout derivations that have a w1 . . . wi prefix.

• ENTROPY REDUCTION (ER) is a complexitymetric of sentence comprehension that quan-tifies the amount of information a word con-tributes towards reducing uncertainty (Hale,2003, 2006, Frank et al., 2015).

ERi =

{Hi−1 −Hi if this difference > 00 otherwise

• The uncertainty level depends on weighted,predictive syntactic analyses that are "still inplay" at a given point. This work takes onestep further by considering readers’ expecta-tion on the non-structural factor animacy.

Selected ReferencesFrank, S., Otten, L., Galli, G., & Vigliocco, G. (2015). The ERP response to the amount of information conveyed by words in sentences. Brain and language, 140, 1-11.Hale, J. (2006). Uncertainty about the rest of the sentence. Cognitive Science, 30(4).Linzen, T. & Jaeger T.F. (In press) Uncertainty and Expectation in Sentence Processing: Evidence From Subcategorization Distributions. Cognitive Science.Lowder, M., & Gordon, P. (2012). The pistol that injured the cowboy: Difficulty with inanimate subject-verb integration is reduced by structural separation. JML, 66(4), 819–832.Traxler, M., Morris, R., & Seely, R. (2002). Processing subject and object relative clauses: Evidence from eye movements. Journal of Memory and Language, 47, 69–90.Traxler, M., Williams, R., Blozis, S., & Morris, R. (2005). Working memory, animacy, and verb class in the processing of relative clauses. Journal of Memory and Language, 53, 204–224.

3. Animacy Information• This study adopts a binary classification

of noun phrase animacy, namely +ANIMvs −ANIM, similar to previous works(MacWhinney et al., 1984, Traxler et al, 2002).

• The frequency distributions of NP animacyare obtained from the annotated hand-parsed Switchboard corpus of conversationalAmerican English (Zaenen et al., 2004).

4. Modeling Incremental Comprehension (Cornell CCPC)

Uncertainty and Prediction in Relativized Structures across East Asian LanguagesZhong Chen, Jiwon Yun, John Whitman, John Hale

Department of Linguistics, Cornell University

Korean Chinese Japanese

Our modeling derives an SR advantage at the head noun in line with structural frequencies (SR 55%/OR 45%). It also implicates headless RCs as a grammatical alternative whose exis-tence makes processing easier at the head noun in SRs. A corpus study reveals that 14% of SRs have a null head whereas 31% of ORs are headless. This asymmetry suggests that an overt head is more predictable in SRs and less work needs to be done.

Our modeling derives a pattern consistent with the empirical !nding in Kahraman et al. (2011) that at the “-no-wa” marked embedded verb, subject clefts are read more slowly than object clefts. Upon reaching the topic marker “-wa”, complement clauses with SBJ-pro are still in play in case of the SC pre!x, which causes more amount of uncertainties re-duced around that point. On the other hand, the OC pre!x is less ambiguous because complement clauses with object-pro are extremely rare.

Our modeling con!rms the SR preference in Korean reported by Kwon et al. (2010) and fur-ther shows that this e"ect could emerge as early as the accusative/nominative marker. This re#ects, among other factors, a greater entropy reduction brought by sentence-initial nominative noun phrases.

Analysis

0.38 “Vt N de N” pro in matrix SBJ & Poss-OBJ0.25 “Vt N de N Vt N” SR in matrix SBJ0.19 “Vt N de N Vi” SR in matrix SBJ0.06 “Vt N de Vt N” headless SR in matrix SBJ0.05 “Vt N de Vi” headless SR in matrix SBJ

0.37 “Vt N de N”0.28 “Vt N de N Vt N”0.22 “Vt N de N Vi”SR

W3 “Vt N de” W4 “Vt N de N”

0.35 “N Vt de N Vt N” OR in matrix SBJ0.27 “N Vt de N Vi” OR in matrix SBJ0.17 “N Vt de Vt N” headless OR in matrix SBJ0.13 “N Vt de Vi” headless OR in matrix SBJ0.04 “N Vt de N Vt N de N” OR in matrix SBJ & Poss-OBJ

0.51 “N Vt de N Vt N”0.39 “N Vt de N Vi”0.06 “N Vt de N Vt N de N”OR

Subject Relatives (SR)

Object Relatives (OR)

0.51 “N Nom N Acc Vt” whole matrix C0.09 “N Acc Vt” pro in matrix SBJ0.05 “N Acc Vadj N Nom N Acc Vt” pro in adjunct SBJ0.03 “N Nom N Acc Vadn N Acc Vt” SR in matrix OBJ0.03 “N Acc Vadn N Nom N Acc Vt” SR in matrix SBJ

0.27 “N Acc Vt”0.17 “N Acc Vadj N Nom N Acc Vt”0.11 “N Acc Vadn N Nom N Acc Vt”SR

W1 “N” W2 “N Acc”

0.75 “N Nom N Acc Vt”0.05 “N Nom N Acc Vadn N Acc Vt”

OR

W4 “N Acc Vt no” W5 “N Acc Vt no wa”

Grammatical phenomena such as case-marking, head-omission, and object-drop create inferential problems that must be solved by any parsing mechanism. The Entropy Reductions brought about by "solving" these problems -- moving towards more concentrated distributions on derivations -- correspond with observed processing di$culty.

Hale, J. (2006). Uncertainty about the rest of the sentence. Cognitive Science, 30(4). Hsiao, F., & Gibson, E. (2003). Processing relative clauses in Chinese. Cognition, 90, 3–27.Kahraman, B., Sato, A., Ono, H. & Sakai, H. (2011). Incremental processing of gap-!ller dependencies: Evidence from the processing of subject and object clefts in Japanese. In

Proceedings of the 12th Tokyo Conference on Psycholinguistics.Kwon, N., Lee, Y., Gordon, P. C., Kluender, R., & Polinsky, M. (2010). Cognitive and linguistic factors a"ecting subject/object asymmetry: An eye-tracking study of pre-nominal relative

clauses in korean. Language, 86(3), 546–582.Lin, C.-J. C., & Bever, T. G. (2006). Subject preference in the processing of relative clauses in Chinese. In Proceedings of the 25th WCCFL (p. 254-260). Nederhof, M.-J. & Satta, G. (2008). Computing Partition Functions of PCFGs. Research on Language and Computation, 6(2), 139–162.Stabler, E. (1997). Derivational minimalism. In C. Retore (Ed.), Logical aspects of computational linguistics. Springer-Verlag.

Conclusion Selected References

0.51 “N Nom N Acc Vt” whole matrix C0.09 “N Acc Vt” SBJ-pro in matrix C0.05 “N Acc Vadj N Nom N Acc Vt” SBJ-pro in adjunct C0.03 “N Nom N Acc Vadn N Acc Vt” SR in matrix OBJ0.03 “N Acc Vadn N Nom N Acc Vt” SR in matrix SBJ

SC

OC

0.08 “N Acc Vt no Nom N Acc Vt” SR in matrix SBJ0.08 “N Acc Vt no wa N Acc Vt” SR in matrix Topic0.05 “N Acc Vt no Acc Vt” SR in matrix OBJ0.05 “N Acc Vt no Acc Vt” SBJ-pro in Comp C0.05 “N Acc Vt no Nom Vi” SR in matrix SBJ

0.15 “N Nom Vt no wa N Acc Vt” OR in matrix SBJ0.09 “N Nom Vt no Acc Vt” OR in matrix OBJ0.09 “N Nom Vt no wa Vi” OR in matrix Topic0.08 “N Nom Vt no Nom N Acc Vt” OR in matrix SBJ0.05 “N Nom Vt no Acc N Nom Vt” OR in matrix SBJ

0.39 “N Acc Vt no wa N Acc Vt”

0.39 “N Nom Vt no wa N Acc Vt”0.23 “N Nom Vt no wa Vi”

MinimalistGrammar

(Stabler, 1997)

Weighted MultipleContext-Free Grammar

‘Intersection’ Grammar conditioned on pre!xes(Nederhof & Satta, 2008)

Weighted, predictive syntactic analysisweighting constructions

with corpus counts

IntroductionEntropy Reduction (Hale, 2006) is a complexity metric that quanti!es the amount of information a word contributes towards reducing structural uncertainty. This certainty level depends on weighted, predictive syntactic analyses that are "still in play" at a given point. This poster uses Entropy Reduction to derive reported processing contrasts in Korean, Chinese and Japanese relativized structures.

Modeling procedure

Experimental Observation: SBJ Relatives < OBJ Relatives (Kwon et al., 2010)

Experimental Observations: SBJ Relatives < OBJ Relatives (Lin & Bever, 2006; Wu, 2009; Chen et al., 2012) SBJ Relatives > OBJ Relatives (Hsiao & Gibson, 2003; Gibson & Wu, in press)

Experimental Observation: Subject Clefts > Object Clefts (Kahraman et al., 2011)

ER Modeling:ER Modeling:Subject Relatives (SR)

Object Relatives (OR)

Analysis

Subject Clefts (SC)

Object Clefts (OC)

Analysis

Comprehension di"culty prediction

ER Modeling:

parse each pre!xin the sentence

Uncertainty at this word

set of derivations

derivation

 W1   W2   W3   W4   W5   W6   W7  

e   祖母 を 介抱した の は 親戚 だ。 (SBJ)   grandma   ACC   nursed   NO   WA   relative   COP  

‘It  was  the  relative  who  nursed  the  grandmother.’  

 

W1   W2    

W3   W4   W5   W6   W7  

祖母 が e   介抱した の は 親戚 だ。 grandma   NOM   (OBJ)   nursed   NO   WA   relative   COP  

‘It  was  the  relative  who  the  grandmother  nursed.’  

 

 W1   W2   W3   W4   W5   W6  

e   邀䇋 富豪 的 (官ਬ) 打了 䆄者 SBJ   invite   tycoon   DE   official   hit   reporter  

‘The  official/Someone  who  invited  the  tycoon  hit  the  reporter.’  

 

W1   W2    

W3   W4   W5   W6  

富豪 邀䇋 e   的 (官ਬ) 打了 䆄者 tycoon   invite   OBJ   DE   official   hit   reporter  

‘The  official/Someone  who  the  tycoon  invited  hit  the  reporter.’  

 

 W1   W2   W3   W4   W5   W6  

e   기자 를 협박한 의원 이 유명해졌다. (SBJ)   reporter   ACC   threaten-­ADN   senator   NOM   became  famous  

‘The  senator  who  threatened  the  reporter  became  famous.’  

 

W1   W2    

W3   W4   W5   W6  

기자 가 e   협박한 의원 이 유명해졌다. reporter   NOM   (OBJ)   threaten-­ADN   senator   NOM   became  famous  

‘The  senator  who  the  reporter  threatened  became  famous.’  

 

Comprehension di"culty predictionComprehension di"culty prediction

W1 “N” W2 “N Nom” W3 “N Vt de” W4 “N Vt de N” W4 “N Nom Vt no” W5 “N Nom Vt no wa”

The 25th Annual CUNY Conference on Human Sentence Processing, March 14-16, 2012 {zc77, jy249, jbw2, jthale}@cornell.edu

5. Example 1: Traxler et al. (2002, 2005)• Animacy facilitates the assignment of thematic roles and affects how ambiguities are resolved in

relative clauses. The easiness of SRs (SUBJ ADV) is less prominent when inanimate heads areinvolved, especially among readers with high working memory capacity (WMC).

Type Examples Head EmbeddedSUBJ ADV within the three-word region after the head

Traxler et al (2002) Traxler et al (2005) This study (ER)High WMCSR The director that watched the movie. . . +anim −anim 294 ms 239 ms 1.01 bitOR The director that the movie pleased. . . +anim −animSR The movie that pleased the director. . . −anim +anim 39 ms -49 ms -0.73 bitOR The movie that the director watched. . . −anim +anim

• ER predictions mirror the reversed SUBJ ADV observed within the high WMC group, confirminga recent finding such that ER is a stronger predictor of processing difficulty when a larger amountof syntactic lookahead is considered (Linzen & Jaeger, in press).

• SRs with inanimate heads are in fact harder than their OR counterparts because (1) they are lessfrequent and (2) the parser is highly uncertain about choosing SR or OR as the continuation givenan inanimate head. More disambiguation effort will be made later within the RC region.

Prob Remainder0.106 Det AniN who Vt Det InaniN Vi0.078 Det AniN who Vt Pronoun Vi0.056 Det AniN who Vt Det AniN Vi0.055 Det AniN who Vi Vi0.026 Det AniN who Vt Det InaniN Vt Det InaniN0.020 Det AniN who Vt Pronoun Vt Det InaniN0.019 Det AniN who Det AniN Vt Vi0.018 Det AniN who Pronoun Vt Vi0.017 Det AniN who Vt Det InaniN Vdi Det AniN to Det InaniN0.015 Det AniN who Vt Det InaniN Vdi Det InaniN to Det InaniN. . . . . .

entropy = 7.929

Prob Remainder0.047 Det InaniN who Vt Det InaniN Vi0.047 Det InaniN who Det AniN Vt Vi0.046 Det InaniN who Pronoun Vt Vi0.035 Det InaniN who Vt Pronoun Vi0.034 Det InaniN who Vi Vi0.025 Det InaniN who Vt Det AniN Vi0.016 Det InaniN who Vt Det InaniN Vt Det InaniN0.016 Det InaniN who Det AniN Vt Vt Det InaniN0.016 Det InaniN who Pronoun Vt Vt Det InaniN0.012 Det InaniN who Vt Pronoun Vt Det InaniN. . . . . .

entropy = 9.092

Animate prefix

Inanimate prefix

Prob Remainder0.106 Det AniN who Vt Det InaniN Vi0.078 Det AniN who Vt Pronoun Vi0.056 Det AniN who Vt Det AniN Vi0.055 Det AniN who Vi Vi0.026 Det AniN who Vt Det InaniN Vt Det InaniN0.020 Det AniN who Vt Pronoun Vt Det InaniN0.019 Det AniN who Det AniN Vt Vi0.018 Det AniN who Pronoun Vt Vi0.017 Det AniN who Vt Det InaniN Vdi Det AniN to Det InaniN0.015 Det AniN who Vt Det InaniN Vdi Det InaniN to Det InaniN. . . . . .

entropy = 7.929

Prob Remainder0.047 Det InaniN who Vt Det InaniN Vi0.047 Det InaniN who Det AniN Vt Vi0.046 Det InaniN who Pronoun Vt Vi0.035 Det InaniN who Vt Pronoun Vi0.034 Det InaniN who Vi Vi0.025 Det InaniN who Vt Det AniN Vi0.016 Det InaniN who Vt Det InaniN Vt Det InaniN0.016 Det InaniN who Det AniN Vt Vt Det InaniN0.016 Det InaniN who Pronoun Vt Vt Det InaniN0.012 Det InaniN who Vt Pronoun Vt Det InaniN. . . . . .

entropy = 9.092

Animate prefix

Inanimate prefix(a) Animate prefix (b) Inanimate prefix

6. Example 2: Lowder & Gordon (2012)• Animacy also interacts with the depth of structural relations. The difficulty of integrating a sub-

ject with an action verb is significantly reduced when the subject is inanimate and the verb occuracross the clausal boundary in an embedded clause, e.g. SR.

Subject NP Verb Location Examples The clausal-boundary effect at the verbLowder & Gordon (2012) This Study (ER)

+anim matrix The cowboy concealed the pistol. . . 379 ms 36 ms 1.12 bit 0.08 bitembedded The cowboy that concealed the pistol. . . 343 ms 1.04 bit

−anim matrix The pistol injured the cowboy. . . 415 ms 93 ms 2.32 bit 0.45 bitembedded The pistol that injured the cowboy. . . 322 ms 1.86 bit

• ER predictions are similar such that the magnitude of clausal boundary effect is amplified with aninanimate subject. This is because the inanimate subject allows a variety of continuations whilemore disambiguation is done at the main verb than at the embedded verb.

Prob Remainder0.304 Det InaniN Vi0.105 Det InaniN Vt Det Inaninoun0.067 Det InaniN Vdi Det Aninoun to Det Inaninoun0.061 Det InaniN Vdi Det Inaninoun to Det Inaninoun0.055 Det InaniN Vt Det Aninoun0.054 Det InaniN Vi in Det Inaninoun0.037 Det InaniN Vt Pronoun0.019 Det InaniN Vi in Det Aninoun0.012 Det InaniN Vdi Det Aninoun to Det Aninoun0.011 Det InaniN Vdi Det Inaninoun to Det Aninoun. . . . . .

entropy = 5.374

Prob Remainder0.424 Det Inaninoun Vt Det Inaninoun0.223 Det Inaninoun Vt Det Aninoun0.148 Det Inaninoun Vt Pronoun0.043 Det Inaninoun Vt Det Inaninoun in Det Inaninoun0.023 Det Inaninoun Vt Det Aninoun in Det Inaninoun0.015 Det Inaninoun Vt Det Inaninoun in Det Aninoun0.015 Det Inaninoun Vt Pronoun in Det Inaninoun0.008 Det Inaninoun Vt Det Aninoun in Det Aninoun0.006 Det Inaninoun Vt Det Inaninoun who Vt Det Inaninoun0.006 Det Inaninoun Vt Det Inaninoun who Det Aninoun Vt. . . . . .

entropy = 3.058

ER=2.32

7. Conclusions• This work models the animacy effect observed in comprehension experiments and provides lin-

guistically plausible interpretations.

• It visualizes alternative derivations that are “still in-play" by using probabilistic grammars andtherefore allows us to describe the interaction between structural and non-structural expectations.

1