Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some...

18
Statistical Structure in Natural Language Tom Jackson April 4th PACM seminar Advisor: Bill Bialek

Transcript of Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some...

Page 1: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

Statistical Structure in Natural Language

Tom Jackson April 4th PACM seminar Advisor: Bill Bialek

Page 2: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

Some Linguistic Questions

n  Why does language work so well? ¨  Unlimited capacity and flexibility

¨  Requires little conscious computational effort

n  Why does language work at all? ¨  Wittgenstein’s dilemma

¨  Rather accurate transmission of meaning from one mind to another

n  How do we acquire language so easily? ¨  Poverty of the stimulus

¨  Everyone learns ‘the same’ language

Page 3: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

Some Linguistic Answers

n  Language as ‘words and rules’ ¨  Semantic and syntactic components of language

n  Hierarchial structure ¨  Morphology (intra-word)

¨  Syntax (intra-sentence)

¨  Compositional rules and paragraph structure

n  Chomsky: Grammar must be ‘universal’

n  Nowak: evolution of syntactic communication ¨  More efficient to express an increasing number

of concepts using combinatorial rules.

Noam Chomsky

Page 4: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

Computational Linguistics

n  Language performance vs. language competence ¨  The ‘Chomskyan Revolution’

¨  Linguistics as psychology

¨  Corpus-based methods outside the mainstream

n  Inherent drawbacks ¨  No access to contextual meaning

¨  Word meaning only defined through relationships to other words ¨  Vive la différance!

n  Relevant for applications n  What are appropriate metrics to use?

Ferdinand de Saussure

Page 5: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

Information Theory

n  Entropy: information capacity

n  Mutual information: Information X provides about Y and vice versa.

H(X) = − px log px{x}∑

I(X;Y ) = H(X) + H(Y ) −H(X ⋅Y )

= pxy logpxypx py{x,y}

Claude Shannon

Page 6: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

Naïve Entropy Estimation

n  Pro: Entirely straightforward n  Con: Useless for interesting case K<N

¨  In the undersampled regime, just because we don’t see an event doesn’t mean it has zero probability.

¨  Analogous to overfitting a curve to data

¨  More appropriate to consider a “smooth” class of distributions

n  Problem: How to apply Occam’s principle in an unbiased way?

ˆ H {ni}( ) = −ni

Ni=1

K

∑ log ni

N

Page 7: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

Bayesian Inference

n  Bayes’ Theorem:

n  Here our prior is on the space of distributions…

Pr(model | data) = Pr(data | model)Pr(model)Pr(data)

1Zδ 1− qii=1

K

∑% & ' (

) *

qiβ −1

i=1

K

n  Generalize uniform prior to the Dirichlet prior parameterized by β.

qi β=ni + βN + Kβ

Page 8: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

Bias in the Dirichelet Prior

n  Prior highly biased for fixed values of β n  Obtain an unbiased prior by averaging over β

A priori (all n=0) entropy estimate as a function of β

1σ error is shaded n  K = 10

n  K = 100

n  K = 1000

Page 9: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

The estimator

n  Computational complexity still linear in N

ˆ S m =1

Z {ni}( )dβ dξ

dβ∫ ρ {ni},β( ) Smβ

ρ {ni,β}( ) =Γ(Kβ)

Γ(N + Kβ)Γ(ni + β)Γ(β)i=1

K

= −ni + βN + Kβi=1

K

∑ Ψ(ni + β +1) −Ψ(N + Kβ +1)( )

Page 10: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

Getting to Work

n  Summer ‘03 research project n  MI estimation algorithm implemented in ~5K lines

of C code

n  Texts used: ¨  Project Gutenberg’s Moby Dick

¨  Project Gutenberg’s War and Peace

n  K=18004, N = 567657

¨  Reuters news articles corpus

n  K=40696, N = 2478116

Page 11: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

Results 1 War and Peace data: Original Text Sentence order scrambled Words within sentence scrambled All words scrambled

Page 12: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

Results 2 War and Peace data: Original Text Sentence order scrambled Words within sentence scrambled All words scrambled

Page 13: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

Results 3 War and Peace data: Original Text Sentence order scrambled Words within sentence scrambled All words scrambled

Reuters corpus

Page 14: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

Information Bottleneck

n  Are we really observing a crossover between syntactic and semantic constraints? ¨  What “carries” the information at a given separation?

n  Information Bottleneck method: compress X into X’ while maximizing I(X’;Y). ¨  Two algorithms to determine p(x’|x):

n  Simulated Annealing (top down clustering)

n  Agglomerative (bottom up clustering)

n  Difficult to implement in this case ¨  Start with K~105; expect relevant clusters at K’~10

¨  Considerable degeneracy from hapax legomena

Page 15: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

Conclusions

n  Robust estimation of probability distribution functionals is possible even with undersampling

n  Pairwise mutual information displays power law behavior, with breaks at semantically significant length scales

Page 16: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

Possible Applications

n  Semantic extraction ¨  Automatic document summarizing

¨  Search engines

n  Data compression ¨  Shannon compression theorem

¨  Existing algorithms work at finite range only

Page 17: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

For further reading

n  Language ¨  Words and Rules: the ingredients of language, by Steven Pinker

¨  Nature 417:611

n  Entropy estimation ¨  Physical review E 52:6841

¨  xxx.lanl.gov/abs/physics/0108025

Page 18: Statistical Structure in Natural Languagetsjackson/pdf/tsj_PACM_slides.pdf · 2014. 4. 14. · Some Linguistic Answers! Language as ‘words and rules’ " Semantic and syntactic

End