Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis Theresa Wilson Janyce Wiebe Paul...
-
Upload
jayson-cross -
Category
Documents
-
view
221 -
download
1
Transcript of Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis Theresa Wilson Janyce Wiebe Paul...
Recognizing Contextual Polarity in Phrase-Level
Sentiment Analysis
Theresa WilsonJanyce Wiebe
Paul Hoffmann
University of Pittsburgh
HLT-EMNLP 2005 2
Introduction
Sentiment analysistask of identifying positive and negative opinions, emotions, and evaluations
How detailed? Depends on the application. Flame detection, review classification
document-level analysis Question answering, review mining
sentence or phrase-level analysis
HLT-EMNLP 2005 3
Question Answering Example
African observers generally approved of his victory while Western Governments denounced it.
Q: What is the international reaction to the reelection of Robert Mugabe as President of Zimbabwe?
HLT-EMNLP 2005 4
Most approaches use a lexicon of positive and negative wordsPrior polarity: out of context, positive or negative beautiful positive horrid negative
A word may appear in a phrase that expresses a different polarity in context
Contextual polarity
“Cheers to Timothy Whitfield for the wonderfully horrid visuals.”
Prior Polarity versus Contextual Polarity
HLT-EMNLP 2005 5
Example
Philip Clap, President of the National Environment Trust, sums up well the general thrust of the reaction of environmental movements: there is no reason at all to believe that the polluters are suddenly going to become reasonable.
HLT-EMNLP 2005 6
Example
Philip Clap, President of the National Environment Trust, sums up well the general thrust of the reaction of environmental movements: there is no reason at all to believe that the polluters are suddenly going to become reasonable.
HLT-EMNLP 2005 7
Philip Clap, President of the National Environment Trust, sums up well the general thrust of the reaction of environmental movements: there is no reason at all to believe that the polluters are suddenly going to become reasonable.
Example
prior polarityprior polarity Contextual Contextual polaritypolarity
HLT-EMNLP 2005 8
Goal of Our Research
Automatically distinguish prior and contextual polarity
HLT-EMNLP 2005 9
Approach
Use machine learning and variety of features
Achieve significant results for a large subset of sentiment expressions
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2
AllInstances
PolarInstances
HLT-EMNLP 2005 10
Outline
Introduction Manual Annotations Corpus Prior-Polarity Subjectivity Lexicon Experiments Previous Work Conclusions
HLT-EMNLP 2005 11
Manual Annotations
Need: sentiment expressions with contextual polarity positive and negative expressions of
emotions, evaluations, stances
Had: subjective expression annotations in MPQA Opinion Corpus http://nrrc.mitre.org/NRRC/publications.htm
words/phrases expressing emotions, evaluations, stances, speculations, etc.
sentiment expressions subjective expressions
Decision: annotate subjective expressions in MPQA Corpus with their contextual polarity
HLT-EMNLP 2005 12
Annotation Scheme
Mark polarity of subjective expressions as positive, negative, both, or neutral
African observers generally approved of his victory while Western governments denounced it.
Besides, politicians refer to good and evil …
Jerome says the hospital feels no different than a hospital in the states.
positive
negative
both
neutral
HLT-EMNLP 2005 13
Annotation Scheme
Judge the contextual polarity of sentiment ultimately being conveyed
They have not succeeded, and will never succeed, in breaking the will of this valiant people.
positive
HLT-EMNLP 2005 14
Agreement Study
10 documents with 447 subjective expressions Kappa: 0.72 (82%)
Remove uncertain cases at least one annotator marked uncertain (18%) Kappa: 0.84 (90%)
(But all data included in experiments)
HLT-EMNLP 2005 15
Outline
Introduction Manual Annotations Corpus
Prior-Polarity Subjectivity Lexicon Experiments Previous Work Conclusions
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
HLT-EMNLP 2005 16
Corpus
425 documents from MPQA Opinion Corpus 15,991 subjective expressions in 8,984 sentences
Divided into two sets Development set
66 docs / 2,808 subjective expressions Experiment set
359 docs / 13,183 subjective expressions Divided into 10 folds for cross-validation
HLT-EMNLP 2005 17
Outline
Introduction Manual Annotations Corpus Prior-Polarity Subjectivity Lexicon
Experiments Previous Work Conclusions
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
HLT-EMNLP 2005 18
Prior-Polarity Subjectivity Lexicon
Over 8,000 words from a variety of sources Both manually and automatically identified Positive/negative words from General Inquirer and
Hatzivassiloglou and McKeown (1997)
All words in lexicon tagged with: Prior polarity: positive, negative, both, neutral Reliability: strongly subjective (strongsubj),
weakly subjective (weaksubj)
HLT-EMNLP 2005 19
Outline
Introduction Manual Annotations Corpus Prior-Polarity Subjectivity Lexicon Experiments Previous Work Conclusions
HLT-EMNLP 2005 20
Experiments
Give each instance its own label
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
Both Steps: BoosTexter AdaBoost.HM 5000 rounds boosting 10-fold cross validation
HLT-EMNLP 2005 21
Definition of Gold Standard
Given an instance inst from the lexicon:if inst not in a subjective expression:
goldclass(inst) = neutral
else if inst in at least one positive and one negative subjective expression:
goldclass(inst) = both
else if inst in a mixture of negative and neutral: goldclass(inst) = negative
else if inst in a mixture of positive and neutral: goldclass(inst) = positive
else: goldclass(inst) = contextual polarity of subjective expression
HLT-EMNLP 2005 22
Features
Many inspired by Polanya & Zaenen (2004): Contextual Valence ShiftersExample: little threat
little truth Others capture dependency relationships
between wordsExample:
wonderfully horrid pos
mod
HLT-EMNLP 2005 23
1. Word features
2. Modification features
3. Structure features
4. Sentence features
5. Document feature
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
HLT-EMNLP 2005 24
1. Word features2. Modification features3. Structure features4. Sentence features5. Document feature
Word token terrifies Word part-of-speech
VB Context that terrifies me Prior Polarity
negative
Reliability strongsubj
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
HLT-EMNLP 2005 25
1. Word features
2. Modification features
3. Structure features4. Sentence features5. Document feature
Binary features: Preceded by
adjective adverb (other than not) intensifier
Self intensifier Modifies
strongsubj clue weaksubj clue
Modified by strongsubj clue weaksubj clue
Dependency Parse Tree
The human rights
report
poses
a substantial
challenge
…
detadj mod adj
det
subj obj
p
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
HLT-EMNLP 2005 26
1. Word features
2. Modification features
3. Structure features4. Sentence features
5. Document feature
Binary features: In subject The human rights report poses
In copular I am confident
In passive voicemust be
regarded
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
The human rights
report
poses
a substantial
challenge
…
detadj mod adj
det
subj obj
p
HLT-EMNLP 2005 27
1. Word features2. Modification features3. Structure features
4. Sentence features
5. Document feature
Count of strongsubj clues in previous, current, next sentence
Count of weaksubj clues in previous, current, next sentence
Counts of various parts of speech
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
HLT-EMNLP 2005 28
1. Word features
2. Modification features
3. Structure features
4. Sentence features
5. Document feature
Document topic (15) economics health
Kyoto protocol presidential election in Zimbabwe
…
Example: The disease can be contracted if a person is bitten by a certain tick or if a person comes into contact with the blood of a congo fever sufferer.
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
HLT-EMNLP 2005 29
75.9
63.4
82.1
40
50
60
70
80
90
Accuracy Polar F Neutral F
Word token
Word + Prior Polarity
All Features
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
Results 1a
HLT-EMNLP 2005 30
30
40
50
60
70
80
Polar Recall Polar Precision
Word token
Word + Prior Polarity
All Features
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
Results 1b
HLT-EMNLP 2005 31
Step 2: Polarity Classification
Classes positive, negative, both, neutral
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
19,506 5,671
HLT-EMNLP 2005 32
Word token Word prior polarity Negated Negated subject Modifies polarity Modified by polarity Conjunction polarity General polarity shifter Negative polarity shifter Positive polarity shifter
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
HLT-EMNLP 2005 33
Word token Word prior polarity Negated Negated subject Modifies polarity Modified by polarity Conjunction polarity General polarity shifter Negative polarity shifter Positive polarity shifter
Word token terrifies
Word prior polarity negative
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
HLT-EMNLP 2005 34
Word token Word prior polarity
Negated Negated subject Modifies polarity Modified by polarity Conjunction polarity General polarity shifter Negative polarity shifter Positive polarity shifter
Binary features: Negated
For example: not good does not look very good not only good but amazing
Negated subject
No politically prudent Israeli could support either of them.
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
HLT-EMNLP 2005 35
Word token Word prior polarity Negated Negated subject Modifies polarity Modified by polarity Conjunction polarity General polarity shifter Negative polarity shifter Positive polarity shifter
Modifies polarity
5 values: positive, negative, neutral, both, not mod
substantial: negative
Modified by polarity
5 values: positive, negative, neutral, both, not mod
challenge: positive
substantial (pos) challenge (neg)
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
HLT-EMNLP 2005 36
Word token Word prior polarity Negated Negated subject Modifies polarity Modified by polarity Conjunction polarity General polarity shifter Negative polarity shifter Positive polarity shifter
Conjunction polarity
5 values: positive, negative, neutral, both, not mod
good: negative
good (pos) and evil (neg)
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
HLT-EMNLP 2005 37
Word token Word prior polarity Negated Negated subject Modifies polarity Modified by polarity Conjunction polarity General polarity shifter Negative polarity shifter Positive polarity shifter
General polarity shifter
pose little threat
contains little truth
Negative polarity shifter
lack of understanding
Positive polarity shifter abate the damage
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
HLT-EMNLP 2005 38
65.7 65.1
77.2
46.2
30
40
50
60
70
80
90
Accuracy Pos F Neg F Neutral F
Word token
Word + Prior Polarity
All Features
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
Results 2a
HLT-EMNLP 2005 39
40
50
60
70
80
90
PosRecall
Pos Prec NegRecall
Neg Prec
Word token
Word + Prior Polarity
All Features
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
Results 2b
HLT-EMNLP 2005 40
Ablation experiments removing features:1. Negated, negated subject
2. Modifies polarity, modified by polarity
3. Conjunction polarity
4. General, negative, positive polarity shifters
Corpus
Lexicon
Neutralor
Polar?
Step 1
ContextualPolarity?
Step 2All
InstancesPolar
Instances
HLT-EMNLP 2005 41
Outline
Introduction Manual Annotations Corpus Prior-Polarity Subjectivity Lexicon Experiments Previous Work Conclusions
HLT-EMNLP 2005 42
Previous Work
Learn prior polarity of words and phrasese.g., Hatzivassiloglou & McKeown (1997), Turney (2002)
Sentence-level sentiment analysise.g., Yu & Hatzivassiloglou (2003), Kim & Hovy (2004)
Phrase-level contextual polarity classificatione.g., Yi et al. (2003)
HLT-EMNLP 2005 43
At HLT/EMNLP 2005
Popescu & Etizioni: Extracting Product Features and Opinions from Reviews
Choi, Cardie, Riloff & Patwardhan: Identifying Sources of Opinions with Conditional Random Fields and Extraction Patterns
Alm, Roth & Sproat: Emotions from Text: Machine Learning for Text-based Emotion Prediction
HLT-EMNLP 2005 44
Outline
Introduction Manual Annotations Corpus Prior-Polarity Subjectivity Lexicon Experiments Previous Work Conclusions
HLT-EMNLP 2005 45
Conclusions
Presented a two-step approach to phrase-level sentiment analysis
1. Determine if an expression is neutral or polar
2. Determines contextual polarity of the ones that are polar
Automatically identify the contextual polarity of a large subset of sentiment expression
HLT-EMNLP 2005 46
Thank you
HLT-EMNLP 2005 47
Acknowledgments
This work was supported by Advanced Research and Development
Activity (ARDA)
National Science Foundation