Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA...
Transcript of Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA...
![Page 1: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/1.jpg)
Dustin Arthur Smith
General Exam Presentation October 2, 2009
Learning and Using Plans by Reading
Narratives
1
Friday, October 2, 2009
![Page 2: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/2.jpg)
2
The Problem
Joan grabs a cupFriday, October 2, 2009
![Page 3: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/3.jpg)
3
She fills it with waterFriday, October 2, 2009
What is she doing now. What will she do next
![Page 4: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/4.jpg)
4
She pours the water on her plant.
Friday, October 2, 2009
![Page 5: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/5.jpg)
5
in(kitchen,Joan)at(sink,Joan) on(sink)under(cup,sink)into(water,sink,cup) ...
at(plant,Joan) has(Joan,cup)watered(plant)pour in plant
in(kitchen,Joan)holding(Joan,cup)possesses(Joan,cup)cabinet(open)unwatered(plant)
Friday, October 2, 2009
World has lots of data. We can infer a coherent representation with sparse data. Our knowledge is predictive
![Page 6: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/6.jpg)
6
Observed Text ➔ Hidden Situations
Joan grabs a cup.
She fills it with water.
She pours the water on her plant.
in(kitchen,Joan)at(sink,Joan) on(sink)under(cup,sink)into(water,sink,cup) ...
at(plant,Joan) has(Joan,cup)watered(plant)
in(kitchen,Joan)holding(Joan,cup)possesses(Joan,cup)cabinet(open)unwatered(plant)
Friday, October 2, 2009
This is the problem -- going from left to right.Why narratives? They’re accessible to many people, close to everyday experiences -- plans -- people performing actions in pursuit of goals -- and they can describe context.
![Page 7: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/7.jpg)
7
Observed Text ➔ Hidden Situations
Joan grabs a cup.
She fills it with water.
She pours the water on her plant.
Friday, October 2, 2009
Won’t make assumptions about semantic representation, assume it’s some graph structure.It does need to represent changes over time, and it does need to change !
![Page 8: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/8.jpg)
8
What kinds of inferences?
Joan grabs a cup.
- Who is involved in this action?- What objects are involved in the scene?- What will happen next?- What did Joan use to grab the cup?- Where did she grab the cup from?- Where is Joan?- Why did Joan grab the cup?- What kind of cup?- Does Joan own the cup?- What color is the cup?- How old is Joan?- Is Joan a male or female?- Is Joan doing something imitable/good?
Friday, October 2, 2009
Representations are an engineering problem. It’s more important to ask: What kinds of information do we need to answer? ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from 1987-1997. Fixed set of questions. Now “named entity” and “event extraction” works on extracting answers to a fixed set of questions depending on the text.
![Page 9: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/9.jpg)
9
Reading is not just representing meaning
Joan found the best restaurant in Boston.
Joan was nearly shot when walking home.
Joan organized her iTunes library. Oh.
Which?
Really? Where!?
Key point: Inferences in narrative understanding depend on the reader’s goals. (Specialist problem-solvers: Panalogy)
Friday, October 2, 2009
The questions we ask (in turn the answers we answer) depend on the text!
The surface text is just the envelope to transmit some message. But when the message has arrived in the reader’s head, what do they do with it?
![Page 10: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/10.jpg)
10
Reading and Planning
Language understanding has two plan recognition problems:
1. Inferring the plans of the author (illocution intent)2. Assimilating communicated knowledge (locution) into plans.
Evaluating the system contains two planning problems:
3. Using the transmitted knowledge to solve problems.4. Communicating the learned knowledge
Friday, October 2, 2009
Reasoning process describing a reasoning process.locution/illocution/perlocution. illocutionary force f(p) hypothesis (Austin) underlying speech act theory
Your friend tells you “I’m hungry” you infer: they want to eat with you. (1)You friend tells you that “the new Italian place in Tech Sq is closed” and you patch your plan knowledge (2).You
![Page 11: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/11.jpg)
11
Reading and Planning
Narrative understanding simplifies this a bit:
- Illocutionary goals are fixed (assume author wants to transmit his/her plans honestly)1. Assimilating communicated knowledge (locution) into plans.
Evaluating the system contains two planning problems:
2. Using the transmitted knowledge to solve problems.3. Communicating the learned knowledge
Friday, October 2, 2009
![Page 12: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/12.jpg)
11
Reading and Planning
Narrative understanding simplifies this a bit:
- Illocutionary goals are fixed (assume author wants to transmit his/her plans honestly)1. Assimilating communicated knowledge (locution) into plans.
Evaluating the system contains two planning problems:
2. Using the transmitted knowledge to solve problems.3. Communicating the learned knowledge
Friday, October 2, 2009
![Page 13: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/13.jpg)
12
Vision
If we are to make computers that can learn by reading, it must be able to: (a) assimilate knowledge into plans and fill in missing knowledge, drawing from a plan library. (b) map English statements to parts of existing plans (c) extend its lexicon and plan library with experience.
Goal: Software that reads narratives by matching them with its plans, improving its lexical-semantic mapping and plans by proposing questions and organizing its knowledge to answer them.
Friday, October 2, 2009
![Page 14: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/14.jpg)
13
Nearer-term Vision: Intelligent UIsA good UI allows the user to easily map their actions to the anticipated behavior of the interfaces.
- Workflows involving many UIs (computer software) and multipurpose tools (cellphones) have too many options for the novice. - Solution: recognizing the user’s goals, automating the tool or abstracting away irrelevant interface functions.
Friday, October 2, 2009
The narrative cloze is a “leave one out” form of testing, where you read a narrative with steps “s1 s2 s3 s4”, then you leave out step s2, and see if the system can infer the missing step.
We’ll start with step one: building a core plan knowledge base.
![Page 15: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/15.jpg)
14
Steps toward Narrative Understanding
I. Acquire a corpus of commonsense plans and their lexical mappings. Use parallel scripts from OMICS to induce plans, lexical taxonomies, and lexical-semantic mapping
II. Use this to parse other narratives, predicting missing/future steps with narrative cloze (planning) and command execution
Friday, October 2, 2009
The narrative cloze is a “leave one out” form of testing, where you read a narrative with steps “s1 s2 s3 s4”, then you leave out step s2, and see if the system can infer the missing step.
We’ll start with step one: building a core plan knowledge base.
![Page 16: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/16.jpg)
15
The source of procedural background knowledge
-- make_tea:40 --locate kettlepour water in kettleturn on kettle to boil waterpour boiled water in mugadd tea bag for as long as desired
-- take_care_of_plants:74 --water plantsplace plants in sun
-- serve_a_drink:57 --get a glassgo to fridgepour water into glassserve drink
-- water_indoor_plants:43 --get a containerfind a tapfill water in containerfind a plantput some water in the pot
-- fill_water_in_container:8 --find water containerplace water container under water faucetturn on faucetfill container with waterturn off faucet
Friday, October 2, 2009
343 narratives are relevant to the actions in the previous script, from 5 different types of stories. 9100 narratives from the OMICS effort to collect scripts about indoor commonsnes
![Page 17: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/17.jpg)
16
What goals do we give our reader?
Friday, October 2, 2009
- Has a parallel corpus of stories about ways to accomplish 174 different goals.
- OMICS database currently has 1,172,971 items from 3K users.
![Page 18: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/18.jpg)
16
access the internetact as a security guardanswer the doorbellanswer the phoneapply band aidassist person standing upassist someone in walkingboil the milkbuy from vending machinecall 911calm an infantchange a baby diaperchange a bulbchange bed sheetscharge a cell phonecheck for intruderscheck for weathercheck if a store is openchop vegetablesclean a spillclean the dishesclean the floorclean the showerclean the tableclean upclean up toysclose the blindsclose the curtainscook fishcook noodlecook pastacook ricedance with the childrendo laundrydraw the curtains
dry clothesdust an objectempty the kitchen sinkempty the trashentertain childrenerase the whiteboardfeed a childfeed a pet catfeed a pet dogfeed infantfeed the fishfetch a cold drinkfetch a ladderfetch an objectfill water in containerfind a personfind an objectfind out more informationfind the timefold clothesfollow someone aroundgather all scattered toysget food from refrigeratorget mailget the newspapergive a medicinegive a messagegive a message on phonego outsidegreet a visitorguard the househandle toxic materialshang clothesheat food in microwaveheat food on kitchen gas
help someone carry thingiron clotheskeep the dog awaykick a ballload the dishwasherlock up the houselock windowsmail a lettermake a bedmake a dinner reservationmake a flight reservationmake a listmake a presentationmake a shopping listmake a tossed saladmake baby sleepmake breakfastmake coffeemake fresh orange juicemake hot dogmake soupmake sure children fedmake teamake toasted breadmaking omelettemove furnituremow the lawnopen a web pageopen packageopen the garageopen the mailpack a mailing boxpack a suitcasepaint a wallpay bills
perform research on specphotocopy a paperpick up dishesplace ladder near wallplay a game on the compplay a movieplay a songplay pianoplug an electric appliance in plug battery into chargerpour beer into a glassprint documentpush someone in a wheelpush somethingput away groceriesput object awayput up a paintingraise the blindsread a story to a childrecharge batteriesremove and replace garbagereplace a refrigerator filterreplace a water tap filterreplace batteries in the treplace heater filterretrieve a toolsecure all exitssecure all windowssecure the perimeter of send a faxsend party invitationsserve a drinkserve a mealset a wake up alarmset the dining table
What goals do we give our reader?
Friday, October 2, 2009
- Has a parallel corpus of stories about ways to accomplish 174 different goals.
- OMICS database currently has 1,172,971 items from 3K users.
![Page 19: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/19.jpg)
17
Background surveyGoal: from natural language (NL) to a plan representation, filling in missing knowledge. Three perspectives:
1. LINGUISTICS: Lexical-Semantics of events: - how events are described in English. - the importance of the verb.
2. AI: Planning and Plan recognition - how can we represent event/actions and use them to describe intended behavior? - how can we infer plans from observation? - how can we learn plans from observation? 3. AI: Narrative understanding (machine reading, story understanding)
Friday, October 2, 2009
![Page 20: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/20.jpg)
17
Background surveyGoal: from natural language (NL) to a plan representation, filling in missing knowledge. Three perspectives:
1. LINGUISTICS: Lexical-Semantics of events: - how events are described in English. - the importance of the verb.
2. AI: Planning and Plan recognition - how can we represent event/actions and use them to describe intended behavior? - how can we infer plans from observation? - how can we learn plans from observation? 3. AI: Narrative understanding (machine reading, story understanding)
Friday, October 2, 2009
![Page 21: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/21.jpg)
18
Linguistics: a Trilogy of Meaning1. Lexical semantics: the meanings of the individualwords/lexical units
2. Formal or Sentential semantics: how those meanings are put together to form sentences.
3. Discourse semantics: how those meanings combine with others to create text or discourse
Friday, October 2, 2009
We know from trilogies, even if they start badly, they’re only getting worse.
- Before you can put the pieces (atomic representations in the lexicon) together, we need to know what those pieces are.
![Page 22: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/22.jpg)
18
Linguistics: a Trilogy of Meaning1. Lexical semantics: the meanings of the individualwords/lexical units
2. Formal or Sentential semantics: how those meanings are put together to form sentences.
3. Discourse semantics: how those meanings combine with others to create text or discourse
Friday, October 2, 2009
We know from trilogies, even if they start badly, they’re only getting worse.
- Before you can put the pieces (atomic representations in the lexicon) together, we need to know what those pieces are.
![Page 23: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/23.jpg)
18
Linguistics: a Trilogy of Meaning1. Lexical semantics: the meanings of the individualwords/lexical units
2. Formal or Sentential semantics: how those meanings are put together to form sentences.
3. Discourse semantics: how those meanings combine with others to create text or discourse
Friday, October 2, 2009
We know from trilogies, even if they start badly, they’re only getting worse.
- Before you can put the pieces (atomic representations in the lexicon) together, we need to know what those pieces are.
![Page 24: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/24.jpg)
18
Linguistics: a Trilogy of Meaning1. Lexical semantics: the meanings of the individualwords/lexical units
2. Formal or Sentential semantics: how those meanings are put together to form sentences.
3. Discourse semantics: how those meanings combine with others to create text or discourse
Friday, October 2, 2009
We know from trilogies, even if they start badly, they’re only getting worse.
- Before you can put the pieces (atomic representations in the lexicon) together, we need to know what those pieces are.
![Page 25: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/25.jpg)
19
Verb Semantics: Event Structures
- Verbs, a natural staring point, as they organize constraints on components of sentences.
- The lexical-semantic properties of verbs name the event or state, specify the participants, and organize the sentence [Levin & Rappaport-Hovav 2005].
Verbs classify and index states and events:
pour
Friday, October 2, 2009
In fact, the verb’s lexical-semantics seems to have some of the constraints on the composition. Murder = the subject acted intentionallyTransmit = has very specific kinds of restrictions- some events have series of events built into their lexical-semantics: (waltz,shop,..)- event conflation: become sick / dog “barked” the neighbor awake. ( two events in one), only causally related events
![Page 26: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/26.jpg)
20
Which parser? - CCG, HSPG, LFG, RRG, LinkGrammar ???
“the theory or formalism behind a lot of acronyms ending in G is less relevant than the selection among a variety of computational techniques. Following is a nonexhaustive sample:
- Top-down with backtracking or bottom-up, chart-style.
- Form of output, such as parse tree, dependency graph, feature structure, or some combination or variation.
- A basic context-free backbone (whether it is called context-free or whether it is written in a rule-like style is irrelevant) augmented with tests of various kinds, which may be called syntactic, semantic, pragmatic, or whatever.
- Tradeoffs between the number of grammar rules (CF or otherwise) and the number of patterns or constructs associated with the words -- i.e., is it a complex grammar with many pages of rules or a simple grammar with most structural information in the lexicon.
- Methods for keeping track of ambiguities and reducing or managing then by grouping, testing, marking, etc.
- Use of background knowledge, corpora, statistics, etc., and at what stage during the parse.
- And most importantly, is the parser written by a superprogrammer or by a newbie who just learned language X? And did the author spend many long hours working and reworking it on lots of texts?”
- John Sowa 2003 http://suo.ieee.org/email/msg11962.html
Conclusion: use whatever tools are needed to find syntactic categories of sentence, with as little commitment to the theory as possible.Friday, October 2, 2009
The first problem is going from sentences to word classes.
![Page 27: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/27.jpg)
21
Hit and Break verbs: Case StudyVerbs seem similar: a. The boy broke the window with a ball. b. The boy hit the window with a ball.
But, not in some argument realizations. Causative alternation: a. The boy broke the window. The window broke. b. The boy hit the window. * The window hit.
Availability of body-part possessor ascension: a. I broke his leg. * I broke him on the leg. b. I hit his leg I hit him on the leg.
With/against alternation: a. Perry broke the fence with the stick. ≠ Perry broke the stick against the fence. b. Perry hit the fence with the stick. = Perry hit the stick against the fence
Conclusion: represent larger class of verbs - Break Verbs: bend, fold, shatter, crack (verbs of change of state) - Hit Verbs: slap, strike, bump, stroke (verbs of surface contact)
Fillmore, C. The grammar of hitting and breaking.1970
Friday, October 2, 2009
Observation; most verbs can be substituted in a sentence.
Break/hit are transitive and take instrument w/ phrases.Break verbs: change of stateHit verbs: contact, often forceful, but no change of state
Verbs can be cross classified: bake - acts like both a change-of-state and create-object verb.
![Page 28: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/28.jpg)
22
Semantic Roles - Fillmore proposed flat ‘labels’ for roles, akin to defining function types. but what happens in the function?
break(agent a, instrument i, object o)hit(agent a, instrument i, place p)
- Argument generalization should occur on a verb class-by-class basis. No thematic hierarchies [L&RH05]
- Levin’s verb classes: 100 syntactic tests, diathesis (argument) alternations, e.g.
Causative alternation: a. The boy broke the window. The window broke. b. The boy hit the window. * The window hit.
Friday, October 2, 2009
Many different problems: 1) no general test for them, 2) if they’re flat, no way to organize them hierarchically, 3) some seem very specific (e.g.. contemplate(x,y=?)) others very general, 4) what happens when overloading rolls sold(agent=source,patient,destination)?- argument realization is a function of many influences: causality of events, aspect, notions of sentience/animacy/volitiion. Causal notions largely determine subject and object.
![Page 29: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/29.jpg)
23
VerbNet v3- After Levin’s verb classes, [Kipper 2006] largest verb lexicon available for English - 274 classes, 23 thematic roles, 94 semantic predicates, 5,257 verb senses, 3,769 verb lemmas.
Each verb is annotated with thematic roles (arguments) and selectional restrictions, and syntactic frames for sentence constructions in which verb and role might appear.
Predicates
• Adv• Pred• Prep• about• abstain• adjust• admit• adopt• agree• alive• allow• amount_changed• apart• appear• apply_heat• apply_material• approve• assess• attached• attempt
By Class Number9.1: put-9.19.2: put_spatial-9.29.3: funnel-9.39.4: put_direction-9.49.5: pour-9.59.6: coil-9.69.7: spray-9.79.8: fill-9.89.9: butter-9.99.10: pocket-9.1010.1: remove-10.110.2: banish-10.210.3: clear-10.310.4.1: wipe_manner-10.4.110.4.2: wipe_instr-10.4.210.5: steal-10.510.6: cheat-10.610.7: pit-10.710.8: debone-10.810.9: mine-10.910.10: fire-10.1010.11: resign-10.11
Friday, October 2, 2009
Argument generalization should occ
![Page 30: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/30.jpg)
24
Put verbs into verb classes
- Most verbs aren’t interchangeable
--water_indoor_plants:Step ~2--add water in a jugfill a vessel with waterfill can with waterfill water containerfill water in containerfill water jugfill watering canfill container with tapget waterpour water from kitchen tap into container
--water_indoor_plants:Step 3-- empty water onto plantspour onto plantpour water over the indoor plantspour water into plant’s potpour water into soil until moistenedpour water on plantsapply water to plantswater the plantwater indoor plants
Pour-9.5
Preparing-26.3-2
Substance_Emission-43.4
Weather-57
VerbNet classWordNet tokenAGENT V THEME {+PATH & -DEST_DIR} LOCATIONAGENT V THEME LOCATION <+ADV_LOC>THEME V {+PATH & -DEST_DIR} LOCATIONAGENT V THEME {+SRC} SOURCE {+DEST_CONF} LOCATIONTHEME V {+SRC} {+DEST_CONF} LOCATION
pour water on plants
VerbNet Subcategoriation Frame
AGENT = [+animate] V = pour THEME = water [+substance +concrete +plural] PATH = on LOCATION = plants [+location -region] SOURCE = [+location -region]
dribble, drip, spew, trickle, pour, slop
barbeque, grill, broil, poach, pour, roast, fry, microwave, toast,
belch, foam, gush, bleed, jet, leak, ooze, drool, spout, squirt
dribble, drip, spew, trickle, slop
Friday, October 2, 2009Different meanings: “water” could mean “to secrete” in ‘their mouths watered at the sight of the cat cooking’Verbs in verbnet are clustered by their syntactic patterns.
![Page 31: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/31.jpg)
25
Synthesized view of Verb MeaningTwo key components [Levin 2009]
Event Schemas: [Levin &RH 1999]
a. Simple event schema (single sub-event) [x ACT<manner>] e.g. jog, run, creak, whistle ..
b. Complex event schema: (sub-event cause result sub-event) [[x ACT] CAUSE [ BECOME [y <result-state>]] e.g., break, dry, harden, melt, open, ...
Root:Sound/meaning pairing, representing verb’s core meaning
a. result state (dry)b. thing (saddle)c. stuff (butter)d. container/location (bottle)e. manner (wipe)
Friday, October 2, 2009
- manner and result roots are completely disjoint classes: a verb lexicalizes only one meaning component [L&RH 95]- “mud smeared on the wall” - There is reuse in the action representation, a lot of it is parameterized against different kinds of scalar values (melt -- state, heat - temp)- Root ontology determines Event Type. Event, not root, affects grammar realization. - Manner verbs are always simple, “We smeared mud on the wall => * Mud smeared on the wall.” . “We splashed mud on the wall. => Mud splashed on the wall”
![Page 32: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/32.jpg)
26
Lexical-Semantic ParsingSemantic role labeling (SRL) - similar to filling in function header with predicates and arguments.
Find examine best option for OMICS corpus:
1. SRL tools to parse into VerbNet/FrameNet/Propbank [Toutanova et al 2005]
2. Unsupervised learning of a verb lexicon [Grenager, Manning 2006]
motion(during(E), container) not(location(start(E), container, aloe)) location(end(E), container, aloe) cause(Joan, E)
Conclusion: Let’s hook up predicate arguments with a plan representation.
Friday, October 2, 2009
This is where the linguistics get really confused, as they move into planning territoryWe already have a language for talking about goals and actions: Plans!
![Page 33: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/33.jpg)
27
Background surveyGoal: from natural language (NL) to a plan representation, filling in missing knowledge. Three perspectives:
1. LINGUISTICS: Lexical-Semantics of events: - how events are described in English. - the importance of the verb.
2. AI: Planning and Plan recognition - how can we represent event/actions and use them to describe intended behavior? - how can we infer plans from observation? - how can we learn plans from observation? 3. AI: Narrative understanding (machine reading, story understanding)
Friday, October 2, 2009
![Page 34: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/34.jpg)
27
Background surveyGoal: from natural language (NL) to a plan representation, filling in missing knowledge. Three perspectives:
1. LINGUISTICS: Lexical-Semantics of events: - how events are described in English. - the importance of the verb.
2. AI: Planning and Plan recognition - how can we represent event/actions and use them to describe intended behavior? - how can we infer plans from observation? - how can we learn plans from observation? 3. AI: Narrative understanding (machine reading, story understanding)
Friday, October 2, 2009
![Page 35: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/35.jpg)
28
The Zoo of Action Representations
Friday, October 2, 2009An agent’s space of problem solving is defined by the language it can represent the world in. This language will bias its learning system,
Some assumptions [from Pasula, Zettlemoyer, Kaebling 2004]
- Frame assumption: after action, anything not changed by the action stays the same
- Object abstraction assumption: effects of action depends on the object’s attributes rather than their identities
- Action outcome assumptions: Each action only affect the world in a small number of distinct ways
![Page 36: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/36.jpg)
29
Desiderata of a Plan Representation- Hierarchical - Straightforward mapping to lexical-semantics - Top-down planning eliminates large number of bad plans early on (specialization operators grow plan space). - Frame and Action Assumptions - anything not changed by action stays the same - each action only affects the world in a few ways
- Relational - Can describe classes of objects (typed variables) and Boolean relations between objects.
- Amenable to learning - Infrastructure for adding new actions, and comparing two alternative actions
- Invertible (Plan Recognition) - Can go from actions to hypotheses of several actions or goals. - “It’s not enough to have [plans], most stories involve combining multiple [plans] to understand a coherent story.” - Peter E. Clark
Friday, October 2, 2009
A couple I like: Relational Reinforcement learning models (vonOtterlo)Angelic Planning (Russell)
![Page 37: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/37.jpg)
Several Kinds of Learning- We can categorize human learning along many dimensions:
- Source of instruction:
- Agent itself: realizing it can’t remember Athena combination, gives up
- Environment: eating food from a truck and beco Society ming ill.
- External agent: parent praises for cleaning room
- Society: imagined anonymous ridicule for being naked in public.
- Poltergeist: deity elicits shame for thinking a kind of thought
- Kinds of lessons:
- Association
- Learning the value of a state
- Learning that a structure has a new slot
- Creating a structural association/analogy/Censoring a representation
- Creating a new goal/Abandoning a goal
- Type of reward/punishment
- Bodily, feeling satiated, pain, pleasurable
- Goal achievement
- Recognizing a new skill...
Friday, October 2, 2009
step 2. eg. not general enough pour(water,mouth) != pour(water,plant)
![Page 38: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/38.jpg)
Several Kinds of Learning- New step comes in:
1. Recognized plan and correct inference
2. Recognized plan and incorrect inference
3. Misrecognized plan (refine rule antecedents or construct new rule)
- Assume actions are decomposed into (if,do,then) tuples.
(IF, DO, THEN):
IF: person(A), plant(P), (dry(P); dying(P))DO: water(A,P)THEN: wet(P), healthy(P)
-- serve_a_drink:57 --get a glassgo to fridgepour water into glassserve drink
Rule learning tool: Probabilistic ILP. Background knowledge provides negative examples,
induce maximally general rule.
Friday, October 2, 2009
step 2. eg. not general enough pour(water,mouth) != pour(water,plant)
![Page 39: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/39.jpg)
32
Reading and Planning
Narrative understanding simplifies this a bit:
- Illocutionary goals are fixed (assume author wants to transmit his/her plans honestly)1. Assimilating communicated knowledge (locution) into plans.
Evaluating the system contains two planning problems:
2. Using the transmitted knowledge to solve problems.3. Communicating the learned knowledge
Friday, October 2, 2009
![Page 40: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/40.jpg)
Is what weʼre learning true?
Friday, October 2, 2009
Another way to ask the question is, how do we know what we’re learning is correct.
![Page 41: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/41.jpg)
34
Evaluating Learned Knowledge
1. Answering Questions: Given text, questions, and answers, learn a procedure to answer questions [Poon & Domingos 2009]
2. Executing Instructions: Given commands and set of actions, learn mapping by validating it in environment [Branavan, Chen, Zettlemoyer, Barzalay 2009]
3. Prediction (narrative cloze): Predict and parse the next sentence, if you’re wrong, fix it! [Chambers & Jurafsky 2008, 2009]
4. Natural Language Generation: Communicate with the users. Chat bots
Friday, October 2, 2009
These all could be formulated as planning problems, and this is the part of the cycle where they test their plans.2) Is working with a simulated environment. These indoor instructions are relevant to a domestic robot, so we have been working on an indoor simulator.3) Is planning where the feedback is only as good as the semantic parsing, and the ability to detect discrepancies between it.
![Page 42: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/42.jpg)
35
Steps: Whatʼs happened so far?I. Acquire a corpus of commonsense plans and their lexical mappings. Use parallel scripts from OMICS to induce plans, lexical taxonomies, and lexical-semantic mapping
1. Parse English sentences: clean, coreference resolution, parse, extract predicate-argument structures.
2. Find global alignment of stories: Read in stories one at a time, align sequences, use structure to either:
a) detect missing nodes / context (alignment)
b) detect disjunctions (abstraction / is-a)
c) detect nested sequences (composition / part-of)
3. Infer corresponding state descriptions: Construct corresponding situation models for each step.
II. Use this to parse other narratives, predicting missing/future steps with narrative cloze and command execution
Friday, October 2, 2009
\
![Page 43: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/43.jpg)
36
Four example parsed narratives (of 36)
Friday, October 2, 2009
![Page 44: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/44.jpg)
get the mail
Problems with the corpus
A1 go to mailbox .A2 open mailbox .A3 retrieve mail .A4 close mailbox .
B1 walk to the mailbox .B2 open the mailbox .B3 take mail out of box .B4 close mailbox .B5 walk back home .
C1 open mailbox .C2 put hand in mailbox .C3 pull mail from mailbox .C4 close mailbox .
D1 open outlook .D2 open inbox .D3 click get mail .D4 open mail .
Friday, October 2, 2009
![Page 45: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/45.jpg)
get the mail
Problems with the corpus
A1 go to mailbox .A2 open mailbox .A3 retrieve mail .A4 close mailbox .
B1 walk to the mailbox .B2 open the mailbox .B3 take mail out of box .B4 close mailbox .B5 walk back home .
C1 open mailbox .C2 put hand in mailbox .C3 pull mail from mailbox .C4 close mailbox .
D1 open outlook .D2 open inbox .D3 click get mail .D4 open mail .
P1: Many ways to say the same things!
Friday, October 2, 2009
![Page 46: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/46.jpg)
get the mail
Problems with the corpus
A1 go to mailbox .A2 open mailbox .A3 retrieve mail .A4 close mailbox .
B1 walk to the mailbox .B2 open the mailbox .B3 take mail out of box .B4 close mailbox .B5 walk back home .
C1 open mailbox .C2 put hand in mailbox .C3 pull mail from mailbox .C4 close mailbox .
D1 open outlook .D2 open inbox .D3 click get mail .D4 open mail .
P2: Temporal abstraction (nesting events)
Friday, October 2, 2009
![Page 47: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/47.jpg)
get the mail
Problems with the corpus
A1 go to mailbox .A2 open mailbox .A3 retrieve mail .A4 close mailbox .
B1 walk to the mailbox .B2 open the mailbox .B3 take mail out of box .B4 close mailbox .B5 walk back home .
C1 open mailbox .C2 put hand in mailbox .C3 pull mail from mailbox .C4 close mailbox .
D1 open outlook .D2 open inbox .D3 click get mail .D4 open mail .
P3: Global Alignment (context)
Friday, October 2, 2009
![Page 48: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/48.jpg)
get the mail
Problems with the corpus
A1 go to mailbox .A2 open mailbox .A3 retrieve mail .A4 close mailbox .
B1 walk to the mailbox .B2 open the mailbox .B3 take mail out of box .B4 close mailbox .B5 walk back home .
C1 open mailbox .C2 put hand in mailbox .C3 pull mail from mailbox .C4 close mailbox .
D1 open outlook .D2 open inbox .D3 click get mail .D4 open mail .
P4: Causal Discontinuity
Friday, October 2, 2009
![Page 49: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/49.jpg)
1. Parse English sentences: clean, coreference resolution, parse, extract predicate-argument structures.
2. Find global alignment of stories: Read in stories one at a time, align sequences, use structure to either:
a) detect missing nodes / context (alignment)
b) detect disjunctions (abstraction / is-a)
c) detect nested sequences (composition / part-of)
3. Infer corresponding state descriptions: Construct corresponding situation models for each step.
Steps for learning a core plan library
Friday, October 2, 2009
![Page 50: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/50.jpg)
43
n=4, pre-
Friday, October 2, 2009
![Page 51: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/51.jpg)
44
n=4, post-
JC similaritytext similarity
Friday, October 2, 2009
![Page 52: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/52.jpg)
45
n=5 pre
Friday, October 2, 2009
![Page 53: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/53.jpg)
46
Beyond string matching- Match(Ai, Bj) = {1,0} are these 0 or 1?
get(mail) get(letter)
open(box) open(mailbox)
close(door) shut(door)
return(home) go(inside)
- Unrealized complements and adjuncts:
- Get the mail [from the mailbox]
- Open the door [with your hand]
- Joe sold his TV [to someone] [for a price].
Friday, October 2, 2009
![Page 54: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/54.jpg)
47
Beyond string matching
[1] Jiang, J and Conrath, D: Semantic Similarity Based on Corpus Statistics and Lexical Taxonomy. 1997
- Taxonomy-based similarity. Problem, in WordNet 3,
d(plant,escape) = d(plant,houseplant) = 1
- Solution, weigh by IC: Jiang & Conrath generalization metric [1]
IC(c) = ! logcount(c)count(·)
Friday, October 2, 2009
Climbing up a tree is not enough. One hop in wordnet does not mean the same things.
![Page 55: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/55.jpg)
47
Beyond string matching
[1] Jiang, J and Conrath, D: Semantic Similarity Based on Corpus Statistics and Lexical Taxonomy. 1997
- Taxonomy-based similarity. Problem, in WordNet 3,
d(plant,escape) = d(plant,houseplant) = 1
- Solution, weigh by IC: Jiang & Conrath generalization metric [1]
IC(c) = ! logcount(c)count(·)
P(c), probability of c appearing in a large corpus.
Friday, October 2, 2009
Climbing up a tree is not enough. One hop in wordnet does not mean the same things.
![Page 56: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/56.jpg)
48
A more general description
L. De Raedt and J. Ramon. Deriving distance metrics from generality relations. 2008.
Friday, October 2, 2009
![Page 57: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/57.jpg)
49
Inducing Taxonomies (from OpenMind)- Given a set of objects O, predicates P, and whether
or not they appear together in a corpus induce a tree, ordering predicates by generality.
O ! P " {0, 1}
L. Schmidt, Kemp C., Tenenbaum J.B. Nonsense and sensibility: Inferring unseen possibilities. 2006.
Python source: http://media.mit.edu/~dustin/ti.tgz
Friday, October 2, 2009
How can we get the taxonomies?
![Page 58: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/58.jpg)
Taxonomy Induction with multiple inheritance?
container
mug
object
tree bathtub
destroy()create()
enter(), fill(),close()...
Friday, October 2, 2009
The idea of having problem solvers spread over different realms
![Page 59: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/59.jpg)
Moving on to another corpus
- 43things.com provides a corpus of 2.5 million goals, and 22,528 plans (as of 2008).
get a tattoo 183stop smoking 159fall in love 132get a fun job 105drink more water 100Give blood 99kiss in the rain 76be vegetarian 69get my license 62stop biting my nails 61be happy with myself 59read more books 55lose some weight 55get married and stay married 52become a mermaid 48skydive 47Buy a House 45make new friends 45
Friday, October 2, 2009
![Page 60: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/60.jpg)
Moving on to another corpus
- 43things.com provides a corpus of 2.5 million goals, and 22,528 plans (as of 2008).
get a tattoo 183stop smoking 159fall in love 132get a fun job 105drink more water 100Give blood 99kiss in the rain 76be vegetarian 69get my license 62stop biting my nails 61be happy with myself 59read more books 55lose some weight 55get married and stay married 52become a mermaid 48skydive 47Buy a House 45make new friends 45
Friday, October 2, 2009
![Page 61: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/61.jpg)
52
Contributions- I surveyed understanding events from two different perspectives, linguistic event semantics and planning.
- I made cases for:
- using NL narratives as a test-bed for learning and evaluating plans and lexical knowledge
- using commonsense and lexical-semantic resources for providing default generalization structures.
- using verbs to index and classify action/event structures
- using goals as an internal knowledge organization and learning metric
Friday, October 2, 2009
![Page 62: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/62.jpg)
53
Additional Slides
Friday, October 2, 2009
![Page 63: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/63.jpg)
Symbols + Numbers = ☯The “connectionist” v.s. “symbolic” false dichotomy has got to end!
Linear/total ordering on set {-5.2, 0.1, 3.2, 8.983}
- antisymmetry: If a ≤ b and b ≤ a then a = b
- transitivity: If a ≤ b and b ≤ c then a ≤ c
- totality: a ≤ b or b ≤ a
-5.2 0.1 3.2 8.983
<(-5.2,.1) <(.1,3.2) <(3.2,8.983)
<(3.2,3.2) <(3.2,3.2)
<(-5.2,0.1) <(0.1,3.2)
Friday, October 2, 2009
![Page 64: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/64.jpg)
Symbols + Numbers = ☯
dog panda subway caterpillar
<(dog,panda) <(panda,sub) <(subway,caterpillar)
Numbers are just symbols with ordering relations.
Friday, October 2, 2009
![Page 65: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/65.jpg)
Symbols + Numbers = ☯
dog panda subway caterpillar
Friday, October 2, 2009
![Page 66: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/66.jpg)
Symbols + Numbers = ☯
dog panda subway caterpillar
mammal
animal
Friday, October 2, 2009
![Page 67: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/67.jpg)
Symbols + Numbers = ☯
1
2
3
4
Friday, October 2, 2009or other kinds of graphical representations.
![Page 68: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/68.jpg)
Object Abstraction
Massachusetts Institute of Technology
Universities
Machakos Institute of Technology
Friday, October 2, 2009or other kinds of graphical representations.
![Page 69: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/69.jpg)
Object Abstraction
Massachusetts Institute of Technology
Universities
Machakos Institute of Technology
Friday, October 2, 2009most machine learning is done between objects/instances and their classes. What happens when you start talking about relations between classes. portals between nodes.Abstraction barrier is important not to cross. Person is part of Society. Toe is part of person => Toe is part of society.
![Page 70: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/70.jpg)
Many Ways To LearnWhat’s the difference between “curious”, “ignorant’ and “confused”? - All refer to states of unknowing: learning is multifaceted
Learning by adding new If-Do-Then rules, or by changing low-level connections,Learning by making new subgoals for goals, or finding better search techniques,Learning by changing or adding descriptions, statements and narrative stories.Learning by changing existing processes.Learning to prevent mistakes by making Suppressors and Censors.Learning to make better generalizations.Learning new Selectors and Critics that provide us with new Ways to Think,Learning new links among our fragments of knowledge.Learning to make new kinds of analogies.
Friday, October 2, 2009
![Page 71: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/71.jpg)
Statistical Relational Learning
- Combining expressiveness of relational logic (input and hypothesis language is not just propositional or attribute-value/matrices)
- Probablistic Inductive Logic Programming [deRaedt, Kersting]- Markov Logic [Domingos]- Programming languages: PRISM, Alchemy,
- Important point: Just having a number is a dead end: often you want to be able to backtrack for credit assignment [Minsky]
http://www.cs.kuleuven.be/~lucdr/ijcai09w.pdf
Friday, October 2, 2009Model section
![Page 72: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/72.jpg)
63
A brief tour of the Plan KR Zoo
Friday, October 2, 2009
![Page 73: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/73.jpg)
64
STRIPS
NL: John went to the sink
STRIPS (Fikes & Nilsson 1971): - each event defined by its preconditions and effects
Action: go_sink(A) Pre: -at(sink,A), canMove(A) Post: at(sink,A)
Friday, October 2, 2009
![Page 74: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/74.jpg)
65
Markov Decision ProcessesMDP:
- set of actions, A: - set of states, S:
- real-valued reward function, R(s,a):
- transition matrix, T: S × A → P(S)
- goal: learn optimal policy π: S → A
{john_infrontof_sink_sink_off_plants_unwatered, john_infrontof_sink_sink_on_plants_watered, john_infrontof_plants_sink_on_plants_unwatered, john_infrontof_plants_sink_off_plants_watered,...}
{get_cup, go_to_sink, go_to_plants, pour_water,..}
R(john_infrontof_plants_sink_on_plants_unwatered,pour_water) = .8
These propositional (X = (0,1)) representations of world states are terrible for scaling.Some tricks; - factoring state space- -- create partitions for Friday, October 2, 2009
![Page 75: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/75.jpg)
66
Coreference Resolution
Joan Rivers owned a rosemary plant.
She kept the plant in the kitchen.
Mrs. Rivers watered the plant daily.
c1 = {‘Joan Rivers’, ‘She’, ‘Mrs. Rivers’}
c2 = {‘rosemary plant’, ‘the plant’}
http://www.stanford.edu/class/cs224u/224u.07.lec5.6up.pdf
Find each coreference chain: set of referring expressions.
There are four kinds of coreferences.
Friday, October 2, 2009
Four kinds of coreferences: Indefinite noun phrases, that are new to hearer (e.g. begin with ‘a’)
![Page 76: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/76.jpg)
67
Coreference Resolution
Joan Rivers owned a rosemary plant.
She kept the plant in the kitchen.
Mrs. Rivers watered the plant daily.
c1 = {‘Joan Rivers’, ‘She’, ‘Mrs. Rivers’}
c2 = {‘rosemary plant’, ‘the plant’}
Find each coreference chain: set of referring expressions.
There are four kinds of coreferences.
Friday, October 2, 2009
The definite phrases -- already known to the hearer because they were a) previously mentioned, b) are common knowledge, or c) unique by definition (“the biggest turtle in the world”)
![Page 77: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/77.jpg)
68
Coreference Resolution
Joan Rivers owned a rosemary plant.
She kept the plant in the kitchen.
Mrs. Rivers watered the plant daily.
c1 = {‘Joan Rivers’, ‘She’, ‘Mrs. Rivers’}
c2 = {‘rosemary plant’, ‘the plant’}
Find each coreference chain: set of referring expressions.
There are four kinds of coreferences.
Friday, October 2, 2009
Pronouns (he, she, it..)
![Page 78: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/78.jpg)
69
Coreference Resolution
Joan Rivers owned a rosemary plant.
She kept the plant in the kitchen.
Mrs. Rivers watered the plant daily.
c1 = {‘Joan Rivers’, ‘She’, ‘Mrs. Rivers’}
c2 = {‘rosemary plant’, ‘the plant’}
Find each coreference chain: set of referring expressions.
There are four kinds of coreferences.
Friday, October 2, 2009
And names. Mrs. Rivers = Joan = Joan Rivers = The daughter of Rivers Phoenix.
![Page 79: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/79.jpg)
Relational Sequence Alignment
[1] Much related work by Kristian Kersting and Luc De Raedt.
- Relational (in first order logic), instead of propositional, descriptions of states.
Predicates/arity: vi/2, cd/1, ls/0, pdfview/2
Ground atoms -- predicates with non-variable terms:
vi(ch2,tex)
Ground clauses:
Generalized Clauses (with variables):
cd(X), vi(Y, tex), latex(Y, tex)
Friday, October 2, 2009
![Page 80: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/80.jpg)
Sequential
Relational Sequence Alignment
[1] Much related work by Kristian Kersting and Luc De Raedt.Friday, October 2, 2009
![Page 81: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/81.jpg)
Sequential
Relational Sequence Alignment
[1] Much related work by Kristian Kersting and Luc De Raedt.Friday, October 2, 2009
![Page 82: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/82.jpg)
73
History
Friday, October 2, 2009
![Page 83: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/83.jpg)
74
Q-A systems from 1963
[1] Green B., Wolf A. Chomsky C., and Laughery K. Baseball: An automatic question-answerer. 1963
Friday, October 2, 2009
![Page 84: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/84.jpg)
75
Friday, October 2, 2009
![Page 85: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/85.jpg)
SHRDLU
76
- Executing instructions & answering questions in mircoworlds (Winograd 72)
Friday, October 2, 2009
![Page 86: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/86.jpg)
Reading Childrenʼs Stories (Charniak 72)
77
Jack was having a birthday party. Mother baked a cake.
Friday, October 2, 2009
Early AI researchers like Charniak focused on children’s stories, assuming that they contained less knowledge and were easier to parse than the 20 word long sentences of the Wall St Journal.
Even these two simple sentences require a lot of knowledge though. Their subjects and objects only overlap with respect to background knowledge that connects “baking cake” with “birthday party”.
![Page 87: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/87.jpg)
Reading Childrenʼs Stories (Charniak 72)
- What is the relationship between Jack and Mother?
- Who will eat the cake?
- Where was Mother when she was baking the cake?
- Where was Jack?
- ... 77
Jack was having a birthday party. Mother baked a cake.
Friday, October 2, 2009
Early AI researchers like Charniak focused on children’s stories, assuming that they contained less knowledge and were easier to parse than the 20 word long sentences of the Wall St Journal.
Even these two simple sentences require a lot of knowledge though. Their subjects and objects only overlap with respect to background knowledge that connects “baking cake” with “birthday party”.
![Page 88: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/88.jpg)
Eugene Charniak’s Ph.D. [1] “the chief concern motivating the model discussed here is relating a large body of knowledge to a particular story”
By preschool, children can represent and remember event sequences [2].
By the time they are reading, children have acquired a lot of world knowledge already!
78
[1] E. Charniak “Toward a Model of Children’s Story Comprehension.” 1972.[2] J. Wenner, P.J. Bauer. Bringing order to the arbitrary. 1999.
Understanding Childrenʼs Stories
Friday, October 2, 2009
By the time children are reading, they have a lot of world knowledge they use.
Jean mandler took this farther, to argue that when children first start speaking, their conceptual representations are very rich even though their lexicons and articulation abilities are not. The word “up” pragmatically means something like, “I want you to pick me up”.
![Page 89: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/89.jpg)
79
“This paper describes a natural language system which improves its own performance through learning. The system processes short English narratives and is able to acquire, from a single narrative, a new schema for a stereotypical set of actions. During the understanding process, the system attempts to construct explanations for characters' actions in terms of the goals their actions were meant to achieve. When the system observes that a character has achieved an interesting goal in a novel way, it generalizes the set of actions they used to achieve this goal into a new schema.” [1]
Learning Plan Schema from Narratives
[1] Mooney R, and DeJong G. Learning Schemata for Natural Language Processing .1985
“A natural language system requires extensive knowledge about the world. Clearly, if a computer system is to summarize, translate, or answer questions about a text, it must have knowledge about the concepts expressed in the text. Imagine trying to process anarrative describing a bank robbery without knowledge of money and why people want it” [1]
system called genesis
key idea - generalize against background knowledge: use own knowledge as generalization boundaries
Friday, October 2, 2009
![Page 90: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/90.jpg)
79
“This paper describes a natural language system which improves its own performance through learning. The system processes short English narratives and is able to acquire, from a single narrative, a new schema for a stereotypical set of actions. During the understanding process, the system attempts to construct explanations for characters' actions in terms of the goals their actions were meant to achieve. When the system observes that a character has achieved an interesting goal in a novel way, it generalizes the set of actions they used to achieve this goal into a new schema.” [1]
Learning Plan Schema from Narratives
[1] Mooney R, and DeJong G. Learning Schemata for Natural Language Processing .1985
“A natural language system requires extensive knowledge about the world. Clearly, if a computer system is to summarize, translate, or answer questions about a text, it must have knowledge about the concepts expressed in the text. Imagine trying to process anarrative describing a bank robbery without knowledge of money and why people want it” [1]
system called genesis
key idea - generalize against background knowledge: use own knowledge as generalization boundaries
“... This approach still requires a large amount of existing knowledge that could be used to construct detailed explanations for simpler stories....” Then, “NLP research began to focus on building robust systems for simpler tasks” [2]
[2] Mooney R, Learning Semantic Parsers: An Important but Under-Studied Problem. 2005.
Friday, October 2, 2009
![Page 91: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/91.jpg)
Big QuestionCharacterizing the problem:
How can we acquire knowledge by reading, when reading itself requires knowledge?
80
Friday, October 2, 2009
So now we have a causal cycle. How can we get knowledge from reading, when doing so requires knowledge?
Possibly we can approach this like a bootstrapping problem: first we build a lowest-layer, sufficiently low-level that we will not need to answer questions about it with much more detail, then we develop richer layers on top of this.
![Page 92: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/92.jpg)
Big QuestionCharacterizing the problem:
How can we acquire knowledge by reading, when reading itself requires knowledge?
80
- Bootstrap solution: Gradually compose more new (more complex) representations out of existing (simpler) representations. Reading involves learning.
Friday, October 2, 2009
So now we have a causal cycle. How can we get knowledge from reading, when doing so requires knowledge?
Possibly we can approach this like a bootstrapping problem: first we build a lowest-layer, sufficiently low-level that we will not need to answer questions about it with much more detail, then we develop richer layers on top of this.
![Page 93: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/93.jpg)
Knowledge Acquisition from Volunteers
http://commons.media.mit.edu
Friday, October 2, 2009
![Page 94: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/94.jpg)
82
With a little help from OpenMind...
Jack was having a birthday party. Mother baked a cake.
is a gameis a children’s gameis a nickname for John.is a boy’s nameis a childhood game.is a lifting device.is kind of nickname.
(40)
Use OpenMind Common Sense knowledge base [1].
[1] http://commons.media.mit.eduFriday, October 2, 2009
OpenMind knows a lot about the terms in this children’s story.
![Page 95: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/95.jpg)
83
Jack was having a birthday party. Mother baked a cake.
is a gameis a children’s gameis a nickname for John.is a boy’s nameis a childhood game.is a lifting device.is kind of nickname.
(40)
bake a cake because you want toballoon used forlikely to find at toy balloonlikely to find at helium balloonbuying presents is forare fun
(17)
With a little help from OpenMind...
Friday, October 2, 2009
![Page 96: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/96.jpg)
84
Jack was having a birthday party. Mother baked a cake.
is a gameis a children’s gameis a nickname for John.is a boy’s nameis a childhood game.is a lifting device.is kind of nickname.
(40)
bake a cake because you want toballoon used forlikely to find at toy balloonlikely to find at helium balloonbuying presents is forare fun
(17)
can care for a childis a womantake care of their childrenloves her childis part of my family
(260)
With a little help from OpenMind...
Friday, October 2, 2009
![Page 97: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/97.jpg)
85
Jack was having a birthday party. Mother baked a cake.
is a gameis a children’s gameis a nickname for John.is a boy’s nameis a childhood game.is a lifting device.is kind of nickname.
(40)
bake a cake because you want toballoon used forlikely to find at toy balloonlikely to find at helium balloonbuying presents is forare fun
(17)
can care for a childis a womantake care of their childrenloves her childis part of my family
(260)
you should have an ovena birthday may make you want tobecause you want to celebrate a birthdayfirst thing add flour
(19)
With a little help from OpenMind...
Friday, October 2, 2009
![Page 98: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/98.jpg)
86
Jack was having a birthday party. Mother baked a cake.
is a gameis a children’s gameis a nickname for John.is a boy’s nameis a childhood game.is a lifting device.is kind of nickname.
(40)
bake cake because you want balloon used forlikely to find at toy balloonlikely to find at helium balloonbuying presents is forare fun
(17)
can care for a childis a womantake care of their childrenloves her childis part of my family
(260)
Problem: Retrieve only the relevant knowledge
you should have an ovena birthday may make you want tobecause you want to celebrate a birthdayfirst thing add flour
(19)
With a little help from OpenMind...
Friday, October 2, 2009
The problem here is that there is TOO MUCH knowledge!
![Page 99: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/99.jpg)
87
Problem: Knowledge is cognitive-context sensitive
Plants are edibleSome types of plants are safe to eat and nutritious for humans,
while other plants might be edible for other animals.
The sun is yellowThe Earth's sun emits radiation that is perceived to be a color
English speaking humans call “yellow”.
Giant squid swim deep in the oceans Giant squid, in their typical behavior, swim in areas of the oceans
that are deeper than how far 20th century humans typically explore.
Friday, October 2, 2009
The knowledge in open mind is missing its procedural counterpart. It’s not declarative enough to be true irrespective of context. Knowledge is embodied in procedures. Human knowledge is human procedures, used for solving problems. Knowledge from human needs to be sensitive to the goals, taboos/censors, values, shared experiences that people have.
![Page 100: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/100.jpg)
88
MRL: Traditionally FO Logic
- FOL has a long history as the MRL for NL
- Acknowledged as “a good starting point”, not the way people represent knowledge (e.g., it’s impossible to represent the concept of most in FOL). Most model-theoretic inferences are intractable: semantic approaches are proof-theoretic
- “In short, first-order logic offers an attractive compromise between the conflicting demands of expressivity and inferential effectiveness.” - (Bos & Blackburn 2008)
- Relational representations are essential, yet shackling language understanding to the awkwardness of deductive inference is problematic.
Friday, October 2, 2009
![Page 101: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/101.jpg)
Lexical-Semantic Resources
‣ Representing a list
In verb learning, syntax matters.
‣ a sublist
89
๏ The text has moved.
➡ Conclusion
✓ The text has moved.
[1] L. De Raedt and J. Ramon. Deriving distance metrics from generality relations. 2008.
➡ a sublist
๏ a sublist
✓ a sublist
‣ definitions. “a xanthine is a product on the pathway of purine degradation”‣ vague attributes: “their characteristic properties disappear altogether...”‣ coreference: “Hydrogen chloride reacts. This reaction produces...”
Friday, October 2, 2009
![Page 102: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/102.jpg)
90
Example Background Knowledge
Friday, October 2, 2009
![Page 103: Learning and Using Plans by Reading Narrativesweb.media.mit.edu/~dustin/dsmith-generals.pdf · ARPA ran 7 Message Understanding Conferences (MUC) to work on IR problems in NL from](https://reader034.fdocuments.in/reader034/viewer/2022050311/5f7354ac52f9ef1e931d2738/html5/thumbnails/103.jpg)
91
Lexical-Semantic ParsingSemantic role labeling - akin to filling in function header with predicates and arguments. Let’s hook this up with a plan representation.
Use lexical-semantic resources: VerbNet, WordNet, OpenMindto extend predicate argument structure. A possible representation:
motion(during(E), container) not(location(start(E), container, aloe)) location(end(E), container, aloe) cause(Joan, E)
Joan watered the aloe =
E = water[butter-9-9](agent a=Joan, destination d=aloe[plant], theme t=[?container])
if d in [plant, lawn, potted tree, tree, flower pot]: d.dry = False d.wet = True if d in [houseplant]: d.dying = False d.healthy = True
Friday, October 2, 2009
This is where the linguistics get really confused, as they move into planning territoryWe already have a language for talking about goals and actions: Plans!