Laurea in INFORMATICA MAGISTRALElacam.di.uniba.it/~ferilli/ufficiale/corsi/ai-eng/7 -...

15
1 Representation / 2 Representation / 2 Semantic Nets - Schemes: Frames, Scripts Semantic Nets - Schemes: Frames, Scripts Laurea in INFORMATICA Laurea in INFORMATICA MAGISTRALE MAGISTRALE Corso di ARTIFICIAL INTELLIGENCE Stefano Ferilli Categorization Procedure to organize objects into classes or categories Allows defining properties and making assumptions on the entire category rather than on single elements Crucial in the conceptualization and knowledge representation phase Choice of categories determines what can be represented Common in technical-scientific fields Leads to the definition of taxonomies Biology: living species Chemistry: elements Archives: subject fields ... Humans naturally organize concepts in 3 hierarchical levels Basic, Subordinate, Superordinate [Rosch & Mervis (1975): “Family resemblance: studies in the internal structures of categories”, Cognitive Psychology, 573- 605] Basic concepts : the natural way to categorize objects and entities that make up our world The first categories learned by humans Other concepts acquired later as an evolution of basic categories Superordinate concepts -> generalization Subordinate concepts -> specialization E.g.: “chair” basic concept “furniture” superordinate with respect to chair “rocking chair” subordinate Categorization Categorization Once formed, categories tend to have a structure that stresses the similarity among members of the same category maximizes the differences between members in different categories This allows reasoning with prototypes Prototype A specimen of the category near the center of the space of features for that category Categories and Objects Much reasoning happens at the category level , rather than on individuals If knowledge is organized into (sub-)categories, it is sufficient to classify an object, by its perceived properties, to infer the properties of the category/- ies it belongs to Simplifies the knowledge base, using INHERITANCE as a form of inference Placing an attribute high in the taxonomy one may allow many instances to inherit that property Inheritance Property Inheritance Objects are grouped into classes because they share some properties => there exist inference mechanisms that work on the structure of the representation Characterization of a property A given at the class level but interpreted as a property of all instances of the class x ( is_a(x, Class) A(x) ) Implementation: GRAPH SEARCH Starting from the given node, bottom-up search in the taxonomy for a node defining its properties

Transcript of Laurea in INFORMATICA MAGISTRALElacam.di.uniba.it/~ferilli/ufficiale/corsi/ai-eng/7 -...

Page 1: Laurea in INFORMATICA MAGISTRALElacam.di.uniba.it/~ferilli/ufficiale/corsi/ai-eng/7 - Representation.pdf · Inheritance Example “Peter breathes air” can be inferred from the fact

1

Representation / 2Representation / 2

Semantic Nets - Schemes: Frames, ScriptsSemantic Nets - Schemes: Frames, Scripts

Laurea in INFORMATICALaurea in INFORMATICAMAGISTRALEMAGISTRALE

Corso diARTIFICIAL INTELLIGENCE

Stefano Ferilli

Categorization

● Procedure to organize objects into classes or categories● Allows defining properties and making assumptions

on the entire category rather than on single elements

● Crucial in the conceptualization and knowledge representation phase

– Choice of categories determines what can be represented

● Common in technical-scientific fields

– Leads to the definition of taxonomies● Biology: living species● Chemistry: elements● Archives: subject fields● ...

● Humans naturally organize concepts in 3 hierarchical levels ● Basic, Subordinate, Superordinate

● [Rosch & Mervis (1975): “Family resemblance: studies in the internal structures of categories”, Cognitive Psychology, 573-605]

– Basic concepts : the natural way to categorize objects and entities that make up our world

● The first categories learned by humans● Other concepts acquired later as an evolution of basic

categories

– Superordinate concepts -> generalization – Subordinate concepts -> specialization

– E.g.:

● “chair” basic concept● “furniture” superordinate with respect to chair● “rocking chair” subordinate

Categorization

Categorization

● Once formed, categories tend to have a structure that● stresses the similarity among members of the

same category

● maximizes the differences between members in different categories

● This allows reasoning with prototypes● Prototype

– A specimen of the category near the center of the space of features for that category

Categories and Objects

● Much reasoning happens at the category level, rather than on individuals● If knowledge is organized into (sub-)categories, it is

sufficient to classify an object, by its perceived properties, to infer the properties of the category/-ies it belongs to

● Simplifies the knowledge base, using

INHERITANCEas a form of inference– Placing an attribute high in the taxonomy one may allow

many instances to inherit that property

Inheritance

● Property Inheritance● Objects are grouped into classes because they

share some properties– => there exist inference mechanisms that work on the

structure of the representation

● Characterization of a property A given at the class level but interpreted as a property of all instances of the class– ∀x ( is_a(x, Class) A(x) )⇒ A(x) )

● Implementation: GRAPH SEARCH– Starting from the given node, bottom-up search in the

taxonomy for a node defining its properties

Page 2: Laurea in INFORMATICA MAGISTRALElacam.di.uniba.it/~ferilli/ufficiale/corsi/ai-eng/7 - Representation.pdf · Inheritance Example “Peter breathes air” can be inferred from the fact

Inheritance

● Example● “Peter breathes air” can be inferred from the fact

that Peter is a Person, Person is a subclass of Mammal, and Mammals breath Air

Mammal

Person

Air

Peter

breathes

instance_of

subclass

Prototypes

● Reasoning with prototypes– What typically characterizes a concept

● Necessary conditions

– Bird(x) Vertebrate(x)– Bird(x) Bipede (x)

● Typically necessary conditions (default)– Bird(x)

Tip Flies(x)

– Bird(x) Tip

Feathered(x)

● Sufficient Conditions (criterials)

– Canary(x) Bird(x)– Ostrich(x) Bird(x)

● Typically sufficient conditions

– Flies(x) Tweets(x) Tip

Bird(x)

– Feathered(x) Tip

Bird(x)

– Source of non-monotonicity

Knowledge Representation Schemes

● Semantic Nets● A method to represent knowledge using graphs

made up of nodes and arcs

– Nodes objects

– Arcs relationships between objects

● Frames● A data structure to represent concepts, objects and

properties through stereotypes

● Scripts● Describe sequences of typical events

Graphs for Knowledge Representation

● Needs:● Analyze knowledge in terms of low-level primitives

● Organize it in high-level structures

● Basic units to represent common-sense knowledge using graphs● Concepts = nodes

● Relationships = arcs – Used to connect concepts to each other in the graph

Sowa’s Conceptual Graphs (GCs)

● A method to represent mental models● Able to systematically and formally detect how

what we think about a concept can be described in terms of its relationships with other kinds of concepts

● 2 kinds of nodes● Concepts (Concrete or abstract)

● Relationships (partof, object of, characteristic of...)

Concept 2RelationshipThe Relationship node explains the

dependence between Concept1 and Concept2

Concept 1

Conceptual Graphs

● Most appropriate assuming that knowing something = ability of building mental models that accurately represent both that thing and the actions that can be taken through and/or on that thing

● Power: able to formalize● Both concrete and abstract concepts

● Hierarchies of concepts– The base of the hierarchy defines concrete concepts on

which progressilvely more abstract concepts rely

Page 3: Laurea in INFORMATICA MAGISTRALElacam.di.uniba.it/~ferilli/ufficiale/corsi/ai-eng/7 - Representation.pdf · Inheritance Example “Peter breathes air” can be inferred from the fact

Semantic Nets

● So called because initially used to represent the meaning of natural language utterances● (Quillian, 1966, Semantic Memory)

● A more appropriate name would beAssociative nets

– More neutral with respect to the application domain

● Peculiarity related to cognitive economy consisting in the identification of fundamental structural relationships (system relationships) very frequent in reality

● Focus on the meaning to give to sentences expressed in natural language

Semantic Nets

● From Digraphs to Semantic Nets● Based on the graphical representation of the

relationships existing between elements in a given domain, Semantic Nets adopt the metaphor that

– objects are nodes in a graph and

– such nodes are related by arcs representing binary relationships

● Like all formalisms based on graphical methods, can be easily and immediately understood

Semantic Nets

● From Digraphs to Semantic Nets

● A Semantic Net is a Labeled Directed Digraph used to describe relationships between objects, situations or actions– Powerful representation

– Can be reduced to a tractable symbolic representation (Logics)

● Graph notation by itself has a little advantage over logical notation

– It allows one to represent relationships between objects and define inferences through links

Semantic Nets

● An example

A B

C

A

B

C

floor

abov

e

belo

w

to_rightsmallerbigger

bigger

small

erab

ove

below

to_l

eft

to_r

ight

bigg

er

smal

ler above

below

to_left

Semantic Nets

● Representation of constraints on features● How to capture semantics in representation

DigraphRepresentation

● Note: without the constraint that pillars “do not touch”, the representation of “arch” and “not arch” are exactly the same

Architrave Architrave

Arch Not arch

Le

ft P

illar

Rig

ht P

illar

Le

ft P

illar

Rig

ht

Pill

ar

LeftPillar

RightPillar

supported bysupported by

architrave

left_of

right_of

not touch

Semantic Nets

● A systematic use when building knowledge-based systems due to the possibility of distinguishing, among nodes that represent concepts, token-nodes and type-nodes● Tokens = individual physical symbols

● Types = sets of symbols

● Properties of a token-node derive from the type-node to which it is linked

18

Event Action

UnpremeditatedMurder

PremeditatedMurder

Murder Pistorius’Murder

Page 4: Laurea in INFORMATICA MAGISTRALElacam.di.uniba.it/~ferilli/ufficiale/corsi/ai-eng/7 - Representation.pdf · Inheritance Example “Peter breathes air” can be inferred from the fact

Semantic Nets

● Quillian’s initial work defined most of the Semantic Net formalism● Labeled arcs and links, hierarchical inheritance and

inferences along associative links, ...

● but representation unsuitable for complex domains● Most links represent very general relationships,

unable to capture the complex semantics of real world problems

Semantic Nets

● Subsequent focus : definition of a richer set of labeled links useful to model the semantics of natural language

● Key : identification of semantic primitives for natural language (Schank, 1974)● Semantic primitives of a language for Semantic Net

processing correspond to elementary concepts that can be expressed through the language– An interpreter can handle them because it is

programmed since the beginning to understand them

Semantic Nets

● Basic idea● Concept meaning determined by relationship to

other objects

● Information stored by interconnecting nodes (entities) by labeled edges (relationships)– Example: Physical attributes of a person

● Translation in logics:– isa(person, mammal), instance(Mike-Hall, person),

team(Mike-Hall, Cardiff)

Semantic Nets

● Typical Relationships:– is_a

● supertype-type (superclass-class)● type-subtype (class-subclass)● subtype-instance (subclass-instance)

– part_of● supertype-type (superclass-class)● type-subtype (class-subclass)

– has● object-property

– value● property-value

– linguistic● Examples: likes, owns, travel, made_of

Semantic Nets

● Example: consider sentence● John gave Mary the book.

– Different aspects of one event

Semantic Nets

● Inference

● Basic mechanism: a form of spreading activation of the nodes in the net

● Already proposed by Quillian in his association nets devised for natural language processing

– Given as input an unknown proposition, the system should be able to find out a representation of its meaning based on the definitions that are available in the net

● More limited performance: the system would only be able to compare the meaning of two lexical entries

● Comparison carried out by starting from parent/ancestor nodes of the words to be compared and progressively visiting and activating neighbor (type or token) nodes through association arcs. This happens for the two propositions looking for a common area of activated nodes

Page 5: Laurea in INFORMATICA MAGISTRALElacam.di.uniba.it/~ferilli/ufficiale/corsi/ai-eng/7 - Representation.pdf · Inheritance Example “Peter breathes air” can be inferred from the fact

Semantic Nets

● Inference● General mechanism: “follow the relationships

between nodes”

● 1. Search for Intersection [proposed by Quillian]– Procedure: spreading activation on two starting nodes

● Start by labeling a set of source nodes (the concepts in the semantic net) with weights/tags (“activations”)

● Then iteratively propagate them out of the source nodes towards the nodes linked to them

● Relationships among objects found by expanding the activation from the two nodes and finding their intersection. This is obtained by assigning a special tag to each visited node

Semantic Nets

● Inference● 2. (obviously) Inheritance

– Based on isa and instance links

– Leverages transitivity of isa– Easily implemented as

link traversal

Person

Parent

Mother

IS-A

IS-A

hasChild

Semantic Nets

● Inference

● What about n-ary relationships (n > 2)?

– Case structure representation technique

● Example

gives(John,Mary,Book)– John gives Mary a book

give-events

E1

book-4 John Mary

hasAgenthasObject

hasRecipient

Semantic Nets

● Inference● Inheritance also provides a means to carry out

default reasoning– Applicable unless exceptions

– Example:● Emus are birds● Typically birds fly and have wings● Emus run (??)

Semantic Nets

● Critique: lack of semantics!– Ambiguities and inconsistencies in the use of nodes and

arcs● Woods [75] and others

– Semantics sometimes unclear or can be found out only by manipulation programs

– Examples of confusion● isa for “belongs” and “subset of”● Canonical instance or class of objects?● Different meaning of relationships (among classes, between

classes and objects, among objects)

● What about logics?

– Semantic nets are a comfortable notation for a part of FOL and nevertheless can be cast as a logic formalism

● Well, not entirely...

Semantic Nets

● Translation in logics

● Note: classes in uppercase, individuals in lowercase

A B

B

IS-A

a

RA b

RA B

INSTANCE

● x : A(x) B(x)

● B(a)

● x : x A R(x,b)

● x : x A y B : R(x,y)

Page 6: Laurea in INFORMATICA MAGISTRALElacam.di.uniba.it/~ferilli/ufficiale/corsi/ai-eng/7 - Representation.pdf · Inheritance Example “Peter breathes air” can be inferred from the fact

Semantic Nets

● Sample Translation

– x Mammal(x) Animal(x)

– x Mammal(x) HasNPaws(x, 4)

– x Elephant(x) Mammal(x)

– x Elephant(x) HasColor(x, gray)

– Elephant(Clyde)

● It is possible to deduce:– Animal(Clyde)

– Mammal(Clyde)

– HasNPaws(Clyde, 4)

– HasColor(Clyde, gray)

Mammal

Elephant

Clyde

Animal

4

gray

HasColor

HasNPaws

INSTANCE

IS-A

IS-A

Semantic Nets

● Exceptions – Sample Translation

● x Mammal(x) HasNPaws(x, 4)

● x Dolphin(x) Mammal(x)

● x Dolphin(x) HasNPaws(x, ?)● Dolphin(Flipper)

– It is possible to deduce:● HasNPaws(Flipper, 4)

...but also

● HasNPaws(Flipper, ?)

● Modeling default reasoning requires non-monotonic logics

Flipper

Mammal

HasNPawsDolphin ?

HasNPaws4

INSTANCE

IS-A

From Semantic Nets to Frames

● 70s-80s: need for wider structures than simple “conceptual nodes” ● Schemes, Frames, Scripts

● Concept of “schema” rediscovered● Proposed to explain some behavior of human

memory [Bartlett, 1932]– Tendence to recall worse at each recall

– Better recall of important propositions

– Omissions, rationalization, search for a sense

● And also...– Theory of linguistic prototypes [Fillmore, 1968]

– Frames in sociology [Goffman, 1974]

– Conceptualizations of Natural Language [Schank, 1973]

Schemes, Frames, Scripts

● Common features[Schank & Abelson, Rumelhart, Bower]● Structures used by humans to organize knowledge

● Concern objects/events/situations

● Useful for understanding– Create expectations/predictions

● General structures

● Affect the way in which we interpret and recall objects and events– By embedding information on instances or specific

events

Frames

● Originally conceived in Psychology● Knowledge structures to represent stereotyped

situations and typical objects

● “framing” = an inevitable process of selective influence on the perception of meanings that an individual attributes to words or sentences

● Used to define “interpretation schemes” allowing individuals or groups to “position, perceive, identify and classify” events and facts, this way structuring the meaning, organizing experiences, driving actions [Goffman]

Frames

● In semantics and computational linguistics● A schematization of a situation, state or event

● E.g., “trade” frame

– Using lexical units that recall the situation or event● E.g., words “buy”, “sell”, “cost”, etc.

– Using semantic roles● E.g., “buyer”, “seller”, “money”, “good”, “transaction”, etc.

● Generally invoked by a verb in a sentence

● Allow to (manually or automatically) annotate the sentence with the corresponding semantic roles

Page 7: Laurea in INFORMATICA MAGISTRALElacam.di.uniba.it/~ferilli/ufficiale/corsi/ai-eng/7 - Representation.pdf · Inheritance Example “Peter breathes air” can be inferred from the fact

● Idea of invoking a known situation to give meaning to a sentence or situation already used in natural language understanding● Example: Sentences

– 1. Tom went to the restaurant

– A 2. Asked the waiter a steak

– 3. Paid his bill and went away

– 1. Tom went to the restaurant

– B 2. Asked the dwarf a mouse

– 3. Gave him a coin and went away● similar in syntactic structure● semantically consistent due to using the same primitives● but B has no meaning, while A is understandable because it

refers to the Restaurant frame

Frames

● Example: Frame for concept “cinema”– Describes the most common stereotype of cinema

● An entrance with ticket counter, a space for waiting, a projection hall

– May describe, in the form of a script, the most common sequence of events that happen in a cinema

● Buying a ticket, waiting for the show to begin, commercials before the movie, then the actual movie, last exiting the place

Expectation

● Very important phenomenon

– Many things can be explained based on the hypothesis that interaction of intelligent systems with the world is driven by a rich set of expectations of many kinds

● Each frame establishes a set of expectations (activated by the perceived signals) that can be confirmed or dismissed

● Aim

– Representing in algorithmic form ● so as to provide statements for a computer

– the set of implicit knowledge and expectations allowing a human to understand

● by making inferences

– sequences of events that can be retraced from consistent narrations

Expectation

● Examples– Four metal tips appearing from under a napkin on a

furnished table are immediately recognized as a fork● Seeing the same tips appearing from a book in a library would

be different!

– While the concept of “paying the ticket” does not belong to the logical definition of “cinema”

● Not in the same way as the concept of “trunk” belongs to the definition of “tree”

it is well-known, based on experience, that usually going to the cinema involves paying a ticket

Frames

● Typical features of human intelligence and understanding, to be considered when building programs aimed at simulating them● Quick recall of appropriate frames from long-term

memory

● Ability of – Activating many frames and subframes to understand

complex situations

– Finding plausible justifications for situations not corresponding to the expectations of the activated frames

● Possibility of integrating, modifying or replacing frames that do not fit current situations

Frames

● Theory of Frames in AI (1975)● Proposed as a paradigm to represent knowledge

from the real world so as to be exploitable by a computer

● Formalized by Minsky, inspired by the proposals of research in cognitive psychology, sociology, linguistics

● Aim: allowing to build a database containing the huge amount of knowledge needed by a system aimed at reproducing “common sense reasoning” in humans

Page 8: Laurea in INFORMATICA MAGISTRALElacam.di.uniba.it/~ferilli/ufficiale/corsi/ai-eng/7 - Representation.pdf · Inheritance Example “Peter breathes air” can be inferred from the fact

Frames

● Frame :● An organization of data in memory, a set of inter-

related knowledge, useful to represent complex situations or stereotypical events– E.g., a typical museum, a typical birthday party

● The result of a conceptualization activity● Components:

– Name

– Set of slots (attribute-value pairs)

● Attribute = slot name● Value = slot filler

FramesReasoning Mechanisms

● When facing a new situation, select a frame from memory

● Reference to be adapted so as to fit reality, changing its details as needed

● Each frame associated to several types of information● How to use the frame

● What one may expect to happen later

● What to do if such expectations are not confirmed

FramesReasoning Mechanisms

● Added value compared to other formalisms:

● Organization of knowledge similar to that used by humans to acquire and maintain up-to-date knowledge based on their everyday experience

● Note: the data in a frame might be represented using other knowledge structures– E.g., semantic nets or production rules

● Actually, frame-based systems often use such formalisms

Frames

● Usually represented as graphs● but Minsky never explicitly refers to semantic nets

or is_a categories

● Frames & Semantic Nets– Name = Node

– Slot names = Names of outgoing arcs from the node

– Slot Fillers = Nodes at the other end of such arcs

Frames

● A frame can be thought of as a net of nodes and relationships● Top levels are fixed, represent things that are

always true about the supposed situation

● Lower levels have as terminals the slots– To be filled by assigning specific data values

● Each terminal may specify the conditions under which such assignements must take place

– May be typed or anyway constrained

– Values may be known or unknown

– May have default values

Frames

● Primitives for handling frames● Invocation

– Initial consideration of a frame

● Determination– Decide if there is enough evidence to infer an invoked

frame

● Processing– Filling a slot of a given frame

● Termination– Inferring that a given frame is not relevant anymore

Page 9: Laurea in INFORMATICA MAGISTRALElacam.di.uniba.it/~ferilli/ufficiale/corsi/ai-eng/7 - Representation.pdf · Inheritance Example “Peter breathes air” can be inferred from the fact

Frames

● When a frame is chosen, an evaluation process takes place to determine if it is appropriate

● Matching used as a form of inference– A frame ‘instance’ can be considered as an ‘instance’ of

another frame if the former matches the latter

● Example– John Smith

● Instance of frame Man

– Is Dog_owner

– If there is a matching with an instance of frame Dog_owner

● e.g., with Owner Name

Frames

● When a frame is not suitable to a situation, it can be ● ‘transformed’

– Perturbation procedures for small changes● May fix the description

● or ‘replaced’

– Replacement procedures for significant changes● Look for new frames whose terminal slots correspond to a

sufficient number of terminal slots of the original frame

● Tasks in the mental process carried out to give meaning to sentences● Recognition

– Based on reference to situations that are stereotyped, known, or anyhow that can be reduced to everybody’s experience

– Implies accessing the proper high-level structure● Encodes the possible interpretations (or predictions)

● Interpretation– Implies a simple processing of that structure aimed at

retrieving predictions

● Prediction

● A kind of loop [Schank]● Similar to that of an Intelligent Agent

– 1. John took out the coins

– 2. Put them in the slot● 3a. Dialed the number● 3b. Started the game● 3c. Picked the cup with coffee

– Several schemes and interpretations can be invoked based on the first two sentences

– Misrecognition● Selection of the stereotyped model of interpretation usually

happens at the occurrence of the first event the selection of the schema may fail

– Reinterpretation● When new facts occur, other schemes are selected until the

only compatible scheme occurs

Frames

● Slot● A way to represent knowledge

– Relationship between an object and a value explicitly named, rather than implied by position

● Example: Tom gave Mary the book– give(Tom, book, Mary, past)– give(subj=Tom, obj=book, obj2=Mary, time=past)

● Allow to embed relationships between different frames– This allows mechanisms for linking frames

Frames

● Theory of Frames is more than just a proposal for a data structure

● Frame System● A schema such that “if there is access to a frame,

indexes are automatically created to select other potentially relevant frames” [Minsky]

– Knowledge encoded in packets (Frames) organized in a networked structure (Frame System) to make retrieval easier, so that if a frame is invoked, links to other potentially connected frames are activated

Page 10: Laurea in INFORMATICA MAGISTRALElacam.di.uniba.it/~ferilli/ufficiale/corsi/ai-eng/7 - Representation.pdf · Inheritance Example “Peter breathes air” can be inferred from the fact

Example of a Frame-based System

superclass: vehicle

reg. number

producer

model

owner

truck

class: vehicle

reg. number

producer

model

owner

tonnage

part of basket

car

class: vehicle

reg. number

producer

model

owner

number of doors

4

horse-power

John’s car

class: car

reg. number LV97

producer BMW

model 520

owner John

number of doors

2

horse-power 150

basket

size 2*3*1.5

material tin

John

age 22

length of driving 2

Frames

● Frame Systems● In addition to providing default values, frames can

– Include knowledge in the form of rules and inherit it from other frames

– Hook procedural information● Computed through programs

● Effects of important actions reflected by transformations between the frames of a system

Frames

● Allow to represent in a single place information collected in different times, places and from different viewpoints● Example: Visual scene analysis

– Minsky (“A framework for representing knowledge” AI Memo 306, MIT AI Lab, 1975) to better highlight the features of this representation

– Different frames● describe a scene from different perspectives● may have the same terminal slots

– Common slots are to be interpreted as the same physical feature, seen from different perspectives

BA

E Region-of

Transformations from a frame to another amount to

moving from a point to another

BA

E

CB

ERight Right

B

E

ParallelogramA E B

Right

Above

Left

Is-a

Above

D A B CLeft Left Left

Left Visual Frames

Cube Cube Cube

Spatial frames when moving towards the right

BA

E

CB

ERight Right

B

E

Page 11: Laurea in INFORMATICA MAGISTRALElacam.di.uniba.it/~ferilli/ufficiale/corsi/ai-eng/7 - Representation.pdf · Inheritance Example “Peter breathes air” can be inferred from the fact

Frames

● Possible terminal slots filler expressions● Frame name

● Frame relationships to other frames

● Slot actual value

– Symbolic, numeric, boolean

● Slot default value

● Slot range of values (or restriction on values)

● Procedural information

Frames

● Possible terminal slots filler expressions● Example: frame Chair

– Specialization-of = Furniture

– No. legs = an integer (DEFAULT = 4)

– Back style = high, low, straight, padded, ...

– No. arms = 0, 1, 2

– Type = normal, wheel, beach, toy, electric, ...

Frames

● Specialization slot A-kind-of used to create a property inheritance hierarchy among frames

● The set of frames and slots provides a description of a situation● When considering a specific situation,

– a copy of the frame ( instantiation) is made

– and added to the short-term memory,

– filling the slots with specifications of the particular situation

Frames

● Slots and property inheritance● [ FRAME : Event

IS-A : Thing SLOTS : (Time (A specific time – Default 21st century)) (Place (A place – Default Italy)) ]

● [ FRAME : Action IS-A : Event SLOTS : (Actor (A person)) ]

● [ FRAME : Walk IS-A : Action SLOTS : (Start (A place)) (Destination (A place)) ]

● [ FRAME : Election IS-A : Event SLOTS : (Place (A (Country (Headed by (A President))))) ]

– Action and Walk inherit default values (21st century and Italy) from Event

– This is impossible for Election, that requires a specific place on its own, with a type specification that might make the inherited default value illegal

Frames

● Procedural information in slots● Procedures attached to slots

may drive the reasoning process– “IF ADDED (CHANGED) DEMONS” or “TRIGGERS”

● Activated every time the value of a slot has been changed

– “IF NEEDED DEMONS” ● Activated on request if the value of a slot “must” be determined

● Demon– Procedure that checks if some condition becomes true

and, in case, activates an associated process● IF-THEN structure

– Sometimes, the associated process is also called Demon

Frames

● Procedural information in slots● Example

Temperature sensors

Name

UnknownCritical

Value

Unknown

Value Unknown

Status Unknown

Get-Value (Self.Name)

IF Self.Status = Alert

THEN Sound-Alert

Value If-Needed Method

Status If-Added Method

IF Self.Value

> Self.CriticalValue

THEN

Self.Status = Alert

Value If-Added Method

Page 12: Laurea in INFORMATICA MAGISTRALElacam.di.uniba.it/~ferilli/ufficiale/corsi/ai-eng/7 - Representation.pdf · Inheritance Example “Peter breathes air” can be inferred from the fact

Frames

● Slots in frames and predicate calculus● Frame instance = an object in a domain

– Frame = Predicate

– Slots = Functions that create terms

● Reasoning

– If a slot has associated an ● IF-ADDED -> forward inference rule● IF-NEEDED -> backward inference rule

Frames

● Slots in frames and predicate calculus● Example

– FRAME: FamilySLOTS: (Mother-of (APerson)) (Father-of (APerson))

(Child-of (APerson))

– A specific instance of Family, denoted by identifier Smith-Family, can be asserted using expression

● (Family Smith-Family)

– Slot identifiers can be applied as functions● (=(Mother-of (Smith-Family) Mary)● (=(Father-of (Smith-Family) Tom)● (=(Child-of (Smith-Family) John)

Frames

● Frame-based development environments often provide an extension to the slot-filler structure through the application of Facets

● Facet● A way to provide extended knowledge about a

frame attribute

● Used to

– Determine the value of an attribute● VALUE FACETS

– Check user queries● PROMPT FACETS

– Tell the inference engine how to process the attribute● INFERENCE FACETS

Frames

● Tools– Frame Languages

● OWL (Scolovits – MIT 1977)● FRL (Roberts – Goldstein MIT 1977)● KRL (Bobrow – Winograd 1977)● NETL (Fahlman MIT 1979)

– Usually provide record structures with typed fields, embedded in Is-a hierarchies

– Hybrid systems– Frame systems are sometimes adapted to create rich

descriptions or definitions, rather than encoding assertions● KL-ONE (Brachman 1982)● KRYPTON (Brachman & Fikes 1983)● FRAIL (Charniak 1983)● KODIAK (Wilensky 1984)● UNIFRAME (Maida 1984)

– Shells● KEE (Knowledge Engineering Environment, Intellicorp 1983)

Frames

● FrameNet [Lowe, Baker, Fillmore]● Resource made up of collections of sentences,

syntactically and semantically annotated,organized in frames

● Frame-based semantics– The meaning of words stems from the role they play in

the conceptual structure of sentences

● Knowledge structured in 16 general domains– time, space, communications, cognition, health …

● 6000 lexical elements; 130.000 annotated sentences

● http://framenet.icsi.berkeley.edu/fndrupal/

Conceptual Dependency

● Conceptual Dependency [Schank, 1973]● A theory on how to represent knowledge about

events usually contained in natural language sentences

● A mechanism to represent and reason about events

● Objective: representing knowledge so as to– Ease inference starting from sentences

– Be independent on the original language

– Differently from semantic nets, provide both a structure and a set of primitives to build representations

Page 13: Laurea in INFORMATICA MAGISTRALElacam.di.uniba.it/~ferilli/ufficiale/corsi/ai-eng/7 - Representation.pdf · Inheritance Example “Peter breathes air” can be inferred from the fact

Conceptual Dependency

● Representations of actions built starting from a set of primitive acts

– ATRANS Abstract transfer relationship (give)

– PTRANS Transfer of an object’s physical position

– PROPEL Application of a force to an object

– MOVE Movement of a part of the body by its owner

– GRASP Act of grasping an object by an actor

– INGEST Ingestion of an object by an animate being

– EXPEL Expulsion of an object by an animate being

– MTRANS Mental information transfer

– MBUILD Building of new information

– SPEAK Production of sounds

– ATTEND Focus on sensorial stimuli

Conceptual Dependency

● Conceptualizations representing events may be modified in many ways to provide the information usually conveyed by tense and mode of a verbal form● Set of tenses proposed by Schank

– p past

– f future

– t transition

– ts transition started

– tf transition finished

– k ongoing

– ? interrogative

– / negative

Conceptual Dependency

● Example● I gave John a book

– Arrows = dependence direction

– Double arrow = bidirectional link between agent and action

– p = tense

– ATRANS = primitive action

– o = relation object

– R = receiver

p o R

to

from

I ATRANS book

I

John

Scripts

● Script [Schank & Abelson, 1977]● Structure made up of a set of slots

– Each may be associated with ● Information about the ‘types’ of values they may take● Default values

Scripts

● Knowledge macrostructures:Scripts and Human memory● Psychologists observed that

– Subjects remember details not present but compliant to the script (e.g., restaurant)

– The adopted script leads to focusing on different information (e.g., Thief vs Restaurant owner)

– Recall with same or different perspective: if change of perspective, recall increases

– Adopted script affects understanding and recall

– Elements compliant with the script more easily recalled● E.g., PhD students: recall of objects in the Professor’s office

Tendency to infer the presence of unseen objects● E.g., books

and to forget that of little relevant ones● E.g., umbrella, number of windows, room orientation, etc.

Scripts

● Knowledge macrostructures● Scripts for social situations

– E.g.: going to restaurant● Sit down, look at menu,

order, eat, pay, exit

– (actions taken by 73% of subjects)

● What are scripts for?– Create expectations,

notice deviations from script

● E.g., he left 100 Euros tip

– Cognitive economy● Not storing all new

information

Page 14: Laurea in INFORMATICA MAGISTRALElacam.di.uniba.it/~ferilli/ufficiale/corsi/ai-eng/7 - Representation.pdf · Inheritance Example “Peter breathes air” can be inferred from the fact

Scripts

● In reality, event sequences have a logic● Causal, temporal, etc.

– Example:● Entry Conditions

cause● Events taking place

represent● Conditions

cause● Events taking place

cause● Set of Results

● A causal chain among events may be represented through a script

Scripts

● Components– Entry conditions

● Conditions to be satisfied before the events described in the script can take place

– Result● Conditions true after the events described in the script took

place

– Props● Objects involved in the events

– Roles● Subjects and individuals involved in the events

– Trace● The variant, particular accepted meaning of a more general

scheme (script)

– Scenes● The actual sequences of events that take place

SCRIPT: RestaurantTRACE: TrattoriaPROPS: Tables

MenuF = FoodControlMoney

ROLES: S = Client

W = Waiter

C = Cook

M = Cashier

O = Owner

ENTRY CONDITIONS

S is hungry

S has money

RESULTS

S has less money

O has more money

S is not hungry

S is satisfied (optional)

What might be the primitive actions to represent a scene?

SCENE 1: Entering

S PTRANS S into restaurant

S ATTEND look tables

S MBUILD where to sit

S PTRANS S to table

S MOVE S sit

SCENE 2: Ordering

W PTRANS W to table

W ATRANS Menu to S

S PTRANS Menu to S

S MBUILD choice of F

S MTRANS signal to W

W PTRANS W to table

S MTRANS (want F) to W …

S MTRANS signals to W

SCENE 3: Eating

C ATRANS F A W

W ATRANS F A S

S INGEST F

SCENE 4: Exiting

W PTRANS W A S

W ATRANS bill A S

S ATRANS money A W

or . . .

Scripts

● Two ways for activating a script● 1. Temporary scripts

– Mentioned, reference, but not fundamental● Sufficient storing a pointer to the script so as to be able to

access it subsequently, if necessary

– Example● “While going to the museum, Susan passed before the

restaurant. She liked very much the Picasso exhibition”– Script associated to restaurant not central: it might be

activated later through the pointer

Page 15: Laurea in INFORMATICA MAGISTRALElacam.di.uniba.it/~ferilli/ufficiale/corsi/ai-eng/7 - Representation.pdf · Inheritance Example “Peter breathes air” can be inferred from the fact

Scripts

● Two ways for activating a script● 2. Non-temporary scripts

– Central in description, must be completely activated● Useful to completely activate them, trying to fill the slots with

observed instances

– Script headers● Preconditions, roles, events, places, props

used to indicate that the script should be activated● E.g.: situations involving at least n elements of the script

header

– Example● “Susan, while going to the museum, passed before a bar. She

was hungry and went in. She saw few tables and went there.”– Presence of trace Bar and precondition hungry enough to

activate script Restaurant

Scripts

● Allow to predict events not explicitly observed● Example

– (Suppose script Restaurant has been activated)

– “Yesterday John went to the restaurant. He ordered a steak. Then he asked for the bill and paid. Finally he went home”.

– Question: “Did John eat yesterday?”

– Answer: “Yes” ● Even if the fact is not explicitly expressed

Scripts

● Provide a way for building a consistent interpretation, given a set of observed facts● Example

– “Yesterday John went out for lunch. After sitting in the place, he called the waiter. The waiter brought him a menu. John ordered his lunch”.

– Question: “Why did the waiter bring a menu?”

– The script provides two possible answers:● John asked for the menu● John had to decide what to eat

Scripts

● Allow to focus on exceptional situations (unusual events), highlighting where the description of observed facts departs from the standard sequence of events● When the script is ‘broken’ it cannot be used

anymore to predict events– Example

● “John went to the restaurant, he sat at a table. He waited for the waiter for a long time. The waiter did not come. John stood up and went away.”

● Cannot say if John consulted a menu or not

● Systems using scripts● SAM (Cullingford, 1981)

Further Readings

● N.J. Nilsson “Artificial Intelligence”● Ch. 18

● E. Rich, K. Knight “Artificial Intelligence”, McGraw-Hill● Ch. 9-10

● Marvin Minsky “A Framework for Representing Knowledge”, MIT-AI Laboratory Memo 306, June, 1974● Reprinted in P. Winston (Ed.) “The Psychology of

Computer Vision”, McGraw-Hill, 1975

– Shorter versions in ● J. Haugeland (Ed.) “Mind Design”, MIT Press, 1981● Allan Collins and Edward E. Smith (Eds.) “Cognitive Science”,

Morgan-Kaufmann, 1992