Query Relevance Feedback and Ontologies How to Make Queries Better.

25
Query Relevance Feedback and Ontologies How to Make Queries Better

Transcript of Query Relevance Feedback and Ontologies How to Make Queries Better.

Query Relevance Feedback and Ontologies

How to Make Queries Better

Overview

• Ranked Retrieval

• Relevance Feedback

• The Semantic Web and Ontologies

Typical Web Retrieval Process

Link Following

Need KeywordQuery

More Like this

Ranked Retrieval

How can we present the “best” item to the user first

What are we trying to do in IR

• Find the Document which is most similar to the query

• Ranking Interpretation– show the best most similar document first– then the next best most similar document– and so on

Bag of Words Model of Text

• Ignore the order of words in the document

• Just record whether a word appears in a document

Similarity Measures

• Cosine Formula

• Measures how like a document is to a query/document

See Kowalski Chapter 7

Similarity as Ranking

• Use the Similarity Measure to rank the documents

Relevance Feedback

More Like this done properly

Observation

• The user is probably in the best position to judge the relevance of a document

• Likewise the user is probably in the best position to judge which returned (highly ranked) documents are irrelevant

Retrieval Process

Need Analytic Query

More Like this

No More Like This

Relevance Feedback in Nutshell

• Perform an initial retrieval

• Ask the user to indicate which documents are relevant/irrelevant– Add all terms from relevant documents– Remove all terms from irrelevant documents– requery

Variants

• Using Ranking and Weighting• Pseudo relevance feedback

– use terms from all (highly ranked) retrieved documents

– Assumes highly ranked documents are a homogenous mass of relevant documents (Croft)

very helpful if very few documents retrieved perpetuates errors/misunderstandings from

original query

Exercise

• What are advantages of positive feedback ?

• What are advantages of negative feedback ?

• Which is best ?

Relevance Feedback Conclusion

• Consistently proven an effective way to improve retrieval

• Biggest problem is getting users to engage in the interaction, especially if no highly relevant documents are in the initially retrieved set

Ontologies

The Semantic Web

• Introduced by Tim Berners Lee and others in 2001– http://www.sciam.com/article.cfm?articleID=00

048144-10D2-1C70-84A9809EC588EF21

• Essentially about allowing computers and people to share the same world

• Central to the communication is the notion of an Ontology

Ontology Definition

• To standardize semantic terms, many areas use specific ontologies, which are hierarchical taxonomies of terms describing certain knowledge topics (Baeza-Yates & Ribeiro-Neto, 1999, p143).

• Thesauri: Ontologies for Information Retrieval.

• Entities, Relations.

O example

Car

Drop head coupe

Automobile

Hot Hatch

Engine WheelsSeatParts

Sort of

Also Known as

Improving Recall and/or Precision

• If you get too few documentsUse more general terms in the query

• Use “automobile” instead of “drop head coupe”Use an alternative term which is more

common Use “car” rather than “automobile”

If you get too many (overall)– Use a more specific term

• Use “hot hatch” rather than “car”

Issues

• How are thesauri different from Ontologies– Are we representing the world or words– Is Wordnet an ontology ?

• Are Ontologies meant to be – General– Universal– For a specific purpose ?

Thesauri

• Provide a map of a given field of knowledge: concepts, relations.• Provide a standard vocabulary for consistent indexing.• Assist users with locating terms for proper query formulation.• Ensure only one term from a synonym set is used for indexing and

searching: otherwise a searcher who uses one synonym and retrieves some useful documents may think the correct term has been used and the search has been exhaustive, without knowing that there are other useful documents under other synonyms.

• Provide classified hierarchies for broadening or narrowing a search if too many or too few documents are retrieved.

• Retrieval based on concepts rather than words (Baeza-Yates & Ribeiro-Neto, 1999).

WordNet Relations

• Examples are:

• Synonyms e.g. couch / sofa / lounge

• Antonyms e.g. love / hate

• Hypernyms (broader) e.g. cat / tabby

• Hyponyms (narrower) e.g. cat / animal

• Meronym (part-of) e.g. finger / hand

• Meronym (made-of) e.g. snowflake / snow

WordNet Demos

• See vancouver-webpages.com/wordnet

• See marimba.d.umn.edu/cgi-bin/similarity.cgi

Conclusions

• Ranked Retrieval– similarity matching

• Relevance Feedback– positive and negative feedback

• The Semantic Web and Ontologies