Algorithmic Information Theory, Similarity Metrics and Google Varun Rao.

20
Algorithmic Information Theory, Similarity Metrics and Google Varun Rao

Transcript of Algorithmic Information Theory, Similarity Metrics and Google Varun Rao.

Algorithmic Information Theory, Similarity Metrics and Google

Varun Rao

Algorithmic Information Theory

• Kolmogorov Complexity

• Information Distance

• Normalized Information Distance

• Normalized Compression Distance

• Normalized Google Distance

Kolmogorov Complexity

• The Kolmogorov complexity of a string x is the length, in bits, of the shortest computer program of the fixed reference computing system that produces x as output.1

• First million bits of Pi vs. First million bits of your favourite song recording

Information Distance

• Given two strings x & y, Information Distance is the length of the shortest binary program that computes output y from input x, and also output x from input y 1

• ID minorizes all other computable distance metrics

Normalized Information Distance

• Roughly speaking, two objects are deemed close if we can significantly “compress” one given the information in the other, the idea being that if two pieces are more similar, then we can more succinctly describe one given the other.2

• NID is characterized as the most informative metric

• Sadly, completely and utterly uncomputable

Normalized Compression Distance

• But we have compressors (lossless)

• If C is a compressor, and C(x) is the compressed length of a string x then

• NCD gets closer to NID as the compressor approximates the ultimate compression, Kolmogorov complexity

Normalized Compression Distance II

• Basic Process to compute NCD for x & y– Use compressor to compute C(x), C(y)– Append x to y and compute C(xy)

• Use relatively simple clustering methods to use NCD as a similarity metric to group strings

Normalized Compression Distance III

• Using Bzip2 on various types of files

Normalized Compression Distance IV• The evolutionary tree built from complete mammalian

mtDNA sequences of 24 species2

Normalized Compression Distance V• Clustering of Native-American, Native-African, and

Native-European languages (translations of The Universal Declaration of Human Rights)2

Normalized Compression Distance VI• Optical Character Recognition using NCD. More

complex clustering techniques achieved 85% success rate as opposed to industry standard 90%-95%2

What about Semantic meaning ?• Or what about how different a horse is

from a car, or a hawk from a handsaw for that matter ?

• Compressors are semantically indifferent to their data

• To insert semantic relationships, turn to Google

Google• Massive database, containing lots of

information about semantic relationships

• The Quick Brown ___ ?

• Use simple page counts as indicators of closeness

• Use relative number of hits as a measure of probability to create a Google Distribution i.e. p(x) = hits in a search of x/total number of pages indexed

Google II• Given that we can construct a

distribution we can construct a Google Shannon Fano code (conceptually) because we can apply the Kraft inequality (after some normalization)

.... ???

Normalized Google Distance• After all that hand waving, we can create

a distance (like) metric NGD that has all kinds of nice properties

Applying NGD• NGD as applied to 15 painting names by 3

Dutch artists

Applying NGD II• Using SVM to learn the concept of primes2

Applying NGD III• Using SVM to learn “electrical” terms 2

Applying NGD IV• Using SVM to learn “religious” terms 2

Sources

1. R. Cilibrasi and P. Vitanyi, “Automatic Meaning Discovery Using Google” http://www.arxiv.org/pdf/cs.CL/0412098

2. R. Cilibrasi, P. Vitanyi. Clustering by compression, Submitted to IEEE Trans. Information Theory. http://www.archiv.org/abs/cs.CV/0312044

3. C.H. Bennett, P. G´acs, M. Li, P.M.B. Vit´anyi,W. Zurek, Information Distance, IEEE Trans. Information Theory, 44:4(1998), 1407–1423.

4. M. Li, X. Chen, X. Li, B. Ma, P. Vitanyi. The similarity metric, IEEE Trans. Information Theory, 50:12(2004), 3250- 3264.

5. “Algorithmic Information Theory”, Wikipedia, accessed 25th January 2005. http://en.wikipedia.org/wiki/Algorithmic_information_theory

6. Greg Harfst, “Kolmogorov Complexity”, accessed 25th January 2005. http://nms.lcs.mit.edu/~gch/kolmogorov.html