Talk 2: Graph Mining Tools - SVD, ranking, proximity Christos Faloutsos CMU.

Post on 21-Jan-2016

224 views 0 download

Transcript of Talk 2: Graph Mining Tools - SVD, ranking, proximity Christos Faloutsos CMU.

Talk 2: Graph Mining Tools - SVD, ranking, proximity

Christos Faloutsos

CMU

Lipari 2010 (C) 2010, C. Faloutsos 2

Outline• Introduction – Motivation• Task 1: Node importance • Task 2: Recommendations• Task 3: Connection sub-graphs• Conclusions

Lipari 2010 (C) 2010, C. Faloutsos 3

Node importance - Motivation:

• Given a graph (eg., web pages containing the desirable query word)

• Q: Which node is the most important?

Lipari 2010 (C) 2010, C. Faloutsos 4

Node importance - Motivation:

• Given a graph (eg., web pages containing the desirable query word)

• Q: Which node is the most important?

• A1: HITS (SVD = Singular Value Decomposition)

• A2: eigenvector (PageRank)

Lipari 2010 (C) 2010, C. Faloutsos 5

Node importance - motivation

• SVD and eigenvector analysis: very closely related

Lipari 2010 (C) 2010, C. Faloutsos 6

SVD - Detailed outline

• Motivation• Definition - properties• Interpretation• Complexity• Case studies

Lipari 2010 (C) 2010, C. Faloutsos 7

SVD - Motivation

• problem #1: text - LSI: find ‘concepts’• problem #2: compression / dim. reduction

Lipari 2010 (C) 2010, C. Faloutsos 8

SVD - Motivation

• problem #1: text - LSI: find ‘concepts’

Lipari 2010 (C) 2010, C. Faloutsos 9

SVD - Motivation

• Customer-product, for recommendation system:

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

bread

lettu

cebe

ef

vegetarians

meat eaters

tom

atos

chick

en

Lipari 2010 (C) 2010, C. Faloutsos 10

SVD - Motivation

• problem #2: compress / reduce dimensionality

Lipari 2010 (C) 2010, C. Faloutsos 11

Problem - specs

• ~10**6 rows; ~10**3 columns; no updates;

• random access to any cell(s) ; small error: OK

Lipari 2010 (C) 2010, C. Faloutsos 12

SVD - Motivation

Lipari 2010 (C) 2010, C. Faloutsos 13

SVD - Motivation

Lipari 2010 (C) 2010, C. Faloutsos 14

SVD - Detailed outline

• Motivation• Definition - properties• Interpretation• Complexity• Case studies• Additional properties

Lipari 2010 (C) 2010, C. Faloutsos 15

SVD - Definition

(reminder: matrix multiplication

1 2

3 45 6

x 1

-1

3 x 2 2 x 1

=

Lipari 2010 (C) 2010, C. Faloutsos 16

SVD - Definition

(reminder: matrix multiplication

1 2

3 45 6

x 1

-1

3 x 2 2 x 1

=

3 x 1

Lipari 2010 (C) 2010, C. Faloutsos 17

SVD - Definition

(reminder: matrix multiplication

1 2

3 45 6

x 1

-1

3 x 2 2 x 1

=

-1

3 x 1

Lipari 2010 (C) 2010, C. Faloutsos 18

SVD - Definition

(reminder: matrix multiplication

1 2

3 45 6

x 1

-1

3 x 2 2 x 1

=

-1

-1

3 x 1

Lipari 2010 (C) 2010, C. Faloutsos 19

SVD - Definition

(reminder: matrix multiplication

1 2

3 45 6

x 1

-1=

-1

-1-1

Lipari 2010 (C) 2010, C. Faloutsos 20

SVD - Definition

A[n x m] = U[n x r] r x r] (V[m x r])T

• A: n x m matrix (eg., n documents, m terms)

• U: n x r matrix (n documents, r concepts)• : r x r diagonal matrix (strength of each

‘concept’) (r : rank of the matrix)• V: m x r matrix (m terms, r concepts)

Lipari 2010 (C) 2010, C. Faloutsos 21

SVD - Definition

• A = U VT - example:

Lipari 2010 (C) 2010, C. Faloutsos 22

SVD - Properties

THEOREM [Press+92]: always possible to decompose matrix A into A = U VT , where

• U, V: unique (*)• U, V: column orthonormal (ie., columns are unit

vectors, orthogonal to each other)– UT U = I; VT V = I (I: identity matrix)

• : singular are positive, and sorted in decreasing order

Lipari 2010 (C) 2010, C. Faloutsos 23

SVD - Example

• A = U VT - example:

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

datainf.

retrieval

brain lung

0.18 0

0.36 0

0.18 0

0.90 0

0 0.53

0 0.800 0.27

=CS

MD

9.64 0

0 5.29x

0.58 0.58 0.58 0 0

0 0 0 0.71 0.71

x

Lipari 2010 (C) 2010, C. Faloutsos 24

SVD - Example

• A = U VT - example:

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

datainf.

retrieval

brain lung

0.18 0

0.36 0

0.18 0

0.90 0

0 0.53

0 0.800 0.27

=CS

MD

9.64 0

0 5.29x

0.58 0.58 0.58 0 0

0 0 0 0.71 0.71

x

CS-conceptMD-concept

Lipari 2010 (C) 2010, C. Faloutsos 25

SVD - Example

• A = U VT - example:

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

datainf.

retrieval

brain lung

0.18 0

0.36 0

0.18 0

0.90 0

0 0.53

0 0.800 0.27

=CS

MD

9.64 0

0 5.29x

0.58 0.58 0.58 0 0

0 0 0 0.71 0.71

x

CS-conceptMD-concept

doc-to-concept similarity matrix

Lipari 2010 (C) 2010, C. Faloutsos 26

SVD - Example

• A = U VT - example:

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

datainf.

retrieval

brain lung

0.18 0

0.36 0

0.18 0

0.90 0

0 0.53

0 0.800 0.27

=CS

MD

9.64 0

0 5.29x

0.58 0.58 0.58 0 0

0 0 0 0.71 0.71

x

‘strength’ of CS-concept

Lipari 2010 (C) 2010, C. Faloutsos 27

SVD - Example

• A = U VT - example:

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

datainf.

retrieval

brain lung

0.18 0

0.36 0

0.18 0

0.90 0

0 0.53

0 0.800 0.27

=CS

MD

9.64 0

0 5.29x

0.58 0.58 0.58 0 0

0 0 0 0.71 0.71

x

term-to-conceptsimilarity matrix

CS-concept

Lipari 2010 (C) 2010, C. Faloutsos 28

SVD - Example

• A = U VT - example:

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

datainf.

retrieval

brain lung

0.18 0

0.36 0

0.18 0

0.90 0

0 0.53

0 0.800 0.27

=CS

MD

9.64 0

0 5.29x

0.58 0.58 0.58 0 0

0 0 0 0.71 0.71

x

term-to-conceptsimilarity matrix

CS-concept

Lipari 2010 (C) 2010, C. Faloutsos 29

SVD - Detailed outline

• Motivation• Definition - properties• Interpretation• Complexity• Case studies• Additional properties

Lipari 2010 (C) 2010, C. Faloutsos 30

SVD - Interpretation #1

‘documents’, ‘terms’ and ‘concepts’:• U: document-to-concept similarity matrix• V: term-to-concept sim. matrix• : its diagonal elements: ‘strength’ of each

concept

Lipari 2010 (C) 2010, C. Faloutsos 31

SVD – Interpretation #1

‘documents’, ‘terms’ and ‘concepts’:Q: if A is the document-to-term matrix, what

is AT A?A:Q: A AT ?A:

Lipari 2010 (C) 2010, C. Faloutsos 32-32

SVD – Interpretation #1

‘documents’, ‘terms’ and ‘concepts’:Q: if A is the document-to-term matrix, what

is AT A?A: term-to-term ([m x m]) similarity matrixQ: A AT ?A: document-to-document ([n x n]) similarity

matrix

Lipari 2010 (C) 2010, C. Faloutsos 33Copyright: Faloutsos, Tong (2009) 2-33

SVD properties

• V are the eigenvectors of the covariance matrix ATA

• U are the eigenvectors of the Gram (inner-product) matrix AAT

Further reading:1. Ian T. Jolliffe, Principal Component Analysis (2nd ed), Springer, 2002.2. Gilbert Strang, Linear Algebra and Its Applications (4th ed), Brooks Cole, 2005.

Lipari 2010 (C) 2010, C. Faloutsos 34

SVD - Interpretation #2

• best axis to project on: (‘best’ = min sum of squares of projection errors)

Lipari 2010 (C) 2010, C. Faloutsos 35

SVD - Motivation

Lipari 2010 (C) 2010, C. Faloutsos 36

SVD - interpretation #2

• minimum RMS error

SVD: givesbest axis to project

v1

first singular

vector

Lipari 2010 (C) 2010, C. Faloutsos 37

SVD - Interpretation #2

Lipari 2010 (C) 2010, C. Faloutsos 38

SVD - Interpretation #2

• A = U VT - example:

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

0.18 0

0.36 0

0.18 0

0.90 0

0 0.53

0 0.800 0.27

=9.64 0

0 5.29x

0.58 0.58 0.58 0 0

0 0 0 0.71 0.71

x

v1

Lipari 2010 (C) 2010, C. Faloutsos 39

SVD - Interpretation #2

• A = U VT - example:

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

0.18 0

0.36 0

0.18 0

0.90 0

0 0.53

0 0.800 0.27

=9.64 0

0 5.29x

0.58 0.58 0.58 0 0

0 0 0 0.71 0.71

x

variance (‘spread’) on the v1 axis

Lipari 2010 (C) 2010, C. Faloutsos 40

SVD - Interpretation #2

• A = U VT - example:– U gives the coordinates of the points in the

projection axis

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

0.18 0

0.36 0

0.18 0

0.90 0

0 0.53

0 0.800 0.27

=9.64 0

0 5.29x

0.58 0.58 0.58 0 0

0 0 0 0.71 0.71

x

Lipari 2010 (C) 2010, C. Faloutsos 41

SVD - Interpretation #2

• More details• Q: how exactly is dim. reduction done?

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

0.18 0

0.36 0

0.18 0

0.90 0

0 0.53

0 0.800 0.27

=9.64 0

0 5.29x

0.58 0.58 0.58 0 0

0 0 0 0.71 0.71

x

Lipari 2010 (C) 2010, C. Faloutsos 42

SVD - Interpretation #2

• More details• Q: how exactly is dim. reduction done?• A: set the smallest singular values to zero:

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

0.18 0

0.36 0

0.18 0

0.90 0

0 0.53

0 0.800 0.27

=9.64 0

0 5.29x

0.58 0.58 0.58 0 0

0 0 0 0.71 0.71

x

Lipari 2010 (C) 2010, C. Faloutsos 43

SVD - Interpretation #2

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

0.18 0

0.36 0

0.18 0

0.90 0

0 0.53

0 0.800 0.27

~9.64 0

0 0x

0.58 0.58 0.58 0 0

0 0 0 0.71 0.71

x

Lipari 2010 (C) 2010, C. Faloutsos 44

SVD - Interpretation #2

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

0.18 0

0.36 0

0.18 0

0.90 0

0 0.53

0 0.800 0.27

~9.64 0

0 0x

0.58 0.58 0.58 0 0

0 0 0 0.71 0.71

x

Lipari 2010 (C) 2010, C. Faloutsos 45

SVD - Interpretation #2

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

0.18

0.36

0.18

0.90

0

00

~9.64

x

0.58 0.58 0.58 0 0

x

Lipari 2010 (C) 2010, C. Faloutsos 46

SVD - Interpretation #2

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

~

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 0 0

0 0 0 0 00 0 0 0 0

Lipari 2010 (C) 2010, C. Faloutsos 47

SVD - Interpretation #2

Exactly equivalent:‘spectral decomposition’ of the matrix:

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

0.18 0

0.36 0

0.18 0

0.90 0

0 0.53

0 0.800 0.27

=9.64 0

0 5.29x

0.58 0.58 0.58 0 0

0 0 0 0.71 0.71

x

Lipari 2010 (C) 2010, C. Faloutsos 48

SVD - Interpretation #2

Exactly equivalent:‘spectral decomposition’ of the matrix:

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

= x xu1 u2

1

2

v1

v2

Lipari 2010 (C) 2010, C. Faloutsos 49

SVD - Interpretation #2

Exactly equivalent:‘spectral decomposition’ of the matrix:

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

= u11 vT1 u22 vT

2+ +...n

m

Lipari 2010 (C) 2010, C. Faloutsos 50

SVD - Interpretation #2

Exactly equivalent:‘spectral decomposition’ of the matrix:

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

= u11 vT1 u22 vT

2+ +...n

m

n x 1 1 x m

r terms

Lipari 2010 (C) 2010, C. Faloutsos 51

SVD - Interpretation #2

approximation / dim. reduction:by keeping the first few terms (Q: how many?)

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

= u11 vT1 u22 vT

2+ +...n

m

assume: 1 >= 2 >= ...

Lipari 2010 (C) 2010, C. Faloutsos 52

SVD - Interpretation #2

A (heuristic - [Fukunaga]): keep 80-90% of ‘energy’ (= sum of squares of i ’s)

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

= u11 vT1 u22 vT

2+ +...n

m

assume: 1 >= 2 >= ...

Lipari 2010 (C) 2010, C. Faloutsos 53

SVD - Detailed outline

• Motivation• Definition - properties• Interpretation

– #1: documents/terms/concepts– #2: dim. reduction– #3: picking non-zero, rectangular ‘blobs’

• Complexity• Case studies• Additional properties

Lipari 2010 (C) 2010, C. Faloutsos 54

SVD - Interpretation #3

• finds non-zero ‘blobs’ in a data matrix

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

0.18 0

0.36 0

0.18 0

0.90 0

0 0.53

0 0.800 0.27

=9.64 0

0 5.29x

0.58 0.58 0.58 0 0

0 0 0 0.71 0.71

x

Lipari 2010 (C) 2010, C. Faloutsos 55

SVD - Interpretation #3

• finds non-zero ‘blobs’ in a data matrix

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

0.18 0

0.36 0

0.18 0

0.90 0

0 0.53

0 0.800 0.27

=9.64 0

0 5.29x

0.58 0.58 0.58 0 0

0 0 0 0.71 0.71

x

Lipari 2010 (C) 2010, C. Faloutsos 56

SVD - Interpretation #3

• finds non-zero ‘blobs’ in a data matrix =• ‘communities’ (bi-partite cores, here)

1 1 1 0 0

2 2 2 0 0

1 1 1 0 0

5 5 5 0 0

0 0 0 2 2

0 0 0 3 30 0 0 1 1

Row 1

Row 4

Col 1

Col 3

Col 4Row 5

Row 7

Lipari 2010 (C) 2010, C. Faloutsos 57

SVD - Detailed outline

• Motivation• Definition - properties• Interpretation• Complexity• Case studies• Additional properties

Lipari 2010 (C) 2010, C. Faloutsos 58

SVD - Complexity

• O( n * m * m) or O( n * n * m) (whichever is less)

• less work, if we just want singular values• or if we want first k singular vectors• or if the matrix is sparse [Berry]• Implemented: in any linear algebra package

(LINPACK, matlab, Splus, mathematica ...)

Lipari 2010 (C) 2010, C. Faloutsos 59

SVD - conclusions so far

• SVD: A= U VT : unique (*)• U: document-to-concept similarities• V: term-to-concept similarities• : strength of each concept• dim. reduction: keep the first few strongest

singular values (80-90% of ‘energy’)– SVD: picks up linear correlations

• SVD: picks up non-zero ‘blobs’

Lipari 2010 (C) 2010, C. Faloutsos 60

SVD - Detailed outline

• Motivation

• Definition - properties

• Interpretation

• Complexity

• SVD properties

• Case studies

• Conclusions

Lipari 2010 (C) 2010, C. Faloutsos 61

SVD - Other properties - summary

• can produce orthogonal basis (obvious) (who cares?)

• can solve over- and under-determined linear problems (see C(1) property)

• can compute ‘fixed points’ (= ‘steady state prob. in Markov chains’) (see C(4) property)

Lipari 2010 (C) 2010, C. Faloutsos 62

SVD -outline of properties

• (A): obvious• (B): less obvious• (C): least obvious (and most powerful!)

Lipari 2010 (C) 2010, C. Faloutsos 63

Properties - by defn.:

A(0): A[n x m] = U [ n x r ] [ r x r ] VT [ r x m]

A(1): UT [r x n] U [n x r ] = I [r x r ] (identity matrix)

A(2): VT [r x n] V [n x r ] = I [r x r ]

A(3): k = diag( 1k, 2

k, ... rk ) (k: ANY real

number)A(4): AT = V UT

Lipari 2010 (C) 2010, C. Faloutsos 64

Less obvious properties

A(0): A[n x m] = U [ n x r ] [ r x r ] VT [ r x m]

B(1): A [n x m] (AT) [m x n] = ??

Lipari 2010 (C) 2010, C. Faloutsos 65

Less obvious properties

A(0): A[n x m] = U [ n x r ] [ r x r ] VT [ r x m]

B(1): A [n x m] (AT) [m x n] = U 2 UT

symmetric; Intuition?

Lipari 2010 (C) 2010, C. Faloutsos 66

Less obvious properties

A(0): A[n x m] = U [ n x r ] [ r x r ] VT [ r x m]

B(1): A [n x m] (AT) [m x n] = U 2 UT

symmetric; Intuition?‘document-to-document’ similarity matrix

B(2): symmetrically, for ‘V’ (AT) [m x n] A [n x m] = V L2 VT Intuition?

Lipari 2010 (C) 2010, C. Faloutsos 67

Less obvious properties

A: term-to-term similarity matrix

B(3): ( (AT) [m x n] A [n x m] ) k= V 2k VT

and

B(4): (AT A )

k ~ v1 12k v1

T for k>>1

where

v1: [m x 1] first column (singular-vector) of V

1: strongest singular value

Lipari 2010 (C) 2010, C. Faloutsos 68

Less obvious properties

B(4): (AT A )

k ~ v1 12k v1

T for k>>1

B(5): (AT A )

k v’ ~ (constant) v1

ie., for (almost) any v’, it converges to a vector parallel to v1

Thus, useful to compute first singular vector/value (as well as the next ones, too...)

Lipari 2010 (C) 2010, C. Faloutsos 69

Less obvious properties - repeated:

A(0): A[n x m] = U [ n x r ] [ r x r ] VT [ r x m]

B(1): A [n x m] (AT) [m x n] = U 2 UT

B(2): (AT) [m x n] A [n x m] = V 2 VT

B(3): ( (AT) [m x n] A [n x m] ) k= V 2k VT

B(4): (AT A )

k ~ v1 12k v1

T

B(5): (AT A )

k v’ ~ (constant) v1

Lipari 2010 (C) 2010, C. Faloutsos 70

Least obvious properties - cont’d

A(0): A[n x m] = U [ n x r ] [ r x r ] VT [ r x m]

C(2): A [n x m] v1 [m x 1] = 1 u1 [n x 1]

where v1 , u1 the first (column) vectors of V, U. (v1 == right-singular-vector)

C(3): symmetrically: u1T A = 1 v1

T

u1 == left-singular-vector

Therefore:

Lipari 2010 (C) 2010, C. Faloutsos 71

Least obvious properties - cont’d

A(0): A[n x m] = U [ n x r ] [ r x r ] VT [ r x m]

C(4): AT A v1 = 12 v1

(fixed point - the dfn of eigenvector for a symmetric matrix)

Lipari 2010 (C) 2010, C. Faloutsos 72

Least obvious properties - altogether

A(0): A[n x m] = U [ n x r ] [ r x r ] VT [ r x m]

C(1): A [n x m] x [m x 1] = b [n x 1]

then, x0 = V (-1) UT b: shortest, actual or least-squares solution

C(2): A [n x m] v1 [m x 1] = 1 u1 [n x 1]

C(3): u1T A = 1 v1

T

C(4): AT A v1 = 12 v1

Lipari 2010 (C) 2010, C. Faloutsos 73

Properties - conclusions

A(0): A[n x m] = U [ n x r ] [ r x r ] VT [ r x m]

B(5): (AT A )

k v’ ~ (constant) v1

C(1): A [n x m] x [m x 1] = b [n x 1]

then, x0 = V (-1) UT b: shortest, actual or least-squares solution

C(4): AT A v1 = 12 v1

Lipari 2010 (C) 2010, C. Faloutsos 74

SVD - detailed outline

• ...• SVD properties• case studies

– Kleinberg’s algorithm– Google’s algorithm

• Conclusions

Lipari 2010 (C) 2010, C. Faloutsos 75

Kleinberg’s algo (HITS)

Kleinberg, Jon (1998). Authoritative sources in a hyperlinked environment. Proc. 9th ACM-SIAM Symposium on Discrete Algorithms.

Lipari 2010 (C) 2010, C. Faloutsos 76

Recall: problem dfn

• Given a graph (eg., web pages containing the desirable query word)

• Q: Which node is the most important?

Lipari 2010 (C) 2010, C. Faloutsos 77

Kleinberg’s algorithm• Problem dfn: given the web and a query• find the most ‘authoritative’ web pages for

this query

Step 0: find all pages containing the query terms

Step 1: expand by one move forward and backward

Lipari 2010 (C) 2010, C. Faloutsos 78

Kleinberg’s algorithm

• Step 1: expand by one move forward and backward

Lipari 2010 (C) 2010, C. Faloutsos 79

Kleinberg’s algorithm

• on the resulting graph, give high score (= ‘authorities’) to nodes that many important nodes point to

• give high importance score (‘hubs’) to nodes that point to good ‘authorities’)

hubs authorities

Lipari 2010 (C) 2010, C. Faloutsos 80

Kleinberg’s algorithm

observations

• recursive definition!

• each node (say, ‘i’-th node) has both an authoritativeness score ai and a hubness score hi

Lipari 2010 (C) 2010, C. Faloutsos 81

Kleinberg’s algorithm

Let E be the set of edges and A be the adjacency matrix: the (i,j) is 1 if the edge from i to j exists

Let h and a be [n x 1] vectors with the ‘hubness’ and ‘authoritativiness’ scores.

Then:

Lipari 2010 (C) 2010, C. Faloutsos 82

Kleinberg’s algorithm

Then:

ai = hk + hl + hm

that is

ai = Sum (hj) over all j that (j,i) edge exists

or

a = AT h

k

l

m

i

Lipari 2010 (C) 2010, C. Faloutsos 83

Kleinberg’s algorithm

symmetrically, for the ‘hubness’:

hi = an + ap + aq

that is

hi = Sum (qj) over all j that (i,j) edge exists

or

h = A a

p

n

q

i

Lipari 2010 (C) 2010, C. Faloutsos 84

Kleinberg’s algorithm

In conclusion, we want vectors h and a such that:

h = A a

a = AT hRecall properties:

C(2): A [n x m] v1 [m x 1] = 1 u1 [n x 1]

C(3): u1T A = 1 v1

T

=

Lipari 2010 (C) 2010, C. Faloutsos 85

Kleinberg’s algorithmIn short, the solutions to

h = A a

a = AT h

are the left- and right- singular-vectors of the adjacency matrix A.

Starting from random a’ and iterating, we’ll eventually converge

(Q: to which of all the singular-vectors? why?)

Lipari 2010 (C) 2010, C. Faloutsos 86

Kleinberg’s algorithm

(Q: to which of all the singular-vectors? why?)

A: to the ones of the strongest singular-value, because of property B(5):

B(5): (AT A )

k v’ ~ (constant) v1

Lipari 2010 (C) 2010, C. Faloutsos 87

Kleinberg’s algorithm - results

Eg., for the query ‘java’:

0.328 www.gamelan.com

0.251 java.sun.com

0.190 www.digitalfocus.com (“the java developer”)

Lipari 2010 (C) 2010, C. Faloutsos 88

Kleinberg’s algorithm - discussion

• ‘authority’ score can be used to find ‘similar pages’ (how?)

Lipari 2010 (C) 2010, C. Faloutsos 89

SVD - detailed outline

• ...• Complexity• SVD properties• Case studies

– Kleinberg’s algorithm (HITS)– Google’s algorithm

• Conclusions

Lipari 2010 (C) 2010, C. Faloutsos 90

PageRank (google)

•Brin, Sergey and Lawrence Page (1998). Anatomy of a Large-Scale Hypertextual Web Search Engine. 7th Intl World Wide Web Conf.

LarryPage

SergeyBrin

Lipari 2010 (C) 2010, C. Faloutsos 91

Problem: PageRank

Given a directed graph, find its most interesting/central node

A node is important,if it is connected with important nodes(recursive, but OK!)

Lipari 2010 (C) 2010, C. Faloutsos 92

Problem: PageRank - solution

Given a directed graph, find its most interesting/central node

Proposed solution: Random walk; spot most ‘popular’ node (-> steady state prob. (ssp))

A node has high ssp,if it is connected with high ssp nodes(recursive, but OK!)

Lipari 2010 (C) 2010, C. Faloutsos 93

(Simplified) PageRank algorithm

• Let A be the adjacency matrix;

• let B be the transition matrix: transpose, column-normalized - then

1 2 3

45

p1

p2

p3

p4

p5

p1

p2

p3

p4

p5

=

To From

B1

1 1

1/2 1/2

1/2

1/2

Lipari 2010 (C) 2010, C. Faloutsos 94

(Simplified) PageRank algorithm

• B p = p

p1

p2

p3

p4

p5

p1

p2

p3

p4

p5

=

B p = p

1

1 1

1/2 1/2

1/2

1/2

1 2 3

45

Lipari 2010 (C) 2010, C. Faloutsos 95

Definitions

A Adjacency matrix (from-to)

D Degree matrix = (diag ( d1, d2, …, dn) )

B Transition matrix: to-from, column normalized

B = AT D-1

Lipari 2010 (C) 2010, C. Faloutsos 96

(Simplified) PageRank algorithm

• B p = 1 * p

• thus, p is the eigenvector that corresponds to the highest eigenvalue (=1, since the matrix is

column-normalized)

• Why does such a p exist? – p exists if B is nxn, nonnegative, irreducible

[Perron–Frobenius theorem]

Lipari 2010 (C) 2010, C. Faloutsos 97

(Simplified) PageRank algorithm

• In short: imagine a particle randomly moving along the edges

• compute its steady-state probabilities (ssp)

Full version of algo: with occasional random jumps

Why? To make the matrix irreducible

Lipari 2010 (C) 2010, C. Faloutsos 98

Full Algorithm

• With probability 1-c, fly-out to a random node

• Then, we havep = c B p + (1-c)/n 1 =>

p = (1-c)/n [I - c B] -1 1

Lipari 2010 (C) 2010, C. Faloutsos 99

Alternative notation

M Modified transition matrix

M = c B + (1-c)/n 1 1T

Then

p = M p

That is: the steady state probabilities =

PageRank scores form the first eigenvector of the ‘modified transition matrix’

Lipari 2010 (C) 2010, C. Faloutsos 100

Parenthesis: intuition behind eigenvectors

Lipari 2010 (C) 2010, C. Faloutsos 101

Formal definition

If A is a (n x n) square matrix , x) is an eigenvalue/eigenvector pair of A if A x = x

CLOSELY related to singular values:

Lipari 2010 (C) 2010, C. Faloutsos 102

Property #1: Eigen- vs singular-values

if

B[n x m] = U[n x r] r x r] (V[m x r])T

then A = (BTB) is symmetric and

C(4): BT B vi = i2 vi

ie, v1 , v2 , ...: eigenvectors of A = (BTB)

Lipari 2010 (C) 2010, C. Faloutsos 103

Property #2

• If A[nxn] is a real, symmetric matrix

• Then it has n real eigenvalues

(if A is not symmetric, some eigenvalues may be complex)

Lipari 2010 (C) 2010, C. Faloutsos 104

Property #3

• If A[nxn] is a real, symmetric matrix

• Then it has n real eigenvalues

• And they agree with its n singular values, except possibly for the sign

Lipari 2010 (C) 2010, C. Faloutsos 105

Intuition

• A as vector transformation

2 11 3

A

10

x

21

x’

= x

x’

2

1

1

3

Lipari 2010 (C) 2010, C. Faloutsos 106

Intuition

• By defn., eigenvectors remain parallel to themselves (‘fixed points’)

2 11 3

A0.52

0.85

v1v1

=

0.52

0.853.62 *

1

Lipari 2010 (C) 2010, C. Faloutsos 107

Convergence

• Usually, fast:

Lipari 2010 (C) 2010, C. Faloutsos 108

Convergence

• Usually, fast:

Lipari 2010 (C) 2010, C. Faloutsos 109

Convergence

• Usually, fast:• depends on ratio

1 : 21

2

Lipari 2010 (C) 2010, C. Faloutsos 110

Kleinberg/google - conclusions

SVD helps in graph analysis:

hub/authority scores: strongest left- and right- singular-vectors of the adjacency matrix

random walk on a graph: steady state probabilities are given by the strongest eigenvector of the (modified) transition matrix

Lipari 2010 (C) 2010, C. Faloutsos 111

Conclusions

• SVD: a valuable tool

• given a document-term matrix, it finds ‘concepts’ (LSI)

• ... and can find fixed-points or steady-state probabilities (google/ Kleinberg/ Markov Chains)

Lipari 2010 (C) 2010, C. Faloutsos 112

Conclusions cont’d

(We didn’t discuss/elaborate, but, SVD

• ... can reduce dimensionality (KL)

• ... and can find rules (PCA; RatioRules)

• ... and can solve optimally over- and under-constraint linear systems (least squares / query feedbacks)

Lipari 2010 (C) 2010, C. Faloutsos 113

References

• Berry, Michael: http://www.cs.utk.edu/~lsi/

• Brin, S. and L. Page (1998). Anatomy of a Large-Scale Hypertextual Web Search Engine. 7th Intl World Wide Web Conf.

Lipari 2010 (C) 2010, C. Faloutsos 114

References

• Christos Faloutsos, Searching Multimedia Databases by Content, Springer, 1996. (App. D)

• Fukunaga, K. (1990). Introduction to Statistical Pattern Recognition, Academic Press.

• I.T. Jolliffe Principal Component Analysis Springer, 2002 (2nd ed.)

Lipari 2010 (C) 2010, C. Faloutsos 115

References cont’d

• Kleinberg, J. (1998). Authoritative sources in a hyperlinked environment. Proc. 9th ACM-SIAM Symposium on Discrete Algorithms.

• Press, W. H., S. A. Teukolsky, et al. (1992). Numerical Recipes in C, Cambridge University Press. www.nr.com

Lipari 2010 (C) 2010, C. Faloutsos 116

Outline• Introduction – Motivation• Task 1: Node importance • Task 2: Recommendations & proximity• Task 3: Connection sub-graphs• Conclusions

Lipari 2010 (C) 2010, C. Faloutsos 117

Acknowledgement:

Most of the foils in ‘Task 2’ are by

Hanghang TONGwww.cs.cmu.edu/~htong

Lipari 2010 (C) 2010, C. Faloutsos 118

Detailed outline

• Problem dfn and motivation

• Solution: Random walk with restarts

• Efficient computation

• Case study: image auto-captioning

• Extensions: bi-partite graphs; tracking

• Conclusions

Lipari 2010 (C) 2010, C. Faloutsos 119

A

B i

ii

i

Motivation: Link Prediction

Should we introduceMr. A to Mr. B?

?

Lipari 2010 (C) 2010, C. Faloutsos 120

Motivation - recommendations

customers Products / movies

‘smith’

Terminator 2 ??

Lipari 2010 (C) 2010, C. Faloutsos 121

Answer: proximity

• ‘yes’, if ‘A’ and ‘B’ are ‘close’• ‘yes’, if ‘smith’ and ‘terminator 2’ are

‘close’

QUESTIONS in this part:- How to measure ‘closeness’/proximity?- How to do it quickly?- What else can we do, given proximity

scores?

Lipari 2010 (C) 2010, C. Faloutsos 122

How close is ‘A’ to ‘B’?

A BH1 1

D1 1

E

F

G1 11

I J1

1 1

a.k.a Relevance, Closeness, ‘Similarity’…

Lipari 2010 (C) 2010, C. Faloutsos 123

Why is it useful?

• RecommendationAnd many more• Image captioning [Pan+]• Conn. / CenterPiece subgraphs [Faloutsos+], [Tong+],

[Koren+]and• Link prediction [Liben-Nowell+], [Tong+]• Ranking [Haveliwala], [Chakrabarti+]• Email Management [Minkov+]• Neighborhood Formulation [Sun+]• Pattern matching [Tong+]• Collaborative Filtering [Fouss+]• …

Lipari 2010 (C) 2010, C. Faloutsos 124

Test Image

Sea Sun Sky Wave Cat Forest Tiger Grass

Image

Keyword

Region Automatic Image Captioning

Q: How to assign keywords to the test image?A: Proximity! [Pan+ 2004]

Lipari 2010 (C) 2010, C. Faloutsos 125

Center-Piece Subgraph(CePS)

A C

B

A C

B

Original GraphCePS

Q: How to find hub for the black nodes?A: Proximity! [Tong+ KDD 2006]

CePS guy

Input Output

Detailed outline

• Problem dfn and motivation

• Solution: Random walk with restarts

• Efficient computation

• Case study: image auto-captioning

• Extensions: bi-partite graphs; tracking

• Conclusions

Lipari 2010 (C) 2010, C. Faloutsos 126

Lipari 2010 (C) 2010, C. Faloutsos 127

How close is ‘A’ to ‘B’?

A BH1 1

D1 1

E

F

G1 11

I J1

1 1Should be close, if they have - many, - short- ‘heavy’ paths

Lipari 2010 (C) 2010, C. Faloutsos 128

Why not shortest path?

A: ‘pizza delivery guy’ problem

A BD1 1

A BD1 1

E

F

G1 11

Some ``bad’’ proximities

Lipari 2010 (C) 2010, C. Faloutsos 129

A BD1 1

A BD1 11 E

Why not max. netflow?

A: No penalty for long paths

Some ``bad’’ proximities

Lipari 2010 (C) 2010, C. Faloutsos 130

What is a ``good’’ Proximity?

A BH1 1

D1 1

E

F

G1 11

I J1

1 1

• Multiple Connections

• Quality of connection

•Direct & In-directed Conns

•Length, Degree, Weight…

Lipari 2010 (C) 2010, C. Faloutsos 131

1

4

3

2

56

7

910

8

11

12

Random walk with restart

[Haveliwala’02]

Lipari 2010 (C) 2010, C. Faloutsos 132

Random walk with restartNode 4

Node 1Node 2Node 3Node 4Node 5Node 6Node 7Node 8Node 9Node 10Node 11Node 12

0.130.100.130.220.130.050.050.080.040.030.040.02

1

4

3

2

56

7

910

811

120.13

0.10

0.13

0.13

0.05

0.05

0.08

0.04

0.02

0.04

0.03

Ranking vector More red, more relevant

Nearby nodes, higher scores

4r

Lipari 2010 (C) 2010, C. Faloutsos 133

2c 3cQ c ...W 2W 3W

Why RWR is a good score?

all paths from i to j with length 1

all paths from i to j with length 2

all paths from i to j with length 3

W : adjacency matrix. c: damping factor

1( )Q I cW ,( , ) i jQ i j r

i

j

Detailed outline

• Problem dfn and motivation

• Solution: Random walk with restarts– variants

• Efficient computation

• Case study: image auto-captioning

• Extensions: bi-partite graphs; tracking

• Conclusions

Lipari 2010 (C) 2010, C. Faloutsos 134

Lipari 2010 (C) 2010, C. Faloutsos 135

Variant: escape probability

• Define Random Walk (RW) on the graph• Esc_Prob(CMUParis)

– Prob (starting at CMU, reaches Paris before returning to CMU)

CMU Paristhe remaining graph

Esc_Prob = Pr (smile before cry)

Lipari 2010 (C) 2010, C. Faloutsos 136

Other Variants• Other measure by RWs

– Community Time/Hitting Time [Fouss+]– SimRank [Jeh+]

• Equivalence of Random Walks– Electric Networks:

• EC [Doyle+]; SAEC[Faloutsos+]; CFEC[Koren+]

– Spring Systems

• Katz [Katz], [Huang+], [Scholkopf+]• Matrix-Forest-based Alg [Chobotarev+]

Lipari 2010 (C) 2010, C. Faloutsos 137

Other Variants• Other measure by RWs

– Community Time/Hitting Time [Fouss+]– SimRank [Jeh+]

• Equivalence of Random Walks– Electric Networks:

• EC [Doyle+]; SAEC[Faloutsos+]; CFEC[Koren+]

– Spring Systems

• Katz [Katz], [Huang+], [Scholkopf+]• Matrix-Forest-based Alg [Chobotarev+]

All are “related to” or “similar to” random walk with restart!

Lipari 2010 (C) 2010, C. Faloutsos 138

Map of proximity measurements

RWR

Esc_Prob + Sink

Hitting Time/Commute

Time

Effective Conductance

String System

Regularized Un-constrainedQuad Opt.

Harmonic Func. ConstrainedQuad Opt.

Mathematic Tools

X out-degree

“voltage = position”

relax

4 ssp decides 1 esc_prob

KatzNormalize

Physical Models

Lipari 2010 (C) 2010, C. Faloutsos 139

Notice: Asymmetry (even in undirected graphs)

A

B C

DE

C-> A : highA-> C: low

Lipari 2010 (C) 2010, C. Faloutsos 140

Summary of Proximity Definitions• Goal: Summarize multiple relationships

• Solutions– Basic: Random Walk with Restarts

• [Haweliwala’02] [Pan+ 2004][Sun+ 2006][Tong+ 2006]

– Properties: Asymmetry• [Koren+ 2006][Tong+ 2007] [Tong+ 2008]

– Variants: Esc_Prob and many others.• [Faloutsos+ 2004] [Koren+ 2006][Tong+ 2007]

Detailed outline

• Problem dfn and motivation

• Solution: Random walk with restarts

• Efficient computation

• Case study: image auto-captioning

• Extensions: bi-partite graphs; tracking

• Conclusions

Lipari 2010 (C) 2010, C. Faloutsos 141

Lipari 2010 (C) 2010, C. Faloutsos 142

Reminder: PageRank

• With probability 1-c, fly-out to a random node

• Then, we havep = c B p + (1-c)/n 1 =>

p = (1-c)/n [I - c B] -1 1

Lipari 2010 (C) 2010, C. Faloutsos 143

Ranking vector Starting vectorAdjacency matrix

(1 )i i ir cWr c e

Restart p

p = c B p + (1-c)/n 1The onlydifference

Lipari 2010 (C) 2010, C. Faloutsos 144

Computing RWR

1

43

2

5 6

7

9 10

811

12

0.13 0 1/3 1/3 1/3 0 0 0 0 0 0 0 0

0.10 1/3 0 1/3 0 0 0 0 1/4 0 0 0

0.13

0.22

0.13

0.050.9

0.05

0.08

0.04

0.03

0.04

0.02

0

1/3 1/3 0 1/3 0 0 0 0 0 0 0 0

1/3 0 1/3 0 1/4 0 0 0 0 0 0 0

0 0 0 1/3 0 1/2 1/2 1/4 0 0 0 0

0 0 0 0 1/4 0 1/2 0 0 0 0 0

0 0 0 0 1/4 1/2 0 0 0 0 0 0

0 1/3 0 0 1/4 0 0 0 1/2 0 1/3 0

0 0 0 0 0 0 0 1/4 0 1/3 0 0

0 0 0 0 0 0 0 0 1/2 0 1/3 1/2

0 0 0 0 0 0 0 1/4 0 1/3 0 1/2

0 0 0 0 0 0 0 0 0 1/3 1/3 0

0.13 0

0.10 0

0.13 0

0.22

0.13 0

0.05 00.1

0.05 0

0.08 0

0.04 0

0.03 0

0.04 0

2 0

1

0.0

n x n n x 1n x 1

Ranking vector Starting vectorAdjacency matrix

1

(1 )i i ir cWr c e

Restart p

p = c B p + (1-c)/n 1

Lipari 2010 (C) 2010, C. Faloutsos 145

0 1/3 1/3 1/3 0 0 0 0 0 0 0 0

1/3 0 1/3 0 0 0 0 1/4 0 0 0 0

1/3 1/3 0 1/3 0 0 0 0 0 0 0 0

1/3 0 1/3 0 1/4

0.9

0 0 0 0 0 0 0

0 0 0 1/3 0 1/2 1/2 1/4 0 0 0 0

0 0 0 0 1/4 0 1/2 0 0 0 0 0

0 0 0 0 1/4 1/2 0 0 0 0 0 0

0 1/3 0 0 1/4 0 0 0 1/2 0 1/3 0

0 0 0 0 0 0 0 1/4 0 1/3 0 0

0 0 0 0 0 0 0 0 1/2 0 1/3 1/2

0 0 0 0 0

0

0

0

0

00.1

0

0

0

0

0 0 1/4 0 1/3 0 1/2 0

0 0 0 0 0 0 0 0 0 1/3 1/3

1

0 0

Q: Given query i, how to solve it?

??

Adjacency matrix Starting vectorRanking vectorRanking vector

Query

Lipari 2010 (C) 2010, C. Faloutsos 146

1

43

2

5 6

7

9 10

8 11

120.130.10

0.13

0.130.05

0.05

0.08

0.04

0.02

0.04

0.03

OntheFly: 0 1/3 1/3 1/3 0 0 0 0 0 0 0 0

1/3 0 1/3 0 0 0 0 1/4 0 0 0 0

1/3 1/3 0 1/3 0 0 0 0 0 0 0 0

1/3 0 1/3 0 1/4

0.9

0 0 0 0 0 0 0

0 0 0 1/3 0 1/2 1/2 1/4 0 0 0 0

0 0 0 0 1/4 0 1/2 0 0 0 0 0

0 0 0 0 1/4 1/2 0 0 0 0 0 0

0 1/3 0 0 1/4 0 0 0 1/2 0 1/3 0

0 0 0 0 0 0 0 1/4 0 1/3 0 0

0 0 0 0 0 0 0 0 1/2 0 1/3 1/2

0 0 0 0 0

0

0

0

0

00.1

0

0

0

0

0 0 1/4 0 1/3 0 1/2 0

0 0 0 0 0 0 0 0 0 1/3 1/3

1

0 0

0

0

0

1

0

0

0

0

0

0

0

0

0.13

0.10

0.13

0.22

0.13

0.05

0.05

0.08

0.04

0.03

0.04

0.02

1

43

2

5 6

7

9 10

811

12

0.3

0

0.3

0.1

0.3

0

0

0

0

0

0

0

0.12

0.18

0.12

0.35

0.03

0.07

0.07

0.07

0

0

0

0

0.19

0.09

0.19

0.18

0.18

0.04

0.04

0.06

0.02

0

0.02

0

0.14

0.13

0.14

0.26

0.10

0.06

0.06

0.08

0.01

0.01

0.01

0

0.16

0.10

0.16

0.21

0.15

0.05

0.05

0.07

0.02

0.01

0.02

0.01

0.13

0.10

0.13

0.22

0.13

0.05

0.05

0.08

0.04

0.03

0.04

0.02

No pre-computation/ light storage

Slow on-line response O(mE)

ir

ir

Lipari 2010 (C) 2010, C. Faloutsos 147

0.20 0.13 0.14 0.13 0.68 0.56 0.56 0.63 0.44 0.35 0.39 0.34

0.28 0.20 0.13 0.96 0.64 0.53 0.53 0.85 0.60 0.48 0.53 0.45

0.14 0.13 0.20 1.29 0.68 0.56 0.56 0.63 0.44 0.35 0.39 0.33

0.13 0.10 0.13 2.06 0.95 0.78 0.78 0.61 0.43 0.34 0.38 0.32

0.09 0.09 0.09 1.27 2.41 1.97 1.97 1.05 0.73 0.58 0.66 0.56

0.03 0.04 0.04 0.52 0.98 2.06 1.37 0.43 0.30 0.24 0.27 0.22

0.03 0.04 0.04 0.52 0.98 1.37 2.06 0.43 0.30 0.24 0.27 0.22

0.08 0.11 0.04 0.82 1.05 0.86 0.86 2.13 1.49 1.19 1.33 1.13

0.03 0.04 0.03 0.28 0.36 0.30 0.30 0.74 1.78 1.00 0.76 0.79

0.04 0.04 0.04 0.34 0.44 0.36 0.36 0.89 1.50 2.45 1.54 1.80

0.04 0.05 0.04 0.38 0.49 0.40 0.40 1.00 1.14 1.54 2.28 1.72

0.02 0.03 0.02 0.21 0.28 0.22 0.22 0.56 0.79 1.20 1.14 2.05

4

PreCompute

1 2 3 4 5 6 7 8 9 10 11 12r r r r r r r r r r r r

1

43

2

5 6

7

9 10

8 11

120.130.10

0.13

0.130.05

0.05

0.08

0.04

0.02

0.04

0.03

13

2

5 6

7

9 10

811

12

R:

c x Q

Q

Lipari 2010 (C) 2010, C. Faloutsos 148

2.20 1.28 1.43 1.29 0.68 0.56 0.56 0.63 0.44 0.35 0.39 0.34

1.28 2.02 1.28 0.96 0.64 0.53 0.53 0.85 0.60 0.48 0.53 0.45

1.43 1.28 2.20 1.29 0.68 0.56 0.56 0.63 0.44 0.35 0.39 0.33

1.29 0.96 1.29 2.06 0.95 0.78 0.78 0.61 0.43 0.34 0.38 0.32

0.91 0.86 0.91 1.27 2.41 1.97 1.97 1.05 0.73 0.58 0.66 0.56

0.37 0.35 0.37 0.52 0.98 2.06 1.37 0.43 0.30 0.24 0.27 0.22

0.37 0.35 0.37 0.52 0.98 1.37 2.06 0.43 0.30 0.24 0.27 0.22

0.84 1.14 0.84 0.82 1.05 0.86 0.86 2.13 1.49 1.19 1.33 1.13

0.29 0.40 0.29 0.28 0.36 0.30 0.30 0.74 1.78 1.00 0.76 0.79

0.35 0.48 0.35 0.34 0.44 0.36 0.36 0.89 1.50 2.45 1.54 1.80

0.39 0.53 0.39 0.38 0.49 0.40 0.40 1.00 1.14 1.54 2.28 1.72

0.22 0.30 0.22 0.21 0.28 0.22 0.22 0.56 0.79 1.20 1.14 2.05

PreCompute:

1

43

2

5 6

7

9 10

8 11

120.130.10

0.13

0.130.05

0.05

0.08

0.04

0.02

0.04

0.03

1

43

2

5 6

7

9 10

811

12

Fast on-line response

Heavy pre-computation/storage costO(n ) O(n )

0.13

0.10

0.13

0.22

0.13

0.05

0.05

0.08

0.04

0.03

0.04

0.02

3 2

Lipari 2010 (C) 2010, C. Faloutsos 149

Q: How to Balance?

On-line Off-line

Lipari 2010 (C) 2010, C. Faloutsos 150

How to balance?

Idea (‘B-Lin’)

• Break into communities

• Pre-compute all, within a community

• Adjust (with S.M.) for ‘bridge edges’

H. Tong, C. Faloutsos, & J.Y. Pan. Fast Random Walk with Restart and Its Applications. ICDM, 613-622, 2006.

Lipari 2010 (C) 2010, C. Faloutsos 151

Detailed outline

• Problem dfn and motivation

• Solution: Random walk with restarts

• Efficient computation

• Case study: image auto-captioning

• Extensions: bi-partite graphs; tracking

• Conclusions

Lipari 2010 (C) 2010, C. Faloutsos 152

gCaP: Automatic Image Caption• Q

Sea Sun Sky Wave{ } { }Cat Forest Grass Tiger

{?, ?, ?,}

A: Proximity! [Pan+ KDD2004]

Lipari 2010 (C) 2010, C. Faloutsos 153

Test Image

Sea Sun Sky Wave Cat Forest Tiger Grass

Image

Keyword

Region

Lipari 2010 (C) 2010, C. Faloutsos 154

Test Image

Sea Sun Sky Wave Cat Forest Tiger Grass

Image

Keyword

Region

{Grass, Forest, Cat, Tiger}

Lipari 2010 (C) 2010, C. Faloutsos 155

C-DEM (Screen-shot)

Lipari 2010 (C) 2010, C. Faloutsos 156

C-DEM: Multi-Modal Query System for DrosophilaEmbryo Databases [Fan+ VLDB 2008]

Detailed outline

• Problem dfn and motivation

• Solution: Random walk with restarts

• Efficient computation

• Case study: image auto-captioning

• Extensions: bi-partite graphs; tracking

• Conclusions

Lipari 2010 (C) 2010, C. Faloutsos 157

Lipari 2010 (C) 2010, C. Faloutsos 158

Problem: update

E’ edges changed

Involves n’ authors, m’ confs.

n authors

m Conferences

Lipari 2010 (C) 2010, C. Faloutsos 159

Solution:

• Use Sherman-Morrison Lemma to quickly update the inverse matrix

Lipari 2010 (C) 2010, C. Faloutsos 160

Fast-Single-Update

176x speedup

40x speedup

log(Time) (Seconds)

Datasets

Our method

Our method

Lipari 2010 (C) 2010, C. Faloutsos 161

pTrack: Philip S. Yu’s Top-5 conferences up to each year

ICDE

ICDCS

SIGMETRICS

PDIS

VLDB

CIKM

ICDCS

ICDE

SIGMETRICS

ICMCS

KDD

SIGMOD

ICDM

CIKM

ICDCS

ICDM

KDD

ICDE

SDM

VLDB

1992 1997 2002 2007

DatabasesPerformanceDistributed Sys.

DatabasesData Mining

DBLP: (Au. x Conf.) - 400k aus, - 3.5k confs - 20 yrs

Lipari 2010 (C) 2010, C. Faloutsos 162

pTrack: Philip S. Yu’s Top-5 conferences up to each year

ICDE

ICDCS

SIGMETRICS

PDIS

VLDB

CIKM

ICDCS

ICDE

SIGMETRICS

ICMCS

KDD

SIGMOD

ICDM

CIKM

ICDCS

ICDM

KDD

ICDE

SDM

VLDB

1992 1997 2002 2007

DatabasesPerformanceDistributed Sys.

DatabasesData Mining

DBLP: (Au. x Conf.) - 400k aus, - 3.5k confs - 20 yrs

Lipari 2010 (C) 2010, C. Faloutsos 163

KDD’s Rank wrt. VLDB over years

Prox.Rank

Year

Data Mining and Databases are getting closer & closer

Lipari 2010 (C) 2010, C. Faloutsos 164

cTrack:10 most influential authors in NIPS community up to each year

Author-paper bipartite graph from NIPS 1987-1999. 3k. 1740 papers, 2037 authors, spreading over 13 years

T. Sejnowski

M. Jordan

Lipari 2010 (C) 2010, C. Faloutsos 165

Conclusions - Take-home messages• Proximity Definitions

– RWR– and a lot of variants

• Computation– Sherman–Morrison Lemma– Fast Incremental Computation

• Applications– Recommendations; auto-captioning; tracking

– Center-piece Subgraphs (next)

– E-mail management; anomaly detection, …

Lipari 2010 (C) 2010, C. Faloutsos 166

References• L. Page, S. Brin, R. Motwani, & T. Winograd. (1998), The

PageRank Citation Ranking: Bringing Order to the Web, Technical report, Stanford Library.

• T.H. Haveliwala. (2002) Topic-Sensitive PageRank. In WWW, 517-526, 2002

• J.Y. Pan, H.J. Yang, C. Faloutsos & P. Duygulu. (2004) Automatic multimedia cross-modal correlation discovery. In KDD, 653-658, 2004.

Lipari 2010 (C) 2010, C. Faloutsos 167

References• C. Faloutsos, K. S. McCurley & A. Tomkins. (2002) Fast

discovery of connection subgraphs. In KDD, 118-127, 2004.

• J. Sun, H. Qu, D. Chakrabarti & C. Faloutsos. (2005) Neighborhood Formation and Anomaly Detection in Bipartite Graphs. In ICDM, 418-425, 2005.

• W. Cohen. (2007) Graph Walks and Graphical Models. Draft.

Lipari 2010 (C) 2010, C. Faloutsos 168

References• P. Doyle & J. Snell. (1984) Random walks and electric

networks, volume 22. Mathematical Association America, New York.

• Y. Koren, S. C. North, and C. Volinsky. (2006) Measuring and extracting proximity in networks. In KDD, 245–255, 2006.

• A. Agarwal, S. Chakrabarti & S. Aggarwal. (2006) Learning to rank networked entities. In KDD, 14-23, 2006.

Lipari 2010 (C) 2010, C. Faloutsos 169

References• S. Chakrabarti. (2007) Dynamic personalized pagerank in

entity-relation graphs. In WWW, 571-580, 2007.

• F. Fouss, A. Pirotte, J.-M. Renders, & M. Saerens. (2007) Random-Walk Computation of Similarities between Nodes of a Graph with Application to Collaborative Recommendation. IEEE Trans. Knowl. Data Eng. 19(3), 355-369 2007.

Lipari 2010 (C) 2010, C. Faloutsos 170

References• H. Tong & C. Faloutsos. (2006) Center-piece subgraphs:

problem definition and fast solutions. In KDD, 404-413, 2006.

• H. Tong, C. Faloutsos, & J.Y. Pan. (2006) Fast Random Walk with Restart and Its Applications. In ICDM, 613-622, 2006.

• H. Tong, Y. Koren, & C. Faloutsos. (2007) Fast direction-aware proximity for graph mining. In KDD, 747-756, 2007.

Lipari 2010 (C) 2010, C. Faloutsos 171

References• H. Tong, B. Gallagher, C. Faloutsos, & T. Eliassi-

Rad. (2007) Fast best-effort pattern matching in large attributed graphs. In KDD, 737-746, 2007.

• H. Tong, S. Papadimitriou, P.S. Yu & C. Faloutsos. (2008) Proximity Tracking on Time-Evolving Bipartite Graphs. SDM 2008.

Lipari 2010 (C) 2010, C. Faloutsos 172

References• B. Gallagher, H. Tong, T. Eliassi-Rad, C. Faloutsos.

Using Ghost Edges for Classification in Sparsely Labeled Networks. KDD 2008

• H. Tong, Y. Sakurai, T. Eliassi-Rad, and C. Faloutsos. Fast Mining of Complex Time-Stamped Events CIKM 08

• H. Tong, H. Qu, and H. Jamjoom. Measuring Proximity on Graphs with Side Information. ICDM 2008

Lipari 2010 (C) 2010, C. Faloutsos 173

Resources

• www.cs.cmu.edu/~htong/soft.htmFor software, papers, and ppt of presentations

• www.cs.cmu.edu/~htong/tut/cikm2008/cikm_tutorial.htmlFor the CIKM’08 tutorial on graphs and proximity

Again, thanks to Hanghang TONGfor permission to use his foils in this part

Lipari 2010 (C) 2010, C. Faloutsos 174

Outline• Introduction – Motivation• Task 1: Node importance • Task 2: Recommendations & proximity• Task 3: Connection sub-graphs• Conclusions

Lipari 2010 (C) 2010, C. Faloutsos 175

Detailed outline

• Problem definition

• Solution

• Results

H. Tong & C. Faloutsos Center-piece subgraphs: problem definition and fast solutions. In KDD, 404-413, 2006.

Lipari 2010 (C) 2010, C. Faloutsos 176

Center-Piece Subgraph(Ceps)• Given Q query nodes• Find Center-piece ( )

• Input of Ceps– Q Query nodes– Budget b– k softAnd number

• App.– Social Network– Law Inforcement– Gene Network– …

b

B

A

C

B

A

C

Lipari 2010 (C) 2010, C. Faloutsos 177

Challenges in Ceps

• Q1: How to measure importance?

• (Q2: How to extract connection subgraph?

• Q3: How to do it efficiently?)

Lipari 2010 (C) 2010, C. Faloutsos 178

Challenges in Ceps

• Q1: How to measure importance?

• A: “proximity” – but how to combine scores?• (Q2: How to extract connection subgraph?• Q3: How to do it efficiently?)

Lipari 2010 (C) 2010, C. Faloutsos 179

AND: Combine Scores

• Q: How to combine scores?

Lipari 2010 (C) 2010, C. Faloutsos 180

AND: Combine Scores

• Q: How to combine scores?

• A: Multiply

• …= prob. 3 random particles coincide on node j

Lipari 2010 (C) 2010, C. Faloutsos 181

Detailed outline

• Problem definition

• Solution

• Results

Lipari 2010 (C) 2010, C. Faloutsos 182

Case Study: AND query

R. Agrawal Jiawei Han

V. Vapnik M. Jordan

H.V. Jagadish

Laks V.S. Lakshmanan

Heikki Mannila

Christos Faloutsos

Padhraic Smyth

Corinna Cortes

15 1013

1 1

6

1 1

4 Daryl Pregibon

10

2

11

3

16

Lipari 2010 (C) 2010, C. Faloutsos 183

Case Study: AND query

R. Agrawal Jiawei Han

V. Vapnik M. Jordan

H.V. Jagadish

Laks V.S. Lakshmanan

Heikki Mannila

Christos Faloutsos

Padhraic Smyth

Corinna Cortes

15 1013

1 1

6

1 1

4 Daryl Pregibon

10

2

11

3

16

Lipari 2010 (C) 2010, C. Faloutsos 184

Conclusions

Proximity (e.g., w/ RWR) helps answer ‘AND’ and ‘k_softAnd’ queries

Overall conclusions

• SVD: a powerful tool– HITS/ pageRank– (dimensionality reduction)

• Proximity: Random Walk with Restarts– Recommendation systems– Auto-captioning– Center-Piece Subgraphs

Lipari 2010 (C) 2010, C. Faloutsos 185