Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve...

57
Lecture 1: Introduction to low-rank tensor representation/approximation Alexander Litvinenko Center for Uncertainty Quantification http://sri-uq.kaust.edu.sa/

Transcript of Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve...

Page 1: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

Lecture 1:Introduction to low-rank tensorrepresentation/approximation

Alexander Litvinenko

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

http://sri-uq.kaust.edu.sa/

Page 2: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Outline

MotivationExamples

Post-processing in low-rank data format

Lecture 2: Higher order tensors

Definitions of different tensor formats

Example: Kriging, combining low-rank approx. withFFT-techniques

Definitions of different tensor formats

ApplicationsComputation of the characteristic

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

2 / 57

Page 3: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Used Literature and Slides

I Book of W. Hackbusch 2012,

I Dissertations of I. Oseledets and M. Espig

I Articles of Tyrtyshnikov et al., De Lathauwer et al., L.Grasedyck, B. Khoromskij, M. Espig

I Lecture courses and presentations of Boris and VeneraKhoromskij

I Software T. Kolda et al.; M. Espig et al.; D. Kressner, K.Tobler; I. Oseledets et al.

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

3 / 57

Page 4: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Challenges of numerical computations

I modelling of multi-particle interactions in large molecularsystems such as proteins, biomolecules,

I modelling of large atomic (metallic) clusters,

I stochastic and parametric equations,

I machine learning, data mining and information technologies,

I multidimensional dynamical systems,

I data compression

I financial mathematics,

I analysis of multi-dimensional data.

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

4 / 57

Page 5: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Example 1: KLE and PCE in stochastic PDEs

Write first Karhunen-Loeve Expansion and then for uncorrelatedrandom variables the Polynomial Chaos Expansion

u(x , ω) =K∑i=1

√λiϕi (x)ξi (ω) =

K∑i=1

√λiϕi (x)

∑α∈J

ξ(α)i Hα(θ(ω))

(1)

where ξ(α)i is a tensor because α = (α1, α2, ..., αM , ...) is a

multi-index.

PCE of u isu(x , ω) =

∑α∈J

u(x)(α)Hα(θ(ω)), (2)

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

5 / 57

Page 6: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Tensor of order 2

Let M := UΣV T ≈ UΣV T = Mk .(Truncated Singular Value Decomposition).Denote A := UΣ and B := V , then Mk = ABT .Storage of A and BT is k(n + m) in contrast to nm for M.

U VΣ∼ ∼ ∼ T=M

The general aim is to reduce (O)(nd) to (O)(n logr n) or even to(O)(logr n).

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

6 / 57

Page 7: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Example: Arithmetic operations

Let v ∈ RZ .Suppose Mk = ABT ∈ Rn×Z , A ∈ Rn×k , B ∈ RZ×k is given.Property 1: Mkv = ABT v = (A(BT v)). Cost O(kZ + kn).

Suppose M′

= CDT , C ∈ Rn×k and D ∈ RZ×k .

Property 2: Mk + M′

= AnewBTnew, Anew := [AC ] ∈ Rn×2k and

Bnew = [B D] ∈ RZ×2k .

Cost O((n + Z )k2 + k3).

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

7 / 57

Page 8: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Aims of low-rank/sparse data approximation

1. Represent/approximate multidimensional operators andfunctions in data sparse format

2. Perform linear algebra algorithms in tensor format (truncationis an issue)

3. Extract needed information from the data-sparse solution (e.g.mean, variance, quantiles etc)

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

8 / 57

Page 9: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Compute the mean and the variance in a low-rank data format

Let W := [v1, v2, ..., vZ ], where vi are snapshots.Given tSVD Wk = ABT ≈W ∈ Rn×Z , A := UkSk , B := V .

v =1

Z

Z∑i=1

vi =1

Z

Z∑i=1

A · bi = Ab, (3)

C =1

Z − 1WcW

Tc =

1

Z − 1ABTBAT =

1

Z − 1AAT . (4)

Diagonal of C can be computed with the complexity O(k2(Z + n)).

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

9 / 57

Page 10: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Accuracy of the mean value

Let 1 = (1, ..., 1)T .

LemmaLet ‖W −Wk‖ ≤ ε. Let vk be a rank-k approximation of themean v, then ‖v − vk‖ ≤ 1√

nε.

Proof: Since v = 1nW 1 and vk = 1

nWk1, then

‖v − vk‖ =1

n‖(W −Wk)1‖ ≤ 1

n‖(W −Wk)‖ · ‖1‖ ≤ 1√

nε.

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

10 / 57

Page 11: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Accuracy of the variance value

LemmaLet ‖W −Wk‖ ≤ ε, then ‖Wc − (Wc)k‖ ≤ ε.Proof:

‖Wc−(Wc)k‖ ≤ ‖W −Wk‖·‖I−1

n·1 ·1T‖ ≤ ‖W −Wk‖·1 ≤ ε,

LemmaLet ‖Wc − (Wc)k‖ ≤ ε, then ‖C− Ck‖ ≤ 1

Z−1ε2.

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

11 / 57

Page 12: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

Lecture 2: Higher order tensors

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

12 / 57

Page 13: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Definition of tensor of order d

Tensor of order d is a multidimensional array over a d-tuple indexset I = I1 × · · · × Id ,

A = [ai1...id : i` ∈ I`] ∈ RI , I` = {1, ..., n`}, ` = 1, .., d .

A tensor A is an element of the linear space

Vn =d⊗`=1

V`, V` = RI`

equipped with the Euclidean scalar product 〈·, ·〉 : Vn × Vn → R,defined as

〈A,B〉 :=∑

(i1...id )∈I

ai1...idbi1...id , forA, B ∈ Vn.

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

13 / 57

Page 14: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Rank-k representation

Tensor product of vectors u(`) = {u(`)i`}ni`=1 ∈ RI` forms the

canonical rank-1 tensor

A(1) ≡ [ui]i∈I = u(1) ⊗ ...⊗ u(d),

with entries ui = u(1)i1⊗ ...⊗ u

(d)id

. The storage is dn in contrast to

nd .

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

14 / 57

Page 15: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Tensor formats: CP, Tucker, TT

A(i1, i2, i3) ≈r∑

α=1

u1(i1, α)u2(i2, α)u3(i3, α)

A(i1, i2, i3) ≈∑

α1,α2,α3

c(α1, α2, α3)u1(i1, α1)u2(i2, α2)u3(i3, α3)

A(i1, ..., id) ≈∑

α1,...,αd−1

G1(i1, α1)G2(α1, i2, α2)...Gd−1(αd−1, id)

Discrete: Gk(ik) is a rk−1 × rk matrix, r1 = rd = 1.

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

15 / 57

Page 16: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Canonical and Tucker tensor formats

(Taken from Boris Khoromskij)

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

16 / 57

Page 17: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Tensor and Matrices

Tensor (A`)`∈I ∈ RIwhere I = I1 × I2 × ...× Id , #Iµ = nµ, #I =

∏dµ=1 nµ.

Rank-1 tensor

A = u1 ⊗ u2 ⊗ ...⊗ ud =:d⊗µ=1

Ai1,...,id = (u1)i1 · ... · (ud)id

Rank-1 tensor A = u ⊗ v , matrix A = uvT , A = vuT , u ∈ Rn,v ∈ Rm,Rank-k tensor A =

∑ki=1 ui ⊗ vi , matrix A =

∑ki=1 uiv

Ti .

Kronecker product A⊗ B ∈ Rnm×nm is a block matrix whose ij-thblock is [AijB].

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

17 / 57

Page 18: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Examples (B. Khoromskij’s lecture)

Rank-1: f = exp(f1(x1) + ...+ fd(xd)) =∏d

j=1 exp(fj(xj))

Rank-2: f = sin(∑d

j=1 xj), since

2i · sin(∑d

j=1 xj) = e i∑d

j=1 xj − e−i∑d

j=1 xj

Rank-d function f (x1, ..., xd) = x1 + x2 + ...+ xd can beapproximated by rank-2: with any prescribed accuracy:

f ≈∏d

j=1(1 + εxj)

ε−∏d

j=1 1

ε+O(ε), as ε→ 0

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

18 / 57

Page 19: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Examples where you can apply low-rank tensor approx.

e iκ‖x−y‖

‖x − y‖Helmholtz kernel (5)

f (x) =

∫[−a,a]3

e−iµ‖x−y‖

‖x − y‖u(y)dy Yukawa potential (6)

Classical Green kernels, x , y ∈ Rd

log(‖x − y‖), 1

‖x − y‖(7)

Other multivariate functions

(a)1

x21 + ..+ x2

d

, (b)1√

x21 + ..+ x2

d

, (c)e−λ‖x‖

‖x‖. (8)

For (a) use (ρ = x21 + ...+ x2

d ∈ [1, R], R > 1)

1

ρ=

∫R+

e−ρtdt. (9)Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

19 / 57

Page 20: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Examples (B. Khoromskij’s lecture)

f (x1, ..., xd) = w1(x1) + w2(x2) + ...+ wd(xd)

= (w1(x1), 1)

(1 0

w2(x2) 1

)...

(1 0

wd−1(xd−1) 1

)(1

wd(xd)

)

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

20 / 57

Page 21: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Examples:

rank(f )=2

f = sin(x1 + x2 + ...+ xd)

= (sinx1, cosx1)

(cosx2 −sinx2

sinx2 cosx2

)...

(cosxd−1 −sinxd−1

sinxd−1 cosxd−1

)(cosxd

sinxd−1

)

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

21 / 57

Page 22: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Examples: Tensor Train format (taken from Ivan Oseledets)

Accuracy of approximation with rank rmaxCenter for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

22 / 57

Page 23: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Examples: (taken from Ivan Oseledets)

Multidimensional integration

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

23 / 57

Page 24: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Examples: (taken from Ivan Oseledets)

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

24 / 57

Page 25: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

LECTURE 3

Lecture 3: Kriging accelerated byorders of magnitude: combining

low-rank covariance approximationswith FFT-techniques

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

25 / 57

Page 26: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Kriging, combining low-rank approx. with FFT-techniques

Let m be number of measurement points, n number of estimationpoints. y ∈ Rm vector of measurement values.

Let s ∈ Rn be the (kriging) vector to be estimate with meanµs = 0 and cov. matrix Qss ∈ Rn×n:

s = Qsy Q−1yy y︸ ︷︷ ︸ξ

, (10)

where Qsy ∈ Rn×m cross covariance matrix and Qyy ∈ Rm×m autocovariance matrix.

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

26 / 57

Page 27: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Task 2: Estimation of Variance σs ∈ Rn

Let Qss|y be the conditional covariance matrix. Then

σs = diag(Qss|y ) = diag(Qss −QsyQ

−1yy Qys

)= diag (Qss)−

m∑i=1

[(QsyQ

−1yy ei ) ◦QT

ys(i)],

where QTys(i) is the transpose of the i-th row in Qys .

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

27 / 57

Page 28: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Toeplitz and circulant matrices

Qss =

a1 a2 a3 a4

a4 a1 a2 a3

a3 a4 a1 a2

a2 a3 a4 a1

, Qi ,j = Qi+1,j+1 (11)

Embedding of Toeplitz Qss into circulant Q is

Q =

a1 a2 a3 a4 a3 a2

a2 a1 a2 a3 a4 a3

a3 a2 a1 a2 a3 a4

a4 a3 a2 a1 a2 a3

a3 a4 a3 a2 a1 a2

a2 a3 a4 a3 a2 a1

, (12)

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

28 / 57

Page 29: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Properties of circulant matrices

I One row/column of the circulant matrix is enough fordescription.

I Any Toeplitz covariance matrix Qss ∈ Rn×n (first columndenote by q) can be embedded in a larger circulant matrixQ ∈ Rn×n (first column denote by q).

I Eigenvectors of Q are the columns of the DFT matrix(dftmtx in matlab), and Q can be decomposed Q = FTΛF .

I usually obtain when discretize cov(x , y) = cov(|x − y |) on atensor grid.

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

29 / 57

Page 30: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Sampling and Injection:

Consider the m × n sampling matrix H:

Hi ,j =

{1 for xi = xj0 otherwise

, (13)

where xi are the coordinates of the i-th measurement location in y,and xj are the coordinates of the j-th estimation point in s.

sampling: ξm×1 = Hun×1 (14)

injection: un×1 = HTξm×1 (15)

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

30 / 57

Page 31: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Connection between covariance matrices

Qys = HQss (16)

Qsy = QssHT (17)

Qsyξ = QssHTξ = Qss

(HTξ

)︸ ︷︷ ︸

u

. (18)

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

31 / 57

Page 32: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Embedding and extraction

Let M be a n × n mapping matrix that transfers the entries of thefinite embedded domain onto the periodic embedding domain. Mhas one single entry of unity per column. Extraction of anembedded Toeplitz matrix Qss from the embedding circulantmatrix Q as follows:

Qss = MT QM, (19)

q = Mq

(Qsyξ =)Qssu = MT QMu = MTF [−d ](F [d ] (q) ◦ F [d ] (Mu)

).

(20)

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

32 / 57

Page 33: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Tensor properties of FFT

LemmaLet u = U(:) =

∑kuj=1

⊗di=1 uji , where u ∈ Rn, uji ∈ Rni and

n =∏d

i=1 ni . Then the d-dimensional Fourier transformationu = F [d ](u) is given by

u = F [d ] (u) =

(d⊗

i=1

Fi

)ku∑j=1

(d⊗

i=1

uji

)=

ku∑j=1

d⊗i=1

(Fi (uji ))

(21)

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

33 / 57

Page 34: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

FFT, Hadamard and Kronecker products

LemmaIf

u ◦ q =

ku∑j=1

d⊗i=1

uji

◦ kq∑`=1

d⊗i=1

q`i

=ku∑j=1

kq∑`=1

d⊗i=1

(uji ◦ q`i ) .

Then

F [d ](u ◦ q) =ku∑`=1

kq∑j=1

d⊗i=1

(Fi (uji ◦ q`i )) .

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

34 / 57

Page 35: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Accuracy

LemmaIf

‖q− q(kq)‖F ≤ εq,‖q− q(kq)‖F‖q‖F

≤ εrel,q.

Then

‖Qss −Q(kq)ss ‖F ≤

√nεq,

‖Qss −Q(kq)ss ‖F

‖Qss‖F≤ εrel,q .

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

35 / 57

Page 36: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

d-dimensional embedding/extraction

Have

M[d ] =d⊗

i=1

Mi .

Then

u = M[d ]u =

(d⊗

i=1

Mi

)·ku∑j=1

d⊗i=1

uji =ku∑j=1

d⊗i=1

Miuji =:ku∑j=1

d⊗i=1

uji ,

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

36 / 57

Page 37: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Lemma: Low-rank kriging estimate Qssu

Qssu ≈ Q(kq)ss u(ku) =

kq∑`=1

ku∑j=1

d⊗i=1

MTi F−1

i [(Fi q`i ) ◦ (Fi uji )] . (22)

with accuracy

‖Qssu−Q(kq)ss u(ku)‖F ≤

√nεq‖u‖+ ‖Qss‖ · εu. (23)

The total costs is O(kukqd L∗ log L∗) instead of O(n log n), with

L∗ = maxi=1...d ni and n =∏d

i=1 n.

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

37 / 57

Page 38: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Numerics: an example from geology

Domain: 20m × 20m × 20m, 25, 000× 25, 000× 25, 000 dofs.4,000 measurements randomly distributed within the volume, withincreasing data density towards the lower left back corner of thedomain.The covariance model is anisotropic Gaussian with unit varianceand with 32 correlation lengths fitting into the domain in thehorizontal directions, and 64 correlation lengths fitting into thevertical direction.

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

38 / 57

Page 39: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Example 6: Kriging

The top left figure shows the entire domain at a sampling rate of 1:64

per direction, and then a series of zooms into the respective lower left

back corner with zoom factors (sampling rates) of 4 (1:16), 16 (1:4), 64

(1:1) for the top right, bottom left and bottom right plots, respectively.

Color scale: showing the 95% confidence interval [µ− 2σ, µ+ 2σ].

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

39 / 57

Page 40: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

LECTURE 4

Lecture 4: DEFINITIONS,PROPERTIES AND APPLICATIONS

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

40 / 57

Page 41: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Definitions of CP

Let T :=⊗d

µ=1 Rnµ be the tensor product constructed fromvector spaces (Rnµ , 〈, 〉Rnµ ) (d ≥ 3).Tensor representation U is a multilinear map U : P → T , whereparametric space P =×D

ν=1 Pν (d ≤ D).Further, Pν depends on some representation rank parameterrν ∈ N.A standard example of a tensor representation is the canonicaltensor format.

(!!!)We distinguish between a tensor v ∈ T and its tensor formatrepresentation p ∈ P, where v = U(p).

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

41 / 57

Page 42: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

r-Terms, Tensor Rank, Canonical Tensor Format

The set Rr of tensors which can be represented in T with r -termsis defined as

Rr (T ) := Rr :=

r∑

i=1

d⊗µ=1

viµ ∈ T : viµ ∈ Rnµ

. (24)

Let v ∈ T . The tensor rank of v in T is

rank(v) := min {r ∈ N0 : v ∈ Rr} . (25)

Example: The Laplace operator in 3d:

∆3 = ∆1 ⊗ I ⊗ I + I ⊗∆1 ⊗ I + I ⊗ I ⊗∆1

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

42 / 57

Page 43: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Definitions of CP

The canonical tensor format is defined by the mapping

Ucp :d

×µ=1

Rnµ×r → Rr , (26)

v := (viµ : 1 ≤ i ≤ r , 1 ≤ µ ≤ d) 7→ Ucp(v) :=r∑

i=1

d⊗µ=1

viµ.

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

43 / 57

Page 44: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Properties of CP

LemmaLet r1, r2 ∈ N, u ∈ Rr1 and v ∈ Rr2 . We have

(i) 〈u, v〉T =∑r1

j1=1

∑r2j2=1

∏dµ=1 〈uj1µ, vj2µ〉Rnµ . The

computational cost of 〈u, v〉T is O(r1r2

∑dµ=1 nµ

).

(ii) u + v ∈ Rr1+r2 .

(iii) u � v ∈ Rr1r2 , where � denotes the point wise Hadamardproduct. Further, u � v can be computed in the canonicaltensor format with r1r2

∑dµ=1 nµ arithmetic operations.

Let R1 = A1BT1 , R2 = A2B

T2 be rank-k matrices, then

R1 + R2 = [A1A2][B1B2]T be rank-2k matrix. Rank truncation!

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

44 / 57

Page 45: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Properties of CP, Hadamard product and FT

Let u =∑k

j=1

⊗di=1 uji , uji ∈ Rn.

F [d ] (u) =k∑

j=1

d⊗i=1

(Fi (uji )) , where F [d ] =d⊗

i=1

Fi . (27)

Let S = ABT =∑k1

i=0 aibTi ∈ Rn×m,

T = CDT =∑k2

j=0 cidTi ∈ Rn×m where ai , ci ∈ Rn, bi , di ∈ Rm,

k1, k2, n,m > 0. Then

F (2)(S ◦ T ) =

k1∑i=0

k2∑j=0

F(ai ◦ cj)F(bTi ◦ dTj ).

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

45 / 57

Page 46: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Tucker Tensor Format

A =

k1∑i1=1

...

kd∑id=1

ci1,...,id · u1i1 ⊗ ...⊗ udid (28)

Core tensor c ∈ Rk1×...×kd , rank (k1, ..., kd).Nonlinear fixed rank approximation problem:

X = argminX minrank(k1,...,kd )‖A− X‖ (29)

I Problem is well-posed but not solved

I There are many local minima

I HOSVD (Lathauwer et al.) yields rank(k1, ..., kd)Y : ‖A− Y ‖ ≤

√d‖A− X‖

I reliable arithmetic, exponential scaling (c ∈ Rk1×k2×...×kd )

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

46 / 57

Page 47: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Example: Canonical rank d , whereas TT rank 2

d-Laplacian over uniform tensor grid. It is known to have theKronecker rank-d representation,

∆d = A⊗IN⊗...⊗IN+IN⊗A⊗...⊗IN+...+IN⊗IN⊗...⊗A ∈ RI⊗d⊗I⊗d

(30)with A = ∆1 = tridiag{−1, 2,−1} ∈ RN×N , and IN being theN × N identity. Notice that for the canonical rank we have rankkC (∆d) = d , while TT-rank of ∆d is equal to 2 for any dimensiondue to the explicit representation

∆d = (∆1 I )×(

I 0∆1 I

)× ...×

(I 0

∆1 I

)×(

I∆1

)(31)

where the rank product operation ”×” is defined as a regularmatrix product of the two corresponding core matrices, their blocksbeing multiplied by means of tensor product. The similar bound istrue for the Tucker rank rankTuck(∆d) = 2.

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

47 / 57

Page 48: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Advantages and disadvantages

Denote k - rank, d-dimension, n = # dofs in 1D:

1. CP: ill-posed approx. alg-m, O(dnk), hard to compute approx.

2. Tucker: reliable arithmetic based on SVD, O(dnk + kd)

3. Hierarchical Tucker: based on SVD, storage O(dnk + dk3),truncation O(dnk2 + dk4)

4. TT: based on SVD, O(dnk2) or O(dnk3), stable

5. Quantics-TT: O(nd)→ O(d logqn)

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

48 / 57

Page 49: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

POSTPROCESSING

Postprocessing in the canonical tensor format

Plan: Want to compute frequency and level sets ( needed for theprobability density function)

An intermediate step: compute the sign function and then thecharacteristic function χI (u)

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

49 / 57

Page 50: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Characteristic, Sign

DefinitionThe characteristic χI (u) ∈ T of u ∈ T in I ⊂ R is for every multi-index i ∈ I pointwise defined as

(χI (u))i :=

{1, ui ∈ I ;0, ui /∈ I .

(32)

Furthermore, the sign(u) ∈ T is for all i ∈ I pointwise defined by

(sign(u))i :=

1, ui > 0;−1, ui < 0;0, ui = 0.

(33)

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

50 / 57

Page 51: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

How to compute the characteristic function ?

Lemma: Let u ∈ T , a, b ∈ R, and 1 =⊗d

µ=1 1µ, where1µ := (1, . . . , 1)t ∈ Rnµ .

(i) If I = R<b, then we have χI (u) = 12 (1+ sign(b1− u)).

(ii) If I = R>a, then we have χI (u) = 12 (1− sign(a1− u)).

(iii) If I = (a, b), then we haveχI (u) = 1

2 (sign(b1− u)− sign(a1− u)).

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

51 / 57

Page 52: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Definition of Level Set and Frequency

DefinitionLet I ⊂ R and u ∈ T . The level set LI (u) ∈ T of u respect to I ispointwise defined by

(LI (u))i :=

{ui , ui ∈ I ;0, ui /∈ I ,

(34)

for all i ∈ I.The frequency FI (u) ∈ N of u respect to I is defined as

FI (u) := # suppχI (u). (35)

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

52 / 57

Page 53: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Computation of level sets and frequency

Proposition

Let I ⊂ R, u ∈ T , and χI (u) its characteristic. We have

LI (u) = χI (u)� u

and rank(LI (u)) ≤ rank(χI (u))rank(u).The frequency FI (u) ∈ N of u respect to I is

FI (u) = 〈χI (u),1〉 ,

where 1 =⊗d

µ=1 1µ, 1µ := (1, . . . , 1)T ∈ Rnµ .

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

53 / 57

Page 54: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

How to compute the mean value in CP format

Let u =∑r

j=1

⊗dµ=1 ujµ ∈ Rr , then the mean value u can be

computed as a scalar product

u =

⟨ r∑j=1

d⊗µ=1

ujµ

,

d⊗µ=1

1

nµ1µ

⟩ =r∑

j=1

d⊗µ=1

⟨ujµ, 1µ

⟩nµ

=

(36)

=r∑

j=1

d∏µ=1

1

( nµ∑k=1

ujµ

), (37)

where 1µ := (1, . . . , 1)T ∈ Rnµ .

Numerical cost is O(r ·∑d

µ=1 nµ)

.

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

54 / 57

Page 55: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

How to compute the variance in CP format

Let u ∈ Rr and

u := u − ud⊗µ=1

1

nµ1 =

r+1∑j=1

d⊗µ=1

ujµ ∈ Rr+1, (38)

then the variance var(u) of u can be computed as follows

var(u) =〈u, u〉∏dµ=1 nµ

=1∏d

µ=1 nµ

⟨r+1∑i=1

d⊗µ=1

uiµ

,

r+1∑j=1

d⊗ν=1

ujν

=r+1∑i=1

r+1∑j=1

d∏µ=1

1

nµ〈uiµ, ujµ〉 .

Numerical cost is O(

(r + 1)2 ·∑d

µ=1 nµ)

.

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

55 / 57

Page 56: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Conclusion

Today we discussed:

I Many examples why do we need low-rank tensors

I Tensors of the second order (matrices)

I Computation of the mean and the variance

I CP, Tucker and tensor train tensor formats

I Many classical kernels have (or can be approximated in )low-rank tensor format

I An example from kriging

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

56 / 57

Page 57: Lecture 1: Introduction to low-rank tensor representation/approximation · Write rst Karhunen-Loeve Expansion and then for uncorrelated random variables the Polynomial Chaos Expansion

4*

Tensor Software

Ivan Oseledets et al., Tensor Train toolbox (Matlab),http://spring.inm.ras.ru/osel

D.Kressner, C. Tobler, Hierarchical Tucker Toolbox (Matlab),http://www.sam.math.ethz.ch/NLAgroup/htucker toolbox.html

M. Espig, et alTensor Calculus library (C): http://gitorious.org/tensorcalculus

Center for UncertaintyQuantification

Center for UncertaintyQuantification

Center for Uncertainty Quantification Logo Lock-up

57 / 57