Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix...

103
Perturbation analysis of matrix optimization Chao Ding Institute of Applied Mathematics Academy of Mathematics and Systems Science, CAS ICCOPT2019, Berlin 2019.08.06

Transcript of Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix...

Page 1: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Perturbation analysis of matrix optimization

Chao Ding

Institute of Applied Mathematics

Academy of Mathematics and Systems Science, CAS

ICCOPT2019, Berlin

2019.08.06

Page 2: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Acknowledgements

Based on the joint work with Ying Cui at USC:

• Nonsmooth composite matrix optimizations: strong regularity,

constraint nondegeneracy and beyond, arXiv:1907.13253 (July,

2019).

Page 3: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Nonsmooth Composite Matrix Optimization Problem

CMatOP:

minimizex∈X

Φ(x) , f(x) + φ ◦ λ(g(x))

subject to h(x) = 0,

• X and Y: two given finite dimensional Euclidean spaces

• f : X→ R, g : X→ Sn and h : X→ Y: twice continuously

differentiable functions

• φ : Rn → (−∞,+∞]: a symmetric function, i.e., for any u ∈ Rn

and any n× n permutation matrix P ,

φ(Pu) = φ(u)

• λ: the vector of eigenvalues for any symmetric matrix

F We focus on the symmetric case just for simplicity;

F The obtained results can be extended to non-symmetric cases;

F This is a general model which includes many “non-polyhedral” OPs:

SDP, Eigenvalue optimization, etc

1

Page 4: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Nonsmooth Composite Matrix Optimization Problem

CMatOP:

minimizex∈X

Φ(x) , f(x) + φ ◦ λ(g(x))

subject to h(x) = 0,

• X and Y: two given finite dimensional Euclidean spaces

• f : X→ R, g : X→ Sn and h : X→ Y: twice continuously

differentiable functions

• φ : Rn → (−∞,+∞]: a symmetric function, i.e., for any u ∈ Rn

and any n× n permutation matrix P ,

φ(Pu) = φ(u)

• λ: the vector of eigenvalues for any symmetric matrix

F We focus on the symmetric case just for simplicity;

F The obtained results can be extended to non-symmetric cases;

F This is a general model which includes many “non-polyhedral” OPs:

SDP, Eigenvalue optimization, etc

1

Page 5: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Nonsmooth Composite Matrix Optimization Problem

CMatOP:

minimizex∈X

Φ(x) , f(x) + φ ◦ λ(g(x))

subject to h(x) = 0,

• X and Y: two given finite dimensional Euclidean spaces

• f : X→ R, g : X→ Sn and h : X→ Y: twice continuously

differentiable functions

• φ : Rn → (−∞,+∞]: a symmetric function, i.e., for any u ∈ Rn

and any n× n permutation matrix P ,

φ(Pu) = φ(u)

• λ: the vector of eigenvalues for any symmetric matrix

F We focus on the symmetric case just for simplicity;

F The obtained results can be extended to non-symmetric cases;

F This is a general model which includes many “non-polyhedral” OPs:

SDP, Eigenvalue optimization, etc

1

Page 6: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Nonsmooth Composite Matrix Optimization Problem

CMatOP:

minimizex∈X

Φ(x) , f(x) + φ ◦ λ(g(x))

subject to h(x) = 0,

• X and Y: two given finite dimensional Euclidean spaces

• f : X→ R, g : X→ Sn and h : X→ Y: twice continuously

differentiable functions

• φ : Rn → (−∞,+∞]: a symmetric function, i.e., for any u ∈ Rn

and any n× n permutation matrix P ,

φ(Pu) = φ(u)

• λ: the vector of eigenvalues for any symmetric matrix

F We focus on the symmetric case just for simplicity;

F The obtained results can be extended to non-symmetric cases;

F This is a general model which includes many “non-polyhedral” OPs:

SDP, Eigenvalue optimization, etc

1

Page 7: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Nonsmooth Composite Matrix Optimization Problem

CMatOP:

minimizex∈X

Φ(x) , f(x) + φ ◦ λ(g(x))

subject to h(x) = 0,

• X and Y: two given finite dimensional Euclidean spaces

• f : X→ R, g : X→ Sn and h : X→ Y: twice continuously

differentiable functions

• φ : Rn → (−∞,+∞]: a symmetric function, i.e., for any u ∈ Rn

and any n× n permutation matrix P ,

φ(Pu) = φ(u)

• λ: the vector of eigenvalues for any symmetric matrix

F We focus on the symmetric case just for simplicity;

F The obtained results can be extended to non-symmetric cases;

F This is a general model which includes many “non-polyhedral” OPs:

SDP, Eigenvalue optimization, etc

1

Page 8: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Nonsmooth Composite Matrix Optimization Problem

CMatOP:

minimizex∈X

Φ(x) , f(x) + φ ◦ λ(g(x))

subject to h(x) = 0,

• X and Y: two given finite dimensional Euclidean spaces

• f : X→ R, g : X→ Sn and h : X→ Y: twice continuously

differentiable functions

• φ : Rn → (−∞,+∞]: a symmetric function, i.e., for any u ∈ Rn

and any n× n permutation matrix P ,

φ(Pu) = φ(u)

• λ: the vector of eigenvalues for any symmetric matrix

F We focus on the symmetric case just for simplicity;

F The obtained results can be extended to non-symmetric cases;

F This is a general model which includes many “non-polyhedral” OPs:

SDP, Eigenvalue optimization, etc

1

Page 9: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Nonsmooth Composite Matrix Optimization Problem

CMatOP:

minimizex∈X

Φ(x) , f(x) + φ ◦ λ(g(x))

subject to h(x) = 0,

• X and Y: two given finite dimensional Euclidean spaces

• f : X→ R, g : X→ Sn and h : X→ Y: twice continuously

differentiable functions

• φ : Rn → (−∞,+∞]: a symmetric function, i.e., for any u ∈ Rn

and any n× n permutation matrix P ,

φ(Pu) = φ(u)

• λ: the vector of eigenvalues for any symmetric matrix

F We focus on the symmetric case just for simplicity;

F The obtained results can be extended to non-symmetric cases;

F This is a general model which includes many “non-polyhedral” OPs:

SDP, Eigenvalue optimization, etc

1

Page 10: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Nonsmooth Composite Matrix Optimization Problem

CMatOP:

minimizex∈X

Φ(x) , f(x) + φ ◦ λ(g(x))

subject to h(x) = 0,

• X and Y: two given finite dimensional Euclidean spaces

• f : X→ R, g : X→ Sn and h : X→ Y: twice continuously

differentiable functions

• φ : Rn → (−∞,+∞]: a symmetric function, i.e., for any u ∈ Rn

and any n× n permutation matrix P ,

φ(Pu) = φ(u)

• λ: the vector of eigenvalues for any symmetric matrix

F We focus on the symmetric case just for simplicity;

F The obtained results can be extended to non-symmetric cases;

F This is a general model which includes many “non-polyhedral” OPs:

SDP, Eigenvalue optimization, etc 1

Page 11: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

More applications

• Fastest mixing Markov chain problem (fast load balancing of

paralleled systems)

• Fastest distributed linear averaging problem

• The reduced rank approximations of transition matrices

• The low rank approximations of doubly stochastic matrices

• Low-rank approximation of matrices with linear structures

• Unsupervised learning

• ......

2

Page 12: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Spectral functions

φ ◦ λ: spectral function (Friedland, 1981)

• φ : Rn → (−∞,+∞] is a symmetric convex piecewise linear

function

• a convex piecewise linear function: a polyhedral convex

function (Rockafellar, 1970)

3

Page 13: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Spectral functions

φ ◦ λ: spectral function (Friedland, 1981)

• φ : Rn → (−∞,+∞] is a symmetric convex piecewise linear

function

• a convex piecewise linear function: a polyhedral convex

function (Rockafellar, 1970)

3

Page 14: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Spectral functions

φ ◦ λ: spectral function (Friedland, 1981)

• φ : Rn → (−∞,+∞] is a symmetric convex piecewise linear

function

• a convex piecewise linear function: a polyhedral convex

function (Rockafellar, 1970)

3

Page 15: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Convex piecewise linear functions

Theorem (Rockafellar & Wets, 1998)

φ can be expressed in the form of

φ(x) = φ1(x) + φ2(x), x ∈ Rn,

with φ1 : Rn → R and φ2 : Rn → (−∞,+∞] are defined by

φ1(x) := max1≤i≤p

{〈ai,x〉 − ci

}and φ2(x) := δdomφ(x),

• a1, . . . ,ap ∈ Rn, c1, . . . , cp ∈ R with some positive integer p ≥ 1;

• domφ is a polyhedral set:

domφ :=

{x ∈ Rn | max

1≤i≤q{〈bi,x〉 − di} ≤ 0

}• b1, . . . ,bq ∈ Rn and d1, . . . , dq ∈ R for some positive integer q ≥ 1.

4

Page 16: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Examples

SDP:

Sn− = {X ∈ Sn | λmax(X) ≤ 0} = {X ∈ Sn | max1≤i≤n

{〈ei, λ(X)〉} ≤ 0}

• ei ∈ Rn: the canonical basis of Rn

g(x) ∈ Sn− ⇐⇒ φ2 ◦ λ(g(x)) = δdomφ(g(x))

Eigenvalue optimizations:

sk(X) =

k∑i=1

λi(X) = max1≤i≤p

{〈ai, λ(X)〉

}

• ai ∈ Rn: the vector contains k ones and n− k zeros

5

Page 17: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Examples

SDP:

Sn− = {X ∈ Sn | λmax(X) ≤ 0} = {X ∈ Sn | max1≤i≤n

{〈ei, λ(X)〉} ≤ 0}

• ei ∈ Rn: the canonical basis of Rn

g(x) ∈ Sn− ⇐⇒ φ2 ◦ λ(g(x)) = δdomφ(g(x))

Eigenvalue optimizations:

sk(X) =

k∑i=1

λi(X) = max1≤i≤p

{〈ai, λ(X)〉

}

• ai ∈ Rn: the vector contains k ones and n− k zeros

5

Page 18: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Examples

SDP:

Sn− = {X ∈ Sn | λmax(X) ≤ 0} = {X ∈ Sn | max1≤i≤n

{〈ei, λ(X)〉} ≤ 0}

• ei ∈ Rn: the canonical basis of Rn

g(x) ∈ Sn− ⇐⇒ φ2 ◦ λ(g(x)) = δdomφ(g(x))

Eigenvalue optimizations:

sk(X) =

k∑i=1

λi(X) = max1≤i≤p

{〈ai, λ(X)〉

}

• ai ∈ Rn: the vector contains k ones and n− k zeros

5

Page 19: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Examples

SDP:

Sn− = {X ∈ Sn | λmax(X) ≤ 0} = {X ∈ Sn | max1≤i≤n

{〈ei, λ(X)〉} ≤ 0}

• ei ∈ Rn: the canonical basis of Rn

g(x) ∈ Sn− ⇐⇒ φ2 ◦ λ(g(x)) = δdomφ(g(x))

Eigenvalue optimizations:

sk(X) =

k∑i=1

λi(X) = max1≤i≤p

{〈ai, λ(X)〉

}

• ai ∈ Rn: the vector contains k ones and n− k zeros

5

Page 20: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Examples

SDP:

Sn− = {X ∈ Sn | λmax(X) ≤ 0} = {X ∈ Sn | max1≤i≤n

{〈ei, λ(X)〉} ≤ 0}

• ei ∈ Rn: the canonical basis of Rn

g(x) ∈ Sn− ⇐⇒ φ2 ◦ λ(g(x)) = δdomφ(g(x))

Eigenvalue optimizations:

sk(X) =

k∑i=1

λi(X) = max1≤i≤p

{〈ai, λ(X)〉

}

• ai ∈ Rn: the vector contains k ones and n− k zeros

5

Page 21: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Perturbation analysis of CMatOPs

Canonically perturbed CMatOPs with parameters (a,b, c) ∈ X×Y× Sn:

minimizex∈X

f(x)− 〈a,x〉+ φ ◦ λ(g(x) + c)

subject to h(x) + b = 0

The Karush-Kuhn-Tucker (KKT) optimality condition for perturbed

problem takes the following form:a = ∇f(x) + h′(x)∗y + g′(x)∗Y + g′(x)∗Z

b = −h(x)

c ∈ −g(x) + ∂θ∗1(Y )

c ∈ −g(x) + ∂θ∗2(Z)

with θ1 = φ1 ◦ λ and θ2 = φ2 ◦ λ are two spectral functions

Strong regularity:

When the solution mapping SKKT(a,b, c) is Lipschitz continuous?

6

Page 22: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Perturbation analysis of CMatOPs

Canonically perturbed CMatOPs with parameters (a,b, c) ∈ X×Y× Sn:

minimizex∈X

f(x)− 〈a,x〉+ φ ◦ λ(g(x) + c)

subject to h(x) + b = 0

The Karush-Kuhn-Tucker (KKT) optimality condition for perturbed

problem takes the following form:a = ∇f(x) + h′(x)∗y + g′(x)∗Y + g′(x)∗Z

b = −h(x)

c ∈ −g(x) + ∂θ∗1(Y )

c ∈ −g(x) + ∂θ∗2(Z)

with θ1 = φ1 ◦ λ and θ2 = φ2 ◦ λ are two spectral functions

Strong regularity:

When the solution mapping SKKT(a,b, c) is Lipschitz continuous?

6

Page 23: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Perturbation analysis of CMatOPs

Canonically perturbed CMatOPs with parameters (a,b, c) ∈ X×Y× Sn:

minimizex∈X

f(x)− 〈a,x〉+ φ ◦ λ(g(x) + c)

subject to h(x) + b = 0

The Karush-Kuhn-Tucker (KKT) optimality condition for perturbed

problem takes the following form:a = ∇f(x) + h′(x)∗y + g′(x)∗Y + g′(x)∗Z

b = −h(x)

c ∈ −g(x) + ∂θ∗1(Y )

c ∈ −g(x) + ∂θ∗2(Z)

with θ1 = φ1 ◦ λ and θ2 = φ2 ◦ λ are two spectral functions

Strong regularity:

When the solution mapping SKKT(a,b, c) is Lipschitz continuous?6

Page 24: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Why it matters?

• Perturbation theory

• Algorithm

7

Page 25: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Why it matters?

• Perturbation theory

• Algorithm

7

Page 26: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Why it matters?

• Perturbation theory

• Algorithm

7

Page 27: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

How?

Variational analysis

but in slightly different way:

Variational analysis of spectral functions

• Tangent sets

• Critical cones

• Second-order tangent sets

• The “σ-term”: the key difference between NLPs (polyhedral) and

CMatOPs (non-polyhedral)

8

Page 28: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

How?

Variational analysis but in slightly different way:

Variational analysis of spectral functions

• Tangent sets

• Critical cones

• Second-order tangent sets

• The “σ-term”: the key difference between NLPs (polyhedral) and

CMatOPs (non-polyhedral)

8

Page 29: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

How?

Variational analysis but in slightly different way:

Variational analysis of spectral functions

• Tangent sets

• Critical cones

• Second-order tangent sets

• The “σ-term”: the key difference between NLPs (polyhedral) and

CMatOPs (non-polyhedral)

8

Page 30: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

How?

Variational analysis but in slightly different way:

Variational analysis of spectral functions

• Tangent sets

• Critical cones

• Second-order tangent sets

• The “σ-term”: the key difference between NLPs (polyhedral) and

CMatOPs (non-polyhedral)

8

Page 31: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

How?

Variational analysis but in slightly different way:

Variational analysis of spectral functions

• Tangent sets

• Critical cones

• Second-order tangent sets

• The “σ-term”: the key difference between NLPs (polyhedral) and

CMatOPs (non-polyhedral)

8

Page 32: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

How?

Variational analysis but in slightly different way:

Variational analysis of spectral functions

• Tangent sets

• Critical cones

• Second-order tangent sets

• The “σ-term”: the key difference between NLPs (polyhedral) and

CMatOPs (non-polyhedral)

8

Page 33: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

How?

Variational analysis but in slightly different way:

Variational analysis of spectral functions

• Tangent sets

• Critical cones

• Second-order tangent sets

• The “σ-term”:

the key difference between NLPs (polyhedral) and

CMatOPs (non-polyhedral)

8

Page 34: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

How?

Variational analysis but in slightly different way:

Variational analysis of spectral functions

• Tangent sets

• Critical cones

• Second-order tangent sets

• The “σ-term”: the key difference between NLPs (polyhedral) and

CMatOPs (non-polyhedral)

8

Page 35: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”: polyhedral =⇒ non-polyhedral

9

Page 36: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”: polyhedral =⇒ non-polyhedral (cont’d)

Metric projection operator ΠK:

A := ΠK(C) := argmin

{1

2‖Y − C‖2 | Y ∈ K

}

If K is a polyhedral closed convex set,

• ΠK is directional differentiable (Facchinei & Pang, 2003)1

ΠK(C +H)−ΠK(C) = ΠCK(C)(H) =: Π′K(C;H) ∀H

• CK(C) is the critical cone of K at C

1F. Facchinei and J. S. Pang. Finite-Dimensional Variational Inequalities and

Complementarity Problems: Volume I, Springer-Verlag, New York, 2003.

10

Page 37: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”: polyhedral =⇒ non-polyhedral (cont’d)

Metric projection operator ΠK:

A := ΠK(C) := argmin

{1

2‖Y − C‖2 | Y ∈ K

}If K is a polyhedral closed convex set,

• ΠK is directional differentiable (Facchinei & Pang, 2003)1

ΠK(C +H)−ΠK(C) = ΠCK(C)(H) =: Π′K(C;H) ∀H

• CK(C) is the critical cone of K at C

1F. Facchinei and J. S. Pang. Finite-Dimensional Variational Inequalities and

Complementarity Problems: Volume I, Springer-Verlag, New York, 2003.

10

Page 38: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”: polyhedral =⇒ non-polyhedral (cont’d)

Metric projection operator ΠK:

A := ΠK(C) := argmin

{1

2‖Y − C‖2 | Y ∈ K

}If K is a polyhedral closed convex set,

• ΠK is directional differentiable (Facchinei & Pang, 2003)1

ΠK(C +H)−ΠK(C) = ΠCK(C)(H) =: Π′K(C;H) ∀H

• CK(C) is the critical cone of K at C

1F. Facchinei and J. S. Pang. Finite-Dimensional Variational Inequalities and

Complementarity Problems: Volume I, Springer-Verlag, New York, 2003.

10

Page 39: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”: polyhedral =⇒ non-polyhedral (cont’d)

Metric projection operator ΠK:

A := ΠK(C) := argmin

{1

2‖Y − C‖2 | Y ∈ K

}If K is a polyhedral closed convex set,

• ΠK is directional differentiable (Facchinei & Pang, 2003)1

ΠK(C +H)−ΠK(C) = ΠCK(C)(H) =: Π′K(C;H) ∀H

• CK(C) is the critical cone of K at C

1F. Facchinei and J. S. Pang. Finite-Dimensional Variational Inequalities and

Complementarity Problems: Volume I, Springer-Verlag, New York, 2003.

10

Page 40: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”: polyhedral =⇒ non-polyhedral (cont’d)

If K is a non-polyhedral closed convex set

but C2-cone reducible,

• ΠK is directional differentiable and Π′K(C;H) is the unique optimal

solution to (Bonnans et al., 1998)2:

min{‖D −H‖2 − σ(B, T 2

K(A,D)) | D ∈ CK(C)}

• B := C −A and σ(B, T 2K(A,D)) is the “σ-term” of K

polyhedral:

min ‖D −H‖2

s.t. D ∈ CK(C)

non-polyhedral:

min ‖D −H‖2 − σ(B, T 2K(A,D))

s.t. D ∈ CK(C)

2J.F. Bonnans, R. Cominetti and A. Shapiro. Sensitivity analysis of optimization problems

under second order regular constraints. Mathematics of Operations Research 23 (1998) 806–831.

11

Page 41: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”: polyhedral =⇒ non-polyhedral (cont’d)

If K is a non-polyhedral closed convex set but C2-cone reducible,

• ΠK is directional differentiable and Π′K(C;H) is the unique optimal

solution to (Bonnans et al., 1998)2:

min{‖D −H‖2 − σ(B, T 2

K(A,D)) | D ∈ CK(C)}

• B := C −A and σ(B, T 2K(A,D)) is the “σ-term” of K

polyhedral:

min ‖D −H‖2

s.t. D ∈ CK(C)

non-polyhedral:

min ‖D −H‖2 − σ(B, T 2K(A,D))

s.t. D ∈ CK(C)

2J.F. Bonnans, R. Cominetti and A. Shapiro. Sensitivity analysis of optimization problems

under second order regular constraints. Mathematics of Operations Research 23 (1998) 806–831.

11

Page 42: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”: polyhedral =⇒ non-polyhedral (cont’d)

If K is a non-polyhedral closed convex set but C2-cone reducible,

• ΠK is directional differentiable and Π′K(C;H) is the unique optimal

solution to (Bonnans et al., 1998)2:

min{‖D −H‖2 − σ(B, T 2

K(A,D)) | D ∈ CK(C)}

• B := C −A and σ(B, T 2K(A,D)) is the “σ-term” of K

polyhedral:

min ‖D −H‖2

s.t. D ∈ CK(C)

non-polyhedral:

min ‖D −H‖2 − σ(B, T 2K(A,D))

s.t. D ∈ CK(C)

2J.F. Bonnans, R. Cominetti and A. Shapiro. Sensitivity analysis of optimization problems

under second order regular constraints. Mathematics of Operations Research 23 (1998) 806–831.

11

Page 43: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”: polyhedral =⇒ non-polyhedral (cont’d)

If K is a non-polyhedral closed convex set but C2-cone reducible,

• ΠK is directional differentiable and Π′K(C;H) is the unique optimal

solution to (Bonnans et al., 1998)2:

min{‖D −H‖2 − σ(B, T 2

K(A,D)) | D ∈ CK(C)}

• B := C −A and σ(B, T 2K(A,D)) is the “σ-term” of K

polyhedral:

min ‖D −H‖2

s.t. D ∈ CK(C)

non-polyhedral:

min ‖D −H‖2 − σ(B, T 2K(A,D))

s.t. D ∈ CK(C)

2J.F. Bonnans, R. Cominetti and A. Shapiro. Sensitivity analysis of optimization problems

under second order regular constraints. Mathematics of Operations Research 23 (1998) 806–831.

11

Page 44: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”: polyhedral =⇒ non-polyhedral (cont’d)

If K is a non-polyhedral closed convex set but C2-cone reducible,

• ΠK is directional differentiable and Π′K(C;H) is the unique optimal

solution to (Bonnans et al., 1998)2:

min{‖D −H‖2 − σ(B, T 2

K(A,D)) | D ∈ CK(C)}

• B := C −A and σ(B, T 2K(A,D)) is the “σ-term” of K

polyhedral:

min ‖D −H‖2

s.t. D ∈ CK(C)

non-polyhedral:

min ‖D −H‖2 − σ(B, T 2K(A,D))

s.t. D ∈ CK(C)

2J.F. Bonnans, R. Cominetti and A. Shapiro. Sensitivity analysis of optimization problems

under second order regular constraints. Mathematics of Operations Research 23 (1998) 806–831.

11

Page 45: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Convex piecewise linear + Symmetric

(Rockafellar & Wets, 1998): φ = φ1 + φ2 with φ2 = δdomφ

φ1(x) = max1≤i≤p

{〈ai,x〉 − ci

}, domφ =

{x ∈ Rn | max

1≤i≤q

{〈bi,x〉 − di

}≤ 0}

Proposition

Let φ = φ1 + φ2 : Rn → (−∞,∞] be a given proper convex piecewise

linear function. φ is symmetric over Rn if and only if the functions

φ1 : Rn → R and φ2 : Rn → (−∞,∞] satisfy the following conditions:

for any x ∈ Rn,

φ1(x) = max1≤i≤p

{maxQ∈Pn

{〈Qai,x〉 − ci

}}and φ2(x) = δdomφ(x),

where domφ =

{x ∈ Rn | max

1≤i≤q

{maxQ∈Pn

{〈Qbi,x〉 − di

}}≤ 0

}.

12

Page 46: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Convex piecewise linear + Symmetric

(Rockafellar & Wets, 1998): φ = φ1 + φ2 with φ2 = δdomφ

φ1(x) = max1≤i≤p

{〈ai,x〉 − ci

}, domφ =

{x ∈ Rn | max

1≤i≤q

{〈bi,x〉 − di

}≤ 0}

Proposition

Let φ = φ1 + φ2 : Rn → (−∞,∞] be a given proper convex piecewise

linear function. φ is symmetric over Rn if and only if the functions

φ1 : Rn → R and φ2 : Rn → (−∞,∞] satisfy the following conditions:

for any x ∈ Rn,

φ1(x) = max1≤i≤p

{maxQ∈Pn

{〈Qai,x〉 − ci

}}and φ2(x) = δdomφ(x),

where domφ =

{x ∈ Rn | max

1≤i≤q

{maxQ∈Pn

{〈Qbi,x〉 − di

}}≤ 0

}.

12

Page 47: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Convex piecewise linear + Symmetric

(Rockafellar & Wets, 1998): φ = φ1 + φ2 with φ2 = δdomφ

φ1(x) = max1≤i≤p

{〈ai,x〉 − ci

}, domφ =

{x ∈ Rn | max

1≤i≤q

{〈bi,x〉 − di

}≤ 0}

Proposition

Let φ = φ1 + φ2 : Rn → (−∞,∞] be a given proper convex piecewise

linear function. φ is symmetric over Rn if and only if the functions

φ1 : Rn → R and φ2 : Rn → (−∞,∞] satisfy the following conditions:

for any x ∈ Rn,

φ1(x) = max1≤i≤p

{maxQ∈Pn

{〈Qai,x〉 − ci

}}and φ2(x) = δdomφ(x),

where domφ =

{x ∈ Rn | max

1≤i≤q

{maxQ∈Pn

{〈Qbi,x〉 − di

}}≤ 0

}.

12

Page 48: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Convex piecewise linear + Symmetric (cont’d)

• For i = 1, . . . , p, define

Di :={x ∈ domφ | 〈aj ,x〉 − cj ≤ 〈ai,x〉 − ci ∀ j = 1, . . . , p

},

then domφ =⋃

i=1,...,p

Di

• any x ∈ domφ, we have two active index sets:

ι1(x) := {1 ≤ i ≤ p | x ∈ Di}, ι2(x) := {1 ≤ i ≤ q | 〈bi,x〉−di = 0}.

Proposition

For any i ∈ ι1(x), j ∈ ι2(x) and Q ∈ Pnx (i.e., Qx = x), there exist

i′ ∈ ι1(x) and j′ ∈ ι2(x) such that ai′

= Qai and bj′

= Qbj ,

respectively.

13

Page 49: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Convex piecewise linear + Symmetric (cont’d)

• For i = 1, . . . , p, define

Di :={x ∈ domφ | 〈aj ,x〉 − cj ≤ 〈ai,x〉 − ci ∀ j = 1, . . . , p

},

then domφ =⋃

i=1,...,p

Di

• any x ∈ domφ, we have two active index sets:

ι1(x) := {1 ≤ i ≤ p | x ∈ Di}, ι2(x) := {1 ≤ i ≤ q | 〈bi,x〉−di = 0}.

Proposition

For any i ∈ ι1(x), j ∈ ι2(x) and Q ∈ Pnx (i.e., Qx = x), there exist

i′ ∈ ι1(x) and j′ ∈ ι2(x) such that ai′

= Qai and bj′

= Qbj ,

respectively.

13

Page 50: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Convex piecewise linear + Symmetric (cont’d)

Rockafellar & Wets, 1998, Mordukhovich & Sarabi, 2016:

• the subgradients:

∂φ1(x) = conv{ai, i ∈ ι1(x)}, ∂φ2(x) = Ndomφ(x) = cone{bi, i ∈ ι2(x)}

φ1(x) = max1≤i≤p

{〈ai,x〉 − ci

}is finite everywhere,

• φ1 is directionally differentiable

• the directional derivate:

φ′1(x;h) = maxi∈ι1(x)

〈ai,h〉, h ∈ Rn.

Let ψ(x) := max1≤i≤q

{〈bi,x〉 − di

}. Then, domφ =

{x ∈ Rn | ψ(x) ≤ 0

}• ψ is directionally differentiable

• the directional derivate:

ψ′(x;h) = maxi∈ι2(x)

〈bi,h〉, h ∈ Rn.

14

Page 51: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Convex piecewise linear + Symmetric (cont’d)

Rockafellar & Wets, 1998, Mordukhovich & Sarabi, 2016:

• the subgradients:

∂φ1(x) = conv{ai, i ∈ ι1(x)}, ∂φ2(x) = Ndomφ(x) = cone{bi, i ∈ ι2(x)}

φ1(x) = max1≤i≤p

{〈ai,x〉 − ci

}is finite everywhere,

• φ1 is directionally differentiable

• the directional derivate:

φ′1(x;h) = maxi∈ι1(x)

〈ai,h〉, h ∈ Rn.

Let ψ(x) := max1≤i≤q

{〈bi,x〉 − di

}. Then, domφ =

{x ∈ Rn | ψ(x) ≤ 0

}• ψ is directionally differentiable

• the directional derivate:

ψ′(x;h) = maxi∈ι2(x)

〈bi,h〉, h ∈ Rn.

14

Page 52: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Convex piecewise linear + Symmetric (cont’d)

Rockafellar & Wets, 1998, Mordukhovich & Sarabi, 2016:

• the subgradients:

∂φ1(x) = conv{ai, i ∈ ι1(x)}, ∂φ2(x) = Ndomφ(x) = cone{bi, i ∈ ι2(x)}

φ1(x) = max1≤i≤p

{〈ai,x〉 − ci

}is finite everywhere,

• φ1 is directionally differentiable

• the directional derivate:

φ′1(x;h) = maxi∈ι1(x)

〈ai,h〉, h ∈ Rn.

Let ψ(x) := max1≤i≤q

{〈bi,x〉 − di

}. Then, domφ =

{x ∈ Rn | ψ(x) ≤ 0

}• ψ is directionally differentiable

• the directional derivate:

ψ′(x;h) = maxi∈ι2(x)

〈bi,h〉, h ∈ Rn.

14

Page 53: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Convex piecewise linear + Symmetric (cont’d)

Rockafellar & Wets, 1998, Mordukhovich & Sarabi, 2016:

• the subgradients:

∂φ1(x) = conv{ai, i ∈ ι1(x)}, ∂φ2(x) = Ndomφ(x) = cone{bi, i ∈ ι2(x)}

φ1(x) = max1≤i≤p

{〈ai,x〉 − ci

}is finite everywhere,

• φ1 is directionally differentiable

• the directional derivate:

φ′1(x;h) = maxi∈ι1(x)

〈ai,h〉, h ∈ Rn.

Let ψ(x) := max1≤i≤q

{〈bi,x〉 − di

}. Then, domφ =

{x ∈ Rn | ψ(x) ≤ 0

}

• ψ is directionally differentiable

• the directional derivate:

ψ′(x;h) = maxi∈ι2(x)

〈bi,h〉, h ∈ Rn.

14

Page 54: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Convex piecewise linear + Symmetric (cont’d)

Rockafellar & Wets, 1998, Mordukhovich & Sarabi, 2016:

• the subgradients:

∂φ1(x) = conv{ai, i ∈ ι1(x)}, ∂φ2(x) = Ndomφ(x) = cone{bi, i ∈ ι2(x)}

φ1(x) = max1≤i≤p

{〈ai,x〉 − ci

}is finite everywhere,

• φ1 is directionally differentiable

• the directional derivate:

φ′1(x;h) = maxi∈ι1(x)

〈ai,h〉, h ∈ Rn.

Let ψ(x) := max1≤i≤q

{〈bi,x〉 − di

}. Then, domφ =

{x ∈ Rn | ψ(x) ≤ 0

}• ψ is directionally differentiable

• the directional derivate:

ψ′(x;h) = maxi∈ι2(x)

〈bi,h〉, h ∈ Rn.

14

Page 55: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Tangent sets

For θ1 = φ1 ◦ λ:

• Tangent set of epigraph:

Tepi θ1(X, θ(X)) = epi θ′1(X; ·) :={

(H, y) ∈ Sn × R | θ′1(X;H) ≤ y}

• The lineality space:

T linθ1 (X) :=

{H ∈ Sn | θ′1(X;H) = −θ′1(X;−H)

}Proposition

H ∈ T linθ1

(X) if and only if 〈z, λ′(X;H)〉 is a constant for any

z ∈ ∂φ1(λ(X)), i.e.,

〈λ′(X;H),ai − aj〉 = 0 ∀ i, j ∈ ι1(λ(X)).

15

Page 56: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Tangent sets

For θ1 = φ1 ◦ λ:

• Tangent set of epigraph:

Tepi θ1(X, θ(X)) = epi θ′1(X; ·) :={

(H, y) ∈ Sn × R | θ′1(X;H) ≤ y}

• The lineality space:

T linθ1 (X) :=

{H ∈ Sn | θ′1(X;H) = −θ′1(X;−H)

}Proposition

H ∈ T linθ1

(X) if and only if 〈z, λ′(X;H)〉 is a constant for any

z ∈ ∂φ1(λ(X)), i.e.,

〈λ′(X;H),ai − aj〉 = 0 ∀ i, j ∈ ι1(λ(X)).

15

Page 57: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Tangent sets

For θ1 = φ1 ◦ λ:

• Tangent set of epigraph:

Tepi θ1(X, θ(X)) = epi θ′1(X; ·) :={

(H, y) ∈ Sn × R | θ′1(X;H) ≤ y}

• The lineality space:

T linθ1 (X) :=

{H ∈ Sn | θ′1(X;H) = −θ′1(X;−H)

}Proposition

H ∈ T linθ1

(X) if and only if 〈z, λ′(X;H)〉 is a constant for any

z ∈ ∂φ1(λ(X)), i.e.,

〈λ′(X;H),ai − aj〉 = 0 ∀ i, j ∈ ι1(λ(X)).

15

Page 58: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Tangent sets

For θ1 = φ1 ◦ λ:

• Tangent set of epigraph:

Tepi θ1(X, θ(X)) = epi θ′1(X; ·) :={

(H, y) ∈ Sn × R | θ′1(X;H) ≤ y}

• The lineality space:

T linθ1 (X) :=

{H ∈ Sn | θ′1(X;H) = −θ′1(X;−H)

}

Proposition

H ∈ T linθ1

(X) if and only if 〈z, λ′(X;H)〉 is a constant for any

z ∈ ∂φ1(λ(X)), i.e.,

〈λ′(X;H),ai − aj〉 = 0 ∀ i, j ∈ ι1(λ(X)).

15

Page 59: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Tangent sets

For θ1 = φ1 ◦ λ:

• Tangent set of epigraph:

Tepi θ1(X, θ(X)) = epi θ′1(X; ·) :={

(H, y) ∈ Sn × R | θ′1(X;H) ≤ y}

• The lineality space:

T linθ1 (X) :=

{H ∈ Sn | θ′1(X;H) = −θ′1(X;−H)

}Proposition

H ∈ T linθ1

(X) if and only if 〈z, λ′(X;H)〉 is a constant for any

z ∈ ∂φ1(λ(X)), i.e.,

〈λ′(X;H),ai − aj〉 = 0 ∀ i, j ∈ ι1(λ(X)).

15

Page 60: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Tangent sets (cont’d)

For θ2 = φ2 ◦ λ:

• θ2 = δK with

K = {X ∈ Sn | λ(X) ∈ domφ} = {X ∈ Sn | ζ(X) ≤ 0} ,

where ζ = ψ ◦ λ• Tangent set of K:

TK(X) ={H ∈ Sn | ζ ′(X;H) ≤ 0

}=

{H ∈ Sn | 〈bi, λ′(X;H)〉 ≤ 0 ∀ i ∈ ι2(λ(X))

}• The lineality space:

lin(TK(X)) ={H ∈ Sn | ζ ′(X;H) = −ζ ′(X;−H) = 0

}Proposition

H ∈ lin(TK(X)) if and only if 〈bi, λ′(X;H)〉 = 0 for any i ∈ ι2(λ(X)).

16

Page 61: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Tangent sets (cont’d)

For θ2 = φ2 ◦ λ:

• θ2 = δK with

K = {X ∈ Sn | λ(X) ∈ domφ} = {X ∈ Sn | ζ(X) ≤ 0} ,

where ζ = ψ ◦ λ

• Tangent set of K:

TK(X) ={H ∈ Sn | ζ ′(X;H) ≤ 0

}=

{H ∈ Sn | 〈bi, λ′(X;H)〉 ≤ 0 ∀ i ∈ ι2(λ(X))

}• The lineality space:

lin(TK(X)) ={H ∈ Sn | ζ ′(X;H) = −ζ ′(X;−H) = 0

}Proposition

H ∈ lin(TK(X)) if and only if 〈bi, λ′(X;H)〉 = 0 for any i ∈ ι2(λ(X)).

16

Page 62: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Tangent sets (cont’d)

For θ2 = φ2 ◦ λ:

• θ2 = δK with

K = {X ∈ Sn | λ(X) ∈ domφ} = {X ∈ Sn | ζ(X) ≤ 0} ,

where ζ = ψ ◦ λ• Tangent set of K:

TK(X) ={H ∈ Sn | ζ ′(X;H) ≤ 0

}=

{H ∈ Sn | 〈bi, λ′(X;H)〉 ≤ 0 ∀ i ∈ ι2(λ(X))

}

• The lineality space:

lin(TK(X)) ={H ∈ Sn | ζ ′(X;H) = −ζ ′(X;−H) = 0

}Proposition

H ∈ lin(TK(X)) if and only if 〈bi, λ′(X;H)〉 = 0 for any i ∈ ι2(λ(X)).

16

Page 63: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Tangent sets (cont’d)

For θ2 = φ2 ◦ λ:

• θ2 = δK with

K = {X ∈ Sn | λ(X) ∈ domφ} = {X ∈ Sn | ζ(X) ≤ 0} ,

where ζ = ψ ◦ λ• Tangent set of K:

TK(X) ={H ∈ Sn | ζ ′(X;H) ≤ 0

}=

{H ∈ Sn | 〈bi, λ′(X;H)〉 ≤ 0 ∀ i ∈ ι2(λ(X))

}• The lineality space:

lin(TK(X)) ={H ∈ Sn | ζ ′(X;H) = −ζ ′(X;−H) = 0

}

Proposition

H ∈ lin(TK(X)) if and only if 〈bi, λ′(X;H)〉 = 0 for any i ∈ ι2(λ(X)).

16

Page 64: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Tangent sets (cont’d)

For θ2 = φ2 ◦ λ:

• θ2 = δK with

K = {X ∈ Sn | λ(X) ∈ domφ} = {X ∈ Sn | ζ(X) ≤ 0} ,

where ζ = ψ ◦ λ• Tangent set of K:

TK(X) ={H ∈ Sn | ζ ′(X;H) ≤ 0

}=

{H ∈ Sn | 〈bi, λ′(X;H)〉 ≤ 0 ∀ i ∈ ι2(λ(X))

}• The lineality space:

lin(TK(X)) ={H ∈ Sn | ζ ′(X;H) = −ζ ′(X;−H) = 0

}Proposition

H ∈ lin(TK(X)) if and only if 〈bi, λ′(X;H)〉 = 0 for any i ∈ ι2(λ(X)).

16

Page 65: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Tangent sets: SDP

Sn− = {X ∈ Sn | λmax(X) ≤ 0} = {X ∈ Sn | max1≤i≤n

{〈ei, λ(X)〉} ≤ 0}

X = V

0α · · · 0... 0β

...

0 · · · Λγ(X)

V T , ι2(λ(X)) = α ∪ β = γ

TSn−(X) ={H ∈ Sn | 〈ei, λ′(X;H)〉 ≤ 0 ∀ i ∈ ι2(λ(X))

}=

{H ∈ Sn | V TγHV γ � 0

}

lin(TSn−(X)) ={H ∈ Sn | 〈ei, λ′(X;H)〉 = 0 ∀ i ∈ ι2(λ(X))

}=

{H ∈ Sn | V TγHV γ = 0

}

17

Page 66: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Tangent sets: SDP

Sn− = {X ∈ Sn | λmax(X) ≤ 0} = {X ∈ Sn | max1≤i≤n

{〈ei, λ(X)〉} ≤ 0}

X = V

0α · · · 0... 0β

...

0 · · · Λγ(X)

V T , ι2(λ(X)) = α ∪ β = γ

TSn−(X) ={H ∈ Sn | 〈ei, λ′(X;H)〉 ≤ 0 ∀ i ∈ ι2(λ(X))

}=

{H ∈ Sn | V TγHV γ � 0

}

lin(TSn−(X)) ={H ∈ Sn | 〈ei, λ′(X;H)〉 = 0 ∀ i ∈ ι2(λ(X))

}=

{H ∈ Sn | V TγHV γ = 0

}

17

Page 67: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Tangent sets: SDP

Sn− = {X ∈ Sn | λmax(X) ≤ 0} = {X ∈ Sn | max1≤i≤n

{〈ei, λ(X)〉} ≤ 0}

X = V

0α · · · 0... 0β

...

0 · · · Λγ(X)

V T , ι2(λ(X)) = α ∪ β = γ

TSn−(X) ={H ∈ Sn | 〈ei, λ′(X;H)〉 ≤ 0 ∀ i ∈ ι2(λ(X))

}=

{H ∈ Sn | V TγHV γ � 0

}

lin(TSn−(X)) ={H ∈ Sn | 〈ei, λ′(X;H)〉 = 0 ∀ i ∈ ι2(λ(X))

}=

{H ∈ Sn | V TγHV γ = 0

}

17

Page 68: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Tangent sets: SDP

Sn− = {X ∈ Sn | λmax(X) ≤ 0} = {X ∈ Sn | max1≤i≤n

{〈ei, λ(X)〉} ≤ 0}

X = V

0α · · · 0... 0β

...

0 · · · Λγ(X)

V T , ι2(λ(X)) = α ∪ β = γ

TSn−(X) ={H ∈ Sn | 〈ei, λ′(X;H)〉 ≤ 0 ∀ i ∈ ι2(λ(X))

}=

{H ∈ Sn | V TγHV γ � 0

}

lin(TSn−(X)) ={H ∈ Sn | 〈ei, λ′(X;H)〉 = 0 ∀ i ∈ ι2(λ(X))

}=

{H ∈ Sn | V TγHV γ = 0

}17

Page 69: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Critical cone

For θ1 = φ1 ◦ λ:

• Let Y ∈ ∂θ1(X). Denote A = X + Y .

• Critical cone:

C(A; ∂θ1(X)) :={H ∈ Sn | θ′1(X;H) ≤ 〈Y ,H〉

}=

{H ∈ Sn | θ′1(X;H) = 〈Y ,H〉

}Proposition

H ∈ C(A; ∂θ1(X)) if and only if H ∈ Sn satisfies for any i, j ∈ η1(x,y),

〈diag(UTHU),ai〉 = 〈diag(U

THU),aj〉 = max

i∈ι1(x)〈λ′(X;H),ai〉,

where the index set η1(x,y) ⊆ ι1(x):

η1(x,y) :={i ∈ ι1(x) |

∑i∈ι1(x)

uiai = y,∑

i∈ι1(x)

ui = 1, 0 < ui ≤ 1}

with x := λ(X) and y := λ(Y ).

18

Page 70: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Critical cone

For θ1 = φ1 ◦ λ:

• Let Y ∈ ∂θ1(X). Denote A = X + Y .

• Critical cone:

C(A; ∂θ1(X)) :={H ∈ Sn | θ′1(X;H) ≤ 〈Y ,H〉

}=

{H ∈ Sn | θ′1(X;H) = 〈Y ,H〉

}

Proposition

H ∈ C(A; ∂θ1(X)) if and only if H ∈ Sn satisfies for any i, j ∈ η1(x,y),

〈diag(UTHU),ai〉 = 〈diag(U

THU),aj〉 = max

i∈ι1(x)〈λ′(X;H),ai〉,

where the index set η1(x,y) ⊆ ι1(x):

η1(x,y) :={i ∈ ι1(x) |

∑i∈ι1(x)

uiai = y,∑

i∈ι1(x)

ui = 1, 0 < ui ≤ 1}

with x := λ(X) and y := λ(Y ).

18

Page 71: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Critical cone

For θ1 = φ1 ◦ λ:

• Let Y ∈ ∂θ1(X). Denote A = X + Y .

• Critical cone:

C(A; ∂θ1(X)) :={H ∈ Sn | θ′1(X;H) ≤ 〈Y ,H〉

}=

{H ∈ Sn | θ′1(X;H) = 〈Y ,H〉

}Proposition

H ∈ C(A; ∂θ1(X)) if and only if H ∈ Sn satisfies for any i, j ∈ η1(x,y),

〈diag(UTHU),ai〉 = 〈diag(U

THU),aj〉 = max

i∈ι1(x)〈λ′(X;H),ai〉,

where the index set η1(x,y) ⊆ ι1(x):

η1(x,y) :={i ∈ ι1(x) |

∑i∈ι1(x)

uiai = y,∑

i∈ι1(x)

ui = 1, 0 < ui ≤ 1}

with x := λ(X) and y := λ(Y ). 18

Page 72: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Critical cone (cont’d)

For θ2 = φ2 ◦ λ:

• Let Z ∈ NK(X). Denote B = X + Z.

• Critical cone:

C(B;NK(X)) := TK(X)∩Z⊥ ={H ∈ Sn | ζ ′(X;H) ≤ 0, 〈Z,H〉 = 0

}Proposition

H ∈ C(B;NK(X)) if and only if H ∈ Sn satisfies for any i ∈ η2(x, z),

0 = 〈diag(VTHV ),bi〉 = max

i∈ι2(x)〈λ′(X;H),bi〉,

where the index set η2(x, z) ⊆ ι2(x):

η2(x, z) :={i ∈ ι2(x) |

∑i∈ι2(x)

uibi = z, ui > 0}

with x := λ(X) and z := λ(Z).

19

Page 73: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Critical cone (cont’d)

For θ2 = φ2 ◦ λ:

• Let Z ∈ NK(X). Denote B = X + Z.

• Critical cone:

C(B;NK(X)) := TK(X)∩Z⊥ ={H ∈ Sn | ζ ′(X;H) ≤ 0, 〈Z,H〉 = 0

}Proposition

H ∈ C(B;NK(X)) if and only if H ∈ Sn satisfies for any i ∈ η2(x, z),

0 = 〈diag(VTHV ),bi〉 = max

i∈ι2(x)〈λ′(X;H),bi〉,

where the index set η2(x, z) ⊆ ι2(x):

η2(x, z) :={i ∈ ι2(x) |

∑i∈ι2(x)

uibi = z, ui > 0}

with x := λ(X) and z := λ(Z).

19

Page 74: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Critical cone: SDP

• Sn− = {X ∈ Sn | λmax(X) ≤ 0} = {X ∈ Sn | max1≤i≤n

{〈ei, λ(X)〉} ≤ 0}

• Z ∈ NSn−(X)

X + Z = V

Λα(Z) · · · 0... 0β

...

0 · · · Λγ(X)

V T ,{

ι2(x) = α ∪ β = γ

η2(x, z) = α

H ∈ C(B;NSn−(X)) if and only if for any i ∈ η2(x, z),

0 = 〈diag(VTHV ),bi〉 = max

i∈ι2(x)〈λ′(X;H),bi〉

i.e.,

VTHV =

diag = 0 � 0 ×

� 0 �... ×

× × ×

⇐⇒ VTHV =

0 0 ×0 � 0 ×× × ×

20

Page 75: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Critical cone: SDP

• Sn− = {X ∈ Sn | λmax(X) ≤ 0} = {X ∈ Sn | max1≤i≤n

{〈ei, λ(X)〉} ≤ 0}

• Z ∈ NSn−(X)

X + Z = V

Λα(Z) · · · 0... 0β

...

0 · · · Λγ(X)

V T ,{

ι2(x) = α ∪ β = γ

η2(x, z) = α

H ∈ C(B;NSn−(X)) if and only if for any i ∈ η2(x, z),

0 = 〈diag(VTHV ),bi〉 = max

i∈ι2(x)〈λ′(X;H),bi〉

i.e.,

VTHV =

diag = 0 � 0 ×

� 0 �... ×

× × ×

⇐⇒ VTHV =

0 0 ×0 � 0 ×× × ×

20

Page 76: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Critical cone: SDP

• Sn− = {X ∈ Sn | λmax(X) ≤ 0} = {X ∈ Sn | max1≤i≤n

{〈ei, λ(X)〉} ≤ 0}

• Z ∈ NSn−(X)

X + Z = V

Λα(Z) · · · 0... 0β

...

0 · · · Λγ(X)

V T ,{

ι2(x) = α ∪ β = γ

η2(x, z) = α

H ∈ C(B;NSn−(X)) if and only if for any i ∈ η2(x, z),

0 = 〈diag(VTHV ),bi〉 = max

i∈ι2(x)〈λ′(X;H),bi〉

i.e.,

VTHV =

diag = 0 � 0 ×

� 0 �... ×

× × ×

⇐⇒ VTHV =

0 0 ×0 � 0 ×× × ×

20

Page 77: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Critical cone: SDP

• Sn− = {X ∈ Sn | λmax(X) ≤ 0} = {X ∈ Sn | max1≤i≤n

{〈ei, λ(X)〉} ≤ 0}

• Z ∈ NSn−(X)

X + Z = V

Λα(Z) · · · 0... 0β

...

0 · · · Λγ(X)

V T ,{

ι2(x) = α ∪ β = γ

η2(x, z) = α

H ∈ C(B;NSn−(X)) if and only if for any i ∈ η2(x, z),

0 = 〈diag(VTHV ),bi〉 = max

i∈ι2(x)〈λ′(X;H),bi〉

i.e.,

VTHV =

diag = 0 � 0 ×

� 0 �... ×

× × ×

⇐⇒ VTHV =

0 0 ×0 � 0 ×× × ×

20

Page 78: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Critical cone: SDP

• Sn− = {X ∈ Sn | λmax(X) ≤ 0} = {X ∈ Sn | max1≤i≤n

{〈ei, λ(X)〉} ≤ 0}

• Z ∈ NSn−(X)

X + Z = V

Λα(Z) · · · 0... 0β

...

0 · · · Λγ(X)

V T ,{

ι2(x) = α ∪ β = γ

η2(x, z) = α

H ∈ C(B;NSn−(X)) if and only if for any i ∈ η2(x, z),

0 = 〈diag(VTHV ),bi〉 = max

i∈ι2(x)〈λ′(X;H),bi〉

i.e.,

VTHV =

diag = 0 � 0 ×

� 0 �... ×

× × ×

⇐⇒ VTHV =

0 0 ×0 � 0 ×× × ×

20

Page 79: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Critical cone: SDP

• Sn− = {X ∈ Sn | λmax(X) ≤ 0} = {X ∈ Sn | max1≤i≤n

{〈ei, λ(X)〉} ≤ 0}

• Z ∈ NSn−(X)

X + Z = V

Λα(Z) · · · 0... 0β

...

0 · · · Λγ(X)

V T ,{

ι2(x) = α ∪ β = γ

η2(x, z) = α

H ∈ C(B;NSn−(X)) if and only if for any i ∈ η2(x, z),

0 = 〈diag(VTHV ),bi〉 = max

i∈ι2(x)〈λ′(X;H),bi〉

i.e.,

VTHV =

diag = 0 � 0 ×

� 0 �... ×

× × ×

⇐⇒ VTHV =

0 0 ×0 � 0 ×× × ×

20

Page 80: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”

For θ1 = φ1 ◦ λ:

• Let Y ∈ ∂θ1(X). Denote A = X + Y and H ∈ C(A; ∂θ1(X)).

• θ1 is (parabolic) second-order directionally differentiable:

z(W ) := θ′′1 (X;H,W ) = φ′′1(λ(X);λ′(X;H), λ′′(X;H,W ))

The σ-term of θ1 , the conjugate function z∗(Y )

Moreover,

z∗(Y ) = 2

r∑l=1

〈Λ(Y )αlαl , UTαlH(X − vlI)†HUαl〉 := Υ1

X

(Y ,H

)

Υ1X

(Y ,H

)= −2

∑1≤l<l′≤r

∑i∈αl

∑j∈αl′

λi(Y )− λj(Y )

λi(X)− λj(X)(U

T

αlHUαl′ )2ij

21

Page 81: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”

For θ1 = φ1 ◦ λ:

• Let Y ∈ ∂θ1(X). Denote A = X + Y and H ∈ C(A; ∂θ1(X)).

• θ1 is (parabolic) second-order directionally differentiable:

z(W ) := θ′′1 (X;H,W ) = φ′′1(λ(X);λ′(X;H), λ′′(X;H,W ))

The σ-term of θ1 , the conjugate function z∗(Y )

Moreover,

z∗(Y ) = 2

r∑l=1

〈Λ(Y )αlαl , UTαlH(X − vlI)†HUαl〉 := Υ1

X

(Y ,H

)

Υ1X

(Y ,H

)= −2

∑1≤l<l′≤r

∑i∈αl

∑j∈αl′

λi(Y )− λj(Y )

λi(X)− λj(X)(U

T

αlHUαl′ )2ij

21

Page 82: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”

For θ1 = φ1 ◦ λ:

• Let Y ∈ ∂θ1(X). Denote A = X + Y and H ∈ C(A; ∂θ1(X)).

• θ1 is (parabolic) second-order directionally differentiable:

z(W ) := θ′′1 (X;H,W ) = φ′′1(λ(X);λ′(X;H), λ′′(X;H,W ))

The σ-term of θ1 , the conjugate function z∗(Y )

Moreover,

z∗(Y ) = 2

r∑l=1

〈Λ(Y )αlαl , UTαlH(X − vlI)†HUαl〉 := Υ1

X

(Y ,H

)

Υ1X

(Y ,H

)= −2

∑1≤l<l′≤r

∑i∈αl

∑j∈αl′

λi(Y )− λj(Y )

λi(X)− λj(X)(U

T

αlHUαl′ )2ij

21

Page 83: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”

For θ1 = φ1 ◦ λ:

• Let Y ∈ ∂θ1(X). Denote A = X + Y and H ∈ C(A; ∂θ1(X)).

• θ1 is (parabolic) second-order directionally differentiable:

z(W ) := θ′′1 (X;H,W ) = φ′′1(λ(X);λ′(X;H), λ′′(X;H,W ))

The σ-term of θ1 , the conjugate function z∗(Y )

Moreover,

z∗(Y ) = 2

r∑l=1

〈Λ(Y )αlαl , UTαlH(X − vlI)†HUαl〉

:= Υ1X

(Y ,H

)

Υ1X

(Y ,H

)= −2

∑1≤l<l′≤r

∑i∈αl

∑j∈αl′

λi(Y )− λj(Y )

λi(X)− λj(X)(U

T

αlHUαl′ )2ij

21

Page 84: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”

For θ1 = φ1 ◦ λ:

• Let Y ∈ ∂θ1(X). Denote A = X + Y and H ∈ C(A; ∂θ1(X)).

• θ1 is (parabolic) second-order directionally differentiable:

z(W ) := θ′′1 (X;H,W ) = φ′′1(λ(X);λ′(X;H), λ′′(X;H,W ))

The σ-term of θ1 , the conjugate function z∗(Y )

Moreover,

z∗(Y ) = 2

r∑l=1

〈Λ(Y )αlαl , UTαlH(X − vlI)†HUαl〉 := Υ1

X

(Y ,H

)

Υ1X

(Y ,H

)= −2

∑1≤l<l′≤r

∑i∈αl

∑j∈αl′

λi(Y )− λj(Y )

λi(X)− λj(X)(U

T

αlHUαl′ )2ij

21

Page 85: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”

For θ1 = φ1 ◦ λ:

• Let Y ∈ ∂θ1(X). Denote A = X + Y and H ∈ C(A; ∂θ1(X)).

• θ1 is (parabolic) second-order directionally differentiable:

z(W ) := θ′′1 (X;H,W ) = φ′′1(λ(X);λ′(X;H), λ′′(X;H,W ))

The σ-term of θ1 , the conjugate function z∗(Y )

Moreover,

z∗(Y ) = 2

r∑l=1

〈Λ(Y )αlαl , UTαlH(X − vlI)†HUαl〉 := Υ1

X

(Y ,H

)

Υ1X

(Y ,H

)= −2

∑1≤l<l′≤r

∑i∈αl

∑j∈αl′

λi(Y )− λj(Y )

λi(X)− λj(X)(U

T

αlHUαl′ )2ij

21

Page 86: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term” (cont’d)

For θ2 = φ2 ◦ λ = δK with

K = {X ∈ Sn | λ(X) ∈ domφ} = {X ∈ Sn | ζ(X) ≤ 0} ,

where ζ = ψ ◦ λ

Let Z ∈ NK(X). Denote B = X + Z and H ∈ C(B,NK(X))

the “σ-term” of K , the support function of T 2K(X,H)

δ∗T 2K(X,H)

(Z) = 2

r∑l=1

〈Λ(Z)αlαl , VT

αlH(X − vlI)†HV αl〉 := Υ2X

(Z,H

)

Υ2X

(Z,H

)= −2

∑1≤l<l′≤r

∑i∈αl

∑j∈αl′

λi(Z)− λj(Z)

λi(X)− λj(X)(V

T

αlHV αl′ )2ij

22

Page 87: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term” (cont’d)

For θ2 = φ2 ◦ λ = δK with

K = {X ∈ Sn | λ(X) ∈ domφ} = {X ∈ Sn | ζ(X) ≤ 0} ,

where ζ = ψ ◦ λ

Let Z ∈ NK(X). Denote B = X + Z and H ∈ C(B,NK(X))

the “σ-term” of K , the support function of T 2K(X,H)

δ∗T 2K(X,H)

(Z) = 2

r∑l=1

〈Λ(Z)αlαl , VT

αlH(X − vlI)†HV αl〉

:= Υ2X

(Z,H

)

Υ2X

(Z,H

)= −2

∑1≤l<l′≤r

∑i∈αl

∑j∈αl′

λi(Z)− λj(Z)

λi(X)− λj(X)(V

T

αlHV αl′ )2ij

22

Page 88: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term” (cont’d)

For θ2 = φ2 ◦ λ = δK with

K = {X ∈ Sn | λ(X) ∈ domφ} = {X ∈ Sn | ζ(X) ≤ 0} ,

where ζ = ψ ◦ λ

Let Z ∈ NK(X). Denote B = X + Z and H ∈ C(B,NK(X))

the “σ-term” of K , the support function of T 2K(X,H)

δ∗T 2K(X,H)

(Z) = 2

r∑l=1

〈Λ(Z)αlαl , VT

αlH(X − vlI)†HV αl〉 := Υ2X

(Z,H

)

Υ2X

(Z,H

)= −2

∑1≤l<l′≤r

∑i∈αl

∑j∈αl′

λi(Z)− λj(Z)

λi(X)− λj(X)(V

T

αlHV αl′ )2ij

22

Page 89: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term” (cont’d)

For θ2 = φ2 ◦ λ = δK with

K = {X ∈ Sn | λ(X) ∈ domφ} = {X ∈ Sn | ζ(X) ≤ 0} ,

where ζ = ψ ◦ λ

Let Z ∈ NK(X). Denote B = X + Z and H ∈ C(B,NK(X))

the “σ-term” of K , the support function of T 2K(X,H)

δ∗T 2K(X,H)

(Z) = 2

r∑l=1

〈Λ(Z)αlαl , VT

αlH(X − vlI)†HV αl〉 := Υ2X

(Z,H

)

Υ2X

(Z,H

)= −2

∑1≤l<l′≤r

∑i∈αl

∑j∈αl′

λi(Z)− λj(Z)

λi(X)− λj(X)(V

T

αlHV αl′ )2ij

22

Page 90: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”: SDP

• Sn− = {X ∈ Sn | λmax(X) ≤ 0}

• Z ∈ NSn−(X), B = X + Z, H ∈ C(B,NK(X))

X + Z = V

Λα(Z) · · · 0... 0β 0

0 · · · Λγ(X)

V TThe “σ-term” of Sn−:

Υ2X

(Z,H

)= 2

∑i∈γ,j∈α

λj(Z)

λi(X)(H)2ij , cf. (Sun, 2006)

where H = VTHV .

23

Page 91: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”: SDP

• Sn− = {X ∈ Sn | λmax(X) ≤ 0}

• Z ∈ NSn−(X), B = X + Z, H ∈ C(B,NK(X))

X + Z = V

Λα(Z) · · · 0... 0β 0

0 · · · Λγ(X)

V TThe “σ-term” of Sn−:

Υ2X

(Z,H

)= 2

∑i∈γ,j∈α

λj(Z)

λi(X)(H)2ij ,

cf. (Sun, 2006)

where H = VTHV .

23

Page 92: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

The “σ-term”: SDP

• Sn− = {X ∈ Sn | λmax(X) ≤ 0}

• Z ∈ NSn−(X), B = X + Z, H ∈ C(B,NK(X))

X + Z = V

Λα(Z) · · · 0... 0β 0

0 · · · Λγ(X)

V TThe “σ-term” of Sn−:

Υ2X

(Z,H

)= 2

∑i∈γ,j∈α

λj(Z)

λi(X)(H)2ij , cf. (Sun, 2006)

where H = VTHV .

23

Page 93: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Robinson CQ

CMatOP:minimize

x∈Xf(x) + θ1(g(x))

subject to h(x) = 0,

g(x) ∈ K

Proposition

Let x ∈ X be a feasible point of the CMatOP. We say that the

Robinson CQ (RCQ) holds at x if[h′(x)

g′(x)

]X +

[{0}

TK(g(x))

]=

[Y

Sn

].

Thus, the set of Lagrange multipliers M(x) is a non-empty, convex,

bounded and compact subset if and only if the RCQ holds at x.

24

Page 94: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Robinson CQ

CMatOP:minimize

x∈Xf(x) + θ1(g(x))

subject to h(x) = 0,

g(x) ∈ K

Proposition

Let x ∈ X be a feasible point of the CMatOP. We say that the

Robinson CQ (RCQ) holds at x if[h′(x)

g′(x)

]X +

[{0}

TK(g(x))

]=

[Y

Sn

].

Thus, the set of Lagrange multipliers M(x) is a non-empty, convex,

bounded and compact subset if and only if the RCQ holds at x.

24

Page 95: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Second-order optimality conditions

Critical cone of CMatOP:

C(x) := {d ∈ X | h′(x)d = 0, g′(x)d ∈ C(A; ∂θ1(g(x))), g′(x)d ∈ C(B;NK(g(x)))}

Theorem (“no gap” second-order optimality conditions)

Suppose that x ∈ X is a locally optimal solution of the CMatOP andthe RCQ holds. Then, the following inequality holds: for any d ∈ C(x),

sup(y,Y,Z)∈M(x)

{〈d,L′′xx(x,y, Y, Z)d〉 −Υ1

g(x)

(Y, g′(x)d

)−Υ2

g(x)

(Z, g′(x)d

)}≥ 0.

Conversely, let x be a feasible point of the CMatOP such that M(x) isnonempty. Suppose that the RCQ holds at x. Then the followingcondition: for any d ∈ C(x) \ {0},

sup(y,Y,Z)∈M(x)

{〈d,L′′xx(x,y, Y, Z)d〉 −Υ1

g(x)

(Y, g′(x)d

)−Υ2

g(x)

(Z, g′(x)d

)}> 0

is necessary and sufficient for the quadratic growth condition at thepoint x: for any x ∈ N such that h(x) = 0 and g(x) ∈ K,

f(x) + φ1 ◦ λ(g(x)) ≥ f(x) + φ1 ◦ λ(g(x)) + ρ‖x− x‖2.

25

Page 96: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Second-order optimality conditions

Critical cone of CMatOP:

C(x) := {d ∈ X | h′(x)d = 0, g′(x)d ∈ C(A; ∂θ1(g(x))), g′(x)d ∈ C(B;NK(g(x)))}

Theorem (“no gap” second-order optimality conditions)

Suppose that x ∈ X is a locally optimal solution of the CMatOP andthe RCQ holds. Then, the following inequality holds: for any d ∈ C(x),

sup(y,Y,Z)∈M(x)

{〈d,L′′xx(x,y, Y, Z)d〉 −Υ1

g(x)

(Y, g′(x)d

)−Υ2

g(x)

(Z, g′(x)d

)}≥ 0.

Conversely, let x be a feasible point of the CMatOP such that M(x) isnonempty. Suppose that the RCQ holds at x. Then the followingcondition: for any d ∈ C(x) \ {0},

sup(y,Y,Z)∈M(x)

{〈d,L′′xx(x,y, Y, Z)d〉 −Υ1

g(x)

(Y, g′(x)d

)−Υ2

g(x)

(Z, g′(x)d

)}> 0

is necessary and sufficient for the quadratic growth condition at thepoint x: for any x ∈ N such that h(x) = 0 and g(x) ∈ K,

f(x) + φ1 ◦ λ(g(x)) ≥ f(x) + φ1 ◦ λ(g(x)) + ρ‖x− x‖2.

25

Page 97: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Second-order optimality conditions

Critical cone of CMatOP:

C(x) := {d ∈ X | h′(x)d = 0, g′(x)d ∈ C(A; ∂θ1(g(x))), g′(x)d ∈ C(B;NK(g(x)))}

Theorem (“no gap” second-order optimality conditions)

Suppose that x ∈ X is a locally optimal solution of the CMatOP andthe RCQ holds. Then, the following inequality holds: for any d ∈ C(x),

sup(y,Y,Z)∈M(x)

{〈d,L′′xx(x,y, Y, Z)d〉 −Υ1

g(x)

(Y, g′(x)d

)−Υ2

g(x)

(Z, g′(x)d

)}≥ 0.

Conversely, let x be a feasible point of the CMatOP such that M(x) isnonempty. Suppose that the RCQ holds at x. Then the followingcondition: for any d ∈ C(x) \ {0},

sup(y,Y,Z)∈M(x)

{〈d,L′′xx(x,y, Y, Z)d〉 −Υ1

g(x)

(Y, g′(x)d

)−Υ2

g(x)

(Z, g′(x)d

)}> 0

is necessary and sufficient for the quadratic growth condition at thepoint x: for any x ∈ N such that h(x) = 0 and g(x) ∈ K,

f(x) + φ1 ◦ λ(g(x)) ≥ f(x) + φ1 ◦ λ(g(x)) + ρ‖x− x‖2.

25

Page 98: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Second-order optimality conditions

Critical cone of CMatOP:

C(x) := {d ∈ X | h′(x)d = 0, g′(x)d ∈ C(A; ∂θ1(g(x))), g′(x)d ∈ C(B;NK(g(x)))}

Theorem (“no gap” second-order optimality conditions)

Suppose that x ∈ X is a locally optimal solution of the CMatOP andthe RCQ holds. Then, the following inequality holds: for any d ∈ C(x),

sup(y,Y,Z)∈M(x)

{〈d,L′′xx(x,y, Y, Z)d〉 −Υ1

g(x)

(Y, g′(x)d

)−Υ2

g(x)

(Z, g′(x)d

)}≥ 0.

Conversely, let x be a feasible point of the CMatOP such that M(x) isnonempty. Suppose that the RCQ holds at x. Then the followingcondition: for any d ∈ C(x) \ {0},

sup(y,Y,Z)∈M(x)

{〈d,L′′xx(x,y, Y, Z)d〉 −Υ1

g(x)

(Y, g′(x)d

)−Υ2

g(x)

(Z, g′(x)d

)}> 0

is necessary and sufficient for the quadratic growth condition at thepoint x: for any x ∈ N such that h(x) = 0 and g(x) ∈ K,

f(x) + φ1 ◦ λ(g(x)) ≥ f(x) + φ1 ◦ λ(g(x)) + ρ‖x− x‖2.

25

Page 99: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Strong second-order sufficient condition

Definition

Let x ∈ X be a stationary point of the CMatOP. We say the strong

second-order sufficient condition holds at x if for any d ∈ C(x) \ {0},

sup(y,Y,Z)∈M(x)

{〈d,L′′xx(x,y, Y, Z)d〉 −Υ1

g(x)

(Y, g′(x)d

)−Υ2

g(x)

(Z, g′(x)d

)}> 0

with

C(x) :=⋂

(y,Y,Z)∈M(x)

app(y, Y, Z),

where for any (y, Y, Z) ∈M(x), the set app(y, Y, Z) is given by

app(y, Y , Z) := {d ∈ X | h′(x)d = 0, g′(x)d ∈ aff(C(A; ∂θ1(g(x)))),

g′(x)d ∈ aff(C(B;NK(g(x))))}.

26

Page 100: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Constraint nondegeneracy (LICQ)

The constraint nondegeneracy for the CMatOP is defined as followsh′(x)

g′(x)

g′(x)

X +

{0}

T linθ1

(g(x))

lin (TK(g(x)))

=

Y

Sn

Sn

.

27

Page 101: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Strong regularity of CMatOPs

Theorem

Let x ∈ X be a stationary point of CMatOP with multipliers (y, Y , Z):

(i) the strong second order sufficient condition and constraint

nondegeneracy hold at x;

(ii) every element in ∂F (x,y, Y , Z) is nonsingular;

(iii) (x,y, Y , Z) is a strongly regular solution of the KKT system.

It holds that (i) =⇒ (ii) =⇒ (iii).

(iii) =⇒ (i) can be established for particular CMatOPs:

• NLSDP (Sun, MOR 2006)

• CMatOPs with the sum of k-largest eigenvalues, etc (in our work)

28

Page 102: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Strong regularity of CMatOPs

Theorem

Let x ∈ X be a stationary point of CMatOP with multipliers (y, Y , Z):

(i) the strong second order sufficient condition and constraint

nondegeneracy hold at x;

(ii) every element in ∂F (x,y, Y , Z) is nonsingular;

(iii) (x,y, Y , Z) is a strongly regular solution of the KKT system.

It holds that (i) =⇒ (ii) =⇒ (iii).

(iii) =⇒ (i) can be established for particular CMatOPs:

• NLSDP (Sun, MOR 2006)

• CMatOPs with the sum of k-largest eigenvalues, etc (in our work)

28

Page 103: Perturbation analysis of matrix optimization · 2019-08-13 · Perturbation analysis of matrix optimization ... a symmetric function, i.e., for any u2Rn and any n npermutation matrix

Thank you!