symm

download symm

of 32

Transcript of symm

  • 7/30/2019 symm

    1/32

    EE263 Autumn 2010-11 Stephen Boyd

    Lecture 15Symmetric matrices, quadratic forms, matrix

    norm, and SVD

    eigenvectors of symmetric matrices

    quadratic forms

    inequalities for quadratic forms

    positive semidefinite matrices

    norm of a matrix

    singular value decomposition

    151

  • 7/30/2019 symm

    2/32

    Eigenvalues of symmetric matrices

    suppose A Rnn is symmetric, i.e., A = AT

    fact: the eigenvalues of A are real

    to see this, suppose Av = v, v = 0, v Cn

    then

    vTAv = vT(Av) = vTv =

    ni=1

    |vi|2

    but also

    vTAv = (Av)T

    v = (v)T

    v =

    ni=1

    |vi|2

    so we have = , i.e., R (hence, can assume v Rn)

    Symmetric matrices, quadratic forms, matrix norm, and SVD 152

  • 7/30/2019 symm

    3/32

    Eigenvectors of symmetric matrices

    fact: there is a set of orthonormal eigenvectors of A, i.e., q1, . . . , q n s.t.Aqi = iqi, q

    Ti qj = ij

    in matrix form: there is an orthogonal Q s.t.

    Q1AQ = QTAQ =

    hence we can express A as

    A = QQT =n

    i=1

    iqiqTi

    in particular, qi are both left and right eigenvectors

    Symmetric matrices, quadratic forms, matrix norm, and SVD 153

  • 7/30/2019 symm

    4/32

    Interpretations

    A = QQT

    x QTx QTx AxQT Q

    linear mapping y = Ax can be decomposed as

    resolve into qi coordinates

    scale coordinates by i

    reconstitute with basis qi

    Symmetric matrices, quadratic forms, matrix norm, and SVD 154

  • 7/30/2019 symm

    5/32

    or, geometrically,

    rotate by QT

    diagonal real scale (dilation) by

    rotate back by Q

    decomposition

    A =n

    i=1iqiq

    Ti

    expresses A as linear combination of 1-dimensional projections

    Symmetric matrices, quadratic forms, matrix norm, and SVD 155

  • 7/30/2019 symm

    6/32

    example:

    A =

    1/2 3/23/2 1/2

    = 1

    2 1 1

    1 1 1 0

    0 2 1

    2 1 1

    1 1 T

    x

    q1

    q2

    q1qT1 x

    q2qT2 x

    2q2qT2 x

    1q1qT1 x

    Ax

    Symmetric matrices, quadratic forms, matrix norm, and SVD 156

  • 7/30/2019 symm

    7/32

    proof (case of i distinct)

    since i distinct, can find v1, . . . , vn, a set of linearly independenteigenvectors of A:

    Avi = ivi, vi = 1

    then we have

    vTi (Avj) = jvTi vj = (Avi)

    Tvj = ivTi vj

    so (i j)vTi vj = 0for i = j, i = j, hence vTi vj = 0

    in this case we can say: eigenvectors are orthogonal in general case (i not distinct) we must say: eigenvectors can be

    chosen to be orthogonal

    Symmetric matrices, quadratic forms, matrix norm, and SVD 157

  • 7/30/2019 symm

    8/32

    Example: RC circuit

    v1

    vn

    c1

    cn

    i1

    in resistive circuit

    ckvk = ik, i = Gv

    G = G

    T

    Rnn

    is conductance matrix of resistive circuit

    thus v = C1Gv where C = diag(c1, . . . , cn)

    note

    C1G is not symmetric

    Symmetric matrices, quadratic forms, matrix norm, and SVD 158

  • 7/30/2019 symm

    9/32

    use state xi =

    civi, so

    x = C1/2v = C1/2GC1/2x

    where C1/2 = diag(

    c1, . . . ,

    cn)

    we conclude:

    eigenvalues 1, . . . , n of

    C1/2GC1/2 (hence,

    C1G) are real

    eigenvectors qi (in xi coordinates) can be chosen orthogonal

    eigenvectors in voltage coordinates, si = C

    1/2qi, satisfy

    C1Gsi = isi, sTi Csi = ij

    Symmetric matrices, quadratic forms, matrix norm, and SVD 159

  • 7/30/2019 symm

    10/32

    Quadratic forms

    a function f : Rn R of the form

    f(x) = x

    T

    Ax =

    n

    i,j=1A

    ijxixj

    is called a quadratic form

    in a quadratic form we may as well assume A = AT since

    xTAx = xT((A + AT)/2)x

    ((A + AT)/2 is called the symmetric part of A)

    uniqueness: if xTAx = xTBx for all x Rn and A = AT, B = BT, thenA = B

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1510

  • 7/30/2019 symm

    11/32

    Examples

    Bx2 = xTBTBx

    n1

    i=1

    (xi+1

    xi)2

    F x2 Gx2

    sets defined by quadratic forms:

    { x | f(x) = a } is called a quadratic surface

    { x | f(x) a } is called a quadratic region

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1511

  • 7/30/2019 symm

    12/32

    Inequalities for quadratic forms

    suppose A = AT, A = QQT with eigenvalues sorted so 1 n

    xTAx = xTQQTx

    = (QTx)T(QTx)

    =

    n

    i=1

    i(q

    T

    i x)

    2

    1n

    i=1

    (qTi x)2

    = 1x2

    i.e., we have xTAx

    1x

    Tx

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1512

  • 7/30/2019 symm

    13/32

    similar argument shows xTAx nx2, so we have

    nxTx xTAx 1xTx

    sometimes 1 is called max, n is called min

    note also that

    qT1 Aq1 = 1q12, qTnAqn = nqn2,

    so the inequalities are tight

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1513

  • 7/30/2019 symm

    14/32

    Positive semidefinite and positive definite matrices

    suppose A = AT Rnn

    we say A is positive semidefinite if xTAx

    0 for all x

    denoted A 0 (and sometimes A 0) A 0 if and only if min(A) 0, i.e., all eigenvalues are nonnegative

    not the same as Aij 0 for all i, j

    we say A is positive definite if xTAx > 0 for all x = 0

    denoted A > 0 A > 0 if and only if min(A) > 0, i.e., all eigenvalues are positive

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1514

  • 7/30/2019 symm

    15/32

    Matrix inequalities

    we say A is negative semidefinite ifA 0

    we say A is negative definite if

    A > 0

    otherwise, we say A is indefinite

    matrix inequality: if B = BT

    Rn

    we say A B if A B 0, A < Bif B A > 0, etc.

    for example:

    A 0 means A is positive semidefinite

    A > B means xTAx > xTBx for all x = 0

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1515

  • 7/30/2019 symm

    16/32

    many properties that youd guess hold actually do, e.g.,

    if A B and C D, then A + C B + D if B 0 then A + B A

    if A

    0 and

    0, then A

    0

    A2 0 if A > 0, then A1 > 0

    matrix inequality is only a partial order: we can have

    A

    B, B

    A

    (such matrices are called incomparable)

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1516

  • 7/30/2019 symm

    17/32

    Ellipsoids

    if A = AT > 0, the set

    E= { x | xT

    Ax 1 }is an ellipsoid in Rn, centered at 0

    s1 s2

    E

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1517

  • 7/30/2019 symm

    18/32

    semi-axes are given by si = 1/2i qi, i.e.:

    eigenvectors determine directions of semiaxes

    eigenvalues determine lengths of semiaxes

    note:

    in direction q1, x

    TAx is large, hence ellipsoid is thin in direction q1

    in direction qn, xTAx is small, hence ellipsoid is fat in direction qn

    max/min gives maximum eccentricity

    if E= { x | xTBx 1 }, where B > 0, then E E A B

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1518

  • 7/30/2019 symm

    19/32

    Gain of a matrix in a direction

    suppose A Rmn (not necessarily square or symmetric)for x Rn, Ax/x gives the amplification factor or gain of A in thedirection x

    obviously, gain varies with direction of input x

    questions:

    what is maximum gain of A(and corresponding maximum gain direction)?

    what is minimum gain of A(and corresponding minimum gain direction)?

    how does gain of A vary with direction?

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1519

  • 7/30/2019 symm

    20/32

    Matrix norm

    the maximum gain

    maxx=0

    Axx

    is called the matrix norm or spectral norm of A and is denoted A

    maxx=0

    Ax2

    x

    2

    = maxx=0

    xTATAx

    x

    2

    = max(ATA)

    so we have A = max(ATA)similarly the minimum gain is given by

    minx=0

    Ax/x =

    min(ATA)

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1520

  • 7/30/2019 symm

    21/32

    note that

    ATA Rnn is symmetric and ATA 0 so min, max 0

    max gain input direction is x = q1, eigenvector of ATA associatedwith max

    min gain input direction is x = qn, eigenvector of ATA associated withmin

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1521

  • 7/30/2019 symm

    22/32

    example: A =

    1 23 4

    5 6

    ATA =

    35 4444 56

    =

    0.620 0.7850.785

    0.620

    90.7 0

    0 0.265

    0.620 0.7850.785

    0.620

    T

    then

    A

    = max(ATA) = 9.53:

    0.6200.785

    = 1,

    A

    0.6200.785

    =

    2.184.99

    7.78

    = 9.53

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1522

  • 7/30/2019 symm

    23/32

    min gain is

    min(ATA) = 0.514:

    0.7850.620

    = 1,A

    0.7850.620

    =

    0.460.14

    0.18

    = 0.514

    for all x = 0, we have0.514 Axx 9.53

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1523

  • 7/30/2019 symm

    24/32

    Properties of matrix norm

    consistent with vector norm: matrix norm of a Rn1 is

    max(aTa) =

    aTa

    for any x, Ax Ax

    scaling: aA = |a|A

    triangle inequality: A + B A + B

    definiteness: A = 0 A = 0

    norm of product: AB AB

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1524

  • 7/30/2019 symm

    25/32

    Singular value decomposition

    more complete picture of gain properties of A given by singular valuedecomposition (SVD) of A:

    A = UVT

    where

    A Rmn, Rank(A) = r

    U

    Rmr, UTU = I

    V Rnr, VTV = I

    = diag(1, . . . , r), where 1

    r > 0

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1525

  • 7/30/2019 symm

    26/32

    with U = [u1 ur], V = [v1 vr],

    A = UVT =r

    i=1

    iuivTi

    i are the (nonzero) singular values of A vi are the right or input singular vectors of A

    ui

    are the left or output singular vectors of A

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1526

  • 7/30/2019 symm

    27/32

    ATA = (UVT)T(UVT) = V2VT

    hence:

    vi are eigenvectors of AT

    A (corresponding to nonzero eigenvalues)

    i =

    i(ATA) (and i(ATA) = 0 for i > r)

    A = 1

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1527

  • 7/30/2019 symm

    28/32

    similarly,AAT = (UVT)(UVT)T = U2UT

    hence:

    ui are eigenvectors of AAT (corresponding to nonzero eigenvalues)

    i =

    i(AAT) (and i(AAT) = 0 for i > r)

    u1, . . . ur are orthonormal basis for range(A)

    v1, . . . vr are orthonormal basis for N(A)

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1528

  • 7/30/2019 symm

    29/32

    Interpretations

    A = UVT =r

    i=1

    iuivTi

    x VTx VTx AxVT U

    linear mapping y = Ax can be decomposed as

    compute coefficients of x along input directions v1, . . . , vr scale coefficients by i reconstitute along output directions u1, . . . , ur

    difference with eigenvalue decomposition for symmetric A: input andoutput directions are different

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1529

  • 7/30/2019 symm

    30/32

    v1 is most sensitive (highest gain) input direction

    u1 is highest gain output direction

    Av1 = 1u1

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1530

  • 7/30/2019 symm

    31/32

    SVD gives clearer picture of gain as function of input/output directions

    example: consider A R44 with = diag(10, 7, 0.1, 0.05)

    input components along directions v1 and v2 are amplified (by about10) and come out mostly along plane spanned by u1, u2

    input components along directions v3 and v4 are attenuated (by about10)

    Ax/x can range between 10 and 0.05

    A is nonsingular

    for some applications you might say A is effectively rank 2

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1531

  • 7/30/2019 symm

    32/32

    example: A R22, with 1 = 1, 2 = 0.5

    resolve x along v1, v2: vT1 x = 0.5, vT2 x = 0.6, i.e., x = 0.5v1 + 0.6v2

    now form Ax = (vT1 x)1u1 + (vT2 x)2u2 = (0.5)(1)u1 + (0.6)(0.5)u2

    v1

    v2u1

    u2

    x Ax

    Symmetric matrices, quadratic forms, matrix norm, and SVD 1532