Quadratic Forms

5
Quadratic Forms Definition : Let {X 1 , X 2 , . . . ,X n } be n (non random) variables. A quadratic form Q is, by definition, an expression such as: Q = ij a ij X i X j where a ij (the coefficients of the form) are real numbers. So, a quadratic form is a second degree, homogenous (no constant term) polynomial in the X i . Matrix Notation: * A denotes the matrix whose general term is [a ij ]. * A quadratic form Q is then defined by the matrix equation: Q = X'AX where X=( X 1 , X 2 , . . . ,X n ) is random vector. Note: The matrix A is assumed to be symmetric matrix. Expectation of a quadratic form Let: * X be a random vector with mean µ and covariance matrix . * Q = X'AX be a quadratic form. Then E[X’AX] = tr(A ) + µ'Aµ Note: The above formula is independent of the distribution of X. Quadratic forms in a multivariate Normal vector and the Chi-square distribution: 1. X is a multivariate normal vector then we write X ~N(µ, ) where µ is mean vector and is variance-covariance matrix.

description

quadratic forms

Transcript of Quadratic Forms

Page 1: Quadratic Forms

Quadratic Forms

Definition : Let {X1, X2, . . . ,Xn } be n (non random) variables. A quadratic form Q is, by definition, an expression such as:

Q = ij aij Xi Xj

where aij (the coefficients of the form) are real numbers.

So, a quadratic form is a second degree, homogenous (no constant term) polynomial in the Xi.

Matrix Notation:

* A denotes the matrix whose general term is [aij].

* A quadratic form Q is then defined by the matrix equation:

Q = X'AX

where X=( X1, X2, . . . ,Xn) is random vector.

Note: The matrix A is assumed to be symmetric matrix.

Expectation of a quadratic form

Let:

* X be a random vector with mean µ and covariance matrix .

* Q = X'AX be a quadratic form.

Then E[X’AX] = tr(A ) + µ'Aµ

Note: The above formula is independent of the distribution of X.

Quadratic forms in a multivariate Normal vector and the Chi-square distribution:

1. X is a multivariate normal vector then we write X ~N(µ, ) where µ is mean vector and is variance-covariance matrix.

2. Let {X1, X2 , ..., Xn} be n standard normal independent variables :Xi ~ N(0, 1) for all i.

Then Chi-square distribution with n degrees of freedom is the distribution of the variable X² defined as:

X ² = i Xi²

Page 2: Quadratic Forms

and is usually denoted by X ² ~ n .

Observe that it is a special quadratic form with A same as Identity matrix of order n.

Question: Do there exist some other matrix P such that X’PX follows chi-square distribution?

Result 1: If X~N(0, In) then Q=X’PX ~ r if and only if P is idempotent and rank (P) is r.

Result 2: If X~ N(µ, ) then Q=X’PX ~ r (non central) if and only if both conditions are satisfied

(a) P=P P (b) rank (P)=r. Further, non centrality parameter is µ'Pµ.

Independence of two quadratic forms in a multivariate normal vector

Result 3: Two Linear forms A’X and B’X in a multinormal vector X with covariance matrix are

independent if and only if A’ B=0.

Result 4 (Craig’s theorem) : Let X~Np(µ, ), and let X'AX and X'BX be two (Chi-square

distributed) quadratic forms, then these two forms are independent if and only if A B = 0.

Note: It is not assumed that the Chi-square distributions of the quadratic forms are central or have the same numbers of degrees of freedom.

Previous Year Questions:

1. (December 2014) Section C

Ans: 1, 2, 4

Explanation:

Covariance matrix is Identity. A and B are idempotent and Symmetric.

Option 1: by result 4

Page 3: Quadratic Forms

Option 2: We know that Y’AY and Y’BY both chi square random variables. Also, we know sum of two chi square is chi square if and only if they are independent. So, by both these results, option B is correct.

Option 3: Difference of two chi square is not chi square. So, not correct

Option 4: already discussed above.

2. (June 2014) Section B

Ans d Explanation: Using result 2, we have the distribution of quadratic form as chi square and expectation of chi square is its parameter p and variance is 2p.

3. December 2012 Section C

Ans 1,4Explanation: for q2 the matrix P is {0.5 -0.5 0; -0.5 0.5 0; 0 0 0}(row wise)

And for q1, P matrix is {0.5 0.5 0; 0.5 0.5 0; 0 0 1}(row wise)Now, variance covariance matrix is identity matrix.Result 2 gives both q1 and q2 are chi square non central but for q1, the non central parameter is zero hence it is central but for q2 it is not. So, option 1 is true but not 2. Hence 3 is not true.Also, by result 4, option 4 is correct.

4. (December 2011 Part B)Let X be a p dimensional random vector that follows N(0, Ip) distribution and let A be a real symmetric matrix. Which of the following is true? (a) X’AX has a chi-square distribution if A2=A but the converse is not true.(b) If X’AX has a chi square distribution then the degrees of freedom is p.(c) If X’AX has a chi square distribution then characteristic roots of A are either 0 or 1.

Page 4: Quadratic Forms

(d) If X’AX has a chi square distribution then A is necessarily positive definite.

Ans: Using result 1, option C is correct.