M ixed Moments C orrelation & Independence, I ndependent sum

10
Tutorial 7, STAT1301 Fall 2010, 09NOV2010, MB103@HKU By Joseph Dong MIXED MOMENTS CORRELATION & INDEPENDENCE, INDEPENDENT SUM

description

M ixed Moments C orrelation & Independence, I ndependent sum. Tutorial 7, STAT1301 Fall 2010, 09NOV2010, MB103@HKU By Joseph Dong. Recall: Univariate Moment of order and Generalization: Mixed Moments. The moment of a random variable is defined as - PowerPoint PPT Presentation

Transcript of M ixed Moments C orrelation & Independence, I ndependent sum

Page 1: M ixed Moments C orrelation & Independence, I ndependent sum

Tutorial 7, STAT1301 Fall 2010, 09NOV2010, MB103@HKUBy Joseph Dong

MIXED MOMENTS

CORRELATION & INDEPENDENCE,

INDEPENDENT SUM

Page 2: M ixed Moments C orrelation & Independence, I ndependent sum

2

Recall: Univariate Moment of order and Generalization: Mixed Moments• The moment of a random variable is defined as

• The central moment of a random variable is defined as

• Q: How to generalize these two definitions to the case of a random vector of dimensions?• A: We can define the mixed moment of an -dimensional random vector.• Define the mixed moment of a random vector as

• Define the mixed central moment of a random vector as

Page 3: M ixed Moments C orrelation & Independence, I ndependent sum

3

Bivariate Mixed Moment: of • The mixed moment of order of and is defined by

• The mixed central moment of order of and is defined by

• The covariance of two random variables is defined as the 2nd bivariate mixed central moment :

• Covariance is a bivariate concept.

• A convenient identity:

• Properties of :• Symmetry: • Positive semi-definiteness : • Linearity:

Page 4: M ixed Moments C orrelation & Independence, I ndependent sum

4

Standardization in Statistics• Express position of a number label using its distance from the expectation, in terms of a multiple of the standard deviation. • This is as if we recoordinatize the state space using the location of

expectation as the origin and using the standard deviation as the unit length.

• Standardization of a random variable is a one-one transformation (a centralization plus a rescaling) of the random variable.

• Using angle brackets to denote the resultant random variable of standardization:

• Purpose of standardization: for ease of describing positions. For example, • What’s the relative position of the number label 5.3 in the state

space of a random variable following . . Since follows (show this if you are not convinced), and 3.25 is a very high quantile. Therefore 5.3 is located quite unusually right in the original state space.

Page 5: M ixed Moments C orrelation & Independence, I ndependent sum

5

Correlation as standardized Covariance• Covariance is a bivariate concept, so is correlation.• Compare:• Covariance of & : • Quick Question: What if and are independent?

• Correlation () of & :

• Very interestingly, correlation of any pair of r.v.’s is always bounded, while their covariance can explode.• Pf.

Page 6: M ixed Moments C orrelation & Independence, I ndependent sum

6

Exploring Correlationdemonstration.wolfram.com• Download Mathematica™ player if you don’t have one.• Search “correlation”• And explore…

Page 7: M ixed Moments C orrelation & Independence, I ndependent sum

7

Covariance/Correlation calculation Exercises• Handout Problem 1• Handout Problem 2• Handout Problem 3

• Find the correlation of the two random variables and who are dependent functionally as

(1) . What about ?

Page 8: M ixed Moments C orrelation & Independence, I ndependent sum

8

Independent sum• Independent sum refers to the sum of independent random

variables.

• is a random variable itself—it has a sample space, a state space, and a probability measure (and distribution) on the sample space.

• We’re now interested in finding the following moments of :• Expectation (too easy and no need independent actually)• Just summing the expecations

• Variance (a bit proof work required, uses all )• It turns out that this is also just the sum of the variances.

• Proof uses properties of covariance.

• MGF (now easy because we have proved Theorem C of Tutorial 6). • Finding the MGF is equivalent to finding the Distribution.

• If we consider a pair of independent sums, we are also interested in finding their covariance (this is easy too)• Using properties of covariance

Page 9: M ixed Moments C orrelation & Independence, I ndependent sum

9

Independent sum: finding its distribution• Previous slide gives one method for finding the distribution of : through its moment generating function since under independence condition, the moment generating function is very convenient to derive. But there is one problem: what if you don’t recognize the resulting form of MGF? (Assuming we are blinded about the divinely clever integral transformation methods.) •We can also find the distribution of by working with the probability measure directly.

Page 10: M ixed Moments C orrelation & Independence, I ndependent sum

10

Exercises: Handout Problems 4,5,6