Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple...

40
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13

Transcript of Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple...

Page 1: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved.

McGraw-Hill/Irwin

Simple Linear Regression Analysis

Chapter 13

Page 2: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-2

Simple Linear Regression

13.1The Simple Linear Regression Model and the Least Square Point Estimates

13.2Model Assumptions and the Standard Error

13.3Testing the Significance of Slope and y-Intercept

13.4Confidence and Prediction Intervals

Page 3: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-3

Simple Linear Regression Continued

13.5 Simple Coefficients of Determination and Correlation

13.6 Testing the Significance of the Population Correlation Coefficient (Optional)

13.7 An F Test for the Model13.8 Residual Analysis (Optional)13.9 Some Shortcut Formulas

(Optional)

Page 4: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-4

13.1 The Simple Linear Regression Model and the Least Squares Point Estimates

The dependent (or response) variable is the variable we wish to understand or predict

The independent (or predictor) variable is the variable we will use to understand or predict the dependent variable

Regression analysis is a statistical technique that uses observed data to relate the dependent variable to one or more independent variables

Page 5: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-5

Form of The Simple LinearRegression Model

y = β0 + β1x + ε y = β0 + β1x + ε is the mean value of

the dependent variable y when the value of the independent variable is x

β0 is the y-intercept; the mean of y when x is 0

β1 is the slope; the change in the mean of y per unit change in x

ε is an error term that describes the effect on y of all factors other than x

Page 6: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-6

Regression Terms

β0 and β1 are called regression parameters

β0 is the y-intercept and β1 is the slope We do not know the true values of these

parameters So, we must use sample data to estimate

them b0 is the estimate of β0 and b1 is the

estimate of β1

Page 7: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-7

The Simple Linear Regression Model Illustrated

Figure 13.4

Page 8: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-8

The Least Squares Point Estimates

n

xx

n

yyxbyb

n

xxxxSS

n

yxyxyyxxSS

SS

SSb

xbby

ii

iiixx

iiiiiixy

xx

xy

10

2

22

1

10

Intercept-Y of EstimatePoint

)(

)()(

Slope of EstimatePoint

ˆ

Equation Prediction / Estimation

Page 9: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-9

Example 13.3: Fuel Consumption Case #1

Page 10: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-10

Example 13.3: Fuel Consumption Case #2 (Slope b1)

1279.0355.1404

6475.179

355.14048

)8.351(76.16874

6475.1798

)7.81)(8.351(11.3413

1

2

2

2

xx

xy

iixx

iiiixy

SS

SSb

n

xxSS

n

yxyxSS

Page 11: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-11

Example 13.3: Fuel Consumption Case #3 (y-Intercept b0)

84.15

)98.43)(1279.0(2125.10

98.438

8.351

2125.108

7.81

10

xbyb

n

xx

n

yy

i

i

Page 12: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-12

Example 13.3: Fuel Consumption Case #4

Prediction (x = 40)y = b0 + b1x = 15.84 + (-0.1279)

(28)y = 12.2588 MMcf of Gas

Page 13: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-13

13.2 Model Assumptions and the Standard Error

1. Mean of Zero2. Constant Variance Assumption3. Normality Assumption4. Independence Assumption

Page 14: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-14

An Illustration of the Model Assumptions

Figure 13.10

Page 15: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-15

Sum of Squares

2

Error Standard2

Error SquareMean

)ˆ(

Errors Squared of Sum

2

22

n-

SSEMSEs

n-

SSEMSEs

yyeSSE iii

Page 16: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-16

13.3 Testing the Significance of the Slope and y-Intercept

A regression model is not likely to be useful unless there is a significant relationship between x and y

To test significance, we use the null hypothesis:

H0: β1 = 0

Versus the alternative hypothesis:

Ha: β1 ≠ 0

Page 17: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-17

Testing the Significance of the Slope #2

Alternative

Reject H0 If

p-Value

Ha: β1 > 0 t > tαArea under t distribution right of t

Ha: β1 < 0 t < –tαArea under t distribution left of t

Ha: β1 ≠ 0 |t| > tα/2* Twice area under t

distribution right of |t|* That is t > tα/2 or t < –tα/2

Page 18: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-18

Testing the Significance of the Slope #3

Test Statistics

100(1-α)% Confidence Interval for β1

[b1 ± t /2 Sb1]

t, t/2 and p-values are based on n–2 degrees of freedom

xx

bb SS

ss

s

bt=

1

1

where1

Page 19: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-19

13.4 Confidence and Prediction Intervals

The point on the regression line corresponding to a particular value of x0 of the independent variable x is y = b0 + b1x0

It is unlikely that this value will equal the mean value of y when x equals x0

Therefore, we need to place bounds on how far the predicted value might be from the actual value

We can do this by calculating a confidence interval mean for the value of y and a prediction interval for an individual value of y

Page 20: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-20

A Confidence Interval and a Prediction Interval

xx

xx

SS

xx

nsty

SS

xx

nsty

2

0

2

2

0

2

11ˆ

Interval Prediction

Interval Confidence

Page 21: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-21

Which to Use?

The prediction interval is useful if it is important to predict an individual value of the dependent variable

A confidence interval is useful if it is important to estimate the mean value

The prediction interval will always be wider than the confidence interval

Page 22: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-22

13.5 The Simple Coefficient ofDetermination and Correlation

How useful is a particular regression model?

One measure of usefulness is the simple coefficient of determination

It is represented by the symbol r²

Page 23: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-23

Calculating The Simple Coefficient ofDetermination

1. Total variation is given by the formula (yi-y)²

2. Explained variation is given by the formula (yi-y)²

3. Unexplained variation is given by the formula (yi-y)²

4. Total variation is the sum of explained and unexplained variation

5. r² is the ratio of explained variation to total variation

Page 24: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-24

The Simple Correlation Coefficient

• The simple correlation coefficient measures the strength of the linear relationship between y and x and is denoted by r

• Where b1 is the slope of the least squares line

negative is if

and positive, is if

12

12

brr=

brr=

Page 25: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-25

Different Values of the CorrelationCoefficient

Figure 13.21

Page 26: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-26

13.6 Testing the Significance of thePopulation Correlation Coefficient (Optional)

The simple correlation coefficient (r) measures the linear relationship between the observed values of x and y from the sample

The population correlation coefficient (ρ) measures the linear relationship between all possible combinations of observed values of x and y

r is an estimate of ρ

Page 27: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-27

Testing ρ

We can test to see if the correlation is significant using the hypotheses

H0: ρ = 0Ha: ρ ≠ 0

The statistic is

This test will give the same results as the test for significance on the slope coefficient b1

21

2

r

nrt

Page 28: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-28

13.7 An F Test for Model

For simple regression, this is another way to test the null hypothesis

H0: β1 = 0

This is the only test we will use for multiple regression

The F test tests the significance of the overall regression relationship between x and y

Page 29: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-29

Mechanics of the F Test

To test H0: β1 = 0 versus Ha: β1 0 at the level of significance

Test statistics based on F

Reject H0 if F(model) > For p-value < F is based on 1 numerator and n - 2

denominator degrees of freedom

2)-)/(n variationed(Unexplain

variationExplainedF

Page 30: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-30

Mechanics of the F Test Graphically

Figure 13.22

Page 31: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-31

13.8 Residual Analysis (Optional)

Regression assumptions are checked by analyzing the regression residuals

Residuals (e) are the difference between the observed value of y and the predicted value of y, e = y - y

If regression assumptions valid, the population of potential error terms will be normally distributed with a mean of zero and a variance σ²

Different error terms will be statistically independent

Page 32: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-32

Residual Analysis #2

Residuals should look like they were randomly and independently selected from normal population with μ=0 and variance σ² With real data, assumptions will not hold

exactly Mild departures do not affect our ability to

make statistical inferences We are looking for pronounced

departures from the assumptions Only require residuals to approximately fit

the description above

Page 33: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-33

Residual Plots

1. Residuals versus independent variable

2. Residuals versus predicted y’s3. Residuals in time order (if the

response is a time series)

Page 34: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-34

Constant Variance Assumption

Figure 13.25

Page 35: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-35

Assumption of Correct FunctionalForm

If the relationship between x and y is something other than a linear one, the residual plot will often suggest a form more appropriate for the model

For example, if there is a curved relationship between x and y, a plot of residuals will often show a curved relationship

Page 36: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-36

Normality Assumption

If the normality assumption holds, a histogram or stem-and-leaf display of residuals should look bell-shaped and symmetric

Another way to check is a normal plot of residuals Order residuals from smallest to largest Plot e(i) on vertical axis against z(i)

If the normality assumption holds, the plot should have a straight-line appearance

Page 37: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-37

Independence Assumption

Independence assumption is most likely to be violated when the data are time-series data If the data is not time series, then it can be

reordered without affecting the data For time-series data, the time-ordered

error terms can be autocorrelated Positive autocorrelation is when a positive

error term in time period i tends to be followed by another positive value in i+k

Negative autocorrelation is when a positive error term tends to be followed by a negative value

Page 38: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-38

Independence Assumption VisuallyPositive Autocorrelation

Figure 13.27 (a)

Page 39: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-39

Independence Assumption VisuallyNegative Autocorrelation

Figure 13.27 (b)

Page 40: Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.

13-40

13.9 Some Shortcut Formulas

n

yyyySS

n

xxxxSS

n

yxyxyyxxSS

SS

SSSS=SSE

SS

SSSSR

SSSSTO

iiiyy

iiixx

iiiiiixy

xx

xyyy

xx

xy

yy

2

22

2

22

2

2

)(

)(

)()(

where

variationdUnexplaine

variationExplained

variationTotal