Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any...

25
Multiple Regression

Transcript of Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any...

Page 1: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

Multiple RegressionMultiple Regression

Page 2: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

Introduction

• In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed.

• We wish to build a model that fits the data better than the simple linear regression model.

Page 3: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

• Computer printout is used to help us: – Assess/Validate the model

• How well does it fit the data?• Is it useful?• Are any of the required conditions violated?

– Apply the model• Interpreting the coefficients• Estimating the expected value of the dependent variable

Page 4: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

Coefficients

Dependent variable Independent variables

Random error variable

Model and Required Conditions

• We allow for k independent variables to potentially be related to the dependent variable

Y = 0 + 1X1+ 2X2 + …+ kXk +

Page 5: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

Multiple Regression for k = 2, Graphical Demonstration

Y = 0 + 1XY = 0 + 1X

X

Y

X2

1

The simple linear regression modelallows for one independent variable, “X”

Y = 0 + 1X +

The multiple linear regression modelallows for more than one independent variable.Y = 0 + 1X1 + 2X2 +

Note how the straight line becomes a plane

Y = 0 + 1X1 + 2X2

Y = 0 + 1X1 + 2X2

Y = 0 + 1X1 + 2X2

Page 6: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

• The error is normally distributed.• The mean is equal to zero and the standard

deviation is constant ( for all possible values of the Xis.

• All errors are independent.

Required Conditions for the Error Variable

Page 7: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

– If the model assessment indicates good fit to the data, use it to interpret the coefficients and generate predictions.

– Assess the model fit using statistics obtained from the sample.

– Diagnose violations of required conditions. Try to remedy problems when identified.

Estimating the Coefficients and Assessing the Model

• The procedure used to perform regression analysis:– Obtain the model coefficients and statistics using Excel.

Page 8: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

• Example 18.1 Where to locate a new motor inn?– La Quinta Motor Inns is planning an expansion.– Management wishes to predict which sites are likely to be

profitable, defined as having 50% or higher operating margin (net profit expressed as a percentage of total revenue).

– Several potential predictors of profitability are:• Competition (room supply)• Market awareness (competing motel)• Demand generators (office and college)• Demographics (household income)• Physical quality/location (distance to downtown)

Page 9: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

Profitability

Competition/Supply

Market Awareness

Demand/Customers Community Physical

Operating Margin

Rooms Nearest OfficeSpace

CollegeEnrollment

Income Disttwn

Distance to downtown.

Medianhouseholdincome.

Distance tothe nearestmotel.

Number of hotels/motelsrooms within 3 miles from the site.

Page 10: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

• Data were collected from 100 randomly-selected inns that belong to La Quinta, and ran for the following suggested model:

Margin = Rooms NearestOffice

College + 5 Income + 6 Disttwn +

Model and Data

Margin Number Nearest Office Space Enrollment Income Distance55.5 3203 4.2 549 8 37 2.733.8 2810 2.8 496 17.5 35 14.449 2890 2.4 254 20 35 2.6

31.9 3422 3.3 434 15.5 38 12.157.4 2687 0.9 678 15.5 42 6.949 3759 2.9 635 19 33 10.8

Xm18-01

Page 11: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

This is the sample regression equation (sometimes called the prediction equation)This is the sample regression equation (sometimes called the prediction equation)

Excel OutputSUMMARY OUTPUT

Regression StatisticsMultiple R 0.7246R Square 0.5251Adjusted R Square 0.4944Standard Error 5.51Observations 100

ANOVAdf SS MS F Significance F

Regression 6 3123.8 520.6 17.14 0.0000Residual 93 2825.6 30.4Total 99 5949.5

Coefficients Standard Error t Stat P-valueIntercept 38.14 6.99 5.45 0.0000Number -0.0076 0.0013 -6.07 0.0000Nearest 1.65 0.63 2.60 0.0108Office Space 0.020 0.0034 5.80 0.0000Enrollment 0.21 0.13 1.59 0.1159Income 0.41 0.14 2.96 0.0039Distance -0.23 0.18 -1.26 0.2107

Margin = 38.14 - 0.0076 Rooms +1.65 Nearest + 0.020 Office + 0.21 College + 0.41 Income - 0.23 Disttwn

Page 12: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

Model Assessment

• The model is assessed using three measures:– The standard error of estimate – The coefficient of determination– The F-test of the analysis of variance

• The standard error of estimates is used in the calculations for the other measures.

Page 13: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

• The standard deviation of the error is estimated by the Standard Error of Estimate:

(k+1 coefficients were estimated)• The magnitude of s is judged by comparing it to:

Standard Error of Estimate

Y .€

sε =SSE

n − k −1

Page 14: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

• From the printout, s = 5.51 • The mean value of Y can be determined as:

• It seems that s is not particularly small (relative

to the mean of Y). • Question:

Can we conclude the model does not fit the data well? Not necessarily.

Y = 45.739

Page 15: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

• The definition is:

• From the printout, R2 = 0.5251• 52.51% of the variation in operating margin is explained by the six

independent variables. 47.49% are unexplained.• When adjusted for the impact of k relative to n (intended to flag

potential problems with small sample size), we have:Adjusted R2 = 1-[SSE/(n-k-1)] / [SS(Total)/(n-1)] =

= 49.44%

R2 =1−SSE

(Yi −Y )2∑

Coefficient of Determination

Page 16: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

• Consider the question:Is there at least one independent variable linearly related to the dependent variable?

• To answer this question, we test the hypothesis:

H0: 1 = 2 = … = k = 0H1: At least one i is not equal to zero.

• If at least one i is not equal to zero, the model has some validity.

• The test is similar to an Analysis of Variance ...

Testing the Validity of the Model

Page 17: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

• The hypotheses can be tested by an ANOVA procedure. The Excel output is:

MSE=SSE/(n-k-1)

MSR=SSR/k

MSR/MSE

SSE

SSR

k =n–k–1 = n-1 =

SSR: Sum of Squares for Regression

SSE: Sum of Squares for Error

ANOVAdf SS MS F Significance F

Regression 6 3123.8 520.6 17.14 0.0000Residual 93 2825.6 30.4Total 99 5949.5

Page 18: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

[Total Variation in Y] = SSR + SSE. Large F indicates a large SSR; that is, much of the variation in Y is explained by the regression model. Therefore, if F is large, the model is considered valid and hence the null hypothesis should be rejected.

The Rejection Region:F>F,k,n-k-1

F =SSR

kSSE

n − k −1

• As in analysis of variance, we have:

Page 19: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

F,k,n-k-1 = F0.05,6,100-6-1=2.17F = 17.14 > 2.17

Also, the p-value (Significance F) = 0.0000Reject the null hypothesis.

ANOVAdf SS MS F Significance F

Regression 6 3123.8 520.6 17.14 0.0000Residual 93 2825.6 30.4Total 99 5949.5

Conclusion: There is sufficient evidence to reject the null hypothesis in favor of the alternative hypothesis: at least one of the i is not equal to zero. Thus, at least one independent variable is linearly related to Y. This linear regression model is valid

Conclusion: There is sufficient evidence to reject the null hypothesis in favor of the alternative hypothesis: at least one of the i is not equal to zero. Thus, at least one independent variable is linearly related to Y. This linear regression model is valid

Page 20: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

• b0 = 38.14. This is the intercept, the value of Y when all the

variables take the value zero. Since the data range of all the independent variables do not cover the value zero, do not interpret the intercept.

• b1 = – 0.0076. In this model, for each additional room within 3 mile

of the La Quinta inn, the operating margin decreases on average

by .0076% (assuming the other variables are held constant).

Interpreting the Coefficients

Page 21: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

• b2 = 1.65. In this model, for each additional mile that the nearest

competitor is to a La Quinta inn, the operating margin increases

on average by 1.65%, when the other variables are held constant.

• b3 = 0.020. For each additional 1000 sq-ft of office space, the

operating margin will increase on average by .02%, when the

other variables are held constant.

• b4 = 0.21. For each additional thousand students, the operating

margin increases on average by .21%, when the other variables

are held constant.

Page 22: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

• b5 = 0.41. For each increment of $1000 in median household income, the operating margin would increase on average by .41%, when the other variables remain constant.

• b6 = -0.23. For each additional mile to the downtown

center, the operating margin decreases on average

by .23%, when the other variables are held constant.

Page 23: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

Coefficients Standard Error t Stat P-valueIntercept 38.14 6.99 5.45 0.0000Number -0.0076 0.0013 -6.07 0.0000Nearest 1.65 0.63 2.60 0.0108Office Space 0.020 0.0034 5.80 0.0000Enrollment 0.21 0.13 1.59 0.1159Income 0.41 0.14 2.96 0.0039Distance -0.23 0.18 -1.26 0.2107

• The hypothesis for each i is:

• Excel output:

H0: i 0H1: i 0 d.f. = n - k -1

Test statistic

ib

iis

bt

−=

Testing Individual Coefficients

Insufficient Evidence

Insufficient EvidenceIgnore

Page 24: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

• Predict the average operating margin of an inn at a site with the following characteristics:– 3815 rooms within 3 miles,– Closet competitor .9 miles away,– 476,000 sq-ft of office space,– 24,500 college students,– $35,000 median household income,– 11.2 miles distance to downtown center.

MARGIN = 38.14 - 0.0076 (3815) +1.65 (.9) + 0.020 (476) +0.21 (24.5) + 0.41 (35) - 0.23 (11.2) = 37.1%

Xm18-01

La Quinta Inns, Point Estimate

Page 25: Multiple Regression. Introduction In this chapter, we extend the simple linear regression model. Any number of independent variables is now allowed. We.

• The conditions required for the model assessment to apply must be checked.

– Is the error variable normally distributed?

– Is the error variance constant?

– Are the errors independent?

– Can we identify outlier?– Is multicolinearity (correlation between the Xi’s) a problem?

Regression Diagnostics

Draw a histogram of the residuals

Plot the residuals versus thepredicted values of Y

Plot the residuals versus the time periods