Statistics for Business Analysis
Day 12 Session I Simple Regression Analysis
Learning Objectives
How to use regression analysis to predict the value of a dependent variable based on an independent variable The meaning of the regression coefficients b0 and b1 How to evaluate the assumptions of regression analysis and know what to do if the assumptions are violated To make inferences about the slope and correlation coefficient To estimate mean values and predict individual values
Created by: Prabhat Mittal profmittal@yahoo.co.in
Introduction to Regression Analysis
Regression analysis is used to:
Predict the value of a dependent variable based on the value of at least one independent variable Explain the impact of changes in an independent variable on the dependent variable
Dependent variable: the variable we wish to predict or explain Independent variable: the variable used to explain the dependent variable
Simple Linear Regression Model
Only one independent variable, X Relationship between X and Y is described by a linear function Changes in Y are assumed to be caused by changes in X
Created by: Prabhat Mittal profmittal@yahoo.co.in
Types of Relationships
Linear relationships Y Y Curvilinear relationships
X Y Y
Types of Relationships
(continued) Strong relationships Y Y Weak relationships
X Y Y
Created by: Prabhat Mittal profmittal@yahoo.co.in
Types of Relationships
(continued) No relationship Y
X Y
Simple Linear Regression Model
Population Y intercept Dependent Variable Population Slope Coefficient Independent Variable Random Error term
Yi = 0 + 1Xi + i
Linear component Random Error component
Created by: Prabhat Mittal profmittal@yahoo.co.in
Simple Linear Regression Model
Y
Observed Value of Y for Xi
(continued)
Yi = 0 + 1Xi + i
i
Slope = 1 Random Error for this Xi value
Predicted Value of Y for Xi Intercept = 0
Xi
Simple Linear Regression Equation (Prediction Line)
The simple linear regression equation provides an estimate of the population regression line
Estimated (or predicted) Y value for observation i Estimate of the regression intercept Estimate of the regression slope Value of X for observation i
Yi = b0 + b1Xi
The individual random error terms ei have a mean of zero
Created by: Prabhat Mittal profmittal@yahoo.co.in
Least Squares Method
b0 and b1 are obtained by finding the values of b0 and b1 that minimize the sum of the squared differences between Y and Y :
min (Yi Yi )2 = min (Yi (b0 + b1Xi ))2
Interpretation of the Slope and the Intercept
b0 is the estimated average value of Y when the value of X is zero b1 is the estimated change in the average value of Y as a result of a one-unit change in X
Created by: Prabhat Mittal profmittal@yahoo.co.in
Simple Linear Regression Example
A real estate agent wishes to examine the relationship between the selling price of a home and its size (measured in square feet) A random sample of 10 houses is selected Dependent variable (Y) = house price in $1000s Independent variable (X) = square feet
Sample Data for House Price Model
House Price in $1000s (Y) 245 312 279 308 199 219 405 324 319 255 Square Feet (X) 1400 1600 1700 1875 1100 1550 2350 2450 1425 1700
Created by: Prabhat Mittal profmittal@yahoo.co.in
Graphical Presentation
House price model: scatter plot
House Price ($1000s) 450 400 350 300 250 200 150 100 50 0 0 500 1000 1500 2000 2500 3000 Square Feet
Regression Using Excel
Tools / Data Analysis / Regression
Created by: Prabhat Mittal profmittal@yahoo.co.in
Excel Output
Regression Statistics Multiple R R Square Adjusted R Square Standard Error Observations 0.76211 0.58082 0.52842 41.33032 10
The regression equation is:
house price = 98.24833 + 0.10977 (square feet)
ANOVA
df Regression Residual Total 1 8 9 SS 18934.9348 13665.5652 32600.5000 MS 18934.9348 1708.1957 F 11.0848 Significance F 0.01039
Coefficients Intercept Square Feet 98.24833 0.10977
Standard Error 58.03348 0.03297
t Stat 1.69296 3.32938
P-value 0.12892 0.01039
Lower 95% -35.57720 0.03374
Upper 95% 232.07386 0.18580
Graphical Presentation
House price model: scatter plot and regression line 450
400 350 300 250 200 150 100 50 0 0 500 1000 1500 2000 2500 3000 Square Feet House Price ($1000s)
Slope = 0.10977
Intercept = 98.248
house price = 98.24833 + 0.10977 (square feet)
Created by: Prabhat Mittal profmittal@yahoo.co.in
Interpretation of the Intercept, b0
house price = 98.24833 + 0.10977 (square feet)
b0 is the estimated average value of Y when the value of X is zero (if X = 0 is in the range of observed X values)
Here, no houses had 0 square feet, so b0 = 98.24833 just indicates that, for houses within the range of sizes observed, $98,248.33 is the portion of the house price not explained by square feet
Interpretation of the Slope Coefficient, b1
house price = 98.24833 + 0.10977 (square feet)
b1 measures the estimated change in the average value of Y as a result of a oneunit change in X
Here, b1 = .10977 tells us that the average value of a house increases by .10977($1000) = $109.77, on average, for each additional one square foot of size
Created by: Prabhat Mittal profmittal@yahoo.co.in
10
Predictions using Regression Analysis
Predict the price for a house with 2000 square feet:
house price = 98.25 + 0.1098 (sq.ft.) = 98.25 + 0.1098(2000) = 317.85
The predicted price for a house with 2000 square feet is 317.85($1,000s) = $317,850
Interpolation vs. Extrapolation
When using a regression model for prediction, only predict within the relevant range of data
Relevant range for interpolation
450 400 350 300 250 200 150 100 50 0 0 500 1000 1500 2000 2500 3000 Square Feet
House Price ($1000s)
Do not try to extrapolate beyond the range of observed Xs
Created by: Prabhat Mittal profmittal@yahoo.co.in
11
Measures of Variation
Total variation is made up of two parts:
SST =
Total Sum of Squares
SSR +
Regression Sum of Squares
SSE
Error Sum of Squares
SST = ( Yi Y )2
where:
SSR = ( Yi Y )2
Y
SSE = ( Yi Yi )2
= Average value of the dependent variable
Yi = Observed values of the dependent variable Yi = Predicted value of Y for the given Xi value
Measures of Variation
(continued)
SST = total sum of squares Measures the variation of the Yi values around their mean Y SSR = regression sum of squares Explained variation attributable to the relationship between X and Y SSE = error sum of squares Variation attributable to factors other than the relationship between X and Y
Created by: Prabhat Mittal profmittal@yahoo.co.in
12
Measures of Variation
(continued)
Y Yi _
Y SST = (Yi - Y)2
SSE = (Yi - Yi )2
_
Y
_ SSR = (Yi - Y)2
_ Y
Xi
Coefficient of Determination, r2
The coefficient of determination is the portion of the total variation in the dependent variable that is explained by variation in the independent variable The coefficient of determination is also called r-squared and is denoted as r2
r2 = SSR regression sum of squares = SST total sum of squares
note:
0 r2 1
Created by: Prabhat Mittal profmittal@yahoo.co.in
13
Examples of Approximate r2 Values
Y r2 = 1 Perfect linear relationship between X and Y: 100% of the variation in Y is explained by variation in X
r2 = 1 Y
r2 = 1
Examples of Approximate r2 Values
Y 0 < r2 < 1 Weaker linear relationships between X and Y: Some but not all of the variation in Y is explained by variation in X X
X Y
Created by: Prabhat Mittal profmittal@yahoo.co.in
14
Examples of Approximate r2 Values
r2 = 0 No linear relationship between X and Y: X The value of Y does not depend on X. (None of the variation in Y is explained by variation in X)
r2 = 0
Excel Output
Regression Statistics Multiple R R Square Adjusted R Square Standard Error Observations 0.76211 0.58082 0.52842 41.33032 10
r2 =
SSR 18934.9348 = = 0.58082 SST 32600.5000
58.08% of the variation in house prices is explained by variation in square feet
ANOVA
df Regression Residual Total 1 8 9 SS 18934.9348 13665.5652 32600.5000 MS 18934.9348 1708.1957 F 11.0848 Significance F 0.01039
Coefficients Intercept Square Feet 98.24833 0.10977
Standard Error 58.03348 0.03297
t Stat 1.69296 3.32938
P-value 0.12892 0.01039
Lower 95% -35.57720 0.03374
Upper 95% 232.07386 0.18580
Created by: Prabhat Mittal profmittal@yahoo.co.in
15
Standard Error of Estimate
The standard deviation of the variation of observations around the regression line is estimated by
S YX =
Where
SSE = n2
(Y Y )
i i i=1
n2
SSE = error sum of squares n = sample size
Excel Output
Regression Statistics Multiple R R Square Adjusted R Square Standard Error Observations 0.76211 0.58082 0.52842 41.33032 10
S YX = 41.33032
ANOVA
df Regression Residual Total 1 8 9 SS 18934.9348 13665.5652 32600.5000 MS 18934.9348 1708.1957 F 11.0848 Significance F 0.01039
Coefficients Intercept Square Feet 98.24833 0.10977
Standard Error 58.03348 0.03297
t Stat 1.69296 3.32938
P-value 0.12892 0.01039
Lower 95% -35.57720 0.03374
Upper 95% 232.07386 0.18580
Created by: Prabhat Mittal profmittal@yahoo.co.in
16
Comparing Standard Errors
SYX is a measure of the variation of observed Y values from the regression line Y Y
small s YX
large s YX
The magnitude of SYX should always be judged relative to the size of the Y values in the sample data i.e., SYX = $41.33K is moderately small relative to house prices in the $200 - $300K range
Assumptions of Regression
Use the acronym LINE: Linearity
The underlying relationship between X and Y is linear
Independence of Errors
Error values are statistically independent
Normality of Error
Error values () are normally distributed for any given value of X
Equal Variance (Homoscedasticity)
The probability distribution of the errors has constant variance
Created by: Prabhat Mittal profmittal@yahoo.co.in
17
Residual Analysis
ei = Yi Yi
The residual for observation i, ei, is the difference between its observed and predicted value Check the assumptions of regression by examining the residuals
Examine for linearity assumption Evaluate independence assumption Evaluate normal distribution assumption Examine for constant variance for all levels of X (homoscedasticity)
Graphical Analysis of Residuals
Can plot residuals vs. X
Residual Analysis for Linearity
Y Y
x
residuals residuals
Not Linear
Linear
Created by: Prabhat Mittal profmittal@yahoo.co.in
18
Residual Analysis for Independence
Not Independent Independent
residuals
X
residuals
residuals
Residual Analysis for Normality
A normal probability plot of the residuals can be used to check for normality: Percent
100
0 -3 -2 -1 0 1 2 3
Residual
Created by: Prabhat Mittal profmittal@yahoo.co.in
19
Residual Analysis for Equal Variance
Y Y
x
residuals
x
residuals
x Non-constant variance
x Constant variance
Excel Residual Output
RESIDUAL OUTPUT Predicted House Price 1 2 3 4 5 6 7 8 9 10 251.92316 273.87671 284.85348 304.06284 218.99284 268.38832 356.20251 367.17929 254.6674 284.85348 Residuals -6.923162 38.12329
Residuals 80 60 40 20 0 -20 -40 -60 Square Feet 0 1000 2000 3000
House Price Model Residual Plot
-5.853484 3.937162 -19.99284 -49.38832 48.79749 -43.17929 64.33264 -29.85348
Does not appear to violate any regression assumptions
Created by: Prabhat Mittal profmittal@yahoo.co.in
20
Measuring Autocorrelation: The Durbin-Watson Statistic
Used when data are collected over time to detect if autocorrelation is present Autocorrelation exists if residuals in one time period are related to residuals in another period
Autocorrelation
Autocorrelation is correlation of the errors (residuals) over time
Time (t) Residual Plot
15
Here, residuals show a cyclic pattern, not random. Cyclical patterns are a sign of positive autocorrelation
10
Residuals
5 0 -5 0 -10 -15
Time (t)
Violates the regression assumption that residuals are random and independent
Created by: Prabhat Mittal profmittal@yahoo.co.in
21
The Durbin-Watson Statistic
The Durbin-Watson statistic is used to test for autocorrelation H0: residuals are not correlated H1: positive autocorrelation is present
n
(ei ei1)2
D=
i= 2 n i=1
The possible range is 0 D 4 D should be close to 2 if H0 is true D less than 2 may signal positive autocorrelation, D greater than 2 may signal negative autocorrelation
2 i
Testing for Positive Autocorrelation
H0: positive autocorrelation does not exist H1: positive autocorrelation is present Calculate the Durbin-Watson test statistic = D
(The Durbin-Watson Statistic can be found using Excel or Minitab)
Find the values dL and dU from the Durbin-Watson table
(for sample size n and number of independent variables k)
Decision rule: reject H0 if D < dL
Reject H0 Inconclusive Do not reject H0
dL
dU
Created by: Prabhat Mittal profmittal@yahoo.co.in
22
Testing for Positive Autocorrelation
(continued)
Suppose we have the following time series data:
160 140 120 100 Sales 80 60 40 20 0 0 5 10 15 Tim e 20 25 30
y = 30.65 + 4.7038x R2 = 0.8976
Is there autocorrelation?
Testing for Positive Autocorrelation
Example with n = 25:
Excel/PHStat output:
Durbin-Watson Calculations Sum of Squared Difference of Residuals Sum of Squared Residuals Durbin-Watson Statistic
n
Sales 160 140 120 100 80 60
(continued)
y = 30.65 + 4.7038x R2 = 0.8976
3296.18 3279.98 1.00494
40 20 0 0 5 10 15 Tim e 20 25 30
(e e
i
i1
)2 =
D=
i= 2 n
e
i =1
2 i
3296.18 = 1.00494 3279.98
Created by: Prabhat Mittal profmittal@yahoo.co.in
23
Testing for Positive Autocorrelation
(continued)
Here, n = 25 and there is k = 1 one independent variable Using the Durbin-Watson table, dL = 1.29 and dU = 1.45 D = 1.00494 < dL = 1.29, so reject H0 and conclude that significant positive autocorrelation exists Therefore the linear model is not the appropriate model to forecast sales
Decision: reject H0 since D = 1.00494 < dL
Reject H0 Inconclusive Do not reject H0
dL=1.29
dU=1.45
Inferences About the Slope
The standard error of the regression slope coefficient (b1) is estimated by
Sb1 =
where:
S YX = SSX
S YX
(X X)
i
S b1
= Estimate of the standard error of the least squares slope
S YX =
SSE = Standard error of the estimate n2
Created by: Prabhat Mittal profmittal@yahoo.co.in
24
Excel Output
Regression Statistics
Multiple R R Square Adjusted R Square Standard Error Observations 0.76211 0.58082 0.52842 41.33032 10
Sb1 = 0.03297
SS MS
18934.9348 1708.1957
ANOVA
df
Regression Residual Total 1 8 9
F
11.0848
Significance F
0.01039
18934.9348 13665.5652 32600.5000
Coefficients
Intercept Square Feet 98.24833 0.10977
Standard Error
58.03348 0.03297
t Stat
1.69296 3.32938
P-value
0.12892 0.01039
Lower 95%
-35.57720 0.03374
Upper 95%
232.07386 0.18580
Comparing Standard Errors of the Slope
lines from different possible samples Y Y
Sb1 is a measure of the variation in the slope of regression
small Sb1
large Sb1
Created by: Prabhat Mittal profmittal@yahoo.co.in
25
Inference about the Slope: t Test
t test for a population slope
Is there a linear relationship between X and Y?
Null and alternative hypotheses
H0: 1 = 0 H1: 1 0 (no linear relationship) (linear relationship does exist)
where: b1 = regression slope coefficient 1 = hypothesized slope Sb = standard 1 error of the slope
Test statistic
b t= 1 1 S b1
d.f. = n 2
Inference about the Slope: t Test
(continued)
House Price in $1000s (y) 245 312 279 308 199 219 405 324 319 255 Square Feet (x) 1400 1600 1700 1875 1100 1550 2350 2450 1425 1700
Simple Linear Regression Equation:
house price = 98.25 + 0.1098 (sq.ft.)
The slope of this model is 0.1098 Does square footage of the house affect its sales price?
Created by: Prabhat Mittal profmittal@yahoo.co.in
26
Inferences about the Slope: t Test Example
H0: 1 = 0 H1: 1 0 From Excel output:
Coefficients Intercept Square Feet 98.24833 0.10977
b1
Standard Error 58.03348 0.03297
Sb1
t Stat 1.69296 3.32938 P-value 0.12892 0.01039
t=
b1 1 0.10977 0 = = 3.32938 t Sb1 0.03297
Inferences about the Slope: t Test Example
(continued)
Test Statistic: t = 3.329
H0: 1 = 0 H1: 1 0 From Excel output:
Coefficients Intercept Square Feet 98.24833 0.10977
b1
Standard Error 58.03348 0.03297
Sb1
t Stat 1.69296 3.32938
t
P-value 0.12892 0.01039
d.f. = 10-2 = 8 /2=.025 /2=.025
Reject H0
-t/2
Do not reject H0
t/2
Reject H0
-2.3060
2.3060 3.329
Decision: Reject H0 Conclusion: There is sufficient evidence that square footage affects house price
Created by: Prabhat Mittal profmittal@yahoo.co.in
27
Inferences about the Slope: t Test Example
(continued)
P-value = 0.01039
H0: 1 = 0 H1: 1 0 From Excel output:
Coefficients Intercept Square Feet 98.24833 0.10977 Standard Error 58.03348 0.03297
P-value
t Stat 1.69296 3.32938 P-value 0.12892 0.01039
This is a two-tail test, so the p-value is P(t > 3.329)+P(t < -3.329) = 0.01039 (for 8 d.f.)
Decision: P-value < so Reject H0 Conclusion: There is sufficient evidence that square footage affects house price
F Test for Significance
F Test statistic:
where
F=
MSR MSE
MSR = MSE =
SSR k SSE n k 1
where F follows an F distribution with k numerator and (n k - 1) denominator degrees of freedom (k = the number of independent variables in the regression model)
Created by: Prabhat Mittal profmittal@yahoo.co.in
28
Excel Output
Regression Statistics
Multiple R R Square Adjusted R Square Standard Error Observations 0.76211 0.58082 0.52842 41.33032 10
F=
MSR 18934.9348 = = 11.0848 MSE 1708.1957
P-value for the F Test
F
11.0848
With 1 and 8 degrees of freedom
SS MS
18934.9348 1708.1957
ANOVA
df
Regression Residual Total 1 8 9
Significance F
0.01039
18934.9348 13665.5652 32600.5000
Coefficients
Intercept Square Feet 98.24833 0.10977
Standard Error
58.03348 0.03297
t Stat
1.69296 3.32938
P-value
0.12892 0.01039
Lower 95%
-35.57720 0.03374
Upper 95%
232.07386 0.18580
F Test for Significance
(continued)
H0: 1 = 0 H1: 1 0 = .05 df1= 1 df2 = 8
Critical Value: F = 5.32 = .05
Test Statistic:
F= MSR = 11.08 MSE
Decision: Reject H0 at = 0.05 Conclusion:
F There is sufficient evidence that house size affects selling price
Do not reject H0
Reject H0
F.05 = 5.32
Created by: Prabhat Mittal profmittal@yahoo.co.in
29
Confidence Interval Estimate for the Slope
Confidence Interval Estimate of the Slope:
b1 t n2Sb1
Excel Printout for House Prices:
Coefficients
Intercept Square Feet 98.24833 0.10977
d.f. = n - 2
Standard Error
58.03348 0.03297
t Stat
1.69296 3.32938
P-value
0.12892 0.01039
Lower 95%
-35.57720 0.03374
Upper 95%
232.07386 0.18580
At 95% level of confidence, the confidence interval for the slope is (0.0337, 0.1858)
Confidence Interval Estimate for the Slope
Coefficients
Intercept Square Feet 98.24833 0.10977
(continued)
Upper 95%
232.07386 0.18580
Standard Error
58.03348 0.03297
t Stat
1.69296 3.32938
P-value
0.12892 0.01039
Lower 95%
-35.57720 0.03374
Since the units of the house price variable is $1000s, we are 95% confident that the average impact on sales price is between $33.70 and $185.80 per square foot of house size
This 95% confidence interval does not include 0. Conclusion: There is a significant relationship between house price and square feet at the .05 level of significance
Created by: Prabhat Mittal profmittal@yahoo.co.in
30
t Test for a Correlation Coefficient
Hypotheses H0: = 0 HA : 0 Test statistic
(no correlation between X and Y) (correlation exists)
t=
r - 1 r n2
2
(with n 2 degrees of freedom)
where r = + r 2 if b1 > 0 r = r 2 if b1 < 0
Example: House Prices
Is there evidence of a linear relationship between square feet and house price at the .05 level of significance?
H 0: = 0 H 1: 0 (No correlation) (correlation exists)
=.05 , df = 10 - 2 = 8
t=
r 1 r n2
2
.762 0 1 .762 10 2
2
= 3.329
Created by: Prabhat Mittal profmittal@yahoo.co.in
31
Example: Test Solution
t= r 1 r 2 n2 = .762 0 1 .762 2 10 2 = 3.329
Decision: Reject H0 Conclusion: There is evidence of a linear association at the 5% level of significance
d.f. = 10-2 = 8 /2=.025 /2=.025
Reject H0
-t/2
Do not reject H0
t/2
Reject H0
-2.3060
2.3060
3.329
Estimating Mean Values and Predicting Individual Values
Goal: Form intervals around Y to express uncertainty about the value of Y for a given Xi
Confidence Interval for the mean of Y, given Xi
Y = b0+b1Xi
Prediction Interval for an individual Y, given Xi
Xi
Created by: Prabhat Mittal profmittal@yahoo.co.in
32
Confidence Interval for the Average Y, Given X
Confidence interval estimate for the mean value of Y given a particular Xi
Confidence interval for Y|X = Xi : Y t n2S YX hi
Size of interval varies according to distance away from mean, X
1 (Xi X)2 1 (Xi X)2 hi = + = + n SSX n (Xi X )2
Prediction Interval for an Individual Y, Given X
Confidence interval estimate for an Individual value of Y given a particular Xi
Confidence interval for YX = Xi : Y t n2S YX 1 + hi
This extra term adds to the interval width to reflect the added uncertainty for an individual case
Created by: Prabhat Mittal profmittal@yahoo.co.in
33
Estimation of Mean Values: Example
Confidence Interval Estimate for Y|X=Xi
Find the 95% confidence interval for the mean price of 2,000 square-foot houses
Predicted Price Yi = 317.85 ($1,000s)
1 (Xi X)2 t S Y n-2 YX + = 317.85 37.12 n (Xi X)2
The confidence interval endpoints are 280.66 and 354.90, or from $280,660 to $354,900
Estimation of Individual Values: Example
Prediction Interval Estimate for YX=X i
Find the 95% prediction interval for an individual house with 2,000 square feet
Predicted Price Yi = 317.85 ($1,000s)
1 (Xi X)2 t S Y n-1 YX 1 + + = 317.85 102.28 n (Xi X)2
The prediction interval endpoints are 215.50 and 420.07, or from $215,500 to $420,070
Created by: Prabhat Mittal profmittal@yahoo.co.in
34
Finding Confidence and Prediction Intervals in Excel
In Excel, use PHStat | regression | simple linear regression Check the confidence and prediction interval for X= box and enter the X-value and confidence level desired
Finding Confidence and Prediction Intervals in Excel
Input values
(continued)
Y
Confidence Interval Estimate for Y|X=Xi Prediction Interval Estimate for YX=Xi
Created by: Prabhat Mittal profmittal@yahoo.co.in
35
Pitfalls of Regression Analysis
Lacking an awareness of the assumptions underlying least-squares regression Not knowing how to evaluate the assumptions Not knowing the alternatives to least-squares regression if a particular assumption is violated Using a regression model without knowledge of the subject matter Extrapolating outside the relevant range
Strategies for Avoiding the Pitfalls of Regression
Start with a scatter diagram of X vs. Y to observe possible relationship Perform residual analysis to check the assumptions
Plot the residuals vs. X to check for violations of assumptions such as homoscedasticity Use a histogram, stem-and-leaf display, box-andwhisker plot, or normal probability plot of the residuals to uncover possible non-normality
Created by: Prabhat Mittal profmittal@yahoo.co.in
36
Strategies for Avoiding the Pitfalls of Regression
(continued)
If there is violation of any assumption, use alternative methods or models If there is no evidence of assumption violation, then test for the significance of the regression coefficients and construct confidence intervals and prediction intervals Avoid making predictions or forecasts outside the relevant range
Created by: Prabhat Mittal profmittal@yahoo.co.in
37