Basic Business
Statistics
Introduction to Multiple Regression
Chap 14-1
The Multiple Regression Model
Idea: Examine the linear relationship between
1 dependent (Y) & 2 or more independent variables (Xi)
Multiple Regression Model with k Independent Variables:
Y-intercept Population slopes Random Error
Yi = β 0 + β1 X1i + β 2 X 2i + ⋅ ⋅ ⋅ + β k X ki + ε i
Multiple Regression Equation
The coefficients of the multiple regression model are
estimated using sample data
Multiple regression equation with k independent variables:
Estimated Estimated
(or predicted) Estimated slope coefficients
intercept
value of Y
ˆ = b + b X + b X + ⋅⋅⋅ + b X
Yi 0 1 1i 2 2i k ki
In this chapter we will use Excel to obtain the regression
slope coefficients and other regression summary
measures.
Multiple Regression Equation
(continued)
Two variable model
Y
Ŷ = b0 + b1X1 + b 2 X 2
X2
X1
Example:
2 Independent Variables
A distributor of frozen dessert pies wants to
evaluate factors thought to influence demand
Dependent variable: Pie sales (units per week)
Independent variables: Price (in $)
Advertising ($100’s)
Data are collected for 15 weeks
Pie Sales Example
Pie Price Advertising
Week Sales ($) ($100s) Multiple regression equation:
1 350 5.50 3.3
2 460 7.50 3.3
3 350 8.00 3.0 Sales = b0 + b1 (Price)
4 430 8.00 4.5
+ b2 (Advertising)
5 350 6.80 3.0
6 380 7.50 4.0
7 430 4.50 3.0
8 470 6.40 3.7
9 450 7.00 3.5
10 490 5.00 4.0
11 340 7.20 3.5
12 300 7.90 3.2
13 440 5.90 4.0
14 450 5.00 3.5
15 300 7.00 2.7
Excel Multiple Regression Output
Regression Statistics
Multiple R 0.72213
R Square 0.52148
Adjusted R Square 0.44172
Standard Error 47.46341
Observations 15
Sales = 306.526 - 24.975(Price) + 74.131(Advertising)
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888
Minitab Multiple Regression Output
Sales = 306.526 - 24.975(Price) + 74.131(Advertising)
The regression equation is
Sales = 307 - 25.0 Price + 74.1 Advertising
Predictor Coef SE Coef T P
Constant 306.50 114.30 2.68 0.020
Price -24.98 10.83 -2.31 0.040
Advertising 74.13 25.97 2.85 0.014
S = 47.4634 R-Sq = 52.1% R-Sq(adj) = 44.2%
Analysis of Variance
Source DF SS MS F P
Regression 2 29460 14730 6.54 0.012
Residual Error 12 27033 2253
Total 14 56493
The Multiple Regression Equation
Sales = 306.526 - 24.975(Price) + 74.131(Adv ertising)
where
Sales is in number of pies per week
Price is in $
Advertising is in $100’s.
b1 = -24.975: sales b2 = 74.131: sales will
will decrease, on increase, on average,
average, by 24.975 by 74.131 pies per
pies per week for week for each $100
each $1 increase in increase in
selling price, net of advertising, net of the
the effects of changes effects of changes
due to advertising due to price
Using The Equation to Make
Predictions
Predict sales for a week in which the selling
price is $5.50 and advertising is $350:
Sales = 306.526 - 24.975(Price) + 74.131(Adv ertising)
= 306.526 - 24.975 (5.50) + 74.131 (3.5)
= 428.62
Note that Advertising is
Predicted sales in $100’s, so $350
means that X2 = 3.5
is 428.62 pies
Coefficient of Multiple Determination
Reports the proportion of total variation in Y
explained by all X variables taken together
SSR regression sum of squares
r =
2
=
SST total sum of squares
Multiple Coefficient of
Determination In Excel
Regression Statistics
Multiple R 0.72213 SSR 29460.0
R Square 0.52148
r =
2
= = .52148
SST 56493.3
Adjusted R Square 0.44172
Standard Error 47.46341 52.1% of the variation in pie sales is
Observations 15 explained by the variation in price
and advertising
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888
Multiple Coefficient of
Determination In Minitab
The regression equation is
Sales = 307 - 25.0 Price + 74.1 Advertising
Predictor Coef SE Coef T P
SSR 29460.0
Constant 306.50 114.30 2.68 0.020
Price -24.98 10.83 -2.31 0.040
r2 = = = .52148
Advertising 74.13 25.97 2.85 0.014
SST 56493.3
S = 47.4634 R-Sq = 52.1% R-Sq(adj) = 44.2%
Analysis of Variance
52.1% of the variation in pie
Source DF SS MS F P sales is explained by the
Regression 2 29460 14730 6.54 0.012 variation in price and
Residual Error 12 27033 2253
Total 14 56493
advertising
Adjusted r2
r2 never decreases when a new X variable is
added to the model
This can be a disadvantage when comparing
models
What is the net effect of adding a new variable?
We lose a degree of freedom when a new X
variable is added
Did the new X variable add enough
explanatory power to offset the loss of one
degree of freedom?
Adjusted r2
(continued)
Shows the proportion of variation in Y
explained by all X variables adjusted for the
number of X variables used
Penalize excessive use of unimportant independent
variables
Smaller than r2
Useful in comparing among models
Adjusted r2 in Excel
Regression Statistics
Multiple R 0.72213
R Square 0.52148
2
radj = .44172
Adjusted R
Square 0.44172 44.2% of the variation in pie sales is explained
Standard Error 47.46341 by the variation in price and advertising, taking
Observations 15 into account number of independent variables
Significance
ANOVA df SS MS F F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Coefficient Standard
s Error t Stat P-value Lower 95% Upper 95%
Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888
Adjusted r2 in Minitab
The regression equation is
2
radj = .44172
Sales = 307 - 25.0 Price + 74.1 Advertising
Predictor Coef SE Coef T P
Constant 306.50 114.30 2.68 0.020
Price -24.98 10.83 -2.31 0.040
Advertising 74.13 25.97 2.85 0.014
44.2% of the variation in pie
S = 47.4634 R-Sq = 52.1% R-Sq(adj) = 44.2% sales is explained by the
variation in price and advertising,
Analysis of Variance
taking into account the number
Source DF SS MS F P of independent variables
Regression 2 29460 14730 6.54 0.012
Residual Error 12 27033 2253
Total 14 56493
Are Individual Variables Significant?
Use t tests of individual variable slopes
Shows if there is a linear relationship
between the variable Xj and Y holding
constant the effects of other X variables
Hypotheses:
H0: βj = 0 (no linear relationship)
H1: βj ≠ 0 (linear relationship does exist
between Xj and Y)
Are Individual Variables Significant?
(continued)
H0: βj = 0 (no linear relationship)
H1: βj ≠ 0 (linear relationship does exist
between Xj and Y)
Test Statistic:
bj − 0
t STAT = (df = n – k – 1)
Sb
j
t - Test
T statistic b/Sb
vs
Critical value alpha and df
Are Individual Variables Significant?
Excel Output
(continued)
Regression Statistics
Multiple R 0.72213 t Stat for Price is tSTAT = -2.306, with
R Square 0.52148 p-value .0398
Adjusted R Square 0.44172
Standard Error 47.46341 t Stat for Advertising is tSTAT = 2.855,
Observations 15 with p-value .0145
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888
Inferences about the Slope:
t Test Example
From the Excel and Minitab output:
H0: βj = 0
H1: βj ≠ 0 For Price tSTAT = -2.306, with p-value .0398
For Advertising tSTAT = 2.855, with p-value .0145
d.f. = 15-2-1 = 12
α = .05 The test statistic for each variable falls
tα/2 = 2.1788 in the rejection region (p-values < .05)
Decision:
a/2=.025 a/2=.025 Reject H0 for each variable
Conclusion:
There is evidence that both
Reject H0 Do not reject H0 Reject H0
-tα/2 tα/2 Price and Advertising affect
0
-2.1788 2.1788 pie sales at α = .05
Confidence Interval Estimate
for the Slope
Confidence interval for the population slope βj
b j ± tα / 2 S b where t has
(n – k – 1) d.f.
j
Coefficients Standard Error
Intercept 306.52619 114.25389 Here, t has
Price -24.97509 10.83213
(15 – 2 – 1) = 12 d.f.
Advertising 74.13096 25.96732
Example: Form a 95% confidence interval for the effect of changes in
price (X1) on pie sales:
-24.975 ± (2.1788)(10.832)
So the interval is (-48.576 , -1.374)
(This interval does not contain zero, so price has a significant effect on sales)
Confidence Interval Estimate
for the Slope
(continued)
Confidence interval for the population slope βj
Coefficients Standard Error … Lower 95% Upper 95%
Intercept 306.52619 114.25389 … 57.58835 555.46404
Price -24.97509 10.83213 … -48.57626 -1.37392
Advertising 74.13096 25.96732 … 17.55303 130.70888
Example: Excel output also reports these interval endpoints:
Weekly sales are estimated to be reduced by between 1.37 and
48.58 pies for each increase of $1 in the selling price, holding the
effect of advertising constant
Using Dummy Variables
A dummy variable is a categorical independent
variable with two levels:
yes or no, on or off, male or female
coded as 0 or 1
Assumes the slopes associated with numerical
independent variables do not change with the
value for the categorical variable
If more than two levels, the number of dummy
variables needed is (number of levels - 1)
Dummy-Variable Example
(with 2 Levels)
Ŷ = b0 + b1 X1 + b 2 X 2
Let:
Y = pie sales
X1 = price
X2 = holiday (X2 = 1 if a holiday occurred during the week)
(X2 = 0 if there was no holiday that week)
Dummy-Variable Example
(with 2 Levels)
(continued)
Ŷ = b0 + b1 X1 + b 2 (1) = (b0 + b 2 ) + b1 X1 Holiday
Ŷ = b0 + b1 X1 + b 2 (0) = b0 + b1 X1 No Holiday
Different Same
intercept slope
Y (sales)
If H0: β2 = 0 is
b0 + b2 rejected, then
b0 “Holiday” has a
significant effect
on pie sales
X1 (Price)
Interpreting the Dummy Variable
Coefficient (with 2 Levels)
Example: Sales = 300 - 30(Price) + 15(Holiday)
Sales: number of pies sold per week
Price: pie price in $
1 If a holiday occurred during the week
Holiday:
0 If no holiday occurred
b2 = 15: on average, sales are 15 pies greater in
weeks with a holiday than in weeks without a
holiday, given the same price
Dummy-Variable Models
(more than 2 Levels)
The number of dummy variables is one less than
the number of levels
Example:
Y = house price ; X1 = square feet
If style of the house is also thought to matter:
Style = ranch, split level, colonial
Three levels, so two dummy
variables are needed
Dummy-Variable Models
(more than 2 Levels)
(continued)
Example: Let “colonial” be the default category, and
let X2 and X3 be used for the other two categories:
Y = house price
X1 = square feet
X2 = 1 if ranch, 0 otherwise
X3 = 1 if split level, 0 otherwise
The multiple regression equation is:
Ŷ = b0 + b1X1 + b 2 X 2 + b3 X3
Interpreting the Dummy Variable
Coefficients (with 3 Levels)
Consider the regression equation:
Ŷ = 20.43 + 0.045X1 + 23.53X 2 + 18.84X 3
For a colonial: X2 = X3 = 0
With the same square feet, a
Ŷ = 20.43 + 0.045X1 ranch will have an estimated
average price of 23.53
For a ranch: X2 = 1; X3 = 0 thousand dollars more than
a colonial.
Ŷ = 20.43 + 0.045X1 + 23.53
With the same square feet, a
For a split level: X2 = 0; X3 = 1 split-level will have an
estimated average price of
Ŷ = 20.43 + 0.045X1 + 18.84 18.84 thousand dollars more
than a colonial.
Interaction Between Independent
Variables
Hypothesizes interaction between pairs of X
variables
Response to one X variable may vary at different
levels of another X variable
Contains two-way cross product terms
Ŷ = b0 + b1X1 + b 2 X 2 + b3 X3
= b0 + b1X1 + b 2 X 2 + b3 (X1X 2 )
Interaction Between Independent
Variables
Ŷ = b0 + b1X1 + b 2 X 2 + b3 X3
= b0 + b1X1 + b 2 X 2 + b3 (X1X 2 )
When X2 = 0, b0 + b1X1
When X2 = 1, (b0 + b2)+ (b1 + b3) X1
Interaction Example
Suppose X2 is a dummy variable and the estimated
regression equation is Ŷ = 1 + 2X1 + 3X2 + 4X1X2
Y
12
X2 = 1:
Y = 1 + 2X1 + 3(1) + 4X1(1) = 4 + 6X1
8
4 X2 = 0:
Y = 1 + 2X1 + 3(0) + 4X1(0) = 1 + 2X1
0
X1
0 0.5 1 1.5
Slopes are different if the effect of X1 on Y depends on X2 value