5.6 Multiple Linear Regression
We’ve covered Simple Linear Regression, whereby “Simple” means just one independent variable. Next we’ll talk about Multiple Linear Regression, where “Multiple” just means multiple independent variables.
\[Y = b_0 + b_1X_1 + b_2X_2 + b_3X_3 + \ldots\]
This is actually quite a straightforward extension, once we get the hang of the interpretation. We can simply extend our linear model to also include \(X_2\), \(X_3\) and so forth, and now our \(b_1\) is called the coefficient on \(X_1\) or the partial coefficient on \(X_1\).
Here’s the most difficult part.
When we interpret each partial coefficient, the value is interpreted holding all the other IVs constant. So \(b_1\) represents the expected change in Y when \(X_1\) increases by one unit, holding constant all the other variables.
This is so important that I’ll say it twice more. If you really understand this point, then you know how to do multiple regression.
The partial regression coefficients represent the expected change in the dependent variable when the associated independent variable is increased by one unit while the values of all other independent variables are held constant.
And a third time, in different words:
Each coefficient \(b_i\) estimates the mean change in the dependent variable (\(Y\)) per unit increase in \(X_i\), when all other predictors are held constant.
Here’s an example
\[\text{Profit} = -2000 + 2.5* \text{ExpenditureOnAdvertising} +32*\text{NumberOfProductsSold}\]
Coefficient | Intepretation: |
---|---|
(b0) | Monthly profit is -$2000 without any money spent on advertising and with zero products sold. |
(b1) | Holding the number of products sold constant, every dollar spent on advertising increases profit by $2.50 |
(b2) | Keeping advertising expenditure constant, every product sold increases profit by $32 |