## 4.4 Partial Effects

As we discussed in the beginning of this chapter, besides predicting outcomes, a second main goal for us is to think about partial effects of a regressor on the outcome. We’ll consider partial effects over the next few sections.

In the model, \[\begin{align*} \mathbb{E}[Y | X_1, X_2, X_3] &= \beta_0 + \beta_1 X_1 + \beta_2 X_2 + \beta_3 X_3 \end{align*}\]

If \(X_1\) is continuous, then \[\begin{align*} \beta_1 = \frac{\partial \mathbb{E}[Y|X_1,X_2,X_3]}{\partial X_1} \end{align*}\]

Thus, \(\beta_1\) is the partial effect of \(X_1\) on \(Y\). In other words, \(\beta_1\) should be interpreted as how much \(Y\) increases, on average, when \(X_1\) increases by one unit holding \(X_2\) and \(X_3\) constant. *Make sure to get this interpretation right!*

**Example 4.2 **Continuing the same example as above about intergenerational income mobility and where \(Y\) denotes child’s income, \(X_1\) denotes parents’ income, \(X_2\) denotes mother’s education, and

\[ \mathbb{E}[Y|X_1,X_2] = 20,000 + 0.5 X_1 + 1000 X_2 \] The partial effect of parents’ income on child’s income is 0.5. This means that, for every one dollar increase in parents’ income, child’s income is 0.5 dollars higher on average holding mother’s education constant.

### 4.4.1 Computation

Let’s run the same regression as in the previous section, but think about partial effects in this case.

```
<- lm(mpg ~ hp + wt, data=mtcars)
reg1 summary(reg1)
#>
#> Call:
#> lm(formula = mpg ~ hp + wt, data = mtcars)
#>
#> Residuals:
#> Min 1Q Median 3Q Max
#> -3.941 -1.600 -0.182 1.050 5.854
#>
#> Coefficients:
#> Estimate Std. Error t value Pr(>|t|)
#> (Intercept) 37.22727 1.59879 23.285 < 2e-16 ***
#> hp -0.03177 0.00903 -3.519 0.00145 **
#> wt -3.87783 0.63273 -6.129 1.12e-06 ***
#> ---
#> Signif. codes:
#> 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
#>
#> Residual standard error: 2.593 on 29 degrees of freedom
#> Multiple R-squared: 0.8268, Adjusted R-squared: 0.8148
#> F-statistic: 69.21 on 2 and 29 DF, p-value: 9.109e-12
```

The partial effect of horsepower on miles per gallon is -0.032. In other words, we estimate that if horsepower increases by one then, on average, miles per gallon decreases by 0.032 holding weight constant.

The t-statistic and p-value are computed for the null hypothesis that the corresponding coefficient is equal to 0. For example, for `hp`

the t-statistic is equal to `-3.519`

which is greater than 1.96 and indicates that the partial effect of `hp`

is statistically significant at a 5% significance level. The corresponding p-value for `hp`

is `0.0145`

indicating that there is only about a 1.5% chance of getting a t-statistic this extreme if the partial effect of `hp`

were actually 0 (i.e., under \(H_0 : \beta_1=0\)).

Practice: What is the partial effect of `wt`

in the previous example? Provide a careful interpretation. Is the partial effect of `wt`

statistically significant? Explain. What is the p-value for `wt`

? How do you interpret the p-value?

Side-Comment: One horsepower is a very small increase in horsepower, so it might be a good idea to multiply the coefficient by some larger number, say 50. In this case, we could say that we estimate that if horsepower increases by 50 then, on average, miles per gallon decreases by 1.59 (\(=50 \times 0.03177\)) holding weight constant. From the above discussion, we know that this effect is statistically different from 0. That said, it is not clear to me if we should interpret this as a large partial effect; I do not know too much about cars, but a 50 horsepower increase seems rather large while a 1.59 decrease in miles per gallon seems relatively small (at least to me).