## 3.9 Mean and Variance of Linear Functions

SW 2.2

For this part, suppose that $$Y=a + bX$$ where $$Y$$ and $$X$$ are random variables while $$a$$ and $$b$$ are fixed constants.

Properties of Expectations

1. $$\mathbb{E}[a] = a$$ [In words: the expected value of a constant is just the constant. This holds because there is nothing random about $$a$$ — we just know what it is.]

2. $$\mathbb{E}[bX] = b\mathbb{E}[X]$$ [In words: the expected value of a constant times a random variable is equal to the constant times the expected value of the random variable. We will use this property often this semester.]

3. $$\mathbb{E}[a + bX] = a + b\mathbb{E}[X]$$ [In words: expected values “pass through” sums. We will use this property often this semester.]

You’ll also notice the similarity between the properties of summations and expectations. This is not a coincidence — it holds because expectations are defined as summations (or very closely related, as integrals).

Properties of Variance

1. $$\mathrm{var}(a) = 0$$ [In words: the variance of a constant is equal to 0.]

2. $$\mathrm{var}(bX) = b^2 \mathrm{var}(X)$$ [In words: A constant can come out of the variance, but it needs to be squared first.]

3. $$\mathrm{var}(a + bX) = \mathrm{var}(bX) = b^2 \mathrm{var}(X)$$

Example 3.11 Later on in the semester, it will sometimes be convenient for us to “standardize” some random variables. We’ll talk more about the reason to do this later, but for now, I’ll just give the typical formula for standardizing a random variable and we’ll see if we can figure out what the mean and variance of the standardized random variable are.

$Y = \frac{ X - \mathbb{E}[X]}{\sqrt{\mathrm{var}(X)}}$ Just to be clear here, we are standardizing the random variable $$X$$ and calling its standardized version $$Y$$. Let’s calculate its mean

\begin{aligned} \mathbb{E}[Y] &= \mathbb{E}\left[ \frac{X - \mathbb{E}[X]}{\sqrt{\mathrm{var}(X)}} \right] \\ &= \frac{1}{\sqrt{\mathrm{var}(X)}} \mathbb{E}\big[ X - \mathbb{E}[X] \big] \\ &= \frac{1}{\sqrt{\mathrm{var}(X)}} \left( \mathbb{E}[X] - \mathbb{E}\big[\mathbb{E}[X]\big] \right) \\ &= \frac{1}{\sqrt{\mathrm{var}(X)}} \left( \mathbb{E}[X] - \mathbb{E}[X] \right) \\ &= 0 \end{aligned} where the first equality just comes from the definition of $$Y$$, the second equality holds because $$1/\sqrt{\mathrm{var}(X)}$$ is a constant and can therefore come out of the expectation, the third equality holds because the expectation can pass through the difference, the fourth equality holds because $$\mathbb{E}[X]$$ is a constant and therefore $$\mathbb{E}\big[\mathbb{E}[X]\big] = \mathbb{E}[X]$$, and the last equality holds because the term in parentheses is equal to 0. Thus, the mean of $$Y$$ is equal to 0. Now let’s calculate the variance.

\begin{aligned} \mathrm{var}(Y) &= \mathrm{var}\left( \frac{X}{\sqrt{\mathrm{var}(X)}} - \frac{\mathbb{E}[X]}{\sqrt{\mathrm{var}(X)}} \right) \\ &= \mathrm{var}\left( \frac{X}{\sqrt{\mathrm{var}(X)}}\right) \\ &= \left( \frac{1}{\sqrt{\mathrm{var}(X)}} \right)^2 \mathrm{var}(X) \\ &= \frac{\mathrm{var}(X)}{\mathrm{var}(X)} \\ &= 1 \end{aligned} where the first equality holds by the definition of $$Y$$, the second equality holds because the second term is a constant and by Variance Property 3 above, the third equality holds because $$(1/\sqrt{\mathrm{var}(X)})$$ is a constant and can come out of the variance but needs to be squared, the fourth equality holds by squaring the term on the left, and the last equality holds by cancelling the numerator and denominator.

Therefore, we have showed that the mean of the standardized random variable is 0 and its variance is 1. This is, in fact, the goal of standardizing a random variable — to transform it so that it has mean 0 and variance 1 and the particular transformation given in this example is one that delivers a new random variable with these properties.