Homework 2

\(\newcommand{\E}{\mathbb{E}}\)

Due: At the start of class on Thursday, Feb. 12. Please turn in a hard copy.

Textbook Questions 4.6, 7.7, 7.28 (part a only), 9.29 (in case the problem is unclear, for the regression in part (a), please include as regressors education, indicators for the demographic groups mentioned in the problem, and the interaction terms mentioned in the problem)

Extra Question 1 The Fibonacci sequence is the sequence of numbers \(0,1,1,2,3,5,8,13,21,34,55,\ldots\) that comes from starting with \(0\) and \(1\) and where each subsequent number is the sum of the previous two. For example, the 5 in the sequence comes from adding 2 and 3; the 55 in the sequence comes from adding 21 and 34.

  1. Write a function called fibonacci that takes in a number n and computes the nth element in the Fibonacci sequence. For example fibonacci(5) should return 3 and fibonacci(8) should return 13.

  2. Consider an alternative sequence where, starting with the third element, each element is computed as the sum of the previous two elements (the same as with the Fibonacci sequence) but where the first two elements can be arbitrary. Write a function alt_seq(a,b,n) where a is the first element in the sequence, b is the second element in the sequence, and n is which element in the sequence to return. For example, if \(a=3\) and \(b=7\), then the sequence would be \(3,7,10,17,27,44,71,\ldots\) and alt_seq(a=3,b=7,n=4) = 17.

Extra Question 2 Consider the following long linear projection:

\[ Y = X_1'\beta_1 + X_2'\beta_2 + e\]

where \(X_1\) is a \(k_1\) dimensional vector and \(X_2\) is a \(k_2\) dimensional vector, as well as the following auxiliary linear projections:

\[ Y = X_2'\delta_2 + u \quad \text{and} \quad X_1 = \Lambda_{12} X_2 + v \]

where \(\Lambda_{12}\) is a \(k_1 \times k_2\) matrix of coefficients from projecting each element of \(X_1\) on \(X_2\). From the Frisch-Waugh-Lovell theorem, we know that

\[ \beta_1 = \left( \E[v v'] \right)^{-1} \E[v u] \]

  1. Show that, in the special case where \(k_1 = 1\), \(\beta_1 = \frac{\E[vY]}{\E[v^2]}\).

  2. Show that, in the special case where \(X_2=1\) (i.e., \(X_2\) only includes the intercept), then \(\beta_1 = \frac{\text{Cov}(X_1, Y)}{\text{Var}(X_1)}\).