(Solution Library) Consider two separate linear regression models (n x 1)y_1=X_1 (n x k_1)(k_1 x 1)beta_1+(n * 1)u_1 and y_2=X_2 β_2+u_2
Question: Consider two separate linear regression models
\[\underset{(n x 1)}{y_{1}}=X_{1} \underset{\left(n x k_{1}\right)\left(k_{1} x 1\right)}{\beta_{1}}+\underset{(n \times 1)}{u_{1}}\]and
\[y_{2}=X_{2} \quad \beta_{2}+u_{2}\] \[\begin{array}{l(nx1)} (n x x)\left(n x k_{2}\right)\left(k_{2} x 1\right) & (n x 1) \end{array}\]For concreteness, assume that the vector \(y_{1}\) contains observations on the wealth of n randomly selected individuals in Australia and \(y_{2}\) contains observations on the wealth of \(n\) randomly selected individuals in New Zealand. The matrix \(X_{1}\) contains n observations on \(\mathrm{k}_{1}\) explanatory variables which are believed to affect individual wealth in Australia, and he matrix \(X_{2}\) contains n observations on \(\mathrm{k}_{2}\) explanatory variables which are believed to affect individual wealth in New Zealand. Notice that we are not assuming that the same regressors appear in both \(X_{1}\) and \(X_{2}\). For simplicity, we assume that
\[\operatorname{Var}\left(u_{i}\right)=\sigma_{i}^{2} I_{n}, i=1,2\]In econometrics we often wish to combine two or more linear regression equations into a single equation.
-
Combine the linear regression models given by (1) and (2) into a single model of the form
\[y=X \beta+u\]
where y, u and \(\beta\) are partitioned vectors and \(X\) is a partitioned matrix. Carefully specify the dimensions of each sub-matrix (sub-vector) of y, u, \(\beta\) and \(X\) in (3). Note:
The partitioning must satisfy the conformability conditions for matrix addition and matrix multiplication. - The formula for the OLS estimator of \(\beta\) in (3), which we denote by \(b\), is
-
Derive the partitioned matrix \(X^{\prime} X\). Specify the dimensions of each element of the partitioned matrix. Hint: Let the partitioned matrix
\[A=\left[\begin{array}{ll} A_{1} & A_{2} \\ A_{3} & A_{4} \end{array}\right]\]
Then
\[A^{\prime}=\left[\begin{array}{ll} A_{1}^{\prime} & A_{3}^{\prime} \\ A_{2}^{\prime} & A_{4}^{\prime} \end{array}\right]\] - Use the properties of partitioned matrices stated in lectures to derive the partitioned matrix \(\left(X^{\prime} X\right)^{-1}\). Specify the dimensions of each element of the matrix.
- Derive the partitioned vector \(X^{\prime} y .\) Specify the dimensions of each element of the vector.
- Show that the partitioned vector \(b\) in (4) may be written as
where
\[\begin{aligned} & {{b}_{1}}={{\left( X_{1}^{\prime }{{X}_{1}} \right)}^{-1}}X_{1}^{\prime }{{y}_{1}} \\ & {{b}_{2}}={{\left( X_{2}^{\prime }{{X}_{2}} \right)}^{-1}}X_{2}^{\prime }{{y}_{2}} \\ \end{aligned}\]
(c) What is the practical implication of the result in (b) iv) above?
(d) Assume that the matrix \(X\) in (3) is an n x k matrix. Prove that if the column rank of \(X\) is less than \(\mathrm{k}\), then \(X^{\prime} X\) is a singular matrix and the OLS estimator of \(\beta\) in (3) is not defined..
Hint: Use the properties of the rank operator stated in the lecture notes.
Deliverable: Word Document 