[All Steps] Weighted least squares. In least squares, the objective (to be minimized) is \|A x-b\|^2=∑_i=1^m(\tildea_i^T x-b_i)^2 where \tildea_i^T i
Question: Weighted least squares. In least squares, the objective (to be minimized) is
\[\|A x-b\|^{2}=\sum_{i=1}^{m}\left(\tilde{a}_{i}^{T} x-b_{i}\right)^{2}\]where \(\tilde{a}_{i}^{T} i\) are the rows of \(A\), and the \(n\) -vector \(x\) is to chosen. In the weighted least squares problem, we minimize the objective
\[\sum_{i=1}^{m} w_{i}\left(\tilde{a}_{i}^{T} x-b_{i}\right)^{2}\]where \(w_{i}\) are given positive weights. The weights allow us to assign different weights to the different components of the residual vector.
- Show that the weighted least squares objective can be expressed as \(\|D(A x-b)\|^{2}\) for an appropriate diagonal matrix \(D\). This allows us to solve the weighted least squares problem as a standard least squares problem, by minimizing \(\|B x-d\|^{2}\), where \(B=D A\) and \(d=D b\).
- Show that when \(A\) has linearly independent columns, so does the matrix \(B\).
- The least squares approximate solution is given by \(\hat{x}=\left(A^{T} A\right)^{-1} A^{T} b\). Give a similar formula for the solution of the weighted least squares problem. You might want to use the matrix \(W=\operatorname{diag}(w)\) in your formula.
Price: $2.99
Solution: The downloadable solution consists of 2 pages
Deliverable: Word Document 