[Steps Shown] Multi-class classifier via matrix least squares. Consider the least squares multi-class classifier described in section 14.3 of the textbook


Question: Multi-class classifier via matrix least squares. Consider the least squares multi-class classifier described in section 14.3 of the textbook with a regression model \(\tilde{f}_{k}(x)=x^{T} \beta_{k}\) for the one-versus-others classifiers. (We assume that the offset term is included using a constant feature.) Show that the coefficient vectors \(\beta_{1}, \ldots, \beta_{K}\) can be found by solving the matrix least squares problem of minimizing \(\left\|X^{T} \beta-Y\right\|^{2}\), where \(\beta\) is the \(n \times K\) matrix with columns \(\beta_{1}, \ldots, \beta_{K}\), and \(Y\) is an \(N \times K\) matrix.

  1. Give \(Y\), i.e., describe its entries. What is the \(i\) th row of \(Y\) ?
  2. Assuming the rows of \(X\) (i.e., the data feature vectors) are linearly independent, show that the least squares estimate is given by \(\hat{\beta}=\left(X^{T}\right)^{\dagger} Y\).

Price: $2.99
Solution: The downloadable solution consists of 1 pages
Deliverable: Word Document

log in to your account

Don't have a membership account?
REGISTER

reset password

Back to
log in

sign up

Back to
log in