[Step-by-Step] A statistic S is said to be an unbiased estimator of the unknown parameter θ if E(S)=θ , otherwise S is said to be biased and
Question: A statistic S is said to be an unbiased estimator of the unknown parameter \(\theta \) if \(E\left( S \right)=\theta \), otherwise S is said to be biased and the \(bias=E\left( S \right)-\theta \). Suppose that two independent measurements, m(1) and m(2), are made of the length of a side of a square piece of sheet metal of unknown length, L. The measurements are subject to a measurement error, \(e\left( i \right)\), that has a mean = 0 and a standard deviation of \(\sigma \). That is \(m\left( i \right)=L+e\left( i \right)\). Two suggestions are made for estimators to use as an estimate the unknown area \({{L}^{2}}\). The two suggested estimators are: \(S\left( 1 \right)={{\left( \frac{m\left( 1 \right)+m\left( 2 \right)}{2} \right)}^{2}}\) and \(S\left( 2 \right)=\frac{m{{\left( 1 \right)}^{2}}+m{{\left( 2 \right)}^{2}}}{2}\). That is S(1) averages the two measurements and then squares them, whereas S(2) squares the two measurements and then averages them. Which one of the following statements is true
Both estimators S(1) and S(2) are unbiased.
Both estimators S(1) and S(2) are biased but S(1) has less bias than S(2)
Both estimators S(1) and S(2) are biased but S(2) has less bias than S(1)
Both estimators S(1) and S(2) are biased and have equal bias
Deliverable: Word Document 