: This question considers the Bayesian estimators for the population mean and variance. Let Y_i iid N(μ
Problem 1 : This question considers the Bayesian estimators for the population mean and variance.
-
Let \({{Y}_{i}}\) iid \(N\left( \mu ,{{\sigma }^{2}} \right)\) for \(i=1,..,n\). Show that the ML estimator of \(\mu \) and \({{\sigma }^{2}}\) are, respectively.
\[\hat{\mu }=\frac{1}{n}\sum\limits_{i=1}^{n}{{{y}_{i}}}=\bar{y}\]
and
\({{\hat{\sigma }}^{2}}=\frac{1}{n}\sum\limits_{i=1}^{n}{{{({{y}_{i}}-\bar{y})}^{2}}}={{s}^{2}}\) - We know that \({{\sigma }^{2}}\) always tends to underestimate \({{\sigma }^{2}}\), particularly when the sample size n
is small. Now we try to run a simulation to assess the Bayesian estimation of \({{\sigma }^{2}}\).
-
Generate n = 10 observations from N(10, 2^2) distribution. Display your MATLAB
code and the data. - From this simulated data set, what is the ML estimate of \(\mu \) and \({{\sigma }^{2}}\) ?
Problem 2 :
We have learned how to fit a simple logistic regression model by the ML method. This question considers fitting a logistic regression by the Bayesian method.
Suppose binary valued independent responses i = I. n. adopt
and we want to estimate \({{\beta }_{0}}\) and \({{\beta }_{1}}\) observations on y and x. and in particular. the
posterior mean estimator of \({{\beta }_{0}}\) and \({{\beta }_{1}}\). The prior distribution for \({{\beta }_{0}}\) and \({{\beta }_{1}}\) are assumed to be
\({{\beta }_{0}}\sim unif\left( {{{\hat{\beta }}}_{0}}-2,{{{\hat{\beta }}}_{0}}+2 \right)\)
\({{\beta }_{1}}\sim unif\left( {{{\hat{\beta }}}_{1}}-0.2,{{{\hat{\beta }}}_{1}}+0.2 \right)\)
where \({{\hat{\beta }}_{0}}\) and \({{\hat{\beta }}_{1}}\) represent the ML estimate of \({{\beta }_{0}}\) and \({{\beta }_{1}}\) respectively. Data on y and x
variables are given in the file "radio.dat", which can be obtained from the unit web page.
Deliverable: Word Document
![](/images/msword.png)