Maximum Likelihood Estimators
Properties
An estimator is an operation computing the value of an estimate, that targets the parameter, using measurements from a sample.
Let \(X_1,\ldots,X_n\overset{iid}{\sim}F(\boldsymbol \theta)\) where \(F(\cdot)\) is a known distribution function and \(\boldsymbol\theta\) is a vector of parameters. Let \(\boldsymbol X = (X_1,\ldots, X_n)^\mathrm{T}\), be the sample collected.
Let \(X_1,\ldots,X_n\) be a random sample from a distribution with parameter \(\theta\). Let \(\hat \theta\) be an estimator for a parameter \(\theta\). Then \(\hat \theta\) is an unbiased estimator if \(E(\hat \theta) = \theta\). Otherwise, \(\hat\theta\) is considered biased.
Let \(X_1,\ldots,X_n\) be a random sample from a distribution with parameter \(\theta\). The estimator \(\hat \theta\) is a consistent estimator of the \(\theta\) if
If \(\hat \theta\) is an ML estimator of \(\theta\), then for any one-to-one function \(g\), the ML estimator for \(g(\theta)\) is \(g(\hat\theta)\).
Let \(X_1,\ldots,X_n\) be a random sample from a distribution with parameter \(\theta\). Let \(\hat \theta\) be the MLE estimator for a parameter \(\theta\). As \(n\rightarrow\infty\), then \(\hat \theta\) has a normal distribution with mean \(\theta\) and variance \(1/nI(\theta)\), where
\[ I(\theta)=E\left[-\frac{\partial^2}{\partial\theta^2}\log\{f(X;\theta)\}\right] \]
Let \(X_1,\ldots,X_n\overset{iid}{\sim}\mathrm{Bin}(m, p)\), find the mle of \(p\). Check if it is an unbiased estimator.
Check if \(\bar X\) is an unbiased estimator for \(\lambda\) in the Poisson distribution.
Using the MLE’s for a normal distribution, \(\hat\mu = \bar X = \frac{1}{n}\sum X_i\) and \(\hat\sigma^2 = \frac{1}{n}\sum (X_i-\bar X)^2\) check if there are unbiased estimators for \(\mu\) and \(\sigma^2\).
Let \(X_1,\ldots,X_n\overset{iid}{\sim}\mathrm{Bernoulli}(p)\), show that the MLE of \(p\) is \(\bar X\). Find the distribution of the MLE as \(n\rightarrow\infty\)