Maximum Likelihood Estimators

Learning Outcomes

  • Maximum Likelihood Estimators

  • Properties

Background Information

Estimators

An estimator is an operation computing the value of an estimate, that targets the parameter, using measurements from a sample.

Data

Let \(X_1,\ldots,X_n\overset{iid}{\sim}F(\boldsymbol \theta)\) where \(F(\cdot)\) is a known distribution function and \(\boldsymbol\theta\) is a vector of parameters. Let \(\boldsymbol X = (X_1,\ldots, X_n)^\mathrm{T}\), be the sample collected.

MLE Properties

Unbiased Estimators

Let \(X_1,\ldots,X_n\) be a random sample from a distribution with parameter \(\theta\). Let \(\hat \theta\) be an estimator for a parameter \(\theta\). Then \(\hat \theta\) is an unbiased estimator if \(E(\hat \theta) = \theta\). Otherwise, \(\hat\theta\) is considered biased.

Consistent Estimators

Let \(X_1,\ldots,X_n\) be a random sample from a distribution with parameter \(\theta\). The estimator \(\hat \theta\) is a consistent estimator of the \(\theta\) if

  1. \(E\{(\hat\theta-\theta)^2\}\rightarrow0\) as \(n\rightarrow \infty\)
  2. \(P(|\hat\theta-\theta|\ge \epsilon)\rightarrow0\) as \(n\rightarrow \infty\) for every \(\epsilon>0\)

Invariance Property

If \(\hat \theta\) is an ML estimator of \(\theta\), then for any one-to-one function \(g\), the ML estimator for \(g(\theta)\) is \(g(\hat\theta)\).

Large Sample Theory

Let \(X_1,\ldots,X_n\) be a random sample from a distribution with parameter \(\theta\). Let \(\hat \theta\) be the MLE estimator for a parameter \(\theta\). As \(n\rightarrow\infty\), then \(\hat \theta\) has a normal distribution with mean \(\theta\) and variance \(1/nI(\theta)\), where

\[ I(\theta)=E\left[-\frac{\partial^2}{\partial\theta^2}\log\{f(X;\theta)\}\right] \]

Examples

Binomial Distribution

Let \(X_1,\ldots,X_n\overset{iid}{\sim}\mathrm{Bin}(m, p)\), find the mle of \(p\). Check if it is an unbiased estimator.

Poisson Distribution

Check if \(\bar X\) is an unbiased estimator for \(\lambda\) in the Poisson distribution.

Normal Distribution

Using the MLE’s for a normal distribution, \(\hat\mu = \bar X = \frac{1}{n}\sum X_i\) and \(\hat\sigma^2 = \frac{1}{n}\sum (X_i-\bar X)^2\) check if there are unbiased estimators for \(\mu\) and \(\sigma^2\).

Bernoulli Distribution

Let \(X_1,\ldots,X_n\overset{iid}{\sim}\mathrm{Bernoulli}(p)\), show that the MLE of \(p\) is \(\bar X\). Find the distribution of the MLE as \(n\rightarrow\infty\)