Maximum Likelihood Estimators
Log-Likelihood Functions
An estimator is an operation computing the value of an estimate, that targets the parameter, using measurements from a sample.
Let \(X_1,\ldots,X_n\overset{iid}{\sim}F(\boldsymbol \theta)\) where \(F(\cdot)\) is a known distribution function and \(\boldsymbol\theta\) is a vector of parameters. Let \(\boldsymbol X = (X_1,\ldots, X_n)^\mathrm{T}\), be the sample collected.
Using the joint pdf or pmf of the sample \(\boldsymbol X\), the likelihood function is a function of \(\boldsymbol \theta\), given the observed data \(\boldsymbol X =\boldsymbol x\), defined as
\[ L(\boldsymbol \theta|\boldsymbol x)=f(\boldsymbol x|\boldsymbol \theta) \]
If the data is iid, then
\[ f(\boldsymbol x|\boldsymbol \theta) = \prod^n_{i=1}f(x_i|\boldsymbol\theta) \]
If \(\ln\{L(\boldsymbol \theta)\}\) is monotone of \(\boldsymbol \theta\), then maximizing \(\ell(\boldsymbol\theta) = \ln\{L(\boldsymbol \theta)\}\) will yield the maximum likelihood estimators.
The maximum likelihood estimator are the estimates of \(\boldsymbol \theta\) that maximize \(\ell(\boldsymbol\theta)\).
Let \(X_1,\ldots,X_n\overset{iid}{\sim}\mathrm{Pois}(\lambda)\), show that the MLE of \(\lambda\) is \(\bar x\).
Let \(X_1,\ldots,X_n\overset{iid}{\sim}N(\mu,\sigma^2)\). Show that the MLE’s of \(\mu\) and \(\sigma^2\) are \(\bar x\) and \(\frac{n-1}{n}s^2\), respectively.
Let \(X_1,\ldots,X_n\overset{iid}{\sim}Exp(\lambda)\). Find the MLE of \(\lambda\)