# 13.3 Variance and interval estimation

In general, it is not easy to determine the variance of complicated estimators such as the mle. However, it is possible to approximate the variance. The key is a theorem that can be found in most mathematical statistics books. The particular version stated here and its multiparameter generalization is taken from [95] and stated without proof. Recall that *L*(θ) is the likelihood function and *l*(θ) its logarithm. All of the results assume that the population has a distribution that is a member of the chosen parametric family.

**Theorem 13.5** *Assume that the pdf* (*pf in the discrete case*) *f* (*x*; θ) *satisfies the following for θ in an interval containing the true value* (*replace integrals by sums for discrete variables*):

**(**ln

*i*)*f*(

*x*; θ)

*is three times differentiable with respect to*θ.

**(**.

*ii*)*This formula implies that the derivative may be taken outside the integral and so we are just differentiating the constant*1.

^{7}

**(**.

*iii*)*This formula is the same concept for the second derivative*.

**(**.

*iv*)*This inequality establishes that the indicated integral exists and that the location where the derivative is zero is a maximum*.

**(**

*v*)*There exists a function H*(

*x*)

*such that ...*

Get *Loss Models: From Data to Decisions, 4th Edition* now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.