In this section, we describe several important properties of estimators. Since an estimator is a function of the samples (which are random variables), it is also a random variable (or a random vector in the multidimensional case) and thus has a nontrivial distribution. It is important to note that the distribution is generally not fixed; if more samples are included in the estimator, then various features of the distribution of the estimator (such as the variance) could change. Typically, it is desirable that the distribution of the estimator become more concentrated about the parameter θ with increasing N.

Example 9.28. Let {Xn} be iid Gaussian random variables with unknown mean μ and known σ. Recall that the sample mean is a minimal sufficient statistic for μ. It is straightforward to show that is also Gaussian with parameters . Examples of the pdf are shown in Figure 9.8, which illustrate how the estimator becomes more accurate as N is increased (the width of the pdf decreases).

FIGURE 9.8 Pdf of the sample mean in Example 9.28 as the number of samples ...

Get Probability, Random Variables, and Random Processes: Theory and Signal Processing Applications now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.