October 2014
Intermediate to advanced
576 pages
14h 32m
English
The method of moments is a very simple idea. Suppose that we observe
from a distribution
, where
is the vector of parameter which may be d dimensional. The idea of this method is to match the empirical moments estimated from data with theoretical moments calculated using the distribution
.
Obviously, the theoretical moments will need to exist, and one needs a minimum of d moments to obtain d equations to be able to estimate all the components of the parameter vector
. An example of a distribution for which this method does not work is the Cauchy distribution since it has infinite theoretical moments.
Read now
Unlock full access