18Data Analysis based on Entropies and Measures of Divergence
In this chapter, we discuss entropies and measures of divergence which are extensively used in data analysis and statistical inference. Tests of goodness of fit are reviewed and their asymptotic theory is discussed. Simulation studies are undertaken for comparing their performance capabilities.
18.1. Introduction
Measures of divergence are powerful statistical tools directly related to statistical inference, including robustness, with diverse applicability (see, for example, Papaioannou 1985; Basu et al. 2011; Ghosh et al. 2013). Indeed, on one hand, they can be used for estimation purposes with the classical example, the well-known maximum likelihood estimator (MLE) which is the result of the implementation of the famous Kullback–Leibler measure. On the other hand, measures are applicable in tests of fit to quantify the degree of agreement between the distribution of an observed random sample and a theoretical, hypothesized distribution. The problem of goodness of fit (gof) to any distribution on the real line is frequently treated by partitioning the range of data in a number of disjoint intervals. In all cases, a test statistic is compared against a known critical value to accept or reject the hypothesis that the sample is from the postulated distribution. Over the years, numerous non-parametric gof methods including the chi-squared test and various empirical distribution function (edf) tests (D’Agostino and ...
Get Data Analysis and Related Applications, Volume 1 now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.