O'Reilly logo

The Biostatistics of Aging: From Gompertzian Mortality to an Index of Aging-Relatedness by Gilberto Levy, Bruce Levin

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

APPENDIX EEXPECTATION–CONDITIONAL MAXIMIZATION (ECM) ALGORITHM

The expectation-maximization (EM) algorithm is a widely applied iterative method in modern statistics for finding maximum likelihood estimates (MLEs) (Dempster et al., 1977; McLachlan and Krishnan, 2008). The EM algorithm is numerically stable (i.e., it monotonically increases the likelihood function), but it can be extremely slow to converge in certain situations. Even before the article by Dempster et al. (1977), estimation for mixture models had been recognized as a natural application of the EM algorithm (McLachlan and Peel, 2000, pp. 47–48; Redner and Walker, 1984). If we interpret our mixture model as an incomplete-data problem and let yi denote the “missing” group indicator (yi = 1 for an observation arising from the Gompertz distribution and yi = 0 for an observation arising from the Weibull distribution), such that the hypothetical complete data for subject i is given by the vector (ai, δi, xi, yi), the complete-data likelihood function taking right-censoring into account and conditioning on age at entry into the study is, from P[(δi, xi, yi) | ai] = P[yi | ai] × likelihood for δi, xi given yi and ai,

(E.1)images

where

(E.2)images

By introducing a function denoted π(ai), involving the five parameters and age at entry into ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required