
112 Kaplan–Meier estimator for F
so that I
{|δm
n
(t)|
2
>ε}
= 0 for all t, as soon as 1/n < ε, and the sum
becomes equal to 0. To establish (b), note that
D
1
√
n
M
n
E
(t) =
1
n
hM
n
i(t) =
Z
t
0
Y
n
(s)
n
µ(s)ds. (10.6)
But
Y
n
(s)
n
=
1
n
n
∑
i=1
I
{
e
T
i
≥s}
is just the tail of the empirical distribution function based on indepen-
dent and identically distributed random variables
e
T
i
. Therefore from
the Glivenko–Cantelli theorem (see Lecture 3), it follows that
Y
n
(s)
n
→ [1 −F(s)][1 −G(s)] n →∞
uniformly in s with probability 1. Therefore, for all t > 0,
D
1
√
n
M
n
E
(t) →
Z
t
0
[1 −F(s)][1 −G(s)] µ(s) ds
=
Z
t
0
[1 −G(s)]dF(s) = A(t)
with probability 1, and not only in probability. Hence the limit theorem ...