
10
Chapter 1
of the network's weights, is called an epoch. The number of samples
in the subset is called the epoch size. Some researchers use an epoch
size of one, meaning that the weights are updated after each training
case is presented. The author usually favors using the entire training
set for each epoch, as this favors stability in convergence to the
optimal weights. Compromises between these extremes are popular.
When the epoch size is less than the entire training set, it is important
that the subset be selected randomly each time, or ugly oscillations
may occur. Epochs of training are repeated until the network's perfor-
mance is satisfactory ...