
434 Efficient Neural Network Training Algorithms
and build up H row for row:
H =
v
⊤
1
H
.
.
.
v
⊤
n
w
H
.
The excerpt from the IDL program FFNCG__DEFINE.PRO shown in Listing
B.1 extends the object class FFN introduced in Cha pter 6 and implements a
vectorized version of the above determination of of v
⊤
H (method FFNCG::ROP)
and H (method FFNCG::HESSIAN). The eigenvalues of H are calculated in
metho d FFNCG::EIGENVALUES. The co rresponding Py thon class Ffncg is coded
in the Python module supervisedclass.py.
B.2 Scaled conjugate gradient training
The backpropagation algorithm of Chapter 6 attempts to minimize the cost
function locally, that is, weight up dates