The Backpropagation Algorithm
  • Initialization: Initialize all the weights with small random values from a pseudorandom sequence generator.
  • Forward computations: For each of the training feature vectors x(i), i = 1, 2,…, N, compute all the vrj(i), yrj(i) = f(vrj(i)), j = 1, 2,…, kr, r = 1, 2,…, L, from (4.7). Compute the cost function for the current estimate of weights from (4.5) and (4.14).
  • Backward computations: For each i = 1, 2,…, N and j = 1, 2,…, kL compute δLj(i) from (4.15) and in the sequel compute δr−1j(i) from (4.22) and (4.23) for r = L, L − 1,…, 2, and j = 1, 2,…, kr
  • Update the weights: For r = 1, 2,…, L and j = 1, 2,…, kr
Remarks ...

Get Pattern Recognition, 4th Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.