Training our neural network

Okay, so now we have some good motivation for why this combination of nodes might help us make predictions. How are we actually going to adjust all of the sub-functions of our neural network nodes based on some input data? The answer is called backpropagation.

Backpropagation is a method for training our neural network that involves doing the following iteratively over a series of epochs (or exposure to our training dataset):

  • Feeding our training data forward through the neural network to calculate an output
  • Calculating errors in the output
  • Using gradient descent (or other relevant method) to determine how we should change our weights and biases based on the errors
  • Backpropagating these weight/bias changes into ...

Get Machine Learning With Go now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.