How to do it...

In the previous section, we learned about performing the following steps on top of the input data to come up with error values in forward-propagation (the code file is available as Neural_network_working_details.ipynb in GitHub):

  1. Initialize weights randomly
  2. Calculate the hidden layer unit values by multiplying input values with weights
  3. Perform activation on the hidden layer values
  4. Connect the hidden layer values to the output layer
  5. Calculate the squared error loss

A function to calculate the squared error loss values across all data points is as follows:

import numpy as npdef feed_forward(inputs, outputs, weights):     pre_hidden = np.dot(inputs,weights[0])+ weights[1]     hidden = 1/(1+np.exp(-pre_hidden)) out = np.dot(hidden, ...

Get Neural Networks with Keras Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.