All expressions are graphs

Now we can finally return to the preceding example.

Our problem, if you recall, is that we had to specify the neural network twice: once for prediction and once for learning purposes. We then refactored the program so that we don't have to specify the network twice. Additionally, we had to manually write out the expression for the backpropagation. This is error prone, especially when dealing with larger neural networks like the one we're about to build in this chapter. Is there a better way? The answer is yes.

Once we understand and fully internalize that neural networks are essentially mathematical expressions, we can take the learning's from tensors, and model a neural network where the entire neural network is ...

Get Go Machine Learning Projects now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.