February 2018
Intermediate to advanced
450 pages
11h 27m
English
The understanding of computational graphs will help us to think of complex models in terms of small subgraphs and operations.
Let's look at an example of a neural network with only one hidden layer and what its computation graph might look like in TensorFlow:
So, we have some hidden layer that we are trying to compute, as the ReLU activation of some parameter matrix W time some input x plus a bias term b. The ReLU function takes the max of your output and zero.
The following diagram shows what the graph might look like in TensorFlow:
In this graph, we have variables for our b and W and we ...
Read now
Unlock full access