May 2018
Beginner
490 pages
13h 16m
English
As in Chapter 9, Getting Your Neurons to Work, an algorithm with adaptive learning rates trains the model. The Adam algorithm using a cross-entropy loss function just like Chapter 9, Getting Your Neurons to Work, as shown here:
with tf.name_scope('train'):train_step = tf.train.AdamOptimizer(FLAGS.learning_rate).minimize(cross_entropy)
The train node has several inputs from other nodes for its Adam optimizer—cross-entropy, layer2, dropout, layer1, and x-input, as shown in the following graph:
At this point, TensorFlow provides all the necessary ...
Read now
Unlock full access