Cost function

Before defining our cost function, we need to define how long we are going to train and how we should define the learning rate:

#Number of training epochsnum_epochs = 700# Defining our learning rate iterations (decay)learning_rate = tf.train.exponential_decay(learning_rate=0.0008,                                          global_step=1,                                          decay_steps=train_input_values.shape[0],                                          decay_rate=0.95,                                          staircase=True)# Defining our cost function - Squared Mean Errormodel_cost = tf.nn.l2_loss(activation_output - output_values, name="squared_error_cost")# Defining our Gradient Descentmodel_train = tf.train.GradientDescentOptimizer(learning_rate).minimize(model_cost)

Now, it's time to execute our computational graph through the session variable.

So first off, we need to initialize ...

Get Deep Learning By Example now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.