O'Reilly logo

Deep Learning By Example by Ahmed Menshawy

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Cost function

Before defining our cost function, we need to define how long we are going to train and how we should define the learning rate:

#Number of training epochsnum_epochs = 700# Defining our learning rate iterations (decay)learning_rate = tf.train.exponential_decay(learning_rate=0.0008,                                          global_step=1,                                          decay_steps=train_input_values.shape[0],                                          decay_rate=0.95,                                          staircase=True)# Defining our cost function - Squared Mean Errormodel_cost = tf.nn.l2_loss(activation_output - output_values, name="squared_error_cost")# Defining our Gradient Descentmodel_train = tf.train.GradientDescentOptimizer(learning_rate).minimize(model_cost)

Now, it's time to execute our computational graph through the session variable.

So first off, we need to initialize ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required