© Hisham El-Amir and Mahmoud Hamdy 2020
H. El-Amir, M. HamdyDeep Learning Pipelinehttps://doi.org/10.1007/978-1-4842-5349-6_10

10. Improving Deep Neural Networks

Hisham El-Amir1  and Mahmoud Hamdy1
Jizah, Egypt

Optimizers in TensorFlow

We’re still on the subject of gradient descent. But let’s now talk about gradient optimization, because of its importance to gradient descent. It is an optimization method for finding the minimum of a function, and it’s important in deep learning. It works to update the weights of the neural network through backpropagation.

So what are the optimizations of the types of gradient descent we talked about previously (batch, mini-batch, and stochastic)? Let’s start by talking about the benefits of these optimization methods ...

Get Deep Learning Pipeline: Building a Deep Learning Model with TensorFlow now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.