Skip to Main Content
Programming Machine Learning
book

Programming Machine Learning

by Paolo Perrotta
March 2020
Beginner to intermediate content levelBeginner to intermediate
342 pages
8h 38m
English
Pragmatic Bookshelf
Content preview from Programming Machine Learning

Batch by Batch

The style of gradient descent that we used so far is also called batch gradient descent, because it clusters all the training examples into one big batch, and calculates the gradient of the loss over the entire batch. A common alternative is called mini-batch gradient descent. Maybe you already guessed what it does: it segments the training set into smaller batches, and then takes a step of gradient descent for each batch.

You might wonder how small batches help speed up training. Stick with me for a moment: let’s implement mini-batch GD and give it a test drive.

Implementing Batches

In most cases, we should shuffle a dataset before we split it into batches. That way, we’re sure that each batch contains a nice mix of examples—as ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Mastering Machine Learning Algorithms - Second Edition

Mastering Machine Learning Algorithms - Second Edition

Giuseppe Bonaccorso
Practical Machine Learning for Computer Vision

Practical Machine Learning for Computer Vision

Valliappa Lakshmanan, Martin Görner, Ryan Gillard

Publisher Resources

ISBN: 9781680507706Errata Page