Skip to Main Content
Programming Machine Learning
book

Programming Machine Learning

by Paolo Perrotta
March 2020
Beginner to intermediate content levelBeginner to intermediate
342 pages
8h 38m
English
Pragmatic Bookshelf
Content preview from Programming Machine Learning

Gradient Descent

Let’s look for a better train algorithm. The job of train is to find the parameters that minimize the loss, so let’s start by focusing on loss itself:

 def​ ​loss​(X, Y, w, b):
 return​ np.average((predict(X, w, b) - Y) ** 2)

Look at this function’s arguments. X and Y contain the input variables and the labels, so they never change from one call of loss to the next. To make the upcoming discussion easier, let’s also temporarily fix b at 0. So now the only variable is w.

How does the loss change as w changes? I put together a program that plots loss for w ranging from -1 to 4, and draws a green cross on its minimum value. Check out the following graph (as usual, that code is among the book’s source code):

Nice curve! Let’s ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Mastering Machine Learning Algorithms - Second Edition

Mastering Machine Learning Algorithms - Second Edition

Giuseppe Bonaccorso
Practical Machine Learning for Computer Vision

Practical Machine Learning for Computer Vision

Valliappa Lakshmanan, Martin Görner, Ryan Gillard

Publisher Resources

ISBN: 9781680507706Errata Page