© David Paper 2018
David PaperData Science Fundamentals for Python and MongoDBhttps://doi.org/10.1007/978-1-4842-3597-3_4

4. Gradient Descent

David Paper1 
(1)
Apt 3, Logan, Utah, USA
 

Gradient descent (GD) is an algorithm that minimizes (or maximizes) functions. To apply, start at an initial set of a function’s parameter values and iteratively move toward a set of parameter values that minimize the function. Iterative minimization is achieved using calculus by taking steps in the negative direction of the function’s gradient. GD is important because optimization is a big part of machine learning. Also, GD is easy to implement, generic, and efficient (fast).

Simple Function Minimization (and Maximization)

GD is a 1st order iterative optimization algorithm ...

Get Data Science Fundamentals for Python and MongoDB now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.