Descending Gradients
Now that you know a little more about optimization and how it applies to machine learning, it’s time to see how optimization for machine learning is implemented in the context of a real problem. In this section, you’ll implement stochastic gradient descent to estimate the true parameters, which were used to generate some training data.
Gradient descent is an iterative optimization routine using the gradients of a function evaluated at a particular point to minimize that particular function. As you learned in the previous chapter, the gradients of a scalar function are indicative of the direction of steepest descent, so they can be useful in determining how to navigate a function in order to find the minimum. Thanks to Nx’s ...
Get Machine Learning in Elixir now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.