Optimizer for machine learning

An algorithm that decides how the iterations are processed is called an optimizer. The optimizer uses an initial guess of the value and keeps improving the approximate optimal value continuously. In each iteration, the optimizer provides a better value than the previous value so that we can find the optimal value along with the iterations. The optimization algorithms that are mainly used in machine learning are derived from gradient descent. Gradient descent is a first-order iterative optimization algorithm that uses the first-order gradient of the objective function to find the direction we should move in in the next iteration. Let's say the objective function is and that the parameter to be adjusted is

Get Hands-On Machine Learning with TensorFlow.js now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.