The Jacobian and Hessian matrices

Sometimes, we need to optimize functions whose input and output are vectors. So, for each component of the output vector, we need to compute the gradient vector. For , we will have m gradient vectors. By arranging them in a matrix form, we get n x m matrix of partial derivatives , called the Jacobian matrix.

For a real-valued function of a single variable, if we want to measure the curvature of the function curve at a point, then we need to compute how first the derivative will change as we change the input. ...

Get Hands-On Transfer Learning with Python now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.