Skip to Content
Hands-On Transfer Learning with Python
book

Hands-On Transfer Learning with Python

by Dipanjan Sarkar, Raghav Bali, Tamoghna Ghosh
August 2018
Intermediate to advanced
438 pages
12h 3m
English
Packt Publishing
Content preview from Hands-On Transfer Learning with Python

The Jacobian and Hessian matrices

Sometimes, we need to optimize functions whose input and output are vectors. So, for each component of the output vector, we need to compute the gradient vector. For , we will have m gradient vectors. By arranging them in a matrix form, we get n x m matrix of partial derivatives , called the Jacobian matrix.

For a real-valued function of a single variable, if we want to measure the curvature of the function curve at a point, then we need to compute how first the derivative will change as we change the input. ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Hands-On Transfer Learning with TensorFlow 2.0

Hands-On Transfer Learning with TensorFlow 2.0

Margaret Maynard-Reid

Publisher Resources

ISBN: 9781788831307Supplemental Content