Skip to Content
Mastering Python for Finance - Second Edition
book

Mastering Python for Finance - Second Edition

by James Ma Weiming
April 2019
Intermediate to advanced
426 pages
11h 13m
English
Packt Publishing
Content preview from Mastering Python for Finance - Second Edition

Stochastic gradient descent

Stochastic gradient descent (SGD) is a form of gradient descent that works by using an iterative process to estimate the gradient towards minimizing an objective loss function, such as a linear support vector machine or logistic regression. The stochastic term comes about as samples are chosen at random. When lesser iterations are used, bigger steps are taken to reach the solution, and the model is said to have a high learning rate. Likewise, with more iterations, smaller steps are taken, resulting in a model with a small learning rate. SGD is a popular choice of machine learning algorithm among practitioners as it has been effectively used in large-scale text classification and natural language processing models. ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Python for Finance - Second Edition

Python for Finance - Second Edition

Yuxing Yan
Python for Finance

Python for Finance

Yves Hilpisch

Publisher Resources

ISBN: 9781789346466Supplemental Content