Machine Learning Foundations: Calculus I: Limits & Derivatives
Using Differentiation, including AutoDiff, in Python to Optimize Learning Algorithms
The Machine Learning Foundations series of online trainings provides a comprehensive overview of all of the subjects  mathematics, statistics, and computer science  that underlie contemporary machine learning techniques, including deep learning and other artificial intelligence approaches.
All of the classes in the series bring theory to life through the combination of vivid fullcolor illustrations, straightforward Python examples within handson Jupyter notebook demos, and comprehension exercises with fullyworked solutions.
The focus is on providing you with a practical, functional understanding of the content covered. Context will be given for each topic, highlighting its relevance to machine learning. You will be betterpositioned to understand cuttingedge machine learning papers and you will be provided with resources for digging even deeper into topics that pique your curiosity.
The eight classes in the series are organized into four couplets:
Linear Algebra
Calculus
Statistics
Computer Science
The content in the second class of each couplet follows directly from the content of the first, however you’re most welcome to pick and choose between any of the individual classes based on your particular interests or your existing familiarity with the material. (Note that at any given time, only a subset of these classes will be scheduled and open for registration. To be pushed notifications of upcoming classes in the series, sign up for the instructor’s email newsletter at jonkrohn.com.)
This class, x, introduces the mathematical field of calculus  the study of rates of change  from the ground up. It is essential because computing derivatives via differentiation is the basis of optimizing most machine learning algorithms, including those used in deep learning such as backpropagation and stochastic gradient descent. Through the measured exposition of theory paired with interactive examples, you’ll develop a working understanding of how calculus is used to compute limits and differentiate functions. You’ll also learn how to apply automatic differentiation within the popular TensorFlow 2 and PyTorch machine learning libraries. The content covered in this class is itself foundational for several other classes in the Machine Learning Foundations series, especially Calculus II and Optimization.
What you'll learnand how you can apply it
 Develop an understanding of what’s going on beneath the hood of machine learning algorithms, including those used for deep learning.
 Be able to more intimately grasp the details of machine learning papers as well as many of the other subjects that underlie ML, including partialderivative calculus, statistics and optimization algorithms.
 Compute the derivatives of functions, including by using AutoDiff in the popular TensorFlow 2 and PyTorch libraries.
This training course is for you because...
 You use highlevel software libraries (e.g., scikitlearn, Keras, TensorFlow) to train or deploy machine learning algorithms, and would now like to understand the fundamentals underlying the abstractions, enabling you to expand your capabilities
 You’re a software developer who would like to develop a firm foundation for the deployment of machine learning algorithms into production systems
 You’re a data scientist who would like to reinforce your understanding of the subjects at the core of your professional discipline
 You’re a data analyst or A.I. enthusiast who would like to become a data scientist or data/ML engineer, and so you’re keen to deeply understand the field you’re entering from the ground up (very wise of you!)
Prerequisites
 All code demos will be in Python so experience with it or another objectoriented programming language would be helpful for following along with the handson examples.
Materials, downloads, or Supplemental Content needed in advance:
 During class, we’ll work on Jupyter notebooks interactively in the cloud via Google Colab. This requires zero setup and instructions will be provided in class.
Resources:
 Familiarity with the basics of math and algebra (e.g., the topics covered in Chapters 1 and 2 of Hadrien Jean’s book) will make the class easier to follow along with.
About your instructor

Jon Krohn is Chief Data Scientist at the machine learning company untapt. He authored the 2019 book Deep Learning Illustrated, an instant #1 bestseller that was translated into six languages. Jon’s also the presenter of dozens of hours of popular video tutorials such as Deep Learning with TensorFlow, Keras, and PyTorch. And he’s renowned for his compelling lectures, which he offers inperson at Columbia University, New York University, and the NYC Data Science Academy. Jon holds a PhD in neuroscience from Oxford and has been publishing on machine learning in leading academic journals since 2010.
Schedule
The timeframes are only estimates and may vary according to how the class is progressing
Segment 1: Limits (40 min)
 What Calculus Is
 A Brief History of Calculus
 The Method of Exhaustion
 Calculating Limits
 Q&A and Break
Segment 2: Computing Derivatives with Differentiation (90 min)
 The Sum Rule
 The Product Rule
 The Quotient Rule
 Relating Differentiation to Machine Learning
 Cost (or Loss) Functions
 Calculating the Derivative of a Cost Function
 Q&A and Break
Segment 3: Automatic Differentiation (80 min)
 AutoDiff with PyTorch
 AutoDiff with TensorFlow 2
 The Future: Differentiable Programming
 Final Exercises and Q&A