## Video description

9 Hours of Video Instruction

Hands-on approach to learning the probability and statistics underlying machine learning

Overview

Probability and Statistics for Machine Learning (Machine Learning Foundations) LiveLessons provides you with a functional, hands-on understanding of probability theory and statistical modeling, with a focus on machine learning applications.

Jon Krohn is Chief Data Scientist at the machine learning company untapt. He authored the book Deep Learning Illustrated, an instant #1 bestseller that has been translated into six languages. Jon is renowned for his compelling lectures, which he offers in person at Columbia University and New York University, as well as online via O'Reilly, YouTube, and the SuperDataScience podcast. Jon holds a PhD from Oxford and has been publishing on machine learning in leading academic journals since 2010; his papers have been cited over a thousand times.

Skill Level
• Intermediate

Learn How To
• Understand the appropriate variable type and probability distribution for representing a given class of data
• Calculate all of the standard summary metrics for describing probability distributions, as well as the standard techniques for assessing the relationships between distributions
• Apply information theory to quantify the proportion of valuable signal that's present among the noise of a given probability distribution
• Hypothesize about and critically evaluate the inputs and outputs of machine learning algorithms using essential statistical tools such as the t-test, ANOVA, and R-squared
• Understand the fundamentals of both frequentist and Bayesian statistics, as well as appreciate when one of these approaches is appropriate for the problem you're solving
• Use historical data to predict the future using regression models that take advantage of frequentist statistical theory (for smaller data sets) and modern machine learning theory (for larger data sets), including why we may want to consider applying deep learning to a given problem
• Develop a deep understanding of what's going on beneath the hood of predictive statistical models and machine learning algorithms

Who Should Take This Course
• You use high-level software libraries (e.g., scikit-learn, Keras, TensorFlow) to train or deploy machine learning algorithms and would now like to understand the fundamentals underlying the abstractions, enabling you to expand your capabilities
• You're a software developer who would like to develop a firm foundation for the deployment of machine learning algorithms into production systems
• You're a data scientist who would like to reinforce your understanding of the subjects at the core of your professional discipline
• You're a data analyst or AI enthusiast who would like to become a data scientist or data/ML engineer, and so you're keen to deeply understand the field you're entering from the ground up (very wise of you!)

Course Requirements
• Mathematics: Familiarity with secondary school-level mathematics will make it easier for you to follow along with the class. If you are comfortable dealing with quantitative information--such as understanding charts and rearranging simple equations--then you should be well-prepared to follow along with all of the mathematics.
• Programming: All code demos will be in Python so experience with it or another object-oriented programming language would be helpful for following along with the hands-on examples.

Lesson Descriptions

Lesson 1: Introduction to Probability
In Lesson 1, Jon starts by orienting you to the machine learning foundations series and covering what probability theory is. He then begins coverage of the most essential probability concepts, which is reinforced by comprehension exercises. The lesson ends with a comparison of Bayesian and frequentist statistics, as well as a discussion of applications of probability to machine learning.

Lesson 2: Random Variables
Lesson 2 focuses on random variables, a fundamental probability concept that is a prerequisite for understanding the later lessons. Jon starts off with an exploration of discrete and continuous variables as well as the probability distributions to which they correspond. The lesson wraps up with calculation of the expected value of random variables.

Lesson 3: Describing Distributions
Lesson 3 is all about metrics for describing probability distributions. Jon covers measures of central tendency, quantiles, box-and-whisker plots, measures of dispersion, and measures of relatedness.

Lesson 4: Relationships Between Probabilities
In Lesson 4, Jon explores the core relationships between probabilities, including joint distributions, marginal and conditional probabilities, the chain rule, and independence.

Lesson 5: Distributions in Machine Learning
Having now led you through mastering probability theory in general, in Lesson 5 Jon details the most important probability distributions in machine learning, including the uniform and normal distributions, as well as the critical concept of the central limit theorem. He also covers the log-normal, exponential, discrete, and Poisson distributions, as well as mixtures of distributions and how to prepare distributions for input into a machine learning model.

Lesson 6: Information Theory
In Lesson 6, Jon provides you with an introduction to information theory, a field of study related to probability theory that includes some key concepts that are ubiquitous in machine learning. Specifically, he defines self-information, Shannon entropy, KL divergence, and cross-entropy.

Lesson 7: Introduction to Statistics
From Lesson 7 onward, Jon shifts gears from general probability theory to the statistical models that probability theory facilitates. He starts by explaining how statistics are applied to machine learning and reviewing the most essential probability theory you absolutely must know to move forward. He then introduces new statistics concepts, specifically z-scores and p-values.

Lesson 8: Comparing Means
In Lesson 8, Jon teaches you to use probability and statistics to compare distributions with t-tests. He covers all the critical types, including the single-sample, independent, and paired varieties. Jon provides specific applications of t-tests to machine learning, and then wraps the lesson up with a discussion of related concepts, namely, confidence intervals and analysis of variance.

Lesson 9: Correlation
Lesson 9 builds on the introduction to correlation in Lesson 3. You are now armed with enough statistical knowledge to calculate p-values for correlations and calculate the coefficient of determination. Jon finishes off the lesson with important discussions about inferring causation and correcting for multiple comparisons.

Lesson 10: Regression
You're in for a treat with Lesson 10, which brings together the preceding lessons with practical, real-world demonstrations of regression--a powerful, highly extensible approach to making predictions. Jon distinguishes independent from dependent variables and uses linear regression to predict continuous variables--first with a single model feature and then with many, including discrete features. The lesson concludes with logistic regression for predicting discrete outcomes.

Lesson 11: Bayesian Statistics
Lesson 11 is on Bayesian statistics. Jon provides a guide as to when frequentist statistics or Bayesian statistics might be the appropriate option for the problem you're solving. Jon then introduces the most essential Bayesian concepts. Finally, Jon leaves you with resources for studying probability and statistics beyond what we had time for in these LiveLessons themselves.

Notebooks are available at github.com/jonkrohn/ML-foundations