Video description
When evaluating ML models, it can be difficult to tell the difference between what the models learned to generalize from training and what the models have simply memorized. And that difference can be crucial in some ML tasks, such as when ML models are trained using sensitive data. Recently, new techniques have emerged for differentially private training of ML models, including deep neural networks (DNNs), that used modified stochastic gradient descent to provide strong privacy guarantees for training data.
Those techniques are now available, and they’re both practical and can be easy to use. This said, they come with their own set of hyperparameters that need to be tuned, and they necessarily make learning less sensitive to outlier data in ways that are likely to slightly reduce utility. Úlfar Erlingsson explores the basics of ML privacy, introduces differential privacy and why it’s considered a gold standard, explains the concrete use of ML privacy and the principled techniques behind it, and dives into intended and unintended memorization and how it differs from generalization.
Prerequisite knowledge
- Experience using TensorFlow to train ML models
- A basic understanding of stochastic gradient descent
What you'll learn
- Learn what it means to provide privacy guarantees for ML models and how such guarantees can be achieved in practice using TensorFlow Privacy
Product information
- Title: TensorFlow Privacy: Learning with differential privacy for training data
- Author(s):
- Release date: February 2020
- Publisher(s): O'Reilly Media, Inc.
- ISBN: 0636920373469
You might also like
video
Spotlight on Data: Machine Learning in Production at Google Scale with Todd Underwood
You can propel your business forward with AI-centric approaches to solving customer needs, but to be …
book
Sharing Big Data Safely
Many big data-driven companies today are moving to protect certain types of data against intrusion, leaks, …
video
ODSC East 2019 (Open Data Science Conference)
ODSC East 2019 Royalties for this video set help fund the ODSC Grant Award for open …
book
Hands-On Differential Privacy
Many organizations today analyze and share large, sensitive datasets about individuals. Whether these datasets cover healthcare …