O'Reilly logo

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Deep Learning with Python Video Edition

Video Description

"The clearest explanation of deep learning I have come across...it was a joy to read."
Richard Tobias, Cephasonics

Deep Learning with Python introduces the field of deep learning using the Python language and the powerful Keras library. Written by Keras creator and Google AI researcher François Chollet, this book builds your understanding through intuitive explanations and practical examples.

Machine learning has made remarkable progress in recent years. We went from near-unusable speech and image recognition, to near-human accuracy. We went from machines that couldn't beat a serious Go player, to defeating a world champion. Behind this progress is deep learning—a combination of engineering advances, best practices, and theory that enables a wealth of previously impossible smart applications.

Inside:
  • Deep learning from first principles
  • Setting up your own deep-learning environment
  • Image-classification models
  • Deep learning for text and sequences
  • Neural style transfer, text generation, and image generation
This Video Editions book requires intermediate Python skills. No previous experience with Keras, TensorFlow, or machine learning is required.

François Chollet works on deep learning at Google in Mountain View, CA. He is the creator of the Keras deep-learning library, as well as a contributor to the TensorFlow machine-learning framework. He also does deep-learning research, with a focus on computer vision and the application of machine learning to formal reasoning. His papers have been published at major conferences in the field, including the Conference on Computer Vision and Pattern Recognition (CVPR), the Conference and Workshop on Neural Information Processing Systems (NIPS), the International Conference on Learning Representations (ICLR), and others.

An excellent hands-on introductory title, with great depth and breadth.
David Blumenthal-Barby, Babbel

Bridges the gap between the hype and a functioning deep-learning system.
Peter Rabinovitch, Akamai

The best resource for becoming a master of Keras and deep learning.
Claudio Rodriguez, Cox Media Group

NARRATED BY MARK THOMAS

Table of Contents

  1. PART 1: THE FUNDAMENTALS OF DEEP LEARNING
    1. Chapter 1. What is deep learning? 00:08:40
    2. Chapter 1. Learning representations from data 00:09:40
    3. Chapter 1. Understanding how deep learning works, in three figures 00:05:14
    4. Chapter 1. Don’t believe the short-term hype 00:07:04
    5. Chapter 1. Before deep learning: a brief history of machine learning 00:08:49
    6. Chapter 1. Decision trees, random forests, and gradient boosting machines 00:10:56
    7. Chapter 1. Why deep learning? Why now? 00:08:58
    8. Chapter 1. A new wave of investment 00:06:45
    9. Chapter 2. Before we begin: the mathematical building blocks of neural networks 00:08:52
    10. Chapter 2. Data representations for neural networks 00:08:59
    11. Chapter 2. Real-world examples of data tensors 00:07:25
    12. Chapter 2. The gears of neural networks: tensor operations 00:05:56
    13. Chapter 2. Tensor dot 00:07:20
    14. Chapter 2. The engine of neural networks: gradient-based optimization 00:09:33
    15. Chapter 2. Stochastic gradient descent 00:08:35
    16. Chapter 2. Looking back at our first example 00:04:01
    17. Chapter 3. Getting started with neural networks 00:10:04
    18. Chapter 3. Introduction to Keras 00:07:31
    19. Chapter 3. Setting up a deep-learning workstation 00:07:26
    20. Chapter 3. Classifying movie reviews: a binary classification example 00:10:12
    21. Chapter 3. Validating your approach 00:05:49
    22. Chapter 3. Classifying newswires: a multiclass classification example 00:10:34
    23. Chapter 3. Predicting house prices: a regression example 00:10:21
    24. Chapter 4. Fundamentals of machine learning 00:10:21
    25. Chapter 4. Evaluating machine-learning models 00:08:44
    26. Chapter 4. Data preprocessing, feature engineering, and feature learning 00:08:28
    27. Chapter 4. Overfitting and underfitting 00:06:58
    28. Chapter 4. Adding weight regularization 00:06:33
    29. Chapter 4. The universal workflow of machine learning 00:06:49
    30. Chapter 4. Developing a model that does better than a baseline 00:07:32
  2. PART 2: DEEP LEARNING IN PRACTICE
    1. Chapter 5. Deep learning for computer vision 00:04:06
    2. Chapter 5. The convolution operation 00:08:36
    3. Chapter 5. The max-pooling operation 00:04:31
    4. Chapter 5. Training a convnet from scratch on a small dataset 00:08:06
    5. Chapter 5. Data preprocessing 00:08:54
    6. Chapter 5. Using a pretrained convnet 00:12:57
    7. Chapter 5. Fine-tuning 00:06:34
    8. Chapter 5. Visualizing what convnets learn 00:07:47
    9. Chapter 5. Visualizing convnet filters 00:09:47
    10. Chapter 6. Deep learning for text and sequences 00:09:08
    11. Chapter 6. Using word embeddings 00:12:03
    12. Chapter 6. Putting it all together: from raw text to word embeddings 00:06:05
    13. Chapter 6. Understanding recurrent neural networks 00:07:49
    14. Chapter 6. Understanding the LSTM and GRU layers 00:09:23
    15. Chapter 6. Advanced use of recurrent neural networks 00:07:41
    16. Chapter 6. A common-sense, non-machine-learning baseline 00:06:50
    17. Chapter 6. Using recurrent dropout to fight overfitting 00:10:42
    18. Chapter 6. Going even further 00:03:59
    19. Chapter 6. Sequence processing with convnets 00:05:21
    20. Chapter 6. Combining CNNs and RNNs to process long sequences 00:06:39
    21. Chapter 7. Advanced deep-learning best practices 00:07:46
    22. Chapter 7. Multi-input models 00:04:13
    23. Chapter 7. Directed acyclic graphs of layers 00:09:48
    24. Chapter 7. Layer weight sharing 00:04:31
    25. Chapter 7. Inspecting and monitoring deep-learning models using Keras callba- acks and TensorBoard 00:05:58
    26. Chapter 7. Introduction to TensorBoard: the TensorFlow visualization framework 00:06:29
    27. Chapter 7. Getting the most out of your models 00:07:39
    28. Chapter 7. Hyperparameter optimization 00:06:02
    29. Chapter 7. Model ensembling 00:08:35
    30. Chapter 8. Generative deep learning 00:06:53
    31. Chapter 8. A brief history of generative recurrent networks 00:08:33
    32. Chapter 8. Implementing character-level LSTM text generation 00:05:55
    33. Chapter 8. DeepDream 00:07:37
    34. Chapter 8. Neural style transfer 00:06:41
    35. Chapter 8. Neural style transfer in Keras 00:07:04
    36. Chapter 8. Generating images with variational autoencoders 00:03:58
    37. Chapter 8. Variational autoencoders 00:09:45
    38. Chapter 8. Introduction to generative adversarial networks 00:05:59
    39. Chapter 8. A bag of tricks 00:08:18
    40. Chapter 9. Conclusions 00:06:08
    41. Chapter 9. How to think about deep learning 00:09:38
    42. Chapter 9. Key network architectures 00:08:42
    43. Chapter 9. The space of possibilities 00:04:21
    44. Chapter 9. The limitations of deep learning 00:05:44
    45. Chapter 9. Local generalization vs. extreme generalization 00:04:57
    46. Chapter 9. The future of deep learning 00:09:35
    47. Chapter 9. Automated machine learning 00:09:12
    48. Chapter 9. Staying up to date in a fast-moving field 00:05:34