Machine Learning with PyTorch

Video description

6+ Hours of Video Instruction

Learn the main concepts and techniques used in modern machine learning and deep neural networks through numerous examples written in PyTorch

Overview

This course begins with the basic concepts of machine and deep learning. Subsequently, you gain a reasonable familiarity with the main features of PyTorch and learn how it can be applied to some popular problem domains.

About the Instructor

David Mertz has been involved with the Python community for 20 years, with data science (under various earlier names), and with machine learning (since way back when it was more likely to be called “artificial intelligence”). He was a director of the Python Software Foundation for six years and continues to serve on, or chair, a variety of PSF working groups.

He has also written quite a bit about Python: the column “Charming Python” for IBM developerWorks, for many years; the book Text Processing in Python (Addison-Wesley, 2003); and two short books for O’Reilly. He created the data science training program for Anaconda, Inc., and was a senior trainer for them.

Skill Level
Intermediate

Learn How To

  • Apply various machine and deep learning techniques
  • Understand the difference between various machine and deep learning libraries
  • Create classifiers
  • Enhance an existing classifier

Who Should Take This Course
Programmers and statisticians interested in using Python and the PyTorch library to implement machine learning

Course Requirements
Programming experience

Lesson Descriptions

Lesson 1: What Is Machine Learning? What Is Deep Learning
The first lesson begins with a high-level overview of the course. It then presents general concepts in machine learning and concepts specifically relevant to neural networks and deep learning. Ideas every data scientist should understand are discussed. The main libraries available for machine learning, and for deep learning specifically, are presented with an eye toward their comparison to PyTorch. The lesson contains an overview of basic concepts in neural networks. Also discussed is the basic idea of a perceptron and the enormous expansion of simple models with hardware that has become available in the last decade. The lesson delves into the main types of network layers available in neural networks. Activation functions are also discussed. Finally, the lesson finishes up with the importance of metrics in guiding refinements of machine learning models. Also discussed are a few of the most commonly used metrics and the need sometimes to use custom domain-specific metrics.

Lesson 2: Comparing Several Libraries
The second lesson of this course compares several different software libraries and shows the particular philosophy and programming style of PyTorch. The first library is scikit-learn, using polynomial feature engineering, random forest classification, and recursive feature elimination. The second library is TensorFlow and its Keras interface. We attempt to recreate features and power similar to the scikit-learn library. The final library we examine is PyTorch, in which we create an identical neural network to that built with Tensorflow, primarily to look at philosophical and API differences between those two popular deep learning libraries. We end by using PyTorch to classify images.

Lesson 3: Understanding PyTorch
The third lesson of this course examines the fundamental abstractions underlying PyTorch: the concept of a tensor, and the capability of performing automatic differentiation after modifications to tensors. The lesson examines the similarity between PyTorch tensors and the arrays in NumPy or other vectorized numeric libraries. It also introduce the two key additions PyTorch provides: auto-gradients that express the functional history of transformations; and also easy targeting of GPUs. The lesson next establishes a low-level neural network and then turns to implementing a neural network with torch.nn. We also take this opportunity with our simple neural network to explain the importance of a bias in input in fine-tuning network layers. The lesson ends by briefly demonstrating the speed gain of using GPUs mentioning the availability of torch.distributed for cluster computation.

Lesson 4: Tasks with Networks
The fourth lesson of this course goes into the most depth in presenting several different types of networks or other models in its sub-lessons. The first sub-lesson addresses a problem David does in his day job: making recommendations for clothing sizes based on some basic survey data about shoppers. In many ways, this problem is a very “classical” machine learning problem that tries to match a small number of features to a small number of output classes. Then the lesson turns to image classification, utilizing convolutional and pooling layers to predict target labels in a commonly used image data set. The network created is of moderate complexity but succeeds relatively well in categorizing images by the pictured object they contain. Next, regression prediction is used on the same problem in an attempt to make more effective prediction by reframing the problem as a regression problem rather than a classification problem. Then clustering with PyTorch is covered. This is a brief departure from neural networks as such, and addresses another common need in machine learning and data science. We implement the k-means algorithm using PyTorch and its underlying vectorized and GPU targeted tensor operations. One of the “hot” topics in deep learning is covered next‚Äìgenerative adversarial networks (GANs). David creates something akin to supervised learning in a framework that is, strictly-speaking, unsupervised. This is accomplished by letting two neural networks‚Äìone a “generator” and one a “discriminator”‚Äìcompete against each other to, respectively, create and detect forgeries. Finally, neural networks are applied to another valuable area of their use‚Äìnatural language processing. The network created, as with other examples in this lesson, is relatively simple. But the simple network utilizes a recurrent layer correctly to classify parts of speech in sentences, even ones with homonyms and lexical ambiguity.

Lesson 5: Enhancing an Image Classifier
The fifth and final lesson of this course looks at an important capability of deep neural networks: transfer learning. It is possible to treat a highly trained complex model almost as a software library for development of new capabilities on top of it. David leverages a very large and previously trained network as a sophisticated tool for feature engineering. Using work already done by others, and only a comparatively small increment of additional computation, David is able to create a network that accurately classifies images against novel labels that are not present in the original training dataset.

About Pearson Video Training

Pearson publishes expert-led video tutorials covering a wide selection of technology topics designed to teach you the skills you need to succeed. These professional and personal technology videos feature world-leading author instructors published by your trusted technology brands: Addison-Wesley, Cisco Press, Pearson IT Certification, Sams, and Que Topics include: IT Certification, Network Security, Cisco Technology, Programming, Web Development, Mobile Development, and more. Learn more about Pearson Video training at http://www.informit.com/video.

Table of contents

  1. Machine Learning with PyTorch: Introduction
    1. 0.0 Course Introduction
    2. 0.1 Lesson Introductions
    3. 0.2 Installing the lessons and platform
  2. Lesson 1: What Is Machine Learning? What Is Deep Learning?
    1. Learning objectives
    2. 1.1 Understand the course at a high level
    3. 1.2 Describe the techniques used in machine learning
    4. 1.3 Describe the libraries used in machine learning
    5. 1.4 Understand the difference between deep learning and other ML techniques
    6. 1.5 Utilize additional concepts in ML
    7. 1.6 Understand the types of network layers and activation functions
    8. 1.7 Understand metrics
  3. Lesson 2: Comparing Several Libraries
    1. Learning objectives
    2. 2.1 Perform a task in scikit-learn
    3. 2.2 Perform a task in Keras (with TensorFlow)
    4. 2.3 Perform a task in PyTorch
    5. 2.4 Classify an image with PyTorch
  4. Lesson 3: Understanding PyTorch
    1. Learning objectives
    2. 3.1 Use tensors, autograd, and NumPy interfaces
    3. 3.2 Establish a low-level neural network
    4. 3.3 Implement a neural network with torch.nn
    5. 3.4 Understand why bias is important
    6. 3.5 Identify other torch tools
  5. Lesson 4: Tasks with Networks
    1. Learning objectives
    2. 4.1 Create a simple feature classifier--Part 1
    3. 4.2 Create a simple feature classifier--Part 2
    4. 4.3 Create an image classifier
    5. 4.4 Utilize regression prediction
    6. 4.5 Do clustering with PyTorch
    7. 4.6 Use generative adversarial networks--Part 1
    8. 4.7 Use generative adversarial networks--Part 2
    9. 4.8 Use a part of speech tagger
  6. Lesson 5: Enhancing an Image Classifier
    1. Learning objectives
    2. 5.1 Start with torchvision.models
    3. 5.2 Retrain pretrained models
    4. 5.3 Modify network layers
  7. Summary
    1. Machine Learning with PyTorch: Summary

Product information

  • Title: Machine Learning with PyTorch
  • Author(s): David Mertz
  • Release date: November 2019
  • Publisher(s): Pearson
  • ISBN: 0135627109