The Deep Learning Workshop

Book description

Take a hands-on approach to understanding deep learning and build smart applications that can recognize images and interpret text

Key Features

  • Understand how to implement deep learning with TensorFlow and Keras
  • Learn the fundamentals of computer vision and image recognition
  • Study the architecture of different neural networks

Book Description

Are you fascinated by how deep learning powers intelligent applications such as self-driving cars, virtual assistants, facial recognition devices, and chatbots to process data and solve complex problems? Whether you are familiar with machine learning or are new to this domain, The Deep Learning Workshop will make it easy for you to understand deep learning with the help of interesting examples and exercises throughout.

The book starts by highlighting the relationship between deep learning, machine learning, and artificial intelligence and helps you get comfortable with the TensorFlow 2.0 programming structure using hands-on exercises. You'll understand neural networks, the structure of a perceptron, and how to use TensorFlow to create and train models. The book will then let you explore the fundamentals of computer vision by performing image recognition exercises with convolutional neural networks (CNNs) using Keras. As you advance, you'll be able to make your model more powerful by implementing text embedding and sequencing the data using popular deep learning solutions. Finally, you'll get to grips with bidirectional recurrent neural networks (RNNs) and build generative adversarial networks (GANs) for image synthesis.

By the end of this deep learning book, you'll have learned the skills essential for building deep learning models with TensorFlow and Keras.

What you will learn

  • Understand how deep learning, machine learning, and artificial intelligence are different
  • Develop multilayer deep neural networks with TensorFlow
  • Implement deep neural networks for multiclass classification using Keras
  • Train CNN models for image recognition
  • Handle sequence data and use it in conjunction with RNNs
  • Build a GAN to generate high-quality synthesized images

Who this book is for

If you are interested in machine learning and want to create and train deep learning models using TensorFlow and Keras, this workshop is for you. A solid understanding of Python and its packages, along with basic machine learning concepts, will help you to learn the topics quickly.

Table of contents

  1. The Deep Learning Workshop
  2. Preface
    1. About the Book
      1. Audience
      2. About the Chapters
      3. Conventions
      4. Code Presentation
      5. Setting up Your Environment
      6. Hardware Requirements
      7. Installing Anaconda on your system
      8. Launching Jupyter Notebook
      9. Installing Libraries
      10. Installing TensorFlow 2.0
      11. Installing Keras
      12. Accessing the Code Files
  3. 1. Building Blocks of Deep Learning
    1. Introduction
      1. AI, Machine Learning, and Deep Learning
      2. Machine Learning
      3. Deep Learning
      4. Using Deep Learning to Classify an Image
        1. Pre-Trained Models
        2. The Google Text-to-Speech API
        3. Prerequisite Packages for the Demo
      5. Exercise 1.01: Image and Speech Recognition Demo
      6. Deep Learning Models
        1. The Multi-Layer Perceptron
        2. Convolutional Neural Networks
        3. Recurrent Neural Networks
      7. Generative Adversarial Networks
    2. Introduction to TensorFlow
      1. Constants
      2. Variables
        1. Defining Functions in TensorFlow
      3. Exercise 1.02: Implementing a Mathematical Equation
      4. Linear Algebra with TensorFlow
      5. Exercise 1.03: Matrix Multiplication Using TensorFlow
      6. The reshape Function
      7. Exercise 1.04: Reshaping Matrices Using the reshape() Function in TensorFlow
      8. The argmax Function
      9. Exercise 1.05: Implementing the argmax() Function
      10. Optimizers
      11. Exercise 1.06: Using an Optimizer for a Simple Linear Regression
      12. Activity 1.01: Solving a Quadratic Equation Using an Optimizer
    3. Summary
  4. 2. Neural Networks
    1. Introduction
    2. Neural Networks and the Structure of Perceptrons
      1. Input Layer
        1. Weights
        2. Bias
        3. Net Input Function
        4. Activation Function (G)
        5. Perceptrons in TensorFlow
      2. Exercise 2.01: Perceptron Implementation
    3. Training a Perceptron
      1. Perceptron Training Process in TensorFlow
      2. Exercise 2.02: Perceptron as a Binary Classifier
      3. Multiclass Classifier
        1. The Softmax Activation Function
      4. Exercise 2.03: Multiclass Classification Using a Perceptron
      5. MNIST Case Study
      6. Exercise 2.04: Classifying Handwritten Digits
    4. Keras as a High-Level API
      1. Exercise 2.05: Binary Classification Using Keras
      2. Multilayer Neural Network or Deep Neural Network
      3. ReLU Activation Function
      4. Exercise 2.06: Multilayer Binary Classifier
      5. Exercise 2.07: Deep Neural Network on MNIST Using Keras
    5. Exploring the Optimizers and Hyperparameters of Neural Networks
      1. Gradient Descent Optimizers
      2. The Vanishing Gradient Problem
      3. Hyperparameter Tuning
      4. Overfitting and Dropout
    6. Activity 2.01: Build a Multilayer Neural Network to Classify Sonar Signals
    7. Summary
  5. 3. Image Classification with Convolutional Neural Networks (CNNs)
    1. Introduction
    2. Digital Images
    3. Image Processing
      1. Convolution Operations
      2. Exercise 3.01: Implementing a Convolution Operation
      3. Stride
      4. Padding
    4. Convolutional Neural Networks
    5. Pooling Layers
      1. CNNs with TensorFlow and Keras
      2. Exercise 3.02: Recognizing Handwritten Digits (MNIST) with CNN Using KERAS
      3. Data Generator
      4. Exercise 3.03: Classifying Cats versus Dogs with Data Generators
    6. Data Augmentation
      1. Horizontal Flipping
      2. Vertical Flipping
      3. Zooming
      4. Horizontal Shifting
      5. Vertical Shifting
      6. Rotating
      7. Shearing
      8. Exercise 3.04: Image Classification (CIFAR-10) with Data Augmentation
      9. Activity 3.01: Building a Multiclass Classifier Based on the Fashion MNIST Dataset
    7. Saving and Restoring Models
      1. Saving the Entire Model
        1. Saving the Architecture Only
        2. Saving the Weights Only
    8. Transfer Learning
    9. Fine-Tuning
      1. Activity 3.02: Fruit Classification with Transfer Learning
    10. Summary
  6. 4. Deep Learning for Text – Embeddings
    1. Introduction
    2. Deep Learning for Natural Language Processing
      1. Getting Started with Text Data Handling
      2. Text Preprocessing
        1. Tokenization
        2. Normalizing Case
        3. Removing Punctuation
        4. Removing Stop Words
      3. Exercise 4.01: Tokenizing, Case Normalization, Punctuation, and Stop Word Removal
      4. Stemming and Lemmatization
      5. Exercise 4.02: Stemming Our Data
        1. Beyond Stemming and Lemmatization
        2. Downloading Text Corpora Using NLTK
      6. Activity 4.01: Text Preprocessing of the 'Alice in Wonderland' Text
      7. Text Representation Considerations
    3. Classical Approaches to Text Representation
      1. One-Hot Encoding
      2. Exercise 4.03: Creating One-Hot Encoding for Our Data
      3. Term Frequencies
      4. The TF-IDF Method
      5. Exercise 4.04: Document-Term Matrix with TF-IDF
      6. Summarizing the Classical Approaches
    4. Distributed Representation for Text
      1. Word Embeddings and Word Vectors
        1. word2vec
      2. Training Our Own Word Embeddings
      3. Semantic Regularities in Word Embeddings
      4. Exercise 4.05: Vectors for Phrases
        1. Effect of Parameters – "size" of the Vector
        2. Effect of Parameters – "window size"
      5. Skip-gram versus CBOW
        1. Effect of Training Data
      6. Exercise 4.06: Training Word Vectors on Different Datasets
      7. Using Pre-Trained Word Vectors
      8. Bias in Embeddings – A Word of Caution
      9. Other Notable Approaches to Word Embeddings
      10. Activity 4.02: Text Representation for Alice in Wonderland
    5. Summary
  7. 5. Deep Learning for Sequences
    1. Introduction
    2. Working with Sequences
      1. Time Series Data – Stock Price Prediction
      2. Exercise 5.01: Visualizing Our Time-Series Data
    3. Recurrent Neural Networks
      1. Loops – An Integral Part of RNNs
      2. Exercise 5.02: Implementing the Forward Pass of a Simple RNN Using TensorFlow
      3. The Flexibility and Versatility of RNNs
      4. Preparing the Data for Stock Price Prediction
      5. Parameters in an RNN
      6. Training RNNs
      7. Exercise 5.03: Building Our First Plain RNN Model
      8. Model Training and Performance Evaluation
      9. 1D Convolutions for Sequence Processing
      10. Exercise 5.04: Building a 1D Convolution-Based Model
      11. Performance of 1D Convnets
      12. Using 1D Convnets with RNNs
      13. Exercise 5.05: Building a Hybrid (1D Convolution and RNN) Model
      14. Activity 5.01: Using a Plain RNN Model to Predict IBM Stock Prices
    4. Summary
  8. 6. LSTMs, GRUs, and Advanced RNNs
    1. Introduction
    2. Long-Range Dependence/Influence
    3. The Vanishing Gradient Problem
    4. Sequence Models for Text Classification
      1. Loading Data
      2. Staging and Preprocessing Our Data
    5. The Embedding Layer
    6. Building the Plain RNN Model
      1. Exercise 6.01: Building and Training an RNN Model for Sentiment Classification
    7. Making Predictions on Unseen Data
    8. LSTMs, GRUs, and Other Variants
      1. LSTMs
    9. Parameters in an LSTM
      1. Exercise 6.02: LSTM-Based Sentiment Classification Model
    10. LSTM versus Plain RNNs
    11. Gated Recurrence Units
      1. Exercise 6.03: GRU-Based Sentiment Classification Model
      2. LSTM versus GRU
    12. Bidirectional RNNs
      1. Exercise 6.04: Bidirectional LSTM-Based Sentiment Classification Model
    13. Stacked RNNs
      1. Exercise 6.05: Stacked LSTM-Based Sentiment Classification Model
    14. Summarizing All the Models
    15. Attention Models
    16. More Variants of RNNs
      1. Activity 6.01: Sentiment Analysis of Amazon Product Reviews
    17. Summary
  9. 7. Generative Adversarial Networks
    1. Introduction
      1. Key Components of Generative Adversarial Networks
      2. Problem Statement – Generating a Distribution Similar to a Given Mathematical Function
      3. Process 1 – Generating Real Data from the Known Function
      4. Exercise 7.01: Generating a Data Distribution from a Known Function
      5. Process 2 – Creating a Basic Generative Network
      6. Building the Generative Network
      7. Sequential()
        1. Kernel Initializers
        2. Dense Layers
        3. Activation Functions
      8. Exercise 7.02: Building a Generative Network
      9. Setting the Stage for the Discriminator Network
      10. Process 3 – Discriminator Network
        1. Implementing the Discriminator Network
        2. Function to Generate Real Samples
        3. Functions to Generate Fake Samples
        4. Building the Discriminator Network
        5. Training the Discriminator Network
      11. Exercise 7.03: Implementing the Discriminator Network
      12. Process 4 – Implementing the GAN
        1. Integrating All the Building Blocks
      13. Process for Building the GAN
      14. The Training Process
      15. Exercise 7.04: Implementing the GAN
    2. Deep Convolutional GANs
      1. Building Blocks of DCGANs
      2. Generating Handwritten Images Using DCGANs
        1. The Training Process
      3. Exercise 7.05: Implementing the DCGAN
      4. Analysis of Sample Plots
      5. Common Problems with GANs
        1. Mode Collapse
        2. Convergence Failure
      6. Activity 7.01: Implementing a DCGAN for the MNIST Fashion Dataset
    3. Summary
  10. Appendix
    1. 1. Building Blocks of Deep Learning
      1. Activity 1.01: Solving a Quadratic Equation Using an Optimizer
      2. Solution
    2. 2. Neural Networks
      1. Activity 2.01: Build a Multilayer Neural Network to Classify Sonar Signals
      2. Solution
    3. 3. Image Classification with Convolutional Neural Networks (CNNs)
      1. Activity 3.01: Building a Multiclass Classifier Based on the Fashion MNIST Dataset
      2. Solution
      3. Activity 3.02: Fruit Classification with Transfer Learning
      4. Solution
    4. 4. Deep Learning for Text – Embeddings
      1. Activity 4.01: Text Preprocessing of the 'Alice in Wonderland' Text
      2. Solution
      3. Activity 4.02: Text Representation for Alice in Wonderland
      4. Solution
    5. 5. Deep Learning for Sequences
      1. Activity 5.01: Using a Plain RNN Model to Predict IBM Stock Prices
      2. Solution
    6. 6. LSTMs, GRUs, and Advanced RNNs
      1. Activity 6.01: Sentiment Analysis of Amazon Product Reviews
      2. Solution
    7. 7. Generative Adversarial Networks
      1. Activity 7.01: Implementing a DCGAN for the MNIST Fashion Dataset
      2. Solution

Product information

  • Title: The Deep Learning Workshop
  • Author(s): Mirza Rahim Baig, Thomas V. Joseph, Nipun Sadvilkar, Mohan Kumar Silaparasetty, Anthony So
  • Release date: July 2020
  • Publisher(s): Packt Publishing
  • ISBN: 9781839219856