Hands-On Natural Language Processing with PyTorch 1.x

Book description

Become a proficient NLP data scientist by developing deep learning models for NLP and extract valuable insights from structured and unstructured data

Key Features

  • Get to grips with word embeddings, semantics, labeling, and high-level word representations using practical examples
  • Learn modern approaches to NLP and explore state-of-the-art NLP models using PyTorch
  • Improve your NLP applications with innovative neural networks such as RNNs, LSTMs, and CNNs

Book Description

In the internet age, where an increasing volume of text data is generated daily from social media and other platforms, being able to make sense of that data is a crucial skill. With this book, you'll learn how to extract valuable insights from text by building deep learning models for natural language processing (NLP) tasks.

Starting by understanding how to install PyTorch and using CUDA to accelerate the processing speed, you'll explore how the NLP architecture works with the help of practical examples. This PyTorch NLP book will guide you through core concepts such as word embeddings, CBOW, and tokenization in PyTorch. You'll then learn techniques for processing textual data and see how deep learning can be used for NLP tasks. The book demonstrates how to implement deep learning and neural network architectures to build models that will allow you to classify and translate text and perform sentiment analysis. Finally, you'll learn how to build advanced NLP models, such as conversational chatbots.

By the end of this book, you'll not only have understood the different NLP problems that can be solved using deep learning with PyTorch, but also be able to build models to solve them.

What you will learn

  • Use NLP techniques for understanding, processing, and generating text
  • Understand PyTorch, its applications and how it can be used to build deep linguistic models
  • Explore the wide variety of deep learning architectures for NLP
  • Develop the skills you need to process and represent both structured and unstructured NLP data
  • Become well-versed with state-of-the-art technologies and exciting new developments in the NLP domain
  • Create chatbots using attention-based neural networks

Who this book is for

This PyTorch book is for NLP developers, machine learning and deep learning developers, and anyone interested in building intelligent language applications using both traditional NLP approaches and deep learning architectures. If you're looking to adopt modern NLP techniques and models for your development projects, this book is for you. Working knowledge of Python programming, along with basic working knowledge of NLP tasks, is required.

Publisher resources

Download Example Code

Table of contents

  1. Hands-On Natural Language Processing with PyTorch 1.x
  2. Contributors
  3. About the author
  4. About the reviewers
  5. Packt is searching for authors like you
  6. Preface
    1. Who this book is for
    2. What this book covers
    3. To get the most out of this book
    4. Download the example code files
    5. Download the color images
    6. Conventions used
    7. Get in touch
    8. Reviews
  7. Section 1: Essentials of PyTorch 1.x for NLP
  8. Chapter 1: Fundamentals of Machine Learning and Deep Learning
    1. Overview of machine learning
      1. Supervised learning
      2. Unsupervised learning
      3. How do models learn?
    2. Neural networks
      1. Structure of neural networks
      2. Activation functions
      3. How do neural networks learn?
      4. Overfitting in neural networks
    3. NLP for machine learning
      1. Bag-of-words
      2. Sequential representation
    4. Summary
  9. Chapter 2: Getting Started with PyTorch 1.x for NLP
    1. Technical requirements
    2. Installing and using PyTorch 1.x
      1. Tensors
    3. Enabling PyTorch acceleration using CUDA
    4. Comparing PyTorch to other deep learning frameworks
    5. Building a simple neural network in PyTorch
      1. Loading the data
      2. Building the classifier
      3. Implementing dropout
      4. Defining the forward pass
      5. Setting the model parameters
      6. Training our network
      7. Making predictions
      8. Evaluating our model
    6. NLP for PyTorch
      1. Setting up the classifier
      2. Training the classifier
    7. Summary
  10. Section 2: Fundamentals of Natural Language Processing
    1. In this section, you will learn about the fundamentals of building a Natural Language Processing (NLP) application. You will also learn how to use various NLP techniques, such as word embeddings, CBOW, and tokenization in PyTorch in this section.
  11. Chapter 3: NLP and Text Embeddings
    1. Technical requirements
    2. Embeddings for NLP
      1. GLoVe
      2. Embedding operations
    3. Exploring CBOW
      1. CBOW architecture
      2. Building CBOW
    4. Exploring n-grams
      1. N-gram language modeling
    5. Tokenization
    6. Tagging and chunking for parts of speech
      1. Tagging
      2. Chunking
    7. TF-IDF
      1. Calculating TF-IDF
      2. Implementing TF-IDF
      3. Calculating TF-IDF weighted embeddings
    8. Summary
  12. Chapter 4: Text Preprocessing, Stemming, and Lemmatization
    1. Technical requirements
    2. Text preprocessing
      1. Removing HTML
      2. Converting text into lowercase
      3. Removing punctuation
      4. Replacing numbers
    3. Stemming and lemmatization
      1. Stemming
      2. Lemmatization
    4. Uses of stemming and lemmatization
      1. Differences in lemmatization and stemming
    5. Summary
  13. Section 3: Real-World NLP Applications Using PyTorch 1.x
  14. Chapter 5: Recurrent Neural Networks and Sentiment Analysis
    1. Technical requirements
    2. Building RNNs
      1. Using RNNs for sentiment analysis
      2. Exploding and shrinking gradients
    3. Introducing LSTMs
      1. Working with LSTMs
      2. LSTM cells
      3. Bidirectional LSTMs
    4. Building a sentiment analyzer using LSTMs
      1. Preprocessing the data
      2. Model architecture
      3. Training the model
      4. Using our model to make predictions
    5. Deploying the application on Heroku
      1. Introducing Heroku
      2. Creating an API using Flask – file structure
      3. Creating an API using Flask – API file
      4. Creating an API using Flask – hosting on Heroku
    6. Summary
  15. Chapter 6: Convolutional Neural Networks for Text Classification
    1. Technical requirements
    2. Exploring CNNs
      1. Convolutions
      2. Convolutions for NLP
    3. Building a CNN for text classification
      1. Defining a multi-class classification dataset
      2. Creating iterators to load the data
      3. Constructing the CNN model
      4. Training the CNN
      5. Making predictions using the trained CNN
    4. Summary
  16. Chapter 7: Text Translation Using Sequence-to-Sequence Neural Networks
    1. Technical requirements
    2. Theory of sequence-to-sequence models
      1. Encoders
      2. Decoders
      3. Using teacher forcing
    3. Building a sequence-to-sequence model for text translation
      1. Preparing the data
      2. Building the encoder
      3. Building the decoder
      4. Constructing the full sequence-to-sequence model
      5. Training the model
      6. Evaluating the model
    4. Next steps
    5. Summary
  17. Chapter 8: Building a Chatbot Using Attention-Based Neural Networks
    1. Technical requirements
    2. The theory of attention within neural networks
      1. Comparing local and global attention
    3. Building a chatbot using sequence-to-sequence neural networks with attention
      1. Acquiring our dataset
      2. Processing our dataset
      3. Creating the vocabulary
      4. Loading the data
      5. Removing rare words
      6. Transforming sentence pairs to tensors
      7. Constructing the model
      8. Defining the training process
      9. Defining the evaluating process
      10. Training the model
    4. Summary
  18. Chapter 9: The Road Ahead
    1. Exploring state-of-the-art NLP machine learning
      1. BERT
      2. BERT–Architecture
      3. Applications of BERT
      4. GPT-2
      5. Comparing self-attention and masked self-attention
      6. GPT-2 – Ethics
    2. Future NLP tasks
      1. Constituency parsing
      2. Semantic role labeling
      3. Textual entailment
      4. Machine comprehension
    3. Summary
  19. Other Books You May Enjoy
    1. Leave a review - let other readers know what you think

Product information

  • Title: Hands-On Natural Language Processing with PyTorch 1.x
  • Author(s): Thomas Dop
  • Release date: July 2020
  • Publisher(s): Packt Publishing
  • ISBN: 9781789802740