Book Description
Write modern natural language processing applications using deep learning algorithms and TensorFlow
About This Book Focuses on more efficient natural language processing using TensorFlow
 Covers NLP as a field in its own right to improve understanding for choosing TensorFlow tools and other deep learning approaches
 Provides choices for how to process and evaluate large unstructured text datasets
 Learn to apply the TensorFlow toolbox to specific tasks in the most interesting field in artificial intelligence
This book is for Python developers with a strong interest in deep learning, who want to learn how to leverage TensorFlow to simplify NLP tasks. Fundamental Python skills are assumed, as well as some knowledge of machine learning and undergraduatelevel calculus and linear algebra. No previous natural language processing experience required, although some background in NLP or computational linguistics will be helpful.
What You Will Learn Core concepts of NLP and various approaches to natural language processing
 How to solve NLP tasks by applying TensorFlow functions to create neural networks
 Strategies to process large amounts of data into word representations that can be used by deep learning applications
 Techniques for performing sentence classification and language generation using CNNs and RNNs
 About employing stateofthe art advanced RNNs, like long shortterm memory, to solve complex text generation tasks
 How to write automatic translation programs and implement an actual neural machine translator from scratch
 The trends and innovations that are paving the future in NLP
Natural language processing (NLP) supplies the majority of data available to deep learning applications, while TensorFlow is the most important deep learning framework currently available. Natural Language Processing with TensorFlow brings TensorFlow and NLP together to give you invaluable tools to work with the immense volume of unstructured data in today's data streams, and apply these tools to specific NLP tasks.
Thushan Ganegedara starts by giving you a grounding in NLP and TensorFlow basics. You'll then learn how to use Word2vec, including advanced extensions, to create word embeddings that turn sequences of words into vectors accessible to deep learning algorithms. Chapters on classical deep learning algorithms, like convolutional neural networks (CNN) and recurrent neural networks (RNN), demonstrate important NLP tasks as sentence classification and language generation. You will learn how to apply highperformance RNN models, like long shortterm memory (LSTM) cells, to NLP tasks. You will also explore neural machine translation and implement a neural machine translator.
After reading this book, you will gain an understanding of NLP and you'll have the skills to apply TensorFlow in deep learning NLP applications, and how to perform specific NLP tasks.
Style and approachThe book provides an emphasis on both the theory and practice of natural language processing. It introduces the reader to existing TensorFlow functions and explains how to apply them while writing NLP algorithms. The popular Word2vec method is used to teach the essential process of learning word representations. The book focuses on how to apply classical deep learning to NLP, as well as exploring cutting edge and emerging approaches. Specific examples are used to make the concepts and techniques concrete.
Downloading the example code for this book You can download the example code files for all Packt books you have purchased from your account at http://www.PacktPub.com. If you purchased this book elsewhere, you can visit http://www.PacktPub.com/support and register to have the files emailed directly to you.
Publisher Resources
Table of Contents

Natural Language Processing with TensorFlow
 Table of Contents
 Natural Language Processing with TensorFlow
 Contributors
 Preface
 1. Introduction to Natural Language Processing
 2. Understanding TensorFlow

3. Word2vec – Learning Word Embeddings
 What is a word representation or meaning?
 Classical approaches to learning word representation
 Word2vec – a neural networkbased approach to learning word representation
 The skipgram algorithm
 The Continuous BagofWords algorithm
 Summary

4. Advanced Word2vec
 The original skipgram algorithm
 Comparing skipgram with CBOW
 Extensions to the word embeddings algorithms
 More recent algorithms extending skipgram and CBOW
 GloVe – Global Vectors representation

Document classification with Word2vec
 Dataset
 Classifying documents with word embeddings
 Implementation – learning word embeddings
 Implementation – word embeddings to document embeddings
 Document clustering and tSNE visualization of embedded documents
 Inspecting several outliers
 Implementation – clustering/classification of documents with Kmeans
 Summary
 5. Sentence Classification with Convolutional Neural Networks

6. Recurrent Neural Networks
 Understanding Recurrent Neural Networks
 Backpropagation Through Time
 Applications of RNNs

Generating text with RNNs
 Defining hyperparameters
 Unrolling the inputs over time for Truncated BPTT
 Defining the validation dataset
 Defining weights and biases
 Defining state persisting variables
 Calculating the hidden states and outputs with unrolled inputs
 Calculating the loss
 Resetting state at the beginning of a new segment of text
 Calculating validation output
 Calculating gradients and optimizing
 Outputting a freshly generated chunk of text
 Evaluating text results output from the RNN
 Perplexity – measuring the quality of the text result

Recurrent Neural Networks with Context Features – RNNs with longer memory
 Technical description of the RNNCF

Implementing the RNNCF
 Defining the RNNCF hyperparameters
 Defining input and output placeholders
 Defining weights of the RNNCF
 Variables and operations for maintaining hidden and context states
 Calculating output
 Calculating the loss
 Calculating validation output
 Computing test output
 Computing the gradients and optimizing
 Text generated with the RNNCF
 Summary
 7. Long ShortTerm Memory Networks

8. Applications of LSTM – Generating Text
 Our data

Implementing an LSTM
 Defining hyperparameters
 Defining parameters
 Defining an LSTM cell and its operations
 Defining inputs and labels
 Defining sequential calculations required to process sequential data
 Defining the optimizer
 Decaying learning rate over time
 Making predictions
 Calculating perplexity (loss)
 Resetting states
 Greedy sampling to break unimodality
 Generating new text
 Example generated text
 Comparing LSTMs to LSTMs with peephole connections and GRUs
 Improving LSTMs – beam search
 Improving LSTMs – generating text with words instead of ngrams
 Using the TensorFlow RNN API
 Summary

9. Applications of LSTM – Image Caption Generation
 Getting to know the data
 The machine learning pipeline for image caption generation
 Extracting image features with CNNs
 Implementation – loading weights and inferencing with VGG
 Learning word embeddings
 Preparing captions for feeding into LSTMs
 Generating data for LSTMs
 Defining the LSTM
 Evaluating the results quantitatively
 Captions generated for test images
 Using TensorFlow RNN API with pretrained GloVe word vectors
 Summary

10. SequencetoSequence Learning – Neural Machine Translation
 Machine translation
 A brief historical tour of machine translation
 Understanding Neural Machine Translation
 Preparing data for the NMT system
 Training the NMT
 Inference with NMT
 The BLEU score – evaluating the machine translation systems
 Implementing an NMT from scratch – a German to English translator
 Training an NMT jointly with word embeddings
 Improving NMTs
 Attention
 Other applications of Seq2Seq models – chatbots
 Summary
 11. Current Trends and the Future of Natural Language Processing
 A. Mathematical Foundations and Advanced TensorFlow
 Index
Product Information
 Title: Natural Language Processing with TensorFlow
 Author(s):
 Release date: May 2018
 Publisher(s): Packt Publishing
 ISBN: 9781788478311