Video description
Nearly 4 Hours of Video InstructionAn intuitive introduction to processing natural language data with TensorFlow-Keras deep learning models.
Overview
Deep Learning for Natural Language Processing LiveLessons, Second Edition, is an introduction to building natural language models with deep learning. These lessons bring intuitive explanations of essential theory to life with interactive, hands-on Jupyter notebook demos. Examples feature Python and Keras, the high-level API for TensorFlow 2, the most popular Deep Learning library. In early lessons, specifics of working with natural language data are covered, including how to convert natural language into numerical representations that can be readily processed by machine learning approaches. In later lessons, state-of-the art Deep Learning architectures are leveraged to make predictions with natural language data.
About the Instructor
Jon Krohn is Chief Data Scientist at the machine learning company untapt. He presents a popular series of deep learning tutorials published by Addison-Wesley and is the author of the bestselling book Deep Learning Illustrated. Jon teaches his deep learning curriculum in-classroom at the New York City Data Science Academy, as well as guest lecturing at Columbia University and New York University. He holds a doctorate in neuroscience from Oxford University and has been publishing on machine learning in leading journals since 2010.
Skill Level
- Intermediate
Learn How To
- Preprocess natural language data for use in machine learning applications
- Transform natural language into numerical representations with word2vec
- Make predictions with Deep Learning models trained on natural language
- Apply state-of-the-art NLP approaches with Keras, the high-level API for TensorFlow 2
- Improve Deep Learning model performance by selecting appropriate model architectures and tuning model hyperparameters
Who Should Take This Course
These LiveLessons are perfectly suited to software engineers, data scientists, analysts, and statisticians with an interest in applying Deep Learning to natural language data. Code examples are provided in Python, so familiarity with it or another object-oriented programming language would be helpful.
Course Requirements
The author’s Deep Learning with TensorFlow, Keras, and PyTorch LiveLessons, or familiarity with the topics covered in Chapters 5 through 9 of his book Deep Learning Illustrated, are a prerequisite.
Lesson Descriptions
Lesson 1: The Power and Elegance of Deep Learning for NLP
This lesson starts off by examining Natural Language Processing and how it has been revolutionized in recent years by Deep Learning approaches. Next comes a review of how to run the code in these LiveLessons. This is followed by the foundational Deep Learning theory that is essential for building an NLP specialization upon. Finally, the lesson provides you with a sneak peek at the capabilities you’ll develop over the course of all five lessons.
Lesson 2: Word Vectors
The lesson begins with a little linguistics section that introduces computational representations of natural language elements. Then it turns to illustrating what word vectors are as well as how the beautiful word2vec algorithm creates them.
Lesson 3: Modeling Natural Language Data
In the preceding lesson, you learned about vector-space embeddings and creating word vectors with word2vec. That process identified shortcomings of our natural language data, so this lesson begins with coverage of best practices for preprocessing language data. Next, on the whiteboard, Jon works through how to calculate a concise and broadly useful summary metric called the Area Under the Curve of the Receiver Operator Characteristic. You immediately learn how to calculate that summary metric in practice by building and evaluating a dense neural network for classifying documents. The lesson then goes a step further by showing you how to add convolutional layers into your deep neural network as well.
Lesson 4: Recurrent Neural Networks
This lesson kicks off by delving into the essential theory of Recurrent Neural Networks, a Deep Learning family that’s ideally suited to handling data that occur in a sequence like languages do. You immediately learn how to apply this theory by incorporating an RNN into your document classification model. Jon then provides a high-level theoretical overview of especially powerful RNN variants--the Long Short-Term Memory Unit and the Gated Recurrent Unit--before showing you how to incorporate these variants into your deep learning models as well.
Lesson 5: Advanced Models
This lesson expands your natural language modeling capabilities further by examining special cases of the LSTM, namely the Bi-Directional and Stacked varieties. Jon also arms you with a rich set of natural language data sets that you can use to train powerful Deep Learning models. To wrap up these LiveLessons, Jon takes you on a journey through other advanced approaches, including sequence generation, seq2seq models, attention, transfer learning, non-sequential network architectures, and financial time series applications.
About Pearson Video Training
Pearson publishes expert-led video tutorials covering a wide selection of technology topics designed to teach you the skills you need to succeed. These professional and personal technology videos feature world-leading author instructors published by your trusted technology brands: Addison-Wesley, Cisco Press, Pearson IT Certification, Prentice Hall, Sams, and Que Topics include: IT Certification, Network Security, Cisco Technology, Programming, Web Development, Mobile Development, and more. Learn more about Pearson Video training at http://www.informit.com/video.
Table of contents
- Introduction
- Lesson 1: The Power and Elegance of Deep Learning for NLP
-
Lesson 2: Word Vectors
- Topics
- 2.1 Computational Representations of Natural Language Elements
- 2.2 Visualizing Word Vectors with word2viz
- 2.3 Localist versus Distributed Representations
- 2.4 Elements of Natural Human Language
- 2.5 The word2vec Algorithm
- 2.6 Creating Word Vectors with word2vec
- 2.7 Pre-Trained Word Vectors and doc2vec
- Lesson 3: Modeling Natural Language Data
- Lesson 4: Recurrent Neural Networks
- Lesson 5: Advanced Models
- Summary
Product information
- Title: Deep Learning for Natural Language Processing, 2nd Edition
- Author(s):
- Release date: February 2020
- Publisher(s): Pearson
- ISBN: 0136620019
You might also like
book
Natural Language Processing with Python
This book offers a highly accessible introduction to natural language processing, the field that supports a …
book
Natural Language Processing with Python and spaCy
Natural Language Processing with Python and spaCy will show you how to create NLP applications like …
video
Natural Language Processing (NLP)
2+ Hours of Video Instruction Overview covers the fundamentals of natural language processing (NLP). It introduces …
book
Grokking Deep Learning
Grokking Deep Learning teaches you to build deep learning neural networks from scratch! In his engaging …