Skip to Main Content
Hands-On Natural Language Processing with PyTorch 1.x
book

Hands-On Natural Language Processing with PyTorch 1.x

by Thomas Dop
July 2020
Beginner to intermediate content levelBeginner to intermediate
276 pages
6h 5m
English
Packt Publishing
Content preview from Hands-On Natural Language Processing with PyTorch 1.x

Chapter 3: NLP and Text Embeddings

There are many different ways of representing text in deep learning. While we have covered basic bag-of-words (BoW) representations, unsurprisingly, there is a far more sophisticated way of representing text data known as embeddings. While a BoW vector acts only as a count of words within a sentence, embeddings help to numerically define the actual meaning of certain words.

In this chapter, we will explore text embeddings and learn how to create embeddings using a continuous BoW model. We will then move on to discuss n-grams and how they can be used within models. We will also cover various ways in which tagging, chunking, and tokenization can be used to split up NLP into its various constituent parts. Finally, ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Hands-On Natural Language Processing with Python

Hands-On Natural Language Processing with Python

Rajesh Arumugam, Rajalingappaa Shanmugamani

Publisher Resources

ISBN: 9781789802740Supplemental Content