Skip to Content
Deep Learning for Natural Language Processing
book

Deep Learning for Natural Language Processing

by Stephan Raaijmakers
November 2022
Beginner to intermediate content levelBeginner to intermediate
296 pages
8h 27m
English
Manning Publications
Content preview from Deep Learning for Natural Language Processing

3 Text embeddings

This chapter covers

  • Preparing texts for deep learning using word and document embeddings
  • Using self-developed vs. pretrained embeddings
  • Implementing word similarity with Word2Vec
  • Retrieving documents using Doc2Vec

After reading this chapter, you will have a practical command of basic and popular text embedding algorithms, and you will have developed insight into how to use embeddings for NLP. We will go through a number of concrete scenarios to reach that goal. But first, let’s review the basics of embeddings.

3.1 Embeddings

Embeddings are procedures for converting input data into vector representations. As mentioned in chapter 1, a vector is like a container (such as an array) containing numbers. Every vector lives in a multidimensional ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Deep Learning for Natural Language Processing, 2nd Edition

Deep Learning for Natural Language Processing, 2nd Edition

Jon Krohn
Natural Language Processing in Action

Natural Language Processing in Action

Cole Howard, Hobson Lane, Hannes Hapke

Publisher Resources

ISBN: 9781617295447Supplemental ContentPublisher SupportOtherPublisher WebsiteSupplemental ContentErrata PagePurchase Link