Skip to Content
Deep Learning with PyTorch
book

Deep Learning with PyTorch

by Vishnu Subramanian
February 2018
Intermediate to advanced
262 pages
6h 59m
English
Packt Publishing
Content preview from Deep Learning with PyTorch

Word embedding

Word embedding is a very popular way of representing text data in problems that are solved by deep learning algorithms. Word embedding provides a dense representation of a word filled with floating numbers. The vector dimension varies according to the vocabulary size. It is common to use a word embedding of dimension size 50, 100, 256, 300, and sometimes 1,000. The dimension size is a hyper-parameter that we need to play with during the training phase.

If we are trying to represent a vocabulary of size 20,000 in one-hot representation then we will end up with 20,000 x 20,000 numbers, most of which will be zero. The same vocabulary can be represented in word embedding as 20,000 x dimension size, where the dimension size could ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Deep Learning with PyTorch

Deep Learning with PyTorch

Eli Stevens, Thomas Viehmann, Luca Pietro Giovanni Antiga
Grokking Deep Learning

Grokking Deep Learning

Andrew W. Trask

Publisher Resources

ISBN: 9781788624336Supplemental Content