Document classification with Word2vec
Although Word2vec gives a very elegant way of learning numerical representations of words, as we saw quantitatively (loss value) and qualitatively (t-SNE embeddings), learning word representations alone is not convincing enough to realize the power of word vectors in real-world applications. Word embeddings are used as the feature representation of words for many tasks, such as image caption generation and machine translation. However, these tasks involve combining different learning models (such as Convolution Neural Networks (CNNs) and Long Short-Term Memory (LSTM) models or two LSTM models). These will be discussed in later chapters. To understand a real-world usage of word embeddings let's stick to a simpler ...
Get Natural Language Processing with TensorFlow now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.