O'Reilly logo

Hands-On Natural Language Processing with Python by Rajalingappaa Shanmugamani, Rajesh Arumugam

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Summary

In this chapter, we discussed Word2vec and its variants, then walked through the code for developing a skip-gram model for understanding word relationships. We then used TensorBoard to visualize word embeddings, and looked at how its various projections can provide very useful visualizations. We then discussed a logical extension of Word2vec to produce a document representation, which we improved by leveraging the tf-idf weights. Finally, we discussed doc2vec and its variants for building document-level vector representations. We also looked at how document embeddings can discover the topics present in documents, using TensorBoard.

In the next chapter, we will look at using deep neural networks for text classification. We will look ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required