O'Reilly logo

Hands-On Natural Language Processing with Python by Rajalingappaa Shanmugamani, Rajesh Arumugam

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

The effects of different pretrained word embeddings

There are a number of pretrained word embeddings available for us to leverage. In effect, these are words and their corresponding n-dimensional word vectors, made by different research teams. Notable pretrained word vectors are GloVe, Word2vec, and fastText. In our work, we use the pretrained Word2vec word vectors, although any of the preceding word embeddings should be useful in building the NER system that we have discussed in this chapter. The reading and processing of these pretrained word embedding models will be different:

TensorBoard graph

The graph in the preceding diagram, from TensorBoard, ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required