Word embedding

Computers need to be taught to deal with the context. Say, for example, "I like eating apple." The computer need to understand that here, apple is a fruit and not a company. We want text where words have the same meaning to have the same representation, or at least a similar representation, so that machines can understand that the words have the same meaning. The main objective of word embedding is to capture as much context, hierarchical, and morphological information concerning the word as possible.

Word embedding can be categorized in two ways:

  • Frequency-based embedding
  • Prediction-based embedding

From the name, it is clear that frequency-based embedding uses a counting mechanism, whereas prediction-based embedding uses ...

Get Natural Language Processing with Java - Second Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.