Understanding probabilistic language modeling

There are two basic goals of the language model (LM):

  • The goal of LM is to assign probability to a sentence or sequence of words
  • LM also tells us about the probability of the upcoming word, which means that it indicates which is the next most likely word by observing the previous word sequence

If any model can compute either of the preceding tasks, it is called a language model. LM uses the conditional probability chain rule. The chain rule of conditional probability is just an extension of conditional probability. We have already seen the equation:

P(A|B) = P(A and B) / P(B)

P(A and B) = P(A,B) = P(A|B) P(B)

Here, P(A,B) is called joint probability. Suppose you have multiple events that are ...

Get Python Natural Language Processing now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.