Quick-thoughts for sentence embeddings

Quick-thoughts is another interesting algorithm for learning the sentence embeddings. In skip-thoughts, we saw how we used the encoder-decoder architecture to learn the sentence embeddings. In quick-thoughts, we try to learn whether a given sentence is related to the candidate sentence. So, instead of using a decoder, we use a classifier to learn whether a given input sentence is related to the candidate sentence.

Let be the input sentence and be the set of candidate sentences containing both valid context ...

Get Hands-On Deep Learning Algorithms with Python now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.