7

Text Representation

So far, we have addressed the classification and generation problems with the transformers library. Text representation is another crucial task in modern natural language processing (NLP), especially for unsupervised tasks such as clustering, semantic search, and topic modeling. Representing sentences by using various models such as Universal Sentence Encoder (USE) and Sentence-BERT (SBERT) with additional frameworks such as SentenceTransformers will be explained here. Zero-shot learning using BART will also be explained, and you will learn how to utilize it. Few-shot learning methodologies and unsupervised use cases such as semantic text clustering and topic modeling will also be described. Finally, one-shot learning

Get Mastering Transformers - Second Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.