Sentence-BERT is one of the most interesting variants of BERT and is popularly used for computing sentence representation. We will begin the chapter by understanding how Sentence-BERT works in detail. We will explore how Sentence-BERT computes sentence representation using the Siamese and triplet network architectures. Next, we will learn about the sentence-transformers library. We will understand how to use the pre-trained Sentence-BERT model to compute sentence representation with the sentence-transformers library.
Moving on, we will understand how to make the monolingual model multilingual with knowledge distillation in detail. Next, we will learn about several interesting domain-specific BERT ...