Video description
Embeddings from transformer models such as BERT can be used as representations of sentences. Matteus Tanha works with these embeddings to match similar sentences or paragraphs by exploring different distance metrics.
Table of contents
Product information
- Title: Working with Transformer-Based Embeddings for Text Similarity Matching
- Author(s):
- Release date: May 2021
- Publisher(s): Manning Publications
- ISBN: 10000MNHV202217
You might also like
book
Natural Language Processing in Action
Natural Language Processing in Action is your guide to creating machines that understand human language using …
book
Natural Language Processing with PyTorch
Natural Language Processing (NLP) provides boundless opportunities for solving problems in artificial intelligence, making products such …
book
Natural Language Processing with Spark NLP
If you want to build an enterprise-quality application that uses natural language text but aren’t sure …
video
Introduction to Transformer Models for NLP: Using BERT, GPT, and More to Solve Modern Natural Language Processing Tasks
11+ Hours of Video Instruction Learn how to apply state-of-the-art transformer-based LLMs, including BERT, ChatGPT, GPT-3, …