Video description
Embeddings from transformer models such as BERT can be used as representations of sentences. Matteus Tanha works with these embeddings to match similar sentences or paragraphs by exploring different distance metrics.
Table of contents
Product information
- Title: Working with Transformer-Based Embeddings for Text Similarity Matching
- Author(s):
- Release date: May 2021
- Publisher(s): Manning Publications
- ISBN: 10000MNHV202217
You might also like
article
Build an LLM-Powered Intelligent Finance Bot Using Streamlit
These shortcuts delve into generative AI, where algorithms and models create synthetic data, detect anomalies, and …
video
AI Superstream: NLP in Production
Sponsored by Snorkel Natural language processing is one of the most widely used branches of machine …
book
Transfer Learning for Natural Language Processing
Build custom NLP models in record time by adapting pre-trained machine learning models to solve specialized …
video
Transfer Learning for Natural Language Processing
Paul Azunre uses Kaggle kernels to experiment with some key Natural Language Processing architectures, such as …