6
Transformers
The transformer-based architectures have become almost universal in Natural Language Processing (NLP) (and beyond) when it comes to solving a wide variety of tasks, such as:
- Neural machine translation
- Text summarization
- Text generation
- Named entity recognition
- Question answering
- Text classification
- Text similarity
- Offensive message/profanity detection
- Query understanding
- Language modeling
- Next-sentence prediction
- Reading comprehension
- Sentiment analysis
- Paraphrasing
and a lot more.
In less than four years, when the Attention Is All You Need paper was published by Google Research in 2017, transformers managed to take the NLP community by storm, breaking any record achieved over the previous thirty years.
Transformer-based models ...
Get Deep Learning with TensorFlow and Keras - Third Edition now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.