3
Emergent vs Downstream Tasks: The Unseen Depths of Transformers
Transformers reveal their full potential when we unleash pretrained models and watch them perform downstream Natural Language Understanding (NLU) tasks. It takes a lot of time and effort to pretrain and fine-tune a transformer model, but the effort is worthwhile when we see a multi-billion-parameter transformer model in action on a range of NLU tasks.
Advanced NLP models have achieved the quest of outperforming the human baseline. The human baseline represents the performance of humans on NLU tasks. Humans learn transduction at an early age and quickly develop inductive thinking. We humans perceive the world directly with our senses. Machine intelligence relies entirely on our ...
Get Transformers for Natural Language Processing and Computer Vision - Third Edition now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.