13
Summarization with T5 and ChatGPT
During the first seven chapters, we explored the architecture training, fine-tuning, and usage of several transformer ecosystems. In Chapter 7, The Generative AI Revolution with ChatGPT, we discovered that OpenAI has begun experimenting with zero-shot models that require no fine-tuning or development and can be implemented in a few lines.
The underlying concept of such an evolution relies on how transformers strive to teach a machine how to understand a language and express itself in a human-like manner. Thus, we have gone from training a model to teaching languages to machines.
ChatGPT, New Bing, Gemini, and other end user software can summarize, so why bother with T5? Because Hugging Face T5 might be the ...
Get Transformers for Natural Language Processing and Computer Vision - Third Edition now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.