2
A Hands-On Introduction to the Subject
So far, we have had an overall look at the evolution of natural language processing (NLP) using deep learning (DL)-based methods. We have learned some basic information about transformers and their respective architecture. In this chapter, we are going to have a deeper look into how a transformer model can be used. Tokenizers and models, such as bidirectional encoder representations from transformer (BERT), will be described in more technical detail with hands-on examples, including how to load a tokenizer/model and use community-provided pretrained models. However, before using any specific model, we will understand the installation steps required to provide the necessary environment by using Anaconda. ...
Get Mastering Transformers - Second Edition now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.