Chapter 9: spaCy and Transformers
In this chapter, you will learn about the latest hot topic in NLP, transformers, and how to use them with TensorFlow and spaCy.
First, you will learn about transformers and transfer learning. Second, you'll learn about the architecture details of the commonly used Transformer architecture â Bidirectional Encoder Representations from Transformers (BERT). You'll also learn how BERT Tokenizer and WordPiece algorithms work. Then you will learn how to quickly get started with pre-trained transformer models of the HuggingFace library. Next, you'll practice how to fine-tune HuggingFace Transformers with TensorFlow and Keras. Finally, you'll learn how spaCy v3.0 integrates transformer models as pre-trained pipelines. ...
Get Mastering spaCy now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.