9 Transfer learning with pretrained language models
This chapter covers
- Using transfer learning to leverage knowledge from unlabeled textual data
- Using self-supervised learning to pretrain large language models such as BERT
- Building a sentiment analyzer with BERT and the Hugging Face Transformers library
- Building a natural language inference model with BERT and AllenNLP
The year 2018 is often called “an inflection point” in the history of NLP. A prominent NLP researcher, Sebastian Ruder (https://ruder.io/nlp-imagenet/), dubbed this change “NLP’s ImageNet moment,” where he used the name of a popular computer vision dataset and powerful models pretrained on it, pointing out that similar changes were underway in the NLP community as well. Powerful ...
Get Real-World Natural Language Processing now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.