© The Author(s), under exclusive license to APress Media, LLC, part of Springer Nature 2022
S. M. JainIntroduction to Transformers for NLPhttps://doi.org/10.1007/978-1-4842-8844-3_6

6. Fine-Tuning Pretrained Models

Shashank Mohan Jain1  
(1)
Bangalore, India
 

So far we have seen how to use huggingface APIs that include the pretrained models to create simple applications. Wouldn’t it be amazing if you could start from scratch and train your own model using only your own data?

Utilizing transfer learning is the most effective strategy to take if you do not have a large amount of spare time or computing resources at your disposal. There are two main advantages of utilizing transfer learning with Hugging Face as opposed to starting from scratch when training ...

Get Introduction to Transformers for NLP: With the Hugging Face Library and Models to Solve Problems now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.