Chapter 6: Hyperparameter Optimization in Flair
Grasping the concept of sequence tagging and getting a basic understanding of how it works generally isn't a huge problem. What isn't as straightforward is understanding all the parameters that govern model training and choosing the values that yield desired results. A special technique called hyperparameter optimization (also called hyperparameter tuning) helps us achieve that.
We will start with providing a general overview of what hyperparameter tuning is, why it's useful, and what different optimization methods are out there. We'll then dive into how to do tuning in Python with the Hyperopt library. We will conclude the chapter with a hands-on exercise where we will find the optimal hyperparameters ...
Get Natural Language Processing with Flair now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.