about this book
This book is meant to give you all the necessary knowledge to use large time models in the most optimal way and adapt them to your own use cases. We begin by exploring the transformer architecture, which still powers most foundation forecasting models. Then we attempt to build a tiny foundation model to experiment with concepts such as pretraining, fine-tuning, and transfer learning. This experience is a great way to appreciate the challenges of building a truly foundational model for forecasting.
Next, we explore foundation models specifically built for time-series forecasting, from TimeGPT to TimesFM. Then we experiment with LLMs applied to forecasting. We explore each method’s inner workings and pretraining procedures, ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Read now
Unlock full access