Chapter 9. Streaming and Real-Time Features
If you want to implement a scalable real-time ML system that has a feature freshness of just a few seconds, you need streaming feature pipelines. A streaming feature pipeline is a stream-processing program that runs 24/7, consuming events from a streaming data source, potentially enriching those events from other data sources, applying data transformations to create features, and writing the output feature data to a feature store.
Operationally, streaming pipelines have more in common with microservices than batch pipelines. If a streaming pipeline breaks, it often needs to be fixed immediately. You don’t have until the next scheduled batch run to fix it. Stream processing programs divide (partition) the infinite stream of events into groups of related events that are processed together in windows. A window is a time-bound set of events. For example, a streaming pipeline could create a window that groups credit card transactions by credit card number for the last hour and computes features over those events, such as the number of card transactions in the last hour for each card. In such a case, you would need to consider what to do with late-arriving data after its processing window had closed. For example, what should you do with a credit card transaction that arrived two hours late? Despite these challenges, streaming feature pipelines are increasingly being used to build real-time ML systems. They are also becoming more accessible ...