Micro-batch stream processing

There are typically two ways that the incoming messages can be processed. The most common way is to process each message one by one. And this is the way we have discussed throughout in this chapter. One-message-at-a-time stream processing is very low latency and obviously simple to handle, because you deal with a single message at a time. The downside with one-at-a-time processing is that a lot of time may be wasted by the underlying processors and stream systems to perform non-business-related tasks, such as fetching the individual message from the kernels page-cache, loading it in memory, transferring it across the network, and passing it down the underlying system. All this can significantly increase the load ...

Get Architecting Data-Intensive Applications now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.