Chapter 31
Request Batch
Combine multiple requests to optimally utilize the network.
Problem
If a lot of requests are sent to cluster nodes with a small amount of data, network latency and the request processing time (including serialization and deserialization of the request on the server side) can add significant overhead.
For example, in a network with 1Gbps capacity, if the latency and request processing time is 100 microseconds and the client is sending hundreds of requests at the same time, it will significantly limit the overall throughput even though each request is just a few bytes.
Solution
Combine multiple requests together into a single request batch. The batch of the request will be sent to the cluster node for processing, with ...
Get Patterns of Distributed Systems now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.