Chapter 7. Mission Criticality

When it comes to streaming integration – or simply any type of real-time data processing – it’s fairly straightforward to build the integrations we’ve been talking about in a lab scenario. You can try it, piece it together with bits of open source, test it, and generally get things working as a viable proof of concept (POC).

But, when it comes time to put these things into production, that’s another matter altogether. When building continuous data pipelines that need to run around the clock and support the applications that the business depends upon, then many questions arise:

  • How do I make sure that it works 24/7?

  • How do I ensure that it can handle failure?

  • How can I be sure that it can scale to the levels I need it to, given the expected growth of data within my organization?

  • How do I make sure that the data remains secure as it moves around the enterprise at high speeds?

This is all about mission criticality. And it can’t be an afterthought. You need to design your streaming integration infrastructure with mission criticality in mind. This is one of the biggest issues to consider when investigating streaming integration platforms.

Clustering

We’ve talked about the benefits of building a distributed platform for scaling streaming integration. Clustering1 is one of the most popular methods for building a distributed environment. But there are several ways to build a scalable and reliable cluster. Your choice of stream-processing ...

Get Streaming Integration now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.