The first application I want to dive into is data integration. Let me start by explaining what I mean by data integration and why I think it’s important, then we’ll see how it relates to logs.
Data integration means making available all the data that an organization has to all the services and systems that need it.
The phrase “data integration” isn’t all that common, but I don’t know a better one. The more recognizable term ETL (extract, transform, and load) usually covers only a limited part of data integration—populating a relational data warehouse. However, much of what I am describing can be thought of as ETL that is generalized to also encompass real-time systems and processing flows.
You don’t hear much about data integration in all the breathless interest and hype around the idea of big data; nevertheless, I believe that this mundane problem of making the data available is one of the more valuable goals that an organization can focus on.
Effective use of data follows a kind of Maslow’s hierarchy of needs. The base of the pyramid shown in Figure 2-1 involves capturing all the relevant data and being able to put it together in an applicable processing environment (whether a fancy real-time query system or just text files and Python scripts). This data needs to be modeled in a uniform way to make it easy to read and process. Once the basic needs of capturing data in a uniform way are taken care of, it is reasonable to work on infrastructure to ...