Chapter 11 Statistical Approach for Data Tracing
11.0 Introduction
Centralized data quality assessment can perform reasonability, boundary, and validity checks, but centralized accuracy checks are difficult to perform. Tracing operations back to the source and validating the data with source data is the best way to perform accuracy checks. However, given the vast quantity and complexity of data passing through the organization, it becomes economically unfeasible to carry out full, 100 percent data accuracy checks for ongoing operational processes. Therefore, there is a need to apply a statistical approach in tracing operations that includes sampling schemes and statistical process control (SPC) to prioritize top critical data elements, trace them back to the source system, and take proactive measures to monitor and control them. Further, a good data tracing approach also helps in data lineage activities. Data lineage is about understanding where data is and how it flows and transforms across the corporate network. In this chapter, we describe the tracing methodology, its important aspects, and how it can be linked to data lineage. As we can see, the tracing methodology is quite useful in the Assess and Improve phases of the DAIC approach.
11.1 Data Tracing Methodology
A data tracing operation can be defined as an end-to-end activity to perform data quality accuracy checks for CDEs, prioritize CDEs, trace prioritized CDEs to source systems, and proactively monitor and control ...
Get Competing with High Quality Data: Concepts, Tools, and Techniques for Building a Successful Approach to Data Quality now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.