12.2. Data Flow Optimization
Integration Services is a platform, meaning that its purpose is to support applications built on its features. The data flow, for example, has no value in and of itself; the value is gained when it is applied to a data-processing purpose (such as warehouse ETL, system integration processes, or DBA data management operations). What this means when it comes to optimization techniques for the data flow is that there are no guaranteed formulas that yield performance gains, because every data flow design has a unique scenario. This section includes principles and common areas for you to evaluate against your data flows to determine applicability.
12.2.1. Pipeline Architecture Review
The SSIS data flow engine uses data buffers to manage data as it flows through the pipeline. As data is extracted form the sources, it is consumed into reserved memory spaces, called buffers. These buffers are acted upon by the various transformations as the data flows through the data flow to the destination. A complete review of the SSIS pipeline engine is found in Chapter 10 of the Professional SQL Server 2005 Integration Services book (Indianapolis: Wiley, 2006). Here are key points to remember:
When data is loaded into a buffer, the transformation logic is applied to the data in place (where possible). In a sense, the transformations pass over the buffers, giving SSIS the capability to stream data through the pipeline. These inline buffer changes are cheap as compared ...