Sooner or later you need to pump data into a destination database. We’ve already seen how DBAPI-compliant modules allow us to insert a list of tuples at one time, and how this provides optimal performance. In a case such as a data warehouse, the number of destination tables and fields will be quite small so it’s no trouble to build the SQL statements by hand for each table; and we already have the list of tuples ready to go:
Where there are many destination tables, a shortcut can be taken if
the field names are simple and match the underlying database well;
you can write a routine that uses the field names in the
DataSet and generates an SQL
INSERT statement to match.
Often there are better ways to bulk-load data. The important thing is
to know that you have correctly structured
DataSets for the destination database; if
that’s true, you can often save them to a tab- or
comma-delimited file and use the database’s bulk-load facility
with far greater speed.