I talked in a previous post about the possibility of using the format command and bulk insert task inside of a foreach loop in order to load all your tables using a single package… I don’t much like the method as it means you have to run each table load in series, and you’re not taking advantage of SSIS (high speed dataflow task, parallelism and eliminating staging with a single pass transformation).
However, the previous post is probably the most popular on this blog and I’ve had requests for more detail. So as part 2 of this post, I’ve created a demo solution for you to try out. Before implementing something like this though I’d think very carefully about what it is your trying to achieve. If you only have a small number of tables/records to load, with a well organised and controlled framework, this may work for you. Certainly, one package is easier to keep track of than 100! Having said this, one package to run all loads means that a failure is going to halt all table loads.
Here is the solution – Dynamic Table Loading Solution
- Run the “ExampleDynamicLoad.sql” in the miscellaneous folder to create an example “control” and “staging” database with necessary tables and metadata. This will also create the folder structures on your local drive to accomodate format and comma delimited load files.
- Run the package “GenerateDynamicLoadFiles.dtsx”. This will create 3 text files in your newly created folder structures, each with a million rows.
- Finally, now that the environment has been set up, you can run the package.
Note that this demo has been created to run against a single development machine. Therefore localhost or “.” have been used in place of specifying servernames.