Dynamic table loading in SSIS (Part 1)

The quick answer to this is that you can’t… at least not in a dataflow. Whilst you can dynamically change environments and databases, the column metadata has to remain the same. It’s therefore fairly limited as to what you can load “on the fly”.

There is a work around however for those of you who wish to be able to create a looping mechanism in a single package with a single task to load the data.

Amazon Link: SQL Server 2000 Fast Answers for DBAs and Developers, Signature Edition

What I used to use quite a lot in a past life was the bcp utility in SQL 2000. This utility is also available in 2005 and has 3 methods of use. [In],[Out] and [Format]. If we use the format function we can specify a file that will be used at run time.

So the idea is that using a bcp format command prior to loading the table in question, you can create a format file for use in Bulk Insert SSIS task. The formatfile property can be set in the task using an expression, as can the destination table and the source file. I think you can see what I’m getting at.

The process flow for your package would be something like this.

  1. Create a recordset listing the table names you want to load (use a dataflow task with a recordset destination, or an excuteSQL task sending results into an object variable)
  2. Create a ForEach loop container mapping the table name to a new variable.
  3. Issue a bcp format command against the table concerned. (Set the arguments using an expression). The result of the expression will look something like this “bcp dbName..TableName format nul -fc:\TableName.fmt -T -SserverName -n”
  4. Also inside the loop create a bulk insert task using expressions for the formatfile name (created in step 3), and the destination tablename.
  5. In your file connection manager properties, use an expression to set the connection string for your datafile.

Once you have done this you are ready to go. One package – no dataflows – dynamically changing source and destinations.

As I said, not pretty, but it works.

See a working example in part 2.

3 thoughts on “Dynamic table loading in SSIS (Part 1)”

  1. I’d just like to add to this that bulk insert task in SSIS does have various limitations especially in regard to permissions and file locations. Also, its probably worth mentioning although I’m sure you all know this, that you can’t transform the data in any way. However, it will allow you to accomplish a simple import of data if that’s what you are trying to do.


  2. I see this is a solution file->database. Now how do we implement database 1->database 2 in a similar way? I guess in this case we need to write alot of custom code.

  3. You’re right. You couldn’t do this from one database to another… however it could be achieved if you absolutely had to have that ability in a single package, by executing a process task which runs the export utility in the source system, then loading the files using bulk insert… but having said this, it sort of defeats the purpose of integration services in that you can’t transform any of the data, and that you can’t load it in a single pass. Also, the dataflow is faster than bulk insert so you’re losing out in performance too!! So after all that there would be no point using SSIS!

    I used to use this method in a datawarehouse I worked on where all our source databases dropped csv files on to a central repository, and from there I would call a single SSIS package passing in the tablename to get the data loaded into a staging area. From there it required 100+ packages for the transforms.

    In the end there is no way around it!!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.