The quick answer to this is that you can’t… at least not in a dataflow. Whilst you can dynamically change environments and databases, the column metadata has to remain the same. It’s therefore fairly limited as to what you can load “on the fly”.
There is a work around however for those of you who wish to be able to create a looping mechanism in a single package with a single task to load the data.
What I used to use quite a lot in a past life was the bcp utility in SQL 2000. This utility is also available in 2005 and has 3 methods of use. [In],[Out] and [Format]. If we use the format function we can specify a file that will be used at run time.
So the idea is that using a bcp format command prior to loading the table in question, you can create a format file for use in Bulk Insert SSIS task. The formatfile property can be set in the task using an expression, as can the destination table and the source file. I think you can see what I’m getting at.
The process flow for your package would be something like this.
- Create a recordset listing the table names you want to load (use a dataflow task with a recordset destination, or an excuteSQL task sending results into an object variable)
- Create a ForEach loop container mapping the table name to a new variable.
- Issue a bcp format command against the table concerned. (Set the arguments using an expression). The result of the expression will look something like this “bcp dbName..TableName format nul -fc:\TableName.fmt -T -SserverName -n”
- Also inside the loop create a bulk insert task using expressions for the formatfile name (created in step 3), and the destination tablename.
- In your file connection manager properties, use an expression to set the connection string for your datafile.
Once you have done this you are ready to go. One package – no dataflows – dynamically changing source and destinations.
As I said, not pretty, but it works.
See a working example in part 2.