Or more accurately: Preparing an encrypted TDE enabled database for restore on a Standard Edition SQL Server
I recently had the challenge of restoring an encrypted database on to a standard edition server to enable further development on the database code. It took some work, but to save you some time, I’ve listed the steps (and the T-SQL) in this article that you need to take in order to accomplish this.
Continue reading “Disabling TDE on SQL Server 2008”
I love the idea of Web 2.0 – I jumped on the social networking bandwagon at a fairly early stage getting involved with AOL’s attempt to compete with myspace with its AIM pages. Its great that people with shared interests can join a network and interact with each other in a way that couldn’t be done before.
And I currently maintain profiles at LinkedIn, flickr, facebook & twitter, youtube, picasa, pprune.org and SSC. These profiles I can handle quite nicely along with my own hobby sites and various shopping websites that I use. I have always argued that privacy concerns can be managed Continue reading “Google Buzz – Web 2.0 Narcosis”
I’ve been working at Canary Wharf for 6 months now. Its a fantastic looking place, and it lends itself well to photography. I thought I’d post a couple of shots while I think up some useful database article to write about.
Continue reading “Canary Wharf”
I was reading a post by BI Monkey and found myself in agreement with what he says. It also got me thinking about the wider implications of the problem of not “helping you make better decisions”.
I’d add to BI Monkey’s question with the specific question I always ask business analysts as they begin to list out the attributes they want added to a dimension in a requirements meeting. “What question does this answer?”. It’s a BI specialist’s responsibility not just to resolve the technical aspects of the requirements, but also to help steer the requirements so that something useful is delivered at the end.
I think this is one of the reasons why support from the business can fade. And without champion users in the wider business, it’s very likely that a BI implementation will be paralysed. People don’t understand the information provided and so don’t use it. Why keep funding the project if nobody uses it? Or the last (over-scoped) project took so long that the team isn’t trusted to take on new projects. Here are some scenarios…
Professional Microsoft SQL Server 2008 Reporting Services (Wrox Programmer to Programmer)”
Continue reading “What question does this answer?”
The script task in SSIS was a fantastic improvement on DTS Active X script. The reasons for this in my view are 2 fold:-
Firstly, by allowing the user to code in Visual Studio it is far easier to develop and debug. Because the developer can make use of intellisense & object explorer, it means even the casual programmer can read & write code.
- Secondly, because we can use it to access other external .net assemblies, it provides functionality for logic that can’t be achieved using the standard SSIS components.
These are both reasons in themselves that make it a compelling choice when deciding how to develop your package logic.
However, I have noticed at both client sites and on the forums, that there are those (the script task junkies!) that use the script task in almost any scenario when it is just not necessary. It makes maintenance and support tricky at best and sometimes impossible…
Continue reading “SSIS – The script task junkie!”
I don’t know about you, but one of the things that drives me to distraction when writing SSIS packages is being forced to use excel as a data source. I’d like to qualify that statement by adding that it’s the use of excel in the face of all reason, to do the job that other software would be better for. Unfortunately though, corporations and people being what they are, they are a fact of life for an ETL developer.
Amazon Link: Excel 2007 Power Programming with VBA (Mr. Spreadsheets Bookshelf)
Continue reading “Excel VBA to prevent user generated errors in SSIS”
Here’s the scenario… You have a RAW file which contains data from many files.
In the subsequent dataflow you need to perform a lookup against a large reference table, however you want just a subset that reflects the period contained within your RAW file.
Question 1, how do you find out the earliest date used within your RAW file data? And question 2, how do you write it to a variable so that you can use it in the subsequent data flow?
Continue reading “SSIS – Writing to a package variable in a dataflow”
Many of you will have heard the mantra that loops are bad, set based is good. But how do you get around them?
The fact of the matter is that there are very few circumstances where a loop is the only way to achieve your objective. A look through the SQL forums will show you many a technique to turn iterative row by row processing into single transaction set based processing. I’m refering to hierarchical query techniques (see nested set theory) and of course the subject of this post, the numbers table.
Continue reading “A practical use of a numbers table.”
In an earlier post, I demonstrated how to send HTML formatted mail using the script task. Another frequently requested use of the mail task is to be able to send query results within the email message body. Here’s how…
Continue reading “SSIS – Writing SQL results to a string variable”
I talked in a previous post about the possibility of using the format command and bulk insert task inside of a foreach loop in order to load all your tables using a single package… I don’t much like the method as it means you have to run each table load in series, and you’re not taking advantage of SSIS (high speed dataflow task, parallelism and eliminating staging with a single pass transformation).
Continue reading “Dynamic table loading in SSIS (Part 2)”