In my last posting I spoke about changing databases. In this posting I want to jump into a little more detail on some things to improve the success of your migration project. I’ll spare you the lecture on planning – but remember the best way to improve the success of any migration project is ensure you spend time planning. So here are my tips for what I’ve seen work:
First, as part of the planning phase be sure to inventory all upstream and downstream dependencies. What systems feed into the one being replaced? What assumptions are these systems making about the system being replaced? This shouldn’t be too difficult but it’ll take a little time. Downstream dependencies are far harder and are going to require significant detective work. The main challenge here is degrees of separation. You should be able to find those systems that are one degree away by analyzing connections and SSIS/DTS packages. This won’t be straight forward but you should be able to account for the majority of cases. The farther you move down the dependency chain the harder it’ll be and the longer it’ll take.
Second, you’ll need robust test data. You won’t be able to account for every scenario so you need to think through the interesting cases. Hopefully you have a starting point, what you currently use to validate new releases of the existing system. You may need to expand this set to be more encompassing given you’re changing the entire system. When you create your test data think through any special processing windows, end of month, end of quarter, end of year, that may have different requirements. You may need different data sets to test each of these special processing windows. To create the test data you may want to use existing tools that capture data/transactions, some of which can obfuscate data.
Third, running test data through the system isn’t enough. You will need tools to compare the results from each system to ensure the new system yields the same results. Depending on your situation you may have to build custom tools. This may not seem like a good investment but it’ll pay dividends.
Fourth, running test data through each system is necessary and once you’ve achieved parity an even better validation is to run the systems in parallel. The old system should stay as primary but each transaction should secondarily be routed to the new system. For a batch processing system this will be easy. For on-line systems you’ll need a capture/replay tool. Though usually minimal, these tools do add overhead to the system so be sure to plan accordingly.
Fifth, create a bridge between the new and old system. This means, once the new system is the primary for upstream systems you'll want to load the old system to lessen the urgency of migrating downstream dependencies. This is a temporary solution until all downstream dependencies are migrated to the new system.
Lastly, once all dependencies have been migrated to the new system you can unplug the old. The team has worked extremely hard to reach this milestone and you don’t to let it pass by without celebration.
Every migration will be a little bit different. Some will be easy, taking as little as a weekend, while others will be extremely complex taking a year or more. The most important advice I can offer is break it into multiple phases and validate at the end of each phase before moving to the next!
I’m not sure why but there seems to be a lot of chatter lately about changing DB platforms. Information Technology Intelligence Corp (ITIC) recently ran a survey about DB migrations. You can find the result here (search the page for “ITIC Sunbelt 2010 SQL Server Survey Results”). At the risk of sounding like an advertisement Microsoft has provided a SQL Server Migration Tool Kit for some time. It aids the migration to SQL Server from Oracle, Sybase, MySQL and even Access. But this is never a slam dunk. It takes careful planning and testing to ensure the applications that use the data source continue to function properly.
I’ve never heard of a DBA waking up one morning and deciding to migrate from one DB platform to another. DB platforms are pretty sticky and there is almost always some catalyst for the change: license/contract renewal, change in supported platforms by a packaged application, vendor consolidation, etc. Let’s be honest migrations are scary projects and you want to be sure the drivers are well understood by the project team and the stakeholders. Then you want to be sure you take your time to plan and analyze the situation.
Over the course of my career I’ve worked on a few very large migration projects that involved more than just the database platform. These had large budgets and extremely stressful. It would have been very easy to become complacent and ignore the need to migrate but opportunity would have been lost. Most of the projects were successful (even the greatest baseball player of all time doesn’t bat 1,000). They resulted in cost savings, staying in a supported environment, or creating more flexibility for the business. The reward was worth the risk. I’d say the two key things that contributed to the success were: breaking the project into meaningful stages and a comprehensive set of regression tests (running against old and new and comparing the results).
I’ll have more to share about migrations in my next posting. Bottom line, don’t let FUD keep you from doing what you need to do to achieve your business goals.
Donald Farmer, he of the BI fame, has been on a roll lately. He and I have something of a competition going most of the time, but it's a "friendly" competition. See, my audience is usually more administration and programming focused; his more BI focused. And no one can deny that BI has a lot more eye-candy. Of course, I explain to Donald that I actually look better than he does, so we're even on the eye-candy part.
OK, enough of that. BI is a hot topic, and people are asking me about it quite a bit, much to Donald's amusement. Since he held a great chat on BI a while back, I thought I would share the link: The Business Intelligence Agenda for 2010. It's a free listen, but you do have to register.
OK, Donald. You owe me one.
Some things just
really work well together. Peanut butter and chocolate. Abbot and Costello.
Coffee and...well...anything. You get the idea. But there are some real advantages
in using SQL Server 2008 (and of course the upcoming SQL Server 2008 R2
release) with Windows Server 2008 and higher. It's not just that they are both
better than their predecessors, SQL Server actually takes advantage of the
improvements in Windows Server 2008.
example is in how Windows Server 2008 handles the infamous "drive offset". This
is a small block size movement from the first part of the hard drive sectors -
it's an internal thing - but it causes real issues with software that exercises
the I/O subsystem, and makes its own calls there. Like SQL Server. In the past,
the data professional had to follow a process called "Partition Alignment", and
this had to be done when the system was set up. That's all now a thing of the
past - with SQL Server 2008 and Windows 2008 Server, this just happens.
Another example is
in how Windows 2008 Server deals with the "sliding TCP/IP window". This
enhancement directly affects how fast SQL Server can send large frames of data
- especially with Replication and large binary objects. At Microsoft we noticed
tremendous speed gains just by moving to Windows 2008 Server.
There are lots of
other examples - from new virtualization and consolidation changes in both
products to clustering enhancements, and now in SQL Server 2008 R2 the ability
to run the "sysprep" utility after SQL Server has been
You can read more about this "pairing effect" in this White
And be sure to check out John
Kelbley's Post on the Windows Server blog where he also talks about ways
that SQL Server and Windows Server work "Better Together".