My post announcing the next version of SQL Server picked up a couple of comments and rather than squeeze my reply in another comment I thought i would make it into a post.
Most customers started adopting SQL Server 2000 in 2001/2. Back in those days my laptop had 256Mb RAM coupled with a Pentium P3 running at 650MHz and a 20Gb hard disk. When working from home I had 64k ISDN. On the server front I might be lucky enough to be using a dual P3 or an early XEON with maybe 1-4Gb RAM with perhaps some 40Gb drives to play with.
If I look at what I have in my bag as I write this I have a dual core laptop with 4Gb RAM (which I can use all of as I am on windows 7 64bit) I have a 1 terabyte of disk. I was talking to a client yesterday who had 48 cores in each of his servers with 256Gbytes of RAM and multi-terabyte databases running across a geo cluster.
While technology has been marching on what else has changed over the last 7 years? Personally I use a lot more multimedia for work, I am permanently connected to the web, which means I can find what I want very quickly but also means I have to be careful. Another revolution in the Microsoft world since 2002 is SharePoint reflecting the need to collaborate and share all sorts of data all of which ends up in SQL Server
So what does this mean for the DBA? It means more data in SQL server more data per DBA and more varied data. So development of SQL Server has focused on scalability, manageability and the need to store all of these new kinds of data as well as the traditional oltp (do we still use this term) in the database.
Most of the flak on my last post was how often things change. You might argue about how often a release of SQL Server should come out. Five years between SQL Server 2000 & 2005 was seen by many as being far too long as server advancements, and security threats appeared. Three years coincides pretty well with windows server itself and if you look at the latest releases SQL Server version these come out shortly after the operating system. This means that SQL Server 2008 can support 64 cores because Windows Server 2008 can, with Windows Server 2008 r2, this rises to 256 cores so SQL Server 2008 r2 will support that. One other simple but obvious point is that every feature in SQL Server 2008 was asked for by someone who needed it for a very good business reason.
It would be great for Microsoft if everybody bought every release of our software, but the reality is this does not happen. So the biggest competitor to Vista is XP and the biggest competitor to SQL Server 2008 is SQL Server 2000. In the real world when you have a project to do or need to change your infrastructure to reflect some business change you should carefully consider whether there is advantage in using the latest or upcoming release of the product. However I would also caution against using a superceded version just because it is the standard. What you get if you do this is an increasingly ageing infrastructure which suddenly needs to be completely replaced, which in turn means you have a huge spike in your capex and a huge need for resources to do migration. On the other hand you do need some stability and so what this means is that you are generally running a mixed economy on your servers, using the latest and greatest where it makes sense and maybe holding onto older versions because of constraints such as third party applications.
However if you have to run SQL Server 2000 today because some mission critical legacy application needs it you could have SQL Server 2000 supported to 2015 as per Microsoft’s support policy. BTW this compares pretty well with Oracle for example: 10g came out in 2005 and will end mainstream support in 2010/11 with extended support to 2013 (http://www.orafaq.com/wiki/Oracle_10g).
So this all comes down to managing change and that is what top IT professionals including DBA’s do.
Finally I am very interested in comments like those made on my last post so thanks to Michael Swart and Shihab Hassan for the feedback.
PingBack from http://windows7live.info/?p=16782