One of the best parts of my job is helping people, and although I am not a Microsoft support guy, I will respond if I can to requests on my blog , twitter etc. a particularly if it’s from people new to a particular technology.   This is partly out of a recognition that we’re all experts in something and even when it comes to the IT department not everyone is deep into SQL Server, System Center or Windows.  SQL Server actually presents it’s own problems as  it often comes with a business application which hasn’t ben procured through IT, and somewhat naturally when the business asks for help IT will push back.  However application like this can be mission critical and in one case (I am not allowed to name names) this resulted in a user running into trouble with a 2TB (yes that is 2 Terrabytes) database filling up his desktop machine.

It was clear from the tone of the e-mail that although this chap is highly technical and has some basic knowledge of IT he hadn’t really done anything with SQL Server before, and was feeling his way.  His database turned out to be so large because it was mostly Filestream, a new data type in SQL Server 2008 and later for storing large unstructured data files in such a way as to keep them inside SQL Server’s control while still essentially being separate files capable of being accessed individually through the win32api.  So if you looked the disk you would see the files for the database itself, the log file and then a folder with these Filestream files in in a set of SQL Server generated folders and file names i.e. they won’t have the same names as when they were added to the database.

There wasn’t time or space on this machine to backup this database but fortunately the user was using a test database to test how to move it and reuse it elsewhere.  I recommended detach –> copy –> attach without realising that Filestream doesn’t like that if you just do it as is,  the attach left the database in an unusable state as it was still looking for the Filestream data in its original location.  Fortunately running detach form T-SQL:

USE [master]
USE [master]
EXEC master.dbo.sp_detach_db @dbname = N'MyDatabase'

where MyDatabase is the name of  my database.  If you are new to SQL Server those file will be in something like

C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\

and that’s part of the problem, a default install of SQL Server will drop these data files on the C Drive along with the SQL Server program and the operating system. Anyway you would then copy the database files to a new larger hard disk and then run this..

USE [master]
( FILENAME = N'D:\DATA\MyDatabase.mdf' ),
( FILENAME = N'D:\\DATA\MyDatabase_log.ldf' ),
( NAME = N'MyDatabase_FSData', FILENAME = N'D:\DATA\FileStreamData' )

where D:\data  is where you have copied the files to in my case D:\Data

The next problem we hit that his new server came back with an error,

Msg 1813, Level 16, State 2, Line 1

Could not open new database 'MyDatabase'. CREATE DATABASE is aborted.

Msg 948, Level 20, State 1, Line 1

The database 'MyDatabase' cannot be opened because it is version 661. This server supports version 655 and earlier. A downgrade path is not supported.

This is pretty simple – the new server was running an older version of SQL Server and while you can take on old database detach it copy and attach it to a newer server (which is a supported way of upgrading a database)  you can’t go backward to an older version.  In this case the original database is SQL Server 2008 R2 and the second server was running SQL Server 2008.  Given this is an end user scenario it’s not surprising that he had differing version around, and my final tip is to run this to find out what you’ve got on a server by server basis:

SELECT SERVERPROPERTY('productversion'), SERVERPROPERTY ('productlevel'), SERVERPROPERTY ('edition')


The version is 10 for SQL Server 2008 , 10.5… for SQL Server 2008 R2. 

If you have lots of servers to perform SQL Server discovery on then you’ll be much better off using the free Microsoft Assessment & Planning Toolkit, which will also return operating system and server information plus a host of other information as you need.


So imminent disaster averted, user fairly happy, but all I have done is put some gaffer tape over the problem in reality this data need to be migrated to proper server so that as the database grows it will perform and be available to the whole team and hopefully be properly protected by the IT guys.