I am a pretty lazy DBA, and I use the UI wherever I can to get the job done as simply and quickly as possible, but laziness can be taken to far especially when it comes to dates..
It is possible to write a query like this
select getdate() + 1
select getdate() + 1
..which adds a day to the current date which is known as an implicit add. If you try this
select sysdatetime() + 1
select sysdatetime() + 1
..you’ll get the same thing unless you are using SQL Server 2008 in which case you’ll get the error
Operand type clash: datetime2 is incompatible with int
Operand type clash: datetime2 is incompatible with int
This doesn’t work as the implicit add has been deliberately disabled for the new date & time data types in SQL Server 2008 and sysdatetime is one of these (datetime2). Why chnage its data type? Because of its increased precision.
So the customer who brought this up has got to either use the first query (which still works with SQL Server 2008) or in my opinion write something a little longer like
..which is not only just as quick to run, but can be understood by humans and making it easier to maintain and document.
Other lazy things to avoid are not fully qualifying objects e.g. use mydb.myschema.mytable and eliminate the use of * as in select * from mytable especially in SQL Server 2008 as we have intellisense!
Cumulative updates have not always been popular with DBA’s as may of you prefer to wait for service packs like the upcoming sp3 for SQL Server 2005, to reduce testing and downtime. However if you are hit with a particular problem then hotfixes and cumulative updates are the the solution.
I am surprised there is already one of these in an attendant Knowledge Base article (KB) for SQL Server 2008 here, but do check this if you are evaluating or developing in SQL Server 2008 already.
If you want to understand how service packs, hot fixes and cumulative updates fit together then check this post form the SQL Server Global Release Services Team.
Unlike Manchester United, SQL Server 2008 is under new management. What I mean by this is that there are a load of new tools to configure, tune and audit which are essential to keep a modern database in the premiership.
While I have done a few events on this theme in the London area, the TechNet team have given me the bus fare to Manchester so that I can show all this new stuff to those unable to justify a trip south.
The event is SQL Server Under New Management on October 7th from 9:00 (registration) 12:30 to at the Manchester Regus:
3000 Aviator Way Manchester M22 5TG United Kingdom
3000 Aviator Way Manchester M22 5TG United Kingdom
..and you can register for it here
I need to use PowerPoint to explain the way these tools fit together but I am concentrating on showing on how all this stuff works such as:
So come along if you can, to learn, share and network.
A BarCamp is where about 100 web geeks spend all weekend together discussing what’s hot and what’s not, and the latest one has just finished at Ebay’s very cool Richmond offices. The idea is to give a chat on any topic , for example I saw one was called “The Art of Stabbing People” - had I made a mistake in turning up in a Microsoft t-shirt, I wondered.
My personal favourite was by Tom Whitwell of the Times OnLine headline writing. He gave a number of top tips to get traffic and interest in what you are doing by being specific, and not being deliberately funny or clever.
My headline of “Ask Microsoft (no stabbing allowed) ” got a little interest but was up against “css3” and “Bring your own cool gadgets”. Other sessions where on web anonymity, sleeping with the enemy (given by a couple where each is a designer for a competing web company), and loads of web tech hack stuff..
The most controversial session for me was a discussion about pseudonymity i.e. anonymity on the web by improbulus. This person is working as hard as possible to remain invisible (except of course by going to BarCamps!). I can accept the need to be really careful about how much you reveal about yourself, to protect your identity from theft and because there trolls and stalkers out there. But if you are commenting on other peoples opinions or work then surely you should answerable and identifiable in some way. Finally, none of the measures discussed will protect you from any concerted attack on your identity whether lawful or otherwise.
So a really great weekend, I wonder if anyone would come if we did one at our office?
When it comes to dashboards, Microsoft’s business intelligence stack seems to have some overlap in it, so how can you decide what to use when?
Dashboards are an overused term, so what I am talking about here is the ability to provide an interactive instrument panel for all or part of your business which should have the following characteristics:
There’s a starter for ten in this TechNet article, but IMHO what this article is missing is a discussion about what you are using your dashboard for.
If you you are using a dashboard to decide what to change in your business from a high level view, for example as part of a balanced scorecard, then I would submit that dashboards are just one element of a larger system. Using a dashboard to monitor business activity would then lead to some analysis when unexpected behaviour is noticed (be that good or bad), followed by a plan to change that activity. In order to do this these three activities Monitoring , Analysis and Planning need to be integrated, which is essentially what Performance Point is.
If you are a new to business intelligence or have a specific use of a dashboard, perhaps to monitor minute by minute activity in a call centre, then using SharePoint or Excel Services is probably sufficient for your needs.
Most of the work to build a dashboard is sourcing the data and defining the metrics on it, so while moving from a SharePoint dashboard to Performance Point one will involve some technical rework, this is minor by comparison, and should prevent you form thinking about migration if that is appropriate.
Update 10 March 2009
Since writing this article Microsoft has announced that Performance Point Server will not be developed further after sp3 (to come out later this year). The monitoring and analytics part of the product (including the dashboards) will be rolled up into a Perofrmance Point service that will be part of SharePoint from Office 14 onwards. I will post more on this when then Next version of Sharepoint goes through its beta releases.
Quite a few people have asked me where the decks are for the SQL Server unleashed event and I now have an answer:
which I need to put on my blog so I can also remember myself!
Also we have filmed this and I am hoping it will be on UGTV shortly as well, if you couldn’t make it.
Thanks to all of the enthusiastic SQL squirrels, who gave the most glorious Saturday for months to sit in a darkened lecture theatre, for their art. Thanks too to Tony Rogerson , Simon Sabin, Martin Bell and the rest of the SQL Bits team for organising the event, and to all the speakers for sharing their hard earned knowledge and for wearing the greenest polo shirts I have ever seen…
I know many of you will have noticed that there was a serious attempt to film everything, and a couple of the speakers have already asked me about getting access to the content, so before I am inundated with requests to links etc. I don’t actually have an answer yet.
I did a couple of shorts, including James Rowland-Jones and his lego for TechNet Edge but it will be a few weeks before it’s up there. This is because the guy responsible for the crew Alan le Marquand, is actually part of Microsoft corp and so he’s off to Redmond for the next three weeks. He’s also got to do a fair amount of editing (including the organisers session on RockBand!) so when I know you’ll know.
Remember that while watching the sessions is useful if you couldn’t make it, it’s not the same as being there, as you can’t ask the presenters questions, hang out with like minded individuals or stock up on swag and catch up with the exhibitors. So I will be expecting a good turn out for the next one and if you’re up for it submit a session!
I have seen quite a few concerts, plays etc. at the Queen Elizabeth Hall in the South Bank Centre, but I never thought I would be on that stage, or presenting alongside Steve Ballmer CEO of Microsoft on 1st October.
The event “Technologies to Change your Business” marks the release of SQL Server 2008 and Hyper-V, and is aimed at IT Managers and CxO’s. We will not only be introducing these new server technologies, but more importantly, showcasing customers who are taking these products and using them as the catalyst for significant change in their business.
Chris Ingle the Consulting & Research Director with IDC will also present on how you should be preparing for the next wave of technology and where you can drive increased business value through technology, in areas such as
Additionally, we will be showcasing some of the future emerging technologies to come from Microsoft Research.
The full agenda is:
So Register Now to join us for an afternoon of innovation, debate & insight as we launch the next wave of technologies to help you deliver new value to your organisation.
Thanks to everyone who came along to celebrate SQL Server 2008 at our unleashed event yesterday and especially to the audience who volunteered to sit in the overflow room as we had a few AV niggles. As you can see above image David Hobbs-Mallyon one of the SQL Server product managers is also a whizz at the new AutoCollage tool we have launched, and has done a great job of capturing the buzz of the day. His other top demo of the day was on Rock band as a screaming vocalist and manic drummer! Also a big thank you to our MVPs, Tony Rogerson, Martin Bell, Simon Sabin, Jasper Smith, to wannabe MVP Suranjan Som, for their demo’s during the day and to Allan Mitchell for the best demo of the day.
If couldn’t make it or were told we were full (which we really were), there is hope in the form of SQL Bits this Saturday (13th September) at the de Havilland Conference Centre in Hatfield, Hertfordshire. It’s free and there are four tracks to choose from Dev, DBA, BI and SQL Server 2008, and those same MVP’s and a few others will be there as well.
BTW My session will be an extended version of what I showed today so if you were at SQL Server unleashed, my top tip would be “don’t come to my session”, unless you are stalking me! Seriously though please come to this event, it will be just as much fun as SQL Server unleashed, and another great opportunity to share knowledge, debate and network.
If you have been following this blog over the last few days, then we are at the stage we know what we want to measure to get the pulse of our business and we have all of that on a scorecard. However just because we are measuring performance doesn’t mean our work is done.
What do we do when we notice that we are not hitting our numbers? I would submit that the first step is to understand what is going on behind those numbers. This is where analysis comes in. The key technical tool for analysis of large numbers of transactions events etc. is called OLAP (on Line Analytical Processing), which tells you precisely nothing about what it is. I prefer to cal the thing I am about to tell you about FASMI :
For the business users out there think Excel pivot tables on steroids and speed. The key difference between what I am talking about and Excel pivot tables is that the data doesn’t live in the spreadsheet, it’s hidden away in a special type of database known (the OLAP engine). Since I am a Microsoft evangelist I would add that the Microsoft tool is called analysis services which comes with Standard and Enterprise editions of SQL Server, and that there is built in support to connect to Analysis Services in Excel.
Anyway how does this help and how can you use it?
If you have used pivot tables you will have seen how you can choose what appears on each axis so you can have products on rows and then time on columns, and a given metric as the numbers in the grid. OLAP adds to this by:
The only downside is that to do the back-end technical work to deliver this is a skilled job and you are going to need external help to get started. I would expect this to take 10-20 days for a small business depending on the complexity of the implementation and the quality of your data. That could be a significant cost for a small business, but as I have mentioned in this series of articles before needs to be justified in terms of potential savings arising from the use of this tool.
If you have read my last two posts and are interested in applying BI to your business, how do you get started?
I would be begin by deciding what success look like for your business, and this should be a balanced view taking in such factors as:
The above is not an exhaustive list, just your starter for ten, I am no strategy consultant. Anyway moving on, for each area above you come up with you need to be able to do the following:
These metrics would then be combined to form a scorecard- a one page overview divided into 4–6 headings like those above each with a set of metrics underneath. This then provides the big picture on how your business is doing. Having established that you then need a set of analytical tools to help you discover what is happening behind these metrics when things don’t go according to plan. and that will be next post.
Georgina (Curious George on the TechNet team) sent me some feedback about my last post, she wanted to know was there any evidence to support my assertion that BI was being used more widely in small business.
Is there any independent advice from Gartner, IDC, Forrester etc? Generally their research does support this trend, but they don’t focus on this market so much because the vendors who commission the reports haven’t traditionally been interested in small business.
However, I have my own experience, and a good network of friends in the BI industry. We have put BI into brewers, franchises, printers, fashion houses to name a few sectors. The companies were between £15 –50m turnover most of whom had less then 250 employees.
Initially when I started doing this, the big barrier to wider adoption was the license costs. which could account for 50% of the whole implementation and sometimes nearly as much as the ERP system they were using. Fortunately as more and more tools appeared like SQL Server, and Excel got better at analysing the data, expensive BI user tools were no longer needed or needed in lower numbers, making the whole solution more affordable.
The next generation of BI tools will mean that this investment can be further reduced by doing all the hard work in the internet cloud. In future any business will choose to have their line of business data hosted in the cloud, and to have a service to analyse that data.
Many vendors are developing services like this. Microsoft’s approach to this area (what the IT industry calls Software plus Services) is a little bit different in that you can mix and match what is kept on premise (say your data) and what you want to do in the cloud. A good example is the black art of data mining, a set of techniques to discover relationships and customer behaviour for example that were not apparent. Microsoft is designing it’s cloud offering to let you do your data mining from data in the cloud or that you have locally , and then to either analyse that from a browser, or from Excel.
Of course all of this stuff could be used by big business, but my bet is that small startups not hindered by the prejudices of large entrenched IT departments will go this way and then stay with it as they thrive and grow.
While many IT professionals equate business intelligence (BI) with large organisations, the reality is that there are many small and medium sized businesses who are quietly implementing BI because they see the benefit of it. I think there are three reasons for this:
The perceived barriers to wider adoption of business intelligence are also worth noting:
I use the term perceived here because I believe these are views that are not necessarily correct when put in context and are often based on outdated information.
Yes Business Intelligence can be complicated depending on the complexity of the business and the measures used to measure success. At it’s simplest level it should be a one page report summarising where you are with subsidiary or sub-report with a more detailed analysis of each quadrant of performance.
As for costs IT should be treated like any other investment and the challenge here is to measure the effectiveness of that investment. Measuring the costs of an IT investment are relatively easy, the hard part is assessing the benefits particularly of BI projects. I think this can only be done if by assessing the contribution BI makes to the decision being made and he ensuing benefits of those decisions.
For example you browse your Profit and Loss and realise you are paying a lot interest and bank charges. Analysis leads you to notice that your corporate clients are paying right at the end of their payment terms (their savvy finance guys have BI as well of course), however your suppliers have tighter payments terms with you because you are small so essentially you are giving interest free loans to your corporate clients. BI has identified the problem, it’s up to about what you do to fix it, but whatever you decide to do you need to set up monitoring to watch this while keeping an eye on your sales so that your tactics for fixing this don’t seriously affect your other strategies. The possible benefits arising from BI in this scenario might be:
To conclude BI is like any business investment, and while the benefits may be hard to quantify there is no doubt that the reduced costs of ownership, make it more and more attractive to small businesses.
This may be a short post in stating the [insert your favourite expletive here] obvious, but you can restore a SQL Server 2000 backup on a SQL Server 2008 database and that’s a good way to do a version migration. However that doesn’t mean your functions and stored procedures are going to work, so before you do this you should definitely definitely run the SQL server 2008 upgrade advisor. This checks for all the known problems in the database and gives you a report of the issues you need to address.
That’s a good place to start, but you also need to remember to check all the code in the applications that are accessing this database. Assuming for a moment this database is something you have written in house, you can check that by turning on profiler and tracing all the SQL hitting the database, and then test the output using the same upgrade advisor. This exercise needs to a good as coverage of you application as possible, perhaps including any setup scripts.
If you are migrating a third party application then check with the vendor that it’s OK to run on SQL Server 2008 and refer them to me if it isn’t yet so I can get them the assistance they need.
Making the further assumption that you have done all of this and have now restored your legacy database onto SQL Server 2008, and that you are moving from then there is still some more work you should do:
1. If possible change the compatibility level of the database to SQL Server 2008 so it runs fast and you can use all the new stuff. 2. If you are migrating form SQL Server 2000, change the torn page error detection method to checksum in database properties –> options, as it is a is a much better test. BTW bear in mind although this option is set immediately it will only work when all of the pages have been read in the new version of SQL Server (which will create the checksums themselves for each page). DBSCC checkdb is one way to achieve that, and you might want to do that anyway to ensure all is well after the migration. 3. Rebuild all of the indexes, if you are migrating form SQL Server 2000.
1. If possible change the compatibility level of the database to SQL Server 2008 so it runs fast and you can use all the new stuff.
2. If you are migrating form SQL Server 2000, change the torn page error detection method to checksum in database properties –> options, as it is a is a much better test.
BTW bear in mind although this option is set immediately it will only work when all of the pages have been read in the new version of SQL Server (which will create the checksums themselves for each page). DBSCC checkdb is one way to achieve that, and you might want to do that anyway to ensure all is well after the migration.
3. Rebuild all of the indexes, if you are migrating form SQL Server 2000.
There’s a ton of other stuff you might want to do as well, but this is a pretty good start.
I have done a blogcast and a few posts around the configuration servers in the betas of SQL Server 2008, however in SQL Server 2008 as released these are now called Central Management Servers. For those of you who have held off form trying the latest version until it was released, this feature is a sort of extension to registered servers that have always been in the SQL Server management tools.
What you do know is to register any instance of SQL Server as a Central Management Server in a given domain (the stuff here only works across serves in the same or trusted domains) in the registered servers tab in SQL Server Management Studio.
Underneath this you can then create a groups or register individual servers as we all used to do in SQL Server 2000. When you right click on the Central Management Server or a group you can do a couple of useful things..
You can register the same server multiple times to put it into different groups. For example you could have a group for each version of SQL Server you have (say 2000,2005, and 2008)and a set of groups for dev test and prod with each server being registered twice , once in group for the version it is running and once for the function (dev test or prod).
With these two sets of groups you can now run sql or evaluate policies against all your dev servers or all of your 2005 servers.
Two things to be aware of:
Thanks to Brent Ozar from Quest for all of this, the demo guru at last night Quest tools launch event.
If you want to know more I will be covering this at length at SQL Bits cubed and these TechNet events:
If you glance to the side of this post you will see that one of the few books on my bookshelf is the Microsoft Data Warehouse Toolkit, and so I am annoyed that I can’t make the course of the book, presented by the authors, Joy Mundy and Warren Thornthwaite.
For those not familiar with these two eminent consultants, they work with Ralph Kimball at the Kimball Group. For those not familiar with Ralph Kimball he is one of the most influential figures in Business Intelligence. Many BI Vendors have adopted some or all of his concepts in their products and if look at his way of designing a warehouse using what he calls the bus architecture..
(adapted from p171 of The Data Warehouse Lifecycle Toolkit)
and compare it to the dimension usage designer in the SQL Server BI Design Studio here
and you can see the similarities i.e. dimensions on rows, facts on columns, and the white boxes represent where the facts (measure group) apply to a dimension.
So if you’re serious about data warehouse design and you are using or considering using the Microsoft product stack then this course could start to look interesting. The course is at the London Heathrow Crowne Plaza on 20-23 October and you can register for it here. It isn’t cheap, but like it says on the cosmetics adverts “You’re Worth it”. If your boss, accountant, partner doesn’t agree then you need to make them aware of the quality of the course the cachet of being taught by the best, and the networking opportunities. If that doesn’t work then it might be a good a idea to look at some influencing skills training, or a career move to an outfit that offers quality training to its staff.
These courses are pretty rare in the UK, and I have always managed to be busy when they are being run
Ralph Kimball’s data warehouse lifecycle toolkit is also there, as I believe these 2 books are a very good start for anyone interested in business intelligence, partly because of the sound theories and approaches and partly because they have been written on the back of years of practical experience.
Following on from yesterday’s post, I wanted to encourage the DBA’s to demonstrate your influencing skills by persuading your managers to support your careers by sending you to TechED EMEA for IT Professionals, 3-7 November 2008 in Barcelona.
If you haven’t been before it’s like an intensive summer school for IT professionals. You get to learn from the experts in formal sessions, hands-on-labs, instructor led labs and my favourite, the chalk talks. Networking is also a big part of the week, with Microsoft staff, industry experts, the other vendors and 5,000 like minded delegates.
Even if you’re deep into your subject (and I am assuming that this is SQL Server as you are reading my blog), then you would probably benefit form pitching up to sessions on Security, SharePoint or Windows Server 2008 and Hyper-V virtualization.
So how do you convince your manager to let you go, as it’s a week out of the office and you need to get there, eat and pay for the conference?
If you can’t persuade them, then consider an influencing skills course before the next TechED comes round, or maybe it’s time to review your future with a company that can’t afford to reward it’s most valuable member of staff , the DBA!
One of the things that isn’t really clear form all the blurb about SQL Server 2008 is how the various new features affect each other and what’s already in SQL Server. So I wasn’t surprised to see questions being asked about Change Data Capture (CDC) and its affect on the log and specifically log shipping.
CDC is in some ways similar to transactional replication, they both rely on the transaction log to work out what’s been happening to the database as they both use the same server entry points (sp_replcmds and sp_repldone) to advance the truncation point in the log.
In this cases the truncation point is no longer the last committed transaction, but the last committed transaction that hasn’t been harvested by these features as these are in the active portion of the log.
Back to my original question does this mean you can’t use log shipping and CDC? Yes you can and this feature is improved upon in SQL Server 2008 as the log shipping traffic is now compressed.