Mary-Jo Foley recently wrote a blog post about being better together. At Microsoft this is something we take seriously. In her blog post she includes some snippets of an interview with Donald Famer. Donald discussed how we exchanged some team members with the Excel team in order to have a great integration story. Anyone in management knows how important people are and will understand that moving people between teams is serious business.
So why do we do this? We do it because it creates the greatest value for our customers. Creating seamless experiences between products – built on open APIs - ensures customers get the greatest benefit out of their investments. it also means that end users (information workers as we like to call them) end up with the most natural level of interaction with the products without having to think much about it.
I’ll offer one proof point: I was recently talking with a friend of mine who’s at a large packaged food company. Mind you he’s in sales not in IT so he offers more of a business rather than technology perspective. They just started a pilot of SharePoint, Excel and PowerPivot. He couldn’t agree more with the value of better together. They initially started out looking at nonMicrosoft products and quickly realized the cost of employees learning another tool which didn’t naturally integrate into their daily route or with other tools would be too disruptive and costly to the business. This really hit home for me that the price on the box is only a small part of the overall equation. The productivity of your employees is far more important.
elsewhere that you
should be thinking about a "data roadmap". A way of thinking about
how you intend to leverage all of that data you're collecting.
And in that post I mentioned that
Microsoft has a "roadmap" for SQL Server. We think about how to
evolve the product all the time, with one overriding goal - to be the
"Information Platform" for your organization. It's more than just storing
and securing your data - it's about giving you tools that let you access
that data from anywhere, quickly and safely. In SQL Server 2008 R2 you'll see
big investments in this kind of thinking, from Master Data Services to Stream
Insight and PowerPivot and even interaction with SQL Azure. We're giving you
new ways to do the basic inputs and outputs, but we're also thinking beyond
those operations to new ways of allowing you to provide that data to your
has a review on some of these features that you can check out here.
So make your data roadmap with our
product roadmap in mind - it's not just data, it's information.
"Five" seems to be a big number these days - is it compression to have a "top-5" instead of a "top-ten"? :) Over at Network world they've posted a quick slide deck of "Five things we love about SQL Server R2". They even include...the Start Menu? OK - I guess everyone has different likes and dislikes. I think they're happy that we gave SQL Server 2008 R2 it's own menu area, rather than lumping it in with SQL Server 2008 (you're welcome).
I agree with their first four for sure - and I'll throw in there things that you might not see as often, like being able to connect to SQL Azure from SQL Server Management Studio, increased database sizes in SQL Server Express, just to name a couple. By the way - if you haven't rushed over to check out the resources we have for you to learn the tech for SQL Server 2008 R2, what are you waiting for? They're free! Enjoy.
As announced a few weeks ago on the SQL Azure blog, SQL Azure now supports Data-tier Applications which are introduced with Visual Studio 2010 and SQL Server 2008 R2. Here’s a quick tour of the feature.
Here’s what I did:
It’s that simple!
So what does this look like? Here are some screen shots:
This is the New Project dialog in Visual Studio 2010. I select the Data-tier Application project.
Here’s a shot of the project after I imported the local database. I could have also created the database schema from scratch.
Here’s a shot of the deployment properties within the Data-tier Application project. The connection string is exactly the same as a local instance except the server name is my SQL Azure server.
I select build and deploy. Here are the results. The system built the DACPAC, connected to my SQL Azure instances, and deployed the Data-tier Application.
The new database, Pubs-Azure, appears in my SQL Azure admin dashboard.
Connecting SQL Server 2008 R2 Management Studio to my SQL Azure instance I see the databases as well as the meta-data for the Data-tier Application.
With Visual Studio 2010, SQL Server 2008 R2 Management Studio and the latest release of SQL Azure we have a symmetrical experience for creating, registering, and deploying Data-tier Applications. Unfortunately at this time we don’t support Upgrading a Data-tier Application on SQL Azure, as we do with an on-Premise install of SQL Server. To move to your database to the next version of your application you’d deploy side-by-side and migrate your data using your favorite data migration tool.
Try this out for yourself and I think you’ll find this to be extremely valuable. In addition, if you want to share your database schema with other people you no longer have to provide a bunch of Transact-SQL scripts. You can give them a DAC Pack file. The file will contain the name of the application, the version, and the database schema definition. This will make management of applications, including deployment much simpler!
Though short, this article in PC World makes an important point: the PowerPivot feature in SQL Server 2008 R2 is incredibly valuable to small businesses. I mentioned in a previous post that in another “life”, in enterprise IT for a Fortune 10 company, I built a “BI” app using a SQL Server backend and pivot tables in Excel. Given all the complexities back then the end-users relied on my expertise to deliver their precious data each month.The operative word in there is “enterprise”. Back then this sort of environment was out of reach of most small businesses and required certain expertise.
But isn’t BI important to small business too? Absolutely! And until now it has, for he most part, been too complex for small businesses. With the introduction of PowerPivot small business have access to an intensely powerful set of capabilities that are extremely friendly. If you own or work for a small business I encourage you to check it out for yourself.
Just after a new release of SQL Server, I often get e-mails and calls from folks with this question: “Can I upgrade from Customer Technical Preview (CTP) x or Beta #x or Release Candidate (RC) to the “Released to Manufacturing” (RTM) version?”
Unfortunately, no. Right up until the last minute, things are changing in the code – and you want that to happen. Our internal testing runs right up until the second we lock down for release, and we watch the CTP/RC/Beta reports to make sure there are no show-stoppers, and fix what we find. And it’s not just “big” changes you need to worry about – a simple change in one line of code can have a massive effect.
Even if you've done this before and things seemed to go well, you may be in a difficult situation because of it.I’ve dealt with someone who faced this exact situation in SQL Server 2008. They upgraded (which is clearly prohibited in the documentation) from a CTP to the RTM version over a year ago. Everything was working fine.
But then…one day they had an issue. Couldn’t fix it themselves, we took a look, days went by, and we finally had to call in the big guns for support. Turns out, the upgrade was the problem. So we had to come up with some elaborate schemes to get the system migrated over while they were in production. This was painful for everyone involved. So in general it's just a really not a good idea.
There is one caveat to this story – if you are a “TAP” customer (you’ll know if you are), we help you move from the CTP products to RTM, but that’s a special case that we track carefully and send along special instructions and tools to help you along. That level of effort isn’t possible on a large scale, so it’s not just a magic tool that we run to upgrade from CTP to RTM. So again, unless you’re a TAP customer, it’s a no-no.
This past week we released SQL Server 2008 R2 to manufacturing. This is a huge accomplishment for the team and our customers are anxious to get their hands on it. I came across one blog post that expressed disappointment that the only thing they could download was the evaluation edition – they couldn’t wait to get their hands on a fully licensed edition, which will be available shortly.
Rather than go into a laundry list of what’s in the release here are links to a few of the RTM stories:
Even though I’m a Manageability Guy and there are some terrific manageability features in R2 the most important feature, in my opinion, is PowerPivot. PowerPivot is going to change everything about business intelligence for IT and information workers. Early in my career as an IT Pro I designed a system that used Excel Pivot Tables that were loaded with massive volumes of sales data. Unfortunately I had to have tens of Pivot Tables spread out across an almost equal number of Excel Workbooks. Since there were so many files and tables I had to build a monthly process for refreshing the data. Plus if one of the users wanted a new view of the data I had to craft it by hand for them. It’s an understatement to say this was a pain. If I had PowerPivot back then it would have greatly simplified my life and better supported the needs of my users. As you read up on PowerPivot you’re going to think it’s too good to be true, take it for a test drive to convince yourself how truly remarkable this technology is and how it’ll transform the way you think about BI.
the reason for the release schedule was to properly align Microsoft's flagship
database product with Microsoft Office, and with Microsoft's "cloud" strategy.
One of the strengths of the SQL Server platform is that it works well with our
other products, and in Microsoft Office 2010 and the latest release of
SharePoint we have included amazing array of Business Intelligence features for
the "non-IT" worker. This means your business users can get at the data they
need and want, and the IT department can still control and protect the data the
way it should be. It's the best of all worlds.
doesn't stop there. As you may have heard, Microsoft is "all in", with a
comprehensive cloud strategy. We have not only a complete cloud
development platform (Azure) but also a relational database offering (SQL
Azure) that goes beyond just hosting a SQL Server Instance in a rack somewhere.
SQL Server 2008 R2 allows you to connect to SQL Azure like you're connecting to
a local server. You now have capacity on demand, without losing any of your
local systems or control.
there's more - this release also includes the "Datacenter" edition, with
support for up to 256 logical processors, data and backup compression (from SQL
Server 2008) and the ability to use SQL Server with "Live Migration" - a
virtualization technology that lets you move virtualized servers without
downtime. These features, along with rapid adoption in the most
mission-critical, enterprise-class environments means that you should consider
SQL Server as a "Tier 1" application platform.
are indeed exciting times for the data professional. Make sure you hit these
links to learn more - your organization is counting on you as the data
professional to know what's new and useful in the data world. You can also post
any questions you have on this post - I'll try and make sure someone gets back
Server 2008 R2 Launch Site: http://www.sqlserverlaunch.com/
Microsoft Site for SQL Server 2008 R2: http://www.microsoft.com/sqlserver/2008/en/us/R2.aspx
If you want to learn more about SQL Server 2008 R2 this collection of videos is a terrific resource. There’s one covering an overview of the release and subsequent videos that drill in to specific feature areas. Each video is between 3 and 6 minutes long. The site also contains links to other resources and a listing of SQL Server 2008 R2 events across the globe.
I originally posted this on my blog but I think this audience might find it interesting. Yesterday I was having a conversation with the User Experience Program Manager on my team regarding two icons where one is supposed to represent an override of the other. It’s a deceptively simple problem. First, it seems there is no standard glyph for override. This means we have to invent something. As we set off doing this we have many things to consider. First, the icons have to be the same but different. Second, the differences must have meaning; we want them easy to understand and remember. For an expert user this is pretty easy. They’ll use the system everyday and quickly become comfortable with the icons. However, the novice user (or casual user) could end up getting lost and worse could make a bad decision if they don’t understand the differences between the icons. Most of the time it’s not readily apparent to customers or end users how much internal discussion goes into every detail of every feature; you just see the final result. If you could be a fly on the wall for a day, week, or month I think you’d walk away with a new appreciation of how difficult it is to develop great software and how much the SQL Server team loves doing it!
<Begin Original Posting>
In my day to day work I interact mostly with experts in the field of database technology and database administration. Every once in a while I get a gentle reminder that not everyone is an expert. Those reminders often come from forum questions from newbie and intermediate DBAs. When I come across one of these questions I’m thankful for it. It’s a great reminder that there is a spectrum of experience level out there.
Believe me, it’s hard work to make a feature easy for a newbie while at the same time powerful for an expert. One of the tricks we use to accomplish this is the script option on dialogs in SSMS. The dialog supports newbies (either new to SQL Server or new to the feature) whereas the scripting support gives them the option to learn what’s happening behind the scenes and build their expertise.
As we develop a new feature we’re intrinsically the resident expert. We take for granted certain knowledge of the internals. If we don’t challenge this we can easily end up shipping the wrong experience which can result in a few bad things, such as delaying the adoption of a new feature. One of the easiest ways to avoid this trap is to conduct usability studies early and often. We recruit DBAs of all experience levels and have them run through various tasks. Sometimes we got it right (when this happens we like to high-five each other and talk about how awesome we are) and other times we have to go back to the drawing board and make corrections in the experience (these are far more somber moments).
As a DBA I’m sure you encounter newbie through to expert users (other DBAs, developers, and end users). Do you change the way you interact with each person based on their experience level? next time you have a newbie come to you with a question, think about how you can help them become an expert.