Today I'm here in Washington DC, demo-ing in Bob Muglia's keynote at our Worldwide Partner Conference. This time, I'm showing something quite new - not just a new product, but a new concept in data - Microsoft's premium information marketplace codenamed "Dallas."
"Dallas" may be better known as a village of 200 or so people in northern Scotland, but now it has a new claim to fame. "Dallas" is a place where developers on any platform as well as information workers can find data from commercial content providers as well as public domain data. They can consume this data and construct new BI scenarios and new apps…
To illustrate how easy it will be for ISVs to integrate this content into their apps and to showcase a BI scenario, in the keynote demo I'll be using United Nations data from "Dallas" integrated in an early build of Tableau Software's solution to create a visualization answering study abroad trends around the world. This took only a few minutes to create using "Dallas" and Tableau Public.
To break down the demo, here are the three key pillars behind this scenario:
Discover Data of All TypesMicrosoft Codename "Dallas" makes data very easy to find and consume. The vision is to be able to post and access any data set, including rich metadata, using a common format.
Explore and ConsumeUsing Tableau and "Dallas" together means you can explore any data set simply by dragging and dropping fields to visualize it. This is a very powerful idea: anyone can easily explore and understand data without doing any programming.
Publish and ShareOnce you've found a story in your data you want to share it... Using Tableau Public you can embed a live visualization in your blog, just like the one above.
"Dallas" creates a lot of opportunities for a company like Tableau. It makes it possible for bloggers, interested citizens and journalists to more easily find public data and tell important stories, creating a true information democracy. "Dallas" also makes it easier for Tableau's corporate customers to find relevant data to mash up with their own company data, making Tableau's corporate tools that much more compelling.
For more information on how you can be a provider or how an ISV can plug into the Dallas partnership opportunities, send mail to DallasBD@Microsoft.com.
The SQL Server Best Practices Analyzer (BPA) came out for SQL Server 2008 R2 recently, and I’ve been asked what the difference is between the BPA and Policy Based Management (PBM) that was introduced in SQL Server 2008.
While it’s true both of these tools can do similar things, each has strengths and weaknesses. The Best Practices Analyzer has a long history, and has various “rules” that compare settings on a server and provide guidance through some very nice reports. Many of these rules became Policies in SQL Server 2008. The BPA requires a separate install, PBM is installed with SQL Server 2008, and the reports are something you would have to create yourself. PBM can be run on a schedule, from a SQL Server Agent Job step or inside PowerShell, and BPA doesn’t do that out of the box. PBM also has a “SQL” task where you can define whatever you would like, BPA doesn’t have that capability in exactly that same way.
Probably the biggest difference between the two tools, however, is that PBM can be set (under certain circumstances) to prevent an action from being taken. For instance, you can actually stop a developer from naming a database object in a certain way. Again, there are restrictions on this feature, but you can use it from time to time.
So which is better? Neither! Both have their uses, and in fact I use them both. One of the greatest strengths of Microsoft products is that you can usually do the same task in multiple ways. Of course, it’s one of our great weaknesses as well!
So as usual, the answer is “it depends”. You should learn more about both, and figure out what works best for you.
Most of us have some help we can provide back to the community. Even if you're new, you can write down some of the things you've learned. And there are several easy ways to to that - you can certainly jump in on the forums (http://social.msdn.microsoft.com/Forums/en-US/category/sqlserver/) and answer any questions you know the answer for and you can even participate by helping to write a form of documentation - did you know we run a Wiki, where you can edit the documentation? Check it out here: http://social.technet.microsoft.com/wiki/
So take some time, peruse those resources, and write something up. We learn best when we teach others, so don't think that you can't give back - you can!
At this point there should be no question that Microsoft is fully embracing the cloud. Windows Azure and SQL Azure launched earlier this year and have been receiving positive feedback. I’ve been happily using SQL Azure for several months. One of the great aspects of SQL Azure is the integration with the standard tooling, e.g. SQL Server Management Studio (SSMS). This is sort of like how I can connect Outlook to my personal e-mail account with my service provider.
However, I‘m not always on a machine with Outlook or with Outlook configured for my mail account. In these cases I opt for the browser email client provided by my service provider. Up until last week the only option provided by Microsoft for using SQL Azure was SSMS. If you didn’t already have SSMS you could download and install it, which isn’t always a practical solution. You’ll notice I said up until last week. Last week we announced a Community Technology Preview (CTP) for Project Houston. In short, Houston is a Silverlight client for managing your SQL Azure database, developed by none other than my team
Mary-Jo Foley recently wrote about Project Houston: http://www.zdnet.com/blog/microsoft/microsoft-delivers-test-build-of-tool-for-cloud-database-development/6910
In addition, a simple Twitter search on the tag #SQLHouston will yield a plethora of results. We’re also tweeting about it under the user name @SQLHouston.
You can access the CTP of Project Houston @ https://manage.sqlazurelabs.com/ and be sure to send us your feedback and suggestions. You can learn more about Project Houston and how to send us your feedback here: http://blogs.msdn.com/b/sqlazure/archive/2010/07/26/10042571.aspx
I know, SQL Server 2008 R2 was released some time back, but with so much content out there I thought I might simplify it a bit. I've written a complete overview that you can read through in about 10 minutes, and there's a great new whitepaper on one of the main features, PowerPivot. Thought I'd share here:
My article: http://www.informit.com/guides/content.aspx?g=sqlserver&seqNum=359
PowerPivot Whitepaper: http://whitepapers.zdnet.com/abstract.aspx?docid=1911939&promo=100303
As Mary Jo Foley recently reported migrating to SQL Azure just got a little easier with the latest version of the SQL Server Migration Assistant (SSMA), released on August 12. Included in the update are the typical cast of source databases: Oracle, Sybase, and Access. But there are two new things. First, MySQL was added as a source database. And second, SQL Azure was added as a target. If you have an existing MySQL database there’s no better way to get started with SQL Azure than migrating your existing db .The process is fast and painless. You can read more about this release of SSMA on the SSMA blog.
With SQL Azure, you're no longer in charge of the Instance of SQL Server that you're running on. You're dropped into your environment at a database level. Also, the size of those databases are more restricted than you might be used to for your on-premise system (Keep in mind that SQL Azure has a set of use-cases, and those aren't always the same as the on-premise installation of SQL Server or other RDBMS systems). Because of these two reasons, you want to start "thinking at the database level". In one case, this means a shift in thinking at a lower level, and in another, at a higher level.
First, because you're at the database level when you enter the system, you don't need to control the Instance-level settings, or work with security and so on at the higher level. That measn that you should focus "down-level" to the database settings you do control.
Second, because of those size limits you need to think differently about the strategies you use for dealing with "big" data - in this case, as of this writing, the 1-50GB databases you can create on SQL Azure. In SQL Server on-premise installations, you can "partition" large sets of data by breaking them out using a Partition Scheme and a Partition Function - more on that here, with a great explanation of previous versions of SQL Server Partitioning. You also have access to FileGroups, which point to files that can be placed on different physical devices. In SQL Azure, you can think about the database as a container like the tables are in on-premise systems, in effect, "thinking up" to the database level.
This actually has some advantages - by placing data sets that you might partition by date, customer ID and so on in different databases, you gain the advantage of a different logical system running that database, gaining you CPU, memory and so on. My friend Wayne Berry has an excellent series of articles dealing with this starting here. He develops a strategy on Partitioning in SQL Azure in that series.
The point is that using SQL Azure requires understanding the way it holds and processes data. While it's ideally suited to data in the sub-50GB range, that doesn't mean that you don't have options to work with larger sets.
Since my team started work on Project Houston hardly a day goes by I’m not in some discussion about the Cloud. It also hits me when I go through my RSS feeds each day: cloud this and cloud that. There is obviously a significant new trend here, it may even qualify as a wave; as in the Client/Server Wave or the Internet Wave or the SOA Wave, to name a few – ok, to name all the ones I can name. Almost 100% of what I read is focused on how the Cloud Wave impacts IT and the “business” in general. Don’t get me wrong, I completely agree it’s real and it’s going to have a profound impact. I think this posting by my friend Buck Woody (and fellow BIEB blogger) sums it up pretty succinctly. The primary point being it doesn’t matter what size your business is today or tomorrow, the Cloud will impact you in a very meaningful way.
What I don’t see much discussion about is how the Cloud Wave (is it just me or does that sound like a break dancing move?) changes the way software vendors (ISVs) develop software. In addition to Project Houston my team is responsible for keeping SQL Server Management Studio (SSMS) up with the SQL Azure feature set. If we step back for a second and look at what we have, we have a product that runs as a service on Windows Azure (running in multiple data centers across the globe) and the traditional thick client application that is downloaded, installed, and run locally. These are very different beasts but they are both challenged with keeping pace with SQL Azure.
SSMS has traditionally been on the release rhythm of the boxed product. This means a release every three years. The engineering system we use to develop software is finely tuned to the three year release cycle. The way we package up and distribute the software is also tuned to the three year release cycle. It’s a pretty good machine and by and large it works. When I went to the team who manages our release cycle and explained to them that I needed to release SSMS more frequently, as in at least once a quarter if not more often, they didn’t know what to say. This isn’t to say these aren’t smart people, they are. But they had never thought about how to adjust the engineering system to release a capability like SSMS more often than the boxed product, let alone every quarter. I hate to admit it but it took a couple of months of discussion just to figure how we could do this. It challenged almost every assumption made about how we develop, package and release software. But the team came through and now we’re able to release SSMS almost any time we want. There are still challenges but at least we have the backing of the engineering system. I’m pretty confident we would have eventually arrived at this solution even without SQL Azure. But given the rapid pace of innovation in the Cloud we were forced to arrive at it sooner.
Project Houston is an entirely different. There is no download for Project Houston, it only runs in the Cloud. The SQL Azure team runs a very different engineering system (although it is a derivation) than what we run for the boxed product. It’s still pretty young and not as tuned but it’s tailored to suit the needs of a service offering. When we first started Project Houston we tried to use our current engineering system. During development it worked pretty well. However, when we got to our first deployment it was a complete mess. We had no idea what we were doing. We worked with an Azure deployment team and we spoke completely different languages. It took a few months of constant discussion and troubleshooting to figure out what we were doing wrong and how we needed to operate to be successful. Today we snap more closely with the SQL Azure engineering system and we leverage release resources on their side to bridge the gap between our dev team and the SQL Azure operations team. It used to take us weeks to get a deployment completed. Now we can do it, should we have to, in the matter of hours. That’s a huge accomplishment by my dev team, the Azure ops team, and our release managers.
There’s another aspect to this as well. Releasing a product that runs as a service introduces an additional set of requirements. One in particular completely blindsided us. Sure when I tell you it’ll be obvious but it caught my team completely off guard. As we get close to releasing of any software (pre-release or GA)we do a formal security review. We have a dedicate team that leads this. It’s a very thorough investigation of the design in an attempt to identify problems. And it works – let me just leave it at that. In the case of Project Houston we hit a situation no one anticipated. The SQL Azure gateway has built-in functionality to guard against DOS (Denial of Service) Attacks. Project Houston is running on the opposite side of the gateway from SQL Azure, it runs on the Windows Azure platform. Since Project Houston handles multiple users connecting to different servers & databases there’s an opportunity for a DOS. During the security review the security team asked how we were guarding against a DOS. As you can image our response was a blank stair and the words D-O-what were uttered a few times.
We had been heads down for 10 months with never a mention of handling a DOS. We were getting really close to releasing the first CTP. We could barely spell D-O-S much less design and implement a solution in the matter of a few weeks. The team jumped on it calling on experts from across the company. We reviewed 5 or 6 different designs each with their own set of pros and cons. The team finally landed on a design and got it implemented. We did miss the original CTP date but not by much.
You’re probably an IT person wondering why this is relevant to you. The point in all this is simple. When you’re dealing with a vendor who claims their product is optimized for the Cloud or designed for the Cloud do they really know what they’re talking about or did the marketing team simply change the product name, redesign the logo and graphics to make it appear Cloud enabled. Moving from traditional boxed software to Cloud is easy. Do it right, is hard – I know, I’m living through it every day.
I've mentioned elsewhere that you should be thinking about a "data roadmap". A way of thinking about how you intend to leverage all of that data you're collecting.
And in that post I mentioned that Microsoft has a "roadmap" for SQL Server. We think about how to evolve the product all the time, with one overriding goal - to be the "Information Platform" for your organization. It's more than just storing and securing your data - it's about giving you tools that let you access that data from anywhere, quickly and safely. In SQL Server 2008 R2 you'll see big investments in this kind of thinking, from Master Data Services to Stream Insight and PowerPivot and even interaction with SQL Azure. We're giving you new ways to do the basic inputs and outputs, but we're also thinking beyond those operations to new ways of allowing you to provide that data to your organization. Bloomberg has a review on some of these features that you can check out here.
So make your data roadmap with our product roadmap in mind - it's not just data, it's information.
Mary-Jo Foley recently wrote a blog post about being better together. At Microsoft this is something we take seriously. In her blog post she includes some snippets of an interview with Donald Famer. Donald discussed how we exchanged some team members with the Excel team in order to have a great integration story. Anyone in management knows how important people are and will understand that moving people between teams is serious business.
So why do we do this? We do it because it creates the greatest value for our customers. Creating seamless experiences between products – built on open APIs - ensures customers get the greatest benefit out of their investments. it also means that end users (information workers as we like to call them) end up with the most natural level of interaction with the products without having to think much about it.
I’ll offer one proof point: I was recently talking with a friend of mine who’s at a large packaged food company. Mind you he’s in sales not in IT so he offers more of a business rather than technology perspective. They just started a pilot of SharePoint, Excel and PowerPivot. He couldn’t agree more with the value of better together. They initially started out looking at nonMicrosoft products and quickly realized the cost of employees learning another tool which didn’t naturally integrate into their daily route or with other tools would be too disruptive and costly to the business. This really hit home for me that the price on the box is only a small part of the overall equation. The productivity of your employees is far more important.
I hope you’re able to join me next week at TechEd 2010 U.S. In New Orleans, Louisiana. I love taking this time each year to learn in an environment where I’m “out of my element”. What I mean by that is I have not only SQL Server knowledge all around me, but also knowledge for the operating system, Office Automation products, SharePoint, and one of my new/old favorites, High-Performance Computing. It's a time where I can focus for an entire week on "recharging" my knowledge batteries.
If you are coming (and I really hope you can), make sure you get the most out of the visit. Check out this link: http://blogs.msdn.com/b/cdndevs/archive/2010/05/31/a-few-tips-amp-tricks-on-attending-microsoft-teched-2010.aspx to learn a lot more about how you can prepare for a TechEd Conference.
We’re all facing budget pressures, so you have to figure out whether you should go to a conference right now. To help you make an informed decision, check out this link: http://northamerica.msteched.com/default.aspx?fbid=TJgYwH0BwHo
And if you are coming down south with me, stop by and say hello – I’m doing a presentation or two and also working at the booths, including our Surface Computing System demo area.
Well, just look again. $250 million is a helluva partnership investment, even for giants in the industry. It's always the case that press releases tend to be somewhat "glossy" - frankly, they all sound the same to me - so again, it's easy to overlook the details. Better than the press release, look here, to some of the applications that are already available, some of the case studies that already prove the value of the collaboration, and some of the details of hardware, software and services that are being integrated: http://bit.ly/5VTD4k
From a purely personal viewpoint, I have to say that the people I work with in Microsoft and HP are genuinely excited about the extended partnership. My friend at HP, John Santaferraro has been tweeting like crazy! For us it's not just marketing, and that's particularly true in the database and business intelligence fields. SQL Server is the most rapidly growing database platform, our BI is making strides in the market, and our partners in HP have also made some very compelling business intelligence investments.
So what is it, that we are offering?
For one thing, pre-configured, packaged solutions for OLTP, BI and DW workloads for different sizes of business. For SQL Server, this is great. Today, our customers see the cost-effectiveness, ease-of-use and power of the SQL Server and Microsoft BI platforms, but they often see only that. We are after all a platform company (having, I suspect, more platforms than the Jackson Five.) The new solutions build on the platform with HPs services and hardware experience which are, of course, first class. If this was a just a case of packaging up a services, software and hardware offering to make it easy to market, I think I would be unimpressed myself, but there is more to it. We are developing tools and services specifically for these offerings - from Microsoft, tools to make virtualization easier; from HP, customized BI professional services in information governance, master data management and so on. These professional services from HP are critical - they have over 11,000 certified Microsoft Professionals, so building out a portfolio of service offerings specifically for them, greatly increases our clout.
For the mid-market, I can see some of our other partners being concerned at first reading of the announcement. However, I see good news for them, too. Naturally, the noise has been about the big things HP and Microsoft can do together, even if somewhat exclusively, but the channel remains very important to us. ("Super-important" as Microsoft execs are wont to say.) After all, we have 32000 HP and Microsoft channel partners.
They will see much larger investment in our marketing programs - something like ten times the current spend, I believe. The investment will go into bundled software and hardware packages that should reduce sales cycles, new financing options to make integrated solutions easier to acquire, and - hugely important and hopefully worthy of a cheer - integrated support from dedicated field engineers.
I hope this gives some impression of why we are excited by the announcement. Of course, the proof of the pudding will be in the eating, and we'll be watching over the next year and more for the restaurant reviews of this particular dessert to come rolling in. I'm looking forward to them.
Meanwhile, it's time for me to get out of my hotel room, away from this laptop, and out into the Arizona sun. Even after so many visits, I'm not going to take it for granted. Look at the HP / MSFT announcement in that light.
One evening last week I was hanging out with a friend who is a professional photographer. As so often happens on such occasions, we whiled away some time comparing new toys, for we both had new cameras. Mine is small and perfectly formed (an Olympus, if you must know) and he had a high-end Nikon of such weight that I suspect it is mostly recommended by chiropractors looking for new work. However, my friend always carries a small point-and-shoot in his pocket, because, as he always reminds me: "The best camera is the one you have in your hand." It's no use having a great camera at home, if it's not with you when an opportunity arises; and, when the opportunity does arise, the camera to hand is indeed best.
Last week I also had six separate customer briefings in the Executive Briefing Center at Redmond. Now that the SQL Server and Office teams have just released their November CTPs, these were great opportunities to advise customers on what is coming in our next release and how to prepare for it. PowerPivot is by far and away the most popular feature, but I also had some surprising discussions around Master Data Services, our first foray into Master Data Management.
What surprised me, was that two of my customers, independently, said "We have needed Master Data for a while, but we could not find tools that we like. We'll certainly wait for Microsoft's solution."
Now, I'm flattered that they want to see Microsoft's offering, but really, if you have problems with master data you need to be looking at a solution, tools or no tools. (If you're new to the concept of Master Data Management or MDM, William McKnight has a couple of great articles, here and here.)
Fortunately, for these particular customers, even they can get started with MDM with the November CTP. For all SQL Server Enterprise Edition customers, Microsoft's MDS will be the tool to hand for Master Data, and therefore, as the photographers would say, the best tool for the job. Indeed, MDS is quite a comprehensive toolset, featuring: a master data hub based on the SQL Server relational engine; a thin-client stewardship portal for managing master data entities, and all their related hierarchies and versioning requirements; workflow integration and extensible extensible business rules; and role-based security.
During the briefings, we all agreed that we would start to review the tools technically, and to review the customers systems, and governance needs, as a matter of urgency.
To understand just how urgent the need is, I must return to my photographer friend. After comparing notes on cameras, our conversation turned to his finances. In particular, he was fuming about confused, duplicate and sometimes outdated information he was getting from a service provider following a merger: a classic master data problem. I am sure you have guessed already: he is an unhappy customer of one of my customers: and I know the advice he would give them. "The best tool for the job, is the tool you have. Just get on with it!"
It's getting close to that time of year when you're going to start seeing lots of "the year in review" specials on television. I started in my new role working with our customers last December, so it seems only fitting that I take a moment and go over some of the highlights in the SQL Server community in the last year - what I've seen, what I've learned, and what has hit the headlines. I have a wonderful vantage point, working with our partners, our clients, and with the SQL Server team here in Redmond. I've traveled to several states, participated in lots of user groups, presentations and conferences, and I've learned a lot about how people use SQL Server in their organizations and what we've done to make that a better experience. Most companies started the year with a big emphasis on cost-saving and getting the most value out of SQL Server. I've helped lots of organizations figure out how they can migrate applications to SQL Server, and how to consolidate those servers onto fewer Instances - saving on hardware and software costs. This is a two-edged sword - you have to carefully plan these migrations and consolidations out, and understanding the right process to use (database stacking, Instance stacking and Virtualization) is vital to keeping the organization happy. Microsoft announced they would support using SQL Server in a Virtualized environment, and also began work on SQL Server 2008 R2 - which has even more options for consolidation.
And some organizations wanted to have even more flexibility, so 2009 saw the release of SQL Server Azure, the "database in the cloud". Each month I've seen more and more chatter on this offering, from small organizations that don't want to manage a server all the way up to huge companies that want the flexibility to rapidly create, deploy and manage their databases. Far from removing the need for a DBA, the data professionals are finding that their role is to help with their organization's data strategy, explaining how and when to use these kinds of offerings to reach the business goal.
This year has also been called "the year of the community", with the SQL Saturday movement becoming wildly popular, as well as an amazing turnout at the PASS conference. Almost 40% of the attendees to PASS this year were first-timers - and from the comments I heard, it won't be their last time either. At PASS the SQL Server Most Valuable Professionals (MVP's) wrote a book (which I'm still reading) called "Deep Dives" - with all of the proceeds donated to War Child, an international charity. They literally took Bill Gates at his word when he said to "give back". Amazing.
Along with consolidation, many data professionals are focusing on performance tuning. They need to get the most out of the systems they already have. I predict that the consolidation efforts will continue, as well as the emphasis on perf tuning. I've taught several performance tuning seminars this year, and I've been asked to do several more next year as well.
So where will 2010 take us? Well, a new release of SQL Server, Visual Studio, new modeling languages, developer tools and administration needs. Look for a bigger emphasis on PowerShell - it allows you to manage almost any Microsoft product, and talks equally well to other platforms and database systems. I also think that you'll see a pent-up demand for new projects as inventories run low and companies ramp up to supply demand. So buckle in. It's going to be a busy time.
My father was an engineer and spent many weekends in the garage nursing his various old cars. Frankly, I was little help, but there was one job I loved: lifting the big old monsters onto the axle-stands with the hydraulic jack. At 10 years old, I felt like the young superman, raising a hefty Rover P5 with one hand. It was magical. Years later, when I understood some of the mechanics and hydraulics, I could still smile to myself, lifting up my own cars to work on them.In my teens, our family invested in our first computer: as for so many at the time, a Sinclair. Within a few hours I had worked out the basics of Basic, and programmed the Sieve of Eratosthenes. And then - that same magical feeling! - I calculated all the primes to 1 million in seconds. Then on to 10 million, then more and more.My intellectual life was never the same again. Researching in archaeology, our databases enabled us to analyse the details of hundreds of land transactions, discovering patterns of social change in late-mediaeval Scotland. Working on fish-farms, our software applications tracked and projected growth of salmon and trout handling dozens of complex variables.Yet, despite my fun with the hydraulic jack, I never did get become a mechanic, and even now, working at Microsoft, I'm not really "into" computers. For example, I could not tell you the model, or processor or graphics card of my laptop or desktop, and I only know the RAM capacity because recently I have had to refer to it. It was never the hydraulic jack which delighted me, or the home computer. It was always what these technologies enabled: on one hand mechanical advantage, on the other, intellectual leverage. In both cases, they extended my abilities to match my imagination, and further yet.Today, I work in the SQL Server Business Intelligence team, and we have a simple mission: to enable everyone to make better business decisions, informed by the right data, in the right form, when they need it. A simple mission to claim, perhaps, but complex to put into execution. And right there is the secret as to why I enjoy working in this space so much: I want the users of our Business Intelligence software to experience that same moment of insight. "I can do this! With a scale and a power and speed that takes me beyond what I thought was possible, I can do this; and, by doing this, I can profoundly change my business."In other words, I want to see our users enjoy that same intellectual leverage. I want the marketing manager to assess campaigns with a more profound insight than before. I want the accounts manager to analyze millions of transactions to check her own hypotheses, not just to rely on high-level reports served up by others. I want the salesperson to make smarter offers informed with the right data when they need it, at the moment of negotiating the sale.There is another aspect of leverage that matters greatly to me. It is the power of talented and enthusiastic individuals to leaven a whole community of users with new ideas. I speak at many events, all over the world: and, for better or worse, my presentations are evaluated by the audience. I don't worry too much about the scores, unless they suck! Don't get me wrong, it's nice to be a high scoring speaker, but I am looking for something else. I am deeply interested in the written comments from the audience, more than their scores. Typically people are very kind. Of course, there are some who may not like my style, or feel I am covering old ground, or I am being too technical, or not technical enough. That's fine, and I do listen careful to criticism.
Nevertheless, what I am looking for is some sign that I really connected with at least some in the audience. It's a good day when someone tells me that what they heard changed their whole view of what is possible, or that they now see more clearly where they can lead their business or their personal practice with analytics.Those enthusiasts are a joy! they will go back to their offices and try new things, they will tell others, they will spread their enthusiasm: they will leverage a community I could not reach on my own.It's such a privilege to do this work. There is not one day when I don't feel, at some point, that spellbinding sense of a 10 year old, lifting his father's car with one hand. I do hope, as you read my blog, that I may convey just a little of that, almost magical, surprise.Donald Farmertwitter: @donalddotfarmer
Hello, I'm Buck Woody - Microsoft's "Real World DBA". I go by that title not because I'm a DBA here (we do have a lot of those, though) but because for over 25 years I've worked as a Data Professional. I've worked at organizations from NASA facilities to hospitals, and from legal offices to manufacturing firms and software development companies. I've been a DBA, a data developer, and a database consultant on everything from mainframes running COBOL flat-file databases to microcomputers running Oracle, DB/2, Sybase, SQL Server, Postgres and Ingres. I've written a few books on SQL Server, been a SQL Server Most Valuable Professional (MVP) and I've run several user groups over the years.
I joined Microsoft just a few years ago and since that time I've worked on the product team as a Program Manager working on the SQL Server 2008 product, and I now work as a Senior Technical Specialist on SQL Server, helping our clients figure out where the SQL Server product fits in their architecture. I also teach a database design course at the University of Washington, and I still volunteer as a DBA so that I keep my hands in the tools and the trade.
From time to time I'll post information here that is different from my daily blogs and weekly articles - but like those posts I'll always make sure that the information has "real world" value. Functions and features are great, but they have to *do something* meaningful before I get excited about them. When I do, you'll be the first to hear it!
Nineteenth century London was famous, or perhaps notorious, for the jokey catchphrases heard in its streets. They sprang from who-knows-where and spread around the city, amongst urchins and gentry alike, in a matter of hours. Charles MacKay, in his classic Extraordinary Popular Delusions and the Madness of Crowds, gives numerous examples, including, as you may have guessed: What a shocking bad hat! and (a particular favourite of mine) Has your mother sold her mangle? These were the viral memes of the day; a Victorian equivalent of Rick-Rolling or videos like Hamster on a Piano. Naturally enough, they have all but vanished from use, except for one or two: the phrase to flare up, which quickly became a cliché after the Reform Act riots; and, curiously, the cry Tom and Jerry! which may have originated from a line in a play.
A couple of things strike me about this phenomenon: the enthusiasm and speed with which a catchy phrase spread; and, the difficulty of predicting, or, on backward reflection, of understanding, which phrases would persist and why.
I am thinking about this just now, because I see a similar, if less amusing, phenomenon at work in many of my customers' businesses. There's a banking customer who discovered - the hard way - that a summer intern's programming project had become widely relied on in their foreign exchange department. It came to their attention, because it was hosted on the desktop machine of an administrative assistant. Whenever she stressed her bandwidth sending a fax or, perhaps more likely, viewing a viral video over the net, the application ground to a halt for its many users who soon complained to IT about the performance of their application. You can imagine that IT were mightily confused - they did not know the application even existed - until they tracked down problem. It's a good example of how business solutions can spread virally and persist in your operations when found to be useful. It also shows how difficult it may be for IT to understand where such initiatives might spring from, and which of them are likely to become mission-critical.
This is an important issue for us in SQL Server, as we prepare, with our 2008 R2 release, to give business users ever more analytic power, and with that also, tools for readily sharing their analyses. We call this Self-Service Business Intelligence; but I must qualify that further. We call these techniques and technologies Managed Self-Service Business Intelligence, and the difference is significant.
With a self-service application such as the Project Gemini add-in for Excel, we give business users unprecedented computational power within their familiar tools. With such power, most anyone has the potential to build a compelling, and attractive, BI solution: a solution they will be happy to share, and that others may find answers their business needs unequivocally, as it comes directly from another business user. When those others find the solution useful, they will pass that knowledge along too.
How does these practices help the IT department? Are they not an invitation to yet more problems? Not when the full picture is seen.
First of all, self-service business intelligence unburdens IT from responding to numerous ad-hoc requests for reports and analyses. They can manage their resources more effectively, by giving users the means to help themselves with the Gemini add-in for Excel. Secondofly, managed self-service does not cut IT out of the loop: it involves them deeply, for IT provision the required services for collaboration, with Sharepoint and the Gemini Add-in for Sharepoint. IT will also still provide much of the data for analysis, especially the authoritative master data (with SQL Server Master Data Services) or the traditionally warehoused historical data of the enterprise at any scale (with Project Madison.)
By managing the infrastructure for collaboration, IT have unique oversight of, and insight to, the sharing and spreading of successful solutions. Microsoft will provide the tools for administrators to discover which solutions are flaring up. When IT discover a new application growing to unexpected responsibility beyond its original desktop environment, they can ask, and act on, the equivalent of another viral Victorian catch-phrase: Does your mother know you're out?
Donald Farmertwitter: @donalddotfarmer
Hello,
I've always found introductions, especially virutalductions, awkward. But since it's good for you to know who's sitting at the keyboard I'll give it a shot. My name is Dan Jones, I'm a principal group program manager on the SQL Server product team focusing on SQL Server Manageability. I've been with the SQL Server team for over five years. My team is staffed with program managers (PMs) who drive product and feature planning, design, and implementation. It's a lot of fun in that we get to work with a wonderful cross-section of people: marketing, developers, testers, executives, and most importantly customers, and partners.
During the development of SQL Server 2008 my team was responsible for two essential manageability features: PowerShell integration and Policy-Based Management (PBM). Both of these are very cool technologies that empower DBAs to be more efficient and productive in their day to day work. If you haven't explored them I highly encourage you to do so. These are some of the technologies that we're using as a foundation for new features and capabilities.
Today my team is wrapping up development on SQL Server 2008 R2 and starting to look ahead to the next major release of SQL Server. I think that's one of the challenging aspects of my job: customers are just becoming acquainted with one release and I'm already off thinking one or two releases ahead. Anyway, in R2 my team is delivering Application and Multi-Server Management (AMM). This encompasses the Data-tier Application (DAC, different than dedicated admin channel) and the SQL Server Control Point. The DAC is about removing friction in the development, deployment and management of the data-tier portion (think database schema) of an application. The Control Point is designed to help DBAs understand the utilization of resources within their SQL Server environment (CPU and disk space) and quickly and easily identify consolidation candidates. These are v1 technology stakes that will grow in capability and scope subsequent releases. If you haven't pulled down the August CTP of SQL Server 2008 R2 do so and let me know what you think of these new features.
Prior to joining Microsoft I spent a few years working for a couple of different start-ups; one in the software development tools space and another in the IT business intelligence area. Prior to that I spent a little over eight years working in enterprise IT for a fortune 50 company. There I worked on several different technologies: mainframe, client-server, packaged software (ERP), business intelligence and e-commerce.
My professional passions gravitate toward SQL Server and how to make it the best data platform for our customers; probably a little too motherhood and apple pie like, but it'll do. Some of the technologies/concepts that are firmly on my radar these days include PowerShell, declarative management (think PBM), virtualization (Hyper-V), and cloud computing. I enjoy talking with customers and users to hear what's on their mind, understand how they're using the product and how we can make it better for them .
On the personal front, I'm a drummer currently playing in a Journey tribute band. I also love fine wine and food. And I love golf but with three kids I just don't have as much time to enjoy it as I wish. Plus the Pacific Northwest weather isn't always as accommodating as I wish.
I look forward to sharing my perspectives and hearing back from you on what you think and what you want to know more about. Cheers, Dan
In early 2001 I worked @ a start-up company with about 150 people. I call it a start-up because it had no revenue but did have free breakfast, lunch and dinner. @ the company there were five people named Michelle and six named Dan (including yours truly). One day the VP of operations got confused about which Michelle and Dan he was talking about; he followed it with the following comment – you can’t swing a dead cat around here without hitting a Dan or a Michelle. Ok, I apologize to all cat lovers – I’m one myself and I for sure don’t condone the swinging of dead animals, or live ones for that matter. But I feel the same way about virtualization and consolidation. Hardly a days goes past when I’m not in a conversation about these topics. It’s important to remember that while virtualization has a coolness factor I recommend you approach it as an enabler to meeting business objectives and not the silver bullet that’ll solve all your woes. Also, while virtualization and consolidation are often mentioned in the same breath they’re not the same thing. A couple of guys on my team put together the following graphic to discuss SQL Server consolidation options:
As you move left to right across the picture you move from Higher Isolation (which equates to higher cost) to Higher Density (which equates to lower costs). High isolation refers to resource and security isolation. Obviously you could take this to the extreme and have each application (instance of SQL Server) reside on its own hardware sitting in its own data center residing in its own building on its own power grid. But that’s crazy, right? Higher density means the greatest sharing of resources.
Here’s a brief explanation of each lane:
The main point here is there is no one size fits all solution or technology and virtualization is just one of the ways to meet consolidation needs. You will likely employ multiple solutions within your environment with the key being to chose the right technology to meet the business requirements of the application. As an IT professional it’s your job to understand the technology and how to apply that technology to solve business problems.
There are several different models out there for approaching consolidation. The high-level steps that resonate with me are:
Finally, I couldn’t end this article without mentioning my favorite virtualization related features in Win7 and Win2K8 R2:
Virtualization and consolidation are concepts that have been around for at least 30 years. With the ever increasing pressure on IT to do more with less they are realities that can no longer be ignored. As I mentioned in my opening blog post, one of the challenging aspects of my job is I’m always thinking one to two releases out and I’m talking these days a lot about consolidation and virtualization… The first of this wave of capabilities sees the light of day with SQL Server 2008 R2 Application and Multi-Server Management.
Cheers,Dan
Bar Bosch, in Palma de Mallorca, is one of my favourite places to hang out. It's hectic and noisy, but still leaves space and time for endless discussions or reflection. It is the perfect place to pretentiously pore over Žižek or Derrida. Yet in the autumn of '98 I found myself sitting there, ordering carajillas and reading up as much as I could about Oracle's Business Intelligence tools.
It was before I joined Microsoft, and I had just started a new job with a BI team focused on Oracle products. I had to get up to speed quickly, but, fortunately, I had some vacation already booked. So, one week after I joined, my wife and son headed to the beaches of Cala Ferrera, and I headed to Bar Bosch with my Oracle books under my arm. It was a long week of reading, and mostly in vain. When we returned home, my CEO had news for me. "All that Oracle stuff," he said. "Forget it. We're focussing 100% on SQL Server now." The rest, as far as my personal story is concerned, is history. I've never looked back.
What happened? Some of the C-level team had attended the SQL Server 7 technical preview at Microsoft's invitation in Redmond. On day one, during a break, they phoned back to our office from the lobby and told the development team to stop Oracle development. They found the Microsoft BI story, told then by Bill Baker and Amir Netz, so compelling that they immediately changed direction. Microsoft presented three irresistible messages: a vision of what BI is about, a commitment to BI customers and partners, and a compelling pricing model.
I could wish all Oracle migrations were so easy. Some are, but I always maintain, as in a recent discussion with Ted Cuzillo, that migrating technologies is easier than migrating people. When one has established a way of working, it's difficult to adapt to a new way, even if it is technically equivalent or superior. It's just easier to carry on with one's current practices. Software manufacturers know this, of course. We all want our products to be, as the marketers say, "sticky" - and all of us in the software business like features and methodologies that build loyalty. That loyalty, once gained, is naturally a most valuable asset.
Yet, recently, Oracle, appear to have taken a different approach: one that customers increasingly resent. They are tying companies into the Oracle ecosystem financially, especially by acquiring other strategic suppliers: writing checks, not code, as Larry Ellison himself once said. Their customers are complaining in stark terms, as they find themselves not only tied in, but subject to ever-increasing prices and intransigent demands. Business Week headlined that "Oracle has Customers Over a Barrel" quoting one customer as saying "Once you've made a deal with the devil, it's hard to get away."
This approach from Oracle is surely a deeply alienating move, and it tastes to me of a certain desperation. Personally, I find it baffling. Here is a company with a truly great database product, a broad range of applications and excellent engineers, but they are driving their own customers to despair with licensing practices that feel like extortion according to some. So, while there have always been Oracle customers willing to make the effort to change, at Microsoft we are expecting to see that number increase.
Naturally, we're encouraging them, and why not? SQL Server is growing rapidly: over 11% in 2008 according to IDC. In the past, much of our growth came from net new customers: we have always been aware that we win many businesses choosing their first enterprise database or BI implementation. Now, we are seeing a large number of switchers: customers willing to make that migration from another technology. To win more of these customers, we need to show that they need not be tied into the Oracle universe, even when their existing commitments are substantial. This case study of the Turkish appliance manufacturer Arçelik describes them effectively moving a 5Tb SAP implementation from Oracle to SQL Server, and gaining substantially in performance and lower costs. We also have a neat little tool that TCO calculator that has proved very popular. To be sure, you don't need an animated tool to find significant TCO savings persuasive, but it's fascinating and fun to play with the various combinations of servers and staff and requirements.
Of course, I would not deny that Oracle are still growing. There's a lot to admire in the efficiency with which they target and execute on acquisitions. Frankly, they sometimes take the rest of us in the industry by surprise; which is partly a sign that they are imaginative and bold, yet partly suggests that they are chasing growth by acquisition at all costs, often seemingly at random. This week they acquired Hyperroll. That gives Oracle four OLAP products. It's little surprise that their customers, and from what I hear, their internal teams, are puzzled and often anxious about their direction. On a larger scale, their acquisition of Sun sent many of the same confusing signals.
How does all this affect someone like me, making suggestions and decisions about our own BI futures in Microsoft? In some ways, it affects my work remarkably little. When Oracle buy a company like Hyperion, it may mean that there's a new flag on the island, but it takes a much longer time to build bridges and integration. As a result, we in Microsoft, especially in the product teams (rather than, say, in marketing) may look at the Oracle applications and see relatively few feature-for-feature challenges. In other cases, such as Oracle's own in-database-OLAP features, we largely had to ignore their feature set when considering their future direction because it seems so unclear.
Oracle integration remains an important priority for us, for several reasons. For one thing, it helps our customers who are coming off Oracle to do so in a planned, progressive manner. Supporting Oracle systems in our tools - such as Analysis Services, Reporting Services and Integration Services also helps us to provide BI features to those customers who do find themselves on the wrong side of the devil's bargain and unable to extract themselves for it. For example, we see many customers using Analysis Services, or Reporting Services or Integration Services with Oracle systems. In fact, demand for Oracle support was so great, that in SQL Server 2008 we introduced an Integration Services high-performance data loader for Oracle. Yes, a "data loader" not an extractor: we released a feature that makes it easier to get data into an Oracle database!
Seeing these customer satisfaction issues at another major and successful vendor, and at the risk of sounding too much like a motivational call, I am convinced that we, in the SQL Server team, must keep to our own core value propositions: a persuasive vision, a commitment to our customers' success, and a compelling pricing model. After all, these are the same key points that were so signifcant to my old team over 10 years ago at that technical preview in Redmond. For their part, our friends at Redwood Shores appear to be doing their best to send customers our way. Is that what they call synergy? I'll ponder that, although this autumn it will be over a latte in Third Place Books rather than a carajilla at the Bar Bosch.
As you may know we've announced that Microsoft's cloud computing platform, Windows Azure, will have commercial availability as of the first day of the upcoming Professional Developer's Conference in Los Angeles on November 17th, 2009. In addition to data storage via Windows Azure, included in our offering will be SQL Azure. I've been following the product team's progress with SQL Azure for quite some time now, given my long-term interest in, and professional use of, SQL Server.
In fact, just this week the product group announced on the the SQL Azure blog, that the current build is feature-complete for PDC09 now. The product team's most recent blog details features that have been added to the most current CTP, such as the ability to configure firewall (access) rules, support for bulk copy (mostly for initial data load-in), and more. I've been watching and waiting, eager to ask lots of detail questions of product group as we start our first phase of commercial availability. Of course, the paramount questions are around security of your data in our cloud. We have a large number of sessions at the upcoming PDC in Los Angels, which runs from November 17 to November 19. The announced schedule to date already included 9 dedicated sessions on SQL Azure. These sessions are being conducted by members of the SQL Azure product group team.
If you can't attend the PDC and want to get started learning the capabilities of SQL Azure, then I recommend downloading the October 2009 Windows Azure Platform Training Kit. It includes power point decks, demos, hands on labs and more. Of course this kit will be updated to reflect changes in the product as we add features. There will also be a good bit of SQL Azure coverage at the upcoming SQL Pass Summit in Seattle from November 2 to 5th. In addition, there will be sessions at TechEd, Europe in Berlin from November 9th to 13th.
An interesting new development is the recent update to the SQL Azure management portal. The CTP access URL is changing as of PDC (from https://ctpportal.database.windows.net to https://sql.azure.com) and the portal itself has been updated to reflect the newly-added features of SQL Azure.
Due to the level of interest in SQL Azure (including my own interest), I have decided to write a technical book about the topic. Readers of this blog will get to preview partial chapters, as I plan to begin the writing in December. I intend to write about topics thtat will have interest for developers, ITPros and architects. These will include development of .NET and non .NET front-end solutions (i.e. PHP, Java) which use SQL Azure as a partial or entire storage solutions, also deployment and management considerations such as auditing and synchronization between cloud and local copies of data stores. Of course there will be a strong emphasis on security implementation best practices throughout the book.
I am quite interested in your feedback if you have worked with any version of the SQL Azure CTP (beta). Take a minute to drop me a mail via this blog to tell me what you've liked or not liked about your experience so far.
November is the month of conferences. There are four pretty major conferences going on and we have SQL Server Manageability sessions @ all of them. Here’s the run down:
SQL PASS
We also have a surprise in store @ SQLPASS during Ted Kummert’s keynote. Also @ PDC there’s a session on SQL Azure futures (“The future lifecycle of database development with SQL Azure”) that covers some of the future technology we’re exploring. If you’re @ PDC this is a must attend session. There is still space available @ all of these conferences. Unfortunately the only conference I’m going to make is SQLPASS. I was excited to go to SQL Connections but my schedule just won’t allow it. Each conference has a pretty amazing set of speakers with incredible knowledge in the area of managing SQL Server. If your schedule and budget permit you should pick a conference and go. I’m positive you’ll see a positive ROI.
I was really in Vegas, for a couple of days. We stayed downtown on Fremont Street, visiting with Alison's vacationing parents, just before the conference, then shuffled up the strip to ritzier lodgings for the main event. It's not much of a boundary to cross, perhaps a formality in some ways, but nevertheless significant. Fremont street has its colourful fun, but it isn't the Strip: and The Four Seasons is surely not the Lady Luck.
I have recently been crossing other boundaries too. For one thing, I have a new role within Microsoft. Just like driving up from Las Vegas to the Strip, to the casual observer you may not notice much difference, but difference there is. Here's the change ...
I have always enjoyed being very close with customers, partners and the wider Business Intelligence community. Since I first joined Microsoft, back in 2001, I have been working in engineering teams, striving sometimes to keep up with the deeply technical side, but still trying to keep in touch with what our customers and partners needed in the real world. It has been a great role, but a difficult one to balance. So now, rather than being in a single vertical product team - Analysis Services, or Integration Services, for example - I'll be working in a cross-team role. My focus will be to help Microsoft articulate a vision for business intelligence, and to improving our technical engagement with analysts, partners and other teams. As part of this role, I'll still need to keep technically close to the development teams: still contributing to engineering execution and vision as I can. (I still have some patents up my sleeve!) Nevertheless, the role does change. I'll miss leading my group of Program Managers - but I'll still be working with them daily. Maybe I'll even miss the hassle of integrating trees of code, fighting bug fires, and juggling development and test resources. However, I have much to look forward to, especially as we enter a whole new area with our "managed self-service" approach to business intelligence.
If you want to keep up with my new role, and the various technical and community initiatives in which I'll be involved, please do read my blog here. It's a Microsoft blog, for sure: in the sense that it is focused on our products, and our issues. For example, my next post will cover our new PowerPivot product, explaining some of the thinking behind our direction, and behind the name. You can also read my other blog where I'll be rather more expansive on issues of broader interest in the BI community. And you can follow me on Twitter: donalddotfarmer.
It's going to be fun. For me, better than Vegas
The SQLPASS Community Summit is one of the most exciting annual conferences for the SQL Server team. Bringing together thousands of professionals to talk about SQL Server is simply amazing. Last year I had the honor of standing on stage with Ted Kummert (Senior Vice President for the Business Platform Division @ Microsoft) during his keynote to demo an early prototype build of Application and Multi-Server Management. Standing in front of 2,500+ SQL Server fans is truly exhilarating.
For the past few months we’ve been working on our content for this year’s conference. The focus on SQL Server 2008 R2 will be intense, to say the least. One of the overriding themes this year is consolidation. In the sessions on consolidation we will be emphasizing the investments we’re making in R2 and Hyper-V. We have another session focusing on developing data-tier applications using Visual Studio 2010.
In addition to the public break out sessions we conducting several private sessions. These private sessions are under NDA and include participants from the SQL Server MVP community as well as people representing companies part of the SQL Server Customer Advisory Network or SCAN. Over the past 5 years (my tenure with the SQL Server group) the emphasis on gathering customer feedback early in the development cycle has intensified.
These break out sessions provide a look into features on the plate for the next release. The sessions are a combination of PowerPoint slides and demos of early prototype builds. Participants are asked to fill out surveys to provide their input to our planning process. I can attest the feedback we receive in these sessions directly impacts the shape of features.
One of the absolute highlights for me, though, is interacting with you. Spending time hearing how you’re using the product, the challenges you face in your business, and how you see the industry changing are invaluable. Even though we may not be talking under NDA about future developments these conversations have a direct impact on the product. All of these data points are fed back into the planning process and impact the multitude of decisions we make throughout the development cycle.
This year I was asked again to participate in Ted’s keynote and I jumped at the opportunity to show off my team’s hard work. I’ll be demoing the latest (yet to be released CTP) build of SQL Server 2008 R2 along with the already release Visual Studio 2010 Beta 2. The demo is absolutely rocking. Ted’s keynote is packed with demos all of them extremely cool – although I think ours is the coolest of the lot. There’s one demo near the end that’s sure to bring the house down. I won’t give it away so you’ll just have to come to the keynote and see it for yourself. But I’m certain it’ll be picked up by the blogging & twitter communities.
If you see me @ the conference don’t hesitate to come up and say hi and bend my ear about what you do for a living, what you like about the product and what we can do to make you more efficient and effective in your job.