With SQL Azure, you're no longer in charge of the Instance of SQL Server that you're running on. You're dropped into your environment at a database level. Also, the size of those databases are more restricted than you might be used to for your on-premise system (Keep in mind that SQL Azure has a set of use-cases, and those aren't always the same as the on-premise installation of SQL Server or other RDBMS systems). Because of these two reasons, you want to start "thinking at the database level". In one case, this means a shift in thinking at a lower level, and in another, at a higher level.
First, because you're at the database level when you enter the system, you don't need to control the Instance-level settings, or work with security and so on at the higher level. That measn that you should focus "down-level" to the database settings you do control.
Second, because of those size limits you need to think differently about the strategies you use for dealing with "big" data - in this case, as of this writing, the 1-50GB databases you can create on SQL Azure. In SQL Server on-premise installations, you can "partition" large sets of data by breaking them out using a Partition Scheme and a Partition Function - more on that here, with a great explanation of previous versions of SQL Server Partitioning. You also have access to FileGroups, which point to files that can be placed on different physical devices. In SQL Azure, you can think about the database as a container like the tables are in on-premise systems, in effect, "thinking up" to the database level.
This actually has some advantages - by placing data sets that you might partition by date, customer ID and so on in different databases, you gain the advantage of a different logical system running that database, gaining you CPU, memory and so on. My friend Wayne Berry has an excellent series of articles dealing with this starting here. He develops a strategy on Partitioning in SQL Azure in that series.
The point is that using SQL Azure requires understanding the way it holds and processes data. While it's ideally suited to data in the sub-50GB range, that doesn't mean that you don't have options to work with larger sets.
Since the introduction of SQL Azure, there has been some confusion about how it should be used. Some think that the primary goal for SQL Azure is simply “SQL Server Somewhere Else”. They evaluate a multi-terabyte database in their current environment, along with the maintenance, backups, disaster recovery and more and try to apply those patterns to SQL Azure. SQL Azure, however, has another set of use-cases that are very compelling.
What SQL Azure represents is a change in how you consider your options for a solution. It allows you to think about adding a relational storage engine to an application where the storage needs are below 50 gigabytes (although you could federate databases to get larger than that – stay tuned for a post on that process) that needs to be accessible from web locations, or to be used as a web service accessible from Windows Azure or other cloud provider programs. That’s one solution pattern.
Another pattern is a “start there, come here” solution. In this case, you want to rapidly create and deploy a relational database for an application using someone else’s hardware. SQL Azure lets you spin up an instance that is available worldwide in a matter of minutes with a simple credit-card transaction. Once the application is up, the usage monitoring is quite simple – you get a bill at the end of the month with a list of what you’ve used. From there, you can re-deploy the application locally, based on the usage pattern, to the “right” server. This gives your organization a “tower of Hanoi” approach for systems architecture.
There’s also a “here and there” approach. This means that you can place the initial set of data in SQL Azure, and use Sync Services or other replication mechanisms to roll off a window of data to a local store, using the larger storage capabilities for maintenance, backups, and reporting, while leveraging the web for distribution and access. This protects the local store of data from the web while providing the largest access footprint for the data you want to provide.
These are only a few of the options you have – and they are only that, options. SQL Azure isn’t meant to replace a large on-premise solution, and the future for SQL Server installations remains firm. This is another way of providing your organization the application data they need.
There are some valid questions about a “cloud” offering like SQL Azure. Some of these include things like security, performance, disaster recovery and bringing data back in-house should you ever decide to do that. In future posts here I’ll address each of these so that you can feel comfortable in your choice. I’ve found that the more you know about a technical solution, the better your design will be. It’s like cooking with more ingredients rather than the same two or three items you’re used to.
Since my team started work on Project Houston hardly a day goes by I’m not in some discussion about the Cloud. It also hits me when I go through my RSS feeds each day: cloud this and cloud that. There is obviously a significant new trend here, it may even qualify as a wave; as in the Client/Server Wave or the Internet Wave or the SOA Wave, to name a few – ok, to name all the ones I can name. Almost 100% of what I read is focused on how the Cloud Wave impacts IT and the “business” in general. Don’t get me wrong, I completely agree it’s real and it’s going to have a profound impact. I think this posting by my friend Buck Woody (and fellow BIEB blogger) sums it up pretty succinctly. The primary point being it doesn’t matter what size your business is today or tomorrow, the Cloud will impact you in a very meaningful way.
What I don’t see much discussion about is how the Cloud Wave (is it just me or does that sound like a break dancing move?) changes the way software vendors (ISVs) develop software. In addition to Project Houston my team is responsible for keeping SQL Server Management Studio (SSMS) up with the SQL Azure feature set. If we step back for a second and look at what we have, we have a product that runs as a service on Windows Azure (running in multiple data centers across the globe) and the traditional thick client application that is downloaded, installed, and run locally. These are very different beasts but they are both challenged with keeping pace with SQL Azure.
SSMS has traditionally been on the release rhythm of the boxed product. This means a release every three years. The engineering system we use to develop software is finely tuned to the three year release cycle. The way we package up and distribute the software is also tuned to the three year release cycle. It’s a pretty good machine and by and large it works. When I went to the team who manages our release cycle and explained to them that I needed to release SSMS more frequently, as in at least once a quarter if not more often, they didn’t know what to say. This isn’t to say these aren’t smart people, they are. But they had never thought about how to adjust the engineering system to release a capability like SSMS more often than the boxed product, let alone every quarter. I hate to admit it but it took a couple of months of discussion just to figure how we could do this. It challenged almost every assumption made about how we develop, package and release software. But the team came through and now we’re able to release SSMS almost any time we want. There are still challenges but at least we have the backing of the engineering system. I’m pretty confident we would have eventually arrived at this solution even without SQL Azure. But given the rapid pace of innovation in the Cloud we were forced to arrive at it sooner.
Project Houston is an entirely different. There is no download for Project Houston, it only runs in the Cloud. The SQL Azure team runs a very different engineering system (although it is a derivation) than what we run for the boxed product. It’s still pretty young and not as tuned but it’s tailored to suit the needs of a service offering. When we first started Project Houston we tried to use our current engineering system. During development it worked pretty well. However, when we got to our first deployment it was a complete mess. We had no idea what we were doing. We worked with an Azure deployment team and we spoke completely different languages. It took a few months of constant discussion and troubleshooting to figure out what we were doing wrong and how we needed to operate to be successful. Today we snap more closely with the SQL Azure engineering system and we leverage release resources on their side to bridge the gap between our dev team and the SQL Azure operations team. It used to take us weeks to get a deployment completed. Now we can do it, should we have to, in the matter of hours. That’s a huge accomplishment by my dev team, the Azure ops team, and our release managers.
There’s another aspect to this as well. Releasing a product that runs as a service introduces an additional set of requirements. One in particular completely blindsided us. Sure when I tell you it’ll be obvious but it caught my team completely off guard. As we get close to releasing of any software (pre-release or GA)we do a formal security review. We have a dedicate team that leads this. It’s a very thorough investigation of the design in an attempt to identify problems. And it works – let me just leave it at that. In the case of Project Houston we hit a situation no one anticipated. The SQL Azure gateway has built-in functionality to guard against DOS (Denial of Service) Attacks. Project Houston is running on the opposite side of the gateway from SQL Azure, it runs on the Windows Azure platform. Since Project Houston handles multiple users connecting to different servers & databases there’s an opportunity for a DOS. During the security review the security team asked how we were guarding against a DOS. As you can image our response was a blank stair and the words D-O-what were uttered a few times.
We had been heads down for 10 months with never a mention of handling a DOS. We were getting really close to releasing the first CTP. We could barely spell D-O-S much less design and implement a solution in the matter of a few weeks. The team jumped on it calling on experts from across the company. We reviewed 5 or 6 different designs each with their own set of pros and cons. The team finally landed on a design and got it implemented. We did miss the original CTP date but not by much.
You’re probably an IT person wondering why this is relevant to you. The point in all this is simple. When you’re dealing with a vendor who claims their product is optimized for the Cloud or designed for the Cloud do they really know what they’re talking about or did the marketing team simply change the product name, redesign the logo and graphics to make it appear Cloud enabled. Moving from traditional boxed software to Cloud is easy. Do it right, is hard – I know, I’m living through it every day.
I follow the Database Trends and Applications magazine because it is pretty platform independent, with a section for Oracle, SQL Server and DB/2. I found it interesting that when I turned there yesterday and today, among the top stories are "Cloud" - front and center. It's easy to see why. When we think about technology today, we've got three big "circles" of components like a Venn diagram. In one circle is the "Platform", which includes hardware and the software that runs on it. Developers live here, as do System Administrators and even Data Professionals. In another circle is the "Connectivity", which includes network and security. In yet another circle is "Data", which includes data professionals of course, developers, SAN admins and Sysadmins from file shares to SharePoint.
The first circle - Platform - has all of the hardware in it. That's a capital expense that also has operational expenses, like people that maintain, manage and monitor them. Companies really want to tone that down wherever they can. That's where this whole "Cloud" thing comes into play. The devs are still there, writing apps for the organization. The data specialists are still there, managing and designing data. The network will always be there. But hardware and software? Does your CEO really want to pay to have that laying around? And what about capacity growth and shrink - not easy to manage when you own everything. So many of them are asking you about the Cloud. And you're probably wondering where it fits in.
Well, Microsoft is certainly one of the main players in this area. From Windows Azure, the Application Fabric and SQL Azure, we have real-live offerings you can leverage right now. But is SQL Azure ready? For many applications, yes. No, I don't recommend that you grab your 1 Terrabyte database that has a 500GB ETL process running on it and just toss that unchanged into SQL Azure. I think there's a place for it - and it keeps growing every day. There's a great article over on Softpedia where you can read more about the changes, improvements and enhancements to SQL Azure that we've just released. But here's a side of that you might not have considered: you don't have to do anything to get those improvements. There's no install. There's no patching. To be sure, there is still testing and so on that you need to do, but it just changes and gets better. Certainly this is one argument for the Cloud. Again, it's another tool in the toolbox - not meant to replace every on-premise environment today, but something you should learn about to see where it does fit.
So check out that article, and post your comments here and there. I'm curious about what you think.
As Mary Jo Foley recently reported migrating to SQL Azure just got a little easier with the latest version of the SQL Server Migration Assistant (SSMA), released on August 12. Included in the update are the typical cast of source databases: Oracle, Sybase, and Access. But there are two new things. First, MySQL was added as a source database. And second, SQL Azure was added as a target. If you have an existing MySQL database there’s no better way to get started with SQL Azure than migrating your existing db .The process is fast and painless. You can read more about this release of SSMA on the SSMA blog.
I know, SQL Server 2008 R2 was released some time back, but with so much content out there I thought I might simplify it a bit. I've written a complete overview that you can read through in about 10 minutes, and there's a great new whitepaper on one of the main features, PowerPivot. Thought I'd share here:
My article: http://www.informit.com/guides/content.aspx?g=sqlserver&seqNum=359
PowerPivot Whitepaper: http://whitepapers.zdnet.com/abstract.aspx?docid=1911939&promo=100303
At this point there should be no question that Microsoft is fully embracing the cloud. Windows Azure and SQL Azure launched earlier this year and have been receiving positive feedback. I’ve been happily using SQL Azure for several months. One of the great aspects of SQL Azure is the integration with the standard tooling, e.g. SQL Server Management Studio (SSMS). This is sort of like how I can connect Outlook to my personal e-mail account with my service provider.
However, I‘m not always on a machine with Outlook or with Outlook configured for my mail account. In these cases I opt for the browser email client provided by my service provider. Up until last week the only option provided by Microsoft for using SQL Azure was SSMS. If you didn’t already have SSMS you could download and install it, which isn’t always a practical solution. You’ll notice I said up until last week. Last week we announced a Community Technology Preview (CTP) for Project Houston. In short, Houston is a Silverlight client for managing your SQL Azure database, developed by none other than my team
Mary-Jo Foley recently wrote about Project Houston: http://www.zdnet.com/blog/microsoft/microsoft-delivers-test-build-of-tool-for-cloud-database-development/6910
In addition, a simple Twitter search on the tag #SQLHouston will yield a plethora of results. We’re also tweeting about it under the user name @SQLHouston.
You can access the CTP of Project Houston @ https://manage.sqlazurelabs.com/ and be sure to send us your feedback and suggestions. You can learn more about Project Houston and how to send us your feedback here: http://blogs.msdn.com/b/sqlazure/archive/2010/07/26/10042571.aspx
Most of us have some help we can provide back to the community. Even if you're new, you can write down some of the things you've learned. And there are several easy ways to to that - you can certainly jump in on the forums (http://social.msdn.microsoft.com/Forums/en-US/category/sqlserver/) and answer any questions you know the answer for and you can even participate by helping to write a form of documentation - did you know we run a Wiki, where you can edit the documentation? Check it out here: http://social.technet.microsoft.com/wiki/
So take some time, peruse those resources, and write something up. We learn best when we teach others, so don't think that you can't give back - you can!
The SQL Server Best Practices Analyzer (BPA) came out for SQL Server 2008 R2 recently, and I’ve been asked what the difference is between the BPA and Policy Based Management (PBM) that was introduced in SQL Server 2008.
While it’s true both of these tools can do similar things, each has strengths and weaknesses. The Best Practices Analyzer has a long history, and has various “rules” that compare settings on a server and provide guidance through some very nice reports. Many of these rules became Policies in SQL Server 2008. The BPA requires a separate install, PBM is installed with SQL Server 2008, and the reports are something you would have to create yourself. PBM can be run on a schedule, from a SQL Server Agent Job step or inside PowerShell, and BPA doesn’t do that out of the box. PBM also has a “SQL” task where you can define whatever you would like, BPA doesn’t have that capability in exactly that same way.
Probably the biggest difference between the two tools, however, is that PBM can be set (under certain circumstances) to prevent an action from being taken. For instance, you can actually stop a developer from naming a database object in a certain way. Again, there are restrictions on this feature, but you can use it from time to time.
So which is better? Neither! Both have their uses, and in fact I use them both. One of the greatest strengths of Microsoft products is that you can usually do the same task in multiple ways. Of course, it’s one of our great weaknesses as well!
So as usual, the answer is “it depends”. You should learn more about both, and figure out what works best for you.
Today I'm here in Washington DC, demo-ing in Bob Muglia's keynote at our Worldwide Partner Conference. This time, I'm showing something quite new - not just a new product, but a new concept in data - Microsoft's premium information marketplace codenamed "Dallas."
"Dallas" may be better known as a village of 200 or so people in northern Scotland, but now it has a new claim to fame. "Dallas" is a place where developers on any platform as well as information workers can find data from commercial content providers as well as public domain data. They can consume this data and construct new BI scenarios and new apps…
To illustrate how easy it will be for ISVs to integrate this content into their apps and to showcase a BI scenario, in the keynote demo I'll be using United Nations data from "Dallas" integrated in an early build of Tableau Software's solution to create a visualization answering study abroad trends around the world. This took only a few minutes to create using "Dallas" and Tableau Public.
To break down the demo, here are the three key pillars behind this scenario:
Discover Data of All TypesMicrosoft Codename "Dallas" makes data very easy to find and consume. The vision is to be able to post and access any data set, including rich metadata, using a common format.
Explore and ConsumeUsing Tableau and "Dallas" together means you can explore any data set simply by dragging and dropping fields to visualize it. This is a very powerful idea: anyone can easily explore and understand data without doing any programming.
Publish and ShareOnce you've found a story in your data you want to share it... Using Tableau Public you can embed a live visualization in your blog, just like the one above.
"Dallas" creates a lot of opportunities for a company like Tableau. It makes it possible for bloggers, interested citizens and journalists to more easily find public data and tell important stories, creating a true information democracy. "Dallas" also makes it easier for Tableau's corporate customers to find relevant data to mash up with their own company data, making Tableau's corporate tools that much more compelling.
For more information on how you can be a provider or how an ISV can plug into the Dallas partnership opportunities, send mail to DallasBD@Microsoft.com.
I’ve been using SQL Azure for several months now both for messing around (kicking the tires if you will) and for hosting a real database. I have a database with research data running in SQL Azure and I use Excel to connect to the database. I even share this out with others on my team. It works great and I don’t have to worry about back-ups, patching, up-time, etc.
I bet your sitting there thinking you should try out this cloud database stuff but you’re not sure where or how to get started, am I close? I have some good news for you, we recently published a getting started guide for SQL Azure. You can download it here. The guide provides step-by-step instructions, with pictures, for setting up your Windows Azure Platform subscription and configuring your SQL Azure environment.
Be sure to read carefully the section regarding firewall settings. I got hung up on this and it took me a little while to get it properly configured; your internal and external IP Addresses may be different. The “Add Firewall Rule” window will show you your external IP address. This is the one you want to use when enabling an IP range, not your internal IP Address.
The guide also contains links to more information on developing and deploying databases on SQL Azure, managing databases and logins, and a Transact-SQL reference – remember SQL Azure is a subset of the full Transact-SQL query language.
We’re working on a Silverlight-based database management tool for SQL Azure under the code name “Houston”. You can read and see more here. In the meantime you’ll want to download SQL Server Management Studio (SSMS) from here for an interactive experience. For more information on which tools support SQL Azure check out the SQL Azure team’s blog here.
Getting started with SQL Azure is fast and easy. You’ll be up and running faster than you can say “now which folder did I download SQL Server to?”
One of the great things about using SQL Server is the amount of resources you can find for it. One of those resources is the Developer's Training Kit - and there's a new version of it that you can download and learn from. This kit is full of videos, demonstrations, labs and sample code that you can use to get up and running with the latest version of SQL Server.
This isn't just for the developer, however. Whether you're a Database Administrator, Database Developer or Business Intelligence Professional, you'll find something new to learn and try out in this kit. The best part? It's free!
In the middle of May, just about 6 weeks ago, the SQL Server BI team launched PowerPivot. This weekend, we have we passed an important milestone: 100,000 downloads of the Excel add-in. For a BI application, this really is a huge number. I guess it truly underscores something we have said all along: that PowerPivot is aimed at a much wider market than traditional BI. (As Amir Netz says, those who can spell Excel, but can't spell BI.) I blogged about this very topic recently when I asked Who is PowerPivot For?
For me, it simply reminds me of one of the reasons I love working at Microsoft; we are the only company who can bring analytics of this power to this size of audience. It's not that other companies don't do great work; I love what many of our partners and rivals are able to accomplish. Yet only at Microsoft can we impact so many people. It's a huge privilege, and I know from my own experience that none of us take it lightly.Of course, it is early days yet. There is a lot more to come. Some of you may be wondering if these 100,000 users already represent the bulk of those who may be interested? I can tell you that the download rate is increasing, and increasing rapidly.
For now, having passed this landmark, I think the team deserves to feel quite satisfied with the early reaction. It promises to be an exciting year!
You've probably seen some press lately about how pervasive Microsoft is in lots of areas (here, here and here). We've also seen SQL Server show up in more and more enterprises - you can read lots of case studies here. There are lots of reasons that you're seeing SQL Server numbers grow - we're less expensive than other offerings, have a pretty good track record and a great toolset to monitor and manage the system. We've also got good support, a well-established repuation in large enterprises and all the way down to small businesses, but there's one factor that I'm seeing make a difference. It's our community.
I've used lots of software over the years (I'm really old) and I've seen lots of community efforts. Open-Source does a great job of keeping people in contact with each other, as do a few other vendors. But I see a LOT of community for SQL Server - people of all levels of skill, all around the world, in multiple industries helping each other out, freely and willingly. Thomas LaRock (SQL Server MVP) recently blogged about this phenomena as well.
I think community makes a difference - if we (Microsoft as a company) listen. And I think we do. I know the folks I deal with constantly monitor social networking sites, blogs, press reports and other sources for a mention of their product area. When they see someone struggling with a feature, they really take that to heart. No, we don't base every product decision on a sampling of people talking, but we sure do listen to what they are talking about.
If you're not connecting with the SQL Server community through social networks, blogs, user groups, SQL Saturdays, PASS or any number of other sources, you're missing out. There's lots of folks out there willing to help - and happy to do it.
I monitor a number of different database discussion forums and you can always find a healthy debate on the best database platform. The problem with this question is the answer is always, it depends. It depends on the business and technical requirements today and over the next few years. We just published a case study on a company called Arabah. Arabah is is the official U.S. distributor of the Jericho by Paloma line of beauty products made from Dead Sea minerals. Arabah started down the path with MySQL but as they began to compare the business and technical requirements to what MySQL offered they realized it wasn’t the correct choice. I’m sure you know how the story ends. The morale of this story isn’t that SQL Server 2008 is a better database platform than MySQL (even though it is), it’s that you must first define your requirements before you select your solution. Now does anyone know what’s the best car?
Ann All recently posted an interesting article at IT Business Edge: Microsoft’s Schizophrenic Approach to Business Intelligence. I’m not too insulted: I tend to agree with Janet Long that, “Part of being sane, is being a little bit crazy.”
Ann’s argument is straightforward enough. We have said that we hope to win fans in IT with PowerPivot, but Howard Dresner (and others) see PowerPivot as putting power in the hands of end users. Howard, and Ann’s, question is: “Who is your primary customer?”
Now I should say here, that I rarely see a dichotomy without wanting to prove it false, and this one is no exception. There, are to my mind, at least three fatal flaws in this argument:
· I think the term “end user” is too inexact for the scenario in question;
· I doubt the distinction between business and IT is as clear as the question implies;
· I don’t agree that a “primary” customer is necessary or desirable.
Let’s take them in turn.
Who is the end user of PowerPivot? The Excel jockey? That is hardly a complete picture. In fact, the majority of users will see the end result of PowerPivot in the form of interactive apps, dashboards and reports served up with SharePoint, and with SQL Server on the back end. These end users are consumers of analyses and they typically outnumber producers – the Excel users – by perhaps ten to one. These end users do not serve themselves: they may explore, and interact with the analysis, but others have built the solution, and others manage the infrastructure that serves them. So, “end users” form a complex ecosystem, which PowerPivot serves in carefully differentiated ways, with a powerful Excel add-in for producers, and a manageable, easy-to-use server-side story for the consumers.
So, if end-users are a complex bunch, what of this distinction between IT and business? Well that’s not so clear either. Many a department in many an enterprise – small or large – has a business user who, in effect, provides IT services for her peers. The SQL Server WorldWide User Group runs a splendid course for the Accidental DBA. Many small SharePoint installs, or Reporting servers, even departmental OLAP boxes are cared for by these untitled, and unsung, practitioners. They provision infrastructure for their peers, but their day job is in the business side. These are not the sole IT audience of PowerPivot, but they will be an important segment of the market.
Even with more “traditional” IT roles – with job titles to match – I find the artificial distinction between business and IT rankles somewhat. The implication is that IT has a set of concerns that are of little or no interest to business users, and possibly vice versa. Well, I must say that I rarely see the vice versa in action: even barely functional IT departments take great interest in the success of the business that pays their wages. Increasingly, business users, especially at higher levels, are aware of the business drivers of IT decisions. After all, companies can be fined, and company officers can even be imprisoned, for breaches of data and security. The integrity of a server may have as much bottom-line impact as the business transacted upon it.
Alors, revenons à nos moutons. The third flaw is the most serious. I simply doubt that a “primary” customer is needed – and indeed I reject the idea of either IT or business being “primary” in a decision to adopt software which affects both of them. To think that one must be primary is to assume that there cannot be an common interest between business and IT. It should not be so.
I expect at least two patterns of adoption for PowerPivot, no doubt there will be a greater variety than I can cover in this blog.
On the one hand, I see Excel power users discovering PowerPivot as a tool they can use to build more powerful analyses. They will adopt on the desktop but will soon need to share their work with consumers. As Ann said in another blog post: Happy Employees Are Collaborative Employees. At that time, they will push for server adoption from IT: either “real” IT or “accidental” departmental IT.
On the other hand, I can see IT taking a proactive role. Recognizing that PowerPivot provides a solution to the continuing demands of business users for ad-hoc analyses, IT can provision the infrastructure and advise business users to go forth and serve themselves.
Both scenarios are equally valid and I see no harm in us pursuing both of them. And of course, there is a third one. Business and IT together (they do talk, you know) decide PowerPivot is an appropriate solution and agree (it does happen, you know) to implement managed self-service BI.
Maybe I’m crazy after all, but I think it might just work.
I like it when *other* people review Microsoft software. I still volunteer as a DBA, so I still use multiple versions of our own products (and comptetitors, by the way) in the "real world". That's how I normally present, that's how I teach at the University of Washington, and how I work with my Microsoft clients. When I talk about the product, it's something I use - but of course I work here, so folks always like finding other people that have tried the new versions of our software and read what they say. Sometimes that's good, sometimes the reviewer has issues that they talk about. In any case, I like when they show detail about what they have tried and how it worked. And here is just such a review - it's by Jason Brooks over at e-Week.com - enjoy: http://www.eweek.com/c/a/IT-Infrastructure/SQL-Server-2008-R2-Offers-Enhancements-New-Management-Capabilities-518969/
I first met Sean McCown when I worked on the team developing SQL Server Integration Services. Sean was simultaneously one of our most active supporters and one of our most trenchant critics. He can very – what’s the word? – very forthright. You know, blunt, outspoken and generally on the money. So, I keep a look-out for Sean’s articles and certainly his reviews of our latest efforts.
So, frankly, I was at least relieved to see the headling of Sean’s latest article - Seven reasons to care about SQL Server 2008 R2 – and glad to read it. Sean calls out what I would agree are the important reasons to upgrade: Managed Self Service Business Intelligence, Report Components, Master Data Services, StreamInsight, Multi-Server management, DACPACs and Sysprep. But don’t just make do with the list: Sean’s comments are insightful and critical where he needs to be, and he calls out great use cases for the features he likes.
For an example of his criticism, Sean takes us to task for the ways in which we have factored Enterprise (now supporting 8 CPUs) and DataCenter. Time and customers will tell – especially new customers. Mark Beyer, of Gartner, had a different view of our changes: “It’s about time. The amount is reasonable, the per socket is still better than any percentage per core that other vendors use, and SQL 2008 R2 remains a solid value for the price.” SKU factoring is always a difficult balance, particularly where existing customers find themselves right on the edge of two editions. We’ll have customers in that situation and I trust our field teams will help them make a choice based on our need to build a long-term relationship, rather than overselling an edition for the box price. Traditionally, this is exactly how we have worked – thus the very high numbers of SQL Server Standard Edition that we see.
Enjoy Sean’s post here and please do add your comments. Would love to see what you think is compelling in SQL Server 2008 R2 too.
Donald Farmer
Well, I've almost recovered from TechEd this year. I had a fantastic time - learning, teaching and most of all, interacting. I'm really glad that Microsoft stuck with New Orleans - they've had a tough go of it since Katrina, but the city didn't show a trace. It was clean, positive and vibrant. Lil' Buck insisted on coming as well, so I brought him along.
I got out most every morning for a walk and to meet with the "SQL Cool Kids" at Cafe Dumonde, for coffee and French donuts. The first morning was with Donal Farmer and Brent Ozar.
I also visited with lots of the SQL Server community. While I did go to a few of the sessions, I have always thought that the most valuable part of an in-person conference was the in-person part - meeting with folks in and around my profession.
(Left to right: Denny Cherry, Andy Rowland, Sean and Jen McCown, Mark Stempski, Thomas LaRock, Steve Jones, Brad McGehee)
I also had an opportunity to visit several sessions this year that were far outside of the SQL Server arena. Knowing more about Windows, the cloud, development and SharePoint was my focus this year. (I saw the cloud - it's a big blue box that lives at TechEd and you can walk in it)
And of course I took liberal use of my evenings there in New Orleans to - well, be in New Orleans. My mother's side of the family comes from this area, so I was right at home.
So all in all, was it worth it? Oh yeah. I'll definitely be back.
I hope you’re able to join me next week at TechEd 2010 U.S. In New Orleans, Louisiana. I love taking this time each year to learn in an environment where I’m “out of my element”. What I mean by that is I have not only SQL Server knowledge all around me, but also knowledge for the operating system, Office Automation products, SharePoint, and one of my new/old favorites, High-Performance Computing. It's a time where I can focus for an entire week on "recharging" my knowledge batteries.
If you are coming (and I really hope you can), make sure you get the most out of the visit. Check out this link: http://blogs.msdn.com/b/cdndevs/archive/2010/05/31/a-few-tips-amp-tricks-on-attending-microsoft-teched-2010.aspx to learn a lot more about how you can prepare for a TechEd Conference.
We’re all facing budget pressures, so you have to figure out whether you should go to a conference right now. To help you make an informed decision, check out this link: http://northamerica.msteched.com/default.aspx?fbid=TJgYwH0BwHo
And if you are coming down south with me, stop by and say hello – I’m doing a presentation or two and also working at the booths, including our Surface Computing System demo area.
Mary-Jo Foley recently wrote a blog post about being better together. At Microsoft this is something we take seriously. In her blog post she includes some snippets of an interview with Donald Famer. Donald discussed how we exchanged some team members with the Excel team in order to have a great integration story. Anyone in management knows how important people are and will understand that moving people between teams is serious business.
So why do we do this? We do it because it creates the greatest value for our customers. Creating seamless experiences between products – built on open APIs - ensures customers get the greatest benefit out of their investments. it also means that end users (information workers as we like to call them) end up with the most natural level of interaction with the products without having to think much about it.
I’ll offer one proof point: I was recently talking with a friend of mine who’s at a large packaged food company. Mind you he’s in sales not in IT so he offers more of a business rather than technology perspective. They just started a pilot of SharePoint, Excel and PowerPivot. He couldn’t agree more with the value of better together. They initially started out looking at nonMicrosoft products and quickly realized the cost of employees learning another tool which didn’t naturally integrate into their daily route or with other tools would be too disruptive and costly to the business. This really hit home for me that the price on the box is only a small part of the overall equation. The productivity of your employees is far more important.
I've mentioned elsewhere that you should be thinking about a "data roadmap". A way of thinking about how you intend to leverage all of that data you're collecting.
And in that post I mentioned that Microsoft has a "roadmap" for SQL Server. We think about how to evolve the product all the time, with one overriding goal - to be the "Information Platform" for your organization. It's more than just storing and securing your data - it's about giving you tools that let you access that data from anywhere, quickly and safely. In SQL Server 2008 R2 you'll see big investments in this kind of thinking, from Master Data Services to Stream Insight and PowerPivot and even interaction with SQL Azure. We're giving you new ways to do the basic inputs and outputs, but we're also thinking beyond those operations to new ways of allowing you to provide that data to your organization. Bloomberg has a review on some of these features that you can check out here.
So make your data roadmap with our product roadmap in mind - it's not just data, it's information.
"Five" seems to be a big number these days - is it compression to have a "top-5" instead of a "top-ten"? :) Over at Network world they've posted a quick slide deck of "Five things we love about SQL Server R2". They even include...the Start Menu? OK - I guess everyone has different likes and dislikes. I think they're happy that we gave SQL Server 2008 R2 it's own menu area, rather than lumping it in with SQL Server 2008 (you're welcome).
I agree with their first four for sure - and I'll throw in there things that you might not see as often, like being able to connect to SQL Azure from SQL Server Management Studio, increased database sizes in SQL Server Express, just to name a couple. By the way - if you haven't rushed over to check out the resources we have for you to learn the tech for SQL Server 2008 R2, what are you waiting for? They're free! Enjoy.
As announced a few weeks ago on the SQL Azure blog, SQL Azure now supports Data-tier Applications which are introduced with Visual Studio 2010 and SQL Server 2008 R2. Here’s a quick tour of the feature.
Here’s what I did:
It’s that simple!
So what does this look like? Here are some screen shots:
This is the New Project dialog in Visual Studio 2010. I select the Data-tier Application project.
Here’s a shot of the project after I imported the local database. I could have also created the database schema from scratch.
Here’s a shot of the deployment properties within the Data-tier Application project. The connection string is exactly the same as a local instance except the server name is my SQL Azure server.
I select build and deploy. Here are the results. The system built the DACPAC, connected to my SQL Azure instances, and deployed the Data-tier Application.
The new database, Pubs-Azure, appears in my SQL Azure admin dashboard.
Connecting SQL Server 2008 R2 Management Studio to my SQL Azure instance I see the databases as well as the meta-data for the Data-tier Application.
With Visual Studio 2010, SQL Server 2008 R2 Management Studio and the latest release of SQL Azure we have a symmetrical experience for creating, registering, and deploying Data-tier Applications. Unfortunately at this time we don’t support Upgrading a Data-tier Application on SQL Azure, as we do with an on-Premise install of SQL Server. To move to your database to the next version of your application you’d deploy side-by-side and migrate your data using your favorite data migration tool.
Try this out for yourself and I think you’ll find this to be extremely valuable. In addition, if you want to share your database schema with other people you no longer have to provide a bunch of Transact-SQL scripts. You can give them a DAC Pack file. The file will contain the name of the application, the version, and the database schema definition. This will make management of applications, including deployment much simpler!
Though short, this article in PC World makes an important point: the PowerPivot feature in SQL Server 2008 R2 is incredibly valuable to small businesses. I mentioned in a previous post that in another “life”, in enterprise IT for a Fortune 10 company, I built a “BI” app using a SQL Server backend and pivot tables in Excel. Given all the complexities back then the end-users relied on my expertise to deliver their precious data each month.The operative word in there is “enterprise”. Back then this sort of environment was out of reach of most small businesses and required certain expertise.
But isn’t BI important to small business too? Absolutely! And until now it has, for he most part, been too complex for small businesses. With the introduction of PowerPivot small business have access to an intensely powerful set of capabilities that are extremely friendly. If you own or work for a small business I encourage you to check it out for yourself.