• Does Consolidation = Virtualization?

    In early 2001 I worked @ a start-up company with about 150 people. I call it a start-up because it had no revenue but did have free breakfast, lunch and dinner. @ the company there were five people named Michelle and six named Dan (including yours truly). One day the VP of operations got confused about which Michelle and Dan he was talking about; he followed it with the following comment – you can’t swing a dead cat around here without hitting a Dan or a Michelle. Ok, I apologize to all cat lovers – I’m one myself and I for sure don’t condone the swinging of dead animals, or live ones for that matter. But I feel the same way about virtualization and consolidation. Hardly a days goes past when I’m not in a conversation about these topics. It’s important to remember that while virtualization has a coolness factor I recommend you approach it as an enabler to meeting business objectives and not the silver bullet that’ll solve all your woes. Also, while virtualization and consolidation are often mentioned in the same breath they’re not the same thing. A couple of guys on my team put together the following graphic to discuss SQL Server consolidation options:

    image 

    As you move left to right across the picture you move from Higher Isolation (which equates to higher cost) to Higher Density (which equates to lower costs). High isolation refers to resource and security isolation. Obviously you could take this to the extreme and have each application (instance of SQL Server) reside on its own hardware sitting in its own data center residing in its own building on its own power grid. But that’s crazy, right? Higher density means the greatest sharing of resources.

    Here’s a brief explanation of each lane:

    1. IT Managed Environment: This solution is the most expensive but provides the greatest level of isolation. Each application is placed on dedicated hardware (computers, storage, and possibly network.)
    2. Virtual Machines: The next level provides terrific OS and application isolation – each application (SQL Server Instance) runs in it’s own VM. Multiple VMs can run on the same physical machine and resources of the physical environment can be allocated to each VM.
    3. Instances: Multiple instances of SQL Server can be installed on the same computer. While the resources (CPU, Memory, etc) of the computer are shared between the instances, each instances provides a very high degree of security isolation: each instance can have a different administrator.
    4. Databases: A single instance of SQL Server can house multiple databases. There’s a strong security boundary between databases (a database user can be contained to that database) but the database are sharing system resources and the administrator for the instance will have access to all databases.
    5. Schemas: Within a single database each database object (tables, views, stored procedures, etc) is contained within a schema. The database can have multiple schemas. Similar to how database acts as a security boundary, permissions can be granted on schemas and schema-contained securables, e.g. tables.

    The main point here is there is no one size fits all solution or technology and virtualization is just one of the ways to meet consolidation needs. You will likely employ multiple solutions within your environment with the key being to chose the right technology to meet the business requirements of the application. As an IT professional it’s your job to understand the technology and how to apply that technology to solve business problems.

    There are several different models out there for approaching consolidation. The high-level steps that resonate with me are:

    1. Plan: define the business and technology objectives. Define the boundaries of the project. Identify the key stakeholders and sponsors. Define the timeline.
    2. Inventory What You Have: there are many tools out there for creating an inventory of your technology assets. The Microsoft Assessment & Planning Toolkit does this and it’s free.
    3. Track How Things Are Being Used and Identify Consolidation Candidates: you can’t start throwing applications together without understanding their usage model.
    4. Consolidate: Once you’ve identified the candidates, make the changes. This will require a lot of up front planning to determine when apps can be moved (planned downtime) and the impact to the great app ecosystem. Remember, database connection strings will change.
    5. Track How Things Are Being Used: yes this is a repeat of step 3 but it’s important. You may get some thing's wrong (over consolidation) and you’ll need to make adjustments. Plus if your environment is like any I’ve seem it’s pretty dynamic with new applications popping up.
    6. Post Mortem: get the project team and key stakeholders together and discuss what went well and want didn’t. Document this learning so you’ll have it the next time around. Consolidation isn’t a one time event. You will need to constantly monitor your environment for opportunities.

     Finally, I couldn’t end this article without mentioning my favorite virtualization related features in Win7 and Win2K8 R2:

    Virtualization and consolidation are concepts that have been around for at least 30 years. With the ever increasing pressure on IT to do more with less they are realities that can no longer be ignored. As I mentioned in my opening blog post, one of the challenging aspects of my job is I’m always thinking one to two releases out and I’m talking these days a lot about consolidation and virtualization… The first of this wave of capabilities sees the light of day with SQL Server 2008 R2 Application and Multi-Server Management.

    Cheers,
    Dan

  • Divining Oracle

    Bar Bosch, in Palma de Mallorca, is one of my favourite places to hang out. It's hectic and noisy, but still leaves space and time for endless discussions or reflection. It is the perfect place to pretentiously pore over Žižek or Derrida. Yet in the autumn of '98 I found myself sitting there, ordering carajillas and reading up as much as I could about Oracle's Business Intelligence tools.

    It was before I joined Microsoft, and I had just started a new job with a BI team focused on Oracle products. I had to get up to speed quickly, but, fortunately, I had some vacation already booked. So, one week after I joined, my wife and son headed to the beaches of Cala Ferrera, and I headed to Bar Bosch with my Oracle books under my arm. It was a long week of reading, and mostly in vain. When we returned home, my CEO had news for me. "All that Oracle stuff," he said. "Forget it. We're focussing 100% on SQL Server now." The rest, as far as my personal story is concerned, is history. I've never looked back.

    What happened? Some of the C-level team had attended the SQL Server 7 technical preview at Microsoft's invitation in Redmond.  On day one, during a break, they phoned back to our office from the lobby and told the development team to stop Oracle development. They found the Microsoft BI story, told then by Bill Baker and Amir Netz, so compelling that they immediately changed direction. Microsoft presented three irresistible messages: a vision of what BI is about, a commitment to BI customers and partners, and a compelling pricing model.

    I could wish all Oracle migrations were so easy. Some are, but I always maintain, as in a recent discussion with Ted Cuzillo, that migrating technologies is easier than migrating people. When one has established a way of working, it's difficult to adapt to a new way, even if it is technically equivalent or superior. It's just easier to carry on with one's current practices. Software manufacturers know this, of course. We all want our products to be, as the marketers say, "sticky" - and all of us in the software business like features and methodologies that build loyalty. That loyalty, once gained, is naturally a most valuable asset.

    Yet, recently, Oracle, appear to have taken a different approach: one that customers increasingly resent. They are tying companies into the Oracle ecosystem financially, especially by acquiring other strategic suppliers: writing checks, not code, as Larry Ellison himself once said. Their customers are complaining in stark terms, as they find themselves not only tied in, but subject to ever-increasing prices and intransigent demands. Business Week headlined that "Oracle has Customers Over a Barrel" quoting one customer as saying "Once you've made a deal with the devil, it's hard to get away." 

    This approach from Oracle is surely a deeply alienating move, and it tastes to me of a certain desperation. Personally, I find it baffling. Here is a company with a truly great database product, a broad range of applications and excellent engineers, but they are driving their own customers to despair with licensing practices that feel like extortion according to some. So, while there have always been Oracle customers willing to make the effort to change, at Microsoft we are expecting to see that number increase.

    Naturally, we're encouraging them, and why not? SQL Server is growing rapidly: over 11% in 2008 according to IDC. In the past, much of our growth came from net new customers: we have always been aware that we win many businesses choosing their first enterprise database or BI implementation. Now, we are seeing a large number of switchers: customers willing to make that migration from another technology. To win more of these customers, we need to show that they need not be tied into the Oracle universe, even when their existing commitments are substantial. This case study of the Turkish appliance manufacturer Arçelik describes them effectively moving a 5Tb SAP implementation from Oracle to SQL Server, and gaining substantially in performance and lower costs. We also have a neat little tool that TCO calculator that has proved very popular. To be sure, you don't need an animated tool to find significant TCO savings persuasive, but it's fascinating and fun to play with the various combinations of servers and staff and requirements.

    Of course, I would not deny that Oracle are still growing. There's a lot to admire in the efficiency with which they target and execute on acquisitions. Frankly, they sometimes take the rest of us in the industry by surprise; which is partly a sign that they are imaginative and bold, yet partly suggests that they are chasing growth by acquisition at all costs, often seemingly at random. This week they acquired Hyperroll. That gives Oracle four OLAP products. It's little surprise that their customers, and from what I hear, their internal teams, are puzzled and often anxious about their direction. On a larger scale, their acquisition of Sun sent many of the same confusing signals.

    How does all this affect someone like me, making suggestions and decisions about our own BI futures in Microsoft? In some ways, it affects my work remarkably little. When Oracle buy a company like Hyperion, it may mean that there's a new flag on the island, but it takes a much longer time to build bridges and integration. As a result, we in Microsoft, especially in the product teams (rather than, say, in marketing) may look at the Oracle applications and see relatively few feature-for-feature challenges. In other cases, such as Oracle's own in-database-OLAP features, we largely had to ignore their feature set when considering their future direction because it seems so unclear.

    Oracle integration remains an important priority for us, for several reasons. For one thing, it helps our customers who are coming off Oracle to do so in a planned, progressive manner. Supporting Oracle systems in our tools - such as Analysis Services, Reporting Services and Integration Services also helps us to provide BI features to those customers who do find themselves on the wrong side of the devil's bargain and unable to extract themselves for it. For example, we see many customers using Analysis Services, or Reporting Services or Integration Services with Oracle systems. In fact, demand for Oracle support was so great, that in SQL Server 2008 we introduced an Integration Services high-performance data loader for Oracle. Yes, a "data loader" not an extractor: we released a feature that makes it easier to get data into an Oracle database!

    Seeing these customer satisfaction issues at another major and successful vendor, and at the risk of sounding too much like a motivational call, I am convinced that we, in the SQL Server team, must keep to our own core value propositions: a persuasive vision, a commitment to our customers' success, and a compelling pricing model. After all, these are the same key points that were so signifcant to my old team over 10 years ago at that technical preview in Redmond. For their part, our friends at Redwood Shores appear to be doing their best to send customers our way. Is that what they call synergy? I'll ponder that, although this autumn it will be over a latte in Third Place Books rather than a carajilla at the Bar Bosch.

  • Introduction From Quentin Clark

    By way of introduction, I run a team in SQL Server called Database Systems Group.  It’s the engineering team that builds the relational systems for SQL.  The single largest piece of this is the core RDBMS itself (usually referred to around here as The Engine), and the most closely-related components supporting that Engine but area also SQL-wide such as manageability (SSMS, SMO, etc.), and connectivity (ODBC, php, sql client, etc.).  My team also include our yet-to-be-shipped high-scale DW engine currently code-named Madison (this is the DATAllegro MPP DW solution that we acquired a bit over a year ago), new relational streaming technology that will be shipped as part of SQL Server 2008 R2, and the Gray Systems Lab in Madison, WI lead by David DeWitt – they do advanced development in the broad database space. 


    
“Because it’s everybody’s business.”  In the database space, this has never been more true.  Businesses are realizing the value of information – not just data, but information that can be leveraged not only by systems, but also presented to those who really need it to make the business more successful.  In the world now where internet search means you can know how many species of ants there are in Madagascar (418 according to the first result returned from Bing ;), knowledge workers have a high expectation for the availability of information to do their jobs.  For the business I am part of, this means, just off the top of my head, high-scale and performant DW, consistent and usable data models, connectivity between heterogeneous systems, closing the “last mile of BI” to get the information on the desktops, and making this all into a cohesive whole and productive developer platform.   It’s about an IT platform that will allow their business to operate more efficiently, and enable more innovation.
  • SQL Azure – Let’s Get Started

    As you may know we've announced that Microsoft's cloud computing platform, Windows Azure, will have commercial availability as of the first day of the upcoming Professional Developer's Conference in Los Angeles on November 17th, 2009.  In addition to data storage via Windows Azure, included in our offering will be SQL Azure.  I've been following the product team's progress with SQL Azure for quite some time now, given my long-term interest in, and professional use of, SQL Server. 

    In fact, just this week the product group announced on the the SQL Azure blog, that the current build is feature-complete for PDC09 now.  The product team's most recent blog details features that have been added to the most current CTP, such as the ability to configure firewall (access) rules, support for bulk copy (mostly for initial data load-in), and more. I've been watching and waiting, eager to ask lots of detail questions of product group as we start our first phase of commercial availability.  Of course, the paramount questions are around security of your data in our cloud.  We have a large number of sessions at the upcoming PDC in Los Angels, which runs from November 17 to November 19.   The announced schedule to date already included 9 dedicated sessions on SQL Azure.  These sessions are being conducted by members of the SQL Azure product group team.

    If you can't attend the PDC and want to get started learning the capabilities of SQL Azure, then I recommend downloading the October 2009 Windows Azure Platform Training Kit.  It includes power point decks, demos, hands on labs and more.  Of course this kit will be updated to reflect changes in the product as we add features.  There will also be a good bit of SQL Azure coverage at the upcoming SQL Pass Summit in Seattle from November 2 to 5th.  In addition, there will be sessions at TechEd, Europe in Berlin from November 9th to 13th.

    An interesting new development is the recent update to the SQL Azure management portal.  The CTP access URL is changing as of PDC (from https://ctpportal.database.windows.net to https://sql.azure.com) and the portal itself has been updated to reflect the newly-added features of SQL Azure.

    Due to the level of interest in SQL Azure (including my own interest), I have decided to write a technical book about the topic.  Readers of this blog will get to preview partial chapters, as I plan to begin the writing in December.  I intend to write about topics thtat will have interest for developers, ITPros and architects.  These will include development of .NET and non .NET front-end solutions (i.e. PHP, Java) which use SQL Azure as a partial or entire storage solutions, also deployment and management considerations such as auditing and synchronization between cloud and local copies of data stores.  Of course there will be a strong emphasis on security implementation best practices throughout the book.

    I am quite interested in your feedback if you have worked with any version of the SQL Azure CTP (beta).  Take a minute to drop me a mail via this blog to tell me what you've liked or not liked about your experience so far.

  • Month O' Conferences

    November is the month of conferences. There are four pretty major conferences going on and we have SQL Server Manageability sessions @ all of them. Here’s the run down:

    • SQL PASS

      1. When: November 2-5, 2009
      2. Where: Seattle, Washington
      3. Site: http://summit2009.sqlpass.org/
      4. Sessions: 2 focused on SQL Server 2008 R2/Visual Studio 2010 Application and Multi-Server Management
    • SQL Connections
      1. When: November 9-12, 2009
      2. Where: Las Vegas, Nevada
      3. Site: http://www.devconnections.com/shows/FALL2009SQL
      4. Sessions: 3 – 1 on SQL Server 2008 R2/Visual Studio 2010 (Application and Multi-Server Management) and 1 on SQL Server 2008 (automation with PowerShell)
    • TechEd Europe
      1. When: November 9-13, 2009
      2. Where: Berlin, Germany
      3. Site: http://www.microsoft.com/europe/teched/
      4. Sessions: 3 – 1 on SQL Server 2008 R2/Visual Studio 2010 (Application and Multi-Server Management) and 2 on SQL Server 2008 (automation with PowerShell & Performance Monitoring & troubleshooting)
    • PDC
      1. When: November 17-19, 2009
      2. Where: Los Angeles, California
      3. Site: http://microsoftpdc.com/
      4. Sessions: 1 focused on SQL Server 2008 R2/Visual Studio 2010 (Data-tier Application Development)

    We also have a surprise in store @ SQLPASS during Ted Kummert’s keynote. Also @ PDC there’s a session on SQL Azure futures (“The future lifecycle of database development with SQL Azure”) that covers some of the future technology we’re exploring. If you’re @ PDC this is a must attend session. There is still space available @ all of these conferences. Unfortunately the only conference I’m going to make is SQLPASS. I was excited to go to SQL Connections but my schedule just won’t allow it. Each conference has a pretty amazing set of speakers with incredible knowledge in the area of managing SQL Server. If your schedule and budget permit you should pick a conference and go. I’m positive you’ll see a positive ROI.