Hot on the heels of the announcements around the beta of SP1 for Windows Server 2008 R2 and Windows 7, 2 new videos have been published to TechNet Edge, showcasing firstly, the upcoming Dynamic Memory, and secondly, RemoteFX.
And the second video…
Looking forward to getting my hands on the beta! Less than 2 months to go! All I have to do now is hope it plays nicely with other other technologies around it… ;-)
I’ve had the pleasure to know the Orinoko chaps for some time now. Heck, we’ve even chatted virtualisation over Peri-Peri chicken! One thing I’ve learnt about these guys very quickly, is how well they know the desktop. Whether it’s upgrading or deploying it, with technologies like System Center Configuration Manager, streaming applications to it, with technologies like App-V, or more recently, virtualising it, with technologies like Hyper-V, Remote Desktop Services, and Citrix XenDesktop. Whichever scenario comes about from a desktop perspective, chances are, these guys have seen it.
Recently, as some of you may know, we held a ‘Best of MMS 2010’ event, in (downtown ;-)) London, focusing on the System Center suite of technologies, and surrounding ecosystem. At the event, Jeff Wettlaufer, Senior Technical Product Manager in the System Center team, blogger, and all round good-egg, sat down with John and Carl from Orinoko to chat ‘desktops’ - what exactly the optimised desktop meant to the guys, and how, among other things, they were seeing Windows 7 adoption. You can check out the video, below:
So, onto the second half of the post title - If any of you out there are using SCCM, or are thinking about using it, one of the nifty things I found out it could do (via John’s blog) was take an export of your Microsoft license statement, from the Microsoft Volume Licensing Site, and compare this with the results of the Asset Intelligence check and start to see where you’re perhaps over-licensed, or, worse, under-licensed! I’ve pinched a screengrab from John’s blog to illustrate what I’m on about:
I’d check out John’s blog if you get stuck, as it is possible to go wrong, particularly when you’re cleaning up the Excel spreadsheet ready for conversion to XML.
Hot on the heels of Thursday’s post on the virtualising of SharePoint 2010, I thought I’d share with you, some useful resources around the virtualising of SQL 2008 R2, on Hyper-V R2, and the kind of performance you can expect from this combination of technologies.
High Performance SQL Server Workloads on Hyper-V White Paper
This whitepaper focuses on the advantages of deploying Microsoft SQL Server database application workloads to a virtualisation environment using Microsoft Windows Server 2008 R2 Hyper-V. It demonstrates that Hyper-V provides the performance and scalability needed to run complex SQL Server workloads in certain scenarios. It also shows how Hyper-V can improve performance when used in conjunction with advanced processor technologies. This paper assumes that the reader has a working knowledge of virtualisation, Windows Server Hyper-V, SQL Server, Microsoft System Center concepts and features.
The whitepaper discusses a number of different tests that were performed, in some detail, yet also, from page 30, you can also start to read about the best practices for running workloads like SQL on Hyper-V. The best practices section provides guidance around configuration for the parent OS, networking, VHD considerations, and more.
If you supplement the information in this whitepaper, with some of these other resources below, you should be in a good position to optimise the performance of SQL in a virtual environment.
SQL Server 2008 Virtualization
SQL Server Analysis Services Virtualization
Short notice I know, but a number of places have come available on this VDI course which may be of interest to some of you in the UK.
44CO181 - Desktop Optimisation: Implementing a VDI Infrastructure with Microsoft & Citrix Summary:
Join us for this 2 Day VDI Instructor led course with supporting Hands on Labs.
IT Professionals who have experience in virtualization technologies, but are not familiar with the construction of VDI based solutions
The course if set to run over 2 days, is instructor led, and includes hands on labs. It’s in Wokingham, near Reading, and is priced at £400 for the 2 days. Not bad, considering you’ll get exposure to both Microsoft stuff, and Citrix stuff, and understand how they all fit together.
If you’re interested, you can register here.
If you’re a Microsoft Partner, and you’re interested in learning more about the recently released System Center Data Protection Manager 2010, or System Center Essentials 2010, here’s a couple of webcasts that you may want to check out:
MGT77PAL: Technical Introduction to System Center Data Protection Manager 2010 Presented by: Rahul Jacob 15th June 2010 – 6pm GMT System Center Data Protection Manager 2010 provides new backup and recovery capabilities at a low cost. Because of the significant new capabilities in DPM, it is highly important that the field, partners and customers are aware of the various solutions and opportunities we have with DPM 2010. This session will help you get started with easy setup and configuration.
MGT78PAL: Application Workloads and DPM - Better Together Presented by: Rahul Jacob 17th June 2010 – 6pm GMT Microsoft System Center Data Protection Manager (DPM) is designed for IT generalists and uses wizards and workflows to help ensure that you can protect your Exchange, SQL and SharePoint data without any advanced degree of storage and backup knowledge. This session will help small and medium business IT administrators plan backups, recoveries and plan for further Disaster Recoveries. The solutions will enable you to take advantage of Exchange 2010 and DPM 2010.
MGT79PAL: Technical Introduction to System Center Essentials 2010 Presented by: Ashok Kumar G 22nd June 2010 – 6pm GMT System Center Essentials 2010 has now hit RTM, so it is important that the Microsoft field and our partners who focus on midsized businesses with less than 50 Servers and 500 clients, learn about the value of this new offering for physical and virtualization management. This session will help you get started with easy setup and configuration. Come see it first, and get ready when your customers ask you about it. Topics include: SCE 2010 Overview, Architecture, Demo, Market challenges, Solution, Licensing
You and I both know, typically, training isn’t cheap. It’s not just the course fees that can be expensive, but it’s the time out of the office that in fact, can be more costly. It’s a pretty simple relationship – when your sales team are out of the office, they aren’t selling, and if they aren’t selling, you aren’t generating revenue! I’m sure this is just one of the reasons why, over recent months, more and more content is being delivered, on demand, through the browser, enabling employees within organisations the ability to learn, in a time that’s flexible to them, and the business. The Microsoft Virtualisation Sales Specialist certification is no different.
Why’s it important?
Well, believe it or not, very shortly, the Microsoft Partner Network will retire the Gold, Certified, and Registered Partner levels, and will replace said levels with competencies. These competencies will more accurately reflect Partner skills, and enable Customers to more accurately find, and work with, the right Partner for them. Competencies exist in 2 flavours for a particular area. Using virtualisation as an example, there will be the virtualisation competency, and the advanced virtualisation competency, each with different requirements and different benefits (although advanced will be in addition to the regular competency). If this is news to you, and you’re reading this thinking, ‘wha?’, I suggest you head on over to the Partner Network before continuing!
Now that' we’re all up to speed, why is the Sales Specialist certification important? Well, these types of certifications are going to contribute towards the competency. For so long, being Gold, or Certified has, to a large extent, been about technical exams, MCPs and MCSE’s, but this break-from-the-norm is ensuring that not only can your techies deploy the technology, but your sales team can also articulate the benefits, construct the deal, and license it accordingly. Oh, and we’ll throw a bit of competitive training in there too ;-)
You can access the Virtualisation Sales Specialist training, and exams by heading over to the Microsoft Sales Specialist site, but you’ll need a Windows Live ID associated to a Partner to take advantage of this – if you’ve ever accessed the Microsoft Partner Portal, and signed in successfully, use this Live ID! When you log in, you’ll see that as of today, there are 2 tracks:
If we drill into the virtualisation track, you’ll see that there are 2 main sections. One is from more of a product understanding perspective, and covers an overview, before providing a deeper look at Microsoft’s Server Virtualisation story, then it’s Desktop story, and a second section, dedicated solely to Licensing, and Selling in Competitive Situations. There is also a separate assessment for either section, which, if you pass, you’ll be rewarded with the certification logo that you see above, but also a certificate you can print out. It’s also important to know that the accreditation is valid for 2 years.
What are the courses actually like? Well, for me, they could do with a bit of a brush up from a presentation perspective. Considering with have PowerPoint 2010 now, and the ability to make some of the slickest presentations, graphically, we’ve ever seen, some of the slides in the sessions leave a little to be desired, but the messaging is spot on, and covers everything to a good level of details, and trust me, the exams are pretty tricky! Whenever you launch a course, it registers you on to it via the Microsoft Partner Network, which will be logged against your Partner ID and enable you, as a Partner, to identify who’s been on which courses.
Overall, I’d say it’s a pretty useful resource, and definitely one to get under your belt. Even if they don’t contribute towards the competency (the Partner Network says the online courses are available in October, yet this one is here now!), they’re a valuable way of understanding the Microsoft Virtualisation story, across desktop and datacenter.
Head on over to the Microsoft Sales Specialist site for more information!
For those of you using System Center Operations Manager to monitor your environment, you’ll understand the concept of management packs, and the value they bring. Fundamentally, OpsMgr wouldn’t be the product it is today, without the management pack framework. These management packs contain the knowledge to monitor, to a granular level, workloads like Exchange, SQL, and SharePoint, but also non-Microsoft applications, from Partners like Citrix, F5, Brocade, Dell, HP, NetApp and more. Without the packs, who would we rely on to configure the monitoring elements? We couldn’t rely on the OpsMgr team, as they don’t have knowledge of every technology in the world that’s monitor-able! We couldn’t rely on individual IT Admin’s within organisations, as this would be complex and time-consuming for the individual involved, and without a very deep knowledge of product-X, say, Exchange, how would you know where to start from a monitoring perspective! Thankfully, the management pack framework, for the most part, takes the pain away when it comes to monitoring key applications and workloads. Sure, some MP’s are better than others, but they’re all improving, and the ecosystem is growing. You only have to look at the number of Partners who are producing PRO-enabled management packs to see the ever-growing ecosystem:
These are just the Partners who are building PRO-MP’s, never mind the huge ecosystem creating regular MP’s too! For me, that just shows that Partners get it. They get the fact that management is a key focus in the future, so providing value-add to their customers, through integration with Microsoft technologies, helps to unify a customer’s infrastructure, and ease the management process for them.
One of our key MP Partners within the ecosystem is Bridgeways. The reason I’m aware of Bridgeways, is because, among other things, they allow OpsMgr to monitor VMware environments. This is a very useful add-on for an environment where VMware technologies have been deployed as the virtualisation layer of choice, but more knowledge about what’s inside the virtual workloads is now required. That’s where OpsMgr comes in, but not being able to see everything in OpsMgr would be disappointing, hence Bridgeways provide the MP to hook OpsMgr in with VMware technologies. You can see a demo of this, here.
Aside from the VMware MP, Bridgeways also provide a free MP for Hyper-V. Now, you can download the Microsoft MP for Hyper-V here, however, this is very much a base MP, and effectively gives you the following:
Not very much there! Fairly useful, but to be honest, the information is more about listing info, than monitoring performance. That’s where Bridgeways have come in and extended the MP in multiple different directions, to give you:
You know what the crazy thing is? Bridgeways have even linked to a blog, which walks you through taking the standard Hyper-V Base MP, and extending it to produce the MP that you can download! For those of you who’ve never edited or modified an MP, and want to understand how it’s done, and the results it can produce, this is a very useful series of tutorials:
Part 1: Getting Started with writing a more robust Hyper-V MP Part 2: Adding your own Monitors Part 3: Adding Rules and Performance Views Part 4: Adding Dashboards
Hat-tip to the chaps at the xplatxperts blog for all the information – very useful indeed!
I’ve been a big fan of NetApp technologies for ages, and I’ve worked closely with people like Steve Winfield, and Pete Mason, to produce a number of videos showcasing some of the collaborative work that’s gone on between Microsoft and NetApp, resulting in products like SnapManager for Hyper-V, SnapDrive 6.2 and more. We’ve got some fantastic joint wins on the platform now too, at both small, and large customers, so it’s all good from that perspective.
I’m currently building out my team’s internal demo infrastructure, which currently consists of 1 Dell T605, with Hyper-V R2, and a number of System Center technologies virtualised on top, along with a cluster of 2 Dell R710’s, hooked up to a NetApp FAS3050c. Now this FAS3050c isn’t the latest model, and it doesn’t have the most capacity in the world (my DS14 Disk shelf gives me around 570GB of usable space) but then it was kindly donated to me by NetApp, who were replacing some of their older kit, with newer kit for our Microsoft Technology Center, in Reading, UK. The great thing for me is, I can still have the latest version of OnTap, it’ll work with the latest and greatest versions of SnapDrive, and SnapManager for Hyper-V, and it still gives me all the features I need, like the snapshotting, thin provisioning, and best of all, deduplication. I’ll be honest with you right now. I love dedupe. I think it’s fantastically clever, streamlined, and because it’s at the block-level, rather than the file level, it’ll even dedupe stuff that you think, on the surface, has no chance of being deduped. Crazy stuff. Let me explain more.
Firstly, for those of you not sure what deduplication with NetApp is, and how it works, there’s a great explanation over at the Dr DeDupe blog.
As I said, my cluster environment is 2 Nodes, and to that cluster, I’m presenting 4 LUNs of storage, which in my NetApp environment, are in 4 separate Volumes. You don’t have to do it like this, and who knows, maybe I’ll change it in the future, but right now, this is how it is:
As you can see, I've got a dedicated LUN for my witness disk, (I’m using Node and Disk Majority for my 2-node cluster), and 3 LUNs presented to the cluster, which have been selected to be Cluster Shared Volumes. They aren’t huge, 100GB each for two of them, and a 25GB CSV that will hold the swap files of my key VMs (Each host only has 12GB RAM, so having 25GB for SWAP VHD’s is fine!) You’ll see from the image above, that currently, I’m using around 51% of my CSV2. It’s currently got a 40GB (ish) Fixed VHD with WS2008 R2 inside, but at the same time, CSV2 also has another Dynamic VHD, with Windows 7 x86 inside it, currently expanded to around 8GB. Total consumption of that CSV is 51GB:
So, that means I’ll lose 51GB on my SAN, right? Wrong! We’re actually using a grand total of 17.5GB!
If we go over to NetApp System Manager, and take a look at this particular volume, you can see for yourself:
Just think about this for a minute. Due to the fact that this is block-level deduplication, we can look inside the contents of the VHD files etc, and see where the blocks match, and deduplicate them, so in this case, we’ve saved a grand total of 37.62GB, which amounts to 60%. Obviously Windows still thinks it’s using 51GB, even though, under the covers, the SAN hasn’t lost that space. This is where Thin Provisioning starts to help, as you can make Windows think it has more storage available to it.
This use of deduplication hasn’t just been used on my CSV’s. Oh no. I’ve used it on the Witness disk, where, even though the whole volume is only 1GB, and the consumption was 50MB for the quorum information, deduplication still managed to save me 10mb, which is 20%. What about my other savings? Well, on my SCVMM Library, where I’m storing a couple of VHDs, but also some ISO files, I’ve saved a total of 15%, and on my actual backup store, being used by Data Protection Manager 2010, to protect Hyper-V and SQL so far, I’m saving just under 39GB, which equates to 58%. These savings are real, and are enabling me to get even greater levels of consolidation on my SAN than I would have normally. Brilliant stuff NetApp.
Now I just need to get ApplianceWatch PRO working… :-)
If you’re thinking about virtualising SharePoint 2010, whether it’s on Hyper-V, or otherwise, Microsoft have released a number of useful pieces of documentation to aid you through the process. Whether it’s support and licensing information you need, or more technical planning of the virtualisation architectures, this TechNet library should give you plenty to get started with.
SharePoint 2010 – Virtualisation Planning
It covers a number of key sections:
If that isn’t enough, there’s a webcast coming soon, delivered by the TechNet team, on the same topics, so if you’d rather hear/see it, than read it, here are all the details:
Language(s): English. Product(s): Hyper-V. Audience(s): IT Generalist. Duration: 60 Minutes Start Date: Tuesday, June 15, 2010 7:00 PM GMT Event Overview: Virtualising business-critical applications can deliver significant customer benefits, including cost savings, enhanced business continuity, and an agile and efficient management solution. In this webcast, we discuss virtualising Microsoft SharePoint 2010 using Microsoft solutions. We present the benefits of Microsoft virtualization technologies over key competitors such as VMware, and we provide guidance for virtualising SharePoint 2010 for production and test/development scenarios, focusing on scale, load balancing, dynamic provisioning, and high availability. Other topics we cover include Microsoft virtualization technical details with best practices and customer evidence and results from lab deployment tests. Presenters: Arno Mihm, Senior Program Manager, Microsoft Corporation and Bill Baer, Program Manager, Microsoft Corporation
If you’re interested, and you’re free at that time, you can register here.
Love reading? Love technology? Like money?
If the answer to those questions is Yes, then these eBook links are for you. Covering a wide spectrum of technologies, from Office through to Virtualisation, and Windows Phone through to Visual Studio, this array of free eBooks should keep you going for a while…