I have just got back to my blog after a few days at various events and I see the Evaluation edition of Windows Server 2012R2 has been released. I need this for my lab ops because I am building and blowing away VMs for er… evaluations and I don’t want to have to muck about with license keys. For example I have a script to create FileServer1 VM but if I use the media from MSDN for this and I don’t add in a license key to my answer file, the machine will pause at the license key screen until I intervene. Now I have the Evaluation Edition I can build VM’s that will starter automatically and when they are running continue to configure them. For example for my FileServer1 VM I created in earleir posts in this series I can add a line to the end of that script while will run on the VM itself once it is properly alive after its first boot..
invoke-command -ComputerName $VMName -FilePath 'E:\UK Demo Kit\Powershell\FileServer1 Storage Spaces.ps1' ..and this will go away and setup FileServer1 with my storage spaces.
Note both the script to create FileServer1 (FileServer1 Setup.ps1) and the xml it uses to add features into that VM (File server 1 add features.xml) and the File Server1 Storage Spaces.ps1 script referenced above are on my SkyDrive for you to enjoy.
One good use case for executing remote PowerShell scripts remotely like this is when working on a cluster. Although I have put the Remote Server Administration Tools (RSAT) on my host and to have access to the Failover Clustering cmdlets I get a warning about running these against a remote cluster..
WARNING: If you are running Windows PowerShell remotely, note that some failover clustering cmdlets do not work remotely. When possible, run the cmdlet locally and specify a remote computer as the target. To run the cmdlet remotely, try using the Credential Security Service Provider (CredSSP). All additional errors or warnings from this cmdlet might be caused by running it remotely.
While on the subject of new downloads the RSAT for managing Windows Server 2012R2 from Windows 8.1 is now available, so you can look after your servers from the comfort of Windows 8.1 with your usual tools like Server Manager, Active Directory Administrative Console, Hyper-V manager and so on On my admin VM I have also put on the Virtual Machine Manger Console ad SQL Server Manager and a few other admin tools..
Before you ask me the RSAT tools you put on each client version of Windows only manage the equivalent version of server and earlier. For example you can’t put the RSAT tools for managing Windows Server 2012R2 onto Windows 8 or Windows 7.
So using my lab ops guides or the more manual guides on TechNet, you can now get stuck into playing with Windows Server 2012R2, as a way of getting up to speed on the latest Windows Server along with the R2 courses on the Microsoft Virtual Academy.
By Alan Richards, Senior Consultant at Foundation SP and SharePoint MVP.
This time last year the computing worlds view of Microsoft Windows changed forever, Windows 8 changed the way we interact with not only our PC’s but also our laptops, phones and tablet devices. Windows 8 was not only a new operating system for your PC but it was also a new way of working, a single consistent interface across all your Windows based devices with the ability to have all your settings, document, images & videos accessible from any Windows device you logged onto.
Now that was the really cool bit, suddenly all my Windows devices were personal to me, I took a photo on my Windows phone and it was immediately available on all my other devices, no more emailing it to myself. This was just cool, no other description for it.
So a year on, everything has settled down and the release of Windows 8.1 has been and gone, have you made the move yet? Perhaps, it’s now time you took that step and upgrade all your devices to Windows 8. Let’s look at your options in three distinct areas; hardware requirements, ways to upgrade & licensing
The hardware requirements for Windows 8 varies depending on what device you want to run it on; do you want touch, do you want game level graphics or do you simply want a device to get some work done.
The basic hardware requirements are very reasonable, in fact if you have a device that runs Windows 7 it will quite easily run Windows 8.
The table below shows the system requirements for Windows 7 & 8
· 1 gigahertz (GHz) or faster with support for PAE, NX, and SSE2
· 1 gigabyte (GB) (32-bit) or 2 GB (64-bit)
· 16 GB (32-bit) or 20 GB (64-bit)
· Microsoft DirectX 9 graphics device with WDDM driver
· 1 gigahertz (GHz) or faster 32-bit (x86) or 64-bit (x64) processor
· 1 gigabyte (GB) RAM (32-bit) or 2 GB RAM (64-bit)
· 16 GB available hard disk space (32-bit) or 20 GB (64-bit)
· DirectX 9 graphics device with WDDM 1.0 or higher driver
If you still have Windows XP and are looking to upgrade you may very well need to buy a new Windows 8 device, of which there are numerous choices as you can see from the images scattered around this article. From normal PC’s & laptops to convertible devices, from tablet devices to all in one computers.
So you are going to take the plunge and get your new view on the world, personalise your device experience and why not, upgrading is easy. Let’s look at the options; if you are simply upgrading your personal device then you can simply get an anytime upgrade if you have a compatible current Windows OS, or you could go out and buy the DVD and install it on to your device. If you want to check in advance then why not use the Windows 8 upgrade advisor, a nifty little tool supplied by Microsoft, check out the page here
If you are a large organisation then you have a few more choices because, let’s face it, running around 100’s of devices with a DVD is not really an option.
This one is a bit like a DVD install but you upload the contents of your volume licensed Windows 8 media and do an in place upgrade of your current version of Windows, assuming of course that your current version has a direct upgrade path (use the upgrade advisor to find out) Only really any good for a small number of devices, I really wouldn’t recommend this for 100’s of devices.
Microsoft Deployment Toolkit
This toolkit brings together various pieces of software such as the Windows Automated Installation to provide a system which can capture and deploy images of Windows to both bare bones devices & devices that already have a Windows installation. The toolkit is downloadable from Microsoft for free using this link.
Install it onto a server with Windows Deployment Services and you have a system from which you can create base devices, sysprep them, capture the image over the network and then deploy the image to multiples of devices.
The toolkit also has the ability to preload drivers to ensure the Windows installation goes without a hitch, all you need is the driver software to load up to the management interface of the toolkit.
All of the deployment options are configured using scripts which you can adjust to meet your needs. You can have a deployment that asks the end user all the usual installation questions right the way up to a completely zero touch installation.
This beast of a piece of software is the installation gold standard. In essence it gives you the same functionality as the Deployment Toolkit in that you can capture & deploy images and do it all using scripts. However the big difference is the functionality and control you have over the installation. System Center gives you so much more, allowing you to send packages to the machines once they are installed. Automating the installation of applications & service packs, allowing you to view the hardware of devices, check upgrade statuses.
If you are licensed for its use then System Center should be your choice for full control over your devices.
Unfortunately something this good doesn’t come free but Microsoft licensing makes it fairly simple to get your hands on Windows 8. How you license your copy of Windows will depend on your personal circumstances; individual, business or education.
For individuals you can purchase an upgrade version of Windows 8 as long as you are currently using a licensed version of a previous version of Windows. If you currently don’t use Windows or are using a version that doesn’t fulfil the upgrade requirements you will need to purchase the full versions from an IT store.
Businesses & education have a multitude of ways to purchase Windows 8, you can purchase the full version or upgrade in the same way that individuals can, however, they can also use their current licensing arrangements to purchase their upgrade to Windows 8. Volume licensing with software assurance allows you access to the latest software for your organisation and so upgrading to Windows 8 is as simple as checking your hardware meets the requirements and then downloading the install package. Do remember though that software assurance only allows you to install an upgrade version of Windows and so the device you are installing it to must already have a fully licensed previous version of Windows.
Windows 8.1 has now been released, so we should look at the licensing arrangements around the latest upgrade. Well the simple answer is - its free!!
If you are a personal user already running Windows 8, then simply update your device to the latest Windows 8.1 version.
If you are a business or education customer who buys copies of windows outright then the same process applies as per an individual user however if you have a volume licensing agreement with software assurance you can download the Windows 8.1 upgrade to install as an ISO onto your devices, remembering of course that they must have a fully licensed copy of a previous version of windows.
Windows 8 is new, it’s different, and it’s personal. Now all your files and settings follow you from device to device. There are also plenty of new devices ready to take advantage of the greatest features of Windows 8. However with the reasonable hardware specifications it’s easy for you to use your current device and upgrade to Windows 8 using any of the easy to access licensing methods, whether that’s as an individual, business or education.
So in summary, go out and get rid of that old version of Windows and upgrade to Windows 8 and open up a new view in your world.
Why is it so hard to get time off for training? It can be hard, because there is a cost associated with a training event. However, whilst SQLRelay offers free training, you may still need some help to explain the value of SQLRelay to your boss. Here are some useful reasons you can provide when you ask for the time to attend a SQLRelay event, and we hope to see you there!
Here are some signs that your organisation needs to send you to SQLRelay…
Well, lots of conferences can give you that! Let’s look at why SQLRelay isn’t like other conferences…
How SQLRelay can make things easier for people who don’t attend…..
To summarise, attending a free training event, given by Microsoft and world-experts, is an excellent investment of your organisation’s time, resources and energy. SQLRelay is coming to a location near you. Come and join us: learn, and get help with your SQL Server issues, for free. We look forward to seeing you there!
SQLRelay is a series of day-long conferences held around the UK by local community organisers. Each event covers a wide range of SQL Server related content delivered by expert speakers from around the world. In its fourth iteration, it’ll be appearing in a city near you during November. For more details consult sqlrelay.co.uk
Help spread the word by getting in touch with us via - Twitter @SQLRelay2013 - Facebook/SQLRelay2013 or via LinkedIn
Event Speaker: Jen Stirrup - Most Valuable Professional (MVP) - SQL Server
Jen is best-known for her work in Big Data, Business Intelligence and Data Visualisation. She is Joint Owner of Copper Blue Consulting, delivering business-critical solutions that add enterprise value in addition to provisioning technical integrity. Jen is a Director-At-Large (Elect) for the Professional Association of SQL Server (PASS) organisation, holding the EMEA seat. Jen is also a current holder of the SQL Server ‘Most Valuable Professional’ Award (MVP) who has also won the SQLPASSion Award, presented by PASS at Summit 2012, for her work in helping the European SQL Server community. Jen has presented at a variety of world-class events including TechEd North America, TechEd Europe, PASS Summit, PASS Business Analytics Conference, SQL Live! 360 and SQLBits, along with SQLSaturday events throughout Europe and the United States.
The iconic kickstand a better, full HD screen, lighter form factor and superior sound make Simon May, Microsoft Evangelist, rather obviously fall in love with the new Surface 2 and Surface Pro 2 devices. But are they good for the IT guy.
Last week I was lucky enough to be one of the first people to go “hands on” with the new Surface 2 and Surface Pro 2 devices from Microsoft. As always this series is about writing about what they’re like for IT Pros which I’ll get onto in a few lines but before I do let me tell you how I use my current Surface devices. Currently I only own a Surface RT, actually I own three of them and two are for demo purposes. My main Surface device spends most of its time sat by the sofa and it’s used for casual non-work stuff but it’s also used heavily for commuting. For the times I go into London to for work I only take my surface, I don’t need anything else for emails, for meetings, for blogging or my general day to day non-technical work. Surface RT is the perfect device for this because it’s light and I don’t need to charge it. I also have an Android tab sat there, invariably I prefer Surface RT.
Let’s start off looking at the new Surface 2 then which runs Windows RT. The very first thing I noticed when I grabbed the device was how much lighter it feels than the Surface RT, I am sure there’s not much of a weight difference but it’s enough to be noticeable. The very next thing I did was to try the iconic kick stand, it feels as solid as the Surface RT with that pleasing spring when it gets to the end of its movement but the kickstand can be pulled to make it move a smidge further and provide a flatter working angle. I moved the kickstand to the second position and I was quite surprised about how that affected by ability to type. With the first position and on the Surface RT it’s pretty cumbersome to type on screen, with Surface 2 the kick stand position makes it easy to type with both hands –almost touch type.
My very next move was to power the device up and log in to set it up. Immediately I noticed how sharp the 1080p screen is compared with the 720p screen is on the Surface RT which just made the Surface logo that little bit smoother. It’s also noticeable on the labels on live tiles which are just that little bit more readable. Personally I prefer to have more tiles so I quickly set my Surface 2 to display 4 and the 1080p screen handles that really nicely too. Within about 10 minutes my apps had started to sync down too so I jumped onto twitter which did exactly what you’d expect on a 1080p screen. Wanting to test the screen more I popped into the Windows Store and installed the 500px app to view some beautiful photography. I have to say the clarity of the screen, the contrast of the colours everything about the screen makes it wonderful to look at.
Taking a look at the desktop to use the Microsoft Office apps also didn’t disappoint me. The higher resolution makes office just that little bit nicer to work with which I think is possibly because it’s slightly more congruous with the display on my Asus Zenbook Prime, things just seem to be the right dimensions.
Everything starts to feel snappier around the interface than my Surface RT with apps loading just that little bit more quickly. Overall I found the Surface 2 to be a pretty great improvement over the Surface RT for me, I’ll probably be buying my own. Sometimes people say to me that it’s not a great device for IT Pros because it doesn’t run desktop apps, I however find that it does almost everything I need for short periods and does much better than anything else I’ve ever used for such. I have easy access to PowerShell and to Remote Desktop and in fact though remote desktop I deliver a couple of apps I need occasionally (like the RSAT) using Remote App and they basically feel like native tools.
Another thing I like, which is actually a Windows 8 feature is the ability to wipe my device. The device I used for this review wasn’t mine, was not going to be mine and other people needed to use it, so I used the reset ability of Windows 8 to just reset the device and take away all my customizations before I handed it off. Very handy for recycling your old Surface RT device I thought.
Surface Pro 2 for the Professional
Next I was onto taking a look at the Sur face Pro 2, a colleague had signed into this device first and it was setup with their Microsoft Account. The very first thing I did was play a movie trailer from Xbox video, not so that I could see the screen – it’s 1080p just like the Surface Pro, but so I could the sound. The Surface Pro 2 and actually the Surface 2 have Dolby audio built in and wow do they sound good! The sound is excellent and probably the best of any tablet device since they have two speakers (lots of tablets only have one – aka Mono) but Surface has multiple drivers and sounds superb. I could happily use the Surface Pro 2 as music device or to watch whole movies on.
I wanted to give the USB 3 on the device a try so I moved a huge amount of data over from a USB3 memory stick and transfer speeds averaged about 34mbs. Copying from the Surface 2 to the stick managed a similar average transfer speed, so we can tick the “it just works” box. I also ran some benchmarks on the device and it out performed by new laptop (Asus Zenbook Prime) in almost every way from drive speed, 3D graphics performance and various CPU tests. I have to say it was impressive in every respect and obviously a total laptop replacement for an IT Pro – with this you’d only need one device for everything in your life – even a little bit of virtualisation!
In Windows Server you can create two kinds of Virtual Desktop Infrastructure (VDI), personal or pooled. A personal collection is a bit like a company car scheme where everyone chooses their own car. This means there needs to be car for everyone even if they are on leave or sick etc. and each car needs to be individually maintained. However the employees are really happy as they can pimp their transport to suit their own preferences. Contrast that with a car pool of identical cars, where an employee just takes the next one out of the pool and when its brought back its refuelled and checked ready for the next user, and you don’t need a car for everyone as there’ll be days when people just come to the office or use public transport to get to their destination. That seems to be a better solution than company cars for the for the employer but not so good for the employees. Pooled VDI collections work like pool cars in that they are built from one template and so only one VM has to be maintained, but that means every user has the same experience which, might not be so popular. However Pooled VDI in Windows Server 2012 has a method for personalising each users experience while still offering the ability to manage just one template VM and that’s why I want to use pooled VDI in my demos.
Carrying on from my last post I right click on RD Virtualisation Host and select Create Virtual Desktop Collection
Now I get specify the collection type
Having chosen the collection type I now need to pick a template on which to base the pool..
I found out that you can’t use the new Hyper-V generation 2 VMs as a VDI template even in Windows Server 2012R2 rtm. This does mean you can use that WimtoVHD Powershell script I have promoting in earlier posts in this series to create my template directly from the Windows installation media.
Note: you’ll need windows 8.1 enterprise for this which is currently only available on msdn, until 8.1 is generally available in a couple of weeks when there should be an evaluation edition available
In fact for a basic VDI demo the VHD this creates can be used as is; all you need to do is create a new VM from this VHD to be configured with the settings each of the VDI VMs will inherit, such as CPU, dynamic memory settings, Virtual NICs and which virtual switches they are connected as well as any bandwidth QoS you might want to impose..
Here you can see the setting for my template VM such as it being connected to my FabricNet virtual switch.
Normally when you build VMs from templates you will want to inject an unattend.xml file into the image to control its settings as it comes out of sysprep (as I have done in earlier posts in this series). This wizard helps you with that or you can just enter basic settings in the wizard itself as I have done ..
and not bother with an unattend.xml file at all.
Now I can start to configure my collection by giving it a name, how many VMs it will contain and specifying who can access it ..
In a production environment you would have several virtualization hosts to run your collection of VMs and here you can specify the load each of those hosts will have.
Having specified which hosts to use I can now get into the specifics of what storage the VMs will use. I am going for a file share, specifically one of the file shares I created earlier in this series, which will make use of the enhancements to storage in R2. Note the option to store the parent disk on a specific disk, which might be a good use of some of the new flash based devices as this will be read a lot but rarely updated.
My final choices is whether to make use of user profile disks. This allows all a users settings and work to be stored in their own virtual hard disk and whenever they log in to get a pooled VM, this disk is mounted to give them access to their stuff. This is really useful if all your users only ever use VDI as you don’t need to worry about all that roaming profiles and so on. However if your users sometimes use VDI and sometimes want to work on physical desktop such as laptops then you’ll want to make use of the usual tools for handling their settings across all of this so they get the same desktop whatever they use - remember we work for these people not the other way around!
That’s pretty much it - the desktops will build and your users can login via the web access server in my case by going to http://RDWebAccess.contoso.com/RDWeb
To demo the differences in performance on a pooled VDI collection that sits on a storage space that's had deduplication enabled I could create another collection on the Normal* shares I created in my post on storage spaces by doing this all again. Or I could just run a PowerShell command, New-RDVirtualDesktopCollection, and set the appropriate switches..
$VHost = "Orange.contoso.com" $RDBroker = "RDBroker.constoso.com" $ColectionName = "ITCamps"
#The VDI Template is a sysprepped VM running the Virtual Hard Disk, network settings etc. that all the pooled VMs will inherit. The VHD will run windows 8.1 configured and sysprepped with any applications and setting needed by end-users
$VDITemplateVM = get-vm -ComputerName $VHost -Name "Win81x86 Gen1 SysPrep"
New-RDVirtualDesktopCollection -CollectionName "ITCamp" -PooledManaged -StorageType CentralSmbShareStorage -VirtualDesktopAllocation 5 -VirtualDesktopTemplateHostServer $VHost -VirtualDesktopTemplateName $VDITemplateVM -ConnectionBroker $RDBroker -Domain “contoso.com” -Force -MaxUserProfileDiskSizeGB 40 -CentralStoragePath”\\fileserver1\NormalVMs” -VirtualDesktopNamePrefix "ITC" -OU “VDICampUsers” -UserProfileDiskPath “\\fileserver1\NormalProfiles” My good friend Simon May then gradually add in more and more VMs into the collection with the Add-RDVirtualDesktopToCollection cmdlet to see how much space he can save.
The other really clever thing about a pooled VDI setup like this, is maintaining it. Clearly you will want to change the tem[plate the Pooled collection is based on from time to time, for example to add or remove version of applications and to keep patches up to date. All you have to do is to make another template VM with the new applications and latest patches and then Update the collection from the Collection management screen, or via the Update-RDVirtualDesktopCollection PowerShell cmdlet for example
PS C:\> Update-RDVirtualDesktopCollection -CollectionName "ITCamp" VirtualDesktopTemplateName "$VDITemplateName" -VirtualDesktopTemplateHostServer $VHost -ForceLogoffTime 12:00am -DisableVirtualDesktopRollback -VirtualDesktopPasswordAge 31 -ConnectionBroker $RDBroker
where I would have set $VDITemplateName to be the modified and sysprepped VM to base the updated collection on. Note the Force LogOffTime setting; that’s where users will be thrown out and forced to log on again. If you don’t set this they’ll only get to use the new version when the login and logout again. However you manage that if you have used User Profile in the collection as I have done their preferences and setting will persist on the updated collection.
So that’s the basics of setting up VDI on a laptop for your evaluations. From here I could go on to ad other parts of the Microsoft remote desktop solution such as;
However I would be interested to know what you would like me to post next, so please add comments or if you are shy e-mail me
By Dana Simberkoff, Vice President, Risk Management & Compliance, AvePoint.
For most organisations worldwide, it’s no longer a matter of “if” they will move to the cloud but rather “what” they will put in the cloud. Keeping everything on-premises within the walls of an organisation is unrealistic. Cloud is increasingly a part of more and more IT business strategies, at least judging by the rapid spending on cloud-related services. According to a recent IDC study, public cloud services spending will reach $98 billion USD in 2016, with a compound annual growth rate five times that of the IT industry overall.
Why? Companies are constantly looking for ways to do more, to collaborate better, to create more product, to continue pushing the revenue needle forward – all the while, enabling an increasingly global workforce. Cloud computing offers many advantages to technology providers and their customers, allowing companies to invest far less in infrastructure and resources that they must host, manage, administer and maintain internally. This instead allows them to invest in the advanced applications they build on an externally hosted and fully redundant environment that they can access at a fraction of the cost - not just for saving costs on what is traditionally capital expenditures on hardware, but more so for business agility. The business landscape has never been more competitive, and every enterprise is looking for an edge. Judging by the numbers, many believe utilising the cloud to manage enterprise systems and content with repositories such as Microsoft SharePoint Online will help pave their way to victory.
With this great reward, however, comes great risk. Hosting SharePoint through Microsoft Office 365 could reduce cost and improve global access to content. However, for organisations subject to regulatory requirements (and that’s essentially every organisation today regardless of size, vertical, or geography), the move to the cloud isn’t without risk. Enterprises have tremendous concerns about storing business data outside their own walls because it means relinquishing control – control of information, user access, authentication, and data exposure (whether intentional or accidental) of sensitive personally identifiable information, classified information, or otherwise non-compliant content.
So you accidentally let someone take a peek at the wrong data – how much harm can that data breach possibly do? About $5.4 million USD worth, according to a recent study by Ponemon. The study found that the average organisational cost for a single breached record – a document, user ID, email, email address – is $188 USD. Think about the number of emails clogging up your own inbox, and documents in your shared drive right now … it adds up very, very quickly.
Before you call your sales representative selling you a cloud platform and tell her no thanks, consider this: There is a way to gain value from cloud computing while addressing compliance concerns. Many companies are offloading select content or workloads into the cloud, and keeping their most regulated content on-premises. You won’t be alone. Many organisations are following this approach, a report by IDC found that 80 percent of the world’s 2,000 largest companies will still have greater than 50 percent of their IT onsite by 2020.
So, what’s your move to start the migration from your old on-premises technology platforms to the cloud? Here’s your four-step playbook:
1) Assess existing sites and content. Identify at-risk content and sensitive data within your “as is” on premises environments – including SharePoint or file share content – that could potentially violate your compliance policy. Perform a risk analysis to understand exposure levels for a defined scope of content, as well as the effectiveness of existing controls to determine the overall sensitivity of an existing SharePoint environment.
2) Report on and classify content. Implement an effective and realistic compliance program that can be enforced, measured and modified as needed. Identify what data your organisation collects, processes and stores (and where it comes from) and decide on applicable/mandatory privacy and security requirements – what, where, why, and how. Provide information classification based on risk exposure to the organisation. Define minimum content and physical security access controls based on risk classification. Assign metadata and restrict access to sensitive content.
3) Design compliance information architecture. This is your chance to expose, access, and manage all content residing in your network and/or the cloud for centralised document management in SharePoint based on your specific business requirements – such as restructuring permissions, and adjusting access, metadata and security settings of content. Strictly regulate user-generated content to prevent the creation or uploading of non-compliant, harmful content.
4) Determine cloud migration approach. Utilise content and site assessment reports and subsequent tagging to develop a best practices approach to migrate select content and workloads to the cloud. You can do so by identifying cloud-appropriate content for migration with customisable filters based on metadata or content types you established in Step 3. Scan, flag and/or block all contents prior to upload to ensure compliance. Detect and make changes to content and/or user permissions and access that violate your policy. Then, just as in any other migration – determine your schedule and project milestones to ensure that the project meets your business needs and keeps your end users focused on what they should be focused on: doing their jobs.
As companies and government agencies move their applications increasingly to a cloud based infrastructure, they must also understand and fully review the associated privacy and security considerations. Privacy is a global issue, and one thing is certain, even if you build software applications to serve a very specific market segment - you cannot ignore privacy as a fundamental issue that your customers will demand. Change can be hard, but this is a positive change. You’ll be utilising a new way of working in the cloud that can vastly improve your business agility, while keeping traditional hardware costs low and safeguarding your sensitive data. In the meantime, keep your feet firmly planted on the ground as your applications move to the sky!
Following the success of MVP Cloud OS Week held this September at Microsoft's Victoria Office, CloudOS is going on the road.
We are pleased to announce the next phase of our MVP-led events, the MVP Cloud OS (Infrastructure) Relay, to be held November 11-15th and 25th to 29th in selected cities nationwide.
Join Microsoft and a panel of MVP speakers, to learn about Cloud OS and how you can use the technology suite from Microsoft to transform your business.
Today's business runs on IT. Every department, every function, needs technology to stay productive and help your business compete. And that means a wave of new demands for applications and resources. The datacenter is the hub for everything you offer the business, all the storage, networking and computing capacity. To ride the wave of demand, you need a datacenter that isn't limited by the restrictions of the past. You need to be able to take everything you know and own today and transform those resources into a datacenter that is capable of handling changing needs and unexpected opportunities. With Microsoft and its Cloud OS strategy, you can transform the datacenter.
Where and when are the events;
IF YOU WANT TO GO TO THE SQL RELAY CLICK HERE
These will be great events, top speakers, top topics and a little bit more.....
Speakers will include
MVP Gordon Mckenna, MVP David Allen, MVP Damian Flynn, MVP Raphael Perez, MVP Rob Marshall, MVP David Nudelman, Martyn Coupland, Sam Erskine (author of 2 cookbooks), John Quirk and myself MVP Simon Skinner.
One of the great things about virtualisation, is that the host operating system running the hypervisor is independent of the operating system in the VMs. For example VMware ESXi is not the same as Linux and Windows operating system in the guest VMs that reside on it. You might be a little confused when you look at Hyper-V in the same way, but actually it’s the same again. You could run Windows Server 2012 Hyper-V and have Windows Server 2003/2008/200R2 in the VMs and contrary to popular belief Linux also works well on Hyper-V and is fully supported for the latest versions. Note: It is technically possible to run much older operating systems on Hyper-V, such as MS-DOS, OS/2, Windows for Workgroups it’s just that those aren’t supported because those operating systems aren’t supported at all even if they are ran on physical hardware.
The point I want make here is about what effect upgrading the hypervisor has on the guest operating system in the virtual machines. This can be likened to reinstalling that operating system on new hardware, which in turn means driver support. VMware Tools/ Hyper-V Integration Components provide these synthetic drivers to spoof such things as CPU storage, networking, time synchronisation and also feed back to the hypervisor the state and usage of these resources. So from the perspective of the guest operating system, moving hypervisors is the ripping and replacing of these drivers. Of course from the host this might mean a change of the metadata and hard disk files that represent that virtual machine on the host.
None of this is difficult but does involve some change albeit less than changing the guest operating system, but why bother upgrading or changing a hypervisor?
If I look at what Hyper-V offers in Windows Server 2012R2 compared with the original version that shipped with Windows Server 2008, then everything has got easier, faster, with corresponding improvements in high availability(HA) and the different but equally important world of disaster recovery (DR). Some of this is a reflection of what hardware can do now such as NUMA in CPUs, SR-IOV on network cards while other improvements have totally been down to reworking the hypervisor itself to provide access to parallel processing without getting caught up with waiting for availability of threads on cores in a CPU.
So you’d have to have some obscure use case to stop you upgrading from one to the other as there would be no license cost involved because Microsoft doesn’t charge for Hyper-V, just the licensing in the VMs - I am assuming you are already licensed for those! So in return for a bit of work you get access to all the new stuff in Hyper-V.
Of course you could also move from Hyper-V in whatever version of Windows Server to VMWare and use one of their many licensing options to suit your HA & DR needs and how many VMs per server you have and so on. In preparing your cost benefit analysis for this compared with moving to Windows Server 2012 R2, it’s worth bearing in mind that you’ll still need licenses for the operating systems in the VMs themselves whatever hypervisor you choose. Often the best way to do that is to license the host with Windows Server Datacenter edition which covers you to run as many VMs as you want on that host each of which is then licensed to run Windows Server and then covers you to run Hyper-V on the host as well. For a few edge cases that analysis might weigh in favour of VMware or be worth paying because of some particular feature like VM fault tolerance that doesn’t exist in Hyper-V. I say edge cases because I don’t see that happening a lot in the current market.
What I do see is movement from VMWare to Hyper-V. I don’t propose to do a feature comparison here (If you are interested then Keith Mayer’s post is as good as it gets) . What I want to focus on is three things:
1. Hyper-V advances over the last five years have outstripped enhancements in VMWare. For example the list of new features in Windows Server 2012 to 2012R2 all enhance Hyper-V in some way be that for VDI, for storage or DR. That rate of change isn’t evident in VSphere 5.1 – 5.5 most of which means the scalability numbers are in line with Windows Server 2012 R2.
2. Windows is Windows. If you know how to manage a windows server you can manage Hyper-V. This reduces the staff costs associated with running server virtualisation, because you don’t need a different team with different skills. A good example is to fire up Server Manager and see your physical hosts, alongside your virtual machines in one screen. This is actually good for us IT Professionals in those teams, we can either acquire a broad windows server knowledge including virtualisation in a smaller team or have the ability to transfer skills and have career progression in a larger one
3. Hyper-V is fit for purpose. While the Hyper-V that you buy in Windows Server 2012R2 is not exactly the same as the one runnning behind Azure , Office 365, Bing etc. there is a lot of common code. I could rattle out a list of references who are on Hyper-V now like Royal Mail, Unilever and Aston Martin, but perhaps the best evidence of Hyper-V being ready for business is silence. By this I mean that when things go wrong with anything technical these days forums and social media are alive with it very quickly and that has not been the case with Hyper-V.
So my assertion is that to upgrade your Hypervisor you need to consider Hyper-V
Here is the Final programme update for each day of Tech.Days Online from Wednesday November 6-8. Not only are we delighted to confirm that Steve Ballmer will be joining us on the first day, but we also have British Lions rugby legend Will Greenwood confirmed to share the British Lion and Microsoft story as well as the Deputy CIO of Lotus F1 on board to talk about Office 365. You can still send in you questions to Microsoft CEO, Steve Ballmer before Wednesday, just send them to firstname.lastname@example.org and we’ll select the best to ask him during the interview.
All sessions are 30-minutes and the technical experts running these sessions consisting of Microsoft product experts and Microsoft Most Valued Professionals (MVP's) will be available post-session for further online chat and follow-up to any questions you have.
Remember that there will also be competitions and prizes to be won throughout each day from T-shirts to an X Box One so do switch on, tune in and join us for all the sessions you want to participate in by registering for Tech.Days Online starting on Wednesday November 6th.
Tech.Days Online – November 6-8 - The Final Programme Update
Wednesday November 6 – Windows Client for IT Pros and Developers
Session Title (all sessions are 30 minutes)
Overview of the day
Windows 8.1 – devices galore! + interview with Will Greenwood on devices and British Lions
MDOP 2013 Overview and Deeper Dive on changes in Application and User Experience Virtualisation
Management in the cloud with Windows Intune Configuration Manager
Steve Ballmer, Microsoft CEO, Live Interview
Device Management – Heterogeneous Device Management
Office 365 – The Evolving Service + Interview with Michael Taylor, Deputy CIO, Lotus F1 team
Building business Apps with Visual Studio DevOps
Windows 8.1 – Workplace Join
Windows 8.1 - VDI
Find out about what you can do with Intel vPro
Windows 8.1 Enterprise
Wrap-up of Day 1 (inc. announcement of today’s Xbox One winner)
Thursday November 7 – Server and Cloud for IT Pros
2012 R2 - Virtualisation
Building Windows Server 2012 R2 Networking with System Center 2012 R2 Virtual Machine Manager
2012 R2 - Storage
Extreme automation - Learn automation or get better at golf!
What’s new in Ops Manager
Cluster in a box
Moving VMs from on-premise to Azure
Automating the Azure Datacentre with PowerShell
Ask the Experts – Your questions answered by today’s expert presenters
Windows Azure Platform
Wrap-up of Day 2 ((inc. announcement of today’s Xbox One winner)
Friday November 8 – Visual Studio, Azure, Dev tools for Developers
Asynchronous C# development in Visual Studio 2013
Agile development with Team Foundation Server
Quick and Easy Cloud Back-Ends for Mobile Apps
Using the Nokia Music C# API on Windows Phone 8 / Windows 8
Azure Cloud Services Architecture
From Whiteboard to deployed in 15 minutes
What's new in Visual Studio 2013 for Web Developers
What's new in Visual Studio 2013 for App Developers
What's new in Windows 8.1 for App Development
Wrap-up of Day 3 (inc. announcement of today’s Xbox One winner)
Remember to register for Tech.Days Online from November 6-8 here
By Geoff Evelyn, SharePoint MVP and owner of SharePointGeoff.com
There are several elements to SharePoint, all of which require support. These elements include back-end, front-end, integrated, business and configuration support. To support customers using the SharePoint platform and the relevant SharePoint solutions on that platform, there will be instances where that support is not entirely the same as what the customer thinks you support (though that's another conversation altogether).
So, a key success criteria of SharePoint depends on the level of its support service success. The capability of that support service is simply defined on a key element - the defining, understanding and setting exactly what is being supported and setting customer expectation.
So what is supported to cover SharePoint? I thought the best way to explain would be to craft a basic table giving a high level view of the key areas:
Support provided by
Advanced usage, like data integration, third party components, Internally / Externally developed Apps and Add-ins
Peers, product creator, user representative, user training
Usage of SharePoint, Site Management, Site Administration
SharePoint champion, user training, helpdesk, SharePoint support
SharePoint configuration management; installation, interfacing to other systems, connection to third party products, change control
SharePoint support, IT Helpdesk
Environment; operating systems, client operating system, back-end software support
SharePoint support, Helpdesk, Engineering and Platform support
Implementation and Deployment
SharePoint support, helpdesk
SharePoint 'support' could not possibly provide 100% support cover to all areas described in the above table. Also, support tends to become more vague as the problem area tends towards user specialization moving from the bottom of the table upward. An example of specialization is the use of third party SharePoint tools or SharePoint Apps. This may require specialized skills coming from the third party, of which SharePoint support consumes. Another example would be SharePoint solutions built internally by bought-in external expertise, which will require an element of internal support once delivered. Again, those skills cannot be provided solely by an internal facing SharePoint team. Similarly, a finance specialist writing macros in a spreadsheet and expecting SharePoint support may also find that he or she is alone, and yet, if that finance specialist needs to upload that document loaded with macros he or she may need support. In any case, it is quite possible that at this level users often fall back on their own resources, training they have had, or contact with other user with similar problems.
Henceforth, one conclusion that can be drawn from the above table is that one method for delivering support is rarely enough. The same can be seen from the individual weaknesses of the different forms of support.
To further explain, I am now going to describe a fictitious scenario where a software house that has developed SharePoint Apps and add-ons, and provides a SharePoint support service to provide support to those products for their clients. In doing this, it may give you an understanding of the associated successes and failures in that model.
1: Scenario - Fabrikam SharePoint software house
Fabrikam is a software house which develops SharePoint tools and apps. The software house then sells these to customers. As they sell these products, the software house provides a support team to look after a number of external clients. With its head office and several branches in the UK and branches in two other European countries, as well as having to support its own SharePoint tools and apps customers, it also has a large number of users of its own internal SharePoint platform.
Support to external users is slick and professional. Fabrikam has its SharePoint customer support service split into two. One part is a 'Maintenance' operation. Support members use an administrative helpdesk system which is used for customers to report product failures, and from there contact the clients to resolve the issues. These support members also have access to a further 'technical' helpdesk which they can contact for assistance regarding the SharePoint issue. In this way, the support team has an extremely high spot rate. It rarely needs to pass on queries, and to keep this spot rate high it maintains a SharePoint site acting as a 'technical library'. The technical helpdesk also re-contacts key clients, first to see if the issue can be resolved without SharePoint support getting directly involved, and second to decide whether additional resources are required.
The second part of the customer support service is the 'Users helpdesk'. This is dedicated to solving problems and challenges users have. This has two tiers; first, general query and answer helpdesk, and a second, a chargeable premium support helpdesk for customers requiring a fully fledged problem-solving service.
Both parts of the customer service have a strong emphasis on user relationships, keen to build up a rapport with their external customers. The service take queries from customers who are also technical competent (usually calling them from a client's internal helpdesk service), answer those they can immediately and pass on what they cannot answer to a separate, central , 'technical support' function, which solves the problem and contacts the customer. Technical support also provides software solutions to the system maintenance helpdesk. Also, clients and the company teams can also use the technical library, which as well as retaining a professionally managed catalogue of technical volumes also offers online technical documents.
In contrast, support to internal users is slender. Fabrikam has a 'development' function responsible for coding SharePoint Apps, Tools and patches. It answers the occasional user query but does not operate a formal helpdesk. User training is virtually non-existent. Users are expected to learn from their office colleagues and no responsibility for the standard of user ability is taken. Due to the regular nature of the interaction between internal users and development, a single, overstretched 'technical support' operative emerges, moving from user to user, solving problems reactively.
The contrast in this scenario between the professionalism of external customer support and the sparsity of internal support is not uncommon. SharePoint support grows out of commercial imperative under the watchful eye of managing the corporate purse. Internal support can risk a little complacency, for it can always turn to its better equipped and better financed colleagues down the corridor. However, the lack of internal user training and poor internal support reduces the effectiveness of the workforce and worsens the learning curve for new recruits. It also occasionally causes customer support to be diverted from its true purpose in order to help out with internal support.
That the customer support side is successful is not in doubt. With such a broad range of services, plus a technical library to catch those support queries which don't fit the other services, their coverage is as broad as could be expected. They have separated loss-making from profit-making customer support work and use one to fee, finance and necessitate the other.
The separation of helpdesk and technical support causes problems in some implementations, however, here it is a distinct advantage. Great store is set by the helpdesk's ability to answer queries on the spot. This pleases external customers, who of course want the fastest possible answer, as well as minimizing the number of staff needed in technical support. In order to make sure technical support does not get out of touch with customers, job rotation takes place between the helpdesk and resolver function. The main weakness of this focus on a high spot rate is the tendency of some technicians to mark a query as closed, when in fact the client may have wanted more. To control this tendency, the helpdesk manager needs to keep a careful watch on answer quality, particularly ensuring that requests for aid are directed and recorded correctly.
Where the systems helpdesk fails is in its inherent inability to manage user expectations for internal support services. In order to attract custom, the company sales force naturally tends to overstate the capability of SharePoint support to its customers, and this increases demand to a level at which the quality of support becomes unsustainable and unusable for internal staff.
The lesson here is that if support is free, it will be oversubscribed, whether it is needed or not. The chargeable, premium support is in a much better position to control expectations through service level agreements.
On the SharePoint product and Apps maintenance side, the separation of the administrative from the technical causes considerable duplication of work. This confuses some customers who see not reason for two contacts on the same topic before relevant support individuals are assigned.
From the company's point of view, there is a trade-off between reducing technical people to logging fault calls from users and having a separate, less-skilled but expensive service just to answer the telephone. To put this into context, in the past companies may have solved this dilemma by issuing its customers with batches of fax header sheets, pre-printed with their account details, for reporting issues or requirements. Nowadays, customers are asked to fill in an online support form which is then routed to the engineers via an administrative helpdesk.
No administrative helpdesk is needed, the faults come straight into the technical helpdesk who can then decide whether to call the client for further information before passing the report on to an administrator to allocate the job to the relevant support individual.
4: In Conclusion
Concerning the scenario described in this article, the more products created by a software house the more diverse and complex the nature of the support becomes.
However, whether the SharePoint solutions are created externally, or internally, or a combination, the increase in products decreases the impetus for internal support to have their skills upgraded.
The answer to the question on who supports what therefore is dependant on the level of support that you wish to provide to your customers.
If the product is an App, then most likely you will not support anything else than that App. However, where do you start offering support, and where do you stop offering support. Do you start at the client PC side to the web application to the site collection to the App? Or just the App? And, if you do, is that clear to the customer so that the level of support provided is what they expect? What happens when another product is created which is closer to the client PC? How does that change the support for the App?
And, even if we move from custom development to solution development from a business user? Say, for example, that a solution using a third party web part component in a SharePoint site is configured by a business user? Where does the support start and end for that business user?
And in order to support internal customers, they need to be defined on a level of importance - some internal customers will be as important (if not more so) than external customers. That needs to identified and wrapped into the professional offering provided. Support for any SharePoint solution must be aligned to the SharePoint Service Delivery, so that resources required to support released solutions are defined and agreed. The reason for that statement is made clear in the following blog:
Six action points to properly define what is SharePoint supported are (in order):
1. Identify the customer base - define their priorities.
2. Inventory the products the customers use and map accordingly.
3. Associate products to specific levels of support
4. Create Service Level Agreements backed up by operational statements aligned to the products offered. A service level agreements aim is - to teach, inspire confidence in the availability of the solution provided and outlines support details. http://www.sharepointgeoff.com/articles-2/sharepoint-service-level-agreement/
5. Associate skillsets to the areas of support identified in the Service Level Agreement
6. Ensure that all products are part of a roadmap that includes configuration management