Insufficient data from Andrew Fryer

The place where I page to when my brain is full up of stuff about the Microsoft platform

September, 2010

  • Windows 7 for friends and family

    It’s not always Microsoft’s fault…

    For the last month or so, my mum’s Asus Revo mini desktop PC thing kept starting windows repair intermittently on boot and her local tech support man (I couldn’t get down to her) thought Windows updates were the problem.  Also when it did power up it was taking ages to start.  My inclination was a hard disk problem and as I couldn’t get it to start Windows at all I stripped the whole thing down (not the easiest machine to do this on), swapped out the hard disk and proceeded to install Windows 7 and the other stuff she would need:

    It all took about 4 hours, and yes it rebooted several times and installed upwards of fifty updates, but I was out having meal during that bit, and when I came back it was all done first time. I then put the old hard disk in a drive bay enclosure and got her data off it and onto the new hard disk and so now the only reason I need to visit is as a loving son rather than an IT Professional.

    The other deployment scenario I worked on at home recently was to upgrade our home machines with the acquisition of two barebones systems from Novatech. These machines were both running  Windows 7 and so all I had to do was move the old system hard disks across and fire up the new barebones machines for everything to work. Despite what I have read on some forums including Microsoft social you definitely don’t need to do a reinstall with Vista or Windows 7 as they  just auto detect the hardware, reboots and asks you to reactivate your Windows license.  The limiting factor on my windows experience score is now those hard disks, and I would need to reinstall windows if I wanted to swap those out too, of course.

    While I would defer to my colleague Simon and the excellent Springboard resources for mass deployments of Windows 7, it was simple enough for me to do all of this.  So you could gain some kudos by doing your own local deployments like this for your neighbours, friends and family and perhaps spend less time fixing their issues and more time socialising with them as a result.

  • Spatial and OLAP data in Reporting Services

    I spent a very interesting afternoon last week with the IT team at the RSPB to explore what they could do to combine their mapping data with the olap cubes they have in analysis services. The only out of the box solution in SQL Server for spatial data is the map control in reporting Services (in SQL Server 2008 R2).  This map control allows you to create the appearance of drilling down to more details as you can pass details of where on the map is being clicked to another report, much like passing parameters in previous versions of reporting services.  The other interesting thing about the map control is that it does allow you to combine sets of data in the one control.  This would allow the RSPB to combine a relational query with the spatial data with an MDX query and then join them on  a common field.

    The spatial hierarchy would use the standard UK postcode:

    [changed 20/09/10 as I got the labellign of the postcodoes mixed up - thanks to Robert Edgson for picking this up]

    Postal Town/ London area( (1st 1or 2 characters) –> outbound code(the group before the space)  –> sector( 1st number after the space)  –> full postcode

    for example for Microsoft UK:

    RG –>RG6 –> RG 6 1-> RG6 1WG which equates to

    This would then translate into three linked reports:

    • A static UK wide report with all the postal town boundaries on e.g. RG – Reading.
    • A postal town report which takes a parameter of postal town (e.g. RG). and shows the inbound codes in that postal town/London area
    • An inbound report which takes a parameter of an inbound code shows the postcodes and or sectors in the area specified by that inbound code.

    I would then publish the map report parts for these maps, another new feature in SQL Server 2008 R2 Reporting Services.

    The olap data in the cube has a location dimension with the parts of the postcode in a postcode hierarchy; postal town –> Inbound code –>sector –>postcode. The parameter passed into the postal town report would then filter the map to the town and because the olap data is inner joined to that town you’ll only get the olap data for that town.

    It is then up to the users as to which visualisation they wish to use to express high and low values e.g. heat colour the polygons or add symbols with different colours or sizes to show the measures in the cube.

    What you loose in surfacing BI in this way is the ad hoc interaction you have with analysis services proper, however what you gain is a quick method of seeing data on a map from a cube, that can be created by an end user.

    In the meantime here’s a gratuitous painting I did of a merlin to show I have more than a casual interest  in the success of the RSPB:

    hastings day trip 048 

    I’ll keep you posted on how the RSPB goes (subject to their permission of course)  and in my next couple of posts I’ll show you how to do some of this using freely available (and unfortunately American based) data.

  • Just because you can doesn’t mean you should, mixing Hyper-V and other applications in Windows Server

    I have got the highest respect for Hyper-V. It’s simple to use I don’t need to learn any new interface (it’s just another mmc snap-in)  in which to mange the virtual world.  However when someone asks me if they can install other roles alongside hyper-V in production I say “Oi mate NO don’t do it”.

    The problem here is that it is possible to do this with Hyper-V, for example to run SQL Server  alongside hyper-v  in the host operating system.  It’s not  a FAQ but do get asked about the advisability of this by customers and I have seen internal threads asking the same thing.  Rather than ask why this a bad idea let me debunk some “myths” about why people think it is a good idea:

    • You’ll get more performance from the system this way.  The theory runs that SQL will run only slightly slower on a virtual machine (vm) than it will on the same physical hardware.  However if you are after performance then SQL Server is now going to have to contend for resources the hosted virtual machines on the same server, the hypervisor expects to get the whole physical server and works underneath the host operating system anyway and so will still affect SQL server performance either way. In fact the host operating system is bottom of the list for memory resources the virtual machine get priority and you can’t reserve cpu for the physical machine either.
    • It’s cheaper from a licensing perspective. No it’s not. If you use the host operating system for anything other than hyper-v you must license that operating system.  You won’t be able to use any fewer licenses of any other application like SQL Server doing this either. Microsoft’s licensing as applied to virtualisation takes the decision to virtualise or not out of the equation.
    • It’s best practice. Really? send me a link to the site that says so and I’ll send you some swag.
    • I’ve been told to.  Please tell me who told you do this so I can enlighten them.

    In fact the only reason I put applications alongside hyper-V is for ease of use and a better experience when I am showing how infrastructure stuff works in my demo environment.  So I have Office installed on the base OS along with on using hyper-V for demos.

    If I was putting a virtual machine infrastructure into production, I would use the free Hyper-V Server in as the host operating system and remotely manage this with server manager or System Center on a client machine, as this is lightweight and secure and still easy to manage.

  • SQL Server Reporting Services interop part 2

    I forgot one thing on my recent post on Reporting services interop, Report Builder.  What is Report Builder? it’s an end user (information worker in Microsoft speak), tool that creates a report that can run in SQL Server reporting services.  What this actually means is that it creates a xml file with an .rdl extension that is a set of instructions to run the report in a structure known as report definition language (RDL – hence the extension name of the file)

    Report Builder 1 (RB1) came out with SQL Server 2005 Reporting Services and allowed end users to create simple reports from a report model . It was a click once application, and for me it was flawed because if you used BI Development Studio  (BIDS) to tweak a report originally created in RB1 it could no longer be opened in RB1, and it only worked off report models – a semantic layer over the underlying data which defines joins and calculations to be used in the RB1 reports.    

    Report Builder Window with model open.

    With the arrival of SQL Server 2008 , reporting services got a complete overhaul and a completely new report builder, Report Builder 2 (RB2).  This was created by skinning the report designer in BIDS  with an office 2007 style ribbon and then making this a click once application that could be downloaded from Report Manager/ SharePoint (if you have reporting services running in integrated mode)..

    So RB2 had the same functionality as Report designer in BIDS, you could use all the new charts, write queries from any source etc.

    SQL Server 2008 R2 has now been released and this is essentially an update and enhancement to the BI tools in SQL Server, including reporting services. So the new Report Builder (RB3) supports maps, new charts like sparklines and allows parts of a report to be saved off and re-used..

    Each version of Report Builder only works with its equivalent version of SQL Server as the features in a particular version of Report Builder depend in turn on the features available in a specific version of Reporting Services e.g. the new charts in RB2 are only available in the report definition language in Reporting Services 2008

    Reporting Services Reporting Builder Compatibility
    2005 1 OK
    2008 2 OK
    2008 R2 3 OK

    Hopefully this all makes sense, but I have been asked this before and it still turns up on internal e-mail threads, hence this post.

    Finally what of the future?  All I know is that there are plans for Reporting Services to be included in SQL Azure (I don’t know when) and in that scenario the version of Report Builder you’ll need will again be specific to that version and so I am not sure what that will be so I am not going to guess.

  • IT Professional vNext

    I recently interviewed one of our interns, Jonathan Lickiss (here), about his aspirations to be an IT Professional (IT Pro) and to callout all the useful resources at the TechNet On portal.  Some would say that Jonathan is mad for selecting this as a career, and that the TechNet On site is just a piece of cynical marketing given Microsoft’s drive to provide all of its services in the cloud, and therefore make  IT Pros redundant. 

    If you look at what has happened to the work done by an IT Pro since my dad’s time in IT (1960’s), the trend has been to automate or outsource any repeatable  task. So you could just as easily point to DHCP, DNS, PowerShell, System Center or Windows 7 as sounding the death knell for the ITPro as they all get rid of this kind of work.  There have also been concerns about outsourcing and taking various IT work off shore, before the cloud came along.  However the number of IT jobs in the UK is still buoyant and we are still short of these skills in the UK despite the recession, if e-mail recruitment spam is anything to go by.

    The main attraction of cloud is cost management, just to be clear not necessarily a reduction in cost, but management of that cost.  This essentially means moving costs from capex to opex and only paying only for what is actually being used. For example, rather than trying to predict license costs and the number of servers needs and buying that upfront the business only pays for the number of active users , the amount of data used and so on.  This smooths cashflow and means a business has to borrow less money from reluctant banks, i.e. Microsoft takes the risk and cost out of the business.  This also improves agility as the business can rent more of what it needs when it needs it and so pay less at quiet times when the services aren’t needed. 

    With those kind of attractions, any finance director will want to evaluate this as an option in the same way that many businesses end up selling their offices and then renting them back, so whether Microsoft provide this service or not, the cloud is coming to a business near you.

    For the IT Pro what the cloud does is remove more drudgery, such as fiddling around with actual kit.  In Microsoft the intention is to go further than this and remove the need to worry about the operating system and some of the key Microsoft applications be that Exchange, SharePoint, or SQL Server.  However what neither Microsoft nor any cloud provider will do for you is manage the actual data or connections to it.   

    It’s also important to understand that the cloud doesn’t affect some parts of the IT Pro world, like end user support.  Yes there is InTune that simplifies the process (see Simon’s blog for more on this), but the helpdesk ITPro must still work with the end user to fix their problem.  

    The public cloud is not going to be the right answer for certain businesses, for a number of reasons be it compliance or culture.  Also it will be some time before the cloud predominates and IT Pros will need to be on hand to help with transitions and integration of on and off premise services.  Some services might never make it into the cloud, and a mixed economy will be the norm for some time to come, and this too will require careful management by competent IT Pros.   

    I think this will mean that the modern IT Pro role will evolve to one where we are much closer to the business. This might even see the end of the traditional IT department with the IT Pro becoming another specialist in a multifunction business team.  This sort of IT Pro will have to put more  thought, imagination, planning into their work, but will be trusted advisors to that team.

    I think this will be rewarding and challenging work , and I for one can see why this is attractive to new entrants like Jonathan, as it was the boring stuff that my dad did that put me off IT.

    Discuss!

  • A worked example of using spatial and olap data in Reporting Services

    In my last post I went through a possible approach for showing spatial and olap data that the RSPB are planning to evaluate, now I want to show you how this might look in an example.

    First of all an apology, I am not going to pay the £1500 to license UK postcodes as this is just a demo, rather I am going to show how it might work with example data from the US Census boundaries. Not only is this free but the data I am using is available on Codeplex (here.) so you can recreate what I am doing to get the idea.

    These files are pretty big sql  scripts, so the best way to load them in is to create a database (mines called USCensus) and then use SQLCMD to run the script for each SQL file:

    SQLCMD –S “MIAMI\PowerPivot” –E  -d”USCensus” –i “d:\samples\US Census\postcode.table.zip”

    where:

    -S is the server instance

    -E signifies windows authentication

    -d is the database

    -i is the sql file you want to run (change and rerun this for each of the four files in the census)

    The State, City and County tables have the relevant boundary in as a geography column, and county and city both show which state they’re in.

    I can then combine this withe the adventure works sample (the version for SQL Server 2008 R2 )cube which I can also get from CodePlex here, as this has a geography dimension.

    Now I have the data I need I can start to use Reporting Services, specifically by opening Report Builder 3 and using the map wizard.  Unlike all the other controls in Reporting Builder the map control allows you to combine different sets of data in the one control.  In my case I need the state polygons from the US Census database I have created and analytical data form the cube.  The wizard guides you through this process including identifying how the data sets are to be joined. In this case the join is between the  state name in the state table to the State Province attribute in the Adventure Works cube.

    Rather than grab every screen I have made a short video which you can pause and replay to see what I have done..

    There’s also a bunch of sessions on Report Builder and spatial data at the next SQLBits in York  1 –2 October 2010.

  • 3 clouds and a screen

    No I haven’t got the numbers the wrong way round, the three clouds I am referring to are the three tracks we have put together for TechDays on 8th October and the screen is where you’ll be watching it as this is Online rather than in person. Those three tracks are:

    • Cirrus Room - A high-level summary of the key technologies in the Windows Azure Platform such as App Fabric from Microsoft UK experts
    • Altocumulus – Hear from companies who have already developed for Azure including myworldcup.com
    • Stratocumulus – Get down to the low level clouds in our deeper technical sessions

    While this is a developer focused event, IT Professionals need to understand some of this as the cloud isn’t going away and Azure services still need to be managed in order to integrate with the on premise infrastructure. So there will certainly be sessions in the top two tracks that will be of interest. 

    As ever the sessions will be recorded if you can’t make the date but the value of listening live is that you can participate with questions to the speakers. I’ll be there to ensure those questions are answered by our panel of experts , and to learn some of the deeper stuff myself.

  • The (IT Professional) Apprentice

    The old Catch 22 situation of not having experience nor having anyone who is wiling to provide that experience still exists today as it did when I stated work.  If I reverse that argument every company in the current economic climate could argue that there is no budget to train staff so we have to recruit experienced staff.  This results in companies having the additional worry that if they do train staff, they will loose staff to their competitors once that employee has enough experience. The statistics bear this corporate behaviour out -  a massive 88% of UK companies don’t have a formal work experience program.

    Microsoft solution to this dilemma is the Britain Works  initiative with the aim of getting 500,000 more people into IT by 2012.  A year in the scheme has worked for 104,000 so not quite on track but making a difference to quite a few people. The scheme now has a special section Young Britain Works offering advice and apprenticeships.  There are training vouchers for IT Professionals to cross train or upgrade their skills as well as sections for Start-ups for charities who are interested in the program. Social Media is also important to connect al of this so you’ll find Britain Works on @britainworks on Twitter and there are groups on LinkedIn and Facebook.

    I realise it’s not always possible to do this in a partner or small business, but can I at least ask you to take a look and possibly connect with some of the candidates on here who are looking to work in our IT industry

    titleElevate

  • A worked example of using spatial and olap data in Reporting Services part 2

    Following on from my my post on Monday I wanted to show you how to link reports together using the mapping control in Report Builder 3.  This is actually pretty easy to do as any polygon or point on a map can have an action associated with it and one of the available action types allows you to call another report.The clever bit is that the polygon has data associated with it so you can pass this information as a parameter to another report.  In my example I have a US State report which is run merely by clicking on the state in the USA wide overall sales map I created my last post.

     

    image

    I have made  a screencast for you to follow and if you want to try it for yourself then please refer to part 1 I posted on Monday as you’ll need understand that first..

  • Woodland Trust Part 3–the New Building

    In my continuing saga on the Woodland Trust I was lucky enough to go to the formal opening of their new building last week.  Unfortunately while all the IT is going smoothly it is the lack of water that is holding up the move, and this means that they can’t bring the new servers online because they are water cooled. 

    IMG_0582

    The data centre may look fairly conventional but only the cabinets are air conditioned not the whole room.

    To quite their IT director Lionel Wilson on this..

    “If you asked an IT Professional to stop milk from going off they would put the pint in the middle of a room and turn the air con rather than storing it in a fridge". 

    BTW those circular things on the right aren’t a bank of speakers, they’re the cooling fans which spin up based on sensors in front of each server rack, and actually sound more like jet engines.

     

    As I have said before sustainability is key part of the new building but despite the high tech look and features in this building it has only cost 10% more than a conventional steel and glass carbuncle of the same size ..

    IMG_0587

    BTW It may seem odd for the Woodland Trust to be using wood the thing they are trying to preserve to make their building but the wood used is sustainable fast growing larch.

    However back to I, their 200 users remote desktop services session will run from a cluster of three physical servers. I am pretty sure you couldn’t get that server density from any VDI solution, but the experience is still very good ,each user has dual 24” LED monitors to deliver a rich corporate desktop. The rest of their key services;  Exchange , and their CRM system all run on Hyper-V and are managed by the System Center suite of tools.

    I had hoped that Lionel would be on hand to discuss this at IP Expo as his passion and knowledge for sustainable IT makes him an excellent speaker, but the slip in move in dates has prevented that so he has made a short video for the event organisers you can watch here.

  • Content is King?

    Many of us in the technical world seem to get caught up in the devices we use rather than what we use them for and how often.  for example it doesn’t really matter which ebook reader you use what matters is what books you read.  Repeat that for the films you watch with your home cinema setup, and the quality of the photographs you took on your last holiday…

    Simon and I took a day off work to setup some old laptops for a charity.  Yes they were pretty ancient e.g Dell Latitude D400, and D600 but they all ran Windows 7 just fine after a bit of digging for the odd driver. More importantly for the users we got Office Professional Plus 2010 on there plus Live Essentials and Microsoft Security Essentials to protect them. However they weren’t well received because the PCs themselves weren’t new, so we bought them home again, which is why I am wondering which is more important, the device or the content?

    I would go for the content, based on my wife’s horrendous office laptop, it’s a lovely little HP and should be rocket fast, but it’s running Office XP and Office 2003. Actually they aren’t the real problem, the issue is the encryption, VPN and anti-virus that are part of the standard build where she works to make XP as secure as Windows 7. To add insult to the injury of having an unreliable and slow machine she really has to put in the hours to get her work done using this older generation of software.

    As for our work on those laptops, we’ve found a community centre that can make really good use of them to raise digital literacy levels in a very deprived area, so it all turned out alright in the end. Not only that but Windows 7 has given these and many other machines a second life, which is also a good sustainability story.

  • An introduction to Remote Desktop Services options

    I am sometimes confused when things get renamed, but it made a lot of sense for Microsoft to  renamed Terminal Services to Remote Desktop Services as it has changed out of all recognition from where it started.  In fact we made this simple video to show how much has changed in Windows Server 2008 R2 for its launch last year..

    With this release of Windows Server there is a one stop shop, the Remote Desktop Web Portal, to bring all of the different options; a remote application, a remote desktop or the desktop of an individual’s own virtual machine using VDI (virtual desktop infrastructure).  Our simple pin board diagram gets the basics across well enough, but if you want to see the full details of what’s under the covers now there’s a poster you can download from here.

    image 

    This might seem complicated but all the bits are in Windows server 2008 R2 as is, and the simple portal for the users I put in this video shows that ..

    you’ll also see how remote applications appear in windows 7 , so the users don’t even need to go to his portal to see them once they register it.

    Like all things Microsoft you might well be confused about when to use which, so I’ll try and clear that up in my next post.

  • Remote desktop and VDI

    In my post yesterday I showed I gave you an introduction to Remote Desktop Services, today I want to contrast and compare a traditional remote desktop approach with the shiny new thing that is Virtual Desktop Infrastructure (VDI),  in order to understand when to use them.

    First of all why would you use either of them?

    • Efficiency.  You might decide remote desktops are a simple way of provisioning IT in hot desk areas ,branch offices, everywhere and deploy thin clients in place of PCs.  An added advantage is that any user can use any desktop.   
    • Environment. You might achieve better control of your power usage by deploying low power thin clients on desks and pulling the computing power needed into a well managed data centre.  However I am cautious about this as I think the savings are marginal unless you are also doing this for reasons like moving up to newer operating systems and doing a lot of consolidation as well as the energy and real costs of replacing hardware well have a long pay back period.
    • Security. With a remote desktop solution all the data is in the data centre, so data loss whether accidental or deliberate is less of an issue.
    • Manageability.  Virtualising desktops is seen as a way of controlling the desktop by the IT department.  

    RDS has long been used as a solution to these issues, so what makes VDI better or indeed worse?

    • Efficiency and the Environment.  The idea behind any desktop virtualisation solution is to put all the computing power used by your users in the data centre with only the screen , keyboard and mouse on the users actual desk.  The question then is how many concurrent desktops can I cram into one server in my data centre, as this will determines the energy and costs saving that can be made.

    If I elect to use VDI I create a virtual machine (VM) for each of my users so the number of users I can support will depend on how many VMs I can run concurrently.  Each VM will have its own operating system , plus any applications the users want to run. RDS works completely differently as each user session is not another copy of the OS with all its applications.  Buried in the   Remote Desktop Session capacity planning guidelines there is is this table..

    Server Configuration

    Scenario

    Capacity

    AMD Opteron Quad-core CPU
    2.7 GHz
    512 KB L2 Cache
    32 GB Memory

    Knowledge
    Worker v2

    110 users

    2 x AMD Opteron Quad-core CPU
    2.7 GHz
    512 KB L2 Cache
    32 GB Memory

    Knowledge
    Worker v2

    200 users

    AMD Opteron Quad-core CPU
    2.7 GHz
    512 KB L2 Cache
    32 GB Memory

    Knowledge Worker v2 without PowerPoint

    180 users

    2 x AMD Opteron Quad-core CPU
    2.7 GHz
    512 KB L2 Cache
    32 GB Memory

    Knowledge Worker v2 without PowerPoint

    300 users

    Contemporary VDI solutions would be pushed to support 20 users on this, based on the memory alone,  so RDS is at least 5X as efficient as VDI for this kind of workload and I am being very generous to VDI here. 

    Security. Both Microsoft RDS and VDI are equally secure as they both use the same protocol (RDP Remote Desktop Protocol) to connect to the desktop from the local device.

    Manageability.  In RDS you manage the server and the set of applications that are installed on it for use by the users. Also the user state doesn’t persist so the desktop returns to a known state at the end of a session. In VDI the operating system and the user applications are installed in each VM, all of which needs to patched updated and monitored.  The fact you are running users’ desktops in VDI doesn’t actually change the manageability problem at all; and any tools like System Center you would use to do this would work equally well on physical desktops or VDI.  BTW there is nothing to stop you running the Remote Desktop Session Servers as vms to get the benefits of server virtualisation.

    So why would use VDI at all ? 

    RDS works well where you have a large community of users with similar and basic needs e.g. Information Workers who need to use Office , access the company intranet and internal applications like ERP and thus have  one of a limited number of standard desktops with no persistent user state. 

    VDI gives each user a very personal desktop and the users state persists between sessions. This is might be essential for some power users with specialised needs, for example contractors who need special tools like Visual Studio and Expression, Actuaries, specialist financial staff and so on. VDI enables these users to be able to use the same thin clients as their colleagues, and to work remotely (where there are reasonable communications), from thin clients or low power PCs. 

    With the way the new component parts  of the Microsoft Remote Desktop infrastructure works these two classes of users will have the same user experience and will not know whether they are in fact using RDS or VDI.

    To conclude

    • For most thin client solutions RDS is much more efficient than VDI, but there are niche uses cases where VDI makes sense
    • VDI is not any more easier to manage than if that users desktop were running on the equivalent physical PC, in fact the same management tools work in both cases     

     

    Further Reading

  • Back Soon

    back soon

    I am out of the office at the moment, but my spies tell me the IE9 beta is here.

    I’ll be doing a bit of icon design on the side when I get back, as these come in quite useful when you tear off tabs and put them on the Windows 7 taskbar .

  • Microsoft security update reliability

    clip_image001

    The security community continually despairs of how bad we are at keeping our systems up to date with the latest security updates be they from Microsoft, Oracle, Adobe etc. In fact this trend probably matches the classic bell curve of adoption of any new technology so there are a few early adopters, the mainstream followed by a small but long tail of those who cannot , don’t know about or can’t be bothered to move away form old technologies like XP and IE 6.  This is a cultural thing and is therefore very hard to change. While a cautious approach to many technologies might be justified, I just can’t see any justification for not keeping up to date with security patches, with the possible exception of concerns about reliability. 

    Given that security and critical updates are a fact of life, I would then expect a company to have a process for checking the reliability of security updates, and if a particular update cause problems this would enable the precise issue to be flagged back to the vendor for resolution. What I can’t understand is a vague notion that some new update might  be unreliable and hanging back to see if anyone else has an issue with it. I am curious about this approach as I am not sure what event or time lag ensures that applying the patch is more likely to work than when it was released.

    Vendors obviously want to ensure these patches are reliable but they will have access to only so many testing environments and this can’t cover the variety of environments that are out there.  Microsoft tackles this problem by recognising that there are early adopters out there who want sight of the patches asap for testing and has a process in place to make use of this,  the Security Update Validation Program (SUVP).  Selected customers are invited to join and they get these patched priori to release in order to test them out (i.e. not put them into production) , which means they can deploy patches immediately they are released as they will have already done their testing, while Microsoft gets feedback on the reliability form all the SUVP customers that the patch is OK to release.

    So your choices are for applying security updates are:

    1. You are relying on the quality of the patches and deploying them as soon as they arrive.
    2. You’re doing your own deep testing when applying security patches as soon as they’re released, and deploy after testing.
    3. You understand the risk of not deploying these patches immediately and have balanced the cost of not testing and applying them against the risk and cost of losing service and /or data as the result of an attack and got sign off for this approach from the business.
    4. Your cv is update

    Further Reading:

    Microsoft Security Centre