• Securing Microsoft’s Cloud Infrastructure

    A lot of people and companies are talking about “the Cloud” today. I guess that there are not too many companies that share the same track record of running online services as Microsoft. 1994 we launched MSN and since then we are in this business.

    Microsoft Global Foundation Services (the group responsible for this infrastructure) just published a document called Securing Microsoft’s Cloud Infrastructure which is definitely worth reading. In my opinion a few items will be key when talking about a trustworthy cloud, one of them being transparency. Transparency how your data is handled, how software is written and operated, how incidents are dealt with, etc.  This paper definitely helps on our side to drive in this direction although we did already a lot in this respect like making the Security Development Lifecycle available and communicating transparently about security challenges etc.

    To show the importance of security for our online services as well, I would like to quote the paper:

    The core driver to creating an effective security program is having a culture that is aware of and highly values security.  Microsoft recognizes that such a culture must be mandated and supported by company leaders. The Microsoft leadership team has long been committed to making the proper investments and incentives to drive secure behavior. In 2002, the company formed the Trustworthy Computing initiative with Bill Gates committing Microsoft to fundamentally changing its mission and strategy in key areas. Today, Trustworthy Computing is a core corporate value at Microsoft, guiding nearly everything the company does. At the foundation of this initiative are these four pillars: Privacy, Security, Reliability, and Business Practices. For more information on Trustworthy Computing, see the Microsoft Trustworthy Computing page.

    Microsoft understands that success in the rapidly changing business of online services is dependent upon the security and privacy of customers’ data and the availability and the resiliency of the services Microsoft offers. Microsoft diligently designs and tests applications and infrastructure to internationally recognized standards in order to demonstrate these capabilities and compliance with laws and with internal security and privacy policies. As a result, Microsoft customers benefit from more focused testing and monitoring, automated patch delivery, cost-saving economies of scale, and ongoing security improvements.

    Here are the links to the different papers we published today:

    Roger

  • How we do IT: Direct Access

    You might know that we have something we call the Microsoft IT Showcase, where our internal IT shows how they use our technology to run our environment.

    Now, we just published a new article, which might be interesting for you to read called Using DirectAccess to Provide Secure Access to Corporate Resources from Anywhere.

    I tell you (as a long-term user of DirectAccess): This technology really rocks!

    Roger

  • Patch Management, a key step towards compliance!

    As you might have read, I recently blogged about my infrastructure and the future of a platform towards a better management of compliance – honestly, I actually played with our latest technology smile_embaressed.

    I wrote about

    Now, a necessary and very important next step towards compliance as well as a secure environment is a sound Patch Management process and then – in the second place - the underlying technology. I blogged several times already about Patch Management as I see a lot of companies failing to deliver on this. I recently wrote a post called Patch Management – Cover the whole 9 yards. in there I mention different papers you could/should read:

    and I reference Christopher Budd’s Ten Principles of Patch Management:

    1. Service packs should form the foundation of your patch management strategy
    2. Make Product Support Lifecycle a key element in your strategy
    3. Perform risk assessment using the Severity Rating System as a starting point
    4. Use mitigating factors to determine applicability and priority
    5. Only use workarounds in conjunction with deployment
    6. Issues with Security Updates are documented in the Security Bulletin Master Knowledge Base Article
    7. Test updates before deployment
    8. Contact Microsoft Customer Support Services if you encounter problems in testing or deployment
    9. Use only methods and information recommended for detection and deployment
    10. The Security Bulletin is always authoritative

    First of all (and you see that in the articles referenced above) it is of outmost importance to have a process in place. Basically the core schema to run such a process is:

    Cc700845.secmod193_1(en-us,TechNet.10)[1]

    I have seen different complexities to deploy such processes. From highly complex to pretty simply and straight-forward ones. The ones of you who know me know, that my preference is KISS (Keep it Simple, Stupid). So, make the process as complex as necessary and as slim as possible.

    So, once you have the process in place and take a conscious decision, the question is about deployment and reporting.

    So, let’s talk about technology now.

    In order to get an overview over the state of your computers, you might use the Microsoft Baseline Security Analyzer. This is an excellent tool to scan your Windows machines and get an overview of the security state of the machines. It might not deliver the same level of sophistication as very expensive tools, but the difference is: We provide it for free and – in my opinion – it gives you a good starting point to look at vulnerabilities including the level of Security Updates of a given PCs. Here is an example of one of these assessments:

    2009,05 - Patch Mgmt 1

    But this does not really resolve your base problem about the Security Update compliance of the computers on your network as well as the distribution of them. From my point of view, there are different options to do so:

    • If you are a small and medium business, one of the coolest solutions for you to go is System Center Essentials. It is System Center Configuration Manager, System Center Operations Manager and Windows Server Update Services in one package. However, it is limited to 30 servers and 500 clients. If you are in this limit, it rocks.
    • System Center Configuration Manager: If you already use this technology to distribute software and configurations, leverage this.
    • Windows Server Update Services: It is kind of unbelievable but this is free! So, to be clear – we do not charge for it! You can download and install it and it scales even for large Enterprises (did I tell you already that it is free smile_wink?).
    • A third-party solution

    I am using WSUS and am more than happy with it. The way I am organized is, that I get regularly a mail from WSUS with the current state of “the nation”:

    2009,05 - Patch Mgmt 2

    As I am mail-driven, this allows me to see, what I have to do with regards to WSUS. I then can log-on to my WSUS server to get more granular reports:

    2009,05 - Patch Mgmt 3

    From here on, I can decide, which actions I want to take, based on detailed reports I can get by clicking one of the texts in the UI:

    2009,05 - Patch Mgmt 4 2009,05 - Patch Mgmt 5

    BTW: this machine is patched in the meantime – so do not even think about it smile_wink

    Even if you cannot enforce the security update level technically that way (and we will talk about Network Access Protection in a later post), it at least helps you to understand, where you stand and what you have to do in order to get compliant.

    Again (as I did so often) my call to action to you: Make sure that you have a straight-forward process in place and then use technology (like WSUS) to deploy the updates and ensure that you have deployed them correctly!

    Roger

  • Security Development Lifecycle Template – Your next step to “Secure Development”

    You might remember it: January 15th, 2002 Bill Gates wrote the famous memo on Trustworthy Computing to all the employees at Microsoft. This was probably one of the biggest initiatives at Microsoft and radically changed the way we develop software (and much, much more). I remember when I was the first time on stage talking about Trustworthy Computing in 2002. I said that this is an industry initiative and not something for Microsoft only. A lot of people just smiled at me and told me that this was just another try to get out of our responsibility and blame the industry for our problems. However, we came a long way since then.

    If you look at Bill’s memo back in 2002, there are a few remarkable statements in there, when it comes to the industry collaboration piece. He said that “We must lead the industry to a whole new level of Trustworthiness in computing.” and “It’s about smart software, services and industry-wide cooperation.”

    So, we started to introduce a processes we called the Security Development Lifecycle at Microsoft. The process on a high level looks pretty familiar (I hope at least):

    cc448177.SDL_Process(en-us,MSDN.10)[1] The effect of this process was pretty impressive. Let’s look at a few key figures from our latest Security Intelligence Report. If we investigate the Security Bulletins we had to release in H1 2008 and compare the impact on Windows Vista and Windows XP, it looks like that:

    2009,05 Vista vs XP

    And our overall share of the industry-wide vulnerabilities dropped constantly:

    500x327[1]It definitely had an effect on us – but we always wanted to share what we are doing within Microsoft to help you as developer to profit from what learned.  So, we made SDL available since quite a while as books, trainings etc. Today we go an addition step to help to reduce the other 97% of the industry-wide vulnerabilities as well.

    Today we announce the availability of a template for Visual Studio, where you can integrate SDL in Visual Studio Team System – and I tell you, this is really, really cool. And as always with such initiatives it is for free!

    As a teaser, here are a few screenshots:

    2009,05 - SDL GuidanceThis is the guidance page on SDL – kind of your starting point 

    2009,05 - SDL Dashboard

    To run your project, you have a dashboard view

    2009,05 - SDL Requirementsand last but definitely not least you have an overview over the SDL requirements

    and there is much, much more!

    Now, I leave the word to the real pros. Read the blog post by our SDL team: Making Secure Code Easier

    I wish you all a lot of success implementing SDL and let’s reduce the industry-wide vulnerabilities

    And – by the way – did I tell you already that we make it available for FREE smile_wink?

    Roger

  • File Classification Infrastructure in Windows Server 2008 R2

    We recently revealed the File Classification Infrastructure in Windows Server 2008 R2. This infrastructure can help you to classify files not only based on the location where it is stored but based on content as well. However, there is not too much value for me to blog more about that, let the experts speak: Classifying files based on location and content using the File Classification Infrastructure (FCI) in Windows Server 2008 R2

    Roger

    Digg This