• Are we sitting on a time bomb?

    I just read another of these studies: Enterprises sitting on security time bomb as office workers compromise company data. Let's briefly look at the findings first:

    • 38% of U.S. office workers admit to storing work documents on personal cloud tools and services
    • […] almost a fifth (16 percent) of people use Dropbox to store work documents, while Google Drive and Apple iCloud came in second and third place with 15% and 12% respectively
    • […] 91% of workers also stating that they use personal devices to store, share, access or work on company documentation […]
    • Regarding personal devices, almost two thirds (64 percent) of office workers use external hard drives to store work documents and almost half (46 percent) use USB drives. More than a third (34 percent) of people admit to using USBs to share documentation with others and 43% use external personal hard drives for the same purpose
    • Half of U.S office workers want to be able to work from anywhere and almost half (49 percent) wanted to access all of their work documents in one place
    • A fifth of U.S. workers also want to use their personal smartphones, laptops and tablets for work

    According to the research, technology adds to people's frustrations in the office as key annoyances are:

    • Not being able to send large files via email (31 percent)
    • Wasting time searching for electronic documents (28 percent)
    • Ensuring that you are using the most up to date version of any given document (21 percent)
    • Getting documents approved by others (18 percent)
    • Figuring out who has specific information about a project or task (17 percent)

    In order to share and work on documents with people outside of their company:

    • Almost two thirds (65 percent) of office workers continue to revert to sending email attachments
    • Nearly a fifth (16 percent) use USB drives
    • A similar amount (15 percent) send hard copies of documents via courier
    • Eight percent send CDs or DVDs via mail

    Shocking, no? Do we need to go out now and start to change the policies and punish the user? Well, this is what happens most of the time. We change the policies and then feel really good. However, I would guess that your user do all these things for a reason. This reason probably is not to feel cool but to do their job. A few weeks ago, I posted on Will the user define security policies in the future? where I quoted a study saying that at least 40% of the sales people had to circumvent security policies to do their job – to get access to information they needed to win a sale.

    I guess it is time to re-think. Almost all the scenarios above can be done in a secure way with today's technology like Rights Management Services, Bitlocker To Go etc. So, it is probably more helping the user to do their job – but in a secure and safe way rather than tightening the policies – no? Do you have a different view on that?

    Roger

  • The Moscow Rules in the Cyberspace

    Doing your basics is a natural given, when you defend your assets. Basics like updating your computers, staying on latest versions, dynamic network zones, incident response, identity management, monitoring etc. etc. – last but not least (or probably first J) is to know your assets and have your data classified so that you understand, which part of your business needs which level of protection.

    That's the basic stuff, which almost all companies do on different levels of maturity. But what about intelligence? What about leveraging sources outside your company (and combine it with information inside your company) to be able to look at least a tiny little bit in the future? This rarely happens or – better – I have not seen too many organizations really doing that intensively and successfully. Additionally, there is the question how to behave if you are going to setup something like that up. We are all used to work in a defensive mode but not necessarily in intelligence.

    Back during the Cold War, the US had some rules how to move behind enemy lines when you are a spy. These rules were called Moscow Rules. If you look at them, they can really and simply be applied to the Cyberspace as well. Read yourself. It is worth thinking about it and then thinking about how we can start to predict attacks: Moscow Rules: The original protocol for operating in the presence of adversaries can be applied to cyber defense

    Roger

  • Enabling the Hybrid Cloud with Microsoft Technology

    When I talk with customers about the Cloud, we always talk about a few key themes:

    • Identity: I am convinced that you need to be able to federate your identity from your on premise solutions to the cloud. You will want to control the process of decommissioning an identity and want to make sure that of you have to lay somebody off, this person has no access to the data anymore – especially in the Public Cloud as this part of your infrastructure can be accessed from anywhere.
    • Transparency: If you move to the public Cloud, you will want to have a certain level of transparency about how the software your business runs on is built and operated.
    • Data Classification: A lot of customers raise concerns about having their data leaving their premises, especially if they leave the country. However, for a lot of data in your environment, most probably you do not really care as the data is not sensitive at all. Then there is data ("the keys to the bomb") you will never ever move to the public Cloud.

    Especially the last point typically leads to a hybrid approach as you want to leverage the public Cloud (for the non-sensitive part of your data) and keep the sensitive data in a private Cloud.

    Our French team just published a paper, which you will want to leverage, when you are in such a situation: Enabling Hybrid Cloud Today with Microsoft Technologies. To quote the abstract:

    With the ambient credo to "do more with less, with a better agility and time to market", IT inevitably becomes a service provider for its enterprise and need to run like a business. The undertaking also requires a step further in the way the IT delivers its services to its customers: internal businesses and beyond. IT has indeed to deliver the services in an industrialized way for greater speed and lower cost. This requires increasing their overall core infrastructure operational maturity in two main areas.

    The first one aims at improving the management of their own on-premises IT landscape and traditional services by evolving towards a private cloud, i.e. an optimized and more automated way to provision and operate (a catalog of) services for businesses. The second one consists in enhancing their service offerings by utilizing off-premises public cloud augmentations (for acceptable cases of use) as (lower-cost) add-ons to existing services in the catalog.

    The combination and interaction of these two cloud paradigms results in the emergence of the hybrid cloud which meets the demands and the expectations of the business. Hybrid cloud spans the two above implementations. A service request can be instantiated in either implementation, or moved from one to another, or can horizontally grow between the two implementations (cloud bursting for instance).

    This paper discusses how Microsoft can help your organization achieve a successful hybrid cloud strategy and present the enabling technologies from on-premises, cross-premises, and off-premises implementation of (parts of) the services. Several typical patterns and common scenarios are illustrated throughout this paper.

    So, download and leverage it!

    Roger

  • Is there a future for Product Certifications?

    Often, when I talk to customers, product certification is one of the key themes they want to address. Especially they want to know about our commitment to Common Criteria and whether our products are certified. Typically we certify an operating system on Common Criteria EAL 4+ - the highest level, which seems achievable for multi-purpose operating systems. However, personally I do not think that product certifications are the future for different reasons:

    • The certification is static. In other words, there is a configuration at a given time, with a given product build, which is certified. The next hotfix or update basically invalidates the certification and you run the product rarely in the configuration certified. We make all the policies and configuration public – if you want to use them, feel free.
    • It is slow. Even though we have plenty of experience by now, it still takes us more than 12 months to get a product certified.
    • It is expensive. I will not go into the details here but it costs us a lot of money. This is the cost of doing business, I get that but there has to be a better way to address this.
    • Typically the protection profile certified against does not completely meet the customer's requirements. This means a lot of additional energy by the customer and us to go the final mile. This leads, unfortunately, to local requirements, local certifications and accreditations. Again, cost of doing business.

    Being an engineer, I am deeply convinced that a secure product is the result of a sound and strong process embedding security into the lifecycle from the beginning. I am convinced as well, that product certification gives you a certain level of assurance but not too much. The process would probably give you much, much more to build your risk management on.

    At the Security Development Conference this week, we declared conformance with ISO 27034-1, the first part of a standard on secure software development. Here is the official statement:

    Microsoft has used a risk based approach to guide software security investments through a program of continuous improvement and processes since the Security Development Lifecycle (SDL) became a company-wide mandatory policy in 2004. In 2012, Microsoft used ISO/IEC 27034-1, an international application security standard as a baseline to evaluate mandatory engineering policies, standards, and procedures along with their supporting people, processes, and tools.

    All current mandatory application security related policies, standards, and procedures along with their supporting people, processes, and tools meet or exceed the guidance in ISO/IEC 27034-1 as published in 2011.

    Basically, this means that we are convinced that our Security Development Lifecycle fulfills ISO 27034-1. Transparency in this context is absolutely key in my opinion – much more than any product certification or any statement along the lines of "we trust our (fill in a role), he/she is in the business for so long, he/she knows what to do". No joke, I heard this statement more than once.

    In the future – and in the Cloud – transparency how software is built and ultimately run, how a company does incident response etc. gains more and more importance. Looking into the purchasing processes of our customers, they are still much too much focused on the product itself, in my humble opinion. I am convinced that this should change and should change rapidly.

    If I may give you an advice: If you do not want to rely on a relatively new standard, you might just start by asking your vendors about how they develop software, how they react on incidents and product vulnerabilities, what support you get when you get compromised on their platform and – if you move to the Cloud – how they run your environment. Use your common sense before any standard when you judge their answers and see what the outcome is. I did it more than once and the answers are amazing (not to the good fairly often)

    Roger

  • Will the user define security policies in the future?

    I think, I blogged about this event already earlier: Years ago I was meeting a customer and was talking about the future of IT. I was telling the audience (about 10 people including the Security Officer) that there is a good chance that IT will not define a set of hardware anymore but that the user will buy their own and use it for business. Additionally, different people have different needs and my notebook is setup differently than a lot of others within Microsoft's internal network – just because I have different needs and I use one piece of hardware for private and business. Actually in my case, it is even my own hardware. Back then at this point the CSO left the room complaining that I am completely nuts.

    Where are we today? We all talk of Consumerization of IT (CoIT), we talk of "Bring Your Own Device" (BYOD) – but the mindset in a lot of companies did not change at all. They run projects on BYOD and define a set of acceptable hardware models – which is outdated the moment they publish it as it takes them a few months. I think we need to change our approach as the world changed. We need to think "policy and requirements" and not "hardware models and OS builds". We might decide in a security policy that we require a device to be allowed access to sensitive resources to have a TPM chip to protect the keys. We might require disk encryption to be switched on (Bitlocker with TPM in our case). We might require IPSec policies to be deployed to authenticate the device to the server. We might require full patching etc. etc. But why the hardware? We might care regarding support but if I bring my own device, hardware issues are my problem, aren't they? As long as this is clear, we can head that way and still offer supported hardware with a standard build to people who want to get full internal service.

    What will happen if we do not follow that path? To me it is fairly simple: Almost half of employees admit to bypassing security controls.

    […] half of sales-focused employees say their job is hindered because they aren't getting access to all the information they need. And with more than half of the respondents working for large organizations (the majority employing more than 5,000 people), the potential ramifications are notable.

    A lot of security people I know have a false sense of security. Do you think that internal security knows of these bypasses? No, not at all:

    That's breeding apathy, too: 40% admitted that if they were breached no one would notice.

    If we do not help our users to do their job in a secure and safe way, we risk our business. Think about it again:

    While 40% of companies have lost a sales opportunity because employees weren't able to access the information they needed, an alarming 46% avoided the possibility of losing a sales opportunity by bypassing security controls to access necessary sensitive information to get the job done.

    Can you really blame your sales? Kind of but they are measured by making money. What would you do in their shoes?

    How much does this have to do with the BYOD? Well a lot to me as it is just the next big wave – actually the one we are riding since smartphones came up. Our users need access to information wherever they are the way they need it. Our job is to protect the company's assets in this context.

    The way we at Microsoft do it, is that we apply something we call "Variable User Experience". It bases on different factors:

    • Identity: How did the user authenticate? With a consumer identity like LiveID, Facebook or with Active Directory. For certain access the Microsoft Account (former LiveID) might be good enough. Did he/she authenticate with UserID/Pwd or with two factors (we use virtual Smartcards in Windows 8 today, so my computer acts as second factor and I do strong authentication).
    • Device: Who manages it? Is it IT Managed, Employee Managed (but still in AD), or unmanaged? Is it authenticated? Is it in a policy compliant state?
    • Location: Is the device on the network or outside on Direct Access/VPN? In which country is the device?
    • Data/Application: What kind of data is being accessed? Which sensitivity level?

    Based on these factors, I might have different routes to what I need to do:

    • Direct Access/VPN: I might be able to access the network and all the data through DA/VPN requiring Strong Authentication
    • VDI/Citrix: If not, for any reason, the fallback is a Terminal Server session, where I do not have any local data but might still require strong authentication.
    • Web SSL: Web based apps, requiring simple authentication. Maybe even through Office365, which I personally use very often.
    • Denied.

    I do not say that we should loosen up everything so that every user can access highly sensitive data on the unpatched and unencrypted iPad or Windows XP. There is a "Denied" in there and it has to. There are administrative (HR) processes in there for violation of policies and it has to. But we have to give the user different – SIMPLE – ways to achieve the goals they have to or they will spend a lot of energy to find ways around our security controls.

    I know that we as security people can sleep well when we have all the controls in place. We did everything to secure the data and if it fails, the user is to blame and not us. But this does not help the business we are supporting, does it?

    If a CEO or CIO reads this and nods now (which happens to me often, when I talk about this subject) – you have a role in there as well: If the "*** hits the fan" and a security incident happens, think twice before you fire the Security Officer. We talk about managing risks. Risks have the tendency to materialize once in a while – and basically you should fire the CSO only if he did not do his homework or if he does not have a proper incident process. Otherwise you create a culture of "CYA" (Cover Your Ass) and not the openness and trust you need to land such a strategy.

    I guess a lot of people disagree now J - let me know!

    Roger