I think, I blogged about this event already earlier: Years ago I was meeting a customer and was talking about the future of IT. I was telling the audience (about 10 people including the Security Officer) that there is a good chance that IT will not define a set of hardware anymore but that the user will buy their own and use it for business. Additionally, different people have different needs and my notebook is setup differently than a lot of others within Microsoft's internal network – just because I have different needs and I use one piece of hardware for private and business. Actually in my case, it is even my own hardware. Back then at this point the CSO left the room complaining that I am completely nuts.
Where are we today? We all talk of Consumerization of IT (CoIT), we talk of "Bring Your Own Device" (BYOD) – but the mindset in a lot of companies did not change at all. They run projects on BYOD and define a set of acceptable hardware models – which is outdated the moment they publish it as it takes them a few months. I think we need to change our approach as the world changed. We need to think "policy and requirements" and not "hardware models and OS builds". We might decide in a security policy that we require a device to be allowed access to sensitive resources to have a TPM chip to protect the keys. We might require disk encryption to be switched on (Bitlocker with TPM in our case). We might require IPSec policies to be deployed to authenticate the device to the server. We might require full patching etc. etc. But why the hardware? We might care regarding support but if I bring my own device, hardware issues are my problem, aren't they? As long as this is clear, we can head that way and still offer supported hardware with a standard build to people who want to get full internal service.
What will happen if we do not follow that path? To me it is fairly simple: Almost half of employees admit to bypassing security controls.
[…] half of sales-focused employees say their job is hindered because they aren't getting access to all the information they need. And with more than half of the respondents working for large organizations (the majority employing more than 5,000 people), the potential ramifications are notable.
A lot of security people I know have a false sense of security. Do you think that internal security knows of these bypasses? No, not at all:
That's breeding apathy, too: 40% admitted that if they were breached no one would notice.
If we do not help our users to do their job in a secure and safe way, we risk our business. Think about it again:
While 40% of companies have lost a sales opportunity because employees weren't able to access the information they needed, an alarming 46% avoided the possibility of losing a sales opportunity by bypassing security controls to access necessary sensitive information to get the job done.
Can you really blame your sales? Kind of but they are measured by making money. What would you do in their shoes?
How much does this have to do with the BYOD? Well a lot to me as it is just the next big wave – actually the one we are riding since smartphones came up. Our users need access to information wherever they are the way they need it. Our job is to protect the company's assets in this context.
The way we at Microsoft do it, is that we apply something we call "Variable User Experience". It bases on different factors:
Based on these factors, I might have different routes to what I need to do:
I do not say that we should loosen up everything so that every user can access highly sensitive data on the unpatched and unencrypted iPad or Windows XP. There is a "Denied" in there and it has to. There are administrative (HR) processes in there for violation of policies and it has to. But we have to give the user different – SIMPLE – ways to achieve the goals they have to or they will spend a lot of energy to find ways around our security controls.
I know that we as security people can sleep well when we have all the controls in place. We did everything to secure the data and if it fails, the user is to blame and not us. But this does not help the business we are supporting, does it?
If a CEO or CIO reads this and nods now (which happens to me often, when I talk about this subject) – you have a role in there as well: If the "*** hits the fan" and a security incident happens, think twice before you fire the Security Officer. We talk about managing risks. Risks have the tendency to materialize once in a while – and basically you should fire the CSO only if he did not do his homework or if he does not have a proper incident process. Otherwise you create a culture of "CYA" (Cover Your Ass) and not the openness and trust you need to land such a strategy.
I guess a lot of people disagree now J - let me know!