Many Information Security people have mused why on Earth "Real People" (i.e. those without propellers)...
There are two parts to the human equation that your article reminds me of - the first is that which you've already covered, that we need to make our software suggest to the user that they want to make the right decision, the secure decision.
The second is we need to make it impossible to write malware that suggests to the user that they are taking the right decision, when they are not.
The latter is the harder part - despite the fact that the former is incalculably hard already.
I agree that we shouldn't give up on training users - there's always a new crop on the way, which is one reason why training doesn't seem to be successful in turning the tide. Rather than saying "these users will never learn", and giving up, we should instead say "maybe my teaching is ineffective", and study why it's ineffective, and how we can improve it.
I said that in the last article: IMHO it is important for Security Personal to get out of the science tower and accept the fact that they have to work _for_ people and especially that their biggest (only?) goal is to expect irratonal, malicious and uneducated behaviour.
So finding a balance (or accepting calculated risks) is the most important task. And this is as dirty as it can get since no clean room model or formular can avoid dealing with humans in that role. Never ever look at your user as a hated annoyance.
BTW: just as i was re-reading your article I had to think about software which _tries_ to be for people. The Dialog in Figure 1 would look like:
Alert: Intrusion from Atacker against your computer detected. Accept Connection: yes/no.
i.e. no technical details, no ip, no port number, no protocol number, maybe the name of tha targat application (but in that case the window title not the image name) etc...
I bet the idea behind that is to have a dialog which is not confusing the non-techies. Of course it is neighter helpfull to techies nor to the user. AND it pisses off the techies who need to explain those messages and can't because critical information was supressed.
This kind of messages was typical for some 3rd party firewall software but it is also quite common for MS software to hide technical details. (For example there are software out there who will not show if it is tcp or udp neighter will it show what sender and receiver ip was)
In case of inbound Personal Firewalls there is only one implementation usefull for the human user: drop the noise from outside without a comment (i.e. hidden logfile).
Not sure what to say about outbound connections (because the usefullness is pretty doubtful), but at least I do think dialogs can be made more explanatory for common cases like DNS, update checks or packets from anonymous service processes (for some strange ip stack reasons) and web browsing)
Rather than saying "these users will never learn", and giving up, we should instead say "maybe my teaching is ineffective", and study why it's ineffective, and how we can improve it.
That one's easy mate; you're not using a long enough lump of 2x4.
I think thoughts such as "security professionals need to know people better.. etc etc" are an overanalysis. It sounds like the thesis of a misguided university student looking for subject matter.
Security is still just a problem-solving exercise like everything else. The goals are somewhat non-human. eg. "Maximise the difference between a real alert and a fake alert". An average user is just an extrapolation from an expert. Even experts need to look out for signs to determine what is a scam and what is not, average users just need it more. But why go half way with anything. Why make something subtle when it can be obvious.
Andrew, you are probably right. However, what I do not know, and would like to, is what defines the expert? If we can enumerate what the expert does to make a security decision - what signs he or she looks for, what the relative importance of each is - then maybe we can teach others to look for the same thing? In fact, maybe we could even build software that highlights those? I'm not sure anyone has done that anlaysis, which is why I started out by saying we need to be more like normal people, and at the same time, they need to be enabled to become more like us.
Jesper: I see where you are coming from. But that is somewhat different to the article, which was not asking "What defines an expert?", but "What defines the others?". What defines an expert is knowledge and intelligent decision making. And turning thought processes of experts into software is an excellent method of bridging the gap (along with education). A pop-up that says "this .exe file may run malcious software", is a simplified version of translating an experts thought process of: ".exe are executable files-> executable files can run processes that compromise my computer" into something other people can use.
I guess what I am saying is: you don't necessarily need to delve into the minds of the average user with the help of complex studies and cognitive psychologists, when you can simply extrapolate down to them having no thought processes above common sense, and no knowledge of computers above what is required to simply operate the software.
And so I guess my difference of opinion lies in suggesting that it is not so hard to identify what an expert does, and contrast that with what someone with no knowledge and no intelligence above common sense, can do. We are all humans and all similar.
I know there is probably a valid argument that "we cannot see other people's perspectives" etc etc, and I have worked with organisational psychologists before who have told me how much insight they get into visiting and observing people who are in the lowest level of intelligence. But I don't know, maybe you are right, maybe I am right, who knows. It is very situational and hard to generalise. I mean, just thinking of the people in the lowest level of intelligence, how hard would it be to get them to run something no matter what measures or educational resources were in place? I think just designing assuming the user only has basic common sense is all you can do. Going below that would be impractical from a design perspective.
I can easily see that people do not understand files, directories, operating systems. So basically they do not understand anything. You can either hope they are motivated enough to learn, or make the software teach them. But I don't think it is too hard to simply accept that "hey, these people don't know anything". But of course, they can learn if they want, and I have faith in people just as you do. And what you are suggesting, getting into more complex analysis of situations to assess their security impact (and converting this into a software solution), is way above what the average user can comprehend, as they do not even know what an operating system is. The other day a friend of mine said "I bought a whole new operating system: Office 2003". So I don't think there needs to be a study to find out what people think, because they don't think anything. They will do whatever is slightly convincing. Then again there probably is value in finding out how they are fooled by simple scams, and the thought processes underlying that. But I don't know if it is that hard to deduce, based on their zero knowledge.
I think there are definitely areas for improvement, but I don't think things are failing because of really poor software design. Each release of Outlook, for example, has better ways of bridging the gap between those who know nothing, and experts, without downgrading user friendliness to an unacceptable level. I think there are bigger issues in the areas of motivation (the general population do not care about computer security, they don't lose any money, dont hear about people losing lots of money, and hardly even lose their data that often) and lack of proper education when they seek it ("If I want to be secure, I just have to use a virus scanner").
Anyway I am not sure I made any points, but at least I have made this comment long enough for nobody to read and critique it. I am not sure we disagree on much.
In the blog comments..... a poster says.... . Everytime I write up a memo to use WSUS it gets shot down...
Andrew, I think you made several important points. Among them, you made the point that transfering expertise from the experts to the non-experts is important. That is of course what trainers do, and I have spent much of my life as a trainer. Regardless, I am not sure how to proceed here. The first step I still claim is for those of us who have some level of expertise to step back and analyze the differences between us and those that do not. That would create a gap analysis which we can then use to figure out how to bridge the gap.
Jesper: I agree in regards to the gap analysis, but am suggesting it is something someone can do in their head, and is often more focused on the expert, rather than the unexperienced user.
Basically an expert could chart their own thought processes, and assume that the average user only has the tool of common sense. Everything else is the gap. I don't think comparing a normal users thought processes with an experts thought processes will be meaningful, as the normal user does not have the knowledge of computers required to have meaningful problem-solving skills that are worth analysing.
So you would start your process at zero basically. Your example of the firewall message meaning something different to the average user, can be explained in one way: They do not have the knowledge to comprehend the message. What I think you are suggesting is, "they have some different thought process that must be analysed so that we can design for these people". What I am suggesting is, lack of knowledge means they see the picture as "fuzzy", without all the detail, which means it becomes "click allow to see the dancing pigs". To some degree they will think "maybe this is dangerous", but their level of motivation is too low to instigate learning.
I agree that their are a lot of complex human behaviour decisions invovled in designing software, but I don't agree with the idea that "we are different to them, and will never be able to design properly without a cognitive psychologist explaining their thought patterns".
I also don't think it is difficult to explain your own processes to yourself. It would involve a lot of work though as you basically have to piece everything together from "what is a domain name" upwards. When you get above the lack of knowledge stage, into the different thought processes stage, you probably would need to compare someone with the same level of knowledge of you, with a different level of intelligence, to see what the difference in thought processes would be. Comparing the thought processes of people with different levels of knowledge would not have any meaning.
When I first moved from the business to IT, I was presented with two signs to fix to my desk. One read "Never underestimate the stupidity of a user", the other "Once a user - always a user". Hopefully, over the past 20 years I've managed to outgrow the second, but the first is as true now as it was then.
IMHO the problem is very simple, people only care about things that matter to them and things only matter when they realise they affect them.
The trick we have to perfect is making the users realise that Information Security is not the same as plane spotting i.e. of interest to just the limited few (apologies to all plane spotters - no flames please), but is actually part of their daily life and should 'just be the way they do things', like looking before you cross the road, or assessing the risk of buying a new widgit before handing over your money, or not writing the PIN to your hole-in-the-wall card on the front of the card.
Until we achieve that, they will still double-click on links in e-mails and choose obvious passwords.
Jesper: I think the people are paramount to security - from Social Engineering to noticing the abnormal - particularly at gross levels. I find it amazing that anyone would write off people - or allow them to be complacent and not an integral part of a comprehensive security posture. Granted some users will take this to different levels than others and providing a good balance of guidelines is difficult. But encouraging users to exercise their common sense and notice the obvious is so critical from physical security to information security. Many users view IT Departments as a superior, unapproachable staff - that should never be questioned. Or when they do try to communicate – it is painful to them or demeaning. This is very, very bad.
Think of the USS Cole. I was an active duty sailor at one time, and we were taught to FOLLOW ORDERS - not think for ourselves in many cases. So being in a foreign port - as a sailor in that mindset - I as others may not have thought twice about a fuel/oil barge with 2 Turkish looking men smoking cigarettes pulling along side my ship while docked. Even with the .45 revolver on my hip - I would not have thought to even consider taking it out of the holster unless directed by a Superior Officer. Amazing how deadly complacency can be!
I feel that this same complacency holds true in IT. To top it off, many IT staff personnel are indirectly teaching “fear” of computer systems to our users – rather than teaching them to protect, taking charge or actively notice changes or behaviors of systems, people, space, their environment in general. Your statement of Empowering the People is key. Sometimes you do not have to teach them everything – ask for their help in the overall security posture as their responsibility as an employee (empowering). Give examples, ask them to think for themselves and make good judgment calls, perhaps use examples from an IT perspective. This is definitely a great start. Why is this viewed as so impossible?
After thinking about it for a bit, maybe you are right Jesper. Maybe there are people other than leading software developers that can teach people better, through both software and traditionally. I think intelligent software engineers are maybe a lot more abstract in their thinking, and maybe it is hard to transfer concepts at a level best suited to the average person. I remember an article once about scientists needing a "spokesperson", who could better translate things to the average person, as it was proposed that the scientists were too abstract and theoretical.
I wonder though, there are a lot of questions unanswered in that theory. You would think those with a more abstract level of thinking would be able to come up with more novel approaches to translating information. I am not really sure about the whole idea of "some people are better teachers". If you are a really bad communicator though, you are going to be really bad in the role of someone who has to design software that teaches people security. But I think there are a lot of people who wear propellor hats that are very good communicators. I also wonder if, by reducing the level at which the information is presented, you are reducing the effectiveness of the information to combat security threats.
I totally agree with the article though, there is definitely a somewhat new role for security developers, or maybe for a new type of person all together, that involves teaching and communicating, rather than developing abstract concepts and implementing structured systems. It was a good article.