During the past few months many people have lamented that Windows lacks a nuclear missile style control option for administrator passwords. Surely you've read about or seen photographs of missile silos where two operators, separated by a distance greater than the span of a single human's arms, must each simultaneously turn a key in a switch to launch a missile. Such a fail-safe is important when considering missile launches: presumably a nation can't thus be committed to global thermonuclear war on the deranged whims of a single raving lunatic.
At first glance, it seems reasonable to allow for similar control over domain and enterprise administrator accounts. A while back I wrote about the fundamental requirement of trust in administrators; missile control-style passwords (is there some official term for this?) might lessen the requirement for such trust, goes the thinking. Well, I'm not convinced that the logic that works for missile silos extends to administrator passwords. Let's examine the differences.
It works for missile silos because the fail-safe is tuned to the characteristics of its environment. It takes two keys, each of which must be rotated simultaneously, and they're separated by around ten feet or so: therefore, two humans absolutely are required. To accidentally or intentionally launch a missile when not under orders, both people must be either equally stupid or equally insane -- and in the second case, also equally trust that each is, in fact, a criminal, rather than one acting as a double agent attempting to entrap the other. Furthermore, both operators perform the function in full view of a whole lot of government staff and military officers. The environment and the fail-safe work together to keep the deadly missiles in the ground. Another important aspect is this: the silo and its control system are designed by and operated by the same entity, the government.
Now compare that to a domain controller. Let's say that it's possible to enable a feature that requires entering two passwords. Where would you do this? A logon screen with two password entry fields lacks both physical and human separation: one person could enter both passwords if he or she knew them. It's no better with smartcards -- again, one person could insert both cards into the readers. Replicating a silo-like environment using a pair of computers isn't the answer, because unlike the silos and their control systems, Microsoft designs Windows but you operate it. The fail-safe works for the silos because of the required physical separation. Microsoft can't dictate, and certainly can't enforce, that you have two domain controllers, separated by at least ten feet. Not everyone can afford all the necessary hardware; plus, think of the demands that would place on space and power in a data center. And besides, even with separated domain controllers, a malicious admin need only to enter the first password or insert the first smartcard in one computer then wheel over to the other one and enter the second password or smarcard there. I'm not sure there's a way to check for simultaneous credential entry.
Separation and delegation of administrative duties is, of course, a good and important concept, one that we'll continue to refine throughout the operating system. There's a lot of power granted to administrators right now, this power we will help you segregate among multiple roles (humans) in your organization. But because of the nature of computer systems, any human granted a particular bit of administrative power must be trusted with that power. Computer systems and the data they store, process, and protect aren't silos; applying silo-style security is the wrong approach to mitigating security risk.
It's great to see you posting again. The Trustworthy Administrator article is an interesting read!
I've seen an implementation of this down here Steve. The Enterprise Admin and Schema Admin password is broken into 2; 1 admin enters the first half, a second admin enters the second half. This is in a DoD environment, they're anal about security and do *not* trust anyone.
But you've got to admit, the idea of having the "two keys" option to some uberadmin account really does seem cool sometimes! The whole two keys ten meters apart thing really does endow the keyholder with a real sense of importance they may otherwise not get elsewhere :) People, in my experience, are led almost solely on how secure they perceive themselves/their companies to be, so often in ignorance of the "real" truth - buying this kind of mumbo jumbo increases their perceived level of security and makes them happy. This is only really ever challenged if companies are ever hacked AND that they actually discover this. But does it matter? Many small to mid sized companies make a good "script kiddie" targets but are of little interest to other commercial organisations. Should they care? When does security become philosophy? ;)
WAYNE-- It still isn't the same thing. In the silo, even if one person obtains both keys, that person can't launch the missile because his arms won't span the distance between the switches. The silo always requires two "authenticators." A split password is still only a single authenticator from the point of view of the secured device; the device has no proof or even knowledge that the single authenticator is divided in half with each half given to individual humans. Because the device can't enforce the security model, sufficiently motivated humans can defeat it. There's also no way to audit which human entered which half.
SQUIDGE-- "When does security become philosophy?" As far as I see it, these are synonyms :)
I have used and on occasions mandated the password split into two parts because (a) perception of security mattered in the project being delivered - ah the politics of project delivery (b) doing so provided at least a slowdown in attacking the account in questions as both safes / teams in management of the safes needed to be compromised or persuaded to act together.
Doing this can be of value in some situations but only delivers incremental increase in security, if that, on the never ending path toward security and sometimes a little is a lot better than nothing at all.
I saw a movie once (the name has escaped me) that presented a scenario to overcome the two-key system. One missile control operator shot the other then used a long pole with a clamp on the end to turn the dead operators key while using his other hand to turn his own key. In the movie the operator assembled the pole and clamp from parts in a duffle bag he brought in with him that had personal items like his lunch. I wonder if after this movie NORAD stopped allowing operators to brown bag it?
Are there really only the two operators in the control room? I thought there'd be a lot more folks hanging around, meaning that if Alice shot Bob, someone else almost certainly would immediately shoot Alice. Assuming that folks in the control room can be armed, of course.
In my opinion the post assumes that a malicious person could find out the two passwords. In that case the physical security of a missile launches is certainly not given. But still, with two passwords spread on two admins, this certainly increases security against intentional or unintentional internal tampering with the system, because one need to find out the two passwords first (or in case of the malicious person being one of the admins, need to find out the second password).
What about biometrics? I’ve worked in environment where three administrators each typed one third of a password - we really wanted an n of m scenario (3 of 7 administrators had to authenticate for example) but that’s not so easy to do just using a split password. Seems like a system with 2 (or more) fingerprint scanners would fill the need (or one reader that required 2 unique fingerprints within a few seconds of one another).
…and I partially agree with Squidge – the benefit of “high security” is largely perception – most “normal” measures can be mitigated pretty easily, but having highly visible security measures in place at least makes people think twice about the importance of what they doing… it’s not so much about “happiness” though, it’s about keeping people aware of the sensitivity of their work.
Many IT people I know require their users to come up with complex passwords and require them to change
There is a term for the ' missile control-style passwords'. It is known as 'dual control' passwords.
Dual controlling passwords is not foolproof or itself, a system preventive control. When implementing this, the parties involved must change the password and a process must be in place to regularly change the passwords.
There should also be a strong change control and monitoring process, to track the usage and activities of powerful IDs to ensure that they are authorized.
While most folks will screamed 'what the hell', this is in fact a good practice and it can be used to protect good administrators when things go wrong. Then, they can safely say, it's not me cause I do not have the full password to do anything crazy.