Rant: Who watches the watchers?

Computer security can be an arcane subject, especial for the “uninitiated” who don’t know what phrases like “risk mitigation”, “threat profile”, and “single-loss-expectancy” are talking about. But a lot of computer security boils down to fundamental ideas about trust and security that we’re used to in the real world. This week at work I was handed a very frustrating example of these fundamentals.

In security jargon, we talk about “controls” – especially “technical controls” vs. “procedural controls”. Let me break that down into plain English for you. Procedural control basically means “we told someone not to do a bad thing, and we trust that they’ll listen to us.” Technical control means “we don’t have to trust someone, because the system won’t do the bad thing even if the person wants to.” In the security world, technical controls are almost always preferable, since they allow your organization to take someone’s trustworthiness out of the equation.

A simple real life example of these two types of controls are locks on doors. In some situations, for example college roommates who grew up together, locking doors isn’t necessary because the people involved are trustworthy. But in another situation, the exterior door on your apartment, you can’t trust the other people and you demand a reasonable lock to secure your living space. And in further extremes, like protecting weapons or biological agents, the people involved are trustworthy but the possible damages are so high that strong locks and other controls (guards, video cameras, fences, etc.) are required.

As you can see from the examples, just because the people involved are trustworthy doesn’t mean systems with lax controls are adequate. If the risk of damage is large, prudence demands that we design a system that “watches the watchers” so to speak.

The example from work wasn’t nearly as dangerous as biological agents. But it was all the more frustrating because I had pointed out the ease with which the operations team could implement better controls on their patching process just a few days ago. Then yesterday it came up that the swing shift operators had installed software patches on the wrong boxes – an error facilitated by the lack of technical control and the attitude from the operations leader that the problem was “reminding the swing shift guys they shouldn’t patch those machines.”

No, the problem is you aren’t even willing to learn from your mistakes and implement new controls even after you’ve been burned once…