Feature

Highly secure systems work only if they need no input from users

In the second extract from his book, security expert Bruce Schneier argues that no end-user can be trusted to keep systems secure.

It has been said that the most insecure system is the one that is not used. And more often than not, a security system is not used because it is just too irritating.

I once did some work for the security group in a major multinational corporation. They were concerned that their senior management was doing business on insecure phones - land lines and cellular - sometimes in a foreign country. Could I help?

There were several secure-voice products, and we talked about them and how they worked. The voice quality was not as good as normal phones. There was a several-second delay at the start of the call while the encryption algorithm was initialised. The phones were a little larger than the smallest and sexiest cellular phones. But their conversations would be encrypted.

Not good enough, said the senior executives. They wanted a secure phone, but they were unwilling to live with inferior voice quality, or longer call-setup time. And in the end, they continued talking over insecure phones.

People want security, but they do not want to see it working. It is instructive to talk with people who remember when a front door lock was first installed on their house. There are some of these people still alive, usually in rural areas. (City houses have had door locks for centuries; rural areas went without them for a long time.)

These people talk about what an imposition a front door lock was. They did not think it was right that they had to fish around for a key, put it in a lock, and then turn the key, just to get into their own home.

Computer security is no different. Find someone who used computers before there were passwords and permissions and limitations. Ask them how much they liked it when security measures were added. Ask them if they tried to get around the security, just because it was easier.

Even today, when the deadline approaches and you have to get the job done, people do not even think twice about bypassing security. They will prop the fire door open so that someone can get into the building more easily, and they will give out their password or take down a firewall because work had to get done.

John Deutch, the former director of the US Central Intelligence Agency, brought classified files home with him on his insecure laptop because it was easier.

People cannot be trusted to implement computer security policies, just as they cannot be trusted to lock their car doors, not lose their wallets, and not tell anyone their mother's maiden name.

They cannot be trusted to do things properly. In a 1999 usability study at Carnegie Mellon University, researchers found that most people could not use the PGP e-mail encryption program correctly.

Of the 12 people who participated in a Carnegie Mellon University experiment, eight never managed to figure out how PGP 5.0 worked. Four of them accidentally sent out unencrypted messages that revealed confidential information. And this is with the program's easy-to-use graphical interface.

Moreover, end-users cannot be trusted to make intelligent security decisions. After the Melissa and Worm.ExplorZip scares of 1999, you might think people learned not to open attachments they were not expecting. But the infection rate from the ILoveYou worm (and its dozens of variants) taught us that no, people cannot be trained not to open attachments.

A revised paperback edition of Secrets and Lies: Digital Security in a Networked World, by Bruce Schneier was published this year by Wiley priced £11.99.

www.wileyeurope.com

 


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

This was first published in November 2004

 

COMMENTS powered by Disqus  //  Commenting policy