Increasingly computer security is focusing on the human component within a system. About five years ago, security research and practice accepted that most users do not comply with security policies, which is what makes attacks on computer systems possible.
People are the weakest link in the security chain and, as reformed hacker Kevin Mitnick points out, social engineering attacks succeed because hackers now make the effort to acquire knowledge of these human factors, whereas the designers of security systems do not.
What can system designers do to prevent undesirable user behaviour? A major conference in London in May, supported by the BCS, will look at determining the threat and minimising the risk while maintaining the integrity and availability of information.
Unfortunately, usability and security are often seen as competing goals in system design. When it comes to security there is a widely held belief that "if it's not hurting, it's not working" - implying that good security is necessarily cumbersome and difficult. This short-sighted view ignores the fact that the purpose of security is to protect essential business processes - not stop them.
Decisions about security mechanisms are often made with the sole aim of protecting the technology, but with no effort to address how they fit into employees' tasks or the organisation's business processes. Employees then end up spending far too much time on the phone to security helpdesks waiting for cryptic passwords which they cannot remember to be re-set, or trying to think up passwords that will pass the proactive password checker.
An everyday example of this burden can be seen in airports. Increasing numbers of business travellers are carrying two laptops around. Why? Are today's laptops so unreliable that you need to bring a spare? Do they have to keep competing clients' work strictly separate? The answer is more bizarre.
Laptops are frequently issued with a standard configuration specified by the employee's security department. One businessman ruefully clarified his situation. "I need to access site X and run tool Y, and the standard configuration does not allow it. So I bought myself a personal laptop on which I can do these things, and I use a USB stick to transfer files between them."
Even the most usable security mechanism may create extra work for users. It is human nature to look for shortcuts, especially when people do not understand how their behaviour might undermine security.
Organisations can change the current disregard that many employees have for security through well-designed security awareness, education and training programmes.
In addition to usable and appropriate security mechanisms, organisations need a positive security culture, based on a shared understanding of the importance of security for the business, and that it is a shared responsibility.
Security is everyone's business, not just that of the security department, but we must stop burdening users with over-complicated mechanisms, impossible demands, and impenetrable jargon.
l Angela Sasse, professor of human-centred technology at University College London, will speak at the Information Security in the Public Sector conference in London in May