Feature

Encryption: the key to secure data?

New Asset  
   

Is there such a thing as totally secure encryption? And which technologies are commercially viable? Danny Bradbury explores approaches to transmitting information securely

 

 

 


For as long as modern computers have been around, they have been associated with encryption in one way or another. It is no coincidence that the first semi-programmable computer, Colossus, was developed to decrypt messages during the Second World War.

Encryption relies on encoding information in a way that makes it difficult to decode without either a key (cipher) or an awful lot of mathematical muscle. The longer the length of the cipher (in bits), the more difficult it will be to break. Although there are many encryption techniques that are unbreakable in practice, there are very few that are unbreakable in theory, given enough time or processing power.

Encryption techniques separate into two main types, explains Bernard Parsons, chief technology officer at security software company BeCrypt. Symmetric encryption dates back to the Roman empire and beyond, but asymmetric encryption is more recent.

Commonly used for file encryption, (for example, to protect files on a laptop to hide data in the event of theft), symmetric encryption focuses on using a single cipher to encrypt and decrypt data. "As the field of cryptography became better understood in the public domain, a number of algorithms were proposed, all of them based on difficult mathematical problems," says Parsons.

The trick is to ensure that the mathematical problem is sufficiently complex to stop it being solved by current computing technology. Developing such problems requires not only significant mathematical skill, but also an agreement between multiple parties to use the same mathematical algorithm to encrypt and decrypt data, in order to exchange files.

Consequently, standards became important in the early days of modern computerised encryption in the mid-1970s. One of the first was the Data Encryption Standard (DES), an encryption algorithm using a cipher 56 bits long. DES was at one time considered strong enough to be used for banks' automatic teller machines, but as processing power increased, it was replaced by triple DES, which ran the same piece of data through the DES algorithm three times for extra strength.

"Towards the end of the 1980s questions were asked about the appropriateness of triple DES for a number of reasons, one being performance," says Parsons. A new encryption standard called AES (Advanced Encryption Standard) was established in 2001, and it is still considered to be state-of-the-art.

Symmetric encryption is all very well, but if you want to send the cipher to someone else so that they can decrypt your message, how do you prevent it falling into the wrong hands?

You could encrypt the cipher with another cipher, but the problem of sending the second cipher safely to the recipient still remains. Short of physically handing the key to someone, which is unworkable in a commercial context, communicating the keys for others to decrypt messages safely becomes impossible.

This is where the second type of encryption - asymmetric or public key encryption - comes in. Public key encryption uses two keys: a private one and a public one. If one key is used to encrypt, the other will decrypt. If company A wants to send a message to company B it uses B's public key, which is available to everyone, to encrypt the message. Once it is encrypted, the only thing that can decrypt the message is B's private key, to which only it has access. The original developers of this technology formed RSA Security, which still uses the algorithm today in its products.

"Symmetric key encryption is always faster than asymmetric, so what you do is encrypt a piece of data using a symmetric key and then encrypt the key using the RSA algorithm," says Mike Vegara, director of product management at RSA. Whereas AES has a minimum cipher length of 128 bits, the RSA algorithm starts at 1,024 bits. But the trade-off is that RSA is incredibly difficult to break, says Nicko van Someren, chief technical officer at cryptographic hardware provider nCipher. He says that 1,024-bit RSA takes 10 times as many processor cycles to do the computation, but it takes in the order of 30,000 times longer to break.

An alternative to the RSA algorithm is elliptic curve cryptography, which works with 160 bits and can be a useful form of asymmetric key encryption on resource-constrained devices such as PDAs and smartphones.

But this does not solve the problem of authentication. If company A encrypts a cipher using company B's public key and sends it to B, it has no way of knowing that the cipher is really from A. Perhaps a third party sent the encrypted cipher to fool B into thinking that the message coded using the cipher was from A. Digital signatures provide a way around this by enabling people to "sign" their ciphers and messages.

Company A creates its digital signatures using its private keys. As before, it encrypts the message it wants to send using a symmetric algorithm, and then encrypts the cipher for the message using B's public key. But then it also runs the unencrypted message through a mathematical algorithm called a hashing function, which produces a unique short string of characters. It then encrypts this string (known as a hash) with its own private key. Everything is then sent to B.

As before, B uses its private key to decrypt the symmetric cipher, which it then uses to decrypt A's message. But it then uses A's public key to decrypt the hash string. It runs the decrypted message through the same algorithm that A used to create the hash. If B's hash matches A's it knows two things: first, that the message is the same one that A ran through the algorithm, so it has not been tampered with en route. Second, that the message definitely came from A because it was decrypted using A's public key, which means it must have been encrypted using its private key.

Like symmetric encryption, hashing algorithms come in various flavours. MD5 is still used in many systems, but has been superseded by SHA-1, created by the National Security Agency in the mid-1990s. However, the safety of SHA-1 has been questioned by the cryptography community following some alleged attacks.

Just because an algorithm has been attacked, however, does not mean that it is commercially unusable. "When you have a full attack on a function, it normally does not have immediate consequences," says David Naccache, vice-president of research and innovation at smartcard supplier Gemplus.

He adds that many attacks are theoretical and would be impractical to carry out in real-world conditions. "On the other hand, it is not a healthy indicator,"he says. After a theoretical attack has been discovered, the cryptography community analyses the risk to existing users and recommends appropriate action.

Public key encryption still faced a major challenge, which was to verify that people's private and public keys were not being created fraudulently. Trusted certificate authorities (such as VeriSign) were set up to help govern the creation of keys, in what became known as public key infrastructures (PKIs).

Within a PKI, a certificate authority would create and sign company A's key to verify it. But users were slow to take up PKIs, and the result was some spectacular failures, such as that of supplier Baltimore Technologies.

So what went wrong? It was too much, too soon, says Andy Mulholland, global chief technical officer at Capgemini. Five years ago when PKI was being promoted, the wrong people were trading online. Trading volume was being pushed by consumer purchases while online corporate trading was still relatively small. "We got the PKI boom without enough commercial activity to warrant us doing it," he says. "If PKI was launched in 2005, the reaction to it would have been very different."

To counter the idea that consumers found PKI difficult to use, PKI advocates such as Vegara often quote public key technologies such as Secure Sockets Layer (SSL) and its successor, Transport Layer Security (TLS), which provide the padlock icon seen in secure browser sessions. These often required no authentication of the user, but were there simply to authenticate the server and secure transactions.

In many cases where an authentication of both parties was required PKI simply was not transparent to users, says Arthur Barnes, principal consultant at Diagonal Security.

Sceptics need look no further than a paper presented by Alma Whitten of Carnegie Mellon University at the Usenix technical symposium in 1999, around the time when PKI marketing was going into overdrive. Called "Why Johnny Can't Surf", the paper revealed that when given 90 minutes to sign and encrypt a message, most of Whitten's test participants failed.

The test participants were using PGP, an open public key encryption software tool developed by Phil Zimmerman in 1991. PGP is significant because it offered an alternative to the top-down certificate authority model used in most PKIs, called the "web of trust". In this model, certification authorities were replaced by trusted individuals who endorsed other people's keys by signing them, leading to the phenomenon of key-signing parties.

PGP put Zimmerman at loggerheads with the US government, which said that making the encryption technology available overseas by putting it online violated export controls. It dropped the case against him in 1996. But government involvement in encryption remains a long-standing issue for privacy advocates. Apart from export controls, the government's ability to obtain keys and decrypt information is a particular bone of contention, especially in the UK, following the ratification of the Regulation of Investigatory Powers Act 2000.

Gavin McGinty, solicitor at IT legal consultancy Pinsent Masons, says the act enables the government to obtain keys under certain conditions. "The general principle is that if it is possible to give the information that is encrypted then you can give that without giving the key, whereas if it is not possible for you to unlock the information, you might be required to give the key as well," he says.

This is all very well in principle, but encryption techniques such as steganography bring the efficiency of such legislation into question. Steganography hides one piece of information in the background noise of another: for example, a Word document inside a JPeg file.

Barnes is not convinced by its effectiveness against an interceptor that is looking for such encoding. "Steganography is sold as undetectable but it is detectable. You just have to know what you are looking for," he says. Generally, for example, the background noise in a JPeg image will not be entirely random. An investigator finding a large degree of randomness in the image may deduce that there is some hidden information, says Barnes.

Encryption is now a mature market in which world-changing developments come infrequently. There are enough symmetric and asymmetric encryption algorithms to satisfy the most ardent of cryptographers, and most of them will be indistinguishable by the average IT director.

Nevertheless, the biggest challenges remain. The poor performance of PKI leaves a gaping hole in the encryption market, which must be filled by some sort of identity management model. This will either take the form of a renewed, more transparent PKI system, or some other as yet undiscovered signing and cryptography initiative. Unravelling that problem could prove to be the biggest puzzle of all.

This article is part of Computer Weekly's Information Security Special Report produced in association with Citrix

For more information visit the Citrix website


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

This was first published in April 2005

 

COMMENTS powered by Disqus  //  Commenting policy