News

Black Hat: SSL is fragile

Ian Grant

Researchers at the Black Hat security conference in Las Vegas have proved that the Secure Sockets Layer (SSL) protocol is fragile and could be broken at any time.

This puts all smartcard-based systems at risk as well as so-called secure websites because SSL is the web's most widely used authentication system.

However, Verisign, which manages SSL security for the .com domain, and internet browser makers were cooperating with each other and the researchers to fix the flaws. New, more robust software and processes would be available by the end of the year, they said.

Alexander Sotirov and Mike "Moxie" Zusman demonstrated a tool that allowed them to take over SSL sites (the ones that start https:// and display a closed padlock in the browser bar) by exploiting poor practices used by certificate authorities in verifying the owners of the website. The tool worked even on sites that used Extended Validation (EV) SSL, they said.

Researcher Dan Kaminsky, working separately, said SSL sites were only as secure as the weakest certificate authority. Anyone with $100,000 could become a certification agent allowed to issue digital certificates attesting to the true owner of a website, he said.

This could be abused if certificate authorities were not diligent in identifying the true owners of the website and their intentions. It was possible for criminals to set up legitimate-looking web sites to deliver malware or steal account numbers and log in details.

He said the X509 standard for public key encryption, which governs SSL, contained ambiguities that meant browser makers could interpret a site's digital certificate differently rather than consistently.

Kaminsky said 60% of attacks on websites exploited differences in the authentication procedures between the site and the browser. The risk of an attack succeeding had fallen from 2 to the power 104 in 2004 to 2 to the power 63 today.

This was too little room for comfort given the increase in computing power available, he said. "Some companies may already be wondering how they got done over," he said.

Tim Callan, Verisign's vice president for product marketing, said Verisign had changed the way it issued certificates to make the authentication process more reliable. It was phasing it in because some web sites had incorporated the insecure process deep in their code.

"We are trying to fix things in a way that's compatible with the ecosystem (how fast people can adapt their web sites)," he said. "We want to give you plenty of warning before we break your system."

Callan said Verisign was working to improve the X509 standard, but Kaminsky said he would prefer to scrap it in favour of adapting the internet's domain name system (DNS) to issue authentication certificates.


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
 

COMMENTS powered by Disqus  //  Commenting policy