Hackers are using automated tools to compromise websites and plunder company databases, security researchers have found.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
Hackers are able to attack more web applications using automated tools, said the report launched to coincide with Infosec Europe 2012 taking place in London.
Incapsula, a subsidiary of Imperva that monitors 1,000 websites, found that on average half the traffic to websites is automated, and that 30% of the automated traffic is malicious.
The conundrum for security professionals and website owners is that the remaining 20% is automated traffic from search engines.
"The challenge is to block the malicious traffic, while still allowing the websites to be indexed by Google and other search engines," said Rob Rachwald, director of security strategy at Imperva.
Although SQLi is a well-known attack method, few sites are designed to prevent it, and 70% of websites were created using PHP, making them vulnerable to RFI attacks, he told Computer Weekly.
In both cases, he said, all hackers have to do is use automated tools to identify and exploit one or both of these vulnerabilities to export the data from underlying databases.
Rachwald, in part, blames the imbalance in security spending. Of the $26bn spent on IT security each year, only about half a billion is devoted to application and data security.
"Hackers will always go to where the weaknesses are. If organisations are not spending time and money on the application side, that will be a natural target," he said.
The remaining $25.5bn is spent on anti-virus, network firewalls, data leakage prevention systems and the like, but none of these will stop SQLi or RFI, said Rachwald.
Automated attacks typically involve extremely fast interactions with websites, which is key to identifying and stopping attacks in progress.
"If click-rates are too fast, they are clearly not human; also if multiple interactions take place over several minutes it is unlikely to be human," said Rachwald.
Automated tools typically open every page on a website, so organisations should monitor for protracted interactions and shut down associated browser sessions.
Organisations can also protect their websites by monitoring the browser headers for the names of common attack tools like Havij, which automates SQLi, and blocking associated browser sessions.
The important thing to note about automated attacks, said Rachwald, is that they are indiscriminate and all organisations need to assume they will be targeted regardless of size or type.