The use of internet search engines to uncover vulnerabilities for launching attacks will increase as malicious hackers seek exploitable information, according to security experts.
Hackers have long used search engines to parse through a website's source code, seeking clues about what the site contains and configuration information that may be useful in launching an attack.
"People have discovered that they can make a really tight Google query that comes back with results that show lots of vulnerabilities at once," said Matt Fisher, an application security analyst at SPI Dynamics. "The hackers are getting a bunch of potential targets with one web search."
He said past software development practices for websites often resulted in insecure code containing critical information. Hackers, using a web browser and a search engine, frequently parse websites looking for just such exposed nuggets of exploitable information.
As examples, Fisher cited back-up files and source code stored in clear text or as HTML files, embedded comments containing passwords and database schemas.
"Any invalid file extension, or a file ending in .inc, .bak or .old, will get source code," he said. "The issue is poor web application security" and does not reflect on search engine security practices, he added. "Developers are not taught secure coding. They are taught functional and efficient coding, but not security. There is simply a lack of awareness."
Web application vulnerabilities are not homogeneous, and every website is unique, Fisher said. "You can't issue a patch for a web application vulnerability. You've got to fix it yourself, and since Port 80 must be open, firewalls won't protect this type of vulnerability."
Google spokesman Nate Tyler declined to comment, citing the silent period required by the US Securities and Exchange Commission before the search engine company's pending initial public offering. Spokesmen for Lycos and Yahoo did not return calls.
Hackers have compounded the problem by using search engines to conceal their locations and complicate forensics, said Chris Wysopal, vice-president of engineering at security assessment company @stake.
"When you search for a particular vulnerability using a search engine, the search engine pulls all the (targeted) files into the search engine cache, which does not leave the hacker's IP address, so it covers their tracks," Wysopal said.
White hat ethical hackers conducting penetration tests and security assessments also commonly use search engines, he said.
The recent Mydoom.O worm targeted search engine caches seeking e-mail addresses. Search engines would have to remove functionality to try to thwart hackers exploiting their caches, Wysopal said. This would not be feasible in today's competitive internet marketplace, which relies on powerful search engines to parse through the boundless information on the internet.
"If a search engine were to cut its functionality (to limit its use by hackers), people would start using another search engine. It's not the search engine's responsibility to make sure your website is secure. It's the site owner's responsibility to make sure these problems aren't there," Wysopal said.
"Attackers are getting more sophisticated, and part of that is using tools like search engines," he said. "Search engines definitely increase the power of the attacker, and I expect this trend will continue.
"People also need to understand how attackers work. They are not going after one particular site, typically, but they have a goal in mind, such as stealing credit cards," Wysopal said.
"Search engines allow hackers to see if the site has anything interesting to them. Almost anything can come down to a particular text string that can give a big clue as to if the site is vulnerable. Once you have source code for the application, it's a lot easier to find an exploit," he said.
Mark Willoughby writes for Computerworld
This was first published in July 2004