You can stay off Google's radar by not linking to your secure website, and running the website over non-standard port numbers. If it is important to be indexed by Google, restrict the Google search to public areas by using robots.txt, a text file that tells search engine crawlers which pages you would like them to not visit, and ensure the Web server is fully patched and pen tested to ensure no vulnerabilities exist even if indexed by Google.
Related Q&A from Richard Brain
Managing vulnerabilities involves a wide array of security testing, including both dynamic and static source code analysis. Learn how the two differ,... Continue Reading
Which browsers are secure enough for enterprise use, and which should be avoided at all costs? In this expert response, Richard Brain examines the ... Continue Reading
Google cloud applications aren't necessarily known for their security. In this expert response, learn what to watch out for when considering using ... Continue Reading