How do I prevent search engines from indexing my site?


It isn't uncommon in certain situations to not want to allow indexing of a site by a search engine. This can come into play when preparing to launch a new service or developing a new site. A consensus between a majority of bot authors was reached on a uniform method of blocking search engine bots using a robots.txt file.

To disallow all bots: User-agent: * Disallow: /

To prevent access to specific directories: User-agent: * Disallow: /cgi-bin/ Disallow: /tmp/ Disallow: /store/

Additional information on setting up a robots.txt file can be found here: http://www.robotstxt.org http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=40360 http://en.wikipedia.org/wiki/Robots.txt

You can also find a similar function here: How do I prevent malicious bots/spiders from accessing my site?



Was this content helpful?




© 2011-2013 Rackspace US, Inc.

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License


See license specifics and DISCLAIMER