Previous Page
Next Page

How to Remove Your Site from the Google Index

If, for some reason, you want to remove your website from the Google index, the process is slightly more involved. What you need to do is place a special text file in the root directory of your website's server. This file should be named robots.txt, and should include the following text:

User-agent: Googlebot
Disallow: /

This code tells the GoogleBot crawler not to crawl your site. If you want to remove your site from all search engines (by preventing all robots from crawling the site), include the following text instead:

User-agent: *
Disallow: /

If you only want to remove certain pages on your site from the Google index, insert the following text into the robots.txt file, replacing page.html with the filename of the specific page:

User-agent: Googlebot
Disallow: /page.html

Finally, you can use the robots.txt file to exclude all pages within a specific directory. To do this, insert the following text, replacing directory with the name of the directory:

User-agent: Googlebot
Disallow: /directory

Tip

To disallow other specific crawlers and spiders, see the (long) list at www.robotstxt.org/wc/active/html/.



Previous Page
Next Page