Prevent GoogleBot Crawl Errors on Glype Proxy

Posted by Vectro 20 June 2011

Sometimes Googlebot tries to index Glype’s browse.php file and the proxied pages within it. These pages do not need to be indexed in order for Google to find and list your proxy site. In some cases, Google encounters an error, tries again, then repeats this and becomes stuck in a loop. This can slow down your site and cause strain on the server. The way to prevent this is by placing the following code in your robots.txt file:


User-agent: *
Disallow: /browse.php

If you have renamed your browse.php file, make sure to place the new name instead. Click here for more information about renaming browse.php.

Sorry, comments are closed.

Previous Post
«
Next Post
»