New robots.txt commands: make sure that Google can index your website more efficient

It seems that Search Spiders is currently messing around with new robots.txt commands. If your robots.txt file accidentally contains some of the new commands, it is possible that the robots.txt file commands Search Spiders to not index your site.

What is a robots.txt file?
The robots.txt file is a small text file that must be placed at your root folder (http://www.anysite.com/robots.txt). It tells the search engine bot which section on your website has to be indexed and which section should be de-indexed.

You may use text editor to generate a robots.txt file. The content of a robots.txt file consists of so-called “records”.

Records the information for a particular search engine bot. Each record consists of two fields: the user agent data and one or more Disallow data. Here’s an example:

User-agent: googlebot Disallow: /cgi-bin/

This robots.txt file would allow the bot “googlebot”, which is the search engine spider of Google, to index every section from your website and not including files from the “cgi-bin” folder. All files in the “cgi-bin” folder will be ignored by googlebot.
Continue reading “New robots.txt commands: make sure that Google can index your website more efficient”