Knowledgebase: General Information
What is a robots.txt file? How do I make one?
The robots.txt file is a file that is created by the user which instructs various spiders/web crawlers on what area of the site they are allowed to visit, and what sections they are not allowed to visit.

A popular page that details much of how the robots.txt file works is located here:

You can also use the Robots META tags from inside your html pages to instruct some spiders/web crawlers on whether to index the page in the search engine by using the following:

A handy utility that we found on the web was a place that will make your robots.txt file for you.

Using their step by step guide, it will walk you through creating a robots.txt file for your site. Once the file is created, just upload it to your site in the /public_html folder and the search engines will find it.

The robots.txt file is not supported by ALL search engines.
(197 vote(s))
Not helpful

Comments (0)
Post a new comment
Full Name: