By: admin

Robots.txt file and how/when/why to use it.

March 28, 2006

Share:

What is a Robots.txt file and is it important? The Robots.txt file is a file that is included in the root directory of your website and it tells crawlers (search engine spiders) what they can visit and what they can’t visit. You may be saying to yourself, “why wouldn’t I want a search engine to visit my site?” Several reasons:

1. Your site is brand new and its going to take a few months to get finished. You really don’t want a search engine to index your site, or crawl across it for consideration if its incomplete. This could actually hurt you on some search engines if they crawl a site that is “under construction” or incomplete with a ton of broken links.

2. Lets say you have a page with information that you don’t want the search engines to index. It could be a page you only want your company to have access to, but don’t want to password protect it.

So much can be added to this, but understand the robots.txt file is a tool we can use to help the search engines crawl across our site better. Google has implemented a really good tool to help you check your robots.txt file, as well as some other useful tools related — if you dont have an account sign up for one here Google Site Maps.

Written by admin on March 28, 2006

Follow me: