How to use robots.txt

What is the purpose of the robots file?

When a search engine crawls (visits) your website, the first thing it looks for is your robots.txt file. This file tells search engines what they should and should not index (save and make available as search results to the public). It also may indicate the location of your XML sitemap.

You can simply create a blank file and name it robots.txt. This will reduce site errors and allow all search engines to rank anything they want.

If you want to stop search engines from ranking you, use this code:

#Code to not allow any search engines!
User-agent: *
Disallow: /

View more robots.txt codes here.

  • 0 Users Found This Useful
Was this answer helpful?

Related Articles

You can use your website without a domain name.

You can use your website without a domain name. We provide you a temporary URL in your welcome...

Create a temporary one page site

Also known as a landing page or an under-construction page. This is the perfect solution for a...

default.html, start page

This is a home page provided by GoMedia. You may delete it at anytime. Normally, when you...

Signing Up

Signing Up With GoMedia is a Simple Process We have a video that will help you get started...

Google Analytics Vs Awstats

Google Analytics and Awstats reports do NOT match up. They are supposed to show different data....