The importance of robots.txt file to a Blogger blog

What is a robots.txt file?
A robots.txt file is that file in which you tell search engine robots or web crawlers to index only specific web pages that belong to your website or blog.

Updated : 4 May 2017
Why do you need a robots.txt file for Blogger?
1. Blogger archives all your posts monthly and yearly. This archive is treated as duplicate content by search engines like Google and thus your blog will have a lower ranking in Google's search engine results.
2. Specifying a sitemap to search engines
3. Not allowing search engines to crawl particular pages on a website like Login Page
4.  Not allowing search engines to crawl some images and js scripts

For example have a look at this sample robots.txt file:

Note the below text:
Disallow: /search

What the above text tells Google is not to index web pages (blog posts) with the word search in its URL.
This effectively means that a web page having an address like the one below won't be indexed by a Google.

How to create a robots.txt file?
A robots.txt file can be created easily using Google Webmaster Tools.

Please try the below steps with extreme caution. If you don't know what you are doing, please use the example robots.txt file and replace the text in it with your blog name.

Steps to create a robots.txt file
1. Go to Google Webmaster's Tools by clicking here: Make sure your site is already added to Webmaster Tools. If not, visit this page: Google Webmasters for Blogger that helps you to add a blogger blog to Google Webmaster Tools

2. Click on Crawler access

3. Just copy the code in your Blogger template. Make sure you test the code with the help of Test button.

To know more
1. Robots.txt file tutorial 

© 2007 - Protection Status
The content is copyrighted to Sundeep Machado

Note: The author is not responsible for damages related to improper use of software, techniques, tips and copyright claims.