How To Setup A Custom Robot.txt In Your Blogger Site

Setting up custom robot really helps In making your blog visible to search engines and also improve your organic traffic too.


Remember that google helps you to get more visitors to you blog or website and it is the best search engine presently and the are based on  keywords and other things like backlinks and many more.  Having a good template is very important in times of seo.


Must times you find it difficult to submit your site to google webmaster tools this is because you have not yet setup your custom robot.
Below are some helpful tips on how to setup custom robot on your Blogger Blog.




Step 1
Go to your blogger dashboard and then click on that blog you want to work on.


Step 2
On your blogger dashboard click on thesettings button >> basic >>


Step 3
Now look at the top you will find description just type in what your blog is all about make sure it is short and meaningful because search engine like short topics.


Step 4
Now click on search preference button and then below you will find custom robot , custom header tag



Step 5
Click on custom robot.txt and click edit and then paste this

User-agent: Mediapartners-GoogleDisallow:User-agent: *Disallow: /searchAllow: /Sitemap: http://www.skilztools.com/feeds/posts/default?orderby=UPDATED

Step 6
Now change www.skilztools.com to your blog URL. and then click save.


Step 7
If you want to check files on your blog use this type of example below. When you check it you will see the whole robot.txt code.


http://www.skilztools.com/robots.txt


Thanks everyone for reading this tutorial don’t forget to drop your comment below thanks.

Explanation

This code is divided into three sections. Let’s first study each of them after that we will learn how to add custom robots.txt file in blogspot blogs.
  1. User-agent: Mediapartners-Google
  2. This code is for Google Adsense robots which help them to serve better ads on your blog. Either you are using Google Adsense on your blog or not simply leave it as it is.
  3. User-agent: *
  4. This is for all robots marked with asterisk (*). In default settings our blog’s labels links are restricted to indexed by search crawlers that means the web crawlers will not index our labels page links because of below code.
    Disallow: /search
    That means the links having keywordsearch just after the domain name will be ignored. See below example which is a link of label page named SEO.
    http://www.skilztools.com/search/label/SEO 
    And if we remove Disallow: /search from the above code then crawlers will access our entire blog to index and crawl all of its content and web pages.
    Here Allow: / refers to the Homepage that means web crawlers can crawl and index our blog’s homepage.

    Disallow Particular Post
    Now suppose if we want to exclude a particular post from indexing then we can add below lines in the code.
    Disallow: /yyyy/mm/post-url.html
    Here yyyy andmm refers to the publishing year and month of the post respectively. For example if we have published a post in year 2013 in month of March then we have to use below format.
    Disallow: /2013/03/post-url.html
    To make this task easy, you can simply copy the post URL and remove the blog name from the beginning.

    Disallow Particular Page
    If we need to disallow a particular page then we can use the same method as above. Simply copy the page URL and remove blog address from it which will something look like this:
    Disallow: /p/page-url.html
  5. Sitemap: http://skilztools.com/feeds/posts/default?orderby=UPDATED
  6. This code refers to the sitemap of our blog. By adding sitemap link here we are simply optimizing our blog’s crawling rate. Means whenever the web crawlers scan our robots.txt file they will find a path to our sitemap where all the links of our published posts present. Web crawlers will find it easy to crawl all of our posts. Hence, there are better chances that web crawlers crawl all of our blog posts without ignoring a single one. 
    Note: This sitemap will only tell the web crawlers about the recent 25 posts. If you want to increase the number of link in your sitemap then replace default sitemap with below one. It will work for first 500 recent posts.  
    Sitemap: http://skilztools.com/atom.xml?redirect=false&start-index=1&max-results=500
    If you have more than 500 published posts in your blog then you can use two sitemaps like below:
    Sitemap: http://skilztools.com/atom.xml?redirect=false&start-index=1&max-results=500
    Sitemap: http://skilztools.com/atom.xml?redirect=false&start-index=500&max-results=1000

    Change Skilztools.com to your website URL. 

1 COMMENT

LEAVE A REPLY

Please enter your comment!
Please enter your name here