How to setup and fix robots.txt in Blogger for SEO [Step by Step instructions]

Robots.txt file is a very important part of SEO and plays a big role in this field. It tells the search engine bot what to index and what not to

How to setup and fix robots.txt in Blogger for SEO

[Step by Step instructions]

How to setup and fix robots.txt in Blogger for SEO [Step by Step instructions]



Hello guys, this is bdbloggerhub and today I’ve come with another post regarding SEO and robots.txt file. Robots.txt file is a very important part of SEO and plays a big role in this field. It tells the search engine bot what to index and what not to. I have added the perfect robots.txt code and instruction on how to set it up in down below. I hope you will read the whole post carefully.


There are a lot of newbie bloggers who are trying to get noticed in Search Engine Result page. Most of them are trying to start blogging to earn some money online, but they don’t know how to start or where to start with. If you are one of them, then you’ve come to the right place.


In today’s post i will be explaining things about robots.txt and how to set it up in blogger blog for SEO. If you don’t know what SEO is, you are very much welcome to read this broad post about “What is SEO and why we need to do SEO.


What is robots.txt?



Robots.txt is a file that has various kinds of instructions for the search engine bots. It basically tells a search engine crawler what to index from your site and what not to index. Let’s assume you have a blog about a niche and you are trying to rank it up on search engine results by doing SEO, you’d want all your posts to be indexed on Google. But there might be some pages or posts that you don’t want to be indexed on search engines. How would you do that? That’s where robots.txt file comes into play. You can write some instructions in a language that the crawlers would understand. You can also specify which crawler is allowed to crawl certain pages and which is not.


Robots.txt is important for SEO. Before a crawler visits your site it will search for your robots.txt file and will see where it can crawl and where it can not. If it is not allowed to crawl a certain pages, it will not crawl it and will not index it on search engine.


How robots.txt code works?

User-agent: Mediapartners-Google 
Disallow:   
[Google’s bot is allowed to crawl everything on this blog]  


User-agent: * 
Disallow: /search 
Allow: /  
[‘*’ means every search engine’s crawler is an user agent and they are not allowed to crawl the https://www.bdbloggerhub.com/search page and allowed to crawl any other pages or posts in the blog.]  


Sitemap: https://www.bdbloggerhub.com/sitemap.xml  
[This is just a link to my sitemap, it’s not really necessary to include this but i like to include it in my robots.txt file]



Robots.txt code:

[User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://www.bdbloggerhub.com/sitemap.xml]

Copy the above code, change ‘https://www.bdbloggerhub.com/sitemap.xml‘ with your sitemap’s link and follow the instructions below.


How to setup Robots.txt file on Blogger



Setting up a robots.txt file in blogger is pretty easy. Blogger has already included the option to set up a robots.txt file, all you have to do is put the instructions in the box and save. It helps a lot with SEO.


  • Step 1: Login to your Blogger blog
  • Step 2: Go to settings 

How to setup and fix robots.txt in Blogger for SEO [Step by Step instructions]





  • Step 3: Go to Search Preferences 



How to setup and fix robots.txt in Blogger for SEO [Step by Step instructions]





  • Step 4: Enable the “Custom robots.txt” if it is not enabled automatically.
  • Step 5: Click on edit and you will see a empty box in there.


How to setup and fix robots.txt in Blogger for SEO [Step by Step instructions]



  • Step 6: If there is any texts or codes in the box, remove them. If not, then proceed to the next step.
  • Step 7: Copy and paste the code i have given above in that box and save it.



Now you will have a fully working robots.txt file that will instruct the search engine crawler about what to do and what not to do.


How to fix blocked by robots.txt?



When you have a robots.txt file in your website and you have disallowed “/search”, google search console might show you a warning saying your website’s robots.txt has blocked “your-site-address/search”. It happens because you have blocked the crawler from crawling the /search page in your blog. This warning is nothing important and it won’t affect your site’s SEO. 


If you still want to fix this issue, just edit your robots.txt file and remove “/search” from disallow: /search under the user-agent: * . This will fix the issue.


So guys, That’s all for today. I hope you guys have read the complete article about robots.txt and hope it will help you guys. If you have any questions, feel free to comment below.

Source: https://www.bdbloggerhub.com/2020/01/how-to-setup-fix-robots-txt-in-blogger.html



You might also like this video