Default Robots.txt of Blogger blog is not SEO Friendly, simply because it allow Search Engine Bots to Index all of the Blog pages, Including your blog Posts, Search Pages used for Navigate from Old to New posts. It also allows Index Labels of Blog and Monthly achieve Pages. It is bad for Blog SEO Health and decrease your blog Search Engine Ranking.The Search Pages of Blogger, which is Post Labels and Navigation URL, this pages considered as bad because the produce Duplicate content error and Duplicate Title error. Search pages also have Dynamic URL, which is not supported by all search engines. So when search engine bot Index your blog Labels, they index your all post which already exists in the there index, so it produces duplicate content.
So it necessary to have Robot.txt for your Blogger blog that tells all bot to not to Index your Archive blog pages, Labels and Navigation pages. I have Robot.txt for Techofy, which has very positive results for my blog, and every line is tested one by one, and it removes some error that is reported in Webmaster Tools.
- Remove duplicate Meta descriptions error from your Blogger Blog.
- Remove Title duplicate error from your Blogger Blog.
- Stop All search Bot to Index Search, Labels, and achieves of the blog.
Previously it was not possible to edit Robots.txt of Blogger-based blog, but now you can use custom Robots.txt on your Blogger Blog. Follow Below Step to enable custom Robots.txt
- Go to Blogger Dashboard > Setting and then Click on Search Preferences.
- In Crawler and Indexing section click Edit in front of Custom robots.txt.
- Click Yes and add your custom written Robots.txt and then click Save changes to save.
User-agent: * Disallow: /search Disallow: /*_archive.html Disallow: /cgi-bin/ Allow: / Sitemap: http://yourblogURL/atom.xml?redirect=false&start-index=1&max-results=500
Copy above code and paste into your blog custom Robots.txt but replace Sitemap section with you Blog URL and Save.
This line applies to all bots and all search bot that support robots.txt exclusion.
This line tells the bot not to Index you Blog labels like my blog label Software, which stops duplicate content error.
This line tells bots to not to Index your blog monthly archive pages which contain the list of all post of the particular month.
This is an optional line I have included it because sometimes webmaster reports 404 pages from your blog containing /cgi-bin/ part in URL, this 404 pages are auto generates due to some blogger internal error so stop search engines to stop indexing this type of pages.
Allow: / Sitemap: http://yourblogURL/atom.xml?redirect=false&start-index=1&max-results=500
This is your Blog sitemap which makes bots search your blog link and Index it. Including sitemap into robots.txt which makes it easier to submit the sitemap to all search engine. Otherwise, you have to submit manually the sitemap to all search engine.
See Robots.txt of Techofy and ask the question if you have any.