How to create sitemap for Blogger blog – Create a Robots.txt File Full Guide

Creating a Sitemap for your Blogger blog is an essential step in optimizing search results pages and improving SEO. It's necessary to submit your Sitemap to Webmaster Tools to achieve these goals.

How to create sitemap for Blogger blog – Create a Robots.txt File Full Guide

Create a Sitemap in Blogger and a Full Guide on Creating a Robots.txt File

A Sitemap acts as a roadmap for your website, listing all the published content. If your website isn't well-linked, a Sitemap helps search engines discover all your content, including orphaned content.

Why Create a Sitemap and Submit it to Google Search Console?

Understanding Sitemaps

A Sitemap is a layout of your website or blog, categorizing all the pages and posts within it. There are two types of Sitemaps:

1. XML Sitemap: Designed for search engines like Google Search Console, Bing Webmaster Tool, and Yandex Search Engine.

2. HTML Sitemap: Meant for Blogger blog users.

How XML Sitemaps Work

XML Sitemaps are intended for site administrators and search consoles. They contain the complete location details of the content. Search consoles collect data from these Sitemaps to present search results based on user queries. Sitemaps become more crucial when internal links to blog posts are weak.

How does a sitemap work?

Whenever you publish a new page or post, it's automatically added to the Sitemap. This signals search engines to index the new content. The search engine then crawls the new pages or posts, helping them discover all your content.

Creating a New Sitemap for Blogger Blog Posts and Pages

Your XML Sitemap is embedded in your Blogger blog. To access it, simply add "sitemap.xml" or "sitemap-pages.xml" at the end of your blog's URL. For instance, if your blog URL is "example.blogspot.com," the Sitemap addresses will be:
1.https://example.blogspot.com/sitemap.xml (for all Blogger blog posts)
2.https://example.blogspot.com/sitemap-pages.xml (for all blogger pages)

Submitting Sitemaps to Google Search Console

-Navigate to Google Search Console (search.google.com/search-console/about) where you registered your blog.(It is the same in which you registered your
blog previously, as we explained in the previous post here )
-Select your blog, go to the Sitemaps option, and enter the following sitemaps for articles, pages, and RSS feeds:
  •  sitemap.xml
  •  sitemap-pages.xml
  • rss.xml
  • atom.xml
  • atom.xml?redirect=false&start-index=1&max-results=500
-Submit these sitemaps one by one. Successful submissions will show the number of topics in green.
- It may require a full link and not just the sitemap name..Such as»
https://actually.blogspot.com/sitemap.xml
-The number of topics will appear in green if the submission is successful
-Now all the additions of a new sitemap have been completed, the number of
posts, site pages or processing data may appear.

Submitting a Sitemap to Bing Webmaster Tools

After Google Search Console, Bing Webmaster Tools is another vital tool. Register, add your blog, scroll to the Sitemap section, click "Submit Sitemap," and add the sitemap.

-Go to Bing Webmaster Tools and log in.(bing.com/webmasters)
-Register with a Hotmail or Microsoft account, then add your blog .
-Scroll to the Sitemap section of the Webmaster Tool
-Click the Submit Sitemap button
Add a new sitemap to the Bing Webmaster Tool.(YourDomain.com/sitemap.xml)
When search engines crawl your blog, they interact with the blog's robots.txt file. This file includes search engine rules and your Sitemap. It guides search engines to find your content.

Optimizing the Robots.txt File for Blogger

The robots.txt file tells search engines which pages to crawl or not. Use rules for user agents, allow, disallow, and Sitemap submission. Create a custom robots.txt file to enhance SEO.

Creating an HTML Sitemap Page for Blogger

To create an HTML Sitemap page, go to your Blogger dashboard, select "Pages," and create a new page. In the HTML section, add the provided code. Your HTML Sitemap page is now ready.
Each Site hosted in blogger has its own default custom robots.txt file that’s something look like that:

Robots.txt Type 1

User-agent: Mediapartners-Google
Disallow: 
User-agent: *
Disallow: /search
Disallow: /b
Allow: /
Sitemap: https://www.yourblogurl.blogspot.com/sitemap.xml

Robots.txt Type 2

User-agent: Mediapartners-Google
Disallow: 
User-agent: *
Disallow: /search
Disallow: /b
Allow: /
Sitemap: https://www.yourblogurl.blogspot.com/feeds/posts/default?orderby=updated

Please make sure to replace "https://www.yourblogurl.blogspot.com" with your actual blog address or custom domain. Follow the steps below to add a custom robots.txt file to your Blogger site. If you want search engine bots to crawl the most recent 500 posts, you should follow the instructions for robots.txt type 2. If your blog already has more than 500 posts, you can add an additional sitemap line.

Note: The provided sitemap will inform web crawlers about the most recent 25 posts. If you wish to include more links in your sitemap, replace the default sitemap with the one below. It will work for the first 500 recent posts.

Sitemap: http://example.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500

If your blog has more than 500 published posts, you can use two sitemaps as shown below:

Sitemap: http://example.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
Sitemap: http://example.blogspot.com/atom.xml?redirect=false&start-index=500&max-results=1000

Robots.txt Type 3

User-agent: Mediapartners-Google
Disallow: 
User-agent: *
Disallow: /search
Disallow: /b
Allow: /
Sitemap: https://www.yourblogurl.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
Sitemap: https://www.yourblogurl.blogspot.com/atom.xml?redirect=false&start-index=500&max-results=1000

 

Editing Robots.txt File in Blogger

Access robots.txt settings in your Blogger dashboard under "Settings," "Crawlers and Indexing Programs," and enable "custom robots.txt file." Paste the robots.txt code and save.
Step 1: Access your Blogger blog.
Step 2: Navigate to the Settings tab and select "Search Preferences" ›› "Crawlers and Indexing" ›› "Custom robots.txt."
Step 3: Activate the custom robots.txt content by choosing the "Yes" option.
Step 4: Paste the code of your robots.txt file into the designated box.
Step 5: Conclude the process by clicking the "Save Changes" button.

How to Verify Your robots.txt File

To verify your robots.txt file for your blog, simply append "/robots.txt" to the end of your blog's URL in your web browser. Here's an example:

http://www.yourblogurl.blogspot.com/robots.txt
Upon accessing the URL of your robots.txt file, you'll be able to view the complete code you've implemented in your customized robots.txt file. Refer to the infographic provided below for further clarity:

Conclusion

In this article, we've covered creating Sitemaps for Blogger blogs, submitting Sitemaps to Google Search Console and Bing Webmaster Tools, and creating a robots.txt file. These steps are essential for optimizing your blog's visibility and SEO performance.

Kar
By : Kar
Online content writer and chartered accountant .
Comments