Procedure to create XML Sitemap and robot.txt files.

Creating XML sitemap.

The easiest technique to improve Google's ability to crawl your Blogger blog is to include an XML sitemap. Google promotes the use of XML Sitemaps and other tools for swift and effective crawling. Search engines like Google, Bing, and others use XML Sitemaps to find and crawl posts and pages on your website that their search bots may have missed during routine crawling. As a result, an XML sitemap aids Google in correctly indexing your blog.

  • Log in for Sitemap Generator.
  • Type in your blog's URL (website address).
  • To create the XML file for your sitemap, click Generate Sitemap.
  • Copy the entire text.
  • Go to Blogger.com, open your blog, and select Settings from the dashboard. Select the Enable Custom robots.txt option that you may find under Crawlers and Indexing. 
  • Insert the XML sitemap into the unique robots.txt file. Disallow:/Search should now read Disallow:/None, as shown in the image below. This will enable Google's crawler to access all of your blog's pages, including Labels (Categories) whose URLs contain the word "search". If not, the Google crawler won't be permitted to crawl and index URLs for "search" pages.
  • Save your changes. 

Check Your Custom Robots.txt File

  • To check your robots.txt, type your sites URL on web browser and add "/robots.txt".
  • Example: http://www.yourblogurl.blogspot.com/robots.txt
  • This will make the code in your own robots.txt file visible.


1 Comments

Previous Post Next Post