Procedure to create XML Sitemap and robot.txt files.

Creating XML sitemap.

The easiest technique to improve Google's ability to crawl your Blogger blog is to include an XML sitemap. Google promotes the use of XML Sitemaps and other tools for swift and effective crawling. Search engines like Google, Bing, and others use XML Sitemaps to find and crawl posts and pages on your website that their search bots may have missed during routine crawling. As a result, an XML sitemap aids Google in correctly indexing your blog.

  • Log in for Sitemap Generator.
  • Type in your blog's URL (website address).
  • To create the XML file for your sitemap, click Generate Sitemap.
  • Copy the entire text.
  • Go to Blogger.com, open your blog, and select Settings from the dashboard. Select the Enable Custom robots.txt option that you may find under Crawlers and Indexing. 
  • Insert the XML sitemap into the unique robots.txt file. Disallow:/Search should now read Disallow:/None, as shown in the image below. This will enable Google's crawler to access all of your blog's pages, including Labels (Categories) whose URLs contain the word "search". If not, the Google crawler won't be permitted to crawl and index URLs for "search" pages.
  • Save your changes. 

Check Your Custom Robots.txt File

  • To check your robots.txt, type your sites URL on web browser and add "/robots.txt".
  • Example: http://www.yourblogurl.blogspot.com/robots.txt
  • This will make the code in your own robots.txt file visible.


4 Comments

  1. Thankyou For the Information, This will help us a lot.
    Also,
    Your Future Ladder: Your trusted visa and immigration consultants. We specialize in providing tailored solutions for international travel and residency. Our experienced team offers personalized guidance through every step of the visa application and immigration process. Trust Your Future Ladder to simplify your journey and help you achieve your global aspirations with ease and confidence.

    ReplyDelete
  2. Great guide on creating XML sitemaps and robots.txt files! At Impressive Sol, we specialize in SEO, web design, development, and graphic design globally. This process is invaluable for optimizing our client websites.

    ReplyDelete
  3. To create an XML Sitemap, generate a list of your website's Site Country URLs in XML format and upload it to your root directory, then create a robots.txt file by specifying the user agents and directives for crawling, and place it in the root directory of your site.






    ReplyDelete
Previous Post Next Post