How to Fix Crawling and Indexing Issue in Blogger?

How to Fix crawling & indexing issue

 Are you facing an indexing error on your Blogger website?

Well, after reading this you will be able to solve the crawling and indexing issue as we discuss each and every setting extensively. 

Here, I will cover how to set up custom robots.txt files, homepage and post page tags, meta tags and search console settings, etc. 

What is crawling?

Crawling is the process of search engine bots like Google bots visits a website to ascertain what is on the page. It helps search engines to understand the content and index it in their database. 

What is search engine indexing?

Indexing is the process of search engines organizing the information for faster response to given queries. They store the crawling data in their servers and organize them for different queries and show only the relevant search results. 

Search engine ranking

After analyzing the indexing data, then search engines show the relevant website in priority order and show in the SERPs. (search engine result pages). 

Ranking depends on a lot of factors like relevancy, Page rank, website authority, backlink, and a lot more. In fact, Google uses more than 200 ranking factors to show accurate results. 

Google Search Indexing Cycle

How to fix crawling & indexing issues in Blogger?

YouTube video

Fixing the crawling & indexing issue is a bit technical and depends on a lot of factors. Here, I will explain some of the settings to avoid these problems. 

Let’s start with the Blogger settings. 

#1: Privacy Setting 

Go to Blogger dashboard and click on the setting tab. Now search for the Privacy option and turn on the setting “Visible to search engines”. If this is turned Off then indexing of the web pages will stop. 

Indexing setting in Blogger #1

#2: Crawlers and indexing settings

Now scroll down to crawlers and indexing settings and turn on the custom robots.txt option. 

Now add the robots.txt file in the below format and replace it with your website URL. 

User-agent: *
Disallow: /search
Disallow: /category/
Disallow: /tag/
Allow: /
Sitemap: https://www.example.com/sitemap.xml
Sitemap: https://www.example.com/sitemap-pages.xml
Sitemap: https://www.example.com/atom.xml?redirect=false&start-index=1&max-results=500

If your website has more than 500 pages then you can add multiple sitemaps to your website. so, just add an extra line of code after the above code. 

Sitemap: https://example.com/atom.xml?redirect=false&start-index=501&max-results=500

In this way, you can add 1000 pages to your blogger XML sitemap and solve the indexing issue. 

You can also Use our Robots.txt Generator tool to Generate it for your Blogger website.

Now enable the “custom robots header tags” option and then you have to set the 3 header tags below. 

  1. Open Homepage tags and select the “all” & “noodp” option and save it. 
  2. Now select Archive and search page tags and select “noindex” & “noodp” option and save it. 
  3. In Post and page tags select the “all” and “noodp” options and save them. 

#3: Submit sitemap in Search Console

Now you have to submit the sitemap in the search console. If you don’t have an account then create one and then verify your domain in it. 

Now click on the sitemap option in the search console and enter the sitemap URL. Type the URL in this format and click on submit. 

https://www.example.com/sitemap.xml

Now your sitemap is submitted to Google. You can also submit your sitemap in the Bing webmaster tool or link your search console with it. 

After submitting the sitemap, Your website is automatically crawled by the search engines bots and indexed in search results. 

Here, some websites face crawling issues in their website. so, this happens sometimes due to crawling budget limitations or any other redirecting issues. 

You can fix the crawl budget issue by regularly updating your website and posting articles frequently. 

You can also manually submit your Blog URL in the URL inspection tool of the search console. Just paste the newly publish article link and click on request indexing. 

After that Google will set your website in priority and crawl your website in some times. 

URL-Inspection-tool for indexing

How can I avoid indexing and crawling issues?

There are several methods you can implement to avoid indexing and crawling issues on your website. These methods will help search engines index your pages faster. 

  • Post Articles frequently and update your old articles. 
  • Focus on the interlinking of articles as it helps search engines discover new pages easily. 
  • Share the article on social media and get some initial traffic to that page. 
  • Fix broken Internal links
  • Fix redirect loop (it happens when two pages redirect to each other)
  • Improve page loading speed
  • Fix duplicate pages issue
  • Use HTML sitemap in Blogger

Conclusion

Now you have learned how to fix crawling and indexing issues in your blogger website. Just check the robots.txt file and meta tags properly and follow the best practices as shown above. 

If you still facing any types of crawling and indexing issues let me know in the comment section. 

Read Also: How to fix “Avoid chaining critical request” in Blogger?

Related Articles..

Leave a Reply

Your email address will not be published. Required fields are marked *

3 Comments

  1. Hello Abhishek
    Thanks for sharing valuable article. I want to know one thing I have done everything already what you shared but still only home page of my blog is ranking on google, other blogs that i have written are not showing.