What Do Google’s Index Coverage Issue Notices Actually Mean?

Google Search Console (GSC) may appear intimidating at first glance – so many functions, reports and metrics that make so little sense at a first glance. But there’s no way around it – GSC is an essential tool for every blogger. To cope with the initial overwhelm, browse our previous guide explaining the basics.

And if you already feel confident with GSC, it’s time to level up your skills. Today’s post will introduce you to another neat reporting feature that will help you maintain a “healthy”, well-ranking website. This post may get a bit technical, but it will make sense in the end. Let’s dive in.

 

GOOGLE SEARCH CONSOLE FEATURE TO NOTE: INDEX COVERAGE REPORT 

One of the most powerful features of the Google Search Console tool is the Index Coverage report. It presents an overview of all the pages on your site that Google bots tried to crawl and index, along with any issues they encountered along the way. In fact, you will receive an email update regarding index coverage issues straight to your inbox!

To provide you with a better understanding of what each Index Coverage Issue update means we’ve put together a detailed glossary (with potential cause examples) for all 7 ‘Error’ Index Coverage Issues below! Keep scrolling to learn more!

INDEX COVERAGE ISSUES – ‘ERROR’ GLOSSARY 

Server error (5xx):

Servers usually return a 500-level error when they’re unable to load a certain page on your website. This could be due to wider server issues (eek!), although often the cause is a brief server disconnection that doesn’t allow Google to load and crawl the page, generating a Console error. Pay attention to those, but don’t fret too much. The problem may disappear during the next crawl.

Redirect error:

It’s common for redirect errors to get reported when a website’s primary URL has changed a few times. When this is the case, sometimes redirects redirect to redirects. Confused? Okay, for instance, you have recently redesigned your website. Along the way, you have decided to change your domain from http://yourdomainname.com to http://www.yourdomainname.com – here’s the first redirect. Next, you have decided to move to a more secure protocol (https:), so your website then redirects users to https://www.yourdomainname.com.

The problem can be fixed by going through your internal links – fixing and updating those either manually or with a tool.

Submitted URL blocked by robots.txt:

This error can happen when you submit a page for indexing, but the page is blocked by robots.txt from indexing. Usually, this is caused by a line of code in your robots.txt file that tells Google it’s not allowed to access and index this page. Quick tip: if you do submit a page for indexing, try testing it using the robots.txt tester first!

Submitted URL marked ‘noindex’:

Your website can contain pages that are accessible to anyone (including search engines) and thus appear in search results and private “noindex” pages i.e. payment forms or gated content on your blog.

This error occurs when you submit a page for indexing, but the page has a ‘noindex’ directive either in a Meta tag or HTTP response. If you want this page to be indexed and appear in search results, you must remove the ‘noindex’ tag or HTTP response!

Submitted URL seems to be a Soft 404:

Soft 404s are pages that look like they are broken to Google, but aren’t properly showing a 404 Not Found response. Typical causes are: having a vacant category page, or website themes that sometimes automatically create pages that shouldn’t exist.

There are a few quick remedies for this error:

  • Keep your blog back-end neatly organized – avoid hoarding multiple themes and plugins; draft posts/pages; and create proper content taxonomies.
  • Make sure that your website is configured to return a 404 (not found) or a 410 (gone) response code to a request for a non-existing page.
  • Google does not recommend to redirect users to another page (e.g. homepage) instead of showing a 404 page.

Submitted URL not found (404):

If you removed a page from your website but forgot to delete it from your sitemap, you’re likely to receive this error. But this is easily preventable with regular maintenance of your sitemap file.

Submitted URL has crawl issue:

This error usually occurs when something prevents Google’s bots to fully download and render the contents of a page on your website. Long page load times and blocked resources are potential reasons for this, so make sure that your website’s speed is lightning fast! A lot of (needless) Javascript code in the page header can also result in this error.  Google recommends the Fetch tool to help you find the possible bottlenecks and make quick fixes.

Now off you go and check how your website performs!

 

Don’t forget to download our FREE SEO Checklist so you will always remember to keep an eye on your Google Search Console and other tactics that would help you to increase your domain ranking.


Leave a Reply

Your email address will not be published. Required fields are marked *

End of content

All posts are loaded