Help! My Google Search Console Is Not Updating

Written by Ryan Jones. Updated on 06, September 2024

Logging in to your Google Search Console account and not seeing any new data is incredibly annoying.

In this blog post, we will talk through how you can tell if your Google Search Console is not updating, the possible reasons why it is not updating and what you can do about it.

How can you tell if your Google Search Console is not updating?

When you log into your Google Search Console dashboard, you will be greeted with your “Overview” page. This will give you a snapshot of your page performance report, your page indexing report, your page experience report, and your enhancements report.

If you head to the left-hand side of your screen, and click on “Search results” under the “Performance” section of the toolbar, you will be taken to the full-page performance report. It will look something like this:

Last updated date in Google Search Console.

You can see, in the top right-hand corner of the screenshot above, we have highlighted the wording “Last updated: 4 hours ago”. This means the data we are seeing is 4 hours old, completely up-to-date.

If, on the other hand, you log into your page performance report and see a message saying this:

Google Search Console not updating for 65 hours example.

With a message denoting “Last updated: 65 hours ago” or, sometimes, even longer. It means there is an issue with Google Search Console showing data. We have seen some instances of Google Search Console accounts being out-of-date by well over 100 hours.

Why is Google Search Console not updating?

If you have found that your Google Search Console is not updating, there are a number of reasons for this. Some come down to Google themselves, others are issues with aspects of your website.

Issues with Google Search Console.

The first, and often the most common, reason that Google Search Console is not updating is an issue with Google and their servers. Sometimes even a company as large and as powerful as Google has issues with downtime.

If this is the case, you will often be notified. The two most common ways of notifying customers about issues with Google Search Console not updating are through the Google Search Central Twitter account, and through Google Search Console itself.

If you head to the Google Search Central Twitter account and see a message looking similar to this:

Google Search Central informing about a Google Search Console reporting issue.

You can be certain that the downtime you are experiencing is through no fault of your own. Google Search Central will also tweet out once these issues have been resolved, and let you know when you can expect your GSC data to be back up to date, once again.

Sometimes, Google will also let you know about issues with Google Search Console data directly within Google Search Console. If this happens, you will see a message looking like this:

Google Search Console label warning users about a reporting issue.

You can then click on the “See here for more details” message which will tell you more about the issue Google Search Console experienced at the time.

In this instance, the issue we have seen is down to a logging error. Here is the error message, in full, from Google:

“Due to a logging error, sites may see a small drop in data for this day. This is a logging issue only; it does not reflect changes in search performance or user behaviour. We hope to replace most of the missing data soon.”

Again, if this is the case and you see this type of message from Google within your performance report, you can be sure that Google Search Console is not updating through no fault of your own.

Issues with reading your website’s sitemap.

One common reason for Google Search Console not updating is an issue with your Sitemap, or at the very least, Google reading your XML sitemap.

Aside from you manually asking Google to index your page via the “page submission” tool, Google relies on crawling sitemaps and the web itself to find pages worth indexing and ranking. If your website is particularly new and does not have a lot of backlinks from external sources, it may rely on your sitemap even more to index and rank your pages.

If Google has an issue reading your sitemap, for whatever reason, this could delay Google indexing pages on your website and, therefore, delay showing your data until it has crawled the necessary pages.

You can take a look at the status of your sitemap directly within Google Search Console. Head to the “Sitemaps” section under the “Indexing” toolbar and you will be greeted with your sitemap report:

Sitemaps page in Google Search Console.

Here you will be able to see information such as:

  • Your sitemap URL.
  • Your sitemap type.
  • The date your sitemap was first submitted to Google.
  • The date your sitemap was last read.
  • The current status of your sitemap.

The two most important things to focus on here are the date your sitemap was last read and the status of your sitemap.

Changes to your robots.txt file.

In some instances, changes to your robots.txt file can cause delays with Google Search Console processing your data.

Your robots.txt file controls how search engine crawlers like Googlebot crawl and interact with your website. If you have made changes to your robots.txt file that is stopping Googlebot (and other crawlers) from parsing large sections of your site, you are going to experience a delay in your data appearing on Google Search Console.

This can happen occasionally, especially if large-scale changes are made to a robots.txt file. Or if you are deploying a staging website to live. Whatever the reasoning, it’s best to have another check on your file and make sure Googlebot is able to crawl all the pages you want to be crawled.

Significant site structure changes.

If your website goes through a major update that sees significant changes to the structure of the website, you may find that Google takes longer than usual to re-crawl your site and, therefore, display your data within Google Search Console.

Due to the sheer size of the web and the number of web pages for Google to crawl, it is highly unlikely Google is going to crawl your website every single day. This is especially true for smaller websites. So if your website has undergone an update and there are lots of pages for Google to re-crawl, you may find a delay in this happening.

This is less of an issue for larger, more active, websites as Google tends to crawl these websites multiple times a day. But if you are a smaller, less active website, then Google may wait longer to re-crawl your site.

Manual actions.

Occasionally, although rare, Google will issue a website with a “manual action” if they believe that the site has been engaging in black hat SEO practices such as purchasing backlinks, which likely causes a drop in rankings.

To establish whether your website has received a manual action, you can head to the “Manual actions” section of your GSC account, found within the “Security & Manual Actions” section of the toolbar.

It should look something like this:

Manual actions page in Google Search Console.

However, if Google has applied a manual action, you will see something like this:

Example of a website with a manual action.

How can I get Google Search Console to start processing my data again?

Depending on the reason for Google Search Console not updating your account, there are a number of things you should do.

Just wait.

In most cases, the only thing you need to do is just wait. Especially if Google Search Console is not showing data because of an issue on Google’s end, or changes to your website structure.

If it’s the case that it’s a Google issue and not an issue with your website, there is nothing you can do anyway other than wait. Google, in general, are very quick in resolving issues with Google Search Console.

You can, sometimes, encourage Google to re-crawl your website faster by submitting a new sitemap or asking it to recrawl your web pages automatically through the URL submission tool. Keep in mind, however, that the submission tool has a limit of 10 per day when manually submitting URLs. In most cases, you are better off letting Google crawl, index and rank your page at its own speed anyway.

Fix any issues with your sitemap.

Sometimes, Google has an issue reading the entire sitemap. If this is the case, it will show an error in the status section. Or, if you find that Google has not read your sitemap in a while, you can re-submit it and see if this gets the barrel rolling quicker.

It’s also worth noting that you can click into each individual sitemap to find any issues with specific pages.

Submitted sitemap in Google Search Console.

With this, we can see Google has discovered 64 pages, and the sitemap was last read on the 24th of July 2023 at the time of writing. We can click into the “SEE PAGE INDEXING” section to see if there are any issues with Google indexing our pages:

Pages that aren't indexed report in Google Search Console.

We can see from above that two of our URLs have not been indexed, even though they have been crawled by Google. They have been listed with the Crawled – not currently indexed error.

I can see, from further investigation, that the URLs in question are:

  • https://seotesting.com/blog/feed
  • https://seotesting.com/blog/niche-site-owners/

Whilst the first URL is the URL of our sitemap, the second URL is interesting. This is a blog post we wrote on why niche site owners should be using SEOTesting. This is now marked in my calendar to further investigate why this particular page has not been indexed, even though it has been crawled.

Fix any issues with your robots.txt file.

If you suspect the issue is down to your robots.txt file. Take a look and find out. Head to https://yourwebsiteurl.com/robots.txt and scan your file. Can you see any instances where robots.txt is blocking Googlebot from visiting your pages? Or your entire site?

Head to your page indexing report, you may find that some pages are being indexed, though blocked by robots.txt. Fixing these issues is sure to get the ball rolling and get Googlebot crawling your website again.

Resolve issues that lead to manual action.

If you have been given a manual action by Google, they will tell you the reasoning for this in the “Manual actions” section of your Google Search Console account. It may be down to the fact they have spotted some “spammy” backlinks pointing to your website, or you are using lots of user-generated content that is not valuable to the user.

Whatever the reason for Google implementing this manual action, getting it fixed quickly is imperative. Listen to the reasons Google has stated for the manual action and resolve it as soon as possible. You might need to change the structure of your website, and the way you write posts or you may even need to disavow some backlinks, but doing this is going to get your site back to ranking faster.

There we have it. If your Google Search Console is not updating, hopefully, this article gives you all the information you need to know on how to diagnose the problem and, more importantly, how to fix the problem so you can get back to ranking your websites as quickly as possible, rather than worrying about issues with GSC.

If you are looking for a tool that can make more of your Google Search Console data, consider giving SEOTesting a try! We have a range of useful reports and other tools that will help you get more from your GSC data, without any additional effort. We are currently running a free 14-day trial, with no credit card required to sign-up.