Google Search Console Data Limitations

Written by Ryan Jones. Updated on 14, November 2024

While Google Search Console search performance data is considered a first-party data source, there are some limitations to the data that you need to be aware of when working with it.

We take a look at those search performance report limitations and also cover the other areas of Search Console where the data only gives us part of the story.

If you are interested, we think we have the best Google Search Console alternative currently on the market.

 

User Interface and Export Limitations

With or without filters applied, the maximum number of rows of data viewable in Search Console via the user interface or a spreadsheet export is 1,000 rows.

The data returned in the user interface is always ordered by clicks and impressions. Any filtering done when the user interface has loaded is done via client-side javascript on the initial 1,000 rows of data. For a large site, this does mean you could be missing data if you want to sort by average position to find all the queries that are on page 2 (btw our Striking Distance Keywords report does this for you!)

The only way to get around any 1,000 row limitations is to either build a custom solution using the Search Console API or use a tool like SEOTesting.com.

Keyword Data Sampling

Google Search Console does provide information about the keywords your website appears for in the SERPs, but this data is not completely comprehensive.

Under the banner of ‘user privacy’, Search Console will not report all the queries a page appears in the search results for.

This mostly affects long-tail queries that only get a few searches a month.

You can see this issue occurring by looking at how many clicks or impressions a page gets and then looking at the queries the page is ranking for.

     Google Search Console only shows a sample of the queries generating the click to a page.

The sum of the clicks and impressions for all the page’s queries will almost never equal the impressions and clicks of the page itself.

While Google Search Console query data is sampled, it is the most accurate data source we have.

Impression Data

Impression data reported in Google Search Console can sometimes seem over inflated due to URLs appearing as site links when brand searches are performed by a user.

In our experience, while site links take up extra real estate in the SERPs, they hardly ever get clicked by users. As the user is searching for the brand name their destination of choice is most often the site’s homepage.

Having impressions increased by URLs appearing as site links can manipulate a page’s click-through rate.

Average Position

For queries and pages that get a low number of appearances in the SERPs, the average position reported in Google Search Console can be really misleading.

The average position includes localised and personalised results, so if it is just you and your employees searching for your own site, you could be artificially giving your page an average ranking of 1 as reported in Search Console.

The more impressions reported in Search Console, the more accurate the average position will be.

Geographical Data Limitations

You can only filter Search Console data by Country. This is only partly helpful if you are a small local business targeting a specific city.

Limited Date Range

The data available in Google Search Console goes back 16 months.

This is enough to allow you to compare months on a Year on Year basis, but not bigger comparison periods.

Once data reaches an age of 16 months, it is deleted from Google Search Console’s databases, and you will not be able to access it again. There are some tools that will export your data into another tool such as Microsoft Excel or Google Sheets so you can keep your own data source of 16 months and beyond.

Exporting backups of your data is a good idea as this is simply good practice.

Single Data Source

Naturally, Google Search Console only provides data from one source… Google. This means that Search Console is an excellent tool for measuring your traffic from Google’s SERPs, but it will not give you a full picture of your standing in search. You can’t see your search traffic from other search engines like Bing or DuckDuckGo.

We’d suggest complimenting your Search Console data with data from other analytics sources like Google Analytics and Bing Webmaster Tools. Doing so will help give you a more complete understanding of your website’s standing within search as a whole, not just the Google silo.

Backlink Data Limits

If you look at your backlinks within your Google Search Console dashboard, it would be easy to think that these are all the backlinks your website has acquired over the years. The reality, however, is slightly more complicated.

Google Search Advocate, John Mueller, has said in the past that the backlinks found within Google Search Console are the only ones you need to really be focussed on, especially when thinking about disavowing links.

This means that Google may be finding external links to your site, but it is choosing not to report them to you in Search Console.

We recommend signing up for a tool such as Ahrefs or Semrush. These tools have their own crawlers and will usually find backlinks that Search Console is not reporting.

Crawl Data Limits & Inaccuracies

Google Search Console will provide some information on how Google crawls your website, but this data is limited and is not 100% accurate.

Firstly, Google does not crawl every page on your website. They have to follow the rules set out in your “Robots.txt” files. Any pages that Google is told not to crawl, should not appear in the crawl reports.

It is also worth noting that Google’s crawl data is delayed by three days, meaning any information you see in your Search Console dashboard is three days old and is not real-time. To get around this, we’d recommend using a real-time website crawling tool such as ScreamingFrog.