Written by Ryan Jones. Updated on 08, April 2024
This error message within Google Search Console is a confusing one, not least because we could (technically) write two completely different articles about it. We may do this in the future, but for now, we are going to focus on writing about the Blocked Due to Unauthorised Request (401) error within Google Search Console. We will explain what the error is, how you can find the pages that have been impacted, and how to fix the error and stop it from showing up in GSC.
Perhaps, in a future article, we will talk about how you can fix the error if you are seeing it when trying to access someone else’s website. But, for now, we’re focussed on helping you make the most of Google Search Console.
Image Credit: https://www.onely.com/
When you are seeing the Blocked Due to Unauthorised Request (401) error within your Google Search Console dashboard, it means that Googlebot has tried to access some pages to crawl them, and has been presented with a 401 error code.
Googlebot sees this error code and realises that, even though it has been asked to crawl the page, it cannot access the page. Therefore, it throws the error code.
When Googlebot starts crawling the pages on your sitemap, it assumes that it is supposed to crawl almost every single one of them. Unless, that is, if it comes across a rule within your website’s robots.txt file or a noindex meta tag on a page. If it does not come across these, it will try and crawl every page on your site.
When it tries to crawl a page, it naturally requires access to the page to crawl, render, index and rank that content. If it cannot access the page, it will throw an error within Google Search Console. The Blocked Due to Unauthorised Request (401) error simply means that it has spotted a 401 error code, so it is no longer going to attempt to crawl the page.
To find any and all the pages on your website that have been impacted by this error, you need to head to the Page Indexing Report within Google Search Console.
Log in to your Google Search Console dashboard, and then click on the “Pages” link within the “Indexing” section on the left-hand side toolbar:
Once here, you will see a screen looking very similar to this:
From here, scroll down on this page and you will be greeted with a list of all the reasons certain pages can’t be indexed. If your website has been receiving any of the Blocked Due to Unauthorised Request (401) errors, they will appear here:
Image Credit: https://www.onely.com/
You can then click into this link and you will find all of your website’s pages that have been impacted. These can then be saved and worked through one at a time.
If you are finding issues with restricted page access on Google Search Console, various factors could be at play. The page you are trying to access might need authentication, and if Googlebot does not have these credentials then it will not be able to access the page to crawl it. At times, marketers or webmasters might accidentally impose authentication requirements on pages that should be accessible to anyone, so it’s always a good idea to review your website’s configuration and CMS settings. Additionally, servers can be set-up incorrectly, causing them to require authentication for content that shouldn’t be behind any “barriers” to access.
Problems can also be found from specific files and server set-ups. It’s very important to make sure that your robots.txt file is not mistakenly preventing Googlebot from accessing specific pages. Of course, the main role of a disallow directive is to stop bots from crawling certain parts of a site. But in some cases, mixed configurations might lead to a 401 error.
Sometimes, technology that is used to improve or secure a website can sometimes be the downfall. CDNs or firewalls, for example, might misinterpret requests from search engine crawlers as malicious and block them, so ensuring these are well-configured to accommodate search engine bots like Googlebot is crucial. Sometimes, as well, website plugins or extensions (particularly those centered around security) can occasionally cause authentication issues If you try a new plugin or extension and suddenly start getting these errors, it’s worth delving into the settings and seeing if this is the cause of the problem.
Luckily, there are certain things that you can do in order to stop these Blocked Due to Unauthorised Request (401) errors from happening on your website, and within your Google Search Console dashboard.
Below, we have outlined all the steps you should take. The first steps within the list are the ones you should do right off the bat, and the further you head down the list is the last step you should take, with the last option being the “last resort” for fixing the issue.
One of the first things you should do in order to diagnose a potential issue with accessing your URLs is to head to the URL yourself, within a browser such as Google Chrome or Mozilla Firefox.
By doing this, you will be able to see for yourself whether there is an authentication prompt for all users, or whether it is just appearing for Googlebot.
Having this kind of hands-on approach will offer an immediate insight into the accessibility of your website’s content, and can help differentiate between widespread issues and issues that are only occurring for bots.
You can then work on fixing this. If the authentication is happening for everyone, then you can remove this (if applicable) and Googlebot should be able to access the page without issue.
Your website’s server logs are a treasure chest of information regarding how your website operates for both users and bots. By taking a look at your server logs, you can find any 401 errors that have occurred, and more importantly, when they occured and how often.
These logs will not only record any and all instances of these errors, but you will also be able to find additional context about the circumstances under which they happened. This will help you in identifying the cause and finding the right solution;.
So if you are seeing a 401 error happen every time a certain page is accessed, for users and bots, you know that there is probably some authentication needed here that might need to be removed.
Every single website will have a set of configurations that define its behaviour. These are set in a number of ways, like through server settings, .htaccess files, or a CMS such as WordPress.
If you find any mistakes in these settings, it is important to fix them quickly. Having proper, and up-to-date, configurations will help to avoid issues that can block users and search engines from properly accessing your site.
By combing through these configurations, you may find that your CMS has certain page blocks in place that stops people (whether that’s actual users, bots, or both) from accessing certain content. Removing these page blocks will allow users and bots to access your page without issue.
Google Search Console allows you to make use of the “URL Inspection Tool” within their platform. This tool will allow webmasters and marketers to simulate exactly how Googlebot sees a specific URL on their site.
By using the URL Inspection Tool, you can gain clarity on whether the 401 error persists for Googlebot, or whether any fixes you have put in place (like updating website configurations or removing password-protection from the page) have worked or whether the problem is still happening.
This is an incredibly efficient way to validate whether a change has worked, or whether you need to look for an alternative solution.
Sometimes, trying to fix an error like this can become incredibly complex. If you find yourself out of your depth when dealing with technical website issues, don’t hesitate to speak to someone more qualified.
Finding the help of a web developer or your server’s administrator, especially one familiar with your website hosting environment, can speed up the process of fixing these errors on your site.
Having their expertise on-side can provide clarity, ensure optimal website configurations are used, and prevent any potential pitfalls.
In the ever-changing world of websites, understanding technical errors such as the Blocked Due to Unauthorised Request (401) error within Google Search Console is incredibly important. As we’ve discussed in this guide, this error can be a result of various issues, from server configurations to accidental password-protection. Finding, fixing and resolving these issues ensures that your website remains visible and accessible to Googlebot. Whether you’re able to diagnose and fix these errors yourself or you need the help of a professional, addressing the erorr will ensure your content is always available online to whoever needs it!
Want to make better use of your Google Search Console data? SEOTesting offers a range of useful reports, designed to help you make better decisions from this data. And that’s on top of our testing functionalities! We’re currently running a 14-day free trial for anyone, with no credit card required. So sign-up today for 14-days free access to the tool.