How to Fix Blocked Due to Access Forbidden (403) Error in Google Search Console

Written by Ryan Jones. Updated on 16, April 2025

The Blocked Due to Access Forbidden 403 error within Google Search Console is common. The error indicates that Googlebot has tried to crawl a URL on your website but found a 403 Forbidden HTTP status code. This signals to the crawler that it does not have permission to access the content.

URL inspection showing page 'Blocked due to access forbidden.'

There are several reasons why this error occurs. The main reason is usually down to rules on your site or server-level restrictions.

What Causes the Blocked Due to Access Forbidden (403) Error in Google Search Console?

Robots.txt Disallowance

One of the main reasons for seeing a Blocked Due to Access Forbidden (403) error is your site’s robots.txt file. You might have rules that disallow Googlebot from accessing certain URLs on your site.

If Googlebot cannot access a URL on a website, it is likely to throw this error.

Server-Level Restrictions

This error might also be down to your server settings.

Your web server might be set to block certain IP addresses. This is down to website security reasons. But if your serve is restricting any IPs that Googlebot uses, you will get the error.

Authentication Requirements

Some parts of your website may need login details.

Googlebot does not have these login credentials. So they cannot access URLs behind these walls. If this happens, you will see the Blocked Due to Access Forbidden (403) error.

Content Management System (CMS) Settings

Some CMS platforms may have built-in settings or plugins that block crawlers. If this happens with your CMS, you may find this is denying Googlebot access to your URLs. This may be causing the Blocked Due to Access Forbidden (403) error you are seeing in GSC.

Misconfigured .htaccess File

If your website uses an Apache server, you might have incorrect rules. These rules in your .htaccess file might block Googlebot. If that happens, fix the rules to fix the error.

Bandwidth Limitations

Some hosting providers may block requests from Googlebot. This usually happens if you exceed bandwidth restrictions set in your plan.

This is likely to happen if you are using a shared hosting provider. It can also happen if you experience a sudden uptick in website traffic.

If you are using a shared hosting plan, or you have received a surge in traffic, this could be the cause.

Geographic Restrictions

You may find that your server blocks requests from certain geographic locations. This can block Googlebot if it is crawling from one of those regions.

Ensure you are not blocking requests from a location Google is trying to crawl from.

Manual Block

Sometimes mistakes happen that block Googlebot from viewing your website. An example is webmasters or SEO professionals. Sometimes they can block Googlebot from accessing parts of your website.

If this happens, Googlebot will receive a 403 Forbidden error when trying to access your site’s content. And this is what leads to the Blocked Due to Access Forbidden (403) error in Google Search Console.

To fix this, you will need to decide whether these pages need to be blocking search engine bots or not.

How to Find Blocked Due to Access Forbidden (403) Errors in Google Search Console

Head to your Google Search Console dashboard. Click on the “Pages” link. Find this under the “Indexing” section on the left-hand toolbar.

Arrow point in Google Search Console to the Pages Indexing tab.

You will see your Page Indexing Report.

Google Search Console Pages Indexing report.

Scroll down on the Page Indexing Report to find the list of indexing issues. These are all the indexing issues Google Search Console has found for your website.

Note: This list will be different for every website.

The list will look like this:

List of reasons why pages aren't indexed.

Look for the “Blocked Due to Access Forbidden (403)” error within the list.

Blocked due to access forbidden indexing status in Google Search Console.

Click into the error and you will see a list of all affected URLs.

From here, you can decide if you need to fix these or not.

How to Fix the Blocked Due to Access Forbidden (403) Error

Fixing the Blocked Due to Access Forbidden (403) error is simple.

First, you need to find the reason Googlebot is being blocked from certain pages on your site.

Check your website’s:

  • .htaccess File
  • Server Configuration Files
  • Security Plugins
  • Firewalls

Look for rules that are blocking Google’s crawlers. You should also ensure that the affected URLs are not disallowed by your robots.txt file.

Once you have identified the cause, correct the configuration that is blocking access.

If it is a .htaccess or server configuration issue, adjust the rules to allow Googlebot to crawl. 

If it’s due to a plugin or firewall, update the settings within those tools to whitelist Google’s crawlers.

Always be cautious when making these changes. Maintain site security while also allowing search engines to crawl your site.

Once you have made any changes, you can inspect your URLs with GSC. This is to ensure Googlebot can crawl and index your URLs without issue.

Watch your Google Search Console dashboard for a few days to ensure that the 403 errors do not reappear.

If errors persist, you may need to conduct a more thorough review of your site’s settings. Or consult with your web development team or server help team.

Should you Fix All Pages with an Access Forbidden (403) Error?

Whether you want to fix the Blocked Due to Access Forbidden (403) error is down to you. You should base it on the content and intended accessibility of those pages.

For pages that need to be public, fixing these errors is a must. You need Googlebot to crawl, index, and rank your pages.

If pages are being restricted by choice. The next steps are different. This is because these pages are special access, or contain sensitive/private content. In this case, 403 errors are serving their intended purpose as they are not supposed to be available.

Additionally, these URLs should be disallowed in your website’s robots.txt file. This is to prevent search engines from attempting to crawl them.

Summing Up

The Blocked Due to Access Forbidden (403) error in Google Search Console is common. It can hinder your website’s SEO performance. This is because it prevents Googlebot from accessing and indexing crucial URLs.

Addressing this error involves a careful examination of various potential causes. Anything from server configurations to CMS settings. Fixing this issue requires a precise approach. You need to change settings that are blocking Googlebot. And ensure the solution meets the dual needs of accessibility for search engines and security for the site.

Resolving these errors is critical for public-facing pages to ensure their visibility. But it is important to recognize when such blocks are intentional. Blocks for private and restricted content should remain intact.

Want to make better use of your Google Search Console data? SEOTesting archives your website’s performance for longer than GSC’s default 16-months. It also has a range of useful reports. Reports to help you identify keyword cannibalization, quick win keywords, and more. Sign-up for our 14-day free trial today. No credit card required.