Written by Tiago Silva. Updated on 09, August 2023
After writing a new piece of content, nothing is more satisfying than seeing it in the Google search results, but how can you accelerate the crawling and indexing process?
Recently we’ve seen reports on Twitter of people waiting weeks for new pages to appear in the search results. We have put this guide together to give you a tick sheet of things to do for each article published, to give it the best chance of appearing quickly in Google.
The first step is using the URL Inspection Tool on Search Console and Request Indexing.
The easiest way for new content is to insert the page’s URL in the search box at the top of GSC. Upon pressing enter, you’ll then see the URL Inspection Tool results for the URL.
As this is a new URL for a new piece of content, the crawl and index status will display as ‘URL is not on Google’. Click the link to Request Indexing, to get this URL added to the priority crawl queue for the site.
Check out our Google Search Console Chrome Extension which allows you to check a page’s URL Inspection status while you browse your site.
An easy way to get a page indexed by Google is to include it in the XML sitemap file.
XML sitemaps are essential for Google to crawl a website efficiently as it lists all the URLs you want to be crawled and indexed.
You should submit a site’s XML sitemap in Google Search Console. A single sitemap is limited to 50MB (uncompressed) and 50,000 URLs, but Search Console will happily let you submit multiple sitemaps with the URLs split between them.
For WordPress sites, there are lots of plugins that create XML sitemap files automatically and update them as new content gets released on the site. Yoast, Rank Math, and AIOSEO are popular options.
Adding links from existing pages that Google already crawls on the site will likely speed up the process of Google crawling your new page. This is called internal linking and is an excellent way to show that the new page you want to index is important for your site.
The process is straightforward:
Internal links are essential for SEO. This process requires little effort and can have a significant impact.
Ever wondered why most sites have a ‘latest blog posts’ or ‘articles’ section on their homepage?
A link from your site’s homepage gives an indication to Google that your new pages is important, and that it should quickly crawl and index it.
This link can be temporary. Once it’s in the search results you can consider removing the link, although the page’s rank may drop. Links from the homepage usually have a huge amount of link equity to pass around, so removing that link will remove that equity.
This is more anecdotal than a proven tactic, but we have ourselves seen instances where we have shared a piece of content on Twitter that has been well received, liked, and retweeted. On the same day, that piece of content appeared in the Google search results, ranking on the first page.
Of course, this means that the piece of content you create needs to be useful enough for people to want to like and share it, so it’s not going to work for everything you publish on your site. I’m not sure your terms of service or privacy page will get many retweets!
If you are sharing your content on Twitter, paying to promote your tweets can help with gaining likes, retweets, and improving the chances of Google indexing the content by finding it via Twitter.
Desperate times require extreme measures. If all of the above has failed, and you have waited a few weeks and your content is still not getting crawled, create a new XML sitemap file with only the URL you are struggling to get crawled by Google and submit it in Search Console.
You should be doing most of the things mentioned above by default whenever you publish or update content. Request indexing in Search Console whenever content is published or updated should always be done, and internal linking is a general best practice that helps both Google and website visitors.
We’ve also covered a couple of extra options – especially creating a separate XML sitemap file just for the URL that isn’t getting crawled. There’s no need to do this for everything you publish or update, but it’s good to have an extra option for when needed.