Striking distance keywords report using Google Search Console API and C#

Written by Nick Swan. Updated on 01, June 2023

Welcome back to the second article in this series.

This time we are going to look at how we can extend the code in our introduction to Google Search Console and C#. We want to filter the set of queries we bring back to those that appear on page 2 of Google. These are known as Striking Distance keywords.

This article makes up part of our Google Search Console tutorials and training section, make sure to check the others out.

Striking distance keywords report using Google Search Console API and C#

What are striking distance keywords?

Striking distance keywords are given the name because they are within ‘striking distance’ of page one of the search results.

There is a massive drop off in click through rate as your page sinks from page 1 to page 2. A lot of people just don’t click through to the second page of the results.

If you can do some ‘good old’ SEO and get these keywords onto page 1 of the search results, you should see an increase in click through rates and clicks from Google.

The first job though is finding out which queries your site is lurking on page 2 for...

Using the Google Search Console API to export striking distance keywords

In our introduction article we went through how to setup Google Cloud Platform, and write your first C# code. This code did a dump of the top 1,000 queries a site has ranked for over the past 16 months or so.

We want to make a few changes to our code so that:

  1. We narrow down the date range, so we only look at query data for the past 7 days
  2. As well as each query, we get the url that is being shown in the search results
  3. We get all the query rows available for our site
  4. Filter the data returned so we only see queries in position 10 - 20

Change the date period


DateTime startDate = DateTime.Now.AddDays(-10);
DateTime endDate = DateTime.Now;

In our initial code we had a 500 day spread between the startDate and endDate. This is roughly equal to 16 months, which is the maximum period of data available from Google Search Console.

With the striking distance keywords, we want to be looking at more relevant data. Keywords move up and down the search results, so looking at positional data from 12 months ago wouldn’t be much use in this use case. We want recent data to work with.

Finalised data from Google Search Console is usually at least 2 days behind. This is why we have a period spread of 10 days. It may not bring back exactly 7 days of data (sometimes more, sometimes less - depending when GSC last updated) but close enough to be able to work with.

[You can do a query for a specific date to check if data is available, so you could go back and find the first day and count 6 more back. But for simplicity sakes of this article, we’ll just use a 10 day spread]

Add page to the dimension list


List dimensionList = new List();
dimensionList.Add("query");
dimensionList.Add("page");

Now we want to request all the queries, and the page that is being displayed in the search results for the query.

Request the maximum number of rows


request.RowLimit = 25000;

If you don’t specify a value, GSC API will only return the top 1,000 rows. We want to request all the query data available so we set this to the maximum of 25,000 rows. We’ll also have to make multiple requests, changing the StartRow parameter each time, to make sure we get all the data we can.

Loop through and request the data


bool keepGoing = true;
int start = 0;
 
List rows = new List();
          
while(keepGoing)
{
    request.StartRow = start;
    var response = service.Searchanalytics.Query(request, searchConsoleUrl).Execute();
               
    if(response.Rows != null)
    {
        rows.AddRange(response.Rows);
        start = start + 25000;
    }
    else
    {
        keepGoing = false;
    }
}

We can get up to 25,000 rows of data from Search Console with each request. What we can do is make a request, and if it contains data, make further requests but increase the starting row by 25,000 each time.

If no rows come back from the current request, we know we have all the data and we can set keepGoing to false, and stop looping through.


StringBuilder sb = new StringBuilder();
 
sb.AppendLine("Query,Url,Clicks,Impressions,CTR,AvgPosition");
 
foreach(var row in rows)
{
    if(row.Position > 10 && row.Position < 20.1)
    {
        sb.Append(row.Keys[0]);
        sb.Append(",");
        sb.Append(row.Keys[1]);
        sb.Append(",");
        sb.Append(row.Clicks);
        sb.Append(",");
        sb.Append(row.Impressions);
        sb.Append(",");
        sb.Append(row.Ctr);
        sb.Append(",");
        sb.Append(row.Position);
        sb.AppendLine();
    }
}

Next we go through our collection of rows of data we retrieved from Search Console.

As we only want queries that appear on page 2 of the search results we use an ‘if’ statement so we only add the relevant rows of data.

Within the ‘if’ block of code, we append the data to a StringBuilder object, nicely formatted to be opened as a csv file which is then written out.

What to do with striking distance keywords

Once you have the data exported from Google Search Console via the API and open in your favourite spreadsheet, you can now work through the queries and see which you want to try to get onto page 1.

This can be done by improving the content, adding sections related to the striking distance keyword, and/or internally linking from other articles on the site that use the striking distance keyword already in their content.

You should check whether the page you are planning to work on is targeting a different higher volume or importance keyword. Optimizing for the striking distance keyword, and deoptimizing for the target keyword, would not be a good result in this case!

Full code

Here's the full code listing so you can copy and paste it into your program.cs file in VS Code.


using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
using Google.Apis.Auth.OAuth2;
using Google.Apis.Services;
using Google.Apis.Webmasters.v3;
using Google.Apis.Webmasters.v3.Data;

namespace console
{
    class Program
    {
        static void Main(string[] args)
        {
            var credentialsPath = "./service-key-filename.json";
            var searchConsoleUrl = "sc-domain:yourdomain.com";

            var stream = new FileStream(credentialsPath, FileMode.Open);

            var credentials = GoogleCredential.FromStream(stream);

            if (credentials.IsCreateScopedRequired)
            {
                credentials = credentials.CreateScoped(new string[] { WebmastersService.Scope.Webmasters });
            }

            var service = new WebmastersService(new BaseClientService.Initializer()
            {
                HttpClientInitializer = credentials,
                ApplicationName = "Console App"
            });

            DateTime startDate = DateTime.Now.AddDays(-10);
            DateTime endDate = DateTime.Now;

            List dimensionList = new List();
            dimensionList.Add("query");
            dimensionList.Add("page");
                        
            var request = new SearchAnalyticsQueryRequest();

            request.StartDate = startDate.ToString("yyyy-MM-dd");
            request.EndDate = endDate.ToString("yyyy-MM-dd");
            request.Dimensions = dimensionList;
            request.RowLimit = 25000;

            bool keepGoing = true;
            int start = 0;

            List rows = new List();
            
            while(keepGoing)
            {
                request.StartRow = start;
                var response = service.Searchanalytics.Query(request, searchConsoleUrl).Execute();
                
                if(response.Rows != null)
                {
                    rows.AddRange(response.Rows);
                    start = start + 25000;
                }
                else
                {
                    keepGoing = false;
                }
            }

            StringBuilder sb = new StringBuilder();

            sb.AppendLine("Query,Url,Clicks,Impressions,CTR,AvgPosition");

            foreach(var row in rows)
            {
                if(row.Position > 10 && row.Position < 20.1)
                {
                    sb.Append(row.Keys[0]);
                    sb.Append(",");
                    sb.Append(row.Keys[1]);
                    sb.Append(",");
                    sb.Append(row.Clicks);
                    sb.Append(",");
                    sb.Append(row.Impressions);
                    sb.Append(",");
                    sb.Append(row.Ctr);
                    sb.Append(",");
                    sb.Append(row.Position);
                    sb.AppendLine();
                }
            }
            

            File.WriteAllText("striking-distance-queries.csv", sb.ToString());
        }
    }
}