Google Search Console. Our way of feeling as though we have control over Google.
Previously webmaster tools, Google Search Console (GSC) is a free service that helps you monitor and maintain your site’s presence in search engine result pages (SERPs).
Do you need it to rank in Google? No.
Should you use it? Definitely.
It allows the owner of a website to help Google better understand their website. Including:
- Indexing website URL’s
- Solving any URL issues
- Submitting new content for quicker indexing
- Identifying which search queries drive traffic
- Setting URL parameters
- And much more…
If you haven’t already, here’s a nice guide on how to set up GSC.
Let’s dive in and look at 8 ways GSC can give you the insight to supercharge your SEO strategy.
1. Improving Meta Descriptions & Title Tags
How to find it:
Search Appearance > HTML Improvements:
This section highlights any cases of meta description and title tag issues, inclduing:
- Duplicate meta descriptions: Meta descriptions that are identical across at least 2 URLs.
- Long meta descriptions: *2018 UPDATE*: Google are now permitting descriptions of around 230-300 characters (depending on your data source) as identified by SearchEngineLand here, but Google won’t give you an exact figure.
- Short meta descriptions: Under 70-100 characters. Again, Google doesn’t give us an exact figure for these.
- Missing Title Tags: URLs without any content in between the <title></title> elements in the HTML.
- Duplicate title tags: Title tags that are identical across at least 2 URLs
- Long title tags: Title tags that exceed 55-60 characters, but Google doesn’t give us an exact figure for these.
- Short title tags: Title tags that are under 30 characters. Again, Google doesn’t give us any indication as to the exact figure but research has shown it sits around this character limit.
- Non-informative title tags: These are usually title tags that have been auto-generated at placeholders, such as “Title” or “Untitled”.
- Non-indexable content issues: These are pages containing non-indexable content. This could be rich media files, video, or some imagery.
Clicking on the element that has the issues will reveal the URLs.
It’s always good to use your keyword research to create any meta data. Although meta data is not a ranking factor, it does help increase your CTR meaning that it is vital all meta descriptions are sufficiently descriptive and within character limits.
Get creative with them.
It is also worth noting that Google may truncate titles or meta data, despite them seeming to fall within character limits. This may be because they exceed maximum pixel width.
It is always worth checking your new titles and meta data are appearing as you desire them to!
2. Identifying hidden keywords
How to find it:
Search Traffic > Search Analytics:
We all know that Google’s Keyword Planner can give us a range of target keywords in which to optimize our pages for, but GSC also shows us which keywords our pages are generating the most clicks and impressions for.
Below you’ll find a range of keywords that, depending on your search criteria, have driven the most clicks, impressions and their average position.
This is all dependent on the date range is given over the last 90 days.
Most of the top results will likely be brand related, and this will only grow as you scale your business and grow your marketing communication channels.
So how can we find opportunities here?
What we can do is have a look at instances in which we are driving a high number of impressions but driving a low relative number of clicks:
With this data, you can work backwards to see which queries are driving a high number of impressions but a low CTR:
- Select Pages in the radio option provided
- Sort by impressions and identify which pages have a relatively high number of impressions but a low CTR
- Select the page you wish to analyze the keywords for and then select queries
- The below then shows the queries generating impressions and driving clicks to the target page
You can also filter out any brand queries using the filter option in step 3 under “queries”.
With this information, assess which target keywords you wish for the target page to rank for and split test different title and meta copy to see if this has any effect on increasing your CTR.
It wouldn’t be uncommon to see that you have multiple pages competing for the same keywords. This can confuse Google as it struggles to understand which pages to rank for which target keywords.
Ensuring that you have optimized your on-page SEO for that target keyword and made good use of internal linking can how prevent this happening.
You can also use these queries as a basis in which to form your content strategy and content creation. Identify which keywords you wish to target and either optimize those pages for that keyword or build completely new pages with the goal of ranking that url on page 1 for that target keyword.
A little bonus… use keywords to help increase PPC efficiencies
It’s a natural battle, PPC vs. Organic.
In one company I worked for both of these channels were managed in different teams. So when it came to reporting on our weekly data, the organic channel was, to an extent, at the mercy of the PPC spend.
The bigger the investment in one PPC campaign, the more paid search would monopolize the traffic away from organic.
One little tip that you can use though is using your ranked keywords as a guide for PPC.
Once you’ve done your keyword research and identified your target keywords either through Search Console or a Keyword Planner, you may find that for a substantial few keywords, you’re not ranking so well at this point.
The top query is a target keyword for this client, although the average rank is relatively poor:
If we plug this query into the Google Keyword Planner, we can see that it drives an average of 90,500 searches a month yet we are only gaining a small fraction of that:
This would be an ideal one to really build a strong PPC campaign for whilst working on a longer-term strategy for SEO.
We can also do the reverse and identify target keywords that we are ranking top spot for and potentially reduce the PPC budget invested into these campaigns:
This example shows a series of relatively low volume queries that we are ranking, on average, in position 1.
It’s always worth keeping an eye on fluctuations, but a PPC campaign built around these queries is arguably not as necessary as our previous example.
Of course, other considerations such as CPC and competition for the keyword will need to be considered before targeting the keyword for a PPC campaign, but this helps to guide which keywords should be considered as priorities.
It’s also worth noting the caveat to the Average Position provided in Search Console. The more popular a query is, the more pages that it may begin to rank for (including non-target pages). As a result, this could bring the average rank down (as a median), as it ranks for multiple pages in position 100.
Just one thing to keep an eye out for.
3. Do your target pages perform better on a different device?
With Google’s long and drawn out move to a mobile-first index and Google themselves telling the world that ‘mobile friendleness’ is a ranking signal, it’s key to ensure that your entire website is responsive.
It’s also a good idea to understand how users are viewing your content. For example, if your site is mobile responsive but built primarily for desktop users, the user experience on the site may not be as great as it could be.
So how can we see page performance for each device?
- Select an important marketing page and then click ‘Devices’:
- The below will then list the performance:
For this particular page we can see that, due to a higher average position of the keywords driving traffic to this page, we drive more impressions and clicks via a mobile device than both tablet and desktop.
With this information you can usually assume that the device is mobile friendly and hence Google’s willingness to rank it so well, but it’s always important to check:
- How the user experience of the page is for each device
- Are key CTA’s in optimal positions?
- The load speed of the page on multiple devices
Check out our awesome infographic on Conversion Rate Optimisation if you are a little unsure on how best to build a page for conversion.
If this is the first time you are reviewing this data, then it’s worth checking two time periods in Google Analytics for this:
- A check 12 months ago if your date range permits it (in Google Analytics)
- A check 3 months ago
This will allow you to see if there has been any major changes in device usage of your page.
Have a look at your key marketing pages that you wish to rank for in search. Is there more opportunity in a different device or is there work to be done on the most popular?
At the bare minimum, all your pages should be mobile responsive.
4. Reindex your updated pages
Once any updates to your pages have been made, whether this is on-page or off-page, we want Google to tell Google about these updates.
The fetch tool allows us to see whether the Googlebot can easily access the page, how it renders the page and appears to the Googlebot and also whether there are any elements that are blocked.
You have up to 500 fetches a week.
It is also our way of giving Google a little prompt to reindex the page(s) that we have made changes to.
How to find it:
Crawl > Fetch as Google
How to fetch a URL
- Enter the URL after the root domain forward slash.
For example, if you wanted to crawl the URL: https://www.mywebsite.com/products/brand-x/ then you would enter products/brand-x/ into the URL box.
- Select either a Desktop or Smartphone crawl. Note: It’s best to run both for the same URL to ensure that you are not unintentionally blocks any JS or CSS required for responsiveness.
- Hit fetch to simply crawl the URL. Hit Fetch and Render to get a view of how Googlebot sees your page, it may be struggling to crawl elements that won’t be immediately obvious with just using ‘Fetch’.
- Hit Request Indexing and this will prompt Google to reindex the page.
(After you prove you are real, of course)
Check the status of your pages in Google’s SERP’s across the next few weeks to see if your changes to any title and meta data have been updated.
You may even find yourself ranking slightly higher for your target keyword.
Google offer a full guide on fetching URL’s here if you run into any problems.
Let’s take a look at the technical elements for those medium term wins.
Google now places an immense amount of emphasis on user experience, ensuring that the result it provides to the query is exactly the result the user requires to solve their problem.
5. Identify crawl errors
It’s not uncommon to identify masses of technical debt of legacy URLs amounted over years if the webmaster or development team didn’t know it existed.
It’s also not uncommon to see this happening after a CMS migration.
You can identify any URLs Google had trouble crawling and subsequently reported it as an error.
How to find it:
Crawl > Crawl Errors
Here, we can identify:
- Server errors: Response Code 500 -> The URL was blocking the Googlebot or the request for the page timed out.
- Soft 404: The URL doesn’t exist but doesn’t show a 404.
- Access denied: Server blocking Googlebot access to the URL.
- Not Found: URL points to a 404 page.
- Other: Googlebot was unable to crawl the URL but the issue is undetermined.
The graph shows the error trend across the last 90 days.
It’s very normal to see Google finding errors within your site structure.
Google gives you the option to export all the errors into excel. If you find that Google is showing errors, export the document and sit down with the developers to try and better understand why this is the case.
- Site URL structure changed when in development
- CMS Migration left URL’s broken
- Incorrect URL rules were set up.
Fixing crawl errors helps to improve user experience, a huge thumbs up in Google’s book.
Once you believe you have fixed the errors in question, hit ‘Mark As Fixed’ to clear them from Search Console.
If the errors still persist, Google will report them once they attempt to re-crawl them.
If the URLs you fixed were in your XML Sitemap, then it’s worth generating a new one and updating this with Google.
- Generate your XML Sitemap with Screaming Frog
- In Search Console, go to Crawl > Sitemaps
- Hit Add/Test Sitemap
- Submit the sitemap URL, which should be yourdomain.co.uk/sitemap.xml
It’s also worth reviewing your robots.txt file to ensure that you aren’t unintentionally blocking any resources that may inhibit Googlebot from crawling pages effectively.
If you wish to view, edit and resubmit your robots.txt file, go to:
Crawl > robots.txt Tester
Hit Submit and select the step you wish to take:
Manage your backlinks
One of the first things you should do when in search console is manage the links pointing towards your site.
The authority, relevance, anchor and positioning of the links pointing towards your domain carries a huge ranking factor with SEO, so it’s important to ensure you keep on top of them.
Search Console allows you to view a large quantity of the links pointing towards your domain, but not all.
How to find it:
Search Traffic > Links to Your Site
The summary presents you with three pieces of data:
- Who links the most
- Your most linked content
- How your data is linked
Who links the most
GSC will show you the top 1,000 domains linking to your site, sorted by the number of times that domain physically links to your domain.
The links show how many times the domain has physically linked back to you and linked pages show the number of actual URLs that they are linking to.
Click ‘Download latest links’ and this will provide you with a higher quantity of domains linking to your domain.
What can I do with this?
Depending on the quantity of backlinks, it is worth installing Moz’s Free Domain authority checker and also manually checking to see if the backlink is suitable.
This can be done by assessing whether:
- The link is paid for
- The link is relevant to the page it is linking to.
- The URL or domain links out to a high volume of domains.
- The Domain Authority of the domain is far lower than your domain. Although, exceptions can be made in some cases. For example, a blogger may have been blogging for years but have a relatively low Domain Authority. The blog may still be relevant to your link and so would be fine.
- The link is littered with advertising.
- The link generally has a poor User Experience
Disavow the poor quality backlinks
Google gives you the opportunity to submit a list of domains pointing to your website that you wish to exclude. This is your way of asking Google not to consider these links when Google comes to assessing the authority of your website.
To disavow a URL or domain:
- Download all links pointing to your website and identify the poor quality links you wish to disavow
- Paste these links into a text file, one link per line. If you wish to disavow an entire domain, add domain:[website address]. For example, domain:spamlink.com
- Upload disavow file to the disavow links tool page
It may take some time for Google to process all the information provided. You will see a message on your in GSC Dashboard once Google has acknowledged the backlink portfolio.
Google offer more guidance on this here
Your most linked content
- This looks at which content is most linked to on your domain. It can help provide an idea of which content is popular and which may need some promotion.
- A URL with few links pointing to it could be an indication that the content needs to be improved or further outreach put behind it, assuming this is a link we want ranked for a target keyword.
How your data is linked – Optimize your anchor text
- This shows the anchor text that is used as a link back to your website.
- The anchor text remains a ranking signal with Google, this means that there is opportunity to optimize it.
Ahrefs recently did a study that identified Google’s sensitivity in regards to anchor text.
They concluded that exact match keyword anchor text should only make up around 2% of your total anchors and phrase match at around 30%. This leaves the rest to brand, website links or non-targeted anchors.
Search console provides a list of anchor text but unfortunately doesn’t provide the links using the anchor text in question.
SEMRush offer an anchor text service allowing you to see which links are using which anchors.
This allows you to reach out to webmasters that are linking back to you with non-optimized anchor text and request that it be changed to exactly match or phrase match the target keyword for that URL.
Just ensure to not exceed your exact match anchors above 2%!
Search console offers a great snapshot of organic performance for keywords and URLs, but it also allows us to guide the Googlebot, allowing their crawl and indexing of our website to be more efficient and more effective.
Search console can help shape your first SEO strategy by:
- Improving meta descriptions and title tags.
- Identifying hidden keywords to guide content plans.
- Increasing PPC efficiencies.
- Assessing page performance by device.
- Reindexing updated pages for quicker results.
- Identifying crawl errors and technical issues with the domain.
- Managing backlinks.
- Optimizing anchor text.
Of course, it can do so much more as well.
The truth is, we’ve only touched the surface of what Search Console can offer, but this should give you a guide on how it can be integral to shape your first B2B SEO Strategy.
Get the full SEO Guide here