In Lawyer’s Guide to Google Search Console (Part 1), we covered Dashboard through AMP reports. Now we’re going to tackle the rest of Search Console.
The Search Traffic section is where most of the exciting Search Console data lives.
First, we have the Search Analytics report. Here you can view your site’s performance in Google Search. This report includes data on:
You can also filter this data across a variety of dimensions including:
- Search Type
- Search Appearance
There are a couple important point to understand about Search Analytics data. First, the data is completely separate from Google Analytics data. Put simply, it’s generated in a completely different way and you will likely find significant discrepancies between the data sets. Second, there are a lot questions surrounding the reliability of Search Console data, particularly as it relates to Search Analytics.
Despite some of these issues, this data is particularly useful for identifying search queries for which there is significant volume. This can help inform content ideation and on-page optimization. At a minimum, this data is useful to track directional trends of pages’ average positions in Google.
Links to Your Site
The Links report contains three sections:
- Who links the most
- Your most linked content
- How your data is linked
This report contains external links that point to your pages from other sites around the web.
It’s important to keep in mind that this is not a comprehensive list of all the links Google has found that link to your pages. In fact, other link indices (see Majestic, Ahrefs, and Moz’s Linkscape) are much more comprehensive. Nonetheless, the Links report is useful to see some of what Google sees when it comes to links and anchor text that point to your site. As you probably know, links and anchors are taken into consideration by Google in deciding when / where to rank pages in their search results.
The Internal Links report lists internal links on your site. This is useful for understanding how Google views your site’s structure and page hierarchy.
Hopefully this section always says:
No manual webspam actions found.
The Manual Actions report will list situations where one of Google’s human reviewers has found that your site is in violation of Google’s webmaster quality guidelines. There are a bunch of different types of manual actions. Most of them relate to some form of spam activity. Some of the most severe manual actions can have serious consequences on your site’s visibility in search results. In fact, some actions can result in a complete de-indexing of your site!
If experience a manual action penalty, you’ll want to resolve the violation as soon as possible and begin the long, arduous journey toward submitting a successful reconsideration request. You’ll probably also want to get some help from someone who has experience submitting reconsideration requests. Manual actions are among the most severe and difficult to remove issues you can face.
The International Targeting section allows you target specific audiences according to language or country. This really only comes into play if your site is intended display the relevant language and country version of your pages. Getting international targeting right can be a bit tricky. We regularly see really bad implementations of law firm websites intended to serve multi-lingual and multi-location audiences.
The Mobile usability report is also among the most important Search Console features. Google recognizes the importance of mobile usability. Issues relating to your site’s lack of mobile-friendliness can cause serious problems with how your pages appear in results.
Hopefully, you’ve already implemented a responsive web design. If not, get familiar with responsive web design basics, and contact your developer to get this fixed yesterday.
The Google Index section provides insights on how your pages are being indexed by Google.
The Index Status report lists how many total pages Google has indexed. It also provides this data over time so that you can see whether Gooogle is indexing new pages as you create them. The Advanced tab will also let you filter by blocked and removed pages.
Blocked resources are those that can’t be accessed by Googlebot. This can make it difficult for Google to properly render and index the pages. With some limited exceptions, I tend to recommend that webmasters allow Googlebot to index just about everything. Maximizing the amount of information Googlebot can understand about your pages typically helps maximize your pages’ visibility.
Of course, there are some situations in which you don’t want pages to be crawled, indexed, and served in Google results. This where the Remove URLs report can come in handy. Take note that this tool is only useful for temporary removals. For permanent content removal, you should remove the target page and follow the steps to permanently block and remove content.
Next we move to Search Console’s crawl section. This section contains reports related to Google’s ability to crawl your pages. Proper crawling is critically important to indexation and serving of pages in search results.
The Crawl Errors report lists URL errors that Google encountered when crawling your pages. Common errors can include:
- Server errors
- Errors related to blocked access
- Not found or 404 errors
You should regularly monitor this report to see what errors Google finds. Don’t forget to review both the desktop and smartphone tabs. As you fix errors here, be sure to mark them as fixed.
The Crawl Stats report shows Googlebot activity for your site over the previous 90 day period. There are three metrics listed here:
- Pages crawled per day
- Kilobytes downloaded per day
- Time spent downloading a page (in milliseconds)
Generally speaking, you’re looking for large spikes and drops in crawl rate. Otherwise, for most law firm sites, there’s probably not a lot to worry about here.
Fetch as Google
The Fetch as Google report helps webmasters understand how Google “sees” your pages. Put simply, it simulates a crawl and render of your pages. This is particularly useful for identifying nasty issues that are preventing proper crawling. It’s useful to periodically check fetch and render and click into the specifics of what code Google sees.
The robots.txt Tester helps you check for robots.txt errors that might exist. For example, your robots.txt file may be completely blocking Googlebot from crawling and indexing your site! Unfortunately, I’ve seen unscrupulous law firm web marketing companies edit a firm’s robots.txt file to block a site after they were fired by the firm!
This is where you can submit your site’s Sitemap directly to Google. I’m a strong proponent of submitting Sitemaps. While Google’s ability to find and crawl pages has greatly improved, Sitemaps are still useful to provide Google a full picture of your site’s structure and organization. Making things easier for Google is likely to help your site’s visibility in search results. If you do submit sitemaps, make sure they are error free and only contain the pages you want to Google to crawl and index.
Most of you probably won’t need to worry about the URL Parameters report. Making mistakes here can have serious consequences. For most sites, letting Googlebot decide to handle parameters is the way to go.
The Security Issues report contains instances where Google has detected security issues with your site. Recently, we’ve seen an uptick in the number of hacked law firm websites. This is typically a result of:
- Failure to regularly update site files.
- Using outdated or unsupported plugins or themes.
- Unsecured hosting environments
Site hacks can wreak havoc on your site’s rankings in search results.
Finally, the Other Resources section contains links to:
- The Structured Data Testing Tool
- Structured Data Markup Helper
- Email Markup Tester
- Google My Business
- Google Merchant Center
- PageSpeed Insights
- Custom Search
- Google Domains
- Webmaster Academy
I’m not going to go through all of these here, but I will recommend that you explore Google My Business, PageSpeed Insights, and Webmaster Academy. Each of these play a particularly important role in improving your pages’ visibility in various results.
Well, that’s Search Console in a nutshell. It’s free, relatively easy to implement, and provides very useful information about how Google sees your site. If you have questions about Search Console, don’t hesitate to ask them in the comments below.