CHAPTER FOUR
Your SEO Toolbox
Success in SEO is highly dependent on having the right tools. Before you can learn the tricks of the trade, you need to have the tools of the trade. The services and utilities covered in this chapter will enable you to analyze your site and identify technical or structural problems, discover the most cost-effective topics and keywords, compare your site’s performance to that of its top competitors, track incoming links, and measure visitor behavior. You’re probably already familiar with some of them, but we’re going to reintroduce them from an SEO perspective. These tools all have a variety of purposes and functions that are useful for SEO, but the common thread for most of them is their utility in keyword research, a topic that we cover in depth in Chapter 6.
The first and most important tool is a simple spreadsheet application to aggregate data from multiple sources, and to calculate the best opportunities. This is a requirement for maximum SEO productivity and efficiency.
Next, we’ll help you explore some options for technical SEO. It doesn’t make sense to optimize a site that isn’t being indexed properly. There are many technical utilities and site analysis features of larger SEO service packages that can help you solve technical site problems. Some are standalone (like Google Search Console), but every major SEO platform and most marketing services suites have their own site and page analysis features.
There are three data perspectives on website activity: server-side, client-side (visitor-side), and search-side. Since your company or client controls the web server, it makes sense to start this chapter by first analyzing the data you already have. Next, we’ll explain how you can supplement that data with extra visitor context from on-page JavaScript trackers. Finally, we’ll introduce you to SEO platforms that provide search data for the keywords you’re targeting, the current search rank for every indexed page on your site, and other valuable features that will help you optimize your site and increase search traffic.
Some of the tools we cover in this chapter are free, but most require a license fee, paid subscription plan, or SaaS contract. Paid tools tend to charge on a per-user, per-site (or property), or per-client basis. While you don’t have to make any decisions about any of these tools right now, in order to follow many of the processes and examples throughout the rest of this book, you must have at least one SEO platform subscription and a web analytics service or tag manager deployed on your site, and you must set up and configure Google Search Console.
Spreadsheets
Our dream careers rarely align well with reality. Real archaeologists spend a lot of their days on their knees in the hot sun with a toothbrush and a garden shovel, not dodging ancient booby traps with a whip and a revolver à la Indiana Jones. Real lawyers spend much of their time on administrative tasks such as billing, collections, office management, case law research, and reading and filing formal documents, not winning clever courtroom battles with hostile witnesses or delivering dramatic closing arguments before a jury. And a professional SEO often spends more billable time working in a spreadsheet than a web browser or code editor. Hopefully that doesn’t throw too much water on your fire. This is still a fun and fascinating industry!
SEO practitioners rely heavily on spreadsheets, and most commonly that means Microsoft Excel, but you can use any modern equivalent. Regardless of which spreadsheet app you use, you must be proficient enough with it to create and maintain proper keyword plans for your company or clients. Specifically, you must be comfortable working with data tables, basic formulas, filters, and pivot tables. If you have a few knowledge gaps in these areas or you don’t feel confident in your spreadsheet skills, then you should invest in training, or at least be prepared to use the Help menu and Google to figure out how to use these advanced features to filter, sort, and calculate your keyword lists.
This topic is covered in more detail in Chapter 6, where we walk you through the process of creating a keyword plan spreadsheet.
Traffic Analysis and Telemetry
In order to analyze data, first you must collect it. There are two paradigms for visitor data collection: raw web server logs that record all incoming traffic from the internet, and JavaScript trackers (also known as tags) that are embedded in the source code of every page on your site. Each has its advantages and disadvantages, and to get a holistic perspective on traffic and user behavior it’s often preferable to incorporate both web server log input and JavaScript tracking. However, what type of solution will work best for you will depend on your individual feature requirements, and what’s already deployed (or required due to vendor contracts) at your company.
Before you proceed, be aware that there is almost certainly already some kind of web analytics package (or perhaps several) deployed somewhere at your company. It isn’t unusual for a web-based business to use a variety of separate analytics tools to measure different metrics or supply data to different services or utilities. You should begin by taking stock of what’s already deployed (and paid for) before making any decisions on analytics tools. As a consultant, there will also be times when you’re stuck using a client’s preferred vendor or solution, so you’ll have to learn to work within those boundaries.
If nothing is currently deployed, then free analytics packages such as Google Analytics and Open Web Analytics are an excellent starting point. Even if a free service doesn’t ultimately meet your needs, at the very least you can use it as a basis for comparing against paid alternatives.
Whether you’re evaluating an existing solution or a new tool, make note of the features that you find valuable and any gaps in functionality that a competing program might be able to cover. Then look for ways to modify or extend the services you’re using, or for a higher-end solution that covers those gaps. This is a long journey, not a one-time event. As you gain more experience in SEO, you’ll continue to develop your requirements and preferences, and will likely end up with a few different go-to options for different scenarios.
Be wary of services (especially free ones) that want you to upload customer data or server logfiles; those service providers may collect your data for other purposes, and this would probably represent a privacy and/or security violation at your company. JavaScript trackers may also share or collect data about your traffic and visitors, and though this is less invasive and dangerous, it still may violate privacy laws or your internal IT policies. (Legal and privacy issues are covered in more detail in Chapter 13.)
Google Search Console
Google Search Console is a free service that provides a lot of technical site information that Google Analytics lacks. With it, you can test your site for indexing, view inbound search query data (keywords, impressions, click-through rate, rank), generate and test XML sitemaps, test mobile compatibility, analyze page performance, and measure the performance of structured data elements that generate SERP features such as enriched results and OneBox answers.
Google Search Console should almost always be the first service you configure for an initial site audit, because it offers a quick way to identify low-level problems. Its keyword research utility is limited to SERP impressions your site is already receiving, but the data on existing search traffic is useful.
Server-Side Log Analysis
Web server software outputs a constant stream of text that describes all of the activity it is handling and stores it in a file somewhere on the server. A typical server log is a plain-text list of HTTP requests. Here’s an example of a line that you might see in an Apache web server log:
127.0.0.1 [05/Nov/2022:21:43:06 -0700] "GET /requested_page.html HTTP/1.1" 200 1585 "https://www.example.com/referring_page.html" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:36.0) Gecko/20100101 Firefox/36.0"
From left to right, here’s what that data represents:
-
The IP address of the machine (computer, mobile device, or server) that made the request
-
The time, date, and time zone of the request (relative to the web server)
-
The HTTP request method (either GET or POST) and the resource being requested (in this example, it’s a web page named requested_page.html)
-
The HTTP status code (200 represents a successful request)
-
The size of the request, in bytes (usually either the amount of data being returned to the client, or the size of the file being requested)
-
The full URL of the page that referred to this resource (sometimes there isn’t a referrer, such as when someone directly types or pastes a URL into a browser; referrers are only shown if the request came from an HTTP resource, such as when a user or crawler follows a link on a web page)
-
The user agent string, which shows the browser or crawler name and version number (Firefox, for some reason, likes to report itself as Mozilla); the operating system name (X11 is a graphical user environment framework on the Ubuntu Linux operating system), revision or build number (expressed here as rv:36.0), and CPU architecture (x86_64 refers to an Intel or AMD 64-bit processor); and the HTML rendering engine and revision number (Gecko is the native engine in the Firefox browser, and the revision number is the same as the operating system’s in this example because it was built for and distributed with Ubuntu Linux)
NOTE
To learn more about the elements of an HTTP request and how to adjust the verbosity and format of a logfile, consult your server software documentation.
Everything about these logfiles is configurable, from the level of detail and verbosity to the location of the file and the conditions under which one logfile should close and another one should begin. The web server log is the oldest tool in the box. When it’s properly configured and its data is filtered and aggregated for human consumption, it can be useful for a variety of SEO purposes, such as:
-
Determining how often search engines are crawling the site and each of its pages (and which pages they aren’t crawling at all)
-
Determining how much time is spent crawling low-value pages (ideally, the search engines would spend this time crawling more important pages on your site)
-
Identifying pages that redirect using a means other than a 301 redirect
-
Identifying chains of redirects
-
Identifying pages on your site that return status codes other than “200 OK”
-
Backlink discovery
-
Finding missing pages/bad links
-
Measuring site performance
-
Determining visitors’ platforms (device, operating system, and browser version)
-
Determining visitors’ locales
Web server logs can also be merged with other data sources to provide insights on conversion rates from paid and organic campaigns, server optimization, and URL canonicalization for duplicate content, among other things.
NOTE
Some web hosting providers may restrict or deny access to raw server logs and configuration files. Web server logs are useful for technical SEO, so if the hosting company won’t give you the data you need, you may want to switch to a more SEO-friendly provider.
The raw logfile from a web server, while human readable, isn’t human comprehensible. You’ll need a third-party analysis tool to cleanse, combine, sort, and slice this transactional information into facts and dimensions that actually mean something to you. There are so many different tools and methods for logfile analysis that describing them could take up a whole (rather boring and repetitive) chapter of this book, so in this section we’ll just mention a few.
First, as always, check to see if there’s already something deployed, and whether it will meet your needs. Keep in mind that a server log is nothing more than a plain-text transactional data source. Aside from dedicated web logfile analysis tools, many companies have business analytics packages that can use server logs to produce useful reports and dashboards. A business analyst within the organization can potentially work with you to pull web log data into it and configure the output to meet your needs. Also, check to see if the company is already using an enterprise services suite (such as Atlassian, Salesforce, IBM, or Oracle) that has an optional web log analysis component. Even if the company is not currently paying for that component, you’ll have a much easier time convincing management to expand the budget for an existing solution than to buy into a completely new one.
The following are some logfile analysis tools that we’ve used and would recommend:
- Splunk
-
Splunk bills itself as a “data to everything platform,” meaning it can be configured to use any logfile or database as input and produce any kind of report, chart, or file as output. Therefore, Splunk isn’t functionally different from most other business intelligence or data analytics solutions. It may be a more affordable alternative to larger analytics suites, though, because Splunk is more of an engine than a service, meaning you have to develop your own solution with it rather than just copying and pasting some code or clicking a few buttons in a web interface as with Google Analytics.
Splunk is an extremely high-end option, and it may be overkill if you’re only analyzing relatively small transactional web logfiles. However, it’s popular enough that it may already be deployed elsewhere within a large organization, which would make it cheaper and easier for you to adopt it for your SEO project.
- BigQuery
BigQuery is Google’s platform for analyzing data at scale. It can provide scalable analysis over petabytes of data. Google bills it as “a serverless, cost-effective and multi-cloud data warehouse designed to help you turn big data into valuable business insights.” If you’re dealing with large volumes of data, then this platform may be your best bet.
- Datadog
Whereas Splunk and BigQuery are analysis tools, Datadog is more of a real-time monitoring service. It can also take input from several different logfiles, but its output is geared more toward real-time results. This is a good solution for measuring the efficacy of time-limited campaigns, short-term promotions, and multivariate testing efforts.
- Screaming Frog SEO Log File Analyser
-
Screaming Frog’s SEO Log File Analyser may be the perfect standalone tool for web logfile analysis for SEO projects. It’s inexpensive and its only purpose is SEO-oriented web server analytics, so you aren’t paying for features and functions that have nothing to do with your work.
If you don’t want to buy into a long-term agreement with a service provider but need solid server-side analytics, Screaming Frog should be your first consideration. The free version has all the same features as the paid version, but it’s limited to importing only one thousand log lines, so it’s more of an “evaluation edition” than a fully featured free edition—you’ll be able to see if it has the features you need, but with such a big limitation on the amount of input, it isn’t viable in production the way other free solutions are (such as Google Analytics and Open Web Analytics).
- Oncrawl
Oncrawl is a full-service SEO platform that has particularly good SEO-oriented site analysis tools which combine crawler-based performance metrics with logfile data. You don’t have to subscribe to the full suite—you can just pay for the logfile analyzer—but its other SEO components are worth considering.
- Botify
Botify is a large-scale technical SEO analysis service that can use multiple local and external data sources to provide insights and metrics on all of your digital assets. As with Oncrawl, logfile analysis is just part of its larger toolset, and it’s worth your while to evaluate this tool in a larger technical SEO context.
- Sitebulb
Sitebulb is a website crawler that focuses on delivering actionable data insights for SEOs. It’s been built to crawl sites of up to tens of millions of pages, and also offers a business model with no project limits and does not charge extra for rendering JavaScript.
JavaScript Trackers
Web server log stats are a valuable SEO asset, but they can only show you what the server can record. From this perspective, it can sometimes be challenging to tell the difference between a real visitor and a bot using a false user agent string, and you’ll have little or no ability to analyze user activity across multiple devices and sessions. While most search crawlers identify themselves as bots via unique user agents such as Googlebot, Bingbot, DuckDuckBot, and archive.org_bot, anyone can write a simple program to scrape web pages and use whatever user agent they like, and your server will record whatever they report themselves as. A badly behaved bot can circumvent a site’s robots.txt restrictions and pollute your server log with bad data, not to mention overloading your server and network. Most of the time these bots aren’t malicious and it’s just somebody clumsily trying to understand what’s on your site. To be safe, you can block them with .htaccess directives or a service like IPBlock.com.
Both Google and Bing execute JavaScript in order to be able to fully render the web pages they crawl, so it would be natural to expect that they would be trackable by JavaScript trackers. However, both Googlebot and Bingbot recognize the JavaScript of most popular JavaScript trackers and skip executing them to save on resources. Most other bots don’t execute any JavaScript at all.
JavaScript trackers also offer more in-depth visitor information such as time on site, session recording or behavior analytics data (the exact mouse clicks or screen taps a visitor makes when navigating a site), and demographics (if visitor data is being stored in a cookie that your analytics package has access to). However, many people use virtual private networks (VPNs), browser plug-ins, and other privacy countermeasures that prevent JavaScript trackers from working properly (though data on those users will still show up in your server logs). Assuming that your trackers are not blocked, they’ll provide you with a highly detailed view of a subset of the true visitor data. Over time, this method will become less effective as privacy and security awareness increase. For that reason, many modern solutions attempt to show you the whole picture by pulling in external data from server logs or other tracking services and integrating it with what’s been collected via JavaScript.
Google Marketing Platform/Google Analytics
Google Marketing Platform is Google’s suite of web analytics products, which encompasses the following components:
- Google Analytics
A web traffic and visitor behavior data collection and analysis tool. Google Analytics is the key component in this platform because it’s the primary data collector; you can use it on its own without involving other aspects of the Google Marketing Platform.The rest of this section has more details.
- Looker Studio (formerly known as Data Studio)
A basic data analysis tool that enables you to combine data from several sources and create reports and dashboards to visualize certain metrics and goals. This is not nearly as powerful as a dedicated business intelligence solution such as Cognos, Qlik, or Oracle, but you may be able to make it work for your project.
- Tag Manager
A framework for deploying multiple JavaScript trackers using one lightweight tag. Refer to “Tag Managers” for more details.
- Surveys
An easy-to-use utility that helps you create and deploy visitor surveys on your site.
Everything in Google Marketing Platform is free, but all data recorded through these services will be collected and used by Google. You can also upgrade to the paid enterprise service level, which enables you to keep your data private, adds a few new services that help you track marketing campaigns and manage digital media assets, and includes “360” branded editions of the free services listed above. The 360 editions include Campaign Manager 360, Display & Video 360, and Search Ads 360. These have more integrations with other services, more filtering and funneling options, and higher limits on web property views and data caps, among other perks and upgrades.
In this chapter we’re only covering Google Analytics and Google Tag Manager. The rest of the Google Marketing Platform services may be useful to you for other marketing purposes, but they either have little to do with SEO or are inferior to the equivalent components in proper SEO platforms and business analytics suites.
As shown in Figure 4-1, Google Analytics is active on over 72% of websites (the data shown in this figure predates the replacement of Google Analytics with Google Analytics 4 [GA4], which took place on July 1, 2023). The basic version of GA4 is free of charge, easy to deploy, and offers a deep view of visitor data.
If you’re an in-house SEO and only manage one site, and you don’t have any security or privacy concerns with Google retaining the data it collects from your Analytics deployment (Google has access to much of this data anyway via its Googlebot crawler, cookie trackers, Chrome users, and visitors who are logged into a Google account during their browsing session), then the free version might be the perfect solution for you. If you have some concerns with Google’s data collection practices, or if you’re a freelance SEO and have multiple web properties to monitor, then the upgrade to Analytics 360 is worth evaluating.
As with all individual pieces of larger service suites, when you’re using other Google products like Google Ads, Google Search Console, or Google Ad Manager, the unique integrations that they have with Google Analytics may prove to be a major advantage over other analytics tools.
Obtaining keyword-specific data
Google Analytics doesn’t reveal the keywords that people search for when they click through to your pages from a SERP. To obtain keyword insights, you’ll have to use a third-party tool to add that missing information. Popular options include:
- Keyword Hero
This tool was specifically designed to provide data about on-site user behavior per keyword in Google Analytics. It shows you how users respond to each landing page per search query, as well as to your website at large for each keyword. You get everything from behavioral metrics to performance metrics such as conversions and revenue per keyword.
- Keyword Tool
-
Keyword Tool seeks to fill in the data gaps in the Google Ads Keyword Planner, which makes it a good secondary resource for optimizing Google Ads campaigns. Because it incorporates data from Google’s autocomplete feature, Keyword Tool is also particularly good for local keyword research.
NOTE
Some SEO platforms (covered later in this chapter) can provide missing keyword data, too: most notably Semrush, Ahrefs, and Searchmetrics.
- Kissmetrics
-
Traffic analysis is all Kissmetrics does, so if you may want to give it extra consideration if you prefer to build your SEO toolset with interchangeable standalone services, or if you’re unhappy with the web analytics capabilities of a larger marketing suite that is already deployed at the company and are looking for a one-off replacement.
There are two different Kissmetrics analytics products: one for SaaS sites, and one specialized for ecommerce sites. Both services attempt to identify individual visitors and build a profile for them that encompasses all of their actions during all of their visits across all of their devices and browsers. This has obvious SEO advantages, but it could also be useful as a data source for expanding the information in your customer relationship management (CRM) database.
- Adobe Analytics
-
Adobe offers a comprehensive suite of online marketing tools in its Adobe Experience Cloud. Adobe Analytics is used by a large number of large enterprises, either separately or as a part of Adobe Experience Cloud, due to the scalability of the platform.
As a standalone solution, Adobe Analytics has interesting features that most of its competitors don’t. These include access to a wider array of potential data sources beyond what is collected via its JavaScript tag, and an AI component that can predict future traffic levels based on patterns and anomalies in past visitor data. It also is highly flexible and customizable and integrates with other data platforms. According to Builtwith.com, Adobe Analytics is used on over 200,000 websites.
- Open Web Analytics
-
Open Web Analytics is an open source web analytics package written in PHP. It isn’t hosted externally; you have to deploy it on your web server and configure it yourself (if you’re not in a small business, your system administrator should be able to do this for you). The advantages are that you control your data and your deployment in-house, you can pare it down to just the data and visualizations that you need, and you have the unique ability to track via both PHP and JavaScript, which allows you to collect some data about people who use browser plug-ins or VPNs to block trackers, or have JavaScript disabled.
When properly deployed, Open Web Analytics looks and acts a lot like Google Analytics without all of its extra marketing metrics and Google product integrations. It’s an excellent bare-bones web analytics package with no bloat, no monthly fees (though if you use it in production, you should give back to the project via a donation), and no limit on how many sites you can deploy it to. The downside is that someone at your company must spend time and effort to deploy and maintain it.
Tag Managers
Your company or client may have several different JavaScript trackers deployed on the same site, for various reasons. Multiple trackers will potentially interfere with one another, and the calls these tags make to external servers can slow page load times. Ultimately, it’s better to reduce the number of tags per page, but in a large company that may not be possible due to interdepartmental politics or budgeting constraints. Tag management services combine several different JavaScript trackers into one short tag, which reduces page load times, bandwidth usage, and the effort of managing multiple tags on a large site. Examples include:
- Google Tag Manager
-
This is an extremely popular free tag management solution. The only downside is that it may not support some less frequently used JavaScript trackers, and you’ll need to use custom tags for unsupported trackers. However, that’s only a deal-breaker if you aren’t savvy enough with JavaScript (or don’t have access to someone who is) to create and maintain that custom code.
Google Tag Manager is undeniably the first solution you should explore, not just because it’s free, but because it might be the simplest option.
- Tealium
Tealium is a commercial tag manager that supports substantially more tracker code templates than Google Tag Manager, which makes it easier for nondevelopers to deploy. However, it may incorrectly render your pages for visitors who block all trackers, and ultimately it’s the conversions that matter, not the visitor stats.
Search Engine Tools and Features
We’ve already covered a few Google services that are designed specifically for website optimization, but if you approach many of Google’s other properties from a creative angle, you’ll find a lot of hidden SEO value—particularly for keyword discovery and valuation. In this section we’ll review several search engine tools and features that can play an important role in your SEO program.
Autocomplete
Google uses algorithms to predict the rest of your query when you start typing in the search field and shows you a list of top-ranked predictions below it. This is an excellent way to see the most popular keywords for your topics. For example, typing in lilac might reveal suggestions like those shown in Figure 4-2.
Google won’t tell you how many times lilac sugar cookies has been searched for, but because it appears at the top of the list of suggestions, you can infer that it was probably searched for more often than the phrases that appear below it. This can give you important insight into what searchers are looking for, or what they search for in relation to a specific topic.
NOTE
Autocomplete predictions are strongly influenced by the user’s location (e.g., wet n wild phoenix might be a prediction that shows up when a user in Phoenix, AZ, types the letter w into the Google search box). If you’re searching from a mobile device, you might also see apps in the list of suggestions.
Google Ads Keyword Planner
The Google Ads Keyword Planner can analyze your site content and deliver a list of relevant keywords that are likely to be converted to clicks in a Google Ads campaign. It’s free to use, but you have to have a Google account (which is also free) in order to log in. It has excellent documentation and a walkthrough for new users.
Regardless of whether your company invests in Google advertising, you can use the Keyword Planner to get good suggestions for related terms, search volume estimates, search trends, and ad cost estimates for any keyword or URL that you enter. You could use these numbers to calculate keyword popularity, keyword difficulty, and cost per click (CPC, covered in Chapter 6). Unfortunately, unless you are running a paid ad campaign, the search volume will be approximated in wide ranges, so the data won’t be precise enough for you to use for anything other than quick, basic keyword valuation. You can work around this and get better data by setting up a small, low-budget campaign; or, if you know someone who is spending money on Google Ads campaigns, you can ask them to add you as an authorized user to their account.
The CPC data is much more precise than the other metrics and can be useful for gauging keyword difficulty for both paid listings and (in a general way) organic searches. You can get more exact estimates by selecting specific budgets or costs per click, and you can forecast the traffic impact and conversion rate based on data from the past two weeks.
The Google Ads Keyword Planner is most useful for projects that will include advertising, but it can also serve as a free (or low-cost) option for organic keyword research. Due to its limitations, though, this is no replacement for search data from any of the major SEO platforms (covered later in this chapter).
Google Trends
Google Trends enables you to view the relative popularity of a keyword over time and by geography. You can also compare trend data between two or more keywords. There are two datasets you can use: realtime (search data from the past 7 days, up to the past hour), and non-realtime (search data spanning the entire Google archive, starting in 2004 and ending about 36 hours ago). Both datasets are sampled (a random, representative sample is taken from the complete source, similar to a poll), normalized according to time and locale, and indexed to the time span (on the line graph, the 0 point represents the lowest search volume over that time span, and the 100 point represents peak search volume).
Google Trends is useful for identifying and predicting spikes in search volume for your topics, and for finding potentially related topics and queries. By drilling down into spikes in the trend graph, you can see the events and news stories that contributed to them, which can give you some good ideas for site content and media outreach.
There’s more you can do with Google Trends, including comparing relative volumes of different keywords over time (what’s bigger, car repair or auto repair?), competitive brand strength research, and seeing trends isolated down to any country in the world. You can read more about this in Chapter 6.
Google Trends is free to access (with a free Google account) and easy to use, so there’s no harm or risk in looking up some of your topics or keywords to see if there’s any useful information on them. Only popular search phrases are sampled, though, so it’s unlikely to give you insights about obscure long-tail keywords.
Google News
Google News (like other searchable news aggregators) enables you to gauge the media activity around a topic and the recent and real-time popularity (and therefore competitiveness/difficulty) of keywords. You can also search for mentions of your brand or company name (in quotes if it’s a phrase). Results are sortable by time, all the way down to stories from the past hour, which makes Google News an excellent resource for extremely down-to-the-minute current keywords. The data from most other keyword research tools is at least a day old.
Search Operators
The site: operator is one of the most used operators for researching information on a site. It’s a fast and easy way to see what pages Google has indexed for a website. Note, however, that Google normally limits the number of pages it will show (the limit changes from time to time, but in the past it has been 300 or 400 pages). An example of these results for mit.edu is shown in Figure 4-3.
The pages shown by Google are usually the ones that it sees as being more important; however, as there is a limit to the number of pages shown, the absence of pages in this list does not mean that they are of poor quality.
The before: and after: query operators limit search results to those that were added to the index before or after a given date (in YYYY, YYYY-MM, or YYYY-MM-DD format). If you only specify a year, the rest of the date is assumed to be 01-01 (January 1). You can use both operators in the same query to limit results to a specific window of time. For instance, if you wanted to see what people were saying about the Blizzard Entertainment game Overwatch between its official release date (October 4, 2022) and its launch (April 24, 2023), you could use this query:
Blizzard overwatch 2 before:2023-04-24 after:2022-10-04
You can also accomplish this via the Tools menu on the Google SERP (by selecting “Any time,” then selecting “Custom range”). That’s fine for a single query, but if you’re going to look up several keywords or use several different dates, it’s quicker to use the search operators instead.
SEO Platforms
SEO platforms offer incredibly valuable tools for a wide array of SEO projects. Though their exact feature sets vary, the best ones can analyze your site for optimization gaps, suggest topics for new content, compare your site against your competitors, and find backlinks. Perhaps more importantly, they also harvest search data from Google, cleanse it of bot traffic, and connect it to in-depth keyword exploration features that can help you calculate the best opportunities.
These services are not free. Expect to pay anywhere between $100 and $500 per month for the good ones, depending on the level of service you need. It’s usually free or inexpensive to do a limited trial, though, and the customer support is excellent (for the platforms we use and recommend, anyway). If search traffic is important to you, then this is money well spent. A professional SEO may even subscribe to several platforms, because they each do something a little better (or just differently in a way that you prefer) than the others, and certain projects may benefit from a particular niche tool or feature.
To do a proper valuation of your keyword list (more on this in Chapter 6), you must have a reliable and current data source for, at the very least, monthly search traffic and CPC, though ideally you’d have keyword difficulty and current rank data from an SEO platform as well. We recommend any of the services discussed here for this purpose (and more—they’re much more than just keyword research tools!), but there are several others on the market that might work better for your industry or region, and new ones may appear after this book has gone to press.
Semrush
Semrush is more than an SEO platform; it also provides tools for market research, social media management, content creation, and ad optimization. Within its SEO package, there are tools for keyword research, page optimization, local search optimization, rank tracking, backlink building, and competitive analysis. All of these services will be helpful for your SEO work, but we really want to highlight Semrush’s superior keyword research capabilities.
If you only pay for one keyword research tool, Semrush should be a top candidate. It’s not that the others are bad (some are actually better for other purposes), but Semrush’s biggest strength is its keyword database: at over 22 billion keywords (as of this printing), to our knowledge it’s the largest and most comprehensive in the industry. The database is built by scraping Google SERPS for the top 500 million most popular keywords, then analyzing the sites in the top 100 positions for each. The only potential shortcoming to this huge database is that it’s only updated about once per month. That’s fine for most SEO purposes (except for researching emerging trends), though, and it’s on par with the update cycles of most other SEO platforms.
Semrush has a variety of keyword research tools, including:
- Keyword Overview
Produces a comprehensive report showing a keyword’s popularity, difficulty, CPC, trends, and related keywords and questions. You can analyze up to 1,000 keywords at a time by copying and pasting them from your spreadsheet or text file into the Bulk Analysis field.
- Organic Research
Provides a report on the keywords used by the top 100 sites that relate to or compete with yours.
- Keyword Magic
Based on the topic or keyword you give it, finds every related relevant keyword in the Semrush database. You can also specify broad match, phrase match, or exact match modes to tune how many keywords it provides.
- Keyword Gap
Analyzes your competitor sites and identifies the best opportunities for keyword targeting and page optimization.
- Keyword Manager
Performs a real-time analysis of how up to 1,000 of your keywords perform on SERPs and with competitors.
- Organic Traffic Insights
Provides insights related to the keywords driving your site traffic data.
The other Semrush features either speak for themselves or should be evaluated on an individual basis. In general, this platform is well-documented and easy to use, but Semrush’s customer support is among the best in the industry if you end up needing help.
Ahrefs
Ahrefs includes clickstream data from a wide variety of sources beyond Google, such as Yandex, Baidu, Amazon, and Bing. It doesn’t offer the big-picture marketing services beyond SEO like Semrush does, but its SEO toolset is feature equivalent. Ahrefs provides the following tools:
- Site Audit
This tool provides a comprehensive SEO analysis of your site, using over 100 optimization criteria. The report shows technical issues with HTML, CSS, and JavaScript; inbound and outbound link impact; performance; and content quality analysis.
- Site Explorer
You can use this tool to get a report on your competitors’ sites, including the keywords they rank for, their ad campaigns, and a backlink analysis.
- Keywords Explorer
Enter up to 10,000 keywords to get search volume and other valuable data. This tool provides a large number of filtering options, including a “topic” column.
- Content Explorer
You provide a topic, and Content Explorer shows you an analysis of the top-performing articles and social media posts related to it. Excellent for finding backlink opportunities.
- Rank Tracker
Monitors your site’s performance relative to your competitors. You can have updated reports delivered to you via email every week.
Searchmetrics
Like Semrush, Searchmetrics is a larger marketing services and consulting company that sells a powerful SEO suite. Searchmetrics was acquired by Conductor in February 2023. There are four tools in the Searchmetrics Suite:
- Research Cloud
A domain-level market research tool that identifies valuable topics and keywords, analyzes competing sites, and reveals content gaps.
- Content Experience
Provides data that helps you write effective search-optimized content for a given topic, including the best keywords to use, seasonal considerations, searcher intent, and competitive analysis.
- Search Experience
Provides performance monitoring and gap analysis for your site and your competitors’ sites in organic search. Whereas Research Cloud is centered on your site, Search Experience is centered on the search journey that leads to it. This service has a much wider global reach than most others in this space.
- Site Experience
Produces a technical audit of your site that reveals potential problems with search indexing, broken links, orphaned pages, and mobile responsiveness.
Searchmetrics Content Experience is outstanding for topic research. One feature that really stands out is the Topic Explorer. You provide a topic, and Topic Explorer creates a report containing search data and statistics, and a color-coded interactive mind map that shows how it performs relative to other semantically related topics in terms of search volume, rank, seasonality, search intent, sales funnel, and level of competitiveness. You can drill down into any of those topics to get a more refined view of the keywords within them.
Moz Pro
Moz Pro offers SEO tools for keyword research, rank tracking, site auditing, on-page optimization, and backlink building. Its main advantages are the Page Authority and Domain Authority scores, which help you find optimal backlink opportunities. The backlink research tool, Link Explorer, states that it has data on over 47 trillion backlinks.
For keyword research, the Moz Keyword Explorer is an excellent resource, especially for natural language questions. It also has one of the best keyword difficulty scoring systems in the industry.
Moz Pro’s site auditing tool can be set up to perform a full site audit and identify a wide range of issues that may be hampering your SEO. In addition, you can set up alerts that will check your site and proactively let you know when problems are discovered. Moz also offers a Page Optimization Score to help identify issues with your pages. This offers content optimization suggestions that enable you to improve the ability of your pages to rank.
Rank Ranger
Rank Ranger (acquired by SimilarWeb in May 2022) is another marketing services suite that includes excellent SEO tools for keyword research, rank tracking, site auditing, and more. Its biggest selling point is its high degree of customizability. Most SEO platforms have a few static presets for charts and graphs; Rank Ranger enables you to build your own. What we want to highlight in particular, though, is the superior natural language question research capabilities of the Rank Ranger Keyword Finder. If mobile search is a higher priority for your site than desktop search, and you only want to pay for one SEO platform, Rank Ranger should be the first one you evaluate.
Other Platforms
As an SEO consultant, you may be asked to use the keyword research functions that are included in a marketing services suite that your client has already paid for or is already familiar with. The following are some other good SEO platforms we’ve worked with in this capacity that we want to mention:
Again, they do most of the same things, and you’ll likely find that you prefer one over the others for specific purposes. Of course, there are many more SEO platforms and keyword research services than the ones we’ve listed in this chapter. These are just the ones we’ve used successfully and are comfortable recommending. If you want to use a platform that isn’t covered in this book, some important questions to ask before subscribing to it are:
-
Where is the data coming from, and how often is it updated?
-
Does the data apply to the region or locale that my site is being marketed to?
-
Can I import my whole keyword list via copy and paste or by uploading a CSV?
-
Can I export keyword data to a CSV (or XLS) file?
-
Does it offer metrics like monthly search volume, keyword difficulty (or keyword competition), CPC, and rank?
-
What are its unique reporting capabilities?
Another useful tool is SerpApi. This is a programmatic interface to Google and other search engines that enables you to run automated queries and returns SERP data in JSON format. You can plug that data into a dashboard or reporting engine, or convert it to CSV or XLS and work with it in a spreadsheet. This is similar to what most SEO platforms do to scrape search data, except SerpApi offers many more data points, customization options, and access to a wider array of search engines, including:
If you only need access to search data, and you have a web developer or Python guru available to help, then SerpApi is a cheaper alternative to a more comprehensive SEO platform.
Automation
Typically, retrieving keyword rank data from an SEO platform is a manual process: you provide a list of keywords, then filter and sort the data, then export it to a CSV or XLS file. From there, you’d import or copy/paste the data into a spreadsheet or analytics engine.
Some Google Sheets extensions exist that pull data from Google services like Google Search Console and Google Analytics. One of our favorites is Search Analytics for Sheets.
Some SEO platforms also offer access to their data via an application programming interface (API), which enables you to script part or all of the export process. If your preferred platform or data provider has an API, expect to pay extra for an API key and a certain number of monthly API usage units.
Some services allow you to pull just about any data from them via APIs, not just keyword reports. You can use this data in custom or third-party dashboards or business analytics packages and CMSs, or you can write a quick Python script to fetch your new keyword data CSV every month.
If you have a very large (by your own standards) keyword dataset, or if you maintain separate keyword lists for subdomains, content directories, or product lines, then you may find value in using AI to do the categorization and sorting for you. Specifically, we want to call out BigML as an easy-to-use platform for building machine learning models for sorting large keyword datasets.
Note that generative AI tools such as ChatGPT can be used to perform specific functions here too. One example is to have these tools help you classify a list of keywords based on the nature of their intent (transactional, informational, navigational), or group them based on semantic relevance.
You can read about more applications for ChatGPT in executing SEO tasks in Chapter 2.
YouTube Optimization
YouTube is the world’s second most popular website (Google being #1), and it has so much content that it’s nearly impossible to navigate without its built-in search function. Unfortunately, unlike Google’s video vertical search feature, it only returns results for videos on YouTube, so its benefit to SEO is at best indirect (this is covered in more detail in Chapters 11 and 12).
That said, if you’re optimizing for YouTube search, vidIQ is one resource that can help: it’s focused on gaining YouTube views and subscribers. Another excellent YouTube resource to consider is TubeBuddy, which provides useful features such as a keyword explorer and A/B testing of video titles and thumbnails.
NOTE
Some SEO platforms (such as Ahrefs and Rank Ranger) also have YouTube keyword research and rank-tracking capabilities.
Conclusion
In this chapter we covered the SEO tools that we like to use (or in some cases are forced to use because of budget limitations or client preferences), but there are hundreds of other options out there that you may prefer for a variety of reasons. Perhaps there are tools that are better suited to non-English languages, or plug-ins for enterprise service or analytics suites that you’ll have to learn to use because they’re already deployed and paid for. You may also find that your preferred toolset for one project may not be as good a fit for a completely different project.
Beyond that, search technology, the web in general, and the field of SEO are all constantly evolving. New needs and concerns will arise as old ones fade away; new utilities and services will continue to enter the market, and some will merge with others or disappear entirely. SEO platforms are very competitive and will add new features and capabilities over time, so even if you’re an in-house SEO with only one site to manage, it’s still a good idea to re-evaluate your SEO tools now and then.
Regardless of which platforms, services, and utilities you select, the most important thing is that you have the right tools to do SEO your way.
Get The Art of SEO, 4th Edition now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.