Whether you’re a novice or expert in the field, when performing a site audit, you need to start with the basics to identify any glaring issues that require immediate attention. In this article, I’m not going to dive into keyword research. While keyword research is an important step in evaluating the SEO of your site, it requires a longer explanation if you want your research to be perfect. If you want to dive into keyword research, one of our SEO experts, Joe Robledo, covers the steps of a low hanging fruit keyword audit here.

For the steps of an SEO site audit that I’ll be covering, you’ll need to use a few tools and platforms. Here are my recommended platforms to use for this audit.

In the sections below, I’ll break down what to look for, the tools necessary for each section, and also uncover potential issues.

Elements of An SEO Audit

  • Analytics
  • Indexation
  • On-Page Elements
  • Technical Aspects

Analytics

To fully understand the success of your efforts, you must have accurate data. There are many issues that come from incorrect data or lack of data. I’ll be speaking specifically about Google Analytics in this section. The first step is to review the goals that are set up within your view. Ask yourself, are these goals valuable? Do they represent my team’s KPIs? Does this information help us track revenue and ROI?

After you’ve ensured your goals are tracking the exact data you’re after, check on your filters tab within your reporting view and review your filters. If it’s not currently added, take your company’s IP address and add it as a filter to your view. You don’t want all the times employees viewed your webpages to skew your data.

screenshot 1

Once you’ve properly filtered your IP, go back to your home tab and view your traffic channels located under the ‘All Traffic’ dropdown in your ‘Acquisition’ tab. What is the traffic distribution of your channels? If your direct, other, or not set channels have a traffic percentage larger than 5% then there is a possibility of a tracking error across your website. A possible issue is that you have a login page on your website and haven’t included current users as an excluded filter.

screenshot 2

Next, check your organic traffic over the past few years. Are there any major dips in traffic? If there is anything substantial, your site may have experienced a penalty in the past. Gather as much information as you can from past penalties and determine how they were resolved.

The last step to verify all traffic is being recorded properly is to use GAchecker.com. This tool allows you to see if any pages across your website don’t have your UA tracking code. Ensure all the website pages that you want tracked are included in the list of pages with the tracking code.

Indexation

Once you’ve verified that Google Analytics is set up correctly, you’ll need to verify that your site is being indexed in Google. The first items to check are your robots.txt file and sitemap. The robots.txt file can usually be found by typing yourdomain.com/robots.txt. Your robots.txt file should look something like this:

screenshot 3

Robots.txt File

Your robots.txt file tells Google bots what they should and shouldn’t index on your site. Any pages that you don’t want showing up in the SERPs should be added as a Disallow. Such pages include: login pages, internal assets, or any page not needing indexation by Google’s search engine. In the robots.txt file, you also want your sitemap listed for Google bots to easily locate and crawl.

Sitemap

What is a sitemap? A sitemap is your site’s repository of pages, neatly compiled for Google’s bots to crawl. In short, a sitemap is the file that can help Google bots navigate your site and find pages that you want indexed. Even without this file, Google can still crawl and index your site’s pages, but with a sitemap, you can make your pages easier to crawl and improve your chances of sending Google to the right pages.

Duplicate Content

Another indexation problem that I have seen across a majority of sites is duplicate content. Duplicate content can happen in multiple ways and may cause Google to drop your keyword rankings because it’s not sure which page should rank for certain keywords. The first duplication issues to check are the URLs. Are there variations of the same page that are pulling up a 200 status code instead of redirecting to one page? Examples of potential duplicate URLs are below.

  • https://www.example.com/
  • http://www.example.com/
  • https://example.com/
  • http://example.com/
  • https://example.com
  • https://Example.com/

Each of these pages are the same, yet they show up as different URLs. This could be happening across your website and there are two ways to fix this issue. The first solution is to set up site-wide redirects to send any variation of a URL to one specific designation. If your site is ‘HTTPS’ without ‘www’ and contains a trailing slash, then set up all variations to redirect to https://yourdomain.com/.

If you’re unable to set up 301 redirects due to the platform you’re using, or if you need to keep those pages, then set up a rel=canonical tag. The rel=canonical tag signals to Google that a given page is a duplicate page, but to reference the rel=canonical URL as the first and main page when indexing.

JavaScript

Another problem that lowers your chance of showing up in Google SERPs is JavaScript. Google has a hard time reading JavaScript, so if you have on-page elements in JavaScript, it’s possible they may not show up for Google bots. Hubspot’s Matthew Barby explained this issue in describing a robust content category page that his team built that wasn’t showing up in Google’s SERPs. In troubleshooting the issue, they used a JavaScript Switcher extension to see what the page showed to Google and saw nothing. After recognizing this error, with a few tweaks to the page, they were able to rank for their focus keywords.

Mobile Friendliness

A last step in reviewing the indexation of your website is checking your mobile-friendliness. Use Google’s Mobile-Friendly Test to see how Google value’s your mobile pages. With Google’s mobile-first index, the content on the mobile version of your site is going to affect how Google ranks your pages. If you’re not serving a mobile-friendly page, then you could be hurting your rankings.

On-Page Elements

After reviewing your analytics and the indexation of your webpages, it’s time to dig into the content on your page. When Google is crawling your page, you want to ensure everything is in working order and optimized to match the search intent behind searcher’s queries. When reviewing your on-page content, using Screaming Frog will help you uncover issues involving meta information. Run your domain through Screaming Frog’s spider tool and review the items below:

  • Duplicate title tags: A title tag is displayed in Google’s SERPs and if you have pages showing the same title tag, you may have issues ranking the correct page in Google.
  • Missing title tags: Again, you need a title tag that matches the search intent of users to have a chance at ranking for specific keywords.
  • Duplicate H1s: Similar to a duplicate title tag, don’t make Google’s bots confused as to which page answers the search intent of the query your page is built around.
  • Multiple H1s: An H1 tells Google what your page is about. Remove any conflicting signals and stick with one H1 that explains the content of your page.
  • Missing H1s: A missing H1 is similar to a missing title tag. You’ll want to send Google as much information about your page as possible, and adding an H1 helps clarify the content of the page.

Technical Aspects

The last step in your SEO site audit is to review additional technical aspects of your website that might improve your site’s rankings. Identifying and fixing major technical issues on your site can be the difference between ranking at the top of page two, or reaching the top of page one in Google’s SERPs.

  • Page speed: Put your webpages through a tool like gtmetrix.com to identify site speed issues. Google values high speed pages, because they greatly enhance user experience. The faster your results load for a user, the more likely Google is to serve your result above others. Issues that are most correlated with slow site speed include large images, failure to leverage browser caching, and excessive JavaScript.
  • Broken internal links: Through Screaming Frog’s spider tool, you can export a list of all the 4XX errors on your site. The report will show all the destinations for each of the broken internal links on your page. Using this report, you can either fix those URLs, redirect the broken links, or remove the links entirely.
  • 302s: Sometimes a webmaster will have a 302 redirect as a placeholder in case the original page will be used again. Most of these 302s, however, should be switched to permanent 301 redirects unless it is guaranteed that the original page will be used again.
  • Broken backlinks: An easy way to identify broken backlinks is to put your domain through ahrefs’ site explorer tool. On the left tab, select ‘Best by links’ then filter the results by 404 codes and you’ll have a list of 404 pages on your site with backlinks pointing to them. Take that list and find new pages with similar content and redirect the 404 URLs to the new pages to recover any link juice flowing to the broken pages.

ahrefs screenshot

Final Thoughts: Site Audit First, Build Content And Optimize Second

Before you begin diving into new content and off-page optimizations, review this audit list to ensure the site’s health is in top shape. Once your site’s technical issues are in order, then you’ll be ready to crank out new content.