Keeping your website in optimal condition gives your business an edge against the competition. Regular website audits help identify and resolve issues that may hurt performance, user experience, or search engine visibility. With consistent, comprehensive audits, you can keep your website functioning at top-level, reaching your audience, and supporting your business objectives. Here, I’ll give the SparkNotes version of how to perform a successful audit and maximize your website’s potential.
Whether you’re a novice or an expert in the field, when performing a website audit, you need to start with the basics to identify any glaring issues that require immediate attention. In this article, I’m not going to dive into keyword research. While keyword research is an important step in evaluating the SEO of your website, it requires a longer explanation if you want your research to be perfect. If you want to dive into keyword research, one of our SEO experts, Joe Robledo, covers the steps of a low hanging fruit keyword audit here.
For the steps of an SEO website audit that I’ll be covering, you’ll need to use a few tools and platforms. Here are my recommended platforms to use for this audit.
In the sections below, I’ll break down what to look for, the tools necessary for each section, and also uncover potential issues. Welcome to your SEO audit crash course.
To fully understand the success of your website efforts, you must have accurate data. There are many issues that come from incorrect data or lack of data. I’ll be speaking specifically about Google Analytics in this section. The first step is to review the goals that are set up within your view. Ask yourself, are these goals valuable? Do they represent my team’s KPIs? Does this information help us track revenue and ROI?
After you’ve ensured your goals are tracking the exact data you’re after, check on your filters tab within your reporting view and review your filters. If it’s not currently added, take your company’s IP address and add it as a filter to your view. You don’t want all the times employees viewed your website pages to skew your data.
Once you’ve properly filtered your IP, go back to your home tab and view your traffic channels located under the ‘All Traffic’ dropdown in your ‘Acquisition’ tab. What is the traffic distribution of your channels? If your direct, other, or not set channels have a traffic percentage larger than 5% then there is a possibility of a tracking error across your website. A possible issue is that you have a login page on your website and haven’t included current users as an excluded filter.
Next, check your organic traffic over the past few years. Are there any major dips in traffic? If there is anything substantial, your website may have experienced a penalty in the past. Gather as much information as you can from past penalties and determine how they were resolved.
The last step to verify all traffic is being recorded properly is to use GAchecker.com. This tool allows you to see if any pages across your website don’t have your UA tracking code. Ensure all the website pages that you want tracked are included in the list of pages with the tracking code.
Indexation
Once you’ve verified that Google Analytics is set up correctly, you’ll need to verify that your website is being indexed in Google. The first items to check are your robots.txt file and sitemap. The robots.txt file can usually be found by typing yourdomain.com/robots.txt. Your robots.txt file should look something like this:
Your robots.txt file tells Google bots what they should and shouldn’t index on your website. Any pages that you don’t want showing up in the SERPs should be added as a Disallow. Such pages include: login pages, internal assets, or any page not needing indexation by Google’s search engine. In the robots.txt file, you also want your sitemap listed for Google bots to easily locate and crawl.
What is a sitemap? A sitemap is your website’s repository of pages, neatly compiled for Google’s bots to crawl. In short, a sitemap is the file that can help Google bots navigate your website and find pages that you want indexed. Even without this file, Google can still crawl and index your website’s pages, but with a sitemap, you can make your pages easier to crawl and improve your chances of sending Google to the right pages.
Another indexation problem that I have seen across a majority of sites is duplicate content. Duplicate content can happen in multiple ways and may cause Google to drop your keyword rankings because it’s not sure which page should rank for certain keywords. The first duplication issues to check are the URLs. Are there variations of the same page that are pulling up a 200 status code instead of redirecting to one page? Examples of potential duplicate URLs are below.
Each of these pages is the same, yet they show up as different URLs. This could be happening across your website and there are two ways to fix this issue. The first solution is to set up site-wide redirects to send any variation of a URL to one specific designation. If your website is ‘HTTPS’ without ‘www’ and contains a trailing slash, then set up all variations to redirect to https://yourdomain.com/.
If you’re unable to set up 301 redirects due to the platform you’re using, or if you need to keep those pages, then set up a rel=canonical tag. The rel=canonical tag signals to Google that a given page is a duplicate page, but to reference the rel=canonical URL as the first and main page when indexing.
Another problem that lowers your chance of showing up in Google SERPs is JavaScript. Google has a hard time reading JavaScript, so if you have on-page elements in JavaScript, it’s possible they may not show up for Google bots. Hubspot’s Matthew Barby explained this issue in describing a robust content category page that his team built that wasn’t showing up in Google’s SERPs. In troubleshooting the issue, they used a JavaScript Switcher extension to see what the page showed to Google and saw nothing. After recognizing this error, with a few tweaks to the page, they were able to rank for their focus keywords.
One of the last steps in reviewing the indexation of your website is checking your mobile-friendliness. Use Google’s Mobile-Friendly Test to see how Google values your mobile pages. With Google’s mobile-first index, the content on the mobile version of your website is going to affect how Google ranks your pages. If you’re not serving a mobile-friendly page, then you could be hurting your rankings.
After reviewing your analytics and the indexation of your website pages, it’s time to dig into the content on your website. When Google is crawling your website pages, you want to ensure everything is in working order and optimized to match the search intent behind searchers’ queries. When reviewing your on-page content, using Screaming Frog will help you uncover issues involving meta information. Run your domain through Screaming Frog’s spider tool and review the items below:
The last step in your SEO website audit is to review additional technical aspects of your website that might improve your website’s rankings. Identifying and fixing major technical issues on your website can be the difference between ranking at the top of page two, or reaching the top of page one in Google’s SERPs.
There are a few common mistakes made when conducting website audits. Look out for the following:
Keep an eye out for these pitfalls as you conduct your own website audit. They’re easily avoidable when you know what to look for.
Before you begin diving into new content and off-page optimizations, review this audit list to ensure the site’s health is in top shape. Once your site’s technical issues are in order, then you’ll be ready to crank out new content.