How to Do a Technical SEO Audit for Your Business Website

Digital marketing is incredibly fast-paced, and your business has to keep up to maintain relevance. The latest website designs and marketing strategies seem to change on a monthly basis, which might make taking shortcuts seem like a good idea.Unfortunately, shortcuts can lead to a build-up of hidden errors that weigh down the performance of your site and your marketing efforts. Poor organic rankings and a bad user experience can cause your target audience to avoid your website like the plague.

Technical SEO debt is one of the silent killers of digital marketing success, but it doesn’t have to be.

What is technical SEO?
Technical SEO is the process of organizing your website for optimal crawling and indexing by search engine robots. Effective technical SEO maximizes search engine access to your content, thereby making it easier for crawlers to interpret and index your web pages.

What is a technical audit in the context of SEO?
A technical SEO audit is a careful assessment of every component of a website that affects search engine crawling and indexing. A technical site audit typically involves the examination of website details that are not explicitly visible to users but that impact a site’s rankings and performance.

Why do I need a website technical audit?
Think of SEO audits for your website as like emissions checks for your car. Your car could look pristine on the outside: freshly waxed, sporting performance wheels and leather seats. However, if you haven’t had an oil change in 10,000 miles and your frame is rusting out, your car wouldn’t be considered a top performer.

The same goes for your website. You can have a seemingly state-of-the-art website with a beautiful page layout that is very aesthetically pleasing, but if your site loads slowly or your code is burdened with mistakes and HTTP status code errors, Google won’t consider your pages to be valuable.

Conducting a technical audit and fine-tuning your website can have a dramatic and positive impact on your search rankings.

How to perform a technical SEO audit
The following technical audit checklist will help you maintain or improve the performance of your website. A small upfront investment in technical SEO can ultimately have a lasting positive impact on your bottom line. Here’s how to get started.

Crawl your business website
The first item on your technical SEO checklist is to perform a site crawl. Google’s web crawlers discover and crawl your pages before ranking them, so it’s important that you identify and resolve any issues.

You’ll need software assistance to crawl your site. We highly recommend using a technical SEO audit tool like Screaming Frog. This software is budget-friendly; a yearly license costs only $170.Use

Use Screaming Frog to:

  • Identify and resolve 404 and 301 errors
  • Identify and resolve redirect chains
  • Identify and resolve missing, duplicate or lengthy page titles and H1 tags
  • Find pages with missing meta descriptions and create them
  • Find, resize and compress oversized images that could be slowing your site

A lot of today’s popular SEO software tools offer crawling functionality as well. For instance, MOZ and SEMrush can crawl your site and identify duplicate content and thin content issues that Screaming Frog won’t tell you about. However, subscriptions to each of these tools cost $100 a month, so we don’t recommend using them solely for their crawling functionality.

Check Robots.txt
If for some reason you don’t want a page to be crawled, you can use your site’s robots.txt file to keep search engine crawlers from finding it. To see if a site has a robots file setup, simply append robots.txt to the root domain. For example:

https://www.example.com/robots.txt

This should be a high-priority item on your SEO site audit checklist, because whether or not your entire site shows up in search results could depend on this. In fact, you can’t use Screaming Frog (as mentioned in the previous step) if your site is disallowed. Here’s a more thorough guide about how to block Google from crawling your site.

If you’re having ranking issues with your site, you might find that either you or your web developer has accidentally disallowed your entire site from being crawled. Use an SFTP client like FileZilla to upload a revised robots.txt file.

Do a Google site search
Another item on your checklist should be using the site search command to find out if you’re having any indexing issues. The site search operator is one of the many Google search commands that you can use to refine your search results.

Type the following into a Google search bar to see only pages from your domain in the results:

Site:example.com

If you have a site with hundreds of pages, but you’re only seeing a small subset in the search results, you’ve just identified that you’re having crawling or indexing issues.
If you’re trying to check your Google rankings for a particular page but can’t seem to find it, you can combine the URL with the info search command to see if it’s being indexed.

info:example.com/particular-page/

If your page shows up, it’s being crawled and indexed and doesn’t need immediate technical SEO attention. Your issue may be link- and content-related SEO factors that aren’t inherently technical problems.

HTTPS and mixed content
Mixed content is when your page uses a combination of HTTP and HTTPS resources. A good analogy would be installing new deadbolt locks on all your doors to secure your home but leaving the windows wide open. While you’ve taken care of your most obvious problems, you’re far from being secure.

If you load a single HTTP resource over an HTTPS connection, your web page is not secure. When this happens, the Google Chrome web browser will display a “Not Secure” warning to the left of your URL bar.

You can view a page source by a right click (on a PC) or CTRL + click (on a Mac) to open up the developer options for a web page and selecting the “View Page URL” item from the list. This will show you an HTML version of your page. Search for any instance(s) of “http://” on the page to find the unsecured resource(s) causing your mixed content issues.

There may be a chance that you’re linking to an external script that used to use HTTP but has since switched to HTTPS. However, it’s more likely that you’ve hard-coded an internal link to another one of your web pages or images that needs to be updated to HTTPS.

Mobile-friendly check
To check whether or not your webpage or site is mobile-friendly, navigate to the Google Mobile-Friendly Test and enter your homepage URL. The tool will identify if your page is available for mobile viewing.

If you have Google Search Console verified (which is highly recommended), you can even view a site-wide mobile usability report.

These reports will prescribe specific measures that you can take to optimize the mobile friendliness of all your web pages. If you have some web development skills, you can easily implement these changes.

The most common problem you’ll see is content overflow. This happens when an HTML element is wider than the page container. For a mobile device, this could mean that you have an image that’s larger than your screen or a sentence that disappears off the side of the page.

Make sure XML sitemap is submitted and updated
Submitting your sitemap in Bing Webmaster Tools and Google Search Console will improve your site’s crawlability. This isn’t a difficult step, but it’s an easy one to forget.

A sitemap is like a Table of Contents for a website and makes it easy for search engines to crawl your website.

If you’re using WordPress, we recommend using the Yoast Plugin to automatically generate an XML sitemap for your site. This doesn’t require any technical skills, and it will automatically update your sitemap when you add or remove pages from your site.

You can find your Yoast sitemap by appending /sitemap_index.xml to your domain. To submit your sitemap in Search Console (assuming you have it verified), use the following URL — but replace the end of the URL query with your own domain name.

https://search.google.com/search-console/sitemaps resource_id=https%3A%2F%2Fwww.example.com%2F&hl=en

This is where you’ll enter your sitemap_index.xml URL so that Google has a Table of Contents for your website.

Internal and external links
Internal and external links are great SEO signals for your site and are critical for web navigation. In fact, links from third-party sites were the original Google ranking signals.

The Google Search Console Links Report displays both internal links and external links for your site. You can use this data to figure out how you currently prioritize your web pages internally (more internal links to a page indicates that it’s more important) and what other sites think is valuable (more external links to a page indicates that it’s valuable).

Linking to resources that redirect or result in a 404 error are more common the older your site gets. This happens because other sites are upgrading to HTTPS, adjusting page URLs or even removing content altogether.

You can use the Internal and External reports within Screaming Frog to make sure none of the links on your website are redirecting and linking to pages that no longer exist.

We also recommend using a report from a tool like SEMrush to evaluate the safety of your backlink profile. When you run an audit in SEMrush, it includes a backlink audit to identify any backlinks that stem from notoriously spammy sites. These links could reflect poorly on your site’s authority, so you may want to disavow them. Google provides instructions on how to do this.

Evaluate site speed
As of July 2018, site speed is a Google ranking signal. When your website is fast and easy for users to navigate, it is deemed more useful for users.

You can start evaluating your site speed using Google’s PageSpeed Insights tool, where you can also read about the July 2018 speed update. This will give you a Page Speed score and an Optimization score along with instructions for improving each.

What’s the difference between your speed and optimization scores?

Use the analogy of two high school students, one who doesn’t have to exert effort to achieve good grades and another who invests significant time and energy to pass their classes. The naturally bright child can attain good grades on tests and demonstrate subject matter expertise (high Page Speed score), but if his homework is 60 percent of his overall grade (low Optimization score), he won’t perform well in the class. Alternatively, the hardworking student may not test well (low Page Speed score), but he can still complete all of the homework assignments (high Optimization score) and earn acceptable grades.

Ideally, your web pages will load quickly and be optimized. This will help your page rise through the rankings when Google takes your site speed into account.

Analyze user behavior metrics
The last item on your SEO site audit checklist is to look into your user behavior metrics. In recent years, Google has placed more and more weight on the user engagement on your web pages.

If a user finds your site, visits multiple pages and spends several minutes browsing your content, your web pages are likely to seem valuable and relevant to Google. If a user visits a page and leaves your site 10 seconds later, your website may appear to be less deserving of ranking on page one.

To dive into these metrics, make sure you have Google Analytics data available. Look for pages with bounce rates of 100 percent and low Time on Page. Analyze those pages to determine why they might be underperforming.

If your underperforming pages don’t offer valuable information about your products, services or industry, you may want to consider altering their content, merging them into another page or removing them altogether. Pruning your underperforming pages can improve the health of your entire site.

Walking through each of these technical SEO steps can help diagnose site health issues. A site with highly valuable content might still never breach page one of the search results if it has poor technical SEO.

If you have more questions about your technical site audit, please leave a comment below!

Author Bio

Tony Mastri is an editor at Local SEO Bee and an experienced digital marketing strategist. He is passionate about business growth and strives to share his knowledge with the marketing community.

Leave a Reply

Your email address will not be published. Required fields are marked *