Technical SEO Audit Tools

At Erudite we perform a lot of Technical Audits each tailored to our clients’ objectives, but underpinned by our Technical Audit Checklist. During these audits there are a number of different tools we use for specific tasks or to review particular things.

In today’s post we’re going to look at the tools that are used in each of the 5 main stages of our technical auditing process:

Crawl & Integrity

In our audit process we separate out the Crawl stage of the audit from the Integrity review. As these largely use the same tools we’ve combined these together. The crawl phase of our audit process identifies problems found in crawls (and makes sure the website can actually be crawled without inhibitions), whilst the integrity phase looks at finding solutions to the issues discovered during those crawls.

Screaming Frog SEO Spider

 

screaming-frog-seo-spider

No technical SEO should be without the Screaming Frog SEO Spider in their arsenal. It helps you understand how the most important optimisation elements on a website are configured; identifies internal redirect chains, finds crawl errors and so much more. What’s more you can use this crawling tool on competitor websites too.

If you need to crawl any medium sized websites make sure you alter the default RAM settings and if you’ve got pages in the region of 100k+ then you’ll need to think about deploying this to a server with a mountain of RAM available. The spider is scalable to every need we’ve ever really encountered at Erudite so far.

When it comes to much larger websites we’ve found that you don’t actually need to be crawling all of the pages to identify what the primary issues you need to fix. Most of the time when we encounter a website that the Spider is having issues with it’s usually down to a spider trap of some description causing infinite loops that need to be resolved before you can get a clean crawl.

It is also incredibly customisable. You can choose to cap the number of items that it’s going to crawl, which parameters should be included, the click depth and so on.

Whilst there is a free version limiting the crawl options and the depth of the crawl to 500 items, the shackles really come off when you pay them their nominal £99 per annum licence fee.

Webmaster Tools

Both Google & Bing Webmaster Tools offer useful insights to how search engines are interpreting your website. During our Crawling & Integrity phases we’ll be paying close attention to details on crawl errors encountered on the website which need to be resolved. We will also review page load speeds, levels of indexing, any issues with XML sitemaps, data on the number of pages and amount of data being downloaded by day and the URL Parameter configuration options to start with, but going through all of this data with a close eye helps gain a greater understanding of what’s occurring with the website and identifies problems that require resolution.

Google and Bing’s crawlers are likely to encounter more URLs than any other website crawling tools, as most crawlers rely on internal links to a page to be able to find them when crawling them. Google and Bing are crawling the web and therefore encounter external links to a website to pages that might be orphaned or no longer exist as well as have as often having a long history of crawling that website and thus have a bigger historical picture of the pages on that website.

One of the first things we look at in Google Webmaster Tools in particular is the “manual action” section too. If there is anything to note in that first, it might be time to stop the technical audit for the time being and go and resolve that problem first.

Other Crawlers

There are times that Screaming Frog is not quite been able to do what’s needed with a crawl. For example if there is markup in the <head> of a document that isn’t quite correct then the spider has issues extracting Meta Descriptions for a site. Whilst you want to get any issues like those fixed anyway, sometimes during an initial audit, you still need to pull all of that data and this is where we’ve found the Beam Us Up crawler to be particularly of use, as it extracts similar data from websites.

DeepCrawl is another great crawling tool which has some great features and is geared up to Enterprise level sites. What we particularly like about DeepCrawl is the comparisons between crawls allowing you to review what has or hasn’t been implemented. It also has a cloud interface meaning it was accessible from anywhere.

Google Search

Whilst this perhaps isn’t a “tool” in quite the same way as the others that we are looking at, checking results in Google Search, particularly reviewing the pages that are indexed using site: or allinurl: commands is an important part of the SEO auditing process.

Another tool that work well in conjunction with these commands is the Dataminer Google Chrome extension. It allows you to right click on a part of a page and “Get Similar” – which is a really quick way of extracting all of the links in those Google listings on a page in a few seconds. Whilst Google limits how many results it will gives in a site: or allinurl: search, this can give you a good sample of what you were searching for.

Site: searches we tend to use to check how well indexed a particular sub folder might be on a website, whilst allinurl: operators can be useful to identify when those pesky URL parameters have made it into the index.

Pagespeed Checks

There are a few tools that we use to review page load speeds for our clients. In our experience, the numbers that are quoted by most of these tools for total page load speeds are considerably higher than you might see as average page load times in the Google Webmaster Tools console. The most useful insights from these tools is the breakdowns in how elements are called on to a page along with the recommendations on what can be improved to shave off a few milliseconds on load speeds. The three below are the ones we tend to use most frequently:

UA Switcher

There are many uses for this handy User Agent switching Google Chrome plugin but there are 2 primary reasons to use it in our audits.

Firstly, if you suspect a website might be hacked we’ve often found how what type of intrusion has occurred by switching the User Agent of the browser to Googlebot and viewing the page as their web crawlers “see” it. It can be useful to compare one page view to another and on more than one occasion we’ve found that internal links have been switched to link to pharma spam pages that have been injected onto a website. If this is the case, you need to close down that intrusion as quickly as possible and get to work cleaning up everything that’s made its way into search indices.

The second reason we use a UA switcher is to mimic a mobile device when looking at a website, something that has become a larger part of our auditing process over the last few years.

There are lots of other extensions available that perform similar user agent switching capabilities –this one is good as you can usually specify your own user agents too.

Ayima Redirect Path

One of the handiest Chrome plugins that there is for a technical SEO, the Ayima Redirect Path plugin gives a nice breakdown of the header status path when you’ve clicked on a link and followed it through to a new page. This is useful when actually looking at a website and clicking on links rather than just the data that a crawler has collected. It illustrates the hops in a redirect from that link or returns the header status for that URL.

redirect-path

BuiltWith Technology Profiler

If you don’t know what content management system you are working with on a website along with the details of the server configurations when you start a new audit process, the BuiltWith Technology Profiler is a useful quick check to see what server technology, frameworks and content delivery networks are being used on a website.

built-with-logo

De-duplicate

The next phase of our audit process is to look for copies of websites, or pages of the website, either internally on the site you are auditing yourself or elsewhere on the wider web.

Search engines prefer that content only exists in one location, be it a specific blog post or an entire website. Don’t confuse them by allowing substantial amounts of duplication in other areas of the web other than your main website that you are looking to increase search traffic to.

Screaming Frog

We use the Screaming Frog Spider a little differently when looking for duplication issues. Look out for any URLs that have a duplicate value in the Hash column as they’ll have identical HTML.

Also, look for pages that have duplicate values in page <title>’s or Meta Descriptions. Even if they aren’t duplicate pages, you are duplicating the targeting of optimisation of those pages. More often than not when we review sites for duplicate titles or descriptions those pages are substantially similar and require a rethink in structure or canonisation to resolve those duplication issues.

Copyscape

Copyscape is a tool that you load lists of URLs into and it then finds pages elsewhere on the web that are using portions of text that are in use on those URLs.

This is useful for identifying where product descriptions might be used across all retailers selling the same product or even simply where your blog content is being scraped from.

Once you’ve worked out who’s copying your content, or where you are using pre-supplied content from product manufacturers, you can work out how best to deal with it. Either by requesting that scraped content is removed or deciding if you need to rewrite product descriptions on your own website to ensure that you aren’t using the same text as all the competition.

Whois

There are various Whois tools available to look up details of who registered a website and what other websites they might own. In particular in this phase, we are looking for a list of all other websites owned by the registrant. We then run a quick check on them to make sure they don’t have a complete copy of the website that you’re trying to make more visible to search.

Google Analytics

Google Analytics is going to be used throughout the auditing process to review what is already working on the website. But one report that is particularly useful for spotting duplication issues (if it’s not been configured to filter them out already in the account) is the Hostnames report. We’ve lost count of the amount of times that we’ve looked at these reports and found at least one copy of a client’s website available at a distinct and different URL, which is a fully indexed duplicate of the site you are looking to boost in search.

Optimise

Next, we look at what changes are needed to be made to a website in order to get it ranking; looking at factors such titles, descriptions, headings and content changes. If we haven’t performed a specific keyword analysis project for a client (which we nearly always do in parallel to a technical audit) we’ll also need to identify what phrases we’ll need to be looking to get that website visible for moving forwards.

Keyword Planner

Whilst the data that the Google Keyword Planner gives you isn’t completely reliable and methods for keyword research requires a post of its own, you need to research what phrases you are going to optimise a website for before you know what changes to be making to a particular page. The Google AdWords Keyword Planner gives useful data with month by month seasonal trends on usage for the keywords that your audience searches for.

Ubersuggest

If are stuck for ideas on the keywords you need to research, Ubersuggest is one of the best tools to use to seek inspiration from a root initial phrase that you have to start from. It gives you lists of related terms to what you initially put in, scraped from Google from particular regional search engines. You can then export the suggestions you like the best and go back to the Keyword Planner to review their search volumes.

SearchMetrics & SEMRush

Both SearchMetrics & SEMRush have similar feature sets and applications when it comes to our audit process (though both have wider capabilities that are useful at other stages of an SEO project). If we’ve not specifically looked at keyword analysis for a client when we are auditing, then we look at their search ranking visibility data to review what particular keyword rankings need to be retained whilst making changes to a website to improve it in new areas.

There is no point increasing traffic on a particular set of keyword terms if you are going to lose visibility and traffic elsewhere that counteracts it.

Both SearchMetrics & SEMRush are also useful for spotting algorithmic devaluations on a website too.

searchmetrics

XML Tree

“Out of the box” Google Chrome doesn’t display XML files nicely – that’s about the only thing Internet Explorer actually does better than Chrome. XML Tree is a simple Google Chrome plugin that displays XML in a cleaner, more structured fashion.

Nofollow

Another simple Google Chrome plugin, Nofollow underlines any nofollow links, internal or external when you look at a webpage. It saves having to view source when you want to check these, though Screaming Frog is better for reviewing links being follow/nofollow in large numbers.

Refine

In the refine section of our audits we are look for the final enhancements that can really set your website apart from the competition in search results.

Rich Snippets

Various rich snippets are available to help improve your search listing appearance – though Google has been reducing some of the options that are available for these over the last year or so. You can see more about the different types of Rich Snippets that you can utilise to supercharge your search results previous blog post, but in terms of tools that we use as part of this process we look at:

Schema Creator – this helps you to generate the structured data markup you need to generate the rich snippets.
Google Structured Data Testing Tool – this allows you to test how any search results will appear once you’ve implemented structured data on your page.

HREFLANG

Whilst HREFLANG isn’t needed as part of every technical audit that we perform, we are increasingly working with brands that have an international presence which require appropriate international targeting. HREFLANG is either implemented in XML sitemaps, where we would use our own HREFLANG Sitemap Generator Tool, or in the <head> source code, where Aleyda’s HREFLANG Tag Generator Tool really comes in handy.

 

Summary

Those are most of the tools that we use as part of our Technical SEO Audits. There are plenty of other tools that we use for other phases of the work and we’d love to hear about any new tools we might not have looked at enough in the comments or on social media channels. Are there any tools that you wish existed but no one seems to have created yet?

We do like finding new tools to play with and hope you’ve found some that are new to you or can be used in slightly different ways than you already are. Just remember, the tool is only as good as the person using it. Many of the tool providers have online tutorials and even live web demos to help you upskill.

Like what you see? Lets talk. Get In Touch