The following chapter is about the SEO Audit Checklist which is part of the full Website Audit Checklist definitive guide.
An SEO audit can be one of the most valuable things you can do on your website.
It uncovers gems that have the potential to drive you more traffic.
During a recent audit I found a web page that was no longer available…
…and guess what?
It had over 127 links from other websites of high authority.
This is a super valuable page that is no longer available so a simple redirection to another page gives that page a big boost.
It’s like hitting the turbo button on a car.
And don’t just do the audits once!
Re-run audits every quarter. I just found 2 important broken redirects today that I would have never noticed otherwise.
Brian Dean, Backlinko
Our friends over at SEMRush did research on the most common SEO issues found during an audit and here’s a section on issues found during an on-page audit:
There are many potential issues and this is just one area!
In the following checklist we’ve broken down SEO into 4 categories:
- General SEO
- Ecommerce specific SEO
- Local SEO
So here’s our ultimate SEO checklist:
This is where the majority of our tests are!
Are all relevant pages indexed?
If your web pages are not indexed you won’t get organic search traffic.
Make sure your pages can be crawled and indexed. Without this, nothing else is going to have an impact.
Patrick Stox, Technical SEO, Ahrefs
Why wouldn’t they be indexed?
- You’ve blocked search engines from indexing your content
- Google doesn’t find all your content (e.g. there are some orphan pages)
- There are errors in your sitemap
- You have not submitted your sitemap in Google search console.
There are many reasons.
So, how can you check if your website and all relevant pages on it are indexed?
It’s really simple:
Use a search formula site:yourdomain.com to run a site search in your Google search bar. This will show you the indexed pages.
If an Agency is building a new website for you and they send you a link to a test version of the new website go to Google and type site: followed by the name of the test website. If you see all the pages this means the website is indexed which you do not want. It’s a common mistake made by Agencies.
If you have a large website (more than 500 pages) you can use the Index Coverage report in Search Console to check which pages have been indexed and if there were any issues indexing your site.
Does it have relevant Sitemap(s)?
When Google crawls through a website to index content it finds a lot of pages through internal links.
The menu links to a lot of pages.
The web pages have internal links to pages.
This helps Google find content.
But…a sitemap is also important because not all pages are linked on a website.
The ones that aren’t are called orphan pages. If there are orphan pages on a website you need to make sure the important ones are linked to from other pages on the site (and are also in the sitemap).
The sitemap will help Google pick up all the pages.
There are several ways of finding out if you have a sitemap, for example, you can use the sitemap test tool.
Here’s what you want to check:
- Is there a sitemap (there may be more than one)? The simplest way to check if a website has a sitemap is by using a Sitemap Test tool. The sitemap should be listed in the robots.txt file.
- Are there any errors on the sitemap? You can check this with the Sitemaps report in Google Search Console.
- Does the sitemap include all the necessary pages?
- How large is the sitemap? A large sitemap may need to be split and here’s an instruction from Google on how to do it.
The sitemap should only contain HTTP status 200 URLs, not redirects or deleted page URLs. Sitemaps should only contain live page URLs so that Google has the freshest information about the available content.
Goran Tepsic, CTO, FourDots
Is all content secure (https) and are there any instances of mixed content?
Have you noticed how Chrome displays a security error when you are browsing a website without a secure certificate (i.e. not https)?
Would you trust a website that is marked ‘Not Secure’?
I doubt that.
You need to make sure your website is secured for the sake of users, but you’ll also get other benefits, too….
… like better visibility in the search engines.
There can also be instances of mixed content where some part of your web page is considered secure and some parts are considered not secure.
This happens when certain resources on a page, like images or videos, are loaded over an insecure HTTP connection.
Here’s a useful mixed content test.
When you run this tool it will let you know if you have any mixed content issues:
Does the website domain resolve correctly?
What do I mean?
Razoraudit.com and www.razoraudit.com could be considered 2 websites.
You need to make sure that your website variations all point to the same domain.
For example, the following should all redirect to https://www.razoraudit.com
Go to Google and type in variations of your website to see what happens!
Are there multiple subdomains that are not needed?
A sub domain does not have the same authority as your main domain so, for example, if you put your blog on blog.[name of domain] that puts the blog at a disadvantage.
Of course there are times when having a subdomain is relevant but we need to check.
Are internal links used effectively and are there broken internal links?
Internal links are hyperlinks that point to another page on the same website.
This could be links within the content of the page, links below the content or above the content through your navigation menu.
Besides helping visitors find their way around a website, they also help pass link equity from page to page which is important for ranking.
Link equity is when you link to another page and you pass value to that page. This can help with ranking of the page you’re linking to.
Internal links provide the most value to users when they are contextual.
This means they are linking to relevant, related content.
These links also help search engines discover what content on your website is related and to assign value to that content.
Good internal linking is incredibly important, so you need to make sure you don’t have broken links.
Broken links are those on your site that point to non-existent pages – they are bad for user experience but they can also have a negative impact on your rankings.
You can use a tool like Ahrefs Broken Link Checker to find all your broken links.
Ideally, you’ll replace all your broken links with live links or you can simply remove them.
Anchor text is the words on a webpage that are linked to another page on your website or another website. The anchor text tells Google what the page is about so use them wisely.
Are the right keywords targeted?
For the existing content, you want to identify the keywords/topics that were targeted.
The content may be targeting keywords that the business has no chance of ranking for.
Or you could be targeting keywords that are not the right keywords for that post.
We want to identify some opportunities for quick wins.
Are there any opportunities for the top ranking keywords?
You can lose rankings for top keywords so it’s important to track these using a tool such as SEMRush.
We run a ranking analysis of the top keywords to identify opportunities for improvement.
|Phrase||Keyword Ranking||Average Searches|
Are there other missed keyword opportunities?
As well as checking the ranking for the top keywords you should also identify other opportunities for targeting keywords that the business may not be ranking for at all.
Are redirects set up correctly?
Imagine you had an old website and you migrated to a new one.
What happens to the old pages that are not going to be on the new website?
What happens when you have the same page but it’s a different URL?
There are many reasons to set up redirects so you just need to make sure that you set them up correctly.
What’s the difference between a 301 and a 302 redirect?
I’m glad you asked!
When you click on a URL and you end up going to a different URL it’s redirected. A 301 redirect tells Google this is a permanent redirect and to pass all the value of the original URL to the new one.
A 302 redirect is only a temporary redirect.
A tool like Screaming Frog will show you an analysis of your pages and redirects.
Just make sure you pick the right one!
Are all on-page elements used effectively?
On-page SEO is all about ‘optimizing’ your website pages so they rank higher in the search engines.
It includes optimizing your content, page URLs, headlines, HTML tags (title, meta, and header), and images. Here’s our top level checklist:
|Duplicate title tags|
|Title tags too long or too short|
|Description too long or too short|
|Broken internal links|
|Broken external links|
When it comes to content, you need to make sure you have the right types of content for each stage of the buyer’s journey.
You also need to check if the keywords are used naturally.
As you can see in the infographic above, there’s plenty to consider in this area.
If you want to make sure you are optimising your content correctly, SEMRush provides an SEO writing assistant that can be added to Google docs. When you create your article in Google docs this will validate how well it’s optimised for relevant keywords.
Are there opportunities for content consolidation or content pruning?
Do you have content that brings no traffic or conversions but has high-quality links pointing to it?
Or 2 pieces of content where neither is performing that well?
Combine pages that aren’t performing. Sometimes this makes more sense than deleting them outright.
Brian Dean, Backlinko
This is an opportunity to strengthen the ability of one piece of content to drive more traffic.
What’s content pruning?
This is where you have content with no traffic, links or conversions.
Hit that delete button!
To identify this content you’ll need to use an SEO tool such as SEMRush/Ahrefs or Cognitive SEO.
By removing low value content you are focusing Google’s attention on the higher quality content on your website.
You can also identify related content that is not ranking and see if there’s an opportunity to combine this content to make one powerful post.
Map organic traffic, link and conversion data to each URL on the site using Screaming Frog API integration. Then, look for content consolidation (no traffic/ conversions, but unique content and/or links) or pruning (no traffic/links/conversions) opportunities.
Robbie Richards, Robbierichards.com
Is the website mobile friendly?
Google has moved to mobile first indexing for most websites at this stage.
What does this mean?
It means you better have a fully responsive mobile website if you want to do well on Google search results!!!
It’s that simple.
The starting point to see if you have a mobile friendly website is running Google’s mobile friendly test tool:
If you see this you can take a big sigh of relief!
Are there pages that can be pushed up in search results?
Your content can go down the ranks over time…
You’ve heard of tooth decay before. This is content decay…
Have a look a this chart from Backlinko which shows the organic click-through rate based on position in the search results:
Here’s an idea for a quick win:
Find a post that is already ranking high and look for opportunities to optimise and increase its ranking.
- Link to it from other content on your website
Linking from other relevant content using good anchor text will give it a boost especially if that content is a high authority piece of content.
If you want to find other content on your website that is related go to Google and type in:
[Keyword you want to rank for] site:[name of website]
Website audit site:www.razoraudit.com
2. Get an external link to it
Andy Drinkwater recently shared this on his LinkedIn profile to show that getting external links to your content is still very important.
Getting external links to a post from relevant high authority websites will give it a boost.
If you want help with outreach use a tool such as Mailshake.
3. Add more content on
Adding good content can’t do any harm.
Go to Ahrefs and put in the keyword you want to rank for and scroll down to the search engine results page.
Look at the keyword column:
This shows you all the keywords the posts are ranking for.
This will give you some great ideas of extra content to add on.
4. Do more on page optimisation
You might find your meta title is not optimised correctly.
Or your headings on the page are not great.
Or you don’t have enough internal links to your post.
Some additional optimisation can give it the boost you need.
5. Republish it
To republish the post change the date of the post within WordPress to the current date.
And then publish it again.
When you republish a piece of content it’s going to get prominence on your website.
It’s going to get social shares.
You can also promote it to your email list.
And it will hopefully get some links.
Use URL filters in your keyword research tool to prioritize this process and align with your site monetization model. E.g. an ecommerce store may first apply a ranking filter to a /product or /category folder, whereas an affiliate site might apply it to a /review folder, or any URLs containing the “best” modifier.
Robbie Richards, Robbierichards.com
Any instances of high impressions with low click-through rate?
Imagine you were getting a lot of impressions on Google for particular pages on your website but there was a very low click through rate.
This could be a perfect opportunity to revisit the title tag as well as the meta description and optimise this further.
Find pages are getting high impressions in Google Search Console but low CTRs. These are great candidates to test better title tags.
Eric Siu, SingleGrain
Is the website (site) architecture good?
This is the structure of your website and how the pages are organised.
Good architecture is good for users because they find it easy to navigate through your site, find what they want and complete the relevant actions.
…it’s also good for the search engines because it makes it easier to index everything.
For a review of site architecture you’ll assess:
- Can you access the more important pages quicker than the less important pages?
Here’s a basic structure:
You start off with your home page which is the most important page on your website.
You should then be able to quickly access the main sections of your website. This is generally through the menu.
But then there may be further pages related to these pages that could be accessible through sub menus or directly on the other pages.
The more important the page the higher it is up the tree (or at least should be!).
Ideally, you won’t have any orphan pages on your website which are pages that are not linked to from anywhere.
These are pages that Google may not find when it’s crawling your website unless you have them in your sitemap.
2. URL structure
The URL structure of your website is also an important part of your website structure.
Google prefers simple URL structures.
This is not simple:
This is simple:
Think of your dropbox folder structure.
You have folders and sub folders.
You use names that are short and easy to understand and sub folders that are related to the main folder.
- Shorter descriptive words are better than longer ones
- Use hyphens to separate words
- Use logical category structure
If you’ve got loads of numbers in your URL name you’re doing something wrong.
What’s a crawl budget? When Google is indexing your website if it’s spending too much time on your site it will expend the crawl budget which means it will stop crawling your website. A good website architecture helps ensure you don’t have crawl budget issues.
Note: Crawl budget typically only applies when you have hundreds of thousands of pages on your website.
Are there any content duplication issues?
Duplicate content is all content that is available on multiple locations (URLs) on or off your website.
Duplicate content can cause SEO problems so it’s important to identify and fix any duplication issues.
Ahrefs audit will show you duplicate content on your website.
This report will also show you the reason for duplication so you can take the appropriate action.
Canonicalization can help with content duplication that makes sense. You can have multiple copies of the same page and tell Google about it through a canonical tag. You point all pages to the master copy.
Are there thin content issues?
Thin content are pages that don’t have a lot of content on them.
These pages typically don’t provide any real value to your website visitors and are bad for SEO.
So, what actions do you need to take to resolve thin content issues?
Use a tool like Screaming Frog to crawl your website and sort the results by word count.
Find pages with the least amount of words on them (less than 300 words) and decide what you want to do with this content.
You have several options:
- Add more content on. Update the content with new research and statistics, quality imagery, actionable advice, and make it more valuable to the readers and search engines alike.
- Remove the content. If the content doesn’t have any valuable backlinks, targets irrelevant keywords and topics, and generates no traffic then the best option is to remove that page.
- Redirect the content. In cases where a page with thin content has some SEO value e.g. it has backlinks from high authority websites and generates traffic and some conversions, then you should consider using a 301 redirect and point it to a closely related page on your website.
Is Open Graph data set up correctly?
Open graph tags are important because they let you control what content appears when a link from your website is shared on social media.
You need to make sure you have the following set up:
- og:title – The title of your content that will be displayed
- og:type – The type of content e.g., “video.movie”
- og:image – An image URL which should be displayed when the content is shared on social.
- og:url – this is the canonical URL for the page you are sharing.
Since Open Graph tags are snippets of code you’ll most likely need help from a web developer to help you implement them on your website.
If you’re on WordPress, plugins like YoastSEO can be really helpful.
What is the backlink profile like?
Backlinks are one of the most important factors to ranking pages on your website.
A great backlink profile will make life a lot easier!
When you’re reviewing backlinks you want to answer the following questions:
Do you have great backlinks to a post that is not ranking? There’s an opportunity to redirect this post to a related page to help this page rank.
Do you have backlinks linking to an invalid page? If so then you can redirect the invalid page.
Do you have bad backlinks to your website? There is an option to ‘disavow’ these links within Google.
When evaluating backlinks it’s important to consider website and page authority. Many SEO tools estimate the value of a domain based on a ranking out of 100. They also do the same for pages on a website. You want to get links from relevant high authority domains and within these domains you ideally want links from high authority pages.
Is Robots.txt configured correctly?
The robots.txt file contains some instructions for Google and for other search engines.
It basically tells the search engines which pages to access and index on your website and which pages to skip.
You want to check if the directives in this file are valid and are there any missing ones!
Here’s an example robots.txt:
The instructions are as follows:
- User-agent: * – This means allow any search engine to index the site
- Disallow /go/ and /sme – We’re asking search engines not to index anything under these sub directories
- Sitemap command – This is telling Google where the sitemap is so it can use that to find pages on the website.
You can check if you have a robots.txt file by adding /robots.txt after your domain name in the address bar.
Are there any issues in the log files?
When Google is crawling your website you’ll see what it’s crawling through your log files.
There are tools such as Screaming Frog log file analysis tools that you can use to get a report on this.
You’re reviewing to uncover things like:
- Are your important pages crawled regularly?
- Any errors, broken links etc.?
- Any problematic redirects?
- Any slow pages?
Google search console will show you any rendering errors.
Are Canonical tags set up correctly?
Sounds quite techie doesn’t it?
It’s actually pretty straightforward.
A Canonical tag is a line of code on your web page that tells Google if this page is the original source or not.
Most of the time you’ll see the canonical tag pointing to the same page (which is known as self referencing) but it can also point to another page.
Here’s what you see in the source code:
<link rel=”canonical” href=”http://www.example.com/product.html” />
But you can also see it pointing to another page on your website which is a duplicate.
You are telling Google which is the master page.
If you want to test a specific page for this use the Canonical tag test tool.
Why use Cananonics?
Imagine you wanted the exact same page to appear on your website under 2 different categories.
You’ll tell Google that one page is the main one and one is just a copy by using canonical tags.
Is there non-spammy anchor text?
We look at the anchor text for links coming into the domain and expect that the majority will be branded terms. If that’s not the case then it’s worth investigating this. We want to see lots of branded anchor text and a good variety of other anchor text (i.e not all focused on similar words).
Anchor text is the text which is hyperlinked when linking to another page on your website or someone else’s page.
Are there any differences between content on mobile and desktop?
A company I chatted to the other day said the agency told them they needed to reduce a lot of text on mobile compared to desktop.
Google is moving to mobile first indexing so you may not get everything indexed correctly if you trim down your content on a mobile device.
Of course there are some changes you will need to make for mobile.
- Change the top of the page horizontal menu to a hamburger menu
- Change images that don’t work well on mobile
… we need to check how the content looks/performs on mobile!!
Are there any AMP errors (if used)
AMP (accelerated mobile pages) is a stripped down version of your desktop pages which are designed to load quickly.
Load quicker = Better user experience
If AMP is implemented we want to make sure there are no AMP errors appearing in Google Search Console.
Are there any redirect chains?
A redirect chain is when you redirect a page to a redirected page!
You only want one redirect in place.
Redirect chains can happen over time because you lose track of what is redirected to what!
Any issues when you view source?
It’s also useful to view the source of pages on the website to see if there are issues that may affect SEO.
For example, there could be a bunch of scripts loaded that are not needed, causing poor performance.
Just removing unused scripts will give website speed a boost.
Is .htaccess configured correctly?
If you’re building a website with PHP and Apache (web server) there’s a powerful .htaccess file that can be used to apply certain configurations to your website.
.. You can set up redirect URL’s
.. Load custom error pages such like 404 pages
.. Force your site to use https instead of http.
So, it’s important for us to review this file to see what has been configured.
Is pagination handled correctly?
We will investigate how pagination is handled.
Google recently announced that it doesn’t look at rel=next, rel=prev when indexing pages. But if you have this in place it’s ok to leave it and it is still good for accessibility.
But some issues that can happen with pagination are:
Only indexing the first page – This leads to orphan pages.
Setting the canonical tag pointing to the main page only.
We’ll see what is set up and will it cause any issues.
Is there any section with infinite scroll?
An infinite scroll is where you see a subset of a list of items and as you scroll you see more and more items.
Google can’t see this content so it can cause issues with SEO.
Is the website fast enough?
This is becoming increasingly important and anyone in the SEO world will tell you the importance of having a fast website.
Ideally you’ll score high (above 90) on Google Page Speed test for important pages on your website.
We have covered this in detail in the technology and performance audit.
If you have adapted your website to suit different countries or languages you’ll want to review this section.
Is there good international link acquisition?
If you want to rank in other countries you’ll ideally have links from country specific domains back to relevant pages.
A link from a .fr domain (French) back to your French page.
Is it obvious keyword research has been done for each country?
For country pages we’ll need to analyse the optimisation of these pages to ensure that it is focused on the international terms.
How is the website structured to support internationalisation and is this the right approach?
There are various ways of setting up international versions of your page. For example, on this ecommerce website they use sub domains:
On each product page they let Google know, using the hreflang attribute, what page to send people to based on their country.
On this website they have everything on the one domain but have different pages:
<link rel=”alternate” href=”https://www.guinness-storehouse.com/fr” hreflang=”fr”>
And some websites will use country specific domains e.g. www.boohoo.fr.
There’s advantages/disadvantages to each of the approaches.
Where are domains hosted?
This will have an impact on speed for local users so it’s important to see where the pages are hosted on subdomains or country specific domains.
The fundamentals of SEO that we covered so far also apply to ecommerce websites.
But…there are some specifics that you need to be aware of.
Is the website structured correctly with categories and products?
You want to have optimised product pages but you’ll also want to have good category pages if you have a lot of products.
You want to rank based on category and based on product.
Is breadcrumb navigation set up?
This helps users experience when browsing through products but it also helps Google understand the structure of your site.
Do product pages have enough content?
You ideally want at least 250 words of content for each page. The more content the better (as long as it’s made readable and doesn’t feel overpowering for the user).
Google doesn’t really care that it’s a product page. It’s just a page of content.
Is there relevant schema markup?
Schema markup is additional information you add to the product/category page that may appear in search results.
To improve your click through rates which is good for SEO.
The second entry listed here stands out because of the reviews. If you visit the product page you’ll see there is schema markup for reviews added to the page.
If you’ve ever read Robert Cialdin’s book about the 6 principles of persuasion you’ll remember that ‘Social Proof’ is important in persuasion.
And reviews are an important social proof you can add to your product pages!
Are related products listed on the page?
Having related products listed on the page is good for:
a). Keeping people on the website
b). Giving people an alternative or add on to the current product
c). Providing links to other pages which will help with ranking and indexing
Are meta titles/descriptions created correctly?
We check this as part of a regular SEO audit but it’s worth mentioning this for an ecommerce website also.
If there’s a lot of products on your shop you probably won’t be able to manually create them so they will be automatically created.
That’s ok but we need to ensure they are created correctly.
What are people searching for?
We need to track all searches to see what people are searching for.
They may be searching for products you don’t have.
They may be searching for queries you don’t have answers to.
This can lead to good ideas for additional content for the site to drive more traffic!
How is Faceted Navigation handled?
Boohoo is one of my favorite shops for buying my clothes!
On the left you can see the various filters I can apply.
This is known as faceted navigation (Facet is a different aspect or feature of something)
If I filter on tracksuits without putting in size or any other parameters the URL is:
Now I apply a filter for size medium.
And now I change this to small:
But what if I wanted to filter on small size, that’s a bomber style and is between 20 and 100 euro:
What I get as a result are these tracksuits which are actually pretty nice….mmm…I might buy one!
But that is not why I’m showing you this!
What is happening is that pages are being dynamically created based on the filters you apply.
But you don’t want all of these pages indexed.
You want to focus your attention on the ‘Tracksuits’ page.
There are several ways of making sure that these parameterised pages are not indexed.
For example, use the canonical tag.
This basically has a canonical tag on the parameterised page which says that the real page to index is the tracksuit page not the parameterised pages.
And here’s how this is handled by Boohoo:
<link rel=”canonical” href=”https://ie.boohoo.com/mens/tracksuits” />
Is there NAP consistency across channels?
NAP (name, address, phone) is important to have on your website but you want to also make sure that it is consistent across your other profiles particularly on Google My Business.
Are keywords relevant to local customers?
If you’re going to target local customers you want to make sure you have content which is targeting local phrases that people are using.
Are there local pages?
If you have businesses in multiple locations having a page for each business is a good idea. Each page should have details of the business, NAP, Google Map etc.
If you’ve only one business you should still have a contact page which includes business name, address, phone number.