A 16-Step SEO Audit Process To Boost Your Google Rankings

David McSweeney
David is the owner of Top5SEO and a white hat SEO evangelist. SEO case studies make him a lot happier than they should, and he has a tendency to overuse ellipses...

Article stats

  • Referring domains 47
  • Organic traffic 924
Data from Content Explorer tool.
    Follow this 16-step SEO audit process to nail your site's SEO and boost your search traffic.

    The little things are important in SEO.

    But so are the big things.

    And sometimes we can spend so much time sweating the small stuff, that we completely fail to see the 800 pound gorilla that’s stomping all over our google rankings.

    Fortunately, running an SEO audit on your site can uncover those gorillas and give you a ton of ideas for boosting your search traffic.

    Here's how to do it.

    Tools Required For The SEO Audit Process

    Here are the tools I will be using during the audit process.

    Not all of them are essential. But they will help to make the process easier.

    Editor's note

    For this walk-through, I’ll be auditing 2 sites.

    In steps 1-8 I will be looking at Simple Life Insure - a life insurance broker from California.

    simplelifeinsure

    I will introduce the second site before step 9.

    Let's get started!

    Step 1: Start A Website Crawl

    In a moment we’re going to be running through a number of manual checks on the website.

    But first we'll start a website crawl running in the background.

    An SEO crawler will spider the site in the same way as Google and give us some useful information on structure and current SEO setup.

    I’m going to use Beam Us Up for the crawl (FREE), but you could also use Screaming Frog’s SEO Site Auditor (£99 per year).

    If you are using Beam Us Up, just enter your site’s address in the URL field and hit ‘Start’.

    beamusup1

    The crawler will start working away in the background while we continue with the audit.

    Sidenote.
    We're currently working on major updates to our Domain Health and Crawl Report tools. We expect to relaunch the tools later this year, which will allow you to view a full crawl report from within your Ahrefs account.

    Step 2: Check That Only 1 Version Of The Site Is Browseable

    Consider all the ways someone could type your website address into a browser (or link to it).

    For example:

    • http://yourdomain.com
    • http://www.yourdomain.com
    • https://yourdomain.com
    • https://www.yourdomain.com

    Only one of these should be accessible in a browser. The others should be 301 redirected to the canonical version.

    In Simple Life Insure's case, there's a problem.

    You can browse the site at both http://

    http

    and https://

    https

    So that's something that needs to be fixed as a priority.

    Recommendation

    We would recommend you use https:// (either www or non www) as there is a slight rankings boost for SSL enabled sites. Plus, it also keeps your site secure and increases trust.

    You can get a free SSL certificate for your site from Let's Encrypt.

    Either way, make sure that your site is only accessible through one URL.

    All other versions should be 301 redirected.

    Step 3: Manually Check Home Page SEO

    Let’s start by looking at a few on-page fundamentals.

    We’ll go to the site’s home page in a browser and right click to view the source.

    view-source

    We’re just going to have a quick check on some basic on-page SEO here. To make things easy we’ll ask a few questions:

    1. Does the page contain a well crafted, clickable title. Does it conform to SEO best practices?
    2. Is there a custom meta description? Is it optimised for maximising click-throughs?
    3. Is there one instance of the H1 tag? Does it conform to SEO best practices?
    4. Are subheaders (H2, H3 etc) properly used and conforming to SEO best practices?

    Let’s find out the answers.

    Check The Home Page Title Tag

    The title tag remains the single most important on-page ranking factor in 2016.

    In his excellent title tag best practice guide, Shaun Anderson of Hobo Web explains

    The Page Title Tag (or more accurately the HTML Title Element) is still, however, arguably the most important on-page seo factor to address on any web page. Keywords in page titles can HELPyour pages rank higher in Google results pages (SERPS). The page title is also often used by Google as the title of a search snippet link in search engine results pages. Keywords in page titles often end up as links to your web page.
    Shaun Anderson
    Shaun Anderson Hobo Web

    Let’s take a look at the current title tag.

    <title>Affordable low cost life insurance + instant quotes online | SimpleLifeInsure.com</title>

    There are a couple of issues here:

    1. Generally for home page I prefer to see brand first.
    2. The title tag is too long and will truncate in search

    Here’s how the page is currently displayed in search.

    google-search1

    Sure enough we can see that it’s truncating. And actually Google have flipped the title round themselves.

    Recommended Changes

    1. Rewrite the title to put the brand first.
    2. Keep the title to around 55 characters in length to avoid truncation.
    3. Target 1 high volume keyword in our strapline.

    We can check search volume for the keyword “low cost life insurance” in Ahrefs Keywords Explorer.

    Not bad.

    But turns out there’s actually more volume for “affordable life insurance”.

    And since “affordable” and “low cost” basically mean the same thing, I would probably go for:

    <title>SimpleLifeInsure.com: Instant affordable life insurance quotes</title>

    Now, I know I said "target 1 high volume keyword".

    But I flipped "instant" to the start of the strapline so we could also get the exact keyword "life insurance quotes" in there.

    That keyword has some mad volume...

    So we might as well throw it in!

    Recommendation
    While 55 characters is a good rule of thumb for title length, it's actually pixels that matter, rather than characters.

    You currently get around 512px to play with, although Google are tweaking this a bit at the moment (for example mobile results may display more).

    While it's easy to figure out how many characters your titles are, working out pixel width is a bit more of a challenge.

    Fortunately there's a cool free tool that will do the work for you.

    Just enter your title tag, hit "Yo, Get Pixel Width" and you'll get a preview of your title along with a pixel count.

    title-tool

    Sweet!

    Check The Home Page Meta Description

    The meta description doesn't directly influence your ranking.

    But a well written meta description can help to get you more clicks.

    And since it's generally accepted that Google is using click-through rate as a ranking signal, you'll want to make sure your home page's meta description is:

    • a) well written
    • b) super enticing

    Think of it as your advert in the search results.

    Ramesh Ranjan puts it well in this article for Hubspot:

    The meta description is one of your last hopes on search engine results pages (SERPs) to attract a searcher to come to your site. This is something that digital marketers constantly neglect to focus on — perhaps because they think it just doesn’t matter anymore. But if you’re not putting effort into your meta descriptions, you could be missing out on good website traffic that can bring in lots of new leads and customers.
    Ramesh Ranjan
    Ramesh Ranjan rameshranjan.com

    A good meta description should communicate key USPs for the product or service you are offering.

    For example, an e-commerce site could include points like ‘FREE delivery’, ‘No quibble returns’, ‘Cheapest prices’ etc.

    Here's the current meta description for Simple Life Insure's home page.

    meta-des

    I think that does a pretty good job.

    So no change required.

    Check The Home Page Header Tags

    The H1 tag is still an important on-page ranking factor.

    We want to make sure that every page on the site has a unique, descriptive, H1 tag.

    For the home page, you will generally want to:

    • communicate the site’s main purpose
    • include 1 or 2 high level keywords in the process

    Here is the current H1 tag:

    <h1>Compare Life Insurance Quotes Within 1 Minute</h1>

    It's pretty good.

    But I would recommend adding in the word "affordable". That way we're hitting the keyword "affordable life insurance" in both the title and H1 tag.

    <h1>Compare Affordable Life Insurance Quotes within 1 Minute</h1>

    All thing being equal, including that exact phrase in there could help to improve rankings for the keyword.

    Check Subheaders (H2, H3 etc)

    Subheaders should be used in a logical way to break up each page’s content. They also provide us with a good opportunity to target secondary keywords/phrases.

    Try to avoid using generic phrases ("more information", "details" etc) in subheaders. Instead these phrases should be wrapped in a div and styled with CSS.

    Simple Life Insure's home page makes good use of subheaders and does not require amendment.

    Step 4: Analyse The Crawl Report

    Once your crawl report is complete you'll get a list of on-site issues.

    Our crawl of Simple Life Insure found a number of issues that require our attention.

    beamusup

    We'll want to go through and fix them to make sure our on-site SEO is up to scratch.

    The duplicate title tags issue issue above is common on WordPress sites. Basically, it's caused by the blog archive pages (i.e. blog/page/2/ etc) all being:

    • a) indexable
    • b) having exactly the same title

    We can either append "- Page X" to our titles, or just add the noindex, follow tag to everything except page 1.

    Personally I prefer the second option. It's easy to do in YOAST.

    yoast

    Step 5: Check Content Is Unique

    Google HATES duplicate content.

    Too much of it and your site can get crushed by Panda.

    Content duplicated across multiple pages on your site is bad. But when it's duplicated on other websites it's even worse.

    You can easily find potential duplicate content issues across the web with a premium Copyscape account. For $10 you can check up to 200 URLs using their batch tool.

    Here's the process.

    Grab the URLs from your crawl report and paste them into the batch tool.

    copyscape1

    Click add and the tool will verify the number of URLs you entered. It will also give an approximate time for the batch scan to complete.

    copyscape2

    When it's done you'll get a list of all scanned URLs showing the number of duplicate content matches and a colour coded "Risk" score.

    copyscape4

    You can click an individual URL to see the matches.

    copyscape3

    Looks like in this case the issue is simply the text disclaimer in the footer of the site.

    boilerplate

    That shouldn't be a problem on most pages.

    But it might be on category pages where there is very little unique content.

    archive

    Recommendation
    There are a couple of solutions.

    Firstly, the text disclaimer could be replaced with an image.

    That certainly gets rid of the duplicate content problem. But it doesn't address the issue of the category pages having very little unique content of their own. They might still get flagged by Panda (or at least struggle to rank).

    So, the better solution would be to beef up the category pages by adding a unique introduction. Doing that will increase the unique content on the page, mitigating the boilerplate text.

    And as a bonus, it may help to boost the rankings of the category pages themselves.

    Step 6: Run Some Tests On Google

    We'll now head over to Google to run some searches.

    Search For Brand

    First of all, we’ll do a search for the site’s brand.

    Unless a site is very new (or the brand is a very generic phrase), I would expect to see it rank at position 1.

    If not then this is a strong indication of deeper problems, such as heavy algorithmic or manual penalties.

    Simple Life Insure actually rank at position 2:

    brand-search

    The site is relatively new, so it's probably not anything to be overly concerned about.

    But it does indicate to me that Google might not completely trust the brand yet.

    How To Increase The Site's Trust

    To increase the site's trust I would recommend:

    • Building some strong, branded links
    • Building some citations on business directories
    • Making sure the site has a Google Business listing
    • Ensuring the site has a presence on all the major social networks

    Perform A Search Using The “site:” Operator

    Next we’ll perform a search using the site: operator, which will show us how many pages Google currently has indexed for the domain.

    This can be a good early indicator of indexing issues.

    The format for the search is “site:yourdomain.com” without the quotes.

    site-search

    We can see that there are currently 177 pages included in the index for this domain.

    That seems a little high considering our crawl only discovered 92 URLs.

    If we scroll down we can see there are quite a few "junk" pages that we'll want to get rid of.

    For example, there are lots of tag pages indexed:

    tags

    These have no content on them and should be removed from the index.

    Again, we can easily manage that through YOAST by adding noindex,follow to all tag pages with one click.

    tag-noindex

    It's well worth scrolling through to find other thin/low quality pages to remove from Google's index.

    Editor's note
    For the next part of the audit process we will switch sites to ToyUniverse.com.au -  a toy retailer from Australia.

    toyuniverse

    For traffic illustration we will be looking at data from last year.

    Step 7: Analyse Search Traffic

    Clearly the aim of any SEO audit is to identify ways to increase a website’s traffic. So it makes sense to take a look at how the site is currently performing.

    We’ll run a few reports in Google Analytics to give us a quick overview.

    First we'll look at the site’s current search traffic.

    The data for August 2015 shows an average of around 250 visitors a day from search.

    Acquisition > Overview > Organic Search

    Google analytics data

    From the same report, we can click on "Landing Page" to discover which pages are currently bringing in the most search traffic.

    A broad spread of landing pages (in this case there are 1,265) would suggest the site is in reasonable health.

    Google analytics landing pages

    Finally, we will set a wide range (I’m going to go back to July 2013) and view traffic by week. We're looking for any noticeable spikes, or dips in traffic.

    spikes in traffic

    In this case we can see some big spikes at the end of each year, followed by drops in January. I suspect that's down to the seasonal nature of the product as generally traffic has been pretty stagnant over the period.

    Of course, in other cases, a drop in traffic may be indicative of a Google penalty. That's something we will take a closer look at shortly.

    Step 8: Gather Data From Google Search Console

    Google Search Console (formerly Webmaster Tools) will give us some useful data for our SEO audit.

    Look For Crawl Errors

    First off, we’ll take a look to see if Google is reporting any issues with crawling the site (Crawl > Crawl Errors).

    We can see that Google is reporting a number of 404 (not found) errors, and several 403 (access denied) errors.

    Google search console crawl errors

    On closer inspection, it appears that these 404s are thrown by old products which have been removed from the site.

    In this case I would recommend that expired products are 301 redirected to either a similar product, or a parent category.

    This will preserve any link equity that the pages may have acquired (from both external and internal links), which would be lost with the 404.

    The pages returning 403 errors may require a little deeper digging to find out how Google is finding them.

    Check For HTML Improvements

    We can take a look at the HTML Improvements report (Search Appearance >HTML Improvements) to get a quick overview of any on-page issues Google has found when indexing the site.

    In this case Google is reporting several issues, including duplicate title tags and duplicate meta descriptions.

    Google search console HTML improvements

    We should have already picked up on most of those in our own crawl of the site. But it's worth double checking to be sure.

    Editor's note
    For steps 9, 10, 11 and 15 you will require an Ahrefs account. If you don't have one yet, you can take a free 14-day trial.

    Step 9: Check The "Curve"

    The overview report in Ahrefs Site Explorer gives us some useful information on the general direction of travel for the site. We can get some quick visual feedback on whether our SEO efforts are having a positive impact on rankings and traffic.

    We'll start by entering the domain into Site Explorer and clicking 'Explore'.

    Site Explorer > Enter domain > Explore

    site-explorer1

    This will take us to the overview report for the domain.

    overview

    The number of referring domains pointing to the website has been steadily increasing, which is good to see.

    Next we'll click on the "Organic Search" tab and take a look at keywords and estimated traffic.

    graphs

    We can see that there has been an increase in both. Although there does appear to have been a dip in rankings between April and July this year, which has now recovered.

    Tip
    These reports are particularly useful for newer sites, where improvements in rankings may not yet have resulted in increased traffic.

    If we can see that keywords and traffic are moving steadily upwards we can assume that continuing our SEO efforts will soon yield positive results.

    Step 10: Find Pages Ranking in Positions 5-10 for High Volume Keywords

    Ranking at position 5 for a high volume keywords sounds pretty good.

    Until you look at the difference moving up just a couple of positions could make to your traffic.

    percentage-of-traffic-by-google-results-position-chitika

    The good news is, that if you are already ranking reasonably well, a little proactive SEO should be enough to push you into the top spots.

    What we want to look for is high volume keywords where we rank in positions 5-10. It's easy to do with Ahrefs Site Explorer.

    We can simply run the organic keywords report and set the position and volume filters accordingly.

    Site Explorer > Enter domain > Explore > Organic Keywords > Position (from: 5 to: 10) > Volume (min: 1,000 max: blank)

    filters

    We'll then get a list of keywords (and content) we can focus on for some quick wins.

    organic-keywords

    So how do we boost these pages and push them into the top spots?

    There are a few options:

    1. Add some more internal links to the pages
    2. Build some fresh backlinks to the pages
    3. Update and relaunch the content
    4. Make sure on-page is optimised for the exact keyword
    Sneaky Tip
    Want to build some fresh backlinks to the page in seconds?

    Just find a page on your site that's not bringing in much traffic (but has accumulated a few inbound links) and 301 redirect it to the page you want to boost.

    It's a little naughty...

    But it works.

    Step 11: Analyse Backlink Profile

    We’ll be conducting a quick, manual audit on the site’s link profile to look for any obvious issues.

    Ahrefs makes this super easy for us!

    Step 1: Anchor Text Distribution

    We'll go back to the overview report for the domain and scroll down to the "Anchors Cloud". This gives us a quick, easy to digest visual of the site’s anchor text distribution.

    anchors-cloud

    In this case we can see that naked URLs and brand links make up over one third of the site’s link profile.

    We can also see other anchors which include the Toy Universe brand name.

    This is good news from an SEO perspective as branded links help to build trust.

    If you were to see a large number of keyword anchors in here, then that is something you may have to take a closer look at.

    But we’re all good in this case 🙂

    Step 2: Broken Backlinks and Some Quick Wins

    Next, we’ll take a look at the ‘Broken Backlinks’ report under ‘Inbound Links’, which can often be the source of some quick wins.

    This will show us links to pages on our site that are currently returning a 404 error – effectively losing the link equity.

    Site Explorer > Enter domain > Explore > Backlink profile > Backlinks > Broken

    In this case the report has found 3 broken links.

    broken-backlinks

    We can easily fix them (and get back the link juice) by creating 301 redirects for the old expired URLs, to relevant, related pages.

    Step 3: Quick Link Audit/Sniff Test

    Finally, we will have a quick run through the site’s referring domains, to see if anything sticks out as suspicious, or low quality.

    We’ll click on "DR" to reverse the order of the report and put the lowest quality links at the top.

    Site Explorer > Enter domain > Explore > Backlink profile > Referring domains > Sort by DR (lowest first)

    referring-domains

    Over time you will develop a sixth sense for spotting low quality links.

    If anything looks suspicious you can simply click on the number under the "Backlinks" column, then click-through to the linking URL to take a look.

    backlinks

    You can spend time going through this in more detail when you conduct a full audit.

    But for now, look for anything that looks low quality, or purposely manipulative.

    Also look out for obvious signs of ‘SEOing’ (it’s a word ok!). For example lots of directory links, or links from multiple forum profiles.

    Check out our guide to finding bad links for more on what to look out for.

    In this case, the link profile looks relatively clean, so we can move on to the next stage of our SEO audit.

    Step 12: Identify Any Penalties

    Editor's note
    Both Panda and Penguin are now baked into Google's algorithm and influence rankings on a "real-time" basis.

    However the following method is still useful for finding historical penalties which may continue to impact your rankings.

    I mentioned earlier that we would take a closer look at drops in traffic that may be indicative of Google penalties.

    Fortunately for this task there is a great (FREE) tool called "Panguin" that will do most of the hard work for us.

    Here’s how you can use the Panguin tool to quickly identify drops in traffic that align with Google’s algorithmic updates.

    Step 1: Log Into Analytics

    Load up the Panguin tool (here) and log in to your Google Analytics account.

    Panguin tool

    Step 2: Select Your Site

    Once logged in to your Analytics account, select the site you wish to review from the left hand menu.

    select your site

    This should be the same as your list of sites in Google Analytics.

    Step 3: Identify Penalties

    Once selected, you will be presented with a chart showing your organic traffic. The chart will be overlayed with various colour coded lines which align with Google updates.

    identify Google penalties

    You can turn the various updates (Panda, Penguin etc) on or off, by toggling the buttons on the right hand side.

    In the image below I have turned off all algorithmic updates with the exception of Penguin.

    showing Penguin penalties

    We can see that the Penguin updates in December 2014 align with the traffic drop, however, I suspect this is merely coincidence and that this fall is just due to the seasonal drop off.

    What to look for
    Traffic drops aligning with algorithmic updates.

    Step 13: Check Out Site Speed

    It has long been confirmed that site speed is one of Google’s many ranking factors (although there is an interesting study here which suggests crawl speed may actually be more important than page load speed).

    Certainly a fast loading site provides a better user experience than a slow loading site.

    So it's always good practice to ensure that your website is optimised to load as quickly as possible.

    You could probably spend days tweaking your website and server. But a good starting point for identifying major issues is Google’s PageSpeed Insights Tool.

    Just enter your URL in the free tool, hit "Analyse" and Google will return a speed ‘score’ for your site (out of 100) for both mobile and desktop. They will also give you a list of other areas they think you can improve.

    pagespeed insights

    In our case, the site gets a score of 57/100 for mobile and 71/100 for desktop. Not the worst I’ve seen, but definite room for improvement!

    pagespeed score

    For more on the importance of site speed to SEO, and some great tips on how to optimise your site, check out this guide by Albert Costill for Search Engine Journal.

    Sidenote.
    You should also check that the site is mobile ready, preferably with a responsive design.

    Step 14: Test Structured Data

    The next step in our audit process will be to test any structured data that the site may contain and ensure it is properly formed.

    Why is Structured Data important?

    Proper use of structured data can increase visibility in the search results and lead to improvement in click-through rate.

    It looks like this:

    Google structured data

    image from google’s structured data guide

    Examples of content that may benefit from the inclusion of structured data include:

    • Reviews
    • Product information
    • Events

    Although, you can mark up most types of content to a certain degree (just don’t spam!).

    Additionally, the correct use of structured data allows Google to better understand your content. Nathan Yerian puts it well here.

    Structured data allows search engines to not only crawl your site, but to truly understand it. Yes, even search engines can have a tough time deciphering web page content. Some elements that seem perfectly obvious to us humans are meaningless to web crawlers.
    Nathan Yerian
    Nathan Yerian Adhere Creative

    We can use Google’s Structured Data Testing Tool to test existing structured data.

    Simply paste in the URL you wish to test and hit "Fetch & Validate".

    structured data testing tool

    Google will evaluate the structured data for the page and return any errors. In this case we can see there are several errors which will need to be investigated and fixed.

    structured data errors

    Step 15: Find "Content Gaps"

    What's a content gap?

    It's a keyword that your competitors currently rank for... and you don't!

    Needless to say, that's something you'll want to rectify.

    There's a simple report you can run in Ahrefs Site Explorer to find content gaps. We've cunningly called it "Content gap".

    But first you'll want to figure out who your closest competitors are in search. To do that, run the "Competing Domains" report.

    Site Explorer > Enter domain > Explore > Organic search > Competing domains

    The report will show sites where we found a big overlap in keyword rankings with your domain.

    competing-domains

    Ignore any marketplace type sites like Ebay or Amazon. Above I have selected 3 domains that I consider to be close competitors of Toy Universe.

    Now I'll head over to the Content Gap report and enter those 3 competitors.

    Site Explorer > Enter domain > Explore > Organic search > Content gap > Enter competitors > Show keywords

    content-gap

    I've set the volume filter to show keywords with maximum search volume of 20,000. That way we'll filter out branded searches and other keywords that it won't be worth our while targeting.

    We can see that there are a number of keywords with high volume that we can target with content. If you're running an Ecommerce store, this can also give you a good idea on new products to stock.

    content-gap2

    Step 16: Conduct a Full Content Audit

    The final step in our audit process might sound a little unintuitive. That's because it will often involve deleting a load of pages from your site.

    So how will that help to increase search traffic?

    The short answer is that lots of low quality (or underperforming) pages can drag your whole site down.

    Last year we deleted over 200 low performing posts from the Ahrefs blog and quickly saw a nice increase in our search traffic.

    You can read the full strategy in this post, but here it is in a nutshell.

    1. Find low quality pages on your site which are not currently bringing in any search traffic
    2. If they can be improved then update them and relaunch
    3. Otherwise, delete them from your site and 301 redirect the URL to a relevant page

    A word of warning however:

    Be very careful when taking a knife to your site.

    Your first choice should be to improve the content and relaunch.

    But if something is very low quality and simply not worth the hassle of updating, then go ahead and delete it. Just be sure to set up a 301 redirect to a relevant page so you don't lose any link equity.

    Pro Tip
    Another thing to look out for is multiple pieces of content targeting the same keyword. That's known as "keyword cannibalisation".

    Ideally only one page on your site should target each keyword.

    A good tactic is to combine multiple posts targeting the same keyword into one, super authoritative post. We recently combined 3 posts into one to create our definitive guide to anchor text.

    And… You’re Done!

    Our SEO audit is complete!

    By following these steps you should have uncovered a number of changes you can make to your site to improve your rankings.

    A full forensic SEO audit is a much lengthier process (taking anything from a few hours to a few days, depending on the size of the site). But this process is a great way to kick start any new SEO campaign and get a feel for how a site is currently set up.

    If we wanted to continue digging, we could now go back to the crawler which we started at the beginning of this process, and take a more detailed look at the site’s structure, internal linking, other on-page factors etc.

    We may also return to Ahrefs at this point to more thoroughly investigate the site’s link profile and look for additional keyword opportunities.

    But, in this case, we’ll leave that for another day…

    Consider that 800lb gorilla well and truly banished!

    Over To You

    If you have any questions about the audit process, or have any tips you would like to share, then please leave a comment below.

    David McSweeney
    David is the owner of Top5SEO and a white hat SEO evangelist. SEO case studies make him a lot happier than they should, and he has a tendency to overuse ellipses...

    Article stats

    • Referring domains 47
    • Organic traffic 924
    Data from Content Explorer tool.

    Shows how many different websites are linking to this piece of content. As a general rule, the more websites link to you, the higher you rank in Google.

    Shows estimated monthly search traffic to this article according to Ahrefs data. The actual search traffic (as reported in Google Analytics) is usually 3-5 times bigger.

    Get notified of new articles

    47,083 marketers are already subscribed to Ahrefs blog. Leave your email to get our weekly newsletter.

    • Muhammad Junaid

      A great info about how to perform SEO audit but I think its requires more than 15 minutes, because analysis takes time. My question is how much you can rely on these tools, are they approved from Google ? or may be these tools getting your site access and data which can be use for any purpose. “Toy Universe” get the High Quality Back Link after this post published, if the tag is “dofollow” and ofcourse nothing is Free in this world. 😛

      • Hey Muhammad, glad you enjoyed the post. It definitely can be done in 15 minutes -did you watch the video (that’s real time)? Of course, as I explained, this is the starting point for a deeper audit, but in just 15 minutes you can uncover a host of information about a site that you can take note of and act upon. Which tools do you mean? It’s quite a tool-lite process actually, and in fact, 4 out of the 7 tools used in the process are from Google themselves. All of this is analysis, so we’re not doing anything that would get us into trouble with Google — we’re fixing problems that might be holding our site back from performing in search.

    • Daryl Wathen

      Hi David, thank you for this nice overall audit into a site. You explained each step very clearly, a useful guide for anyone, even those fairly new to SEO.

    • Hi!
      This one is nice post for every one experienced and for beginners as well. i am following all your post of Over simplified SEO. it’s a great effort i have learned most of things,
      Thanks

    • Awesome Post right on point. I love the info graphic. Should not take longer than 15m!!

    • awesome blog post

    • Nickalas D’Urso

      I am trying to follow along. No idea how to edit the codes etc. I would love to work with some people here on our site and hire someone. I am a little lost tbh

      • which CMS are you using? Most will have a way to do that through admin, or if not you can install an SEO extension (i.e YOAST in wordpress)

    • I deleted some pages and posts from my site, Wording Well, in my efforts to “clean up” my site. 

      Now, after reading this article, I see that I have a ton of crawl errors! Yikes!

      How do I fix this issue???

      Help! (I am non-techie, so please be gentle with me!)

      • Don’t worry and re-submit your sitemap into web-console also try fetch as Google and do update your best articles.

    • Genevie Estrada1

      Amazing analysis — With reference to which , if anyone requires to combine PDF or PNG files , my boss encountered post here https://goo.gl/s1AnZ1

    • gravymatt

      I ran Panguin for my company site and we are doing well (cross yo’ fingers) but what is the green lines that indicate a Structural Algo bump on Jan ’16- I had heard of the Google Animals but this was the first I had seen about Structural?

    • PandaBalls

      Hi David

      Great reading as always.

      I notice in this article that you refer to “toy universe” being a brand link. 

      Do you not think that this is a “partial match” link? i.e. does google not see “toy” and in this case ignore the fact that you are referring to the brand and considers the link to be a keyword link instead?

      Do you think google evaluates the site to try and determine its “identity” and as a reuslt it may treat the presence of those anchors differently? This has serious ramifications if one builds links and make the assumption that “pete’s shoes” is brand and in fact googel might device it is not.

      Thanks,

      Ryan

      • That’s a very good question Ryan and one that I couldn’t give you a definitive answer to. Although my guess would be that they can definitely figure out what your brand is and treat links accordingly.

    • Great stuff as usual David! and double thanks;)

    • Casey Weisbach

      Awesome blog post David! Thank you for this

    • Emma James

      Doing a site audit is a rather hard task and this guide can be like a checklist)
      I also use special tools for auditing the websites, thanks for such a comprehensive description of Ahrefs usage, I’ve noted some cool tips.
      And as for crawling tools, I mainly use Netpeak Spider. It automatically detects site issues and this additionally eases the task.

      • I’ve not used that tool before. Thanks for sharing — will check it out 🙂

    • Fantastic breakdown David. Thanks much!

    • Thanks for sharing this comprehensive guide. We follow a similar process but will be trying some of the alternate tools you mention here, like Barracuda and Beam Us Up. How often do you recommend auditing a site? 

      • no worries Mhairi. I would say at least every 6 months you should be running a full audit.

        • @Outsmarts:disqus I’d say it also depends on your conversion window, on the frequency of your publishing and your business model. If you’re a mass medium like a magazine or newspaper, I would audit much more often than if you sell life insurances online. What do you think @davidmcsweeney:disqus?

    • I’m glad to see that the tactic of merging low traffic/conversion posts into a super post is a good one, because my most recent goal was all about that. I’ll tell you in a couple of months 😛 Great write-up as usual!

    • Another good article David 🙂

      A question about keyword cannibalisation: it’s likely that after a business writes several blog posts, the pages will start to share some of the same keywords, right? How do you avoid that?

      For example, i could write pieces entitled “10 ways to create a content strategy”, “Why a content strategy is so important for your business” and “Mapping your content strategy to your customer journey” for example.

      Assuming they’re decent articles, after Google has crawled them the different pages will each rank for “content strategy” (perhaps in different guises), but each article targets different aspects of content strategy.

      How would you avoid keyword cannibalisation?

      Many thanks.

      Robin

      • Hi Robin,

        In that case I would either:

        1. Focus on creating one mega guide to content strategy
        2. Target each piece of content at a semi-long tail with good volume (that’s what we do with things like ‘link building’ — i.e. ‘link building strategies’, ‘link building tips’, ‘white hat link building’ etc)

        Although I should also point out that user experience trumps all.

        • Yes, agree. Good advice. Thanks David.

    • Kuba

      Hi David, thanks for that post.

      I also tried 301-ing about 30% of our blog posts that brought little traffic. Hard to say if it worked or not — we observed an increase in traffic, although not immediately, so I can’t be 100% sure about the reason. However, pruning is most likely to work when a website has competing pages. 

      Quick question: what is the best way of handling the category pages? (blog/2 etc.)

      So far, we’ve been using canonical tag — it keeps the pages in the index, though. Plus, there is the issue of duplicate title tags and meta descriptions (not sure if it is important though in this case.)

      Is “no index” ‚as you suggested, the best option?

      • I tend to just noindex,follow them. I’ve never been 100% convinced that Google uses the canonical tag correctly. I see what you see, pages remaining in the index when I would think they should drop them for the defacto page.

        Also, technically if you think about it canonical is the wrong tag there as it’s not the same page really. You just don’t want it in the index (as it’s thin).

        • Kuba

          Thanks David. You’re right — now when I look at it, canonical doesn’t make sense in that scenario.

    • miranda barnett

      Hi David
      This is a fabulous blog — as all of yours are.
      I have been working my way through all of these top tips and have a question about browseable sites.
      Our site is set as http:// but I did a check as suggested and it also displays as https://
      I know that I should apply a 301 redirect and loose a bit of link juice.
      However, our site was originally a .com which our SEO company recommended we change to .co.uk which we did and created a 301 redirect. We then created a brand new website and redirected the old .co.uk site to the new one. So I have two 301 redirects in place and adding another to redirect https to http will make it a third. Will this be detrimental to our site performance, link juice and how Google views us?
      Thanks for any suggestions!

      • that sounds like a bit of a complicated setup, so I wouldn’t want to make a call on it without taking a look at the actual site/redirects. Feel free to drop me an email to david.mcsweeney@ahrefs.com 🙂

    • xtopher66

      What‘s the difference between Beam Us Up and the Ahref‘s health check tool?

      • The Domain Health report is undergoing a big update at the moment. But yes, soon you should be able to do all this (and more) in Ahrefs 🙂

        • When the Domain Health Report is going to be released? You told about 6 months ago 🙁

    • ur a fraud

      I like your blog a lot guys but the tool you refer to doesn’t work (anymore? Since your original post is old) on Mac.
      I know you can’t control what’s going on on others websites but one thing that’s most important than SEO is UX.
      Here, your UX sucks because the first step of your tutorial doesn’t work

      • does it definitely not work on Mac? They definitely have a Mac version?https://uploads.disquscdn.com/images/f5ccfbec3aff973f02dc85e76bbadd20598e9a12b7ffe66744e7f7eb19cbf7bc.png

        • ur a fraud

          Yeah they have a mac version and I downloaded it.
          But there’s a problem with the installation of the tool. “Unable to load Java Runtime Environment”
          Like I said, I know you’re not there to support the tools you provide in your examples but anybody who read your tutorial would be stuck at this. Unless they are tech-savvy

          Anyway, thanks for all the infos you gave us, it’s always pleasure to read you

          • I use Screaming Frog on my Mac — the free version is good — however the paid version allows you to export data and a few other things — I love it 🙂

    • Abrar Shahriar

      Great Post.
      You just heat the fire on SEO and this is a jaw dropping content.

    • Thanks David 🙂 I learned a few things about ahrefs I didn’t know — and I thought your article made perfect sense, all the way through.

    • Thank you to share this is very useful kindly post like this useful tips

    • Pranish Shakya

      Thank You for the help. Got a lot of valuable information form this article. Keep on posting more. 🙂

    • Catalin Moraru

      Hi David.

      I loved your article very much. But I have one question: How do you solve duplicate content made by other websites? I have created unique content for my products, but other websites are taking the content as it is and upload it on their website. 

      Being physical products, you can’t really create other specifications so they are taking the content as it is. What should I do?

    • Most of the time SEO Audit is used to find the breach in your strategy!

    • Sarah

      Hi i’m Sarah. I love this post so much, i did some of the tips above and it really works on my site. My biggest concern is about content and site performance.

    • You said “Google have flipped the title round”, is there any way to stop this, and what about meta description, most of the times google show description from the post content but I want to show my actual post description.

    • Lexi Pivovarova

      Thanks for elaborating the things so briefly. It is so easy to understand. I really like the way you evaluate the checklist of on page SEO. Great Work.

    • Thanks for the guide with very useful tips.
      Thank you again. Matt

    • SURAJ BHATT

      Hi David, Thank you for writing such an insightful post, helped me immensely in my Institute’s SEO project. While working on the project i came across the pixel length greater than 512px. Is it Ok to have pixel length greater than 512 px and if not then what can be done to improve it. Thanks Again and keep writing such posts 🙂

    • Hi David, I just search on google seo audit and got this article 1st page 3 no position on google. I read the content and learn a lot. Thank you for this valuable article.

    • SEO

      Thanks for the post. with this article i learned more. Very helpful to me. for SEO audit and SEO improvements

    • Brendan Massengale

      I know this post is old but I doubt it will ever be irrelevant. I had some trouble getting Beam Us Up to work on my windows machine and had to look for free alternatives. I am currently using Wildshark SEO Spider. Fully featured and free is a great combination. 🙂

    • Mejsny

      Hi David, how often do you recommend to do this SEO audit?

    • Shahryaar Rehman

      After reading and implementing this audit to my client website I realize that I was leaving many on-site issues (even some of them are really high priority to solve).
      Many Thanks 🙂 @davidmcsweeney:disqus

    • Bill Widmer

      SO glad I stumbled upon this post. So much good info — now I have a step-by-step system to run my client’s sites through. Thanks, David! 🙂

    • Hi David,

      Great article, but I have a question for you.

      Do you use other methods to test your structured data besides the one that you mentioned?

      Kind regards,
      Filip

    • You probably should also mention that Google is taking in consideration the keywords in the URL. Seems to me at the moment this has the biggest value over than SEO and backlinks. This is my opinion based on a competitor ranking better with zero itens listed in this post.