Editor’s Note: When we say “Ultimate Guide,” we mean it! Here’s a handy table of contents to help navigate the post:
- Introduction: Let’s figure out what an SEO audit is all about.
- How to Perform an SEO Audit
- Technical Audit
- On-page Audit (Content Audit)
- Off-page Audit
- Finishing Touches
I’ll be the first one to admit it: “SEO Audit” sounds brain-numbingly boring. Audits are the whole reason that people make jokes about accountants and IRS employees. And here you are reading an article on creating an SEO audit.
Let me just say it: An SEO audit is anything but boring.
Sure, you might have to dive into some of the tech geekery of SEO, but the results that you produce are powerful, game-changing, revenue-impacting, and thoroughly actionable.
You think I overstate my case?
Quick example: I conducted an SEO audit for a large manufacturing firm selling multi-million dollar processing equipment with a 12-24 month sales cycle. Needless to say, a single click on their CTA is cause for a company-wide celebration.
One tech audit, and one month later, they were receiving dozens of inquiries per month.
The business impact? Millions of dollars in additional revenue.
From whence came such a bounty of money? From the humble yet powerful SEO audit.
An SEO audit is one of the most powerful tools. A thorough audit answers questions, solves problems, and produces an array of benefits that you may have never anticipated.
It’s my goal to show you exactly how to do an SEO audit.
First, we’re going to do some introduction stuff. Don’t skip it. Hopefully, it will set you up for the awesomeness of the audit that follows.
When you read the introduction, you’ll know the answers to all the critical questions that impact an SEO audit:
- What is an SEO Audit?
- Who needs one?
- What’s it good for?
- How smart do you need to be in order to run one of these?
- What’s the goal of an SEO audit, anyway?
- What’s the result of an SEO audit?
- What are some of the challenges?
Introduction: Let’s figure out what an SEO audit is all about.
To set the stage, we’re going to ask a series of questions. By the end of this introduction, you’ll know exactly what we’re going to accomplish with an SEO audit.
Plus, you’ll be salivating to begin your own audit.
What is the definition of “SEO Audit?”
An SEO audit is a broad evaluation of a website’s search engine performance.
Holy cow, that’s a huge definition.
You could potentially spend two years creating a detailed audit based on that definition.
So, having dished up that amazon of a scope, I’m going to whittle it down.
The SEO audit I’m going to propose is not as detailed as it could be. The audit I’m tracing out below is comprehensive, but not exhaustive.
- You don’t need to analyze every single freaking page of content on the website. You could, but you don’t have to. I’ve audited sites that have millions of pages.
- It’s not necessary to produce a spreadsheet of every potentially harmful backlink pointing to the site. Again, if that’s your thing, go for it. But you don’t have to.
- I’m not going to suggest that you measure the size of every image on the site to determine compression capabilities. Not a bad idea, but not necessary.
The audit you are going to learn to perform below is not a freebie whip-it-up-in-minutes approach, either. Google “free SEO audit,” and you’ll find a legion of tools that will “audit” your website.
“Audit” can mean anything you want it to, bit I’m showing you a method that free tools simply can’t provide.
There are all kinds of detailed audits that you could subject your website to. For the purposes of this article, I’m going to suggest an audit that gives you powerful results without draining your time and sucking your soul of its vitality and humanity.
Who needs an SEO audit?
At the risk of sounding cliche, every website does. I believe that every website could benefit from an SEO audit.
I’ll be totally transparent, though. I’ve looked at some websites, and thought, No. We’re not going to do an audit. First, we’re going to do a redesign.
You see what I mean? Some websites are so far gone that no amount of SEO auditing is going to bring them back.
Use your best judgment.
What is an SEO audit good for?
The basic answer is this: It gives you an actionable plan for improving your site’s search engine performance and visibility.
There are two groups of people who will benefit from an SEO audit:
- This audit is ideal for agencies who want to offer a great product.
Advertise it on your website, package it up, and start selling it. This thing is a killer product, and you can charge anywhere from $1k to $5k to run one, depending on the size of the client’s site, and the complexity of your audit.
Alternatively, you can provide the audit as a complementary product. An audit is essentially a roadmap of website improvements that you can offer at a cost.
- This audit is also beneficial for any website seeking to improve its SEO.
It’s hard to simply “improve a website” without knowing where to start and what to do.
Your SEO audit shows you precisely what features you should focus on when seeking to enhance your website search performance.
What skill level should I have before running this audit?
I’d like to think that this audit is easy enough for my eight-year-old to run, provided she follows the instructions.
But, truth be told, there are a few technical terms and concepts that you should have in place before attempting this audit.
- You should know the major SEO terms: URL, meta description, ALT tag, image compression, robots.txt. If these terms are Greek to you, then this audit will be challenging.
- It will be helpful if you’ve used some basic SEO tools. The more tools you’re familiar with, the faster and better you’ll be able to perform this audit. I’ll suggest some free and paid tools below. Some of the most helpful are Majestic, SEMRush, Ahrefs, Moz, Screaming Frog, etc.
- You should have some technical chops. I don’t consider myself a super-technical person. I studied mostly languages and literature in college, not a STEM field. However, once I found my groove in the marketing world, I developed some facility in technical activities. If tech just isn’t your thing, then this audit could be a lesson in suffering.
Don’t worry. If you’re an SEO newbie, never used an SEO tool in your life, and prefer playing Chopin on the piano to poring over sitemaps on your Macbook, don’t worry: You can still do this.
What is the goal of an SEO audit?
The goal is simple: To improve your website’s search engine performance.
This is an obvious point, but it brings up two important not-so-obvious points:
- The audit itself is not going to change your website. The audit only shows you what you should change.
- In order for the audit to be successful, it should provide a tactical list of things that need to be changed. It does no good to explain, “you have 289,189 backlinks.” Who cares?! The information in an audit should be tied to something actionable.
And the ultimate goal is even better: More revenue for your business.
What is the result of an SEO audit?
A good audit will provide the following:
- An SEO audit is an actionable plan. You instantly know what needs to be changed and how to do it.
- An SEO audit is the big picture of a site. You can understand at a glance how competitive your website is from an SEO perspective.
- An SEO audit displays your website’s weaknesses. And then, it shows how to improve upon them.
- An SEO audit helps to set expectations. SEO audits are great at providing a competitive baseline for a website’s future improvements.
- An SEO audit provides ongoing opportunities for improvement. You probably won’t be able to improve on everything on your SEO audit list right away. Instead, you have a variety of things to work on for several weeks or months.
What are some challenges of an SEO audit?
- Not providing a helpful overview. Executives don’t really care about robots.txt optimization unless it impacts the bottom line. Rise above brushstroke detail, and create a panoramic landscape — the big picture overview.
- Not providing actionable takeaways. I know I’m repeating it, but this is important. Listen up folks. The audit has to tell people what to do.
- Not providing something of more value than an automated audit. I once did work for an agency where we provided complimentary SEO audits for every client who came on board. Those audits took me five minutes to complete. They were valuable, yes, but they didn’t provide the same sort of deep dive analysis that an SEO audit should provide. I’ve also done work for a (different) agency that conducted technical audits (tech only; no content, UI, nav, etc., auditing) that required as much as twenty hours of intensive work. They were expensive audits, but they were good.
How to Perform an SEO Audit
Ready to roll? Here’s how we’re going to do it.
There are three main areas of an audit, which cover the three main areas of SEO.
- Technical – the nuts-and-bolts of a website. The best person for making improvements in this area is a web developer or designer. A website can’t succeed unless it has a strong technical foundation. This is where it all begins.
- On-page — It’s all about content in this area. Having the right keywords, content silos, and information is critical. Local SEOs should analyze NAP presence in this phase.
- Off-page — This area covers backlinks, brand mentions, social media, and external ranking factors. If a site has been impacted by algorithmic penalties, you’ll uncover why and how.
This audit I explain below contains three main sections, corresponding to these three main SEO areas.
We’ll start off with the technical, move into on-page, and conclude with off-page.
Structure of This Guide
I’m assuming that you have some knowledge of SEO terms. Rather than complicate the article with definitions and descriptions, I’m going to simply give you the most valuable information to conduct the audit
- What it is — What to analyze
- How to do it — Tools or resources you can use
- What results to look for — How to turn the information into actionable insights
At this point, don’t obsess over how to format your final document and present it to the client. Your goal is to gather all the information and dive into the nitty-gritty of getting the site audited.
The goal of a technical audit is to find out how accessible the site is to search engines and robots. Indexation is the goal. In order for that to happen, you need to make sure that all the technical pieces are in place.
This is the most tedious and unsexy parts of the audit, but it’s foundational.
Ignoring a site’s technical capabilities and accessibility is like designing a vehicle, but forgetting to add an engine.
A site’s technical innards are that engine. Ensuring the engine is well-oiled and tuned-up is critical to having a functional vehicle.
Here’s how to tune up the engine on your or your client’s website.
What It Is
Check to make sure that search bots aren’t getting errors when crawling the site.
How To Do It
- Log in to Google Search Console,
- Check “current status” on the dashboard.
- If your DNS is all good, you’ll see a green check
Run a free test here: http://dnscheck.pingdom.com/. This is very thorough analysis.
What Results to Look For
The free test above will give you an all-clear sign, or an explanation of any problems that you should address.
How To Do It
Check the Google Search console dashboard, and look for the green check.
What Results to Look For
Drill down on any server issues, and find out exactly what’s going on.
You can go into detail on server errors in the Search console
- Crawl errors
- Server connectivity
How To Do It
Robots.txt fetch is one of the dashboard elements in Search Console, so you can see it at a glance.
You can use Google’s tool found here: https://www.google.com/webmasters/tools/robots-testing-tool
You can also access the tool in Search Console
- Robots.txt tester
What To Look For
As long as you’re getting good results (no errors), you should be fine. However, it’s important to make sure that there are no major sections of the website being blocked by the site’s robots.txt.
404 Handling and Errors
What It Is
Check for the volume and velocity of missing pages, which could signal site performance or server issues.
How To Do It
- Google Search Console
- Crawl Errors
- Not Found
What Results to Look For
Most sites have a few 404s. Don’t stress over it. If there are thousands of 404s, and it comprises a huge percentage of the site, then you should start to stress. Additionally, major spikes in 404s could signal problems, too.
If your report looks like this, make a note to dive deeper into why so many pages are 404ing.
Site Security (HTTPs)
What It Is
Google gives search preferences to secure sites. Ensuring that your site is secure will serve to reassure users and possible provide a small uptick in search rankings.
How To Do It
Is the URL http:// or is there an “s” — https://? The “s” signals that the site has security in place.
Most browsers display this information with a graphical feature, such as the green padlock in Chrome.
What Results to Look For
The search advantage of a secured site isn’t drastic but it’s important to consider in order to improve on user experience and trust.
It is especially important for any payment pages or on sections of the site that are requesting personal information.
What It Is
It’s a good idea to run a free check up on the validation of the site’s HTML.
How To Do It
Use the free tool from the World Wide Web Consortium: https://validator.w3.org/
What Results to Look For
The validator is going to be super picky, so don’t despair if you find a lot of errors.
You may wish to hand a copy of the report to your developer, and let him or her make any relevant improvements.
Look especially for “fatal errors,” in order to focus on these issues first.
What It Is
Make sure that Google is indexing all the pages that it should. The number of indexed pages that your site features in Google search should be roughly equivalent to the number of indexed pages you see in The Search console.
How To Do It
Search Google for “site:[yoursite]”, and identify the number of indexed pages.
Check Search Console:
- Google Index
- Index Status
What Results to Look For
The results should be equivalent, though they are rarely the same. Major discrepancies should be investigated.
How To Do It
- Search console
- Search appearance
- Structured data
What Results to Look For
Google provides a comprehensive report. Structured data can be time-consuming to add to a site, but it does provide some advantage for search results.
What It Is
An XML sitemap is for bots only, not people. A site without an XML sitemap is a website with a missed SEO opportunity.
You can look at the sitemap of any website you choose, provided the site has one.
How To Do It
First, you want to make sure that the website has an XML sitemap. You can find out by checking Search Console.
Second, you want to ensure that the XML is simple, straightforward, and uncluttered. Adding optional tags and instructions can actually weaken the integrity of the XML sitemap, and create search bot confusion.
To perform a quick scan of your XML, you can use W3’s syntax checker: http://www.w3schools.com/xml/xml_validator.asp
Malware Scan and Canonical Header Check
What It Is
Most search engines are adept at identifying malware. However, it is helpful to run a scan on your site to examine headers and take a close look at potential risk factors.
How To Do It
Use the tool found here: https://aw-snap.info/file-viewer/
What Results to Look For
There are two elements to look for:
- On the results, press CTRL+F, and search for “canonical.” The site should have a rel=canonical tag.
- Make sure that the header codes are valid (i.e., not expired)
What It Is
Site speed is the biggest area of potential SEO improvements. Small changes in site speed can produce an enormous positive impact on indexability, user experience, and conversions.
If you have to pick one element of a technical audit to conduct, it should be in the area of site speed. There is a 99% likelihood of finding something to fix. Then, there is a 100% likelihood of seeing measurable improvements in the website and its SEO performance.
How To Do It
I suggest using two tools for this section of the audit: Pingdom and Google PageSpeed Insights:
Each tool provides unique advantages and insights. Pingdom provides a comparative analysis and speed in raw seconds. Their waterfall analysis shows you exactly where the delays are happening.
The advantage of Google is the mobile analysis. Load time on mobile devices is essential for improving the quality of a site and its search traffic.
Keep in mind, you probably won’t get a perfect score. Even Google.com has several “should fix” errors in their own page speed analysis.
What Results to Look For
There are a lot of things that could slow down a website, but I want to point out a few of the big ones:
- Code to text ratio — A lot of cruft can slow down a website quickly. For an extra measure of investigation into this number, check out http://www.whatsmyip.org/text-to-code-ratio/.
- Page Size — This is the often considered a standard measurement of page speed analysis. Although analyzing page size alone is insufficient, you should still give some consideration to it. Pages bigger than 1.5MB are at risk of sluggish load times.
- Image file size — Big images will kill your page speed. If images comprise more than 75% of the entire page’s file size, it could signal a problem. By all means, compress every image on the page to conserve space.
- Caching — A simple cache installation can remove precious seconds from load times. Ensure that caching is installed, enabled, and optimized to include as many cacheable resources as possible.
- Script minification, externalization, and combination — Getting carried away with codes is a leading cause of site speed slowdowns. A competent developer will be able to assess which codes can be removed, minified, or externalized to improve the site speed.
Google provides an easy-to-use report that provides a guide on how to improve site speed elements.
Follow the instructions this report (or hand it off to your developer) and you’ll instantly see a massive uptick in your site’s SEO.
How To Do It
Use the Google Developers Mobile-Friendly Test
What Results to Look For
You’re looking for a simple “Awesome” from Google. If you don’t get the awesome, you’ll need to address the listed issues.
Websites need to be mobile friendly in order to appear in Google mobile results. If the site lacks mobile compatibility, changes are in order.
On-page Audit (Content Audit)
A word about structure
A thoroughgoing content audit will produce a list of every single page of a website in a spreadsheet, with detailed notes and data on each page.
Should you be producing such a thorough audit? It all depends. If the site is hundreds of thousands of pages, probably not. I’ve conducted content audits of websites with 1.5 million pages. obviously, I did not look at every page.
If the site is a content-rich with a few dozen pages, sure. Go through every page with a fine-toothed comb. It all depends on the needs of the client, the cost of the audit, and your ultimate goal.
If you’re working with a massive website, you should audit the major content features (navigation, etc.) and then create a list of the pages with the highest amount of traffic. Using a benchmark of 25 or 50 pages, conduct your detailed audit on these pages.
Limiting your audit in this way will keep it from taking 15 years to complete, while at the same time giving your client the most immediate results.
A word about tools
The best tool I’ve used for conducting a content audit is the Screaming Frog SEO Spider. Although you can use a free version of the tool, I recommend purchasing a license.
Screaming Frog provides at-a-glance analysis for many of the issues that I explain below. It is particularly useful for page-by-page analysis, discussed in detail under “page-by-page analysis” in this section.
A word about variety
What about all the variety — e-commerce sites, content sites, blogs, B2B sites, lead generation sites, news sites?
This content audit is intended to be general enough for you to conduct it on any website, regardless of the nature, intent, audience, or size.
That being said, I recommend that you revise the “page-by-page analysis” section in order to align with your content audit goals and objectives.
Navigation is critical to UX, which is critical to SEO as well. Put on your user experience hat, and assess the functionality and layout of the menu navigation.
Good SEO is a result of accurate and intuitive website design. Accurate and intuitive website design is the nexus of user experience, information architecture, design, and development. This is a lot of information to juggle, and a tall order to fill.
- Is it comprehensive? Does it get you to the main sections of the website within one or two clicks with no CSS/mouseover confusion?
- Is it simple? Navigation menus with seven or more main elements begin to tax the limits of the brain.
- Is it logically arranged? Creative navigation is rarely a good design move. Conventional main navigation usually consists of a vertical bar at the top of the page, but there’s room for flexibility. As long as it “makes sense” you should be okay.
- Is the navigation keyword-focused? Top-level navigation is a critical SEO factor. Make sure that main navigation elements have an obvious keyword focus. No, you shouldn’t stuff each navigation item with a massive long tail keyword, but you should make it obvious to the user what they’re going to see. There’s no need to overthink this. Some items are standard: An “about us” page can be “About.” A “blog” is a “blog.” Industry-specific items, however, should include relevant keywords. For example, if you are analyzing an industrial carpet cleaning website, you should see the terms “carpet” and “cleaning” in some of the menu items.
If the site has navigation at the bottom of the site, you should analyze it for the following features:
- Is it useful? Does it provide navigation to the most relevant pages of the site? Are there links to social profiles?
- Is it logical? Does it follow the pattern of the top navigation?
- Is it strategic? Most of the time, the footer is a good place to add the business’s name, address, and phone number (NAP).
The footer is one of the places where SEOs of yore went overboard in their optimization efforts. Here are some telltale signs of an over optimized website footer:
- Lots of copy. The footer should not contain huge amounts of copy. I’ve seen footers with 1,000+ pages of raw keywords. This is a violation of user experience best practice and a flagrant abuse of SEO. It’s called keyword-stuffing, and it’s something that will compromise a site’s integrity in the search engines. Plus, there’s the duplicate content issue. Since a footer is repeated on every page of the website, you run the risk of over-optimized duplicate content.
- Lots of links. The footer nav is not the place for a full sitemap. Keep your links to a minimum. Keep what’s important (top nav replication), and what’s necessary (terms of service or site disclosures).
- Sitewide external links. A sitewide external link is a text or image hyperlinked to another outside website. Since a website’s footer is usually sitewide, such a link creates, in effect, thousands of links to the target website. I’ve witnessed entire sites penalized due to receiving sitewide links from an external website’s footer. It’s definitely not a good idea to link to other websites in your footer.
URLs are a potent SEO force, and they need to be optimized from two angles: Usability and indexation. Since the URL is the path that that the search engine follows, it’s crucial to include the right keywords.
At the same time, they URL should convey to the user what the page is about. These two features do not contradict each other.
Check URLs for the following:
- Keyword presence. Use just enough keywords to make the page relevant and informative.
- Concise. Don’t make your URLs too long.
- No numbers or symbols. Except in the infrequent case of tracking URLs, parameterization, or landing pages, URLs should be clean and symbol-free
- Capitalization. There’s no need to capitalize different words in the URLs. Doing so can create traffic deviancy and confusion. Consistency in capitalization is important, and the industry standard is no capitalization in URLs whatsoever.
- Dashes and underscores. Separating words in the URL is generally done using dashes (as opposed to underscores). Whatever the site’s convention is, it should be followed consistently.
Businesses with a local presence should have NAP presence across the entire website. NAP stands for Name, Address, and Phone number.
Ideally, the NAP should be present in the website header or footer, in order to achieve sitewide presence.
UpCity has the NAP in the footer.
In addition to a footer presence, the NAP should have a solid presence on an additional page such as the “about” page or a “location” page.
Integrating maps and any relevant local keywords (county name, city neighborhood name, etc.) is important to improving user experience and achieving keyword relevance for local queries.
The most important aspect of NAP presence is consistency. If the business has ever had a different name, address, or phone number, then it is at risk for NAP inconsistency, which can compromise local SEO potential.
As part of an audit, you should recommend scouring the web for occurrences of the business NAP, including directory listings, and make the relevant changes. The NAP must have consistency in all its online mentions, and occurrences.
To make sure that your NAP is consistent, you can use Yext. It’s a tool that will take most of the pain out of the challenge that is NAP consistency.
Important: Although it falls outside of the scope of a strict SEO audit, you should make sure that the website’s Google My Business/Google+ profiles are aligned with the NAP on the website.
One of the best ways to develop a strong site structure and full indexation of the site’s content is by means of an HTML sitemap.
The sitemap should be one click away from any page on the site. Thus, putting it in the footer is the best option.
The HTML sitemap should provide an accurate visual layout of the main pages or sections of the site. The sitemap URL should be /sitemap. For example, http://example.com/sitemap
If the site contains breadcrumbs, conduct a quick check to ensure the relevancy and functionality of the breadcrumbs. Breadcrumbs are helpful for navigation, as long as they accurately convey the page’s position in the overall site structure. Breadcrumbs also provide an opportunity for schema markup.
Main Page Content
The most-visited page of a website needs the greatest amount of SEO content care. Here’s what you should look for on the main page content:
- The correct keyword focus. Does the page target the correct longtail keywords for its industry and target audience?
- The correct keyword presence. Does the page include sufficient saturation of the keyword? Are there enough semantic variations of the keyword?
- Enough content. A glaring SEO shortcoming of many sites is the lack of optimized content. Image rich websites provide sufficient visual engagement, but the site needs written content in order to best capture the attention of users and search engines. A line of above-the-fold content is critical. A minimum of 400 words of content on the homepage is recommended.
Consistent and Continual Content Production
In order to stay competitive, a website should be producing content on a regular basis. You can easily identify how Google is indexing this content by performing a simple Google query: site:www.example.com, and then click “search tools,” → “anytime” → “past week” (or whatever time frame you prefer).
Sites updated weekly or more frequently are maintaining good content publication velocity, which will keep them actively indexed and ranked.
It’s time to take a deep dive into each page of the website — or as many as you’ve chosen. To make this as simple as possible, create a checklist of things to look for on each page.
Here’s a sample checklist:
- Page Title
- Does the page have a title?
- Is it within the optimal length of 30-55 characters? (Check here.)
- Is it relevant?
- Does it contain keywords?
- Is there consistent standard of title creation? It helps to have a consistent convention for titling pages. For example, it may look something like this: [This Page is About Keywords] | [Company Name]. Make sure that there is some consistency in how the title is arranged, and what symbol is used to separate elements. (The vertical bar — | — is the standard.)
- Meta description
- Does the page have a meta description?
- Is it engaging for users?
- Is it the correct length?
- Does the page have an H1?
- Is it relevant and keyword-focused?
- Keyword focus and representation
- Generally speaking, each page should have a unique keyword focus. Does the page target a single keyword?
- Does the keyword occur on the page with enough frequency and semantic variety?
- Is the keyword represented in the title, the H1, and the body copy?
- Does the page have sufficient copy?
- Does the copy reflect the keyword focus of the page?
- Visual and interactive content (the more, the better)
- Does the page have engaging and relevant visual imagery?
- Do the videos play?
- Are there functional interactive elements?
- Alt tags
- Are images optimized with relevant filenames and tagged with appropriate keywords?
- Internal linking
- Does the page contain copy with relevant text links to other internal pages? A site with strong internal linking helps to create a site with strong internal architecture. Checking for at least one internal contextual link (apart from navigation links) is important.
- Last updated
- Frequently-changed pages will rank higher. Google’s fresh factor will value pages with significant revisions of major sections of content, improvements to the H1, and even new copy. Pages that have not been updated in one year or more should be revised.
- Broken links
- Make sure each page has no broken links.
- Spelling, grammar, etc.
- You may not have the resources to perform a full-on proofreading of every single page. However, it’s important to scan for accuracy and to eliminate any typos. A typo is a user experience issue, and therefore, it’s an SEO issue, too.
- Are the pages authored by copywriter’s who possess authority and mastery of the content?
- Are authors attributed in these content pages, and are there links to social profiles?
- Unique content
- Make sure that content is not unduly duplicated across pages. H1s, title tags, and body copy should be largely unique for nearly every page on the website.
- Content that is duplicated on external sites is a warning sign, signaling scraped or copied content, and running the risk of duplicate content penalization.
- Unless the purpose of the site is ad revenue, such ads can distract from the purpose of the site and reduce user experience.
- Bounce rates
- Analyze the bounce rates of main pages to ensure that they are neither too high nor too low. Note that low bounce rates are just as problematic as high bounce rates.
- Layout and Readability
- Ensure that content width, font face, font size, kerning, and font color are all contributing to enhanced readability and layout of the site.
- Check for headers (e.g., H2, H3, etc.) with appropriate CSS or formatting.
- Scan for readability with paragraphs, chunks of text, bullet points, etc.
- Outbound links
- Are there are appropriate outbound links to authoritative websites, in order to validate claims or statements in the copy? Linking out to strong and authoritative websites (e.g., a news site, a reference site, etc.) can improve the site’s co-citation and co-occurrence.
- Overall value
- Central to the SEO question is “does this page provide value to the user?” The question is subjective, and the answer may be hard to come by. If the page serves a purpose, answers a question, or solves a problem, it’s obviously valuable. If it doesn’t, then the page should be revised or redirected to ensure that the website is streamlined, powerful, and clear.
- Call to action
- Every page should contribute to the overall business goal of the website. If further engagement or action is expected, the page should contain relevant CTAs to encourage the user to take the next logical step.
The final phase of the SEO audit is most likely the most significant in terms of SEO value. To be clear, the off-page audit is mostly about backlinks.
(Note for the sake of brevity, we’ve chosen not to include social media profiles in this discussion, except for a mere mention. Social media audits are an entity all their own.)
Backlinks are a bickering ground among SEOs, but are nonetheless a critical force in SEO.
This audit should be conducted by someone who is familiar with backlinks. Out of all three sections of the audit, this section requires the greatest amount of informed opinion, experienced-based information, and synthesis of data.
A word about tools
You will need tools. To achieve complete backlink analysis, you may need to use several tools. Here are the ones that I recommend:
- Majestic SEO
- Link Research Tools
Each tool has built up a glossary of unique terms that define their link theory. For example, Majestic uses measurements built on “trust flow” and “citation flow.”
Moz’s famous “DA” and “PA” are also a method of measurement.
Link Research Tools uses the opaque terms of CEMPER Power and CEMPER Trust to explain their scoring methodology.
If you’re not aware of the differences between tools or how the tools work, this array of terms and scoring factors can be confusing.
There are two options you have for conducting an audit:
- Option 1: Stick with a single tool and use their images, graphs, charts, and data.
- Option 2: Synthesize tools and produce your own conclusions, recommendations, and assessments
The first option is recommended for those who do not have deep experience with backlink audits. The second option, synthesis, should be the approach taken by someone with more extensive experience and familiarity with the array of tools.
A word about comparison
Backlink analysis is a comparison game. Before you begin the audit, you should have a clear idea of at least five of the business’s competitor sites.
If you don’t have any comparative benchmarks, it will be difficult to gauge the success (or lack thereof) of the off-site SEO elements.
For example, if the industry standard for websites in your industry is a domain authority of 75, and the audited website is a 16, that would be a problem. However, if you weren’t aware that the industry average was in the 70s, then you wouldn’t be able to assess this major shortcoming.
Before conducting an off-site audit, I strongly recommend creating a list of the top five competitors of the business and gaining a sense of their off-site metrics.
Unlike the proprietary audit elements of content and technical, a website’s offsite data is open to anyone who has the tools and interest to access it. There are no secrets. You can find out, with some accuracy, exactly what websites link to, say, the CIA or FBI website.
Coming up with metrics from competitor sites isn’t hard, and it should be a first step in conducting an off-site audit.
One of the most helpful ways to present this information to a client is to rank their website alongside the website of the competitors.
Where to get your data
Most of the audit elements below require that you have your site’s backlink data.
There are plenty of services that provide backlink data, which is why I suggest that you use one of the tools mentioned above.
Google provides some of the backlink data, and it’s a helpful starting point, especially if you’re willing to work with raw data instead of the analysis or presentation provided by backlink SaaSes.
To download your backlink data, log into Google Search console → Search Traffic → Links to Your Site → Under “Who links the most” click “More” → “Download more sample links.”
The CSV data you download can be tweaked, massaged, and analyzed, depending on your skill or comfort level with spreadsheets.
One simple way to analyze these links for your audit is to eliminate all duplicate domain links and determine the domain authority for each referring domain. (Excel skills are a good thing to have in your back pocket.) Instantly, you’ll reveal the topics and authority of the site’s backlink profile.
Incoming Search Traffic
Google’s “Search Analytics” is a frontline resource in auditing a website’s search traffic.Although not as robust as some tools, it nonetheless gives you an accurate measure of a website’s search health, and is a great auditing tool.
To access this report, go to the Search Console → Search Traffic → Search Analytics.
The chart gives you an excellent understanding of the site’s search viability, ranking, and performance, especially when you track clicks, impressions, CTRs, and position
This chart can serve as the starting point for a deeper dive into the site’s off-site elements. It prepares you as the auditor with an understanding of the site’s traffic and search potential, before opening the Pandora’s box of link issues.
First, you want to answer the glaring question, Is the site penalized?
There are two types of penalties — manual and algorithmic. I’ll show you how to assess both:
This is a penalty levied by Google herself and enacted by a real-life human being in the Googleplex. If the site has a manual penalty, you’re in real trouble. And you probably already know it.
To check, access the search console → Search traffic → and “Manual Actions”
Hopefully, you’ll see this.
If you don’t see that, the website is in serious trouble, and should seek immediate remediation.
An algo penalty is one that is levied automatically, based on the various fluctuations in Google’s search algorithm.
The best tool for checking on algo damage is from Barracuda. The tool is free and only requires that the website has Google analytics installed and functioning.
The tool presents your website’s’ search traffic overlaid with a timeline of algorithmic changes. A decline in website traffic that corresponds to an algo change signals an algorithmic penalty.
Moz’s “domain authority” ranking provides a logarithmically scaled number value to every site. The higher the number, the more authoritative the website. The DA is developed from a mashup of metrics, most notably backlink data.
Using Moz’s free tool, you can find out the DA of any website. I use Moz’s browser plugin, which presents me immediately with the DA of every website I visit. It’s a quick way to answer the question, “Is this website authoritative?”
Make sure that the website you’re auditing has a DA that is within a ten-point range of the industry average.
Specifically, you should come up with the average DA for five competitors. If the audited website is within ten points of that number, you can assess whether it is leading the competitors or trailing behind.
Page authority is similar to Moz’s domain authority, but is page specific. I recommend identifying the PA for the most important pages of a website.
A website’s offsite link profile includes not merely the links to the domain in general, but to the links pointing at specific pages within that domain.
The overall quantity of links is a good gauge of the overall power and authority of the domain. You can’t depend on quantity alone as a barometer of site health, however. Some of the most toxic (and penalized) websites have the most links — links generated by years of blackhat SEO and spam.
New websites may have a few hundred links. Over time, this number should grow. Many websites have millions of inbound links, gained from years of steady content production and growth of authority.
“Velocity” describes the rate at which links are acquired or lost. Generally, a website should see a gradual and steady increase in links over time.
Major spikes in link growth are the sign of 1) the website went viral (woot!), or 2) the website is the victim of negative SEO. Either way, the trend should be monitored, and the links analyzed.
A significant velocity spike of low-quality links may signal spam factors, putting the site at risk for a penalty.
The best tools for measuring link velocity are Majestic, Ahrefs and Link Research Tools.
Deep Link Ratio
A website should have more deep links than homepage links.
Deep links are those that target the website’s internal pages. Links to deep/internal pages are deemed more trustworthy, and therefore more valuable to a website’s overall authority and integrity.
Successful content marketing efforts will gain far more deep links than homepage links. Any ratio over 50% deep links is acceptable, and a deep link ratio of 75% is optimal.
TLDs, or top-level-domain refer to a website’s domain type — .org, .com, etc. A healthy website has mostly .orgs and .com referring links.
A high percentage of .biz, .co., or foreign-language links can signal an increase in negative SEO.
The most valued TLDs are .gov and .edu. Links from these sites are usually authoritative, and can confer some serious link cachet to the receiving websites.
Link Trust and Authority
Finding out the trust and authority of the referring links is a crucial measure of the site you’re auditing. Too many untrustworthy or low-quality links can compromise the site.
Tools such as Link Research Tools can help assess this aspect. Using Google’s backlink data, you can also conduct manual research on linking websites to determine their trust and authority.
Link Topic or Theme
The site should be receiving links from other relevant websites. For example, a local construction contractor might receive links from local directories, local news services, and vendors such as plumbers or electricians.
These types of sites can be classified according to their general theme or topic — computer, business, gaming, etc.
The easiest way to assess the link topic or theme is to use a tool like Majestic or Link Research Tools.
Check the incoming links to make sure that the topic of the linking site is relevant. A buildup of foreign language, ad-heavy, pharma, or adult-themed links is a warning sign regarding the health of the link profile.
Anchor Text Analysis
The healthiest anchor texts are branded, meaning that the referring links should use anchors that contain the brand name.
The second-healthiest anchor texts are naked URLs.
Too many keyword-rich anchors are a spam signal. For example, if your site is selling “cheap cell phones,” and 75% of your anchor texts contain some “cheap cell phone” variation, that’s not healthy.
Image anchors are fine, but if you have 50% or more image anchors, it signals a concern.
There are three types of incoming links:
- Redirects — links that automatically redirect from another website domain or internal page
- Follow — the standard href link from one site to another
- No-follow — an incoming link tagged with no-follow.
Most of the links to a website will be followed. A ratio of 90%+ followed links is normal. If there is an overwhelming amount of no-followed or redirected links, it’s important to draw attention to it and find out why.
One of the most vexing questions in link profile auditing has been how to measure the spamminess of links. “Toxicity” is a word sometimes used to describe the harmfulness of a link pointing to the website.
To date, there is no cut-and-dried solution to this concern. The popular Link Research Tools is the leading SaaS in the space, but its results are oftentimes skewed toward the negative. Moz’s recent foray into harmful link scoring is slightly more modest, but its results can be incomplete.
I recommend using whichever tool you are most comfortable with, but not leaving the answer to the question in the purview of a specific tool. Instead, you should examine and audit the whole scope of link profile concerns, and develop a more nuanced answer to the question. In addition, I recommend downloading a website’s backlinks and cherry-picking through them to get a feel for the nature and authority of the linking sites.
Every website is going to have toxic links. The real concern is, what action will you take?
In addition to gaining links, websites also lose links. Analyze link attrition metrics to find out how quickly or slowly it’s happening. If a site is hemorrhaging links at a rapid pace, it could reduce the site’s authority, while also possibly signaling an internal or technical problem.
Brand mentions are distinct from links. A brand mention happens when a website is mentioned, but not directly linked.
For example, let’s say the website you’re auditing is called “methodical coffee” which happens to be the name of a great coffee shop in my city. (I have no financial interest in, nor do I work for Methodical Coffee. I occasionally sip $5.00 espressos there. That’s about it.)
A brand mention of “methodical coffee” comes from the prestigious National Geographic, along with a photo and description. In the caption, however, there is no link to Methodical Coffee’s website.
Branded mentions are a growing source of power and authority for websites. The trend highlights the need for brands to promote not only their websites but also their brand reputation, in the interest of improving search potential.
Moz’s brand mention tool is one of the most helpful methods of measuring these occurrences. You can also perform some Google gymnastics to get an accurate measure of brand mentions with or without a link.
The Google query technique involves removing your website and social profiles from the search, and manually examining the SERPs.
Packaging your report in a presentable and beautiful way is important, especially if you want to present it to a client or use it as a guide.
Here are some suggestions:
- Make it visually appealing. Eyeballs can glaze over when you throw walls of texts and screenshots of code into a document. Embrace white space and big colorful pie charts.
- Include images, screenshots, charts, and graphs in your explanation and analysis.
- Provide a one-paragraph summary of the entire audit
- Include a detailed checklist of things to fix.
- Don’t be afraid of being detailed, as long as you can provide a brief summary as well.
Audits aren’t the kind of resource that you simply send off with a hope and a prayer. These should be accompanied with a meeting in order to explain the potential impact of the audit.