Follow this 16-step SEO Audit checklist to find SEO errors and boost your Google rankings.

For 2020 the ugly days of keyword stuffing your way up the SERPs are far away in the rearview mirror. Google’s algorithm now has over 200 ranking factors, and they seem to be adding more every month.

Does this mean that an SEO Audit in 2020 needs more than 200 steps to be any good? Does it need to take weeks and hundreds of man-hours?

No, it doesn’t.

Google weighs each factor differently, and by focusing on the most critical factors, you can get over 90% of the results with less than 10% of the effort.

Here's a list of what we will be covering in our SEO audit:

  1. Determine how SEO fits into your overall marketing strategy
  2. Crawl your website
  3. Find indexing errors & technical problems
  4. Remove low-quality content
  5. Using Robots.txt and Robots Meta Tags to resolve technical issues
  6. Identify speed & mobile page issues
  7. Find structured data errors
  8. Test your meta descriptions
  9. Analyze your organic traffic
  10. Learn from your competition
  11. Improve your content & on-page SEO
  12. Optimizing your internal links for rank boosts
  13. Optimizing your crawl budget with your site structure
  14. Improving your backlink strategy
  15. Tracking your results
  16. Your checklist for running recurring audits

If you're looking for a simple approach to an SEO audit, check out the video below and if you're just starting our check out our what is SEO guide:

Tools needed for the SEO audit

Before we jump into the SEO audit process, there are a few tools that you’ll want to experiment to help make this entire process go as smoothly as possible. Are they required to overhaul your site from an SEO audit point-of-view? Not necessarily – but they will make the process far easier and more effective, which is why every one of them is worth a closer look.  Here are some of the best SEO audit tools:

#1 Decide How SEO Fits Into Your Overall Marketing Strategy

This is basic advice, but understanding how SEO fits into your marketing strategy is the first step of our SEO audit process.  New marketing platforms are gaining traction, and SEO/SEM is no longer the end-all-be-all of digital marketing.

Some companies spend as much as 60% of their previous Google Search budget on Amazon. And social media spending rose to 13.8% of the total marketing budget in 2018.

So you can’t expect SEO to do 100% of everything when you might only be spending 30% of your energy and money here.

So you have to decide what actions you are trying to drive with your SEO strategy.

One simple question that will help you is:

When are you trying to reach your customers?

  • Before they know they are interested in a product/service.
  • When they are researching/comparing alternatives.
  • Right before the point of sale. (When they are searching for dealers/suppliers.)

Unless you know what you want SEO to do for your business, it’s impossible to evaluate performance.

If you know that you want to focus on reaching customers when they are researching/comparing alternatives, everything from then on gets a lot easier.

An SEO Strategy Worth $35m++ in ARR

A great example of dedication to a specific step in the buyer’s cycle for their SEO strategy, the Zero Moment of Truth, is Zapier.

Zapier gets more than 8.49 million visits a month from over 25,000 unique landing pages.

(source)

They achieve this by understanding exactly where SEO fits into their overall marketing strategy, and what stage they are trying to reach potential customers in.

The majority of their 25,000 landing pages focus on people searching for integrations between two pieces of software.

(Source)

SERPs like this are the bread and butter of Zapier, a business that reached 35m ARR last year. Every time they partner with a new company they make an integrations page for each of the possible integrations.

They make it easy to rank because partners are encouraged to link to their unique integration page.

With the added cash flow, they eventually added a vibrant company blog and other elements to their SEO & content strategy, but this simple strategy generated millions of dollars.

If you know when you want to reach customers via organic search, and you create a strategy around that, you significantly increase your chances of success.

Tips - Here are 19 actionable seo tips for 2020 that will help boost your rankings.

#2 Crawl Your Website

As part of the process used to determine search engine rankings, search engines like Google will use a crawler (or “spider”) to essentially analyze the structure and current SEO setup of your site, looking for various elements that determine where you’ll rank for certain terms.

Tips - Google indexes and ranks javascript pages in two different waves days apart.

Therefore, if you really want to conduct an SEO audit, you’ll want to crawl your site yourself.

There is a wide range of tools that you can use to do this, some of which are paid and some of which are free. I recommend using Screaming Frog’s SEO Spider to kick off your SEO audit (it’s free for the first 500 URLs and then £149/year after that).

Once you’ve signed up for an account, it’s time to select your crawl configuration. This is important, you need to configure your crawler to behave like the search engine you’re focused on (Googlebot, Bingbot, etc). You can select your user-agent by clicking on configuration > user agent in Screaming Frog (I’ve included a short gif below).

Next, you want to decide how the spider should behave. You can do this by selecting “spider” from the configuration menu in Screaming Frog. Do you want the spider to check images, CSS, JacaScript, Canonicals, etc? – I suggest allowing the spider to access all of the above (I will share my setup below!).

If you have a site that relies a lot on JavaScript (like SpyFu) you’d want to make sure the spider can render your pages correctly. To do that, click on the rendering tab, and select JavaScript from the drop-down. You can also enable rendered page screenshots to ensure that your pages are rendering correctly.

When everything is finalized, enter your URL and click “Start” to get started. Depending on the size of your domain, this could take quite a while – so remember, patience is a virtue.

#3 Find Indexing Errors & Technical Problems

Even in 2020, fundamental technical SEO issues are more common than you’d like to think and one sure way to find these issues is through your SEO audit.

50% of analyzed web pages had duplicate content & indexation issues, 45% had broken image and alt tag issues, and 35% had broken links.

Find indexing issues (unnecessary pages, duplicate content), broken links and other technical issues by using a tool like Screaming Frog.

Just type in your URL, and let it tell you everything that is wrong with your website from an on-page SEO standpoint.

(Source)

It helps you quickly identify significant issues and quick wins like:

  • Duplicate Content
  • Broken Links
  • Redirects
  • Too short/long Meta Tags
  • Broken Images
  • Noindex Tags

And since it identifies the specific URL for each problem, you can go in and fix each problem one by one.

How To Fix a Common Duplicate Content Issue in WordPress

Duplicate content can often be caused by the content structure coded into your CMS(Content Management System). If you use WordPress, it publishes category pages by default.

For example, when you create separate category pages, Google might index both leading to duplicate content in the SERPs.

So duplicate content is a widespread issue for inexperienced WordPress users.

If you have the SEO by Yoast plugin installed, you can easily hide category and other taxonomy pages from the search results.

Go to SEO / Search Appearance / Taxonomies in the WP dashboard, and you change the settings for categories and tags.

If your category pages are getting flagged for duplicate content, you should set the “Show Categories in search results” setting to “No.”

#4 Remove Excess / Low-Quality Content

While it seems counterintuitive, Google has said they don’t value the frequency of posting or volume of content as a ranking factor.

(Source)

(Source)

And it’s not just smoke and mirrors. Over the last few years, sites have managed to improve their search rankings by actually removing thousands of pages from search results. This strategy, sometimes called Content Pruning, can lead to significant traffic increases as big as 44%.

Just as with regular pruning, Content Pruning is a process where you go about removing the unnecessary, to maximize results.

Remove pages that serve little purpose and that don’t adequately answer the search queries of Google searches.

While you should consider removing problematic pages discovered with the SEO tool in the last step, tools often don’t pick up on the majority of problem pages. So you have to do some manual searches.

What better place to start than Google itself?

(source)

Check how many pages Google has indexed, and if the number is outrageous, start pruning.

If you are using a CMS, you should make a note of the types of pages that are unnecessarily indexed as well.

There might be a setting in the backend that you can change to fix the issues, rather than manually deleting or no-indexing hundreds or thousands of pages.

A tidy site with good structure that highlights quality, cornerstone content is an important piece of the SEO puzzle in 2020.

#5 Robots.txt and Robots Meta Tags

Understanding how the Rovots.txt file works is vital in any SEO audit -- Robots.txt basically informs Google, and other web crawlers, to index pages on your site, so people can find them. It also allows you to tell Google what sections of your site you DON’T want them to crawl (these pages will still be indexed if anything links to that page).

We accidentally set a bug in motion that basically told Google not to crawl any of our main site.

This bug, which was the programming equivalent of turning off the wrong lightswitch, told Google to deindex hundreds of thousands of pieces of content pretty much overnight.  To avoid this, use a service like TechnicalSEO's text checker which automatically checks your Robots.txt for changes and notifies you if it detects any changes.

This lasted for 5 days without us realizing it. We found the problem and reverted it back, but the damage was done. We went from 500,000 pages indexed to 100,000.

We got some of that traffic back. But not all of it, and it was a crushing blow and an important lesson.

If you want to de-index specific pages Google recommends using robots meta tags. “The robots meta tag lets you utilize a granular, page-specific approach to controlling how an individual page should be indexed and served to users in search results. Place the robots meta tag in the <head> section of a given page, like this:

“The robots meta tag in the above example instructs most search engines not to show the page in search results. The value of the name attribute (robots) specifies that the directive applies to all crawlers. To address a specific crawler, replace the robots value of the name attribute with the name of the crawler that you are addressing. Specific crawlers are also known as user-agents (a crawler uses its user-agent to request a page.) Google’s standard web crawler has the user-agent name Googlebot. To prevent only Googlebot from crawling your page, update the tag as follows:”

“This tag now instructs Google (but no other search engines) not to show this page in its web search results. Both the name and the content attributes are non-case sensitive.”

Although, Google doesn't officially support using Noindex: in the robots.txt we've found that the command has worked for us and others, most notably Wayfair.  Your mileage may vary using this command in the Robots.txt file.

Tips - If you use Robots meta tags, make sure that Google can crawl that page (do not block it via Robots.txt)

#6 Check Speed & Mobile Page Issues

Not only has Google announced that page speed is officially a ranking factor, but loading speed and mobile page display issues are easy to spot and can have a great impact on your bottom line.

Pages that load within two seconds have an average bounce rate of 9%, while pages that take five seconds to load have a bounce rate of 38%.

And after the introduction of Rankbrain, bounce rates can impact rankings by drastically changing the average dwell time. Another rank factor.

If you still want to risk it, 79% of shoppers who have have a bad experience with your website are less likely to return to shop again.

Test Your Website Speed

An easy way to check your website loading speed is to use a tool like Pingdom’s website speed test.

(Source)

Simply type in the URL, choose a location, and run the test.

As you can see, Pingdom is almost failing us, giving us a D on on-page factors, but the key here is the actual load time, which is well below a healthy 2 seconds.

Because we fork out for quality servers, we get away with including a video above the fold & JS associated with our web app.

Your mileage may vary.

Fixing the issues can be as simple as installing a caching plugin (for clunky WordPress installations), downsizing image and video files, installing Lazy Loading scripts, or as complicated and expensive as moving servers.

Test Your Mobile-Friendliness

Testing if your website is mobile-friendly is just as easy. Head over to Google’s Mobile-Friendly Test, type in your URL and run the test.

Unless you are using an old theme in an old installation of WordPress, your scores should be okay.

If they are not, and you don’t have the expertise to fix the issue, you should contact a developer to help you sort this out.

Not only will it impact your rankings, but it will affect how every person that reaches your site through SERPs interacts with your website and business.

#7 Get Rid of Structured Data Errors

There is a wide range of pages on your domain that could benefit in terms of SEO from the inclusion of structured data. These include but are certainly not limited to things like product or service reviews, product or service information or description pages, pages that outline an upcoming event that you’re going to be participating in and more.

Head over to Google’s own Structured Data Testing Tool and enter the URL of a site you want to check. Click the option labeled “Run Test” and Google will not only evaluate the structured data for the domain you just entered – it will also provide you with any errors that were found at the same time.

If any errors were uncovered, do whatever you need to do to fix them. Luckily, Google’s tool will tell you where. If you built your site yourself, dive back into the code and make the necessary changes. If you hired someone to do it, hand them the report you just received and let them get to work – it’s a good starting point and again, the impact you’ll experience will be huge.

#8 Test and Rewrite Your Meta Descriptions

You’ll want to pay particular attention to your site’s meta descriptions – that is, the information about the data that your website is currently displaying.

Google recently said that titles and meta descriptions are easy wins.

This is something that you want to pay attention to. More often than not, the problem that people run into has to do with duplicate meta tags for similar pages in multiple locations across the domain.

In the Google Search Console, click on the menu option labeled “Search Appearance” and click the button labeled “HTML Improvements.”

Then, click the option reading “Duplicate Meta Descriptions” to see how many meta tags you’re going to have to rewrite.

Remember that meta description information is designed to give people a very clear idea of the content you’re offering, thus encouraging them to click. Keep those descriptions short, sweet and to the point – and also make sure they’re unique as well.

As a bonus, you can also use the “HTML Improvements” window in the Google Search Console to look at missing and duplicate title tags, too.

Make sure that all title tags are accounted for and in terms of duplicate tags, the same rules apply. Give your sites a clear title that immediately lets people know what it is you’re offering and, most critically, why they should care enough to click in the first place.

ABT – Always be testing applies to your titles and meta descriptions too!  Test your meta descriptions and keep an eye on your CTR for those pages!

#9 Analyze Keywords & Organic Traffic

Most websites get the majority of their traffic from minor keywords. Long tail searches account for 70% of all searches done on the web. And 50% of searches) are 3 words or longer.

This tendency means that you probably get most of your traffic from lower volume, long-tail keywords.

The first step to making the most of this trend is to do an in-depth analysis of the keywords that already drive traffic to your site.

If you have connected Google Analytics with Search Console, you can analyze from within GA.

Simply open up the Acquisition breakdown, and click through to organic search.

(Source)

Once you’ve selected arrived at the channel breakdown for Organic Search, select “Landing Page” as the primary dimension. (The Keyword view almost always shows “not provided” now and is not very useful.)

(Source)

This will give you an overview of the most important pages on your site for organic search. Then you can cross-reference these results with the queries breakdown in Search Console.

(source)

Since Google doesn’t directly tie queries to landing pages in Search Console, and GA is no longer directly sent the keywords from Google, this roundabout way is the best you can do with a 3rd party service.

If you want something more efficient, you can use the keyword research tools included in SpyFu.

SpyFu Keyword Tools

Just type in your URL, and you will instantly get a breakdown of your most valuable keywords. The only caveat is the traffic will be limited to estimates, whereas your GA traffic is based on real data.

The monthly search volume is real. So this can be a quick and easy pointer in the right direction if the deep dive in GA and Search Console is too confusing.

#10 Learn From Your Competition

The top 1 billion search terms only make up 35.7% of total searches. When the other 99 billion+ terms make up the brunt of searches, there are too many to find with normal keyword research and ideation.

So how do you know which keywords to target, day after day, year after year?

“Borrow” a little from every one of your biggest competitors.

If you are new to SEO competitor analysis, don’t worry. With SpyFu, it’s no longer a complicated thing.

Type in your URL, press enter, and voila, your organic search competitors are served on a silver platter. Or rather, they are arranged in the form of a numbered list.

SpyFu Competitor Research Tool

Once you have this list of competitors, you can do a few things.

You can painstakingly go through the SERPs for important keywords, and try to figure out where you’re lacking.

Or you can use SpyFu’s Kombat to instantly learn keyword opportunities that multiple of your competitors use, while you don’t.

SpyFu Domain Comparison Tool

The Kombat breakdown compiles a list of all the keywords targeted by 3 separate domains(yours being one of them).

Then it filters the keywords into three lists:

  1. Keywords everyone targets.
  2. Keywords your competitors are targeting, but not you.
  3. Keywords one competitor is targeting.

The second is the most valuable, and often contains gems you can turn around and create a content calendar with.

Then, if you notice a single website is killing the competition in organic search, you can use our keyword research tool to find opportunities from just the one site.

#11 Check & Improve Content and On-Page SEO

While backlinks get most of the pomp and fanfare when it comes to SEO, on-page changes can deliver real results with much less investment.

In a recent case study, on-page improvements alone led to a 32% increase in organic traffic.

Breaking up content and adding relevant h2s/h3s/h4s alone lead to a 12% improvement.

What does this mean?

Google either directly has Readability as a ranking factor, or it’s a usability thing that impacts rankings indirectly because of dwell time and RankBrain.

Either way, the way your content is structured, and the content itself has to be improved if you want to get the most bang for your SEO dollars.

Improve Your Content

When it comes to SEO in 2020, content is still not the undisputed king, but you will struggle to rank content readers don’t like to read.

A great tool that can help you improve your content is Market Muse. It will help you strike a nice balance between SEO concerns and the actual content.

It analyzes all pages of your website, and pinpoints pages where the content may need some tweaks.

(Source)

You can also actively analyze single pages against their competition.

(source)

It automatically suggests topics that you should include in pages and posts to get a fighting chance in the SERPs.

There’s still a lot of factors that you have to improve by ear; like the audience-voice match and creative, but Market Muse can act like the extra person a content team usually sorely needs.

For pure readability concerns, you can run your content through an app like Grammarly or Hemingway before publishing.

This AI-powered proofreading will help you pick up on grammatical errors, and make your content easier on the eyes of the humans on the other side of a blue screen.

Improve Your On-Page SEO

Making sure your On-Page SEO was on point used to be a very tedious process. You had a 30+ point checklist, and you went through it.

For each page.

Now there are a ton of tools out there to help you. And the best part is, most of them are free.

One of the best ones is Seobility’s SEO checker.

Just type your URL in and start the check.

(source)

Use SEObility to check your homepage as well as individual cornerstone pages that drive a lot of traffic, or have the potential to.

Follow up on the findings.

Internal links are like the forgotten step-brother of the SEO audit process. They exist, and sometimes they get a mention, but most of the time professionals are stuck worrying and talking about backlinks.

Which is a shame because only optimizing your internal links could improve your traffic by as much as 40%, with a lot less work.

The consensus among SEOs is that many internal links are a signal to Google that a specific page is important.

You can do this with Google Search Console’s link tab.

(source)

Your most linked to sites are broken down, and you see a specific breakdown for internal links.

This breakdown will show all pages in a descending order based on how many internal links they receive.

Privacy Policies, Terms and other pages linked to from the footer/sidebar of your website will inevitably rank highly here.

Google is smart enough for that not to be an issue.

Focus on the relevant, original content pages.

Are you giving too many internal links to content targeting low-volume keywords?

Does your cornerstone content have too few backlinks?

These are the questions you want to answer by going through this breakdown.

Once you know what your internal links currently look like, it’s time to get to work.

Brian Dean of Backlinko recommends this site architecture for a reason.

When you structure a site this way, kind of like a very flat pyramid, it is an easy hierarchy to understand.

A search engine crawler doesn’t need the creative mind of a person to understand what’s important and what isn’t.

Remove internal links that create an incredibly disorganized website, and work on grouping content through internal links to boost certain cornerstone content.

To improve your internal linking profile as you go along, make sure you link to relevant cornerstone pages as you add new content to your site.

There are many tools you can use If you use WordPress, SEO by Yoast will suggest internal links to include in a post based on topic relevance.

But make sure to keep your site architecture in mind and don’t link too much across categories.

Once you’ve nailed every step of this internal link strategy, you will be rewarded with better rankings for your key pages.

It’s not a secret that backlinks are still the backbone of any large scale SEO effort, even in 2020. Some experts even claim that as much as 75% of SEO is off-page, and backlinks are still one of the top ranking factors.

But the times have changed.

Google doesn’t care about straight volume anymore.

Context matters now.

Google cares if the site that links to you is in the same category(good).

Google cares if the site that links to you is authoritative(a popular site in the category).

It cares if it links to you in-content (good) or from a list/directory (not-so-good).

This means that buying links from link farms is no longer a one-and-done solution to your backlink troubles.

First, you should check what sites are already linking to you.

You can do this by returning to the Search Console link tab.

This time focus on the external.

If you find any high quality, relevant domains, you can consider doing outreach for covering other products/cornerstone pages of content in the future.

After finding the domains that linked to your website, it’s time to look elsewhere for inspiration: your competitors.

SpyFu makes finding their sources of backlinks as easy as pie. Simply head to the backlinks tool, type in their URL, and start checking.

SpyFu Backlink Builder

If you can find content that has a lot of backlinks, but is outdated or simply not that good, you have a potential goldmine on your hands.

One-up the original content, and run an outreach campaign to the sites that linked to the original piece.

Stay In Touch With Writers & Influencers in Your Space

This strategy is a bit of a slow burner but is perhaps the strategy that pays the most dividends per second of your time spent.

Chances are you already know of a few, but you can use a tool like Buzzsumo to easily search for more.

(Source)

Write down the most important ones, or the ones you think you can get along with the most, and make a point out of keeping in touch with them.

Let them know about updates as soon as (or before) they come out.

Inform them they have access to interview your team members if they want it.

Reach out to them when your/their favorite team wins the Super Bowl.

Go nuts.

Just make sure you are top of mind when they want to write something about a company in your space.

#15 Track Your Results

If you implement the changes blind and hope for results, you can’t call yourself an SEO professional.

The simple act of installing analytics, of starting to track results of changes, is powerful. It can lead to 13% improvements in conversion rate all by itself, controlling for every other factor(no increase in budget or change of staff).

And SEO is no different.

If you didn’t track what happened after you implemented certain changes on your website it would be like operating with a blindfold.

You wouldn’t know what you should keep doing or stop doing. What you should do more of or less of.

Luckily SpyFu makes rank tracking easy.

Just open the tracking dashboard and you can see your historical ranks for any keyword.

SpyFu Rank Tracker

You can also easily save keywords while you are doing other things (like checking out your biggest winners).

#16 Run Your Audits Regularly

It's wise to run your SEO audits on a regular basis for a number of reasons -- this will let you stay on top of your SEO efforts and catch errors as soon (or close) as they happen.  Here's a quick list of tests that you should have on your weekly/monthly site audit.

1. Check indexed pages

  1. Do a “site:domain.com [keyword]” search. Follow that format by typing “site:” into Google’s search bar, followed by your domain (no spaces between the colon and domain) and then space and the search term you’re targeting. site:spyfu.com backlinks
  2. Review how many pages are returned. It’s notable but can be off, so don’t put too much stock in it.
  3. Do you expect your homepage to show up as the first result?
  4. If the homepage isn’t showing up as the first result, there could be issues, like a penalty or poor site architecture/internal linking, affecting the site. This may be less of a concern as Google’s John Mueller recently said that your homepage doesn’t need to be listed first. Also, a homepage should be designed to navigate your elsewhere. If detailed info in on another page, give that other page more weight.

You can view a full list of Google Search Operators here: Google Search Operators

2. Review the number of organic landing pages in Google Analytics

  1. Does this match with the number of results in a site: search?
  2. This is often the best view of how many pages are in a search engine’s index that search engines find valuable.

3. Search for the brand and branded terms

  1. Is the homepage showing up at the top, or are correct pages showing up?
  2. If the proper pages aren’t showing up as the first result, there could be issues –like a penalty –in play.

4. Check Google’s cache for key pages

  1. Is the content showing up?
  2. Are navigation links present?
  3. Are there links that aren’t visible on the site?

Tip -Don’t forget to check the text-only version of the cached page. Search the page in Google, and use the drop down option next to the result that lets you open the cached page. (Choose text-only when you click through.)

5. Do a mobile search for your brand and key landing pages

  1. Does your listing have the “mobile-friendly” label?
  2. Are your landing pages mobile friendly?
  3. If the answer is no to either of these, it may be costing you organic visits.

6. Make sure your title tags are optimized

  1. Title tags should be optimized and unique.
  2. Your brand name should be included in your title tag to improve click-through rates.
  3. Title tags are about 55-60 characters (512 pixels) to be fully displayed. You can test here or review title pixel widths in Screaming Frog.

7. Confirm that important pages have click-through rate optimized titles and meta descriptions

This will help improve your organic traffic independent of your rankings. Try SERP Turkey to help.

8. Check for pages missing page titles and meta descriptions

You can check this using the Google Search Console. Search Appearance —> HTML Improvements

9. There is a significant amount of optimized, unique content on key pages

The on-page content includes the primary keyword phrase multiple times as well as variations and alternate keyword phrases

10. Optimize Image file names

Their file names and alt text should include the primary keyword phrase associated with the page.

11. Make your URLs descriptive and optimized

While it is beneficial to include your keyword phrase in URLs, changing your URLs can negatively impact traffic when you do a 301. As such, I typically recommend optimizing URLs when the current ones are really bad or when you don’t have to change URLs with existing external links.

12. Aim for clean URLs

  1. No excessive parameters or session IDs.
  2. URLs exposed to search engines should be static.

13. Use Short URLs

Keep them 115 characters or shorter – this character limit isn’t set in stone, but shorter URLs are better for usability.

Additional reading:

Best Practices for URLs

URL Rewriting Tool

mod_rewrite tips and reference

mod_rewrite Cheat Sheet

Creating 301 Redirects With .htacces

14. Optimize homepage content

As a general rule, make sure the homepage has at least one paragraph. There has to be enough content on the page to give search engines an understanding of what a page is about. Based on my experience, I typically recommend at least 150 words.

15. Optimize your landing pages

These pages will be more specific and could have more content. Aim for at least a few paragraphs. This should be enough to give search engines an understanding of what the page is about.

Don’t just settle for template text used across your pages. Make it completely unique.

16. Be sure the site contains real and substantial content

There should be real content on the site (as opposed to a list of links).

17. Proper keyword targeting

The page needs to satisfy the search. Not “some general topic” but actual substance that delivers on what the reader is hoping to solve.

Get specific — create pages targeting head terms, mid-tail, and long-tail keywords.

18. Watch for keyword cannibalization

Do a site: search in Google for important keyword phrases. Finding “flight upgrades” on Trip Advisor would look like this:

site:tripadvisor.com flight upgrades

Check for duplicate content/page titles using the Moz Pro Crawl Test. (More on this in Part IV)

19. Make content to help users convert

It should be easily accessible to users. Write it for humans: in addition to search engine driven content, there should be content to help educate users about the product or service.

20. Content formatting

  1. Is the content formatted well and easy to read quickly?
  2. Are H tags used?
  3. Are images used?
  4. Is the text broken down into easy to read paragraphs?

21. Write good headlines on blog posts

Good headlines capture readers, keep them on the page, and give you the opportunity to tie them to the targeted search phrase/keyword. The time-tested rule of a good headline is that it should make the reader want to read the first line on your content (while being relevant).

22. Watch the amount of content vs. ads

Since the implementation of Panda, the amount of ad-space on a page has become a key point of consideration. There isn’t a magic ratio, but your ad space shouldn’t significantly compete with content. Aim for these guidelines:

  • Make sure there is significant unique content above the fold.
  • If you have more ads than unique content, you are probably going to have a problem.

23. There should be one URL for each piece of content

  1. Do URLs include parameters or tracking code? This will result in multiple URLs for a piece of content.
  2. Does the same content reside on completely different URLs? This is often due to products/content being replicated across different categories.

Tips- Use Google Search Console to set up to 15 parameters for Google to ignore when indexing the site.

You will see Google list these as “Ignore” or “Don’t ignore”. It fights canonicalization issues when multiple URLs serve the same content. It’s a good practice to protect your overall rankings.

Read more at Search Engine Land.

24. Do a search to check for duplicate content

  1. Take a content snippet, put it in quotes and search for it.
  2. Does the content show up elsewhere on the domain?
  3. Has it been scrapped? If the content has been scraped, you should file a content removal request with Google.

25. Check sub-domains for duplicate content

It’s tempting to duplicate content when you want to make sure that visitors find what they need. Watch for repeated copy from one sub-domain to another.

26. Check the robots.txt

It’s an important task to remember if the entire site or important content been blocked. Check out if link equity is being orphaned due to pages being blocked via the robots.txt.

27. Turn off JavaScript, cookies, and CSS

Use the Web Developer Toolbar

Check to see if content is there. Do the navigation links work?

28. Now change your user agent to Googlebot

Use the User Agent Add-on

  • Are they cloaking?
  • Does it look the same as before?

Tips - Use SEO Browser to do a quick spot check.

29. Check for 4xx errors and 5xx errors

Use Screaming Frog or SiteBulb to check your website for 4xx and 5xx errors.  You can also use Google Search Console to view your site errors as well.

30. XML sitemaps are listed in the robots.txt file

31. XML sitemaps are submitted to Google/Bing Webmaster Tools

32. Check pages for meta robots noindex tag

Look for pages that are:

  • accidentally being tagged with the meta robots noindex command
  • missing the noindex command (when it’s needed)

Crawl tools to help: Moz or Screaming Frog

33. Do goal pages have the noindex tag applied?

This is important to prevent direct organic visits from showing up as goals in analytics

34. Check proper use of 301s

  1. Are 301s being used for all redirects?
  2. If the root is being directed to a landing page, are they using a 301 instead of a 302?
  3. Use Live HTTP Headers Firefox plugin to check 301s.

35. Check “bad” redirects

Poor redirect practice (for SEO, at least) includes 302s, 307s, meta refresh, and JavaScript redirects. They pass little to no value.

Use Screaming Frog to identify them.

36. Point all redirects directly to the final URL

Do not leverage redirect chains. Redirect chains significantly diminish the amount of link equity associated with the final URL. After too many redirects, you will lose credit entirely. Google will stop following the chain after several redirects.

Our SEO audit is complete!

Successfully conducting an SEO audit in 2020 is a lot like running a race.

There are a ton of factors that you can choose to get into. With running some get caught up in the wind-resistance of your fabric, the cushioning of the shoes, brands & inspirational playlists that “help you push further.”

Likewise, some SEOs get caught up in keyword density, perfectly stuffing meta and alt tags, word length and everything else that sounds good.

They lose sight of the bigger picture.

Of nailing the basics. Tracking the results, learning from them, and improving gradually. Of putting one foot in front of the other.

If you start by nailing your strategy and eradicating all major SEO technical errors. If you glean insights from your analytics and your competitors. And if you implement improvements based on that, you will be light-years ahead of someone who is stuck worrying about every “new” ranking factor.

You will get exponential results while they are fighting for incremental improvements.