The 10-Step SEO Audit for Organic Search Supremacy
Let’s face it: there is a lot of SEO audit methodology out there. Whether you’re a small business trying to optimize your site for organic search, or an agency doing the same for a client, it can be difficult to know where to begin, how in-depth your analysis should go, and which SEO tools will help you glean the most useful information.
To cut out some of the noise, we’ve nailed down 10 core elements to a successful SEO audit. Whether you’re a seasoned SEO, or a small business owner puttering around on your self-made Squarespace website, successfully running through these 10 steps will set your website on the path to organic search supremacy.
Let’s dive in.
A Note on SEO Software and SEO Tools
Any SEO software worth its mettle—Ahrefs, Moz Pro, SEMrush—will have a site audit or site crawl tool that allows you to crawl your website and determine not just general search health, but common inefficiencies that are dragging down your site’s organic performance. They’re also replete with other tools—keyword research tools, backlink profiling tools, etc.—that, in addition to Google Search Console (formerly Webmaster Tools), are integral to performing the 10 below steps. I would recommend procuring at least the free trial version of one of these tools before attempting a complete SEO audit of your site.
A Note on Google Search Console
To do an SEO audit on your own site, you’re going to need to add it as a property in Google Search Console and Google Analytics, then verify that you own the site using one of these methods:
If you’re auditing a site you don’t own, and you’d like access to Analytics and Search Console to perform that audit, ask permission from the existing site owner, who can manually add you as a user.
Now on to the audit.
Step 1: Identify Internal & External Link Building Opportunities
Building links is a vital part of accruing website authority, and no SEO audit is complete without recommendations for building site-specific internal and external links.
Building Internal Links
Internal links pass link equity within your own pages, and as such, are vital to forming authoritative hierarchies within your site. A straightforward and time-honored way to build internal links: when you create a new piece of content, do a site search for older, related content which, ideally, has built up some equity; then find anchor text within the old content to link to the new content. So, if you’ve created a new resource on Facebook ads:
These are the pages you’ll want to link from.
A chrome extension like MozBar will give you this view. You’ll want to link from the page with the greatest page authority (PA). Page Authority and Domain Authority (DA) are not “official” metrics. Google doesn’t use them to index pages. They were created by SEO software companies to provide estimates of a page’s or a domain’s authority. That said, they’re still pretty accurate; and they’re useful when determining which sites or pages to link from.
You also want to keep in mind user experience and information architecture (IA) when creating internal links. Linking from credible pages isn’t everything. Where would be a genuinely useful place to send a site visitor from a given page? What would bring them closer to purchasing your product or service? We’ll discuss this more in Step 2, but these are good questions to ask yourself when building internal links.
Building External Links
Earning links from a diverse set of authoritative domains is the essence of increasing Domain Authority. One straightforward way to build external links is to search for resource lists that could realistically feature your content and link to you. So, if you’re running an SEO audit for a private school in Massachusetts, and you want to make recommendations for external link building:
“Best prep schools in ma” will yield some nice resource lists that may or may not already feature your client, and that represent good opportunities to reach out for external links.
An easier way to build external links—and a way that yields higher return—is to search for unlinked mentions. Any good SEO tool will have a content explorer tool that allows you to search for places on the web where your brand has been mentioned:
After sorting by Domain Authority and assuring your target has not already linked to you, you can reach out to content managers via Twitter, email, condor, what have you; ask for links back to your homepage; and offer to share their article on your social accounts in return.
Step 2: Identify Potential Information Architecture Improvements
Information architecture, or IA, is a pretty sweeping term that basically just means, “The way information is ordered/structured.”
Standard website IA.
For the purposes of an SEO audit, that means redistributing internal linking structures on your website to pass equity to the pages that need it. It also means working closely with developers and designers to develop user-friendly solutions that will improve page authority without compromising UX (User Experience).
Perhaps your blog index only lists ten pages at a time, which pushes older posts some 20-30 clicks from your homepage (where the most equity lies). Increasing that number of posts-per-page will bring those older posts closer to the homepage.
Or, perhaps the “related posts” and “popular posts” sections of your blog contain overlapping links; or your header and footer contain overlapping links. Replacing those duplicate links with links to other pages to which you want to pass equity will allow you to fully maximize that coveted link space.
Unless you’re auditing your own site, IA-based recommendations should hinge on the goals of project stakeholders. To which pages/parts of the site are they trying to push users? Is the goal to make the site intuitive or immersive? Your target audience and business goals will inform the way you define and organize content.
Step 3: Identify Thin Content
Speaking of content…your pages aren’t going to get any respect in the SERP (Search Engine Results Page) if they’re thin. Panda, a 2011 Google algorithm update that cracked down on sites with thin content, made sure of that.
“Thin” content is content that fails to meet user needs. A 300-word blog post explaining a complex concept would be considered thin. That said, it’s not realistic to put 1000-2000 words on every page of your site. Pages closer to your homepage are going to be dominated by design work—hero images, icons, etc.—call-to-action buttons, and product-centric copy. So, what content should you be looking at?
- Copy deck. Your client has given you a deck of all the pages they’d like audited. You can go through each page individually and make page-level recommendations with respect to the other elements present on that page and how it fits into the overall site.
- Top pages. Export your top 25, 50, 100 pages by traffic—depending on site size—and make sure each is sufficiently beefed-up.
- All of it. Most site audit tools within the above SEO software I mentioned will, after crawling your site, offer a broad report on content length and quality:
You can then export pages with little to no content and offer page-level recommendations on how to fix each one; or, simply offer them in aggregate as pages that should be looked at.
The two big benefits of beefing up on-page content: 1. More opportunities to link internally. 2. More opportunities to squeeze in target and ancillary keywords that will help that page get found (see Step 5).
Step 4: Identify Duplicate Content
Good news: when scanning for duplicate content, you’re going to be looking at the same subsets of your site that you looked at when you were scanning for thin content. Some SEOs get riled up about duplicate content and the potential site penalties sites could incur as a result. Here’s my take on it:
Google is smart enough to know whether or not you’re intentionally and maliciously duplicating content on your site to clog the SERP with your site’s URLs. In all likelihood, you’re not. A more likely scenario, if you have duplicate content, is that it’s happening unintentionally. Perhaps your CMS (Content Management System) is dynamically generating new pages that are similar in appearance and that haven’t been manually canonicalized in Search Console. WordPress does this with archive pages.
If you’re concerned about incurring penalties from unintentional duplicate content, this post from Google should set you at ease.
Step 5: Scan for Keyword Optimization
As with beefing up thin content, there are certain pages on your site where it’s not going to be feasible to fully optimize for target and ancillary keywords. Say you are an employee scheduling software company. You’ve squeezed “employee scheduling software” onto your homepage in multiple places. The highest volume, lowest competition keyword related to “employee scheduling software” is “best employee scheduling software.” While you’d like to rank for that keyword, it’s probably not a prudent move to say “we’re the best” on your homepage. That’s a keyword best reserved for a blog post.
Still—whenever possible, you should make sure all your pages are as optimized as possible for keywords that will help them show up in organic search. The first step in doing that is conducting keyword research.
How to Do Keyword Research
Using one of the above SEO products—or a free keyword tool—search for keywords related to your topic that have high intent, high volume (number of monthly search queries), and low difficulty (level of competition in the SERP for that keyword):
“How to do an seo audit,” for example, would be considered an ancillary keyword that I might want to fit into this post—in either an H2 or in body copy. Or, perhaps I know my post is going to have a section on conducting keyword research:
“How to Do Keyword Research” is a better-optimized H3 than “How to Conduct Keyword Research,” because it has far more volume. As you might imagine, doing vertical-specific keyword research is a great way to generate blog topics.
If possible, target keywords should be in the URL, the title, an H1 (the headline of your post), an H2 (a subhead within your post), the meta title (see Step 6), the meta description (ditto), and littered (in moderation) throughout the body copy. Ancillary keywords should, if possible, be used in H2s and in body copy. Naturally, if you’re creating a new piece of content, it’ll be easier to make sure you fully optimize all these elements. If you’re updating old pages, you just have to do the best you can. It wouldn’t necessarily make sense, for instance, to change a URL simply to optimize for a new keyword.
Step 6: Make Sure Meta Tags Are Optimized
Meta tags consist of a meta title and a meta description. They help Google determine the content of the page it’s crawling, and are two of the bigger factors Google takes into account when determining the order in which to rank pages. They also help users looking through search results determine the content of your page, and as such, act as promotions for your content.
Using your vertical-specific keyword research, write or rewrite your meta tags, making sure to optimize them for the keywords that will help them show up in search.
With meta tags especially—though this is also important for general site copy—you want to avoid squeezing in keywords for the sake of squeezing in keywords, also known as keyword stuffing. “10 Great Instagram Captions, Good Instagram Captions, and Funny Instagram Captions That Will Make You ROFL” is not a good title.
Here are some meta tag best practices:
- Titles. Google displays the first 50-60 characters of your title. After that, it truncates the title with an ellipsis. Not only can this cause vital keywords to be omitted from being crawled, but it just flat out looks bad in the SERP. Keep your titles under 60 characters, and 90% of them will avoid truncation. This useful, free tool from Moz helps with title creation. Your target keyword should exist in totality in your title. Keeping the brand name in the title also allows you to piggyback off its authority. As such, an SEO best practice title tag looks like this: Target Keyword | Brand Name. So: “How to Choose the Right Running Shoes | Nike.”
- Descriptions. Optimal description length has oscillated in recent years, but today it stands at 155-160. In terms of content, here’s a formula to live by: target keyword + ancillary keywords (if natural) + descriptors + call-to-action = in the money. So: “SEO Audits are hard. WordStream makes them easy. We’ve nailed down 10 core steps to any successful SEO audit. Come check out our quick, simple 10-step SEO audit.”
Step 7: Identify Page Update Opportunities
Making even small updates to a page signals to Google that it should crawl that page. As such, regular updates will help keep your pages fresh and relevant in the eyes of the search engine.
There are two types of content that should be updated regularly. The first is a top page, which I discussed in Step 3. These are the pages that drive the most traffic for your business. Updating them assures they’ll continue to drive traffic for your business.
The second is an “opportunity page.” And opportunity page is a page which, if it were to move up slightly in the SERP, would gain a meaningful increase in traffic. Popular SEO products are equipped with rank trackers that allow you to see where pages rank for certain keywords, if those rankings have changed, and how much traffic is being generated from each keyword.
Pages just outside the top 10.
Pages that rank just outside the top 10 should be updated religiously to give them a chance to move up to page one. A similar approach should be taken with pages that rank just outside the top 3. Traffic falls off precipitously after the first 3 spots, and nearly altogether after the first page.
Process for Conducting Page Updates
One way to update pages is by adding new information and research. For example, if you have a blog post about Google algorithm updates, and a new algorithm update just came out, that post is ripe for an update. Keeping your content up to date will ensure that your organic visitors’ needs are being met, so they don’t bounce back to the SERP and click another result (which can ultimately harm your rankings).
A second way to update content is to identify related, ancillary keywords that have shown to be traffic drivers and add those keywords to your page.
A third way to update pages is to do general house cleaning: compress images and fix or eliminate broken links. Cumbersome images result in slow load times (more in Step 8). Broken links result in a poor user experience (more in Step 9). Remedying these on-page elements can give your page the boost it needs to achieve the desired ranking.
A fourth type of content update is actually removing content from your site. It may seem counterintuitive, but if there are pages on your site that get little to no organic traffic, they may be hurting your overall organic rankings by lowering the average value of your site in Google’s eyes. Look for pages with 0 or close to 0 organic visits in the past year, and if you can’t overhaul the pages right away to add more value, deindex them.
Step 8: Run Page Speed Analytics
Page speed is vital—especially after Google dropped its Speed Update in May. This is partly due increases in mobile searching. Anyone using a phone to look for stuff on the internet isn"t going to wait around for a slow-loading site. That means the faster your site is, the more likely Google is to reward you with strong organic rankings.
PageSpeed Insights is going to be your go-to tool here. It gives you granular looks at page speeds and offers suggestions for improvement. You can analyze page speed at both a site-wide level:
And at a page level:
As you can see, you can also analyze both mobile and desktop versions of your site.
Suggestions for improved page speed range from backend alterations like eliminating JavaScript and CSS in above-the-fold elements to simple image compression. If you’re looking for quick and effective page speed improvement, image compression is your best bet. Run each image listed in the report through a simple image compressor like this one, then re-upload them to your CMS. Depending on the before-and-after sizes of your images, you stand to gain a substantial increase in performance.
Here are some other common things that could be slowing down your site.
Step 9: Scan for Site Errors
A 404 is a “page not found” error, generally caused by broken links and images within your site. For example, if a page on your site links to a piece of content that has since been deleted, anyone who clicks that link will get a 404 error.
Contrary to popular opinion, broken links don’t results in site penalties. As content cycles in and out, and your site’s structure changes over time, 404s occur naturally. That said, broken links in inopportune places can fracture your internal linking structure. They can also be a pain in the ass for users trying to navigate from one page to another.
Site audit tools have the ability to identify all the 404s within your site. Once you identify them, fixing them is a matter of determining how important each link is to your linking structure and user experience. Have a page which no longer exists, or which now exists at a new URL? It’s a good idea to make sure all high-traffic pages which formerly linked to that page now 301 redirect to the new page; or, that they at least redirect back your home page.
This chrome extension is useful for identifying broken links on a page-by-page basis.
Step 10: HTTP to HTTPS: Make the Switch
In today’s digital marketing landscape, running on HTTP is a decided no-no. HTTPS is faster, more secure, and is one of Google’s ranking signals.
Checking to make sure your site is running on HTTPS is as simple as manually entering the various non-HTTPS iterations of your site domain—www.site.com; site.com; http://www.site.com—and making sure they’ve all been 301 redirected to the HTTPS iteration.
You’ll then want to scour the search index for places where non-HTTPS URLs appear. You can use the Index Status report in Search Console to see which version of your site’s URLs are canonicalized. If you have to, manually canonicalize the HTTPS versions.
For a super in-depth guide on making the switch from HTTP to HTTPS, read this.
SEO Audit: Not an Exact Science
There’s a reason there’s no one prescribed recipe for a successful SEO audit. It’s the same reason SEO “best practices” are so often inconsistent. Google is a mystical beast. It changes its algorithm almost daily, and it rarely tells us why or how it did so. A one-size-fits-all SEO audit is unrealistic.
That said: rest assured, the above 10 steps are time-tested, core elements to a successful SEO audit. Remedy the flaws they reveal, and you’ll be well on your way to organic search supremacy.
No comments:
Post a Comment