Thursday, 11 August 2016

10 Quick Fixes For Lasting SEO Benefits

Got an hour to spare?

By: Masha Maksimava




I know, I know, search engine optimization isn"t "quick" or "simple" by definition. Many things in SEO will take plenty of time and effort — think taking your site mobile, building links, or moving to HTTPS. But then… there are some SEO tweaks that really are both quick and effective. In this article, I"ve put up my top 10 list of the most worthwhile SEO fixes you can do in under an hour.

These simple hacks won"t take you to the top of Google if you neglect the more time-consuming (but necessary) components of the search engine optimization process. But combined with your main efforts, they are sure to make a difference in your search engine visibility and traffic, and give you that ranking boost you may be just needing.

Technical SEO


  • Take care of your sitemap.


You sure know how crucial a sitemap is to your site. It tells search engines about your site structure and all the important pages that should be crawled, indexed, and (ideally) ranked in search results. A clean, frequently updated sitemap also has less obvious benefits. By adding fresh content to it immediately, you are letting search engines index your new pages quicker; so if that content gets duplicated externally later on, search engines will know that yours was the original piece.

If your site is fairly big and frequently updated, chances are you use a plugin or tool to create and update your sitemap (like SEO PowerSuite"s WebSite Auditor). But if you don"t, my number one advice is to go and review your sitemap right now to make sure all the important pages of your site can be found there, including the fresher ones.

Once you"re positive your sitemap contains all the right pages, go on and check it for errorsin Google Search Console. To do that, log in to your account, and go to Crawl > Sitemaps. At the top right you"ll find an Add/Test Sitemaps button. Click on it, plug in the URL of your sitemap, and click Test:


When the test is complete, you can view the results by clicking on View test result. This will reveal if Google encountered any errors when parsing your sitemap (if that"s the case, you"ll see a list of problems to be fixed). If the test is successful, congrats! Your sitemap is clean and ready for upload.

Lastly, if for whatever bizarre reason you haven"t done it yet, go on and submit your sitemap to Google. All you need to do is hit the blue Submit button.

If you"re also looking to submit your sitemap to Bing, it"s a similar process: log in to BingWebmaster Tools, and under the Sitemaps widget, click Submit a Sitemap. This will reveal a URL field in which you can enter the location of your sitemap file.

Quick, eh?
  • Check your robots.txt.


What can go wrong with a robots.txt file, anyway? Much more than you"d think. Examplesinclude webmasters carelessly creating a disallow rule for "/" (all site"s pages, that is), blocking some of the older parts of the site for fear of content duplication (instead of using a redirect or canonical), and even SEO companies editing the client"s robots.txt file to disallow indexing of all parts of a website after the site"s owner stopped paying them.

The less dramatic (but much more frequent) scenario is when the site"s JavaScript and CSS get blocked from search engine bots. If that happens, search engines won"t be able to access and load your CSS; for responsively designed sites, that means they won"t be able to figure out that the website is mobile friendly, and hence rank you lower (if at all) in mobile search.

To check if you are blocking any of your important pages and resources, you might want to review your robots instructions. The quickest way to do that is in SEO PowerSuite"s WebSite Auditor. First, download the free version of the toolkit and install it. Fire up WebSite Auditor and create a project for your site — the initial analysis will take under a minute. Next, go toSite Structure > All Resources and go through your HTML, CSS, and JavaScript. Sort the records by the Robots instructions column and see if any of the important resources are getting blocked.


If you find out some of your important HTML pages are being blocked, you"ll need to go and add an allow rule for those (it will override the disallow rule). If you discover some of your CSS and JavaScript are disallowed, the simplest solution to let Google crawl them and protect your site from similar issues in the future (recommended by Google"s own Gary Illyes) is just 3 lines in your robots.txt file:

User-Agent: Googlebot
Allow: .js
Allow: .css


You can add these rules by hand if you prefer to edit the file manually, or use WebSite Auditor"s robots.txt editor to do the same.



  • Audit your site for errors.


A site audit is one of the best quick SEO checks you can do. Don"t get me wrong — many things in technical SEO take a while to fix and perfect, and there"s no limit to a technically perfect site. Still, some on-site factors are more important than others to the search engines. Validation errors are not a great thing to have on your pages, but, well… Everyone has them. To let you spot the most important issues quickly, WebSite Auditor marks every factor in your site audit with a red Error sign (those are the critical SEO problems you can"t ignore), yellowWarning sign (worth your attention once you"ve fixed the errors), blue Info sign (not all of those will need fixing, but it"s important that you are aware of them), or green Correct sign.


The first thing you should do is look for the red errors — these are the gravest SEO issues you"ll want to fix asap. To start the check, fire up WebSite Auditor, and enter the URL of your site to create a project (for an existing project, click the Rebuild Project button). In under a minute, your site audit will be complete. Now, look for the red error signs in your audit. These will reveal the issues that call for your immediate attention.


See? I just found a bunch of 4XX pages that are linked to internally, so they are definitely confusing visitors and putting lots of crawl budget to waste. Those are the pages that were knowingly removed from the site, so all I need to do is go and fix those internal links, making sure they point to the up-to-date versions of these pages - a couple of minutes of work.
  • Test and improve page speed.


Page speed is one of the few factors that has been officially confirmed to be used for ranking pages by Google. Above SEO, page speed has a massive impact on user experience and conversions. According to Kissmetrics, as little as a 1 second delay in page loading time can result in up to 7% reduce in conversions.

The good news is, page speed is easy to check and optimize. To do this, you"ll also need to use WebSite Auditor. This time, open your project and go to Content Analysis. Click Add page, specify the URL you"d like to test, and enter your target keywords. In a moment, your page will be analyzed in terms of on-page optimization as well as technical SEO. Scroll to the Page Speed (Desktop) section of on-page factors to see if any problems have been found.


The factors in this section will show you exactly what (if anything) makes your page too heavy and slow. Often, compressing your images and resources can help speed up the page immensely. If you do have some uncompressed images on your page, click on this factor and switch to Recommendation — you"ll find a direct download link with your images, compressed and optimized for optimal load time.
  • Ask search engines to re-crawl your site.


Whatever on-site changes you made, chances are it"ll take a while until Google notices them. But no, you don"t have to wait — you can explicitly ask Google to re-crawl your pages to make sure the changes are taken into account immediately.

All you need to do is log in to Google Search Console and go to Crawl > Fetch as Google. Enter the URL of the page you want to be re-crawled (or leave the field blank if you"d like Google to crawl the homepage) and click Fetch.


Note that your fetch must have a complete, partial, or redirected status for you to be able to submit the page to Google"s index (otherwise, you"ll see a list of problems Google found on your site and will need to fix those and use the Fetch as Google tool again). If Googlebot can successfully fetch your page, just click the Submit to index button to encourage Google to re-crawl it.


You can submit either the exact URL to be re-crawled (up to 500 URLs per week), or the URL and all pages linked from it (up to 10 per month). If you choose the latter, Google will use this URL as a starting point in indexing your site content and will follow internal links to crawl the rest of the pages. Google doesn"t guarantee to index all of your site"s pages, but if the site is fairly small, it most probably will.


There"s a similar option in Bing Webmaster Tools, too. Just locate the Configure My Site section in your dashboard and click on Submit URLs. Fill in the URL you need re-indexed, and Bing will typically crawl it within minutes.

Content


  • Fix duplicate content.


Duplicate content is one of those SEO issues that can get your site penalized severely — but one that"s surprisingly easy to spot and fix.

To run a quick duplicate content check, open your WebSite Auditor project and examine theOn-page section of SEO factors in your site audit. Look out for duplicate titles and meta descriptions. If any are found, click on the problematic factor to see where the duplication occurs.


Now, you"ll need to take a look at the problem pages to see whether the duplication only occurs in the titles and/or descriptions, or if all content on the pages is duplicated. If it"s the former, all you"ll need to do is rewrite your titles. You can do this right in WebSite Auditor, by going to Content Analysis > Content Editor and switching to the Title & Meta tags tab.

If the page"s content is duplicated too, you"ll need a different approach. Typically, you"ll want to set up a 301 redirect or use rel=canonical to tell Google which page is the more important one.
  • Optimize titles and descriptions for clicks.


This tactic can be a very effective CTR booster — and it only takes a couple of minutes and just a few small changes. This recent case study by Search Engine Watch is a great story of how a few title and meta description tweaks can improve the entire site"s click through rate by 20%.

There are plenty of ways to optimize your Google snippet for more clicks. You may wish to mention your low price, the speed at which you deliver, or the values that set your brand apart from others (like locally sourced produce or being a non-profit).

Take a look at your competitors" search engine listings and think of something that will make yours stand out. But of course, don"t forget to use your keywords or their variations too — think about both search engine bots and human searchers.

To rewrite your titles and do some experimentation, use WebSite Auditor"s Content Editorsubmodule (under Content Analysis). Switch to Title & Meta tags, and start composing your title and description. You will be able to preview your Google snippet right away, as you type.

Once you"re happy with your snippet, hit Save page to save the upload-ready HTML file to your hard drive.



  • Get inspired by competitors" content.


You"ve heard the "content is king" fable a thousand times. And it kind of really is, if you think about it. But it doesn"t mean you have to be a creative genius to think of those viral content ideas. There"s a whole Internet at your disposal, full of content that has done exceptionally well in your industry, if you will just look.

Here"s the trick: Web monitoring tools, like Google Alerts and Awario, can save you massive amounts of time in discovering those content ideas. The latter has a free trial, and the sign-up takes nothing more than your email address. Go on and sign up, and create an alert for your competitors with their brand names as keywords. In a sec, you"ll see a feed of mentions of those brand names (as well their own posts on social media). Now, you can sort those mentions by the number of people that are exposed to each (aka Reach) by clicking on the three dots in the right upper corner of your mention feed. This way, you"ll see which of their content has done best — and get inspired for your own next piece.



Off-page SEO


  • Run a quick backlink audit.


True, link auditing isn"t always quick. There are plenty of auditing tips, tricks, and advanced how-tos that — let"s be honest — you don"t always have the time for (though if you are looking for a more in-depth guide on link auditing, here"s one). A quick audit won"t be as comprehensive, but chances are it will let you spot some 90% of potentially risky links that can lead to search engine penalties — and will only take a few minutes of your time.

To get started, launch SEO PowerSuite"s SEO SpyGlass and create a project to your site. When the app has collected your backlinks, switch to Backlink Profile > Linking Domains and go to theLink penalty risk tab. Select all domains in your workspace and hit the Update Link Penalty Riskbutton to calculate how risky the links from each referring domain are.

Hang on a moment while SEO SpyGlass is evaluating the domains. When it"s done, click on the header of the Penalty Risk column to sort the domains by their riskiness. For details on why any one of the domains is considered risky, click the i button next to the domain"s Penalty Risk value. This will reveal a list of factors that make this link potentially dangerous.


If you do find a few spammy links you need to disavow, go on and add them to your disavow file. To do that, select the risky linking domains, right-click the selection, and hit Disavow Backlinks. Most of the time, you"d want to choose Entire domain under Disavow mode. Next, go to Preferences > Disavow/Blacklist backlinks. Review the list, and click Export when you"re happy with it.

Get the rest here

No comments:

Post a Comment