Skip navigation
newsletter signup

Technical SEO Issues

August 28, 2013
             
By

Technical SEO Difficulties!While uncovering and fixing technical issues has always been an important part of SEO, in the wake of Panda and Penguin, technical SEO has moved closer to the forefront. You may have thought that the better Google gets, the less effect technical problems would have on SEO -- I know I did. But in fact, it has been the opposite. It's not that Google can't figure out technical SEO problems and work around them -- they most certainly can and have done it for years. But it seems that they have decided to force webmasters to clean up their sites now.

It does make sense from Google's perspective. Why should they waste their computing power to sort through badly coded websites and misconfigured servers? I can totally see them deciding that if you don't have the time or wherewithal to fix blatant errors, then why should they show your website to their users (aka the searchers)?

Enter Google's Webmaster Tools

For many years Google has provided a host of free webmaster tools to diagnose technical SEO issues. Yet I imagine that only a very small percentage of website owners actually use the tools, and an even smaller percentage are likely to fix the problems. So it seems that Google eventually decided to take drastic measures by downgrading sites that had the most egregious technical issues.

What better way to make site owners take notice than taking away some of their traffic?

Now, I'm not saying that all sites with any technical problems are being downgraded by Google. They're most certainly not. But if sites have other issues that Panda and Penguin caught, PLUS they have a lot of technical issues, it's easy to imagine the creation of a perfect storm, so to speak. Which is partly why some sites that fix their spammy SEO issues without fixing their technical ones may never quite recover.

Why would Google care about technical website problems?

In most cases, it's not the technical issues themselves that are hurting your SEO efforts, but the results that are caused by the errors they create. For instance, most of us agree that Google had made a big push toward showing the most user-friendly sites first in their search results. Well, what's less user friendly than a site where many of the links produce "Internal Server Error" pages instead of what they're supposed to show?

Google is really a "referral engine."

Think of it this way: What if I recommended a particular product to you, but after you bought it, it didn't work very well? Would you trust me for future product recommendations? Probably not. It's the same with Google. They need to refer searchers to the most relevant results that also work as they should. A site with lots of technical errors *should* be downgraded by Google because it provides a poor user experience.

By now you're probably wondering what sorts of technical SEO issues might cause Google's black-and-white animal hitmen to downgrade your site. While the list is long, below are the ones I've compiled that I see the most often when I'm auditing penalized sites. They're generally ones that either cause a poor user experience or simply make your site harder for Google to crawl, index, or read.

Technical SEO issues include (but are definitely not limited to):
  • Server errors: This includes tons of 404-errors on a site (especially bad if the rest of the site is internally linking to them), what Google calls soft-404's, plus 500-server errors, and just generally pages that can't be accessed by Google (or any spider).
  • Incorrect HTTP header responses: This includes redirects that simply don't redirect at all, ones that show 302-HTTP header responses instead of 301's, and 404 pages that respond with a 200, 301, or 302 instead of a 404 response.
  • Multiple redirects: This includes any redirect that makes more than 1 hop before a user lands on the page they're ultimately supposed to land on. While Google can and does handle 1 or 2 hops, it's prudent to set your redirects to go directly to the actual URL that you want your users to land on without any stops in the middle.
  • Redirect loops: This is when you redirect a URL to a different URL that is redirecting back to the first URL (yes, I've actually seen this in action!).
  • Misconfigured canonical link elements: Ever see a site that inadvertently pointed all of the pages to the home page via rel="canonical"? I've seen many. (True confession here -- when the tag was new I even did it myself once with my forum...oops!)
  • Requiring JavaScript or cookies to view something: Search engines traditionally don't use JavaScript or cookies, so if the only way to see something on your site requires them, there's a good chance none of that information will be indexed.
  • Pages indexed that shouldn't be: I've seen all sorts of these, from server index pages to those that pop up Ajax errors.

How to diagnose technical SEO issues

As previously mentioned, you can find most of these issues by digging into your Google Webmaster Tools account. You'll find lists of 404-errors, soft-404's, crawl errors, and pages that Google simply can't access. You can even try to fetch problematic pages as Googlebot to gain additional insight.

I also highly recommend using a spidering tool such as ScreamingFrog. This tool will spider your entire site and provide you with all kinds of feedback. One thing to remember if you use a tool like this, however, is that just because the tool finds all kinds of strange things, it doesn't mean that Google is also finding them. Be sure to double-check Google's index before you panic!

The key takeaway here is to not just find your site's technical errors, but to actually fix them. Even if your site hasn't lost any traffic over the years, if you find a bunch of technical errors, they could be keeping you from receiving all the search engine traffic you deserve. It's very possible that spending a day fixing these problems could pay off handsomely in the long run. If nothing else, it will certainly keep your users happier!

Jill

 
Jill Whalen has been an SEO Consultant and the CEO of Jill Whalen High Rankings, a Boston area SEO Company since 1995. Follow her on Twitter @JillWhalen

If you learned from this article, be sure to sign up for the High Rankings Advisor SEO Newsletter so you can be the first to receive similar articles in the future!
 
 
 
Post Comment

 Charles said:
Thank you for the Screaming Frog suggestion, because I hadn't used it before (I just used Xenu to run through links). What was interesting to me was that AVG alerted me to a 'Blackhat SEO type 1730 exploit' in respect of an external link which SF listed. The plot thickened, because visiting the site with a browser set AVG off again. I checked with virustotal.com and sucuri.net and they reported no problems. Confident, I looked at various pages' source code and failed to spot the problem. (What seemed strange was 578 lines of blank code.) Even if I spoofed my user agent to various SE spiders, the same happened. SAHGB.org.uk is hardly a bad neighbourhood, and cannot imagine them trying a dodgy technique knowingly.
I'm debating whether to remove that external link, just in case, or trust Google won't factor it in next month somehow.
 Alan Bleiweiss said:
Glad you're writing about this Jill.

I routinely use Screaming Frog on sites that aren't too big (running SF on a million page site is painful). I also rely on URIValet.com and WebPageTest.org for page code processing speed evaluations that help uncover issues likely behind slow processing speeds within GA's speed data.
 Riza said:
Whenever I read the “technical” word in a title, I get all nervous inside that I may not be able to understand it. I’m pretty sure others feel the same way. Which is really frustrating because these people want to understand as much as they can to be able to move forward professionally in this field.

Surprisingly, such fears are unfounded in this post. Your article is so simple to understand. I, for one, learned a lot! Very comprehensive.

Thanks, Jill!
 Warren Whitlock said:
I'm ambivalent on this one. Of course, no one is against doing the proofreading of SEO and every little bit helps but when I do this I'm often wondering if I'm straightening deck chairs on my Titanic.
 ben said:
410 Gone headers
I have an ecommerce site where all my one-off products have unique (friendly) urls but obviously once a product has sold the page returns a 404.
If I was to amend my 404 page code so that it redirected with a 410 in the event of a product page not being found e.g. checking for "/product/", would that please Google or would it be annoyed with me and punish me?
I was thinking of redirecting to the category home page for the sold product.

Great article btw, thanks
 Jill Whalen said:
@Ben, I always prefer 301 redirects if you can implement them.
 Ben said:
the problem is, there's no 301 that's as relevant as the now sold product, it would have to redirect to the parent category page. Won't Google object to that because it's not relevant enough to the original page?
 Ben also said:
p.s. because it's actually gone, not been moved
 Jill Whalen said:
Redirecting to the relevant category page should be fine, IMO.
 Ben said:
Thanks Jill, I'll do that then.