Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Components Of A Technical Website Audit


  • Please log in to reply
6 replies to this topic

#1 ttw

ttw

    HR 5

  • Active Members
  • PipPipPipPipPip
  • 379 posts
  • Location:San Mateo, California

Posted 16 June 2009 - 08:58 AM

What do you consider to be the most important elements to test during a technical website audit?

I have a client who thinks their website rankings have dropped due to a website refresh (traffic isn't down though) and since we haven't found any problems yet, I'm looking for other things to test. Thanks

#2 Jill

Jill

    Recovering SEO

  • Admin
  • 33,004 posts

Posted 16 June 2009 - 11:38 AM

What do you mean by a website refresh? Did they do a redesign?

If so, you want to see if they did 301-redirects from the old URLs to the new. That's usually the main cause of problems after redesigns.

#3 ttw

ttw

    HR 5

  • Active Members
  • PipPipPipPipPip
  • 379 posts
  • Location:San Mateo, California

Posted 17 June 2009 - 11:00 PM

They said they did 301 redirects but I'll check if they mapped old pages to new pages properlyl.

#4 Randy

Randy

    Convert Me!

  • Moderator
  • 17,540 posts

Posted 18 June 2009 - 06:50 AM

FWIW the main two things I look at are the redirects if any have been done, and do a site: type of search to see if all or at least most of the pages have been indexed. If the number of indexed pages is reasonably correct, as compared against the real total, then there usually aren't any serious technical issues.

If it's a dynamic site where the number of files isn't going to match up to the number of pages you can usually get a pretty good idea by running something like Xenu against the site.

#5 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,157 posts
  • Location:Georgia

Posted 18 June 2009 - 01:20 PM

Usability and accessibility may shed some light on what could have changed. If the site is less usable it may be producing fewer conversions. If the site is less accessible it may indeed have lost search visibility (there is a correlation between making a site accessible and on-site optimization).


#6 ttw

ttw

    HR 5

  • Active Members
  • PipPipPipPipPip
  • 379 posts
  • Location:San Mateo, California

Posted 19 June 2009 - 02:49 PM

I have learned that my client is using Akamai to analyze incoming requests for pages and based on the location of the user, speed up the delivery of page content to take some of the load off the main server.

Have you had any experience with this product negatively impacting search engines?

#7 Randy

Randy

    Convert Me!

  • Moderator
  • 17,540 posts

Posted 19 June 2009 - 04:29 PM

I've not worked with it a lot, but I've never seen Akamai or any other correctly configured offloader cause a problem with the spiders. Though I guess the need to have one in the first place might indicate a potential problem.

My understanding of the Akamai service is that it's basically a caching engine with locations all around the world. Meaning they grab snapshots of the html pages of the real site, store them in several datacenter locations, then try to match up users with the closest location to cut down on latency issues. It doesn't specifically look for search engine spiders to my limited knowledge, and since they're all US-based they get the US data. Whether that's from the real server or one of the Akamai servers.

Does this site have a Google Webmaster Tools account set up and verified by chance? If so, that would be a good place to start.

GWT provides some pretty helpful crawl info. Not only helping spot robots.txt issues, crawl errors, but actual crawl and page load latency data. In GWT you can see the crawl data in the Diagnostics section. Crawl errors is pretty self explanatory. If you see something there you have issues to concern yourself with. In the Crawl Stats are is where you'll find out if there are Latency issues. It's the bottom graph there, even though it doesn't say Latency exactly. I think it says something like Time spent downloading pages. As long as that is around the 250-400 milliseconds range for the most part, you're in pretty good shape.

(FTR, I can't prove that Latency plays any part in the ranking algorithm, but I have seen some empirical evidence that suggests it might from a few really screwed up sites sites I've helped clean up. In those cases the only change being made was to move them to a new, more powerful server [for non-SEO reasons] yet I've seen rankings improve pretty dramatically in the weeks right after the server move with no other explanation. Those were all rather extreme cases though. I've seen plenty of others where the site is slow as molasses but there was no apparent detrimental ranking effect.)






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!