Are you a Google Analytics enthusiast?
More SEO Content
Steps To Recover From A Penalty?
Posted 08 June 2007 - 06:11 AM
I run a pretty big PR6 corporate website in the travel market for a while now. It's been around since 1999 and has always been well ranked by Google. I'm sure this is a pretty familair story, but on the 4th of May all the 300 or so keywords and phrases we monitor carefully were thrown right to the back of the rankings. I have a feeling it's that -950 penalty everyone talks about which hits niche authority sites.
We didn't change anything recently other than to upgrade our CRM (which produced invalid XHTML transitional code for about two weeks) and we also added a PayPal driven store to sell company tshirts and other memento-like things.
Luckily we still get the majority of our Google traffic as we're still No.1 when customers search our company name - we're an established brand in our market.
Does anyone have a 'recovery guide' for getting back to the top? I'm probably going to be sacked for this one so any advice would be greatly appreciated.
Posted 08 June 2007 - 06:20 AM
interesting you say -950 penalty, i've heard there was a -30 penalty , big difference, and still nothing I've seen that documents any of this on Google.
Have you noticed any new competitors appear in the list where you used to be? , has a competitor re-vamped their site and done a better SEO job than before making them go above you?
Would seem odd if there is these so called "Penalties" that they would apply it to you for no reason, but that said, Googles system be it Adwords, Analytics, PageRank, are always falling over or have major bugs, so I see no reason why Google's index shouldn't get screwed from time to time.
I'm not sure anyone can offer a recovery plan, you can't undo something you say you haven't done!
Posted 08 June 2007 - 06:45 AM
That's neither here nor there. The fact that we got nailed is a serious problem for me. You can find information on the -950 penalty on www.webmasterworld.com/google/3215939.htm
In our search market we have many large affiliate websites that sell accommodation across the board who do extremely well in SER and then you have people like us who have smaller independent sites. We're the largest of the independent websites (I would think) and we're well known in our markets.
Competitors have been coming and going for years, and have never effected us at all. As a company we keep expanding and entering into new markets which means we're sometimes the 'new competitor' and have always done well in our markets in 2 - 3 months.
It's our strict policy that we don't do dark SEO. Don't get me wrong, we focus on our keywords, but we don't violate quality guidelines at all.
I'd really appreciate your help everyone! Does anyone have a SEO best practice checklist?
Posted 08 June 2007 - 08:33 AM
I have a feeling you may be jumping the gun a bit. SERP falls can happen for numerous reasons and you seem to have locked in on one and only one possibility. Have you also looked into possibilities such as the web site having gone offline for a bit, DNS errors, a robots.txt that told the spiders to stay away, etc, etc?
IMO you need to first rule out all of the more logical potential causes before jumping to the conclusion that you've been penalized. The engines just don't go around penalizing sites that aren't doing anything wrong.
Posted 08 June 2007 - 10:22 AM
I've checked our DNS records, checked our server uptime, checked that the robots.txt file is allowing access, checked our sitemaps and I can't see anything wrong.
One thing I did notice in the webmanager tools is that there were 50 or so strange broken links that I suspect we're Google trying to interpret the SQL functions of our CMS as URLS. This has not happened before and after.
Taking your suggestion do you know if there are any SEO checklists that I go through that will help identify problems?
Posted 08 June 2007 - 02:59 PM
I'd run a spider against the site to make sure it can be spidered properly. Xenu Link Sleuth is a good freebie.
I'd run the domain though DNSReport.com to make sure things checked out okay with it.
I'd look in the Webmaster Central tools to make sure Google was able to read my robots.txt correctly.
I'd run a few dozen pages through a header checker to make sure the server is delivering a 200 OK status and not someting funky because of a misconfiguration. WebBug is a freebie tool you can use to do this, though make sure you check off HTTP Version 1.1 in the upper right corner if you use it since Ver 1.0 was incredibly buggy.
I'd look at the site while emulating a spider user-agent to give a quick test to make sure the host hasn't started doing anything stupid like blocking spider visits to save on their bandwidth. Alternatively you can parse through your access_logs to make sure the search engine spiders are able to get to your pages.
If you don't have sole control over both the coding of the pages and the server I would download the current pages to make sure some idiot hasn't placed a bunch of hidden text, hidden links or anything else nefarious in the code that is invisible to the naked eye.
I'd set up something with one of the several dozen groups out there that will send you an email if they cannot reach your site every X minutes. Just because the server is technically "up" doesn't neccessarily mean the site is reachable from outside.
That's the short list of where I'd start, under the assumption there may be a technical issue at the heart of the problem.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users