Skip navigation
SEO Website Audit

Some Farm Aid for the Afflicted: Google Farmer Update

March 2, 2011
By

By now you've surely heard of the recent Google algorithm changes dubbed the "Farmer Update (ADDED: Now often referred to as the Panda Update)."

(See also this newer post regarding Panda: Why SEO in All the Right Places Doesn't Cut It Anymore

According to Google, about 12% of search queries were impacted by this update. The SISTRIX blog  provided additional insight by posting the top 25 websites that overnight stopped showing up in Google for numerous keywords. I was interested in learning Photo Credit: aechempatimore about this update, and SISTRIX was kind enough to share their big list of over 300 sites that have had deep traffic losses. In addition, I've had various people send me their sites to look at.

I hoped to analyze the data to spot similarities between the sites that got hit so that I could understand the specific factors Google used when deciding which pages to nuke. As you can imagine, there was a lot of data to sort through and I feel as if I've only just gotten started. However, I do have some preliminary findings to share with you as quickly as possible.

Please note that just because I noticed similar things on sites that got hit, it doesn't mean those things were the cause of the loss of Google traffic. It's far too easy to make assumptions and mix up cause and effect in nearly every aspect of SEO. So I caution you to treat the information I'm providing as what it is -- preliminary findings that make me go "Hmmm." Also note that I've barely had enough time to look at the potential on-page factors that might be causing issues, and haven't even started to look at the off-page links that are pointing to these sites. Because we know that links and anchor text are Google's main squeeze, my on-page analysis could very well be completely off base.

With that caveat out of the way, below are some of the interesting things I noticed that made me go hmmm...with the small set of sites I've looked at so far.

Semi-hidden Content

One surprise finding, which may or may not relate to the loss of Google traffic, was that many of the sites had content that was behind tabs, and not visible all at once to someone using a typical browser. It's possible that this type of design element is so common on websites these days that many sites from a random sampling would also be using it, but it definitely struck me as odd. What made it especially interesting was that most of the sites using the tabs had a very large amount of content contained within them. With tabs such as these, a person only sees the content in one tab at a time, while Google sees all the content from all of the tabs, as if it were contained on one page. (Technically it is, because it's all one URL.) In many cases all the tabbed content put together added up to thousands of words, and often hundreds of links as well.

While there's nothing inherently wrong with using tabs this way (and many sites are currently using the technique), some cases might trigger red flags.

There are many different coding methods to "hide" content behind tabs. The code on two of the sites I reviewed that had lost Google traffic were using different methods. One had this code: "display: none; visibility: hidden;" and the other had this: "overflow: hidden;".

Why Google might not like it: Each site was using their tabs for different reasons, and I doubt that the "visibility: hidden" in and of itself caused Google to no longer like those pages. But perhaps Google took issue with the extremely long pages of content because they might appear to be less user-friendly (if Google didn't realize that the content is tabbed). In addition, the numerous extra links in some of the tabs might appear to go overboard.

In one instance, I set my default browser to Googlebot and tried to browse a page that was using tabs with tons of content behind them, but I got an error message that the page couldn't be viewed at all. The error seemed to have something to do with a very strange, hidden ad link contained in the tabbed content.

In another case of semi-hidden content, the pages were designed in a way that is very cool and easy to use for people, but all the content from the various hidden areas, when viewed on one long page as Google saw it, ends up looking like a disgusting keyword-stuffed mess! I have no idea if the site was purposely designed to stuff keywords in that way or not, but before the Farmer Update it was apparently working for them.

Completely Hidden Content

Another common finding between some of the sites I reviewed was having the real "meat" of the site behind a registration wall. While there would be some keyword-rich content on the page in question, you couldn't read the whole article unless you registered for it. Google has never been a fan of that, and even offers their "First Click Free" program so that content publishers who require registration to read their articles can still get their content indexed. But the site must show the entire piece of content to people who have not registered if they got to it from a Google search. The sites I reviewed were not using the First Click Free approach.

Why Google might not like it: They believe that if you want your content indexed, you should play by their rules, which in this case is the First Click Free rule. They probably also believe that a page with just a summary of information related to the searcher's query is likely not the best page for the user to land on. So it doesn't surprise me that those types of pages may have been hit in the Farmer Update.

Merry-Go-Round Sites Containing Mostly Ads or Links

Interestingly, I recognized one of the sites on the big SITRIX list as one I had done a website review for last year. I have to say that it was one of the craziest sites I had ever seen, and I was shocked that Google was even showing it highly in the search results. So when I saw it got nuked bigtime by Farmer Google, I wasn't surprised. I noticed some similarities between that site and a few of the others that got nailed -- mostly that you felt you were going round and round in circles as you tried to find the information you were originally seeking at Google.

Here's what happens on this type of site: You get to a page that uses the keywords you typed into Google, only to find that you need to click a link on that page to really get the information. But when you click that page, you either end up at another site, or on another page on the same site -- and you still don't quite have the info you wanted. It seems that you could keep clicking that way forever and never find what you were looking for. Yet you always have the feeling it is you doing something wrong, not that the site simply sucks wind. (Of course, the pages are also always full of Google AdSense and other ads.)

Similar to the merry-go-round sites, others I reviewed were simply aggregating others' content in one way or another. In many cases, it would make more sense for Google to just show the original site (or sites) rather than a page with a list of sites -- especially when the list of links is actually just running an ad platform that appears to be links.

One site was a niche comparison site, which seemed okay on the surface. But I found that when I browsed to a particular product and then tried to view it on the website that was listed as the cheapest, in many cases I was brought to either the home page of said site or a page that contained a product similar to the one I was looking at, but not the exact one. Ugh.

Why Google might not like it: Google stated that part of this update was to improve the quality of the results their searchers were receiving. All of the above types of sites have numerous pages that meet the "poor quality" label, assuming anyone ever paid attention. In these cases, I can see where it makes more sense for Google to show the pages being linked to directly in their search results, rather than the page that's doing the linking.

So there you have it -- my first impressions from a very small sample of sites.

What You Should Watch Out For

With everything I've seen, the consistent themes seem to be usability and the intent of the page in question. I can't say how Google is technically figuring out intent, but they appear to be going after pages that might frustrate users. Google's goal is to satisfy the search query of their user -- the searcher. Their goal is not to provide their searcher with pages that link to the pages, that link to the other pages, that satisfy the original search.

With all that said, after writing up my findings, I also looked at some of the new Google results, and, sadly, there are some even worse pages that show! In one case, the site I was reviewing, while not satisfying the search query itself (other than having the search words on the page), was beat out by a pathetic little made-for-AdSense site that had no redeeming qualities whatsoever. How that one survived the Farmer Update, I'll never know.

It's key to remember that this update is most likely just the beginning. About the only thing I'm sure of at the moment is that Google still has a lot of tweaking to do over the next few months to truly sort things out.

Jill
 
Jill Whalen is the CEO of High Rankings, an SEO Company in the Boston, MA area since 1995. Follow her on Twitter @JillWhalenJill Whalen

If you learned from this article, be sure to sign up for the High Rankings Advisor SEO Newsletter so you can be the first to receive similar articles in the future!
 
 
Post Comment

 Al Toman said:
I think your preliminary study conclusion is fairly accurate. As a result of Farmer Google my pages dramatically went up the SERPS :O) for the majority of my keywords.

Tabs on a web page are cool but have always appeared to me to be anti-optimization for the very reasons you stated.

Hence, I'd rather link to 3 additional pages instead of having 3 tabs of content on the home page. This gives up to 12 additional keywords (4 per page max) and makes for a bigger target of landing pages.

Use tabs on an administrative page or no-follow page, etc.
Tabs are also effectively used on the product page such as Lowes hardware or was it Sears!?!

Also there are many instances of Google highly recognizing these keyworded pages that have nothing but unrelated links on them ... as you said ... point to another page, another site, or to la-la land.

Maybe some day beyond my time we will have a highly efficient search engine and not this play toy marketing machine Google created.

???

Nah!
 Alex said:
Very intreresting Jill. I had not considered usability at all. I dislke the tabbed pages you mention - they are very frustrating and are common on medical sites

Why do think article directories got slammed - poor editorial control?
 Lauren said:
Just out of curiosity I tried searching for "free clip art", and none of the link farms I'm used to seeing as results popped up. Alas, the first result was still pretty spammy, but at least it was new?
 AJ Kohn said:
I've been trying to figure out the reason why sites with less 'chrome' seem to be ranking well. Your semi-hidden content is something I hadn't considered - specifically the code sniffing around 'hidden'. Instead I was pondering whether they'd reverted to a text to code ratio of some sort.

However they're doing it the result is the same in many cases, sites that look like they're the spawn of Geocities seem to rank quite well.

Not to pick on these guys but this is the type of site I see often trumping more ... polished sites. http://repair2000.com/wash.html

It may provide very good advice, but is this really superior UX for users? (And yes, this is returned in the top position for a handful of searches.)

In the end it seems like high volume MFA sites (Made for AdSense AND Made for Amazon) are being replaced by low volume MFA sites.
 Jill Whalen said:
@Alex for the article repository sites, I think it was mainly a case of those sites having legacy authority from being around for so many years. I'm guessing Google may have removed some "authority points" from either some sites. Or maybe they put less weight on authority on all sites.

In addition, I think those article sites fit in with much of what I said in the article, they may make use of the specific keyword phrase, but a person reading them never ends up feeling satisfied that they got the info they wanted. Not to mention that most of the article are only uploaded to those sites for link/anchor text purposes.
 Finn Skovgaard said:
Coincidence or not, I reported Ezinearticles to Google Legal Support, DMCA Complaints, on February 15, for copyright infringement. Someone had copied an article from my site, edited it a bit, published it on Ezinearticles, and then called himself author. As a result, I found my content on about a dozen sites, all reported. I don't think just one such report would result in such drastic action, but if others have used the same site for similar infringements, it could explain it.
 John Park said:
My site didn't suffer at all, actually Google gave me some better rankings. One of my pages was non-existent and then it came right up to the first page that has 545,000,000 results. My blog is simple, I try to make it user friendly and I don't try to trick the SE's or my visitors. They moved my PR from a 2 to a 3.
 Jane said:
I haven't had time to really study this. Most of our sites are smaller local sites and often have to complete in searches with the content farm sites. I'm not seeing much up or down in our results. But I really didn't notice much improvement in localized searches either. For example, when I searched for hair salons with a local town name, I get 1 or two actual salon sites on the first page along with a few Google places pages. The rest is the same crap I'm sued to seeing.
 Karen DeCrane said:
My sites and my client sites either remained the same or climbed a bit. Thanks for proving once again that you ignore source code at your own peril, quality matters more than quantity, when you build on a solid foundation Google changes don't affect you nearly as much as when you use the latest "Google killer" shiny thing.
 Finn Skovgaard said:
As an individual, non-SEO specialist, I've managed to build a site over the years by following your advice and common sense, so that I presently hover somewhere between number 1 and 5 on Google US for "living and working in France" and rank very well for many other keywords. I've never paid any Google ads for that site or used spammy technics. Neither have I invested months of SEO efforts. I don't claim that site is perfect in SEO terms, but that it's possible for someone who is determined to obtain good results without spamming or paying ads. It's a bit difficult to follow those who claim that paying Google ads should provide better search results. I'm happy to see that spammy sites, out of which some simply use stolen contents, are now being actively put back in the place where they belong.
 Donna said:
Is anyone seeing a vast difference in results across browsers?
 Steven W. said:
Browsers shouldn't make a difference; it's the back-end Google web crawling results preparation that's at bay here.
 Alicia said:
Jill: When you mention tabs, are you including setups like dropdown menus that have subnavigation links hidden by CSS or Javascript until you mouseover a tab?

Donna: if you haven't already, check that you aren't logged into Google Accounts in one browser but not another.
 Jill Whalen said:
@Alicia, no, not dropdowns, at least not in the sites that I was looking at.
 Herman said:
Please explain what you mean by tabs. Does this include javascript menus, image menus?
 Andreas said:
This event made me search for my website to see current rankings. To my total surprise, it was almost at position #1 in the results. This has never happened before. I was elated.

But then I realized that I was logged into my Google account, and because I had +1'd my own site, it got a boost for me. When logged out, my rankings seemed unchanged.

Most of our client sites are unchanged too, but then again we mostly deal with local sites.

I wonder if any off-page factors came into Panda. That surely is a very difficult thing to police for a Search Engine, because penalizing links from bad neighborhoods too badly will result in everyone getting links to their competitor sites from bad neighborhoods.
 Jill Whalen said:
@Andreas, I did a follow up article to see if anchor text link spam was a thing of the past after Panda, but sadly, it wasn't.