Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Recovering From Penguin


  • Please log in to reply
5 replies to this topic

#1 rob1

rob1

    HR 2

  • Members
  • PipPip
  • 12 posts

Posted 21 November 2013 - 02:10 PM

I had a few questions about recovering from Penguin that maybe some of you experts could help me answer.
 
My site used to be in the top 3 for my competitive keywords. After Penguin updates, it is now on page 8 for my main keyword and about 16 for my secondary keyword. It has no manual penalty or messages. In addition, it comes up as number one for my brand name and has a PR of 4. So I believe my site is not banned just being hit by the Penguin algo.
 
My site has similar banklinks to my competitors, however, it has more links than any  of my competitors and it also has links from one or two hundred directories. In my judgement, this is probably what is causing me the grief. So I'm finally getting around to embarking on the arduous task of contacting each of these directory websites and asking that me links be removed. This leads me to a couple of questions.
 
1)I have read that if you don't have a manual penalty, then your SERPS are because of Penguin and that the algo may be automatically penalizing you. If one addresses all of the problem areas will Penquin's algo immediately reflect this or do you have to wait until the next Penguin update? I read somewhere you have to wait til the next update, but if it is strictly algo driven that doesn't make sense.
 
2) I also read that if your site does not have a manual penalty, you should never use disavow. However, it is clear to me I have a algo penalty and I full expect most of my requests to webmasters to remove my site will be ignored. Won't I have to disavow those or does disavowing links not help with the auto Penquin algo.
 
3) Finally, has anyone actually had success in getting their SERPS restored from a Penquin hit and how long did it actually take once you cleaned things up? I'm not interested in hearing about manual penalties, that's a whole different issue.

 



#2 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,157 posts
  • Location:Georgia

Posted 21 November 2013 - 02:57 PM

1)I have read that if you don't have a manual penalty, then your SERPS are because of Penguin and that the algo may be automatically penalizing you. If one addresses all of the problem areas will Penquin's algo immediately reflect this or do you have to wait until the next Penguin update? I read somewhere you have to wait til the next update, but if it is strictly algo driven that doesn't make sense.


Not true. A drop in traffic could be due to any number of reasons, and not just Penguin.

Nor is it true that Penguin only looks at backlinks. It is also knocking down Websites for stuffing keywords on their pages. The keyword stuffing doesn't have to be blatantly obvious. It can be pretty subtle.

Further, even if the downgrade *IS* due to Penguin, you may not make changes in time for the next release of Penguin data to reflect those changes. The rule of thumb is that you'll have to wait through 2 releases of data to be sure that your changes did (not) address whatever issues Penguin is focused on.
 

2) I also read that if your site does not have a manual penalty, you should never use disavow. However, it is clear to me I have a algo penalty and I full expect most of my requests to webmasters to remove my site will be ignored. Won't I have to disavow those or does disavowing links not help with the auto Penquin algo.


Also not true. You can use the Disavow tool for any reason. You may or may not see improvement, though. The Disavow tool tells Bing or Google to ignore the anchor text and PageRank that might have passed from any link you are disavowing. While this might remove negative value from your site, if the link at one time helped you won't see THAT boon return.
 

3) Finally, has anyone actually had success in getting their SERPS restored from a Penquin hit and how long did it actually take once you cleaned things up? I'm not interested in hearing about manual penalties, that's a whole different issue.


I have come across several case studies where people seem to have fixed their Penguin issues. There was no consistent time frame among the case studies and given that there was no way to confirm the authors' analyses (that the changes in SERPs were due to Penguin) you have to take their reports with a grain of salt.

#3 rob1

rob1

    HR 2

  • Members
  • PipPip
  • 12 posts

Posted 21 November 2013 - 03:24 PM

Hi Michael, thanks for taking the time to reply and help me understand. First let me clarify, I said my SERPS or search rankings not my traffic. Also, I understand Penguin also looks at on page, though I'm very confident based on my analysis my issue is with links, and along with time frame that it was because of Penguin.

 

The point of my first question was, that I thought Penquin is an algo change that basically can push your site down in SERPS for doing what google considers negative. If it is an algo, then why do you need to wait for another Penguin update if you clean up your site. Isn't the algo dynamic? Of course it will take several days for changes in links and on page changes to be cached by google but at that point shouldn't it be reflected. Or, are you saying that when a new version of Penguin is released, there is some sort of one time run on every site out there and any automatic penalties that are discovered are stored in some google database and stay there til the next update. That doesn't make sense to me. Just trying to understand why you would need to wait across updates.

 

As far as case studies, yes I've read those as well and most had to do with manual penalties. I was hoping someone hear could report their results first hand. With how many people have been impacted by Penguin, surely someone has tried and either succeeded or failed here.


Edited by rob1, 21 November 2013 - 03:26 PM.


#4 lesterj1

lesterj1

    HR 2

  • Members
  • PipPip
  • 19 posts
  • Location:Tampa, FL

Posted 22 November 2013 - 09:47 AM

Hi Rob,

 

Penguin is a separately run algorithm that is applied periodically.  That's why there's always such a big deal when another Penguin update is released.  Panda was the same way for a while, but now has been integrated into the main algorithm.  So, Panda is dynamic, Penguin is not.

 

I'd love to be able to report about a recovery, but at this point I've only had one client hit by Penguin (they were hit 1 week after I brought them on board) and we're still in the process of cleaning up their link profile.  That said, I've seen dozens of first hand Penguin recovery stories posted around.  Just do a search and you should be able to find some.

 

Lastly, there is something you need to keep in mind about Penguin recoveries.  The term, "recovery" is deceptive.  The recovery that happens is that the penalty (manual or alg) is lifted.  It isn't that all your rankings come back.  A site hit by Penguin was hit because a bunch of links were considered to be against Google's guidelines and were intended to manipulate the search rankings.  In order to have that penalty removed, you have to remove the manipulative links which will mean that you no longer have the benefits (keywords or link authority) coming from those links and your rankings will be lower.  So, don't expect to get your rankings back from removing some links.  Your rankings will be lower.  The goal is to get the penalty removed so you are free to start building your site's authority back up naturally.



#5 rob1

rob1

    HR 2

  • Members
  • PipPip
  • 12 posts

Posted 22 November 2013 - 10:42 AM

Thanks Lester, this explains it very well. As far as your recovery comments—understood. However, my site shares the same type and number of quality links as my competitors are the first page. Unfortunately I have a whole bunch of directory links as well that I need to clean up. Even after cleaning up I still should have just as many if not more good links so I would expect to regain my rankings.

 

A few days ago I was down to page 8. I have already made some on site changes to pages, removed all of my poor quality outgoing links and consolidate all of my outgoing link pages into one quality, related links page. Today I see my site up to page 6. Not sure if this is a just google fluctuations or a result of the changes. Now I have to start asking webmasters to remove me from some of these directories...ugh!



#6 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,157 posts
  • Location:Georgia

Posted 22 November 2013 - 10:58 AM

Hi Michael, thanks for taking the time to reply and help me understand. First let me clarify, I said my SERPS or search rankings not my traffic....

Search marketing is all about the traffic. You talk about SERPs, I talk about traffic. You won't see more traffic by seeing low rankings in the SERPs that interest you.
 

...The point of my first question was, that I thought Penquin is an algo change that basically can push your site down in SERPS for doing what google considers negative. If it is an algo, then why do you need to wait for another Penguin update if you clean up your site. Isn't the algo dynamic? Of course it will take several days for changes in links and on page changes to be cached by google but at that point shouldn't it be reflected. Or, are you saying that when a new version of Penguin is released, there is some sort of one time run on every site out there and any automatic penalties that are discovered are stored in some google database and stay there til the next update. That doesn't make sense to me. Just trying to understand why you would need to wait across updates.

I will offer a slightly different explanation from that provided by lesterj1 (which is a perfectly fine explanation).

People talk about "algorithms" and search engines without really understanding what the search engineers mean by "algorithm". An algorithm is just a process for accomplishing a task. There are things we could call "meta algorithms" (meaning they are sets of algorithms working together). In computer science we call these "systems". In Web search we call them "Bing", "Baidu", "Google", "Yandex", "Blekko", et. al.

In 2013 a Web search engine actually consists of several internal search engines: an image search engine, a news search engine, a blog search engine, etc. The user interfaces for the major search engines have been designed to pass your queries on to one or more of these internal search engines.

Each internal search engine has its own algorithms for finding content (crawling), including it in its database (indexing), evaluating that content (filtering, flagging, base scoring), and ordering that content in response to queries (ranking by relevance, freshness, fitness, etc.).

The "master Web search" engine -- the one most people tend to use -- may aggregate results from these various internal search tools (creating what we call a "universal search result" with news injections, image injections, local listings, etc.). The "master Web search" engine may also just provide a list of 10 links, some of which may have been provided by some of those "vertical" engines but in a more low-key way.

So algorithms like Panda and Penguin -- these are not "search engine algorithms". They are what search engineers call "document classifiers". A document classifier looks at a specific document (the content found on a given URL) and applies various rules to that document. The classifier may create metadata about the document, it may look for specific patterns in the document, it may extract specific types of data from the document, etc.

A major search engine like Bing or Google probably applies anywhere from several dozen to several hundred document classifiers to any given URL. Some of these classifiers are used to hunt for low quality content and/or spammy signals.

The Panda algorithm takes the data provided by other document classifiers and (we believe) probably computes some sort of score that [people outside of Google] may refer to as "a Panda score". This is purely a conjecture based on what Google has said about Panda. In 2011 Danny Sullivan summarized this "Panda score" concept in a pretty nice article. The Panda score would be added in to a page's base IR (Information Retrieval) score in response to each query, just like its internal PageRank score is added to that IR score.

The Penguin algorithm is not as well-described by Google as the Panda algorithm, but it is essentially trying to do automatically what the Google spam team has been doing manually: identify manipulative links and content. The Penguin algorithm, in my opinion, most likely works in a fashion similar to Panda. I believe that it probably takes data provided by other document classifiers and uses that data to compute a "Penguin Score". The Penguin Score may be added to a base IR score just like a Panda score or a PageRank value. There is no evidence supporting this point of view, but I find it hard to imagine Google NOT using the same model for a lot of its internal algorithms.

Because Panda and Penguin are trying to simulate human judgment the Google engineers run them in "batch mode", offline. They results are computed and stored -- probably evaluated through some sort of random quality testing process -- and then the data is integrated into the live search database(s). Earlier this year Google said that Panda was going to be more smoothly integrated into the process; everyone assumed that meant it would be automated, but Google subsequently revealed that it didn't happen automatically. They probably just streamlined their human interaction.

Penguin does not seem to be that far evolved, yet.

So every offline algorithm (and I am sure there are more we don't know about) involves a lag time between the fetching of the data that it processes and the release of its computed results into the online database. During that lag time a whole of crawling, indexing, and reordering goes on due to the "live" or "real-time" algorithms that ARE integrated into the main search tools.
 

As far as case studies, yes I've read those as well and most had to do with manual penalties. I was hoping someone hear could report their results first hand. With how many people have been impacted by Penguin, surely someone has tried and either succeeded or failed here.

The problem with all the case studies is that it's impossible for a third party to confirm that these sites really were hit by Penguin.

Penguin did not necessarily replace all the other processes the spam has in place. It most likely just automated some of the most repetitive tasks in the spam identification and penalization queues.

Technically, if the downgrade is not applied by a human Google won't call it a penalty. But so far they haven't shown that the effect is any different. In their internal consoles where they analyze Website performance in their search results they most likely see something like a "Penguin Score", "Penguin Flag", etc. that tells them a site has been downgraded by the algorithm. They almost certainly see something that says "manual action", too.

Edited by Michael Martinez, 22 November 2013 - 10:59 AM.





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!