Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Removing Duplicated Content Using Only The Noindex In Large Scale (80%


  • Please log in to reply
2 replies to this topic

#1 LukasTheCurious

LukasTheCurious

    HR 1

  • Members
  • Pip
  • 1 posts

Posted 13 October 2015 - 08:17 AM

Hi everyone,

I am taking care of the large "news" website (500k pages), which got massive hit from Panda because of the duplicated content (70% was syndicated content). I recommended that all syndicated content should be removed and the website should focus on original, high quallity content. 

However, this was implemented only partially. All syndicated content is set to NOINDEX (they thing that it is good for user to see standard news + original HQ content). Of course it didn't help at all. No change after months. If I would be Google, I would definitely penalize website that has 80% of the content set to NOINDEX a it is duplicated. I would consider this site "cheating" and not worthy for the user.

What do you think about this "theory"? What would you do?

Thank you for your help!



#2 qwerty

qwerty

    HR 10

  • Moderator
  • 8,695 posts
  • Location:Somerville, MA

Posted 15 October 2015 - 05:33 AM

I think that if they're going to republish someone else's content, noindex is the way to go. That's not enough as a response to getting hit by Panda, though. You need to have more original content, and that content needs to be of high quality. It's not necessarily a matter of the ratio of original to syndicated content (especially if the syndicated content is noindexed). It's more about the quality of the content you want the search engine to index.



#3 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,325 posts
  • Location:Georgia

Posted 15 October 2015 - 02:25 PM

 

Hi everyone,

I am taking care of the large "news" website (500k pages), which got massive hit from Panda because of the duplicated content (70% was syndicated content).

 

 

I find it highly doubtful that is what led to a Panda downgrade.

 

Removing the syndicated content isn't going to do much good, either, although depending on how the site is designed the reduction could have a beneficial secondary effect.

 

With Panda you should be looking at the site design and improving user experience, not worrying about syndicated content.






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

We are now a read-only forum.
 
No new posts or registrations allowed.