Are you a Google Analytics enthusiast?
More SEO Content
Article Distribution and Duplicate Content Bans
Posted 01 December 2005 - 12:46 PM
Posted 01 December 2005 - 01:29 PM
Posted 01 December 2005 - 05:07 PM
I think Google has some way to determine what's an article and what's not. You'll find thousands of articles listed (exact reprints) from countless authors on all sorts of sites.
Posted 01 December 2005 - 05:37 PM
Yes we have thousands of articles distributed all over the net. Never had a glimpse of problems. But on site can be a different story as in printer friendly pages. They may not penalize but may filter or in the future penalize since they duplicate consumes resources. They may even accidentally filter both or purposefully filter bot. G has problems with www for goodness sake and even trailing slashes. They have canocial problems which they are trying to fix in the recent updates. If they are having such problems and fixes, why chance that they distribute the print version rather than the real version? Why chance them dropping the real page because they had a glitch. See what I am saying? Google had a glitch with 301's on our site in which I discussed the problems with G-engineers. They were indexing pages that contained url's that redirected to external sites through a tracking script (302's were bad and caused hijackings so we used 301s) under the url of the redirect script. Having a tracking urls and you have a whole mess of problems PR problems and duplicates. They also flipped and decided that the non www version was the correct version. The non www which had 0 pagerank (301 redirected) thus crapping out our site for 9 months. It is appears to have corrected and we are waiting of a full crawl. Stuff goes wrong. Why give them extra stuff to go wring with?
Since Google is cracking down real hard on spammers and other such techniques I wouldn't be surprised if duplicate are under their radar. A database can be sorted and distributed countless of ways. A person can re-sort the database and create many tens of thousands of pages which link to other pages pushing up PR to those pages. Like every new thread on a Forum adds to the total and cast an internal vote to the home page and sub pages and so on.
As the poster even stated above that linking to a print only version without any links back creates orphan pages. Those print version that DO link back can be used to manipulate PR and if Google EVER decides that this is "trying to manipulate" they can easily get rid of the sites like this automatically with their algo. The print versions also are suck PR where the PR can be pushed to the better parts of your site.
Sending out articles for publication why not do a quick revision? This way your site is ALWAYS unique and will always be credited (no matter what happens) as being unique. Also who is to say they won't step up their filters and allow only 1 republished article in the serps. Who is to say that they won't consider this as "manipulation of PR" in the future. Google can filter and show only 1 duplicate. Who is to say yours will be the one that stays or not. How will they determine who wins? Most factors they can go by can be deceiving... Let those who accept the article deal with that as yours on site will be unique. This way not matter what happens and they filter the distributed articles only showing 1 listing for it, you will still have virtually 2 listings. 1 for your unique version and one for the version from the site that doesn't get filtered.
Edited by making-it-big, 01 December 2005 - 06:07 PM.
Posted 03 December 2005 - 03:09 AM
Looking at the source for this page, I find this tag <meta name="robots" content="noindex"> -- in the body, not the head -- but couldn't that be causing the problem?
Posted 03 December 2005 - 03:36 AM
This is the classic example of mixing cause and effect.
<meta name="robots" content="noindex"> is right there in the code as this:
<meta name="robots" content="noindex">
Well done jbergerot for reminding each and everyone of us (me first, second and third in that list) that Occam is a Prophet: the simplest answer is often the best.
Posted 03 December 2005 - 10:52 AM
Posted 03 December 2005 - 11:55 AM
Well, it was used in the body, not the head of the page. It was contained in <noindex></noindex> tags. And it was used in several different places on the page. So maybe she was trying to exclude only parts of the page (presumably the duplicate content). Is that even possible?
Posted 03 December 2005 - 12:25 PM
Posted 04 December 2005 - 05:03 PM
AFAIK, and I haven't cheked in a while as poor coding is rare these days, most meta tags don't need to be between the <head></head> section to be picked up by an SE. The reason for that is that poor coding is still extrememly common accross the total number of sites, and doing so would cause issues. An SEs goal is to be as good as possible, not strictly enforce coding semantics.
And that site is a clasic example of myth made real. <noindex> tags. Where did that come from? An SEO urban myth that created a belief in a duplicate content penalty, when in reality it was just the poor use of a tag that blocks all indexing.
And that is why you should always be skeptical of all forum claims. Without spending the time to investigate fully, no "I have proof" is really good enough. That can infuriate people, just as it infuriated Nicky, but as this case proves, there is often a perfectly logical reason for most things that does not require a new penalty / filter.
In any case, seems to me that Nicky now owes jbergerot a beer, or whatever is his poison!
Edited by projectphp, 05 December 2005 - 12:30 AM.
Posted 04 December 2005 - 05:19 PM
Posted 04 December 2005 - 05:39 PM
Right, so now I have nothing conclusive regarding being de-indexed for duplicate content, however, I truly did believe that that was the problem hense why I was so adamant in this thread, but, I have no problems with holding my hands up when I have messed up, so sorry for questioning you guys, the fact is now, I have no idea regarding what happends with DC and if there is or is not a penalty. (Would still be interested to know what Matt say's about it tho).
Well spotted jbergerot
I'll go and hide away in a dark corner somewhere now.
Posted 04 December 2005 - 05:56 PM
Please let us know in a few weeks if after removing the offending tags, you start to see more reindexing. Hopefully this was the only cause and things can get back quickly for you.
Posted 04 December 2005 - 06:31 PM
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users