Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Dynamic Urls And Duplicate Content


  • Please log in to reply
1 reply to this topic

#1 prabhu

prabhu

    HR 2

  • Active Members
  • PipPip
  • 36 posts

Posted 26 November 2009 - 12:18 AM

I have a dynamic urls and few duplicate pages exit there with the same content but with different urls,

for few duplicate urls, Both the pages have been crawled but when i check the cache, both showing same in cache . I have not done any seo stuff there like 301, rel=canonical etc.

But in few cases only one url has been crawled and secon not. I have also noticed drop in my rankings.

Please advise me the right solution

Should i block the duplicate urls in through Robots.txt
Or should i use 301 redirect of all the duplicate urls to the original ones.

#2 Randy

Randy

    Convert Me!

  • Moderator
  • 17,540 posts

Posted 26 November 2009 - 09:40 AM

You seem to have made the incorrect leap of thinking there is some sort of penalty for duplicate content. If so, that's not the cause in your rankings drop because there is no such penalty for the type of duplication you've described.

Is it smart to make sure your unique content is available only from one unique url address? Yes. But the reasons have nothing to do with search engine penalties.

As to how to fix it, your best bet is to fix whatever links appear in your site navigation that are pointing to the extra urls. If you can't do that or the extra urls have already been discovered 301 redirects are usually your best bet as a solution. This will help to merge any link pop back into a single location.

If you can't 301 for some reason, then using a canonical <link> is the 2nd best choice.

If you can't do either of those, then robots.txt exclusion is at least an option.

But no matter which of the three you choose the first thing you need to do is fix whatever links/paths the spiders have used to find these extra urls.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!