I have a dynamic urls and few duplicate pages exit there with the same content but with different urls,
for few duplicate urls, Both the pages have been crawled but when i check the cache, both showing same in cache . I have not done any seo stuff there like 301, rel=canonical etc.
But in few cases only one url has been crawled and secon not. I have also noticed drop in my rankings.
Please advise me the right solution
Should i block the duplicate urls in through Robots.txt
Or should i use 301 redirect of all the duplicate urls to the original ones.
Are you a Google Analytics enthusiast?
Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE!
www.CustomReportSharing.com
From the folks who brought you High Rankings!
More SEO Content
International SEM | Social Media | Search Friendly Design | SEO | Paid Search / PPC | Seminars | Forum Threads | Q&A | Copywriting | Keyword Research | Web Analytics / Conversions | Blogging | Dynamic Sites | Linking | SEO Services | Site Architecture | Search Engine Spam | Wrap-ups | Business Issues | HRA Questions | Online Courses

Dynamic Urls And Duplicate Content
Started by
prabhu
, Nov 26 2009 12:18 AM
1 reply to this topic
#1
Posted 26 November 2009 - 12:18 AM
#2
Posted 26 November 2009 - 09:40 AM
You seem to have made the incorrect leap of thinking there is some sort of penalty for duplicate content. If so, that's not the cause in your rankings drop because there is no such penalty for the type of duplication you've described.
Is it smart to make sure your unique content is available only from one unique url address? Yes. But the reasons have nothing to do with search engine penalties.
As to how to fix it, your best bet is to fix whatever links appear in your site navigation that are pointing to the extra urls. If you can't do that or the extra urls have already been discovered 301 redirects are usually your best bet as a solution. This will help to merge any link pop back into a single location.
If you can't 301 for some reason, then using a canonical <link> is the 2nd best choice.
If you can't do either of those, then robots.txt exclusion is at least an option.
But no matter which of the three you choose the first thing you need to do is fix whatever links/paths the spiders have used to find these extra urls.
Is it smart to make sure your unique content is available only from one unique url address? Yes. But the reasons have nothing to do with search engine penalties.
As to how to fix it, your best bet is to fix whatever links appear in your site navigation that are pointing to the extra urls. If you can't do that or the extra urls have already been discovered 301 redirects are usually your best bet as a solution. This will help to merge any link pop back into a single location.
If you can't 301 for some reason, then using a canonical <link> is the 2nd best choice.
If you can't do either of those, then robots.txt exclusion is at least an option.
But no matter which of the three you choose the first thing you need to do is fix whatever links/paths the spiders have used to find these extra urls.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users