Are you a Google Analytics enthusiast?
More SEO Content
Staled In The Google Index
Posted 21 August 2009 - 02:40 PM
Now there is some 9,000 duplicate tag titles that I thought are getting a penalty but even so there is a lot more urls to index that Google can't seem to get going on. Any advice or help please?
Posted 21 August 2009 - 03:47 PM
Posted 21 August 2009 - 03:55 PM
Posted 21 August 2009 - 10:29 PM
Well, you have a problem with this statement alone. Google tops out XML Sitemaps at 50,000 urls per Sitemap. So at a minimum you'd have to create multiple Sitemaps, where each contained 50,000 or fewer URLs, then either submit them separately or reference them in a Sitemap Index file.
Beyond this though, having an XML Sitemap does not guarantee that those URLs will be crawled or indexed. Sitemaps can useful for sites that have funky URLs that Googlebot has trouble crawling. However G'bot obviously crawled all of those URLs before so that doesn't seem to be the case. Even with the Sitemaps I wouldn't expect any miracles, given what you've described.
If it's something that's just happened recently it may be a temporary fluke that'll correct itself. Or could be an issue with their reporting number again. (Hey, it's happened before!) I assume you've already looked into possibilities like someone having changed the robots.txt file so that it's not excluding some of your urls, or a meta robots tag being tweak in your forum code.
If it's not recent and there is nothing standing in Googlebot's way then it comes down to Google changing the way they do things or changing they view they have of your site. Impossible to say on that one without a lot more details since I've seen both happen and they look pretty darned similar.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users