Are you a Google Analytics enthusiast?
More SEO Content
Pages Not Crawled
Posted 04 May 2010 - 01:59 PM
Six months later, I have only 545 pages index which is about 15% of all the pages of my site. The other pages are unique content pages, some are accessible on most of my pages in the navigation.
I was wondering if making a sitemap would speed up the process or if there was a way to tell Google to go index these pages.
Posted 04 May 2010 - 02:47 PM
Are you sure that your hosting company isn't still doing something to throttle the spiders?
That would be the likely cause.
Posted 04 May 2010 - 03:00 PM
Just wondering if it would have slowed down the robots because there was a noindex a while ago. The robots would still wonder if they have to index it.
Posted 04 May 2010 - 04:52 PM
If you're still getting about the same amount of referrals from Google as before and you have not added much content, you should be okay.
If you have added additional pages and are getting more traffic from Google, you should be okay.
If you are receiving less traffic then maybe something is wrong.
It could also be that some of your inbound links are no longer passing value to your site.
Posted 05 May 2010 - 10:10 AM
I fixed it and removed the pages with the removal tool in webmaster tool. I think the bot got pretty mixed up with these changes.
That was about 2 months ago and I removed them about a month ago. Maybe it will take longer for the spider to figured it out.
Just wondering if at this point a site map would be benefit or ifI should just leave it like that.
Posted 05 May 2010 - 10:17 AM
Posted 05 May 2010 - 10:21 AM
If Google is trying to index URLs that don't exist, I don't think that adding a sitemap will change that. Assuming there are no bad links on the site that point to the nonexistent URLs, Google will check the URLs how ever many times it needs to until it's convinced that the 404 responses it's getting are always going to be there, and then it will stop checking them. Hopefully, once that process is completed, it will move on to finding legitimate links on your site to legitimate pages and crawling those URLs.
Posted 05 May 2010 - 10:39 AM
I will wait and see what happen
Posted 05 May 2010 - 10:49 AM
Posted 05 May 2010 - 12:52 PM
Thanks Jill, can you tell me what is a spider simulator?
Posted 05 May 2010 - 01:16 PM
Another way to check your site for bad internal links is to use an actual spider program, like Xenu's Link Sleuth, but that requires you to download and install the program. It's very useful, though, not to mention free.
Posted 05 May 2010 - 03:50 PM
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users