Are you a Google Analytics enthusiast?
More SEO Content
What Is The Point Of Using Crawl-delay In Robots.txt
Posted 09 June 2009 - 11:03 AM
Posted 09 June 2009 - 11:45 AM
For very large sites that might get a lot of bandwidth eaten up, it might make sense to have that within the robots.txt file.
Posted 09 June 2009 - 02:45 PM
For example, when Googlebot spiders your site and does a deep spider they may have several spiders hitting several pages in a relatively short duration. If this spidering causes the server to slow to a crawl you'd definitely want to either fix the root cause of the problem or institute a crawl delay.
As Jill mentioned from the wiki article crawl delay can also be utilized for cases where the spiders are eating up too much bandwidth. Though frankly in that case I'd encourage you to upgrade your hosting package if things are that close. Real users are going to use up far more bandwidth and server load than the spiders. So if the spiders extra bit of usage is causing problems you're always better off to fix the root cause. Rather than limiting the spiders ability to crawl your site properly.
Posted 10 June 2009 - 01:08 AM
Posted 10 June 2009 - 08:00 AM
You might be surprised at the purpose the entries serve. There's no sense in allowing the robots to crawl areas that really shouldn't be indexed as you want them to focus on the areas of importance.
It sounds like they're using their robots.txt file exactly as they should.
Posted 10 June 2009 - 02:19 PM
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users