Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Crawling Error From Google


  • Please log in to reply
4 replies to this topic

#1 sanjeet

sanjeet

    HR 1

  • Members
  • Pip
  • 2 posts

Posted 13 January 2011 - 06:45 AM

hi all,

I am facing problem regarding my site www dot logicspice dotcom.
in webmaster account following errors i received-
robots.txt unreachable
www dot logicspice dot com Web Missing robots.txt

Network unreachable: Network unreachable
We encountered an error while trying to access your Sitemap. Please ensure your Sitemap

follows our guidelines and can be accessed at the location you provided and then resubmit.

plz help me how can i get over from above mentioned problem.
plz visit my robots and xml site map and inform me about error.

Thanks
Sanjeet kumar

#2 Jill

Jill

    Recovering SEO

  • Admin
  • 32,963 posts

Posted 13 January 2011 - 10:13 AM

Perhaps your server was down for a bit when they tried to reach it?

#3 sanjeet

sanjeet

    HR 1

  • Members
  • Pip
  • 2 posts

Posted 15 January 2011 - 12:26 AM

QUOTE(Jill @ Jan 13 2011, 08:43 PM) View Post
Perhaps your server was down for a bit when they tried to reach it?


thanks for information.
actually i was also thinking about that but its been more than week and there is no such down time for the site. and still i m receiving that error.
webmaster also showing error page not found for those pages which are not actually in my site. what to do for this.
Plz help.

#4 piskie

piskie

    HR 7

  • Active Members
  • PipPipPipPipPipPipPip
  • 1,098 posts
  • Location:Cornwall

Posted 15 January 2011 - 04:51 AM

See if it makes any difference when you remove: Crawl-Delay: 10

#5 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,122 posts
  • Location:Georgia

Posted 18 January 2011 - 07:02 PM

This sort of error happens a lot. Usually it is for one of the following reasons:
  • The site in question is blocking Googlebot (could be through .htaccess or a software-implemented block)
  • The Webhosting service is blocking Googlebot (some hosts have denied doing this only to find later they were)
  • There is a router somewhere between your site and Googlebot that is causing timeouts
  • Your server is running so slow that Googlebot is getting timeouts
  • BufferBloat may be affecting the route between your server and GoogleBot (it may straighten itself out for a while)
BufferBloat is a relatively new concept that probably is not yet responsible for many if any Googlebot crawl errors but I suspect it may become a major issue for SEO if commercial network engineers cannot resolve the issue.

Denying/blocking specific IP addresses you are suspicious about is a common enough way to accidentally block a search engine crawler.

Here is how one ISP figured out their firewall had accidentally blocked GoogleBot:
Our firewall has an automated mechanism which will block IP addresses deemed to be making too many concurrent connections to our server in a short time. Our security department has whitelisted the google network range that is noticed to make these connections. On top of that we have made the firewall less stringent in the sense we will allow a higher threshold of concurrent connections compared to previously. Based on your feedback, the configuration is just right.

NOTE: I am actually quoting the citation posted at the end of the article.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!