Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo

New Robots.txt File Not Being Used


  • Please log in to reply
3 replies to this topic

#1 doughayman

doughayman

    HR 3

  • Active Members
  • PipPipPip
  • 91 posts

Posted 07 May 2010 - 05:30 PM

Hi,

I have robots.txt file for a domain. The robots.txt file contains 230 Disallow statements, which are all valid syntactically. Googlebot routinely reads this file, and WMT indicates that the processing of it is "successful".

My problem is that an entry that was in an old version of robots.txt several months back, is getting blocked, when in fact, I do not want it to get blocked.

For whatever reason, it seems like this old version of robots.txt is actually being used by Google, despite the fact that I've made many changes to it over the last month, and it has been spidered by Google.

Is there a standard period of time that typically needs to elapse, before a new version of robots.txt becomes the defacto standard for the site ? Is there something that I can do to force Google to use this new version ?

Thanks in advance !

#2 Scottie

Scottie

    Psycho Mom

  • Admin
  • 6,294 posts
  • Location:Columbia, SC

Posted 07 May 2010 - 09:05 PM

So, you checked GWT and they are still showing as "Restricted by robots.txt"?

#3 doughayman

doughayman

    HR 3

  • Active Members
  • PipPipPip
  • 91 posts

Posted 09 May 2010 - 09:01 PM

QUOTE(Scottie @ May 7 2010, 10:05 PM) View Post
So, you checked GWT and they are still showing as "Restricted by robots.txt"?


Yes Scottie, precisely. Either the reporting in GWT is erroneous, or for some reason they have my old robots.txt file in cache, and despite downloading the new robots.txt, they are still not using it. Perplexing, to say the least.

#4 qwerty

qwerty

    HR 10

  • Moderator
  • 8,546 posts
  • Location:Somerville, MA

Posted 10 May 2010 - 12:40 AM

When you say that WMT is reporting the file as restricted, I take it you mean that it's listed as restricted under Diagnostics > Crawl Errors.

Have you tried using the Test Robots.txt tab under Site Configuration > Crawler Access? You enter the URL in question, and if it's blocked, it should report which line in the file is being read as blocking it. If Google is accessing the current robots.txt file (and you say it is) and unless you've got an error in there that's causing the file to be blocked (and you say you haven't), then I wonder if it's possible that one section of WMT would report a given URL as blocked and another section would report the opposite.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!