Are you a Google Analytics enthusiast?
More SEO Content
Disallow: /css/? Always Allowed Spiders Crawl Css
Posted 24 August 2010 - 01:26 PM
I just had a developer ask me what benefit was there for letting a spider crawl CSS.
Quite honestly, the only things I have disallowed over the years was the scripting folder, subdirectories that contain duplicate content, duplicate landing pages used for various marketing promotions and other sub directories that have parts of the site that I do not want spidered.
I have always allowed them to crawl the CSS...did I make a mistake here....In regards to benefit, I always thought it had to do with the file names and the relationship to Htags, etc.
What is the answer? DJKay
Posted 24 August 2010 - 01:39 PM
But I can definitely see a disadvantage to blocking it: doing so gives the search engine reason to question whether any of your formatting is serving the purpose of hiding content from users. Maybe that's just me being paranoid, but that's the first thing that I'd think of if I came across a blocked CSS.
Posted 24 August 2010 - 01:55 PM
Posted 24 August 2010 - 02:44 PM
Search engines say they would prefer to crawl the files, but there is no harm in blocking them.
If you're using CSS to do something sneaky and you block the search engine crawlers, you will still probably fail a manual review. Simply blocking crawlers from CSS directories should not result in a manual review but you have to remember that competitors might report a sneaky site.
That's about the worst that could happen, so far as I can see.
Posted 25 August 2010 - 03:25 AM
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users