Are you a Google Analytics enthusiast?
More SEO Content
Posted 19 May 2009 - 07:01 AM
i created a site a while ago. The google indexed about 140 pages out of that site, and actually I dont want most of the pages to be indexed anymore. I read an article that it is possible to remove pages form an google index by creating an appriopriate robots.txt. Is that true. How often is the file 'robots.txt' checked by google crawler?
Posted 19 May 2009 - 08:33 AM
[Live url removed per [url=http://www.highrankings.com/forum/index.php?act=boardrules]Forum Rules[/url].]
Posted 19 May 2009 - 08:45 AM
2. Since the pages are already indexed it'll take a bit of time to get them de-indexed after making your robots.txt changes. Give it a few weeks.
3. If you want to ban all compliant spiders from everything a simple rebots.txt that says
will do the trick. That's not what you have in your robots.txt currently. Instead you're disallowing all bots from some subdirectories and disallows some bots from everything. With none of those latter category being a reference to Googlebot. So as you now have things set up Googlebot will only ignore the list of subdirectories in the first block of your robots.txt, since it applies to all spiders. That's not going to keep them away from all of your pages though.
Posted 19 May 2009 - 08:58 AM
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users