Are you a Google Analytics enthusiast?
More SEO Content
Web Stats Analysis
Posted 08 October 2003 - 02:09 PM
Just a quick question for you all, I'm not certain if this is the best location or not, so my apologies if this question shoudl appear in another forum.
Specifically, what I wanted to know is if anyone knew of a good list of the IP addresses of spiders: I have been hoping that someone may have put together a list that I could place in my web tracking software filter to help eliminate the lion share of automated visits to my site: does anyone have any suggestions?
Posted 08 October 2003 - 02:26 PM
Adding the IP address of spiders to a stats tracking program won't eliminate the actual visits. If that's what you are trying to do. Are you simply trying to filter the bot visits from the rest of the stats? If so, what tracking program are you currently using? I guess I'm not sure I understand the question, but most tracking software should automatically filter bot visits from human visits.
a list that I could place in my web tracking software filter to help eliminate the lion share of automated visits to my site
IMO, tracking bot IP's can be a full-time excersize in futility. Time that is better spent working on the site.
Posted 08 October 2003 - 03:18 PM
Posted 08 October 2003 - 04:10 PM
I monitor the traffic to my site using a combination of deepmetrix, Hi-Stats, and a home-grown Access database to monitor my website traffic.
I don't want to deny the bots from my site, rather, I just want to filter out the records of their visits when I review the website data. The only way I can do that is by IP# and/or IP range.
I have found a few lists on various websites that list the various robots as well as their IP ranges, but was wondering if there were any lists that people used and/or recommended.
Posted 08 October 2003 - 05:58 PM
If you really want IP addresses, instead, here's how to find it (and an indication of what you'll be facing). The only application where the IP address is absolutely required is for cloaking, so a search on "spider cloaking" (no quotes) should yield you some good results.
What you likely will discover is that most of the sites that sell such scripts also provide frequent updates to their IP list, because without such updates cloaking is useless. That the IP addresses change fairly frequently is a given, else they couldn't make money with updates.
If you're using a custom database (Access), you could also program your own search utility. Any time the robots.txt file is accessed, it's a pretty safe bet you got yourself a spider. Create a table to store that information, and you'll soon have a personalized list of IP addresses.
Posted 09 October 2003 - 09:07 AM
Posted 09 October 2003 - 09:40 AM
Posted 09 October 2003 - 04:15 PM
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users