Are you a Google Analytics enthusiast?
More SEO Content
Addon Domains And Robots.txt
Posted 16 June 2007 - 05:46 PM
my question is how should I configure my robots.txt for the main site and then for each separate site. If I restrict the main robots.txt from crawling the other sites' directories and then give each site its own robots.txt in their respective directory will each site get crawled properly?
And how will this set up affect my search engine results for the main site and the addons? Or should I reconfigure the server so that each site has its own exclusive /public_html/site/ root directory?
thanks in advance.
Posted 16 June 2007 - 06:37 PM
Posted 16 June 2007 - 10:42 PM
The robots.txt in your DocumentRoot (which is almost certainly public_html) should restrict access to the folders you've called myothersite1 and myothersite2. This will deter the spiders from crawling that content as part of the main site and eliminate the potential for duplicate content.
Inside each of those folders (which now become the DocumentRoot for each corresponding domain), you can put different copies of robots.txt -- if necessary -- to exclude anything specific to those domains. Those robots.txt files will only be seen when the spider visits the domains aligned with the folders.
If you have your own server and no limitations imposed by others, there's really no reason at all to create DocumentRoots inside of other DocumentRoots. It's certainly do-able, as outlined above, and it's not even all that tough, but it IS an unnecessary complication with no discernable benefits. Most people who do it are on a shared server and want to save a few bucks on hosting.
Posted 17 June 2007 - 04:25 PM
And actually, I wish I would have found this forum when I first started my site last year. It would have really made things easier.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users