Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Google Listing Sub-domains


  • Please log in to reply
6 replies to this topic

#1 Jack

Jack

    HR 2

  • Members
  • PipPip
  • 12 posts

Posted 05 May 2011 - 09:57 PM

On my sites server, I have sub-directories setup for testing new changes to the site prior to applying them to the live shop. It turns out that google is indexing all of the pages in those sub-directories. They are not mentioned anywhere (robots file, sitemap, etc.) but they are setup as sub-domanis, like test.mydomain.com. All I can think is that google is using the IP, which they share, to locate them. Regardless of how they are finding them, can some please explain a way to stop it, short of adding no archive tags to all of the files or making them password protected? I thought of adding another IP and setting the sub-domains up on it but without knowing just how it is google is doing it, that may be a wasted effort. I would appreciate any thoughts on how to set this up properly.

#2 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,028 posts
  • Location:Blackpool UK

Posted 06 May 2011 - 10:18 AM

For Google (or any other search tool) to find the URL there must be a link to it somewhere, search engines cannot see how the domains and subdomains your server are structured.

#3 Jill

Jill

    Recovering SEO

  • Admin
  • 32,983 posts

Posted 06 May 2011 - 10:40 AM

I believe there's a way to block the subdomains, perhaps through robots.txt?

#4 qwerty

qwerty

    HR 10

  • Moderator
  • 8,625 posts
  • Location:Somerville, MA

Posted 06 May 2011 - 11:27 AM

Yes, each subdomain can have its own robots.txt file.

#5 Jack

Jack

    HR 2

  • Members
  • PipPip
  • 12 posts

Posted 06 May 2011 - 01:12 PM

QUOTE(chrishirst @ May 6 2011, 11:18 AM) View Post
For Google (or any other search tool) to find the URL there must be a link to it somewhere, search engines cannot see how the domains and subdomains your server are structured.

No, there isn't such a link. No mention of it anywhere on the server other than in its own files. That's why I think it is being reached via the IP.

#6 Jack

Jack

    HR 2

  • Members
  • PipPip
  • 12 posts

Posted 06 May 2011 - 01:17 PM

QUOTE(Jill @ May 6 2011, 11:40 AM) View Post
I believe there's a way to block the subdomains, perhaps through robots.txt?

Thanks, but google doesn't always pay attention to the robot file. They won't rank the pages if told not to search them but they may search them and list any they think should be listed. But, now that you joggled my brain, I'll try adding a 301 to the sub-domains .htaccess file.

#7 qwerty

qwerty

    HR 10

  • Moderator
  • 8,625 posts
  • Location:Somerville, MA

Posted 06 May 2011 - 10:02 PM

If the purpose of a given subdomain is internal testing, I don't think a 301 is what you want. If you redirect requests for pages within the subdomain, you're not going to be able to access those pages for testing.

Robots.txt will keep Google out of the pages in the subdomain, but it's true that that doesn't stop them from being aware of the pages there and including them in the index (even if they don't include the content on them). You can use Webmaster Tools to have Google remove URLs, however.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!