Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo

Many Versions Of One Site In 13 Countries


  • Please log in to reply
4 replies to this topic

#1 Say Yebo

Say Yebo

    HR 4

  • Active Members
  • PipPipPipPip
  • 222 posts
  • Location:USA

Posted 10 June 2009 - 11:54 AM

I'm on a learning curve with a new client who has international needs that are new to me.

I cannot mention the industry due to privacy issues, but for the sake of this, let's pretend it's a ZOO.

The concept is a series of zoo web sites that dozens of zoos in 13 different countries can sign up for. The purpose of the exercise is for people to easily locate a nearby zoo in any one of the 13 countries.

Each zookeeper will get his own site where he can personalize some of the pages: Hours of Operation, Directions to our Zoo, Animal of the Month, etc.

In addition, each site will also have many pages of animal information that is common to every site, as an educational resource for zoo visitors.

My questions are:

1. Is there some code that should appear on each site to distinguish its country for the search engines? (Let's assume at this point that the URL may not identify the country)

2. Will the pages of duplicated animal information cause any issues? Should I block them from being indexed? Keep in mind the purpose of the sites are for people to be able to "locate a zoo nearby" - and the animal info is merely to enhance their visit to the site once they've located it. The purpose is NOT primarily to provide people with animal info.

Thanks,
Caro

#2 dowhatnow

dowhatnow

    HR 2

  • Active Members
  • PipPip
  • 20 posts

Posted 30 July 2009 - 03:26 PM

I know where you are coming from we are going into about 5 counties in the next year. All of our sites are going to be sub domain'd I'm not sure how the same language sites are going to perform in the search engines.

I've decided to nofollow a US site because of the duplicate content, anyone got any ideas on how this will work?



#3 robmarketshare

robmarketshare

    HR 4

  • Active Members
  • PipPipPipPip
  • 258 posts

Posted 30 July 2009 - 05:33 PM

For starters ( if it is a .com ,info, .org .eu) one could use the localisation in Googles WMT for .com/fr .com/nl .com/be .com/de .com/at
and I would think sticking the country at the end of the <title> would not hurt either.

#4 mcanerin

mcanerin

    HR 7

  • Active Members
  • PipPipPipPipPipPipPip
  • 2,242 posts
  • Location:Calgary, Alberta, Canada

Posted 31 July 2009 - 02:55 PM

This should help you:

As a general rule duplicate content on sites that are *clearly geolocated differently* are not considered spammy by search engines. They will simply choose and serve the content closest to the visitor.

Just be aware that you are not going to be showing up multiple times for that duplicate content - only once, and the one that will show up will be (in order of importance):

1. Geolocated to the visitor
2. Has a higher PR (one of the few remaining uses of PR)
3. Is older or more likely to be the original.

Note: this order is based on my experience, I'm pretty sure I've never seen a search rep reference it in this manner.

Ian

#5 dowhatnow

dowhatnow

    HR 2

  • Active Members
  • PipPip
  • 20 posts

Posted 03 August 2009 - 02:48 AM

So basically, if I have a US sub domain with $ on it which shares virtually the same content as the UK site (which is on the top domain) the search engines will not see this as duplicate content?

Edited by Jill, 03 August 2009 - 09:37 AM.





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!