Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Sub-folder, Sub-domain Or Parameter= For White Label Sites


  • Please log in to reply
6 replies to this topic

#1 JonathanE

JonathanE

    HR 1

  • Members
  • Pip
  • 4 posts

Posted 13 June 2009 - 06:04 AM

We will soon be enabling partners to white label our service (e.g. header, footer, CSS, what content to display). Most wish to display our service within an iFrame on their own website for a number of reasons, however at a later date we will also have partners who wish to add their own header/footer to our page and link to this page in the main window. This will be an automated process in time.

Each white label partner will soon be able to customise the content that shows on their version for any of the products (e.g. hotel descriptions), either manually, or with us pulling their unique content from their own exiting CMS where the cost is justifiable.

I am tossing up between using a sub-domains, sub-folders, or having 'parameter=value' added to each url to identify the partner and have the page act accordingly.

I have spent some hrs searching for discussion on this point, but have not found anything that relates to this specific scenario, with most discussions being about the SEO benefits of hosting a blog on a sub-domain or sub-folder. I would be extremely grateful if some experienced/knowledgeable users can help me identify what method is best (for us and our white label partners) SEO/SEM wise.

I believe that the parameter=value method will eliminate any issues regarding duplicate content (where they use our standard descriptions), however this will stop them from being able to analyse their traffic on the white label solution using their own google analytics/adwords accounts.

I am thinking that the sub-domain or sub-folder approach is best, but possibly with each of the white label pages having a noindex command (or similar - e.g. disallowing the entire sub-domain/sub-folder in the robots.txt other than the sitemap and the pages specified) unless the owner/partner has customised the content and ticked that they want it indexed. Is this a sensible approach, or are there other/better alternatives?

The sub-folder approach would probably provide the best SEO benefit to the main domain also (from the in-bound links, and the unique content being attributed to the domain), while also providing the white label partners with a better chance of the pages with their unique content show up in the search results due to the credibility of the domain.

If we let their pages get indexed however and allow them to show in search engine results, then users may end up on the pages without it being shown within the intended iFrame with the partner's header/footer. I am thinking that we could get around this somehow by seeing where the user came from, and if they did not come from the partner site then we would re-direct them to the partner's website with our page being loaded within the iFrame (possibly by posting the page URL to the partner's page to let them know what page to display in the iFrame).

As you can see, there is quite a lot to take into account. I do not know how to program/code, but do have a decent understanding of how things work/am a quick learner and would really like to get a better idea of the options, which is best suited and the benefits and restrictions.

Thanks heaps for any help guys.

#2 Jill

Jill

    Recovering SEO

  • Admin
  • 33,244 posts

Posted 13 June 2009 - 08:56 AM

I would probably want to use subdomains since it's sounds like these are more to be separate websites than part of your website.

#3 JonathanE

JonathanE

    HR 1

  • Members
  • Pip
  • 4 posts

Posted 13 June 2009 - 09:31 AM

Hi Jill. Thank you for the reply.

You are correct that each solution will act as its on website in a number of ways as they will belong the the white label partner and will be heavily customizable (e.g. ability to alter product descriptions and rank products to determine their default position in the search results).

They will all be using the same standard/default content however (some of which is sourced from several content providers/booking channels) and the same source code.

It will be a shame to lose the in-bound link goodness from our white label partner's websites linking to our domain, and their pages will be dependent on their own efforts to build up the credibility of the sub-domain, but you are probably correct that sub-domains is the right way to go.

I will need to discuss the back end duplication issues with the developers, but from an SEO perspective, would it be wise to stop search engines from indexing the duplicate content on these multiple sub-domains? I believe this would have to be done using the robots.txt and telling the engines which pages not to index on each sub-domain, and can not be done at a content level (e.g. letting them index the page, but telling them to ignore certain sections/content)?

#4 Randy

Randy

    Convert Me!

  • Moderator
  • 17,540 posts

Posted 13 June 2009 - 10:57 AM

At the end of the day the url won't be so much about SEO as it is about structure, user perception and ease of maintenance. Or in other words, there's not a lot to gain (or lose) on the SEO front in the different types of URLs you mentioned. But there are some differences in the users perception, the ease of maintenance and definitely in the structure.

For all three of those I agree with Jill. Subdomains are probably your best bet.

From what you've said above I assume these white label partners are going to be using some methods other than search to drive traffic to their "sites" right? And they're aware that they'll have to do this and their pages will stand no chance of ranking? It's an important thing for them to understand.

As far as how to do it, you could simply robots.txt each of those subdomains. The entire thing. As far as losing the link pop, well that's not really your link popularity is it? It's garnered because of the work of your white label partners so isn't really yours to control in the first place.

Are there potentially ways to preserve this for your main site? Maybe. You might be able to do something with the newish canonical meta tag, so that each of those pages points over to the parent page. Implementation would be fairly easy to accomplish with a tiny bit of scripting, and in theory at least it should work. However I would argue it's not really your link pop to attempt to preserve in the first place. And even if it were the total effects of the meta canonical tag are not yet well understood with it being a relatively new thing. Frankly, I'm not sure I would risk it. There's just not enough data out there yet to give one a good idea of all of the possible intended and unintended consequences.

If it were me I'd set 'em up on subdomains and robots.txt those out of the engines. And make darned sure the white label partners fully understand that their pages are not going to get into the search engines.

#5 JonathanE

JonathanE

    HR 1

  • Members
  • Pip
  • 4 posts

Posted 13 June 2009 - 02:11 PM

Thanks for your input Randy.

Our initial partners already have high traffic websites, but wish to use our service to compliment their own (provide a better user experience, and earn additional $ at the same time). As such, they will mostly continue to drive traffic to their own websites as they have done to date.

Some will be providing unique content to be shown through our solution however (e.g. their own descriptions) and are likely to want to try and rank these pages if possible. As such, would the robots.txt solution allow us to let search engines crawl these specified pages (that have unique content) and block the rest. Maybe the best way would be to do this on the back end somehow with the pages that have unique content having a different dir structure (e.g. .com/partner/unique/....) to those that are using the standard content, with this dir being allowed.

I will look into canonical meta tags, thanks, but will also be sure to take your advice/warnings on board.

#6 Randy

Randy

    Convert Me!

  • Moderator
  • 17,540 posts

Posted 13 June 2009 - 06:55 PM

Yeah, you'd be better off having that to-be-crawled content out in a different directory.

robots.txt exclusion pretty much goes only one way. It's excludes. So to exclude some files in a subdirectory and not others would mean you'd have to exclude each individual page that needed to be. As opposed to using one instruction to exclude an entire subdirectory.

Unfortunately robots.txt doesn't support the type of regex statements you would need to do a partial exclusion on the directory level. Some engines kinda/sorta support some wildcarding, but it's not part of the specs. And even for those that do support it the wildcarding is pretty rudimentary.

Best to plan to have those crawlable pages in a different place from the very beginning.

#7 JonathanE

JonathanE

    HR 1

  • Members
  • Pip
  • 4 posts

Posted 13 June 2009 - 10:15 PM

Great, I will discuss the viability of doing that with the developers (having the pages with unique content appear to be in a different directory).

The search result pages will all be blocked, but we can always link to each description page from the sitemap so that search engines can still get to them that way.

Thanks again for your help, this has given me a much better understanding of the best way to move forward!




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

We are now a read-only forum.
 
No new posts or registrations allowed.