Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!


Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 




From the folks who brought you High Rankings!


Redirects For A Global Website

  • Please log in to reply
2 replies to this topic

#1 BeantownSEO


    HR 2

  • Members
  • PipPip
  • 20 posts

Posted 23 July 2009 - 04:09 PM

I am currently working on a client's global website and will be the first to admit I am not the most technically astute person. The client has a global website, www.client.com which has a 302 redirect setup to //www.client.com/index.jspx;jsessioni...E9F44CAADE.pa05, which is then redirected to the English version of the site, www.client.com/en-US/index.jspx;jse...E9F44CAADE.pa05.

So in the search results for the brand, you see www.client.com but when you click the result, you are redirected to www.client.com/en-US/index.jspx. They have an IP detection set in place so that the main site, www.client.com, redirects you to the proper internatinal version of the site; it detects where you are and redirects you to the proper site.

So my question is, is the 302 redirect the proper HTTP header for www.client.com, or should it be a 301 redirect, or should it even be that www.client.com returns a 200 HTTP OK as does all of the different versions of the site, like www.client.com/en-US/index.jspx for example. I want to make sure that the proper redirects are in place so that all link popularity is funneled down properly, and transferred properly, and that there aren't multiple URL's indexed for the same page because things are not redirect properly.

They have a few other branded websites, and all of them are handling this situation differently which makes it difficult to say. Another one of their sites is redirecting in this manner:

#1 Server Response: http://www.client2.com
HTTP Status Code: HTTP/1.1 301 Moved Permanently
Date: Thu, 23 Jul 2009 19:31:02 GMT
Server: Apache/2.2.8 (Unix) mod_ssl/2.2.8 OpenSSL/0.9.7a PHP/5.1.6
Location: http://www.client2.com/index.jsp
Vary: Accept-Encoding
Content-Length: 240
Connection: close
Content-Type: text/html; charset=iso-8859-1
Redirect Target: http://www.client2.com/index.jsp

#2 Server Response: http://www.client2.com/index.jsp
HTTP Status Code: HTTP/1.1 301 Moved Permanently
Date: Thu, 23 Jul 2009 19:31:02 GMT
Set-Cookie: JSESSIONID=E4F6A7CF456FA8E15C54F490A07D4AD5.el46; Path=/
Last-Modified: Thu, 23 Jul 2009 19:31:02 GMT
Cache-Control: max-age=60
Location: /languagejump
Connection: close
Content-Type: text/html; charset=utf-8
Vary: Accept-Encoding
Redirect Target: /languagejump

#3 Server Response: /languagejump
Wrong service type or malformed URL

Any help on which is the proper way to ensure proper indexing and funneling of link popularity would be greatly appreciated!

#2 adibranch


    HR 5

  • Active Members
  • PipPipPipPipPip
  • 332 posts

Posted 24 July 2009 - 06:31 AM

redirect from the root to a sub directory or file is bad.. for various reasons. If they cant place the site in the root (which as developers they can, they just cant be bothered) then set up the folder/file point in htaccess.

#3 Randy


    Convert Me!

  • Moderator
  • 17,540 posts

Posted 24 July 2009 - 07:23 AM

Okay, there are definitely some redirect issues you'll probably want to get a better handle on. But that's not the part that concerns me first and most.

In your question you indicate the final url has what looks to be a Session ID in them. In this case a Java Session ID by the looks of it. That's what the jsessionid stuff is.

So my question is whether or not these session ids actually show up in the url strings? And if they do for you using a normal browser, does the server give the search engines a free pass around those? Or it is a required element? Or a required element if the browser/user-agent doesn't allow a cookie to be set.

Most likely you're going to want to do a little research and try to get as close as possible to viewing the site the same way the search engine spiders do. You can do this somewhat by using the site: operand at the various search engines, paying attention to the url structure of the pages they have indexed and reviewing the Cache they have on file for pages.

Or another way to do it is to set up your browser to make it look (and act) like the search engine spiders. In other words, you'll want to tweak the way the browser reports itself so that it'll look like Googlebot or Slurp and also to not accept Cookies and not process javascript or client side java.

That's where I'd start. Mostly because whether the pages can be spidered and cached properly is the most important thing. Only when you've confirmed the pages can be spidered and cached do you need to start sorting out the redirect stuff. That's a whole other ball game.

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

We are now a read-only forum.
No new posts or registrations allowed.