Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Permanent 301 Redirect


  • This topic is locked This topic is locked
320 replies to this topic

#16 Randy

Randy

    Convert Me!

  • Moderator
  • 17,540 posts

Posted 21 May 2004 - 08:04 AM

Yup, it's completely kosher if used properly James. You're not showing the engines anything that you're not also showing to human visitors.

For the .com / .co.uk portion of your question, that's exactly how it should be used.

#17 Shane

Shane

    HR 6

  • Active Members
  • PipPipPipPipPipPip
  • 850 posts
  • Location:Atlanta, GA

Posted 21 May 2004 - 08:47 AM

Ian,

You also need to check "A permanent redirection for this resource." Otherwise, IIS will send a 302 instead of a 301.

Shane

#18 Shane

Shane

    HR 6

  • Active Members
  • PipPipPipPipPipPip
  • 850 posts
  • Location:Atlanta, GA

Posted 21 May 2004 - 08:51 AM

Randy,

Please correct me if I'm wrong, but I don't think the DNS solution works (or maybe I just read it wrong :aloha: ). If you CNAME www.myotherdomain.com to www.mydomain.com, then the site can be accessed via either domain. That means that the site could easily end up being indexed under both domain names. That would be great at the start, but eventually they'll detect the duplicate content and act accordingly. (I say accordingly, because I'm not 100% sure what they actually do in those cases :) )

Shane

#19 Jill

Jill

    Recovering SEO

  • Admin
  • 32,963 posts

Posted 21 May 2004 - 09:16 AM

Shane, it's not duplicate content, it's one site (one content) with 2 (or more) domains.

Google, for one, knows these are all the same site and will only show one of them, which is exactly what you want.

Jill

#20 Shane

Shane

    HR 6

  • Active Members
  • PipPipPipPipPipPip
  • 850 posts
  • Location:Atlanta, GA

Posted 24 May 2004 - 08:22 AM

Google does know that these are the same sites? They infer that from the fact that the content and IP addresses are identical? What happens if you've located your site on both costs for redundancy and are thus using multiple IP addresses? Seems like the possibility then exists for Google to see the same content under two different IP addresses and domain names. That's obviously not an issue for most sites, since we're talking about a single web server in most cases and almost exclusively a single location, but for larger ones it seems like it might be.

You also lose control over which domain Google picks. A year ago this month, we had that exact problem. We didn't have 301's set up for some of our very old domains, and Google decided out of the blue to started using one of those rather than our main one that had the prominence and name recognition. It threw everything we had done to that point into a tailspin. After (quickly) getting the 301's in place and waiting for the next Google update (which, thankfully, happens much more often now), everything was rectified and we pulled out of the spin.

After many lengthy discussions with them (including one in person out at Mountain View), they still could come up with no answers as to why it happened.

We're not going to take chances anymore, though. When it's important for a specific domain to be the one we show up under, we'll always use 301's for the domains that redirect to it. We know that does exactly what we want it to do (and I'm sure Yahoo will figure out how to handle them before long :cry:).

#21 Randy

Randy

    Convert Me!

  • Moderator
  • 17,540 posts

Posted 24 May 2004 - 09:28 AM

That's exactly the approach to take too Shane. Like you I learned this lesson the hard way a few years ago. It's one I'm not likely to forget!

I'm a big believer in leaving as little to chance as possible. Call me a control freak. :cry: It's true with things like this. I know which domain I want in the SERPs, so I leave nothing to chance.

FWIW on the redundancy example, most people wouldn't have dual domain names. They would simply have a single domain and set up the redundancy through a DNS round robin or via nameservers pointing to totally different locations in case one server went down.

In my experience that sort of setup (single domain on two servers with different IP's) wouldn't cause a duplicate content problem. The only time that seems to kick in is when there are different domain names.

#22 KirkVandenBerghe

KirkVandenBerghe

    HR 1

  • Members
  • Pip
  • 1 posts
  • Location:Kauai, Hawaii

Posted 28 May 2004 - 03:19 PM

Aloha,

Great thread. This is my first post to this forum.

I just had to change a site's dedicated IP address. I've read--but haven't been able to confirm as fact or fiction--that Google maintains their own domain > IP database, and that it if often out of date relative to the worldwide nameservers.

So, I'm concerned that this change in IP may cause some of the indexed pages for the domain in question to drop out of the Google directory before they update and find the domain's new IP.

I do have the ability to associate a new domain with the old IP. Is there some way to use a redirect (any type) to say to the SE's "this domain isn't at this IP anymore, and you can find it at this new IP"?

Best,
Kirk out.

#23 Shane

Shane

    HR 6

  • Active Members
  • PipPipPipPipPipPip
  • 850 posts
  • Location:Atlanta, GA

Posted 28 May 2004 - 03:34 PM

That's the first I've ever heard of that, Kirk. It would really surprise me if they did that.

Anyone else ever heard that?

#24 Ron Carnell

Ron Carnell

    HR 6

  • Moderator
  • 966 posts
  • Location:Michigan USA

Posted 28 May 2004 - 04:21 PM

Shane, it would be really surprising if they didn't do that. DNS lookups, to convert a domain name into an IP address, are notoriously time-consuming and Googlebot has a whole lot of pages in a whole lot of domains to find and index. Spiders really can't afford to look up a domain name for every page, so we can be pretty sure they cache it. The only real question is how LONG they cache it.

Irrespective of search engines, changing the IP address for a domain isn't an instant thing because the DNS database is spread out over the entire world. When you point your domain name at a new named server, with a new IP address, it typically takes about three days for the majority of the world to see the new server. Some people won't see the change for a week or more. During DNS Propagation, a visitor may go to your old IP address or your new one, depending entirely on which part of the world-wide DNS database they hit. Knowing this, no one in their right mind changes to a new IP address and then just deletes everything from the old one. Instead, they keep two copies of the web site alive for a little while.

Returning to search engines, now, the trick is to extend your little while to a little while longer. You don't need to worry about a redirect, because the DNS is essentially doing that. Keep a copy of your site at both IP addresses for at least a few weeks (Googlebot is fairly quick) and preferably for a month (other search engine spiders can be much slower about updating their DNS cache). The copy on your old IP address doesn't have to be completely current, but keeping it there insure the spiders finds something to index and your site stays in their database. Keep an eye on the log files at your new IP address and eventually you'll see that all the major spiders have abandoned the old IP address and you can follow suit.

Bottom line is that it'll cost an extra month in hosting fees, but that's usually a very small price to pay. :)

#25 Randy

Randy

    Convert Me!

  • Moderator
  • 17,540 posts

Posted 29 May 2004 - 09:47 AM

Welcome Kirk ! :notworthy:

Ron's answer is spot on for 99.9%, so that's the best way to do it. Just pay and extra month of hosting and everything will be fine.

If you happen to have control over the local DNS at your old server there are ways to basically force the update. In a nutshell, instead of pointing the local DNS to the old server's IP number, change them to point to the new server's IP number.

Most people will not have this ability, so feel free to disregard the suggestion. I started doing this when retiring one server in exchange for a new, faster machine. Mainly to make sure everybody's email ended up all in one place.

A nice side benefit of tweaking the local DNS settings was that all of the bots seemed to pick up the new IP and update their records very quickly.

#26 Shane

Shane

    HR 6

  • Active Members
  • PipPipPipPipPipPip
  • 850 posts
  • Location:Atlanta, GA

Posted 29 May 2004 - 04:27 PM

Ron,

Sorry, of course you're right :rofl: Everyone runs a caching DNS. My post wasn't very clear. What I would be surprised to find was that they let their DNS servers get way out of date.

Now that I think about it, how do you specify how often your caching DNS server updates? Isn't it at the mercy of the TTL's in each domain's authority record?

It wouldn't surprise me to find out the Google overrides this somehow, but it <i>would</i> surprise me if they overrode it and cached the record for more than a day or to. A DNS lookup takes a miniscule amount of time, especially compared to everything else they have to do.

Shane

#27 Randy

Randy

    Convert Me!

  • Moderator
  • 17,540 posts

Posted 29 May 2004 - 06:38 PM

They used to be a lot slower to update their DNS cache Shane. I haven't seen any problems with changing in IP number over the last couple of years though.

#28 Ron Carnell

Ron Carnell

    HR 6

  • Moderator
  • 966 posts
  • Location:Michigan USA

Posted 29 May 2004 - 10:17 PM

The TTL (Time To Live) in the DNS record (not the same TTL attached to a UDP packet, for those following along) determines how long a named server should cache information. For example, when I go to a web site, the named server assigned to my local machine (usually set by the ISP) will query around until it finds the named server for the domain I requested. If I ask my local named server for the same site tomorrow, chances are it will still have the information in its cache, saving a lot of unnecessary bandwidth for everyone. However, my local named server will only maintain its cache of that information for however many seconds are specified in the TTL received from the authority named server during that initial request.

Which is a very long way of saying, Yea, Shane, you're absolutely right. ;)

However, historically, search engines have not maintained their domain/IP lookup in a named server (which is a very poor database model for millions upon millions of records), but have presumably kept it with their crawling schedule. As Randy said, over the last few years, it's rarely been a problem but I suspect that's because everyone has only been watching Google, and Google has set new standards. In the late Nineties, a failure to keep duplicate sites for at *least* a month was practically a guarantee of getting dropped from Excite or AV, the two big players of that era. Google does much better, but I honestly don't know if any of the other spiders have kept pace. I'm too chicken to find out. :)

FTR, if you do maintain duplicate sites for a while and examine the logs of both, you can very easily see Google crawling the OLD site for up to two weeks after the DNS swap, certainly far greater than the typical TTL settings. However, if duplicate sites aren't maintained, I still haven't heard of Google dropping a site from the index in recent years, because (1) Google doesn't drop sites as easily as in the old days, and (2) I "think" Googlebot recognizes the signs and triggers a new DNS lookup. The latter, unfortunately, is very difficult to document, perhaps in large part because I don't think I could program a spider to recognize those signs (under HTTP 1.1 there is no standard error code, you just go to the wrong site).

At any rate, Google is no longer the only kid on the block. And frankly, even when it was, I still didn't take any chances. The potential risk of four to six weeks of traffic is hard to justify against a month's hosting fees. Been there, done that, and now I'm so chicken I can sometimes be heard to cluck when crossing the road. :)

#29 pixpixpix

pixpixpix

    HR 1

  • Members
  • Pip
  • 8 posts
  • Location:San Francisco Bay Area

Posted 05 June 2004 - 07:54 PM

Glad I found this thread - Currently I have alternatge domains pointing to a directory on my main site. That is 123.com has a URL redirect entry in the DNS record to go to Me.com/subdirectory

Actually there are two entries. The second is for www.123.com to go to the same Me.com/subdirectory

However when I look at the headers with an http viewer (http://www.rexswain.com/httpview.html) they are coming is as 302 Found rather than 301

Google seems to have separate entries for www.123.com and 123.com

Should a CNAME entry be set up to combine these? enom control panel doesn't seem to let me do that.

can a CNAME entry point to another domain subdirectory?

thanks

[Edited live link per forum guidelines -Randy]

Edited by Randy, 05 June 2004 - 08:35 PM.


#30 Randy

Randy

    Convert Me!

  • Moderator
  • 17,540 posts

Posted 05 June 2004 - 08:39 PM

Welcome pixpixpix ! ;)

I'm not sure that's possible with CNAME's. I know I've certainly never tried it. If you have enough access at the server level you could certainly set up an alias to accomplish the task, but may be asking a bit much if you're on shared hosting.

A question...

Do you have the ability to use an .htaccess file on the domains you're forwarding? If so, that would be the easiest and cleanest way to accomplish the task IMO.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!