Are you a Google Analytics enthusiast?
More SEO Content
Permanent 301 Redirect
Posted 21 May 2004 - 08:04 AM
For the .com / .co.uk portion of your question, that's exactly how it should be used.
Posted 21 May 2004 - 08:47 AM
You also need to check "A permanent redirection for this resource." Otherwise, IIS will send a 302 instead of a 301.
Posted 21 May 2004 - 08:51 AM
Please correct me if I'm wrong, but I don't think the DNS solution works (or maybe I just read it wrong ). If you CNAME www.myotherdomain.com to www.mydomain.com, then the site can be accessed via either domain. That means that the site could easily end up being indexed under both domain names. That would be great at the start, but eventually they'll detect the duplicate content and act accordingly. (I say accordingly, because I'm not 100% sure what they actually do in those cases )
Posted 21 May 2004 - 09:16 AM
Google, for one, knows these are all the same site and will only show one of them, which is exactly what you want.
Posted 24 May 2004 - 08:22 AM
You also lose control over which domain Google picks. A year ago this month, we had that exact problem. We didn't have 301's set up for some of our very old domains, and Google decided out of the blue to started using one of those rather than our main one that had the prominence and name recognition. It threw everything we had done to that point into a tailspin. After (quickly) getting the 301's in place and waiting for the next Google update (which, thankfully, happens much more often now), everything was rectified and we pulled out of the spin.
After many lengthy discussions with them (including one in person out at Mountain View), they still could come up with no answers as to why it happened.
We're not going to take chances anymore, though. When it's important for a specific domain to be the one we show up under, we'll always use 301's for the domains that redirect to it. We know that does exactly what we want it to do (and I'm sure Yahoo will figure out how to handle them before long ).
Posted 24 May 2004 - 09:28 AM
I'm a big believer in leaving as little to chance as possible. Call me a control freak. It's true with things like this. I know which domain I want in the SERPs, so I leave nothing to chance.
FWIW on the redundancy example, most people wouldn't have dual domain names. They would simply have a single domain and set up the redundancy through a DNS round robin or via nameservers pointing to totally different locations in case one server went down.
In my experience that sort of setup (single domain on two servers with different IP's) wouldn't cause a duplicate content problem. The only time that seems to kick in is when there are different domain names.
Posted 28 May 2004 - 03:19 PM
Great thread. This is my first post to this forum.
I just had to change a site's dedicated IP address. I've read--but haven't been able to confirm as fact or fiction--that Google maintains their own domain > IP database, and that it if often out of date relative to the worldwide nameservers.
So, I'm concerned that this change in IP may cause some of the indexed pages for the domain in question to drop out of the Google directory before they update and find the domain's new IP.
I do have the ability to associate a new domain with the old IP. Is there some way to use a redirect (any type) to say to the SE's "this domain isn't at this IP anymore, and you can find it at this new IP"?
Posted 28 May 2004 - 03:34 PM
Anyone else ever heard that?
Posted 28 May 2004 - 04:21 PM
Irrespective of search engines, changing the IP address for a domain isn't an instant thing because the DNS database is spread out over the entire world. When you point your domain name at a new named server, with a new IP address, it typically takes about three days for the majority of the world to see the new server. Some people won't see the change for a week or more. During DNS Propagation, a visitor may go to your old IP address or your new one, depending entirely on which part of the world-wide DNS database they hit. Knowing this, no one in their right mind changes to a new IP address and then just deletes everything from the old one. Instead, they keep two copies of the web site alive for a little while.
Returning to search engines, now, the trick is to extend your little while to a little while longer. You don't need to worry about a redirect, because the DNS is essentially doing that. Keep a copy of your site at both IP addresses for at least a few weeks (Googlebot is fairly quick) and preferably for a month (other search engine spiders can be much slower about updating their DNS cache). The copy on your old IP address doesn't have to be completely current, but keeping it there insure the spiders finds something to index and your site stays in their database. Keep an eye on the log files at your new IP address and eventually you'll see that all the major spiders have abandoned the old IP address and you can follow suit.
Bottom line is that it'll cost an extra month in hosting fees, but that's usually a very small price to pay.
Posted 29 May 2004 - 09:47 AM
Ron's answer is spot on for 99.9%, so that's the best way to do it. Just pay and extra month of hosting and everything will be fine.
If you happen to have control over the local DNS at your old server there are ways to basically force the update. In a nutshell, instead of pointing the local DNS to the old server's IP number, change them to point to the new server's IP number.
Most people will not have this ability, so feel free to disregard the suggestion. I started doing this when retiring one server in exchange for a new, faster machine. Mainly to make sure everybody's email ended up all in one place.
A nice side benefit of tweaking the local DNS settings was that all of the bots seemed to pick up the new IP and update their records very quickly.
Posted 29 May 2004 - 04:27 PM
Sorry, of course you're right Everyone runs a caching DNS. My post wasn't very clear. What I would be surprised to find was that they let their DNS servers get way out of date.
Now that I think about it, how do you specify how often your caching DNS server updates? Isn't it at the mercy of the TTL's in each domain's authority record?
It wouldn't surprise me to find out the Google overrides this somehow, but it <i>would</i> surprise me if they overrode it and cached the record for more than a day or to. A DNS lookup takes a miniscule amount of time, especially compared to everything else they have to do.
Posted 29 May 2004 - 06:38 PM
Posted 29 May 2004 - 10:17 PM
Which is a very long way of saying, Yea, Shane, you're absolutely right.
However, historically, search engines have not maintained their domain/IP lookup in a named server (which is a very poor database model for millions upon millions of records), but have presumably kept it with their crawling schedule. As Randy said, over the last few years, it's rarely been a problem … but I suspect that's because everyone has only been watching Google, and Google has set new standards. In the late Nineties, a failure to keep duplicate sites for at *least* a month was practically a guarantee of getting dropped from Excite or AV, the two big players of that era. Google does much better, but I honestly don't know if any of the other spiders have kept pace. I'm too chicken to find out.
FTR, if you do maintain duplicate sites for a while and examine the logs of both, you can very easily see Google crawling the OLD site for up to two weeks after the DNS swap, certainly far greater than the typical TTL settings. However, if duplicate sites aren't maintained, I still haven't heard of Google dropping a site from the index in recent years, because (1) Google doesn't drop sites as easily as in the old days, and (2) I "think" Googlebot recognizes the signs and triggers a new DNS lookup. The latter, unfortunately, is very difficult to document, perhaps in large part because I don't think I could program a spider to recognize those signs (under HTTP 1.1 there is no standard error code, you just go to the wrong site).
At any rate, Google is no longer the only kid on the block. And frankly, even when it was, I still didn't take any chances. The potential risk of four to six weeks of traffic is hard to justify against a month's hosting fees. Been there, done that, and now I'm so chicken I can sometimes be heard to cluck when crossing the road.
Posted 05 June 2004 - 07:54 PM
Actually there are two entries. The second is for www.123.com to go to the same Me.com/subdirectory
However when I look at the headers with an http viewer (http://www.rexswain.com/httpview.html) they are coming is as 302 Found rather than 301
Google seems to have separate entries for www.123.com and 123.com
Should a CNAME entry be set up to combine these? enom control panel doesn't seem to let me do that.
can a CNAME entry point to another domain subdirectory?
[Edited live link per forum guidelines -Randy]
Edited by Randy, 05 June 2004 - 08:35 PM.
Posted 05 June 2004 - 08:39 PM
I'm not sure that's possible with CNAME's. I know I've certainly never tried it. If you have enough access at the server level you could certainly set up an alias to accomplish the task, but may be asking a bit much if you're on shared hosting.
Do you have the ability to use an .htaccess file on the domains you're forwarding? If so, that would be the easiest and cleanest way to accomplish the task IMO.
1 user(s) are reading this topic
0 members, 1 guests, 0 anonymous users