Are you a Google Analytics enthusiast?
More SEO Content
Potential 301 Problem - Duplicate Urls
Posted 06 May 2009 - 08:28 AM
This lead me to thinking that perhaps we did not do something correctly last year in creating the robots.tx. Maybe the redirects are not correct? This is what was done at that time:
301 redirect all requests for non ââ‚¬â€œwww. URLs to the www version of the same URL
When 301 redirecting, always designate a URL as the location instead of doing this:
08/29/08 11:34:32 Browsing www.domain.com/info.php?info_id=119%20
Fetching www.domain.com/info.php?info_id=119 ...
GET /info.php?info_id=119 HTTP/1.1
User-Agent: Sam Spade 1.14
HTTP/1.1 301 Moved Permanently
Date: Fri, 29 Aug 2008 15:34:34 GMT
Server: Apache/2.0.52 (Red Hat)
Ideally, all internal links to the home page should be coded to point to / or www.domain.com/
My question is - does this eliminate the possibility of duplicate URLs being seen by Google?Is there something else I should direct the programmers to look at? We also have a blog - www.domainNews.com that links to our primary website. I have been told that Google might be viewing this as a link farm and thus negatively influencing our site in some way. Others have told me that is not true. I also know our code is not pristine when view in a validator, but it is comparable to competitive sites in my area in terms of errors.
I am looking for a road map to correct any issues I might have, either technical or content, that is negatively affecting our results in the Google SERPS. I also ran an XENU report that I don't completely understand - especially the section on redirected URLs vs valid URLs. It is attached.
[Domain and keyword references removed per [url=http://www.highrankings.com/forum/index.php?act=boardrules]Forum Rules[/url]. Please read them.]
Posted 06 May 2009 - 09:40 AM
First things first. The redirect you show a trace on. It's really bad form to not have the full URL in the Location: field. I checked on your domain, and what you show in your trace is how it shows up today, with no url.
Second, though robots.txt can help to eliminate duplicate pages it does nothing to allow link popularity those old pages may have had to pass on to the new pages. So by simply excluding them but having no 301 redirect in the process you've effectively lowered your own link popularity.
So to fix things properly you not only need to fix the errors you had in your Xenu run, but you'll also need to go back in time and make sure calls to all of those old url addresses are being redirected (via 301 status) to the most appropriate new url. That's large job, and one that's not going to be solved for you from a forum post.
If you feel like taking it on yourself instead of hiring someone you first need to map out a plan, including all of the old urls and their new counterpart.
Posted 06 May 2009 - 11:01 AM
Posted 06 May 2009 - 12:09 PM
On the rest, it's impossible to say if the url/redirection/duplicates is the root of the problem you've been seeing or not without knowing the whole history and doing a complete site review. It's a problem, but it's impossible to say if it's the root cause or a contributing factor without having all of the details.
That's why it would be smart to contract with someone who knows what to look for. They should be able to sort it out fairly expediently once they're armed with all of the data.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users