Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Is It Necessary To Keep A Log Of All The 301 Redirects Implemented Dur


  • Please log in to reply
7 replies to this topic

#1 ttw

ttw

    HR 5

  • Active Members
  • PipPipPipPipPip
  • 395 posts
  • Location:San Mateo, California

Posted 28 September 2015 - 06:06 PM

We are working on cleaning up a site in preparation for a relaunch. 

 

The client has had hundreds and hundreds and 301 redirects because of changing blog platforms, acquiring companies, site changes etc. 

 

We have everything we are currently redirecting in an Excel file but I was wondering if there's a need to keep a long list of redirects.  

 

I'm wondering if it is necessary to maintain a history of 301 redirects implemented as part of this process along with all the other 301 redirects?

 

The client's platform is Sitecore.

 

Thank you for your input.

 

Rosemary

 

 

 

 

 

 



#2 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,718 posts
  • Location:Blackpool UK

Posted 28 September 2015 - 06:09 PM

Only if the original URLs are still being requested.

#3 ttw

ttw

    HR 5

  • Active Members
  • PipPipPipPipPip
  • 395 posts
  • Location:San Mateo, California

Posted 28 September 2015 - 06:17 PM

Thanks very much.

 

What tool would you use to see if URLs are still being requested?  GWMTs    -  I know you don't like that tool ;)

 

If the URLs are still being requested would you then go in and update the 301s to the new URL?

 

Thank!



#4 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,718 posts
  • Location:Blackpool UK

Posted 28 September 2015 - 07:26 PM

The site access logs is the best place, Google tools only show URLs that are being shown to Google users, so it cannot account for direct traffic or traffic from other links or sources. And javascript tools such as Google analytics cannot show anything if the documents are not there to host the scripts.



#5 torka

torka

    Vintage Babe

  • Moderator
  • 4,825 posts
  • Location:Triangle area, NC, USA, Earth (usually)

Posted 29 September 2015 - 07:36 AM

Absolutely, if URLs are still being requested, you should try to redirect to the nearest substitute. If there's nothing that works, falling back to a 404 is OK, but IMHO it's better to send them to the closest version of what they were looking for if at all possible.

 

--Torka :oldfogey:



#6 ttw

ttw

    HR 5

  • Active Members
  • PipPipPipPipPip
  • 395 posts
  • Location:San Mateo, California

Posted 29 September 2015 - 08:09 AM

Thank you.   I'm guessing the common way is to track via Excel?



#7 torka

torka

    Vintage Babe

  • Moderator
  • 4,825 posts
  • Location:Triangle area, NC, USA, Earth (usually)

Posted 29 September 2015 - 08:36 AM

Excel seems like as good a way as any.

 

One thing -- since there have been so many changes over the years, you'll want to make sure that you're not sending visitors through a series of redirects; rather, that each redirect sends people directly from the page they're requesting (however old it might be) straight to the latest-greatest substitute for the page they originally asked for with no intermediary "hops."

 

As to maintaining a history, it might not hurt. (Remember, though, you're talking to a pack-rat here! ;) My theory: better to have and not need than to need and not have.) If all it involves is saving a version of the Excel file with all the current redirect info in it, I can't see where there's any harm in that, and it might be useful to have someday.

 

My :02:

 

--Torka :oldfogey:



#8 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,325 posts
  • Location:Georgia

Posted 29 September 2015 - 03:57 PM

If the Excel spreadsheet is so large that you are confused by all the listings you can set it aside (do not delete it) for a month and track all the actual activity via the server log files.

 

When auditing crawl problems on Websites I do that anyway because it shows me what is really happening with the 301 redirects.  You want to keep the redirects as simple as possible, which may mean you'll have a lot of old URLs pointing to a small number of new URLs.

 

If an ancient URL is not requested within 30-90 days I usually remove it from my 301 redirect list and wait to see if it ever surfaces again.

 

If I see a redirect leading to another redirect in the server logs I update the redirect map.

 

You only want one redirect at most for any old URL, although it's nothing to freak out about if that doesn't happen 100% of the time.






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

We are now a read-only forum.
 
No new posts or registrations allowed.