Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Print Pages Being Favored Over Html Pages!


  • Please log in to reply
5 replies to this topic

#1 Say Yebo

Say Yebo

    HR 4

  • Active Members
  • PipPipPipPip
  • 220 posts
  • Location:USA

Posted 01 October 2010 - 03:49 PM

Hi,

My client has thousands of 'print version' pages on his site, and now many of them are appearing in the SERPS instead of the proper web page.

Putting noindex on them all could be quite a task....and may not work anyway.

Is there a better way to resolve this? Can one list that many pages in the robots.txt file?

#2 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,085 posts
  • Location:Georgia

Posted 01 October 2010 - 10:27 PM

You might be able to contrive a wildcard option but that probably would not help much. This may only be a temporary situation that lasts a few weeks. Every now and then Google seems to dump a lot of pages from its index and then it recrawls the Web and rebuilds the index. I have seen many, many complaints in numerous forums over the past couple of weeks about sites' pages vanishing from Google. That is usually a clear signal of the temporary situation I describe.

I suppose if you cannot add "noindex" meta tags to all those pages, implementing a "rel='canonical'" tag would be just as cumbersome. But going forward you may want to work with your client on that.

In the meantime, one option some people have found helps is to acquire new trusted (high-value) links to their deeper content.

#3 Say Yebo

Say Yebo

    HR 4

  • Active Members
  • PipPipPipPip
  • 220 posts
  • Location:USA

Posted 02 October 2010 - 08:51 AM

Thank you Michael....that gives me some food for thought.

#4 Jill

Jill

    Recovering SEO

  • Admin
  • 32,929 posts

Posted 02 October 2010 - 09:38 AM

Can you put them all in one directory and exclude that via robots.txt? That's my typical recommendation.

#5 qwerty

qwerty

    HR 10

  • Moderator
  • 8,615 posts
  • Location:Somerville, MA

Posted 03 October 2010 - 09:34 PM

I've had bad results trying that a couple of times. Since the pages were already indexed, blocking them via robots.txt kept the spiders from attempting to check to see if they had changed, but didn't drop them from the index. I don't know if that's what happens most of the time, but that was my experience.

You might try setting up a separate CSS for the pages when the media is print. That way, you've only got one page for the content, but you still have the functionality of providing a print version that doesn't contain a bunch of unnecessary design elements. If you did that, you could delete the current print versions of the pages and redirect requests for those URLs to the other version.

#6 Say Yebo

Say Yebo

    HR 4

  • Active Members
  • PipPipPipPip
  • 220 posts
  • Location:USA

Posted 04 October 2010 - 11:58 AM

QUOTE(Jill @ Oct 2 2010, 10:38 AM) View Post
Can you put them all in one directory and exclude that via robots.txt? That's my typical recommendation.


Well, what I forgot to mention is that, strictly speaking, it's not just one site. It's hundreds of location based sites within one main site, and each sub-site contains similar content, all of which has print versions.

So a typical URL looks like this:

www.mainsite.com/HartfordKitchen/HowToBakeMuffins

So I'm assuming each subsite would have to have a directory with all its print pages in right?




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!