Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Over-Optimisation Penalties...


  • Please log in to reply
37 replies to this topic

#31 piskie

piskie

    HR 7

  • Active Members
  • PipPipPipPipPipPipPip
  • 1,098 posts
  • Location:Cornwall

Posted 12 August 2012 - 08:05 AM

Maybe Chris and maybe not.
However, it was not my theory, I was merely floating a possibility in the hope that someone with more knowledge about the possibility would enlightening me.

#32 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,101 posts
  • Location:Blackpool UK

Posted 12 August 2012 - 03:15 PM

Never seen anything like that occur, Sites I've dealt with that had URLs that were just an image, a caption and the site navigation were crawled just as often any any other URLs were.

Edited by chrishirst, 12 August 2012 - 03:16 PM.


#33 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,157 posts
  • Location:Georgia

Posted 12 August 2012 - 03:23 PM

While we're there, is there any influence on Spidering Frequency by massive and complicated Code Bloat.


Absolutely none. Some of the most heavily bloated pages tend to be news sites, which are crawled frequently.

Google's "Page Speed" test is more about the combined time it takes to FETCH and RENDER a "page" than about any single detailed action.

Your Web browser makes multiple fetches before you can view a single page. Every file mentioned in your code -- Javascript, CSS, templates, server side includes, images, videos, etc. -- everything has to be fetched separately in sequence.

Search engines, in order to estimate page loading+rendering time must also process all those fetches (and then virtually construct the pages) to really estimate how long it takes a user to see the content.

Code-to-content ratio has never been a clean signal. It is a very complex matter that cannot be easily measured by looking at the content of the primary FETCH (what you can think of as "the display URL" or the "page" that the user thinks s/he is seeing).

If search engines ever really took that into consideration, they probably stopped doing so years ago.

A page with low code-to-content ratio (by whatever standard you employ) may not be fetched for months at a time simply because it doesn't change, has few links pointing to it, and provides little to no content for a search engine to present to its visitors.

A page with high code-to-content ratio may be refetched several times a day simply because it changes often, has many links pointing to it, and provides a lot of content for a search engine to present to its visitors.

It's a rare Web page whose crawl schedule would be dominated by one factor, in my opinion.

Edited by Michael Martinez, 12 August 2012 - 03:23 PM.


#34 piskie

piskie

    HR 7

  • Active Members
  • PipPipPipPipPipPipPip
  • 1,098 posts
  • Location:Cornwall

Posted 12 August 2012 - 06:42 PM

Thanks MM.

#35 torka

torka

    Vintage Babe

  • Moderator
  • 4,636 posts
  • Location:Triangle area, NC, USA, Earth (usually)

Posted 13 August 2012 - 09:40 AM

If search engines ever really took that into consideration, they probably stopped doing so years ago.

Indeed. :)

Here's a video from 2006 in which Vanessa Fox (who at the time worked for Google) says as far as they were concerned there's no such thing as "code to text ratio."

--Torka :propeller:

#36 Jill

Jill

    Recovering SEO

  • Admin
  • 33,003 posts

Posted 13 August 2012 - 11:00 AM

While I don't think code-to-text ratio would be much of a factor (if any), 2006 was a long time before Google started their big push for speedy websites. So I'm not sure if that information is still as applicable as it once was.

#37 piskie

piskie

    HR 7

  • Active Members
  • PipPipPipPipPipPipPip
  • 1,098 posts
  • Location:Cornwall

Posted 13 August 2012 - 06:37 PM

So there are shades of Gray thinking between black and white.

#38 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,101 posts
  • Location:Blackpool UK

Posted 14 August 2012 - 09:29 AM

While I don't think code-to-text ratio would be much of a factor (if any), 2006 was a long time before Google started their big push for speedy websites. So I'm not sure if that information is still as applicable as it once was.

Even so, to create any appreciable difference in "page speed" (whatever that may mean) it would have to be in orders of tens of megabytes. A few hundred k is only going to make microseconds in time differentials, unless your server is running on ten+ year old hardware of course.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!