SEO Class in Chicago, IL
Are you a Google Analytics enthusiast?
More SEO Content
Matt Cutts Giving Out Clues
Posted 23 July 2007 - 12:18 PM
Why would Google have that information? Are they validating pages just for the fun of it? I found this rather shocking because most of us think validation has nothing to do with rankings. Maybe it has something to do with indexing, or the method of indexing used.
In Providence last month Dan Crow told us that Google ability to index the Internet is limited by power and bandwidth, not money. He suggested that Google only allocates so much time and bandwidth to spider each site, so for large sites it's in the site owner's interest to do things that make the indexing process more efficient, because that may lead to more pages being indexed.
One way to make a site easier to index is to reduce code bloat. Another might be clearing coding errors so that the indexing program can quickly figure out the page, rather than having to make guesses when the code is all screwy. Guessing and compensating for errors takes time, processor power, and ultimately electricity. When indexing billions of pages, trivial things that eat up CPU cycles can create a significant burden.
Posted 23 July 2007 - 01:00 PM
Posted 23 July 2007 - 01:06 PM
Come to think of it, this is all assuming that he meant W3C validating, and not some sort of "Google" validating, which may or may not be more useful, depending on your viewpoint, and would not surprise me given Googles infamous "we can do it better than anyone else" approach to things.
Posted 23 July 2007 - 05:26 PM
Crickey if it became a choice between W3C standards and not getting indexed or Google standards and get indexed, all hell is gonna break loose in the 'standards compliant' community, and I for one won't be happy having just W3C'd all my sites - oh well , might keep me out of mischeif having to serve different pages for Google than the Joe public - oh hangon - isn't that 'black hat' SEO.
And to what standard does google's website validate to because it certainly isn't W3C - Stop it my sides are hurting
Edited by 1dmf, 23 July 2007 - 05:31 PM.
Posted 23 July 2007 - 06:41 PM
I wouldn't read anything into it, Jonathan, other than people have probably been asking Matt for that sort of thing because they read in so many places that valid code will help rankings. (Which we all know, it doesn't.)
Posted 23 July 2007 - 07:27 PM
Posted 23 July 2007 - 07:31 PM
Posted 23 July 2007 - 08:36 PM
But... if G has limited computing power and bandwidth, it may use up whatever is allocated for a given site crawl on a few bloated pages and split.
I always try to keep my main pages short and sweet (300 to 400 words) and get them to validate to whatever doctype. Also I put intersite links to those pages where they can be found fast.
Edited by maleman, 23 July 2007 - 08:41 PM.
Posted 23 July 2007 - 09:31 PM
The answer is simple: they don't need to recrawl your site to validate it. They just need to run a parser over the downloaded HTML.
The parser is
Google need never know the outcome, and might not even bother to record the output / result.
The time thing is pretty obvious, and pretty useful to know. One wonders why so many sites are so code heavy. Stripping out newlines, tabs and commetns always seemed a good idea to me.
Posted 23 July 2007 - 09:51 PM
Anyway, who says they're going to offer it? They're just 'askin what people think are important, no?
Posted 24 July 2007 - 05:42 AM
It wouldn't hurt for there to be a button next to each page in the Webmaster Tools section where you could click validate and it just passes this to W3C in a new window, leaving the W3C to validate with their tool and bandwidth and google having a quick and easy access option in the WMT's.
Maybe they could even record the result next to each page, then Google wouldn't have to validate while crawling simply cross reference against their DB to see if a page is valid. Then do what ever accordingly, seem to me a more sensible approach.
Posted 24 July 2007 - 06:14 AM
Code Bloat and Errors could be a factor if not now then sometime in the future. Head in Sand Merchants can ignore the "Possibility" at their own risk.
"They will never do that" is a brave assumption because after all Never is a long time.
Posted 24 July 2007 - 06:45 AM
Occam's razor: simplest answer is often the best.
No conspiracy. No new knowledge. Nothing to learn here. Move along.
Posted 24 July 2007 - 07:52 AM
But hey, like you say nothing to learn here.
Edited by 1dmf, 24 July 2007 - 08:41 AM.
Posted 24 July 2007 - 08:22 AM
I think you are a bit confused as to what the code does, but then, not looking at the code creates that issue
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users