QUOTE(gert @ Jan 14 2010, 09:45 PM)
Have heard that quite often as well... to me it would make sense as search engines' attempt to decrease the amount of data to crawl and have website authors focus more on clean and slim coding - as the amount of data continiously increases I could well imagine search engines to increasingly consider code-related aspects ... guess it could be more interpreted as a "write clean and slim code" than a "have an X:Y code-content-ratio"
HOW exactly does it "make sense"?
If I have a particular design that requires more markup code than the text that is contained within it, WHAT
exactly has that got to do with "a search engine".
They are NOT the ultimate authority on the Internet on design and coding practices.
"Decrease the amount of data to crawl"??
That shows a lack of understanding of what SE crawlers actually "do". Just as the people who come up with these wild theories do not understand the fundamentals of "how things work".
It takes microseconds
for the source code to be "read" from the server, it would have to be several hundred Megabytes
(images are NOT loaded) of source code to take longer.
I could write a software routine that would take remove all
markup elements (and just leave the text) from a page of source code in a fraction of a second when it runs so I pretty sure something like that could be coded by the programmers who write the SE indexing code.http://videos.webpro...oogle-sitemaps/