Are you a Google Analytics enthusiast?
More SEO Content
Matt Cutts Giving Out Clues
Posted 27 July 2007 - 11:32 AM
Sir Crow spoke about how little of the "Total Web" was actually indexed.
The scale of increasing the total crawl depth even a fraction means an enormous amount of additional resources.
So, logically, JEH has valid point.
If Page Rank is high, they may expend more resources to crawl through mud-dirty code.
Posted 27 July 2007 - 10:53 PM
If I, as a user, search on Google and I click through on a search result to a site that does not render properly in my browser (this happens to me often because I use Firefox), then the quality of my search experience is diminished. I have to either start up IE (and I don't do this very often, as I figure if someone can't design a site to render properly in more than just IE then why should I bother to visit it), or try another site in Google's search results and hope that it will render properly. This can get more than a bit annoying. But, if every site I looked at rendered well in my browser (Standards compliance would perhaps be one way of helping to achieve this), then from my point of view, the quality of my search experience would improve. As personally I don't often swap to IE anymore to view sites that don't render properly, then that would be a vast improvement for me. But, I totally understand that not everyone would want search results skewed to just include the sites that render across the most popular browsers. Maybe a little checkbox next to the search box that people like me could tick if they wanted this option would be kind of handy though. I do think that Search Engines will continue to move towards providing a more personalised experience because they do want to keep as many of their users happy as possible.
With an ever increasing shift to browsers such as Firefox, Search engines must be having to consider the ramifications, and must be thinking of ways to try and combat problems such as I have described above. This article http://www.xitimonit...-1-2-3-102.html describes how "The ascension of Firefox continues… Nearly 28% average use rate in Europe in the beginning of July 2007, with a progression in the totality of the 32 European countries studied." They state the Perimeter of the Study as being: "conducted from Monday, July 2 to Sunday, July 8, 2007 with a Perimeter of 95,827 websites."
This study appears to have concentrated on 32 European countries, however it does also mention a worldwide trend which I found quite interesting.
Posted 27 July 2007 - 11:08 PM
Posted 28 July 2007 - 07:22 AM
To begin with a site can quite easily pass normal standards testing yet still display quite badly in one browser or another. That's not so much a Standards issue as it is a Cross Browser support issue, as Bob noted above. Call it lazy design protocols.
However if they made it a part of Personalization perhaps it might be doable, since at the very least Personalization would allow you to remove a site from showing up in your search results in the future. Of course this removal may keep you from later seeing other sites that do not have design issues in FF because Google thought you were removing it for another reason entirely. This is why it would take a large leap of faith on their part to start penalizing sites like this.
FWIW, I think you're probably right Bob. I've not seen any studies either, but I would hazard to guess the percentage of cross browser compliant pages is quite small. Especially if you start including things like the old Mac/IE browsers, older versions of Netscape and even older versions of Windows/IE that are still running around out there.
Posted 30 July 2007 - 09:16 PM
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users