Are you a Google Analytics enthusiast?
More SEO Content
How Smart Are Se Robots?
Posted 18 November 2003 - 05:22 AM
I highly doubt that I'm the first person to come up with this theory-- in fact, I imagine this has been discussed here before. But I've been thinking...
The more prominent SEs are constantly striving to produce more relevant results, and to produce more relevant results, they try their best to make their robots think like actual people. Of course their robots can't visually process a page the way a person can. All they can do is process is the text that makes up a web page.
What I've been thinking about is that there are clues in the text of a web page about its appearance.
First of all, I want to tell you why I'm thinking about appearance--the visual aspect of a website. I'm thinking about it because I believe it's the most important aspect of a website. I know that most people think content is the most important part of a website. But to the human eye, what's even more important is the way the content appears. For example, if it's not large enough to read, or if it's not contrasted enough with the BG, it's not much good to you. And there are are a huge number of other factors in the visual aspect of a website that affect its overall readibility and more importantly, usability.
So here's a question for you. If the goal of most search engines is to produce more relevant results, why wouldn't things like the visual aspect of a website be considered? I mean if we were all just a bunch of text-processing computers, I could see why content would be the only things that matters. But because we're not computers, don't you think that appearance is important? And don't you think that SEs would consider the appearance of a website to be important?
Which of the following websites would the average person prefer? A website with top-notch information, with black text that covers the entire width of a computer screen, with no pictures... nothing but text links and choppy headings. Or, a website with good (but not quite top-notch) information, laid out with Verdana instead of Sans Serif, with pleasing colors, a few graphics, and column widths that facilitate easy reading?
I really don't know for sure which website the average person would prefer (I honestly don't). But I'm guessing that the majority of people would choose good (bot not the best) information over top-notch information, if it came in a nicer package. I know that most people go to websites looking for information... but I don't think people are so enthralled with information that they want nothing but information. I tend to think that most people want a pleasant viewing experience, and that doesn't come in your average "SiteSell" powered website (nothing against that product... it's just the best example I could think of).
Anyways, back to where I started. Let's say that people do consider the appearance of a website to be important, and let's say that it actually is a factor in the popularity of a website. Then don't you think that SEs would place some stock in the appearance of a website?
The thing is that even though SE robots can't look at a webpage the way a person can, they can get a general idea of the appearance of the website. For example, they can (or could, if they wanted to) identify the main content section of a website, and determine if if the lines of content appear in a readable width. They could also be checking the contrast between the background and the text. Even more, they could be looking for pleasing color combinations like dark-blue/yellow. And who knows, they might consider a few images and maybe even a flash presentation to be an indicator of quality. And all of these might be factors in determining what they consider to be the quality of a website.
I bet if they're not now, they will be some day.
Well, that's about it... that's something I've been thinking about lately. I'm curious what some of you experienced SEO people think about that. I'm also curious if you have any other thoughts about SE relevancy besides keyword density and link popularity/page rank.
Posted 18 November 2003 - 07:59 AM
As a general rule,the robots/spiders just fetch the pages of a site.Another piece of software analyzes the site later.Because sites are not always instantly accessible,the robot is usually programmed to make several attempts to retrieve a site.You can also select which files to retrieve,and which to ignore.Since you can't perform a proper analysis without all of the pieces,it would slow the process down considerably,if you make the robot/spider do all of the work.The robot saves what it can fetch,and will go back for the rest later.
Yes,the robot could perform basic preselection,and flag a site as not worth the effort.Due to the number of sites that contain the same information,the surfer would still find what she is looking for.
This would also convince ... hopefully ... people with sloppy sites,to clean up their act.
Speaking of fonts,while many tests have been conducted to determine which is more 'readable',have you noticed that some words just don't look right in some fonts. I sometimes double check the spelling of words that I'm sure I've typed correctly,just because they dont look [ feel ] right. I don't mean the serif/sans serif issue.Sometimes the typed word doesn't look the right size/shape.
As for page rank,it was probably a good idea when it started,but now it just encourages people to produce sites in quantity,rather than quality.
What do you think of this as a motto for a search engine:
'Your No Crap Search Engine' ... needs a little work,but I like it.
going to stop typing now.
must get more coffee.
Posted 18 November 2003 - 08:24 AM
The truth is though that THE most important thing is to get targeted visitors to your site, spookilly i just posted a comment by a websurfer in another post, here it is again
"If I'm going to a Web site, I want information. I want
information quickly. It could be written in 10 point pica
for all I care. I'm already interested in what might be
there, why turn me off?"
this is something i have always said, people will struggle if the content is good enough (not that they should have to) content is king but useability is queen, easy navigation and clear information presentation will keep your visitors, and convert better, but getting them to you is THE most important part, as you could design a killer leaflet giving away £10 notes (a service everyone would want) but if you pin that leaflet to the top of every telegraph pole you ain't gonna get any custom
Posted 18 November 2003 - 01:39 PM
Anyways, you said useability is queen. So we both agree that it's something that's very important to a user. My question then is why wouldn't it be important to an SE? And like I said, there are ways an SE could be checking a site for useability by processing the text that makes up the page--very easily.
The main question I was trying to ask in my previous post is do you think it could be happening? I mean do you think that SEs could be checking a website's usability? Or if they're not now, don't you think they will be someday? I personally think they'd be stupid not to be. I mean the goal of every search engine is to provide users with what they're looking for. Users are looking for useable sites. Yes, they're looking for information. But if that information isn't presented in some kind of useable form, they'll probably click away in a few seconds. And when they click away in a few seconds, they go away unsatisfied because they didn't find what they were looking for. And that's not something an SE wants.
I don't know... I just don't think the average person is quite the information maniac we make them them out to be. Yes, people want information, but I think they prefer to have it spoon-fed to them.
Just my opinion... but I'd make a bet with anyone right now that within 5 years useability will be a topic at SEO forums. And SEs really could be distinguising between useable sites an unusable sites, if they took the notion.
Have a good one,
Posted 18 November 2003 - 02:10 PM
Just do a simple test. Take one of the most obscure web pages in your site. Block and copy 10 consecutive words. Then do a Google search of those 10 consecutive words within quotation marks. You'll be surprised at the result.
I just did it for the phrase "There are companies out there trying to develop product that" which occurs in one of my newsletters. Lo and behold, Google serves up the exact newsletter. It took only 0.92 seconds. I find that mind-boggling.
A Useability evaluation for every web page would be a much more complex process. If Google could get it down to an operating procedure, then why do we need Jakob Nielsen?
Posted 18 November 2003 - 02:17 PM
In an indirect way, however.
For instance, a site who's navigation was confusing and complicated, may also not be crawler-friendly, and thus never get indexed.
There are certainly many instances of that.
They can't really check for usability, but very often what makes a site usable for a person also makes it usable for a search engine spider.
Posted 18 November 2003 - 02:43 PM
Back to bots,and usability testing ... it gets too complicated.Google is already using 10,000 computers.We could just line up all the bad designers/SEOs,and shoot every tenth one.Bet we wouldn't have to do that more than a couple of times. It's just a theory.Maybe we could test it on lawyers first.
Posted 18 November 2003 - 02:51 PM
I have always found that any site with a good navigation system gets a good score, sure this is the result of PR again, but it is obvious, and Spiders can read a page, they can read text colours and background colours to stop spamm,
But wheat designates good colour selection? If the site is being built for partially sighted visitors then chances are they will have big fonts on a black background with white or bright yellow text. You or I may think it vile, but to a partially sighted person it will be heaven.
Posted 18 November 2003 - 03:15 PM
Search engines cannot visually see graphics and IMO good looking graphics combined with good content make for a great web site. Take the smileys that are available on this forum for example. Most all of them rank all over any other forum I have been on but to a search engine, they all look pretty much the same (<img src='imagename.gif' border='0' height="x" width="x" alt='x'>).
So if overall usability of a site is evaluated by not only how easy it is to navigate, how easy it is to find information, etc., but also on the aesthetic look of the site, then search engines will never be able to evaluate a site in the way we who have eyeballs can.
Posted 18 November 2003 - 03:43 PM
I kind of agree with what Jill said... sites that are user-friendly are often SE friendly. I was reading an article not too long ago though (I think it was here), about how SEO often makes sites very unfriendly to the user--the point being that relevancy alone doesn't give users what they're looking for.
I'm still fairly convinced that site quality and useability are important to SEs, and I don't believe that content alone is responsible for the quality and useability of a site. Yes, it's very, very important, but it's not the only thing that matters.
Maybe SEs don't check useability with their robots, and maybe there never will be a need for that. But the quality and useability of a website could be determined with other means, like link popularity. Isn't the main purpose of link popularity in determining the quality of a website?
The problem with link popularity, like everything else, is that people will find a way to cheat with it. So basically, I see the trend in SE relevancy algorithms in making SE robots smarter, and making them think more like people while they analyze individual pages for quality and relevancy--not just relevancy. I'm not saying that other pages at the website and links to that website don't matter, or that they won't be a factor. But I see the futuristic SE robot "looking" at a page individually (the way a person would), and determining whether it's relevant and user friendly, or relevant and not user friendly, or not relevant and not user friendly (something along those lines). Whatever it takes to give users what they're looking for, and keep crafty SEO people from bucking the system.
As to Google already using 10,000 computers. Well, I think that they'd use 12,000 or 15,000 if that would allow them to produce more relevant results, because more relevant results would produce more users, which would produce more money.
Don't you think that when Google decided to start using "pagerank" they had to install more computers? I'm going to say they probably did. Why? Because they predicted that it would produce more relevant results, which is they wanted.
OK, enough on that. This is nothing but thinking and speculation on my part. It's nothing that's going to help anyone get higher rankings, or produce more sales. I enjoy talking about stuff like this, though.
Thanks for your replies, and thanks for being a sport while I'm still learning SEO.
Posted 18 November 2003 - 04:28 PM
Perhaps not applicable yet, but it's really neat to consider.
One of the reasons I got really into SEO was that it seemed to make such good sense. And I've seen it mentioned elsewhere-- it's not just search engine optimization, it's website optimization. Because SEO goes hand in hand with usability (which is a very active topic on a lot of webmaster forums!), and with marketing, and with a whole bunch of departments. Properly, a website should be thoroughly considered and optimized for a number of factors-- aesthetics, usability, searchability, sales conversions, etc. So often, when you go to SEO a site, it's a good chance to get most of the rest of your problems tidied up. I've seen services proliferating today that offer a "holistic" approach-- by which I don't mean mumbo-jumbo, I mean in the real sense of "holistic", which means "considering the whole".
Anyway, interesting to ponder upon.
Posted 18 November 2003 - 05:03 PM
SEO doesn't have to make a site unfriendly- SEO that doesn't take the user into account (navigation, marketing, message) can be very unfriendly and those rankings are useless.
I was reading an article not too long ago though (I think it was here), about how SEO often makes sites very unfriendly to the user--the point being that relevancy alone doesn't give users what they're looking for.
I don't think search engines can or would take usability into account, but an SEO certainly should IMO. Without conversions, what good is traffic?
Posted 18 November 2003 - 05:16 PM
Well maybe they could if they asked the right people what they thought of the sites they visit!
I don't think search engines can or would take usability into account
I have said this before but will say it again lol If the Se monitored how much time a visitor spends on a domain, then surely this would be a vote of confidence in the site that is measurable and transcends all the other elements.
No one will stay on the site if they cant find stuff, nor will they stay on a site if it has no real unique content, but IF they do stay then what better endorsement?
Google toolbar can do this i would have thought, if it records where you are then it can number crunch this into the volume of visitors to that site divided by the amount of time, resulting in a time spent of site score, and should be able to come up with a weighting for this in the algo.
It would then not need to read the content if it was crap people would hit it and go, if it was relevant then they would stay, if it was relevant but poorly designed they stay less, making it score lower.
Just my tuppence.
Posted 18 November 2003 - 06:28 PM
I see the trend in SE relevancy algorithms in making SE robots smarter, and making them think more like people ...
Which people? We ain't all built the same.
The problem with trying to quantify quality is that you inevitably muddy the waters beyond any possibility of seeing things clearly. You want a perfect recipe for failure? Try to build something, anything, for the "average" person. The only people you'll appeal to are the ones who really do have 2.3 children. Met any of them lately?
Posted 18 November 2003 - 06:31 PM
If the Se monitored how much time a visitor spends on a domain, then surely this would be a vote of confidence in the site that is measurable and transcends all the other elements.
Except, if I click on a page and then decide to go out for a pint and come back to it later...
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users