What does "increasing personalisation" mean. I agreed with everything you said in your post up until this point and then you lost me. Please help.
A site that doesn't use any optimisation techniques at all may still be irrelevant to the user.
"Relevancy" is whatever the user thinks it is. Obviously that will vary depending on ones perspective. I think that's why we'll be seeing increasing personalisation in future.
Are you a Google Analytics enthusiast?
More SEO Content
Google & The Bad Guys Newsletter 077
Posted 03 November 2003 - 09:07 PM
Posted 03 November 2003 - 10:51 PM
There's a good paper here.
Talk is one thing. Doing it is another. However, I've noticed this issue come up more and more in the past year.
Posted 04 November 2003 - 02:50 AM
The authors state that the problem they are trying to solve is that the degree of “precision & recall of commonly used search engines … are usually very low …Users often have to sift through many web pages to find a small set of documents that satisfy their information needs.”
To solve this problem they put human evaluations at the centre of their information retrieval (IR) system:
"Based on the user's relevance feedback ... our proposed approach can learn and approximate the user's underlying ranking function and apply it to unseen documents" Section 2, last Paragraph, page 4.
GIGO, an ancient acronym comes to mind immediately – garbage in garbage out. By which I mean if any ranking system depends on human input it will be open to manipulation at some level (i.e. spam) that will need to be filtered out to give the most “relevant” results.
OK, different people have different notions of relevance depending partly on the reason for searching. An academic searching for “truth” may be an empirically minded nuclear scientist or a Wittgensteinian linguistic philosopher each with very different notions of truth & relevancy. There are attempts to compartmentalize information according to groups of experts to give each academic the info relevant to their particular niche.
For authoratative results googles use of linking as a factor in relevance tried to automate the value given to user’s underlying ranking function. The result – link farms, selling links, weird & wonderful linking strategies, blog noise et al. The use of human reviewers via DMOZ has it’s own unique problems & quirks.
Teoma is often mentioned in the context of personalization of SERPs; it uses “related link collections by experts and enthusiasts” no real change there.
Alexa also tries to personalize with "also visit" & reviews. A look at the attempts at self-promotion in the reviews of Google shows pathetic & blatant abuse of the human review process. But they are an example of a weakness in any system that ultimately depends on "user's relevance feedback"
Amazon uses similar personalization, “also bought” with great success but I am doubtful as to whether it is an effective model for searching the entire web.
Google presumably has access to the largest collection of user search data but still returns far to many pages of irrelevant results in many cases. If google is struggling to increase the relevancy of SERPs & filter out spammy results with all its experience, knowledge & PhDs is there any chance of any other organization or business coming up with an IR system with personalization in the form of "user's relevance feedback," as proposed by Fan, Gordon & Pathak as it’s foundation?
Their system may be useful in closed systems like intranets but I think it is limited and limiting if applied to the WWW. To look beyond SERPs & perhaps go back to the idealism of freedom of information increasing through the development of the internet & WWW I wish to point out that the control of knowledge is political. I relish the eclectic nature of information currently available. Even if my SERPS were to be limited/ filtered by my own preferences & search history alongside whatever measures of authority I remain doubtful about personalistion as the basis of the future of web search.
Edited by Jentotaltravel, 04 November 2003 - 03:36 AM.
Posted 04 November 2003 - 03:55 AM
Machine systems are also weak. Statistical weighting, such as calculations based on frequency, typified by keyword density and link popularity, do not necessarily generate relevance. A page matches a keyword query and is popular - so what? If I say "I'm hungry" that does not therefore mean I will always be satisfied with McDonalds.
I'm no authority on the science of search, however I believe providing relevancy to the user is the ultimate aim. IMHO I can't see the science moving in any useful direction other than providing increased personalisation.
Posted 04 November 2003 - 05:17 AM
So if personalisation becomes the base for providing relevancy to the individual user it will represent a seismic shift from objective ideals of relevancy to individual subjective world views. Somehow I think search will still need to retain an over-arching meta theory of relevancy & position the individual within it.
Posted 04 November 2003 - 06:42 AM
The other issue with personalization would be that, by definition, it favors some results over others. While the "favored" results may be based on the individual user's past actions, I don't see this as an always-spot-on kind of thing. People do different things, and they change. Therefore, basing search results on the user's past actions means not seeing what you might have seen had you not previously conducted a preponderance of one type of search or another.
That is, *if* I accepted a cookie that is *still* on my machine, and I've done a lot of searching in a particular sphere, my current search results would be swayed toward that sphere and related topics even if I were now embarking on a totally different type of search. If I've searched all year for software and, say, answers to web programming questions, will my search for "Christmas gifts" yield UltraEdit and Perl and PHP books for my 80-year-old grandmother? Of course, if I *knew* about the personalization factor, then I could simply delete the cookie -- which means I start at zero.
Given the number of people who (a) don't know anything about search, (B) are shocked to learn that some search results are paid ads, and © don't know what a browser is, the likelihood in the event of "swayed" search results is that they'll simply think the search engine is not that good. It would also confound those people whose results keep changing or are different than those of a friend/partner/associate. (And SEOs may have a good time proving rankings to clients.) Of course, if we're talking search initiated from a search engine contained on the user's own computer, or the fact that some people may not care if their searches are logged, that could be taken into account, too.
It could work, but I think much would have to be resolved. Unless we're talking about a big red "Personalize My Results" button. That would include, of course, the "small print" that warns that your search results are being logged. <g>
Posted 04 November 2003 - 07:00 AM
Given the number of people who (a) don't know anything about search, ( are shocked to learn that some search results are paid ads, and © don't know what a browser is, the likelihood in the event of "swayed" search results is that they'll simply think the search engine is not that good.
So true, I met many members of the public while working for a UK gov backed computer based training establishment with courses in every topic but mainly computing from absolute beginners to programming etc & it is foolish to assume that the public generally have any understanding of the ins & outs of search - the on/off button used to be the starting point for many learners.
Re the issues & difficulties raised by personalisation which you clarify in some detail - if possible at all, I agree that any such subjective results must be relegated to secondary position after some overall objective relevancy measurment. So personalisation may or may not be a valuable addition for some searchers but only after a primary ordering of the information available & thus not the base for the future of search in general.
Posted 04 November 2003 - 07:45 AM
I wonder whether Google's define: suggests a different way of handling the problem. If you do a search for define:lederhosen, then you get a definition of lederhosen. Suppose Google was to widen the concept. Why can't I put in the search field a first word that helps to define the type of thing I'm looking for, a colon and then a keyword for what is currently interesting me. So I should be able to do searches like this:
You get the idea. How would this approach fly with everyone?
Posted 04 November 2003 - 11:09 AM
Why can't I put in the search field a first word that helps to define the type of thing I'm looking for, a colon and then a keyword for what is currently interesting me. So I should be able to do searches like this:
People who are already well practised in using various search syntaxes might be fine with that but wouldnt Joe Public rather see those qualifiers already on the page - as they are in the directory or as in froogle for shopping. All of which detract from Googles super shiny one simple box suits all & turns it into a portal of sorts.
I suppose in the extreme succesful spammers, meaning those who manipulate search results to their ends, are already practising the ultimate in personalization by putting what they what they want the searcheer to see in front of their eyeballs.
Not quite what a succesful search company intends but as such the spammers are making the resuts relevant to them - the question is how to stop that & return the most relevent serps from all the web not just sections of it.
Until then it seems that the "bad guys are winning" for now.
Posted 04 November 2003 - 02:50 PM
such personalization would require either cookies (which can be deleted, intentionally or not) or a login
Or a toolbar.
I agree that any such subjective results must be relegated to secondary position after some overall objective relevancy measurment
Objective relevancy? Is there such a thing?
succesful spammers, meaning those who manipulate search results to their ends
How about localisation? I see this as being the first step on the road to personalisation and it works very well, on AdWords at least. Different results based on originating query IP. I don't want to buy cars in California, I want to buy them in Wellington, New Zealand.
Posted 05 November 2003 - 02:38 AM
As to objective relevancy, well, google has to decide if the ford you are searching for is a car, a crossing place or a film star etc prior to assigning locality.
targetted Adwords & web search results are about as similar as chalk & cheese in how location is assigned according to google. When describing regional targetting to ad words advertisers google states that it relies on the searchers IP address - which is then matched to the region/ area as already specified by the advertiser.
To searchers using Google labs "search the web by location" it asks for
US address, city & state, or zip e.g. 123 Main Ave, 94043 or Dallas, TX or 94043
The geo search query is then matched to hints google finds on web pages.
"We analyze the entire content of a page to extract hints or "signals" that enable us to assign a corresponding physical location, then return results that match the geographic range you specify. "
As to SEOs being spammers, yeah, spammers try to use optimisation techniques to get their url to the top of the SERPS but I think most of the mainstream SEOs would dissassociate themselves from those who for example rank 1st for "Disney" then switch the searcher to hard core porn sites. However, checking out popular spam techniques can indicate which techniques are likely to be the next to be penalised.
That takes me back to asking does google care about the techniques used if the result is relevent to the user? I say yes, because those techniques which work are the ones likely to be abused by spammers.
Posted 05 November 2003 - 06:28 AM
Longhorn will be local, in a sense imbedded in the OS of your computer, no cookies no nothing required to learn about and track not only your preferences but those in your sphere of influence as well.
That is, *if* I accepted a cookie that is *still* on my machine, and I've done a lot of searching in a particular sphere, my current search results would be swayed toward that sphere and related topics even if I were now embarking on a totally different type of search. If I've searched all year for software and, say, answers to web programming questions, will my search for "Christmas gifts" yield UltraEdit and Perl and PHP books for my 80-year-old grandmother?
It could levearge what you have on your own machine and augment that with "web results". It will know what you have searched for, for granny, in the past and will use that not your "personal preferences". It can and will get smarter about you as you search more. Big brother won't be in a TV set he will be in your computer. That Big brother will be M$, is IMO, frightening, and I'm not an M$ basher, just a judgement based on past performance. Security is the "issue" or their lack of real concern about it.
IMO, SE "indicate" relevancy, users "determine" relevancy because it is subjective and personal. IMO, for that reason alone it makes sense to personalize search.
Joe's legitimate links page of dubious quality is likely relevent to Joe, or hopefully is, since that is the premise that some engines are using for ranking.
Posted 05 November 2003 - 10:50 AM
There were a bunch of personalized search patents that have been granted in the last 3 months that are likely to begin affecting the web.
NEC, Philips, AltaVista, IBM, Ask Jeeves and a few others all kicking in.
One of them had to do with the creation of virtual user groups. The SE would not know who you were, but would know that you belonged to one or more "virtual user groups" with some common interests. It would then place more weight on results from that group. Ie a search on "cat" would return cats for an animal user group member but Caterpillar Tractors for a contruction company one.
Another patent used a database that was on the users computer that personalized the results.
Oh - and Google was just granted a patent (September 2) that I think will change a LOT of how the results are posted. It's a duplication detector that seems quite good, but will have huge implications for SEO's and copyright holders. Not all of them good. I'm saving it for a different post (and re-reading it several times to make sure I fully understand it)
Edited by mcanerin, 05 November 2003 - 12:41 PM.
Posted 05 November 2003 - 04:36 PM
There were a bunch of personalized search patents
Indeed. There's a lot of work being done in this area of late. As I said, who knows if it will result in anything concrete, but it may give an indication as to the direction of future developments.
Edited by peter_d, 05 November 2003 - 05:24 PM.
Posted 10 November 2003 - 11:18 AM
If they implement the LocalRank patent, the results will be resorted according to links from within the same results. Non relevant links would still count towards overall link pop, but in the actual sorting, relevant links would be far more powerful.
Yes, there are a lot of holes in that strategy too, but I would be very interested to see how that works out.
Not quite according to my understanding of Local rank.
The sites are first ranked by the old ranking method and then the local ranks are added to old ranks then resorted, but the kicker is that there are some undisclosed multipliers in the forumula that could make either the old rank or the local rank dominant.
I don't see anything in the local rank formula that has to do with relevance, just with reducing the links from one class c address to one per page, and then reranking the page as above</OT>
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users