Are you a Google Analytics enthusiast?
More SEO Content
Self-appointed Seo Expert Has An Opinion
Posted 06 January 2004 - 12:08 PM
Would this explain why so many directory sites seems to have occupied the top slots with Google favouring resources as opposed to actual content in certain search topics?
Seems odd, after all adding a bunch of links to resources is pretty quick and easy and open to abuse.
Posted 06 January 2004 - 12:54 PM
I'll give you my understanding of this, though. Dan (or someone else with a more scientific approach will probably have some good insights to add to my findings).
As with my observations that "themed" links have been a factor (at least in some way or another) since December (or so) 2002, I've believed that outbound links do have some bearing on this. In the same way that an outbound link is a vote for another page, an outbound link can be a vote saying that "This page has information related, at least in some way, to what I'm talking about here." We all know that's not always the case with outbound links, so a means of determining if it's a link for the sake of relevance or a link for the sake of linking. In the same way that Dan's article describes how a page with a link to yours relates to your page, it can determine how a page that you link to relates to your page.
This can go a long way in determining the "authority" of a site. In a broad generalization, a page with lots of links to it but nothing going out may be useful, but it only presents one view. A page with lots of links to it that presents a view, and then provides links to alternative or more diverse views is inherently more useful than the former example.
Using this in the ranking algo also helps benefit the goal of the search engine - to bring someone who has a question as close to their answer as possible as quickly as possible. The idea is to have the #1 result provide the best answer to the question. With many search terms (though we're seeing that Google's getting better at it) it's not always possible to determine exactly what the person is looking for. A search for "computers" could be someone looking for instructions on how to do something with a computer or someone looking to buy a computer (or any number of other things). Rather than presenting a site that sells computers followed by a site that provides information on computers (in which case the #1 results is wrong, say, 50% of the time), it would be far better to link to an "authority" site that talks about computers in general and then provides a link to a place that sells them and another link to a place that provides complete information. Or maybe the site itself performs one of the tasks and freely links to one that does the other task. Either way, this is a better #1 result because it delivers something that says something to the user and then offers them the "Okay, now, do you want to buy or learn?" So, when the user ends their session, the #1 result for their term took them to a site that answered their question - even if it was by linking to another site. This brings the effectiveness of the results from 50-50 to 100% (in this two option example, anyway - obviously most terms are more complex than this).
In late 2002 I noticed some things that led me to believe that this was happening in some way. Since then, I've done some experimentation (most notably in the directory site in my profile). That site has very few inbound links (except from our forums where we sometimes link to categories as an aide to visitors with questions) and has never had any SEO done to it. I designed it in a way that was SEO friendly, but no one has ever sat down and analyzed keywords and seen if we named categories and things in such a way as to be good for search engines.
We don't rank #1 for many terms, but there are 100 or so pretty frequently used terms that we rank in the top 10 for and Google does send a good amount of traffic to us - even if it is only pass-through traffic (which is really what it's there for, anyway). It's not the specific pages dealing with a specific issue though, that rank really well. It's the categoy listings themselves. The pages that define a specific topic and then proceed to link out to sites related to that topic. As directories go, we allow a lot more descriptive text than most for each of our listings so that helps with the anchor to plain text ratio, but other than that, there's no good reason for the page to rank in the top 10. We encourage natural language descriptions (you can use a keyword, but don't repeat it). There are virtually no links on the web to it (other than from our forums). It has no particular content of its own (other than the site descriptions). What it does have is a clear topic and a bunch of links to more information on that topic.
So, though my findings are by no means conclusive, not only does it seem like it would be logical for search engines to want to add value to an outbound link, it does appear that there must be some value to the link or it simply wouldn't be ranking very well.
Posted 06 January 2004 - 05:54 PM
Finally got around to reading your report - excellent stuff - well done that man! Will recommend it to any clients who mumble about their Google results.
Posted 06 January 2004 - 07:55 PM
Posted 06 January 2004 - 09:15 PM
I think you have correctly nailed the largest pieces, but I still think there is an overoptimization penalty. The rerankings can not be overly topic sensitive unless Google has assumed Adirondack is search engine marketing. I have not been involved with SEO the great length of time that many of you have (and presumably do not know as much about it).
I know that some people think the SEO related terms (seo search engine marketing search engine optimization website promotion...) are altered to mess with SEOs.
I looked at the top 10 for "search engine marketing" and most of them made sense
marketleap, search engine guide, search engine watch, bruce clay, search engine blog, pandia...
A few of the websites did not seem strongly connected in the SEO field (meaning not lots of incomming links from SEO related websites)
for example, Mannix Marketing ranks well on many search engines (so they obviously must know how to do SEO), but
Their open directory category is Basic Service Web Design.
(I know the backlink checker only shows some of them) most of their other inbound links are from a couple Adirondack area websites.
They have few links (if any) to other seo websites. They link back to their own sites from a whole bunch of Adirondack websites using "search engine optimization by mannix marketing"
I think the spread of the words "search engine" from "marketing" in the links and on the page copy is helping them out...
Posted 06 January 2004 - 09:51 PM
I think I am about to stir up trouble and get hated, but...
I think you have correctly nailed the largest pieces, but I still think there is an overoptimization penalty. The rerankings can not be overly topic sensitive unless Google has assumed Adirondack is search engine marketing.
I can't hate you because you've linked to my site. We're like blood brothers now. Besides, I think that Black Hat SEO site is a hoot.
Anyway, let's talk about "overoptimization." You might need to define that for me. There are way too many spammy pages still showing up in the search results for me to believe that there's anything new going on. Spam filters aren't new, or very effective. But I suppose it depends on what you mean.
I define an optimized web page as "a well structured HTML document that clearly conveys the topic." Here is an example:
I have pages that are optimized with targeted search phrases in the <TITLE>, in a big old <H>eading at the top of the page, in the first words of the first sentence of the first <P>aragraph, in a sub<H>eading, in the <BODY> text a couple times, repeated at the end of the page in the fine print, and contained in at least one <A>nchor. Yet these pages still appear prominently in the search results for that search term. They even have the exact search term used in incoming links.
Maybe that's just "optimized" and not "over optimized," though. Maybe if I used the search term in the ALT attribute of a few <IMG> tags, slid it into a hidden layer, came up with 10 different variations of it and used them in separate links to the same page, and that sort of thing, then I'd be over-optimizing. In my opinion, I'd just be spamming at that point, but you get the idea.
It all depends on what you mean.
To really understand where I'm coming from on the topical bias, you have to look at a lot of SERPs, and a lot of the web maps around the top 10-20 sites. Look at the web maps around the top 10 sites for "fruit basket" and "laptop rentals." If you do, you'll soon be able to tell me how "anime" gets related to "fruit basket" and "platypus" becomes relevant for "laptop rentals," and why Google still has some work to do.
Edited by DanThies, 06 January 2004 - 09:57 PM.
Posted 06 January 2004 - 10:12 PM
End of the day, if you are too dense in any of the areas where keyword density are listed above, then the value of those keywords begins to diminish. The higher you are above what Google picked as the "optimal range", the more noticeable the lack of value becomes. It's not a penalty, nor a filter, it's just a keyword density filter across several levels (overall, link text, emphasized text) with a cap. Get inside that myseterious "optimal level" and it's good. Get too high above it and (as far as ranking goes) you may as well not have put the words on there in the first place.
I've looked at several examples lately where this seems to be the case, though I could be wrong.
Posted 06 January 2004 - 10:27 PM
I have navigation pages that aren't getting traffic (which is a good thing for my visitors, because they get straight to the content instead), and these tend to have high keyword density because there are so few words on the page. This one used to be in the top ten for "marketing ezine," which never made any sense to me.
Posted 06 January 2004 - 11:06 PM
can't hate you because you've linked to my site. We're like blood brothers now. Besides, I think that Black Hat SEO site is a hoot.
I think you are only the second person here who has told me possitive comments about the (un)official black hat SEO directory. I will send stickers and to the first dozen...I got alot of respect for anyone who likes such a fine resource
Where I get the overoptimization theory is some of the specific phrases I was looking at for some people. I think what I term as overoptimization Google may term as spam, but I think they are assessing a spam penalty related to specific subjects (not just black and white spam or not spam).
The bell curve idea is what I am talking about. If its way too much, and all the links seem too similar, then its not a good thing... The biggest problem is that the internet is a massive database and the technology has only recently been spread heavily across the entire medium. Since all of those Adirondack websites link into the seo site maybe it is possible that Google can parallel the ideas and think that the ideas are extremely related? Perhaps if they go a bit further, or if Google changes its technology slightly they will disappear...
The hardest thing to be sure of is wether the short term changes are due to google or the changes I make (most likely they are usually google)...One time a customer added a keyword text heavy "about text block" to his page and he disappeared...I removed it and he showed right back up.
The biggest problem in knowing exactly what is going on since the technology is so new and we are changing things at the same time google is...
From the sites I have looked at I have noticed a heavy pattern of
TONS OF KEYWORD A EVERYWHERE
and then a single or maybe couple occasions of keyword b on high ranking pages.
Currently the best seo advice anyone can give is to gather relevant inbound links for the winter. As search engines get more advanced (and their processor power improves) I think that grading sites based on themes and outbound links will also add enough value to the search product to be part of the equasions.
While Google's core technology may have changed a bit their goal has not...thus legitimate optimization still should not be much different than it was before... it is alot harder to cheese the system buying pagerank, which was super simple in the past...
Posted 07 January 2004 - 05:58 AM
'Blue Widgets' in your incoming anchor text
'Blue Widgets' in your title tag, twice
'Blue Widgets' in your description tag, twice if you like
'Blue Widgets' as the only bold text on your page
'Blue Widgets' in your H1, H2 tag etc
'Blue Widgets' as the highest density phrase on your site
That should get your tripping nicely, if that's your gig.
Posted 07 January 2004 - 06:26 AM
Sorry to keep you up so late.
Well, if Barry likes it, that just about clinches the deal. All we need is a vote of confidence from Alan Perkins and I can go to bed!
Yes I pretty much agree with it (as you already knew ). I'm interested in the number of topics you think there needs to be. You are coming at it from the perspective of the number of topics in the ODP. It's interesting to consider the maximum number of topics that an individual SERP will actually support - ten. As long as the SERP contains fair representation for each topic area, I don't think there needs to be many topics - that's not to say that there aren't.
Posted 07 January 2004 - 08:45 AM
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users