Are you a Google Analytics enthusiast?
More SEO Content
Keyword Themes And Google
Posted 16 February 2005 - 11:28 AM
I've got a question for the Google experts. At the request of one of my clients, I've been focusing on getting two of his sites listed in Google for his desired keywords. The strategy we employ is on-page optimization for the specific keyword and reciprocal linking. The reciprocal linking is done via directory placement and also email request (e.g., we'll link to you if you link to us).
Something's been bugging me about the on-page optimization. The site isn't huge - maybe 60 pages of content - and if we optimize the home page for "blue widgets" then there isn't a lot of content to support the term simply because there just aren't a lot of pages on the site. So let's say 18 of the 60 pages have the term "blue widgets" in them, but none of those 18 pages are truly optimized for that term (they just mention the term in passing, for example), would my client's site even have a ghost of a chance of being in the top 10 in Google for the term if all the competitor sites had like 100 pages or more of content optimized for "blue widgets?"
I tested this theory by doing a brief analysis on the top ten pages that come up for the desired term. I looked at a few of the things that we all agree are important (pagerank, backlinks, keyword repeats). However, I also looked at two new variables for the top ten sites for my term..
1) How many pages on the indexed domain contain the actual term (e.g. "blue widgets")
2) How many pages from the domain are listed in Google (with or without the term)
I then compared the results with the page I optimizing for my client. The results seem to support the theory that Google places greater weight on the amount of content throughout a domain as well as the amount of time a particular term shows up within that content. Nine of the top ten pages contained more than 100 pages with the targeted term and (astoundingly) 9 of the top ten results contain THOUSANDS (in some cases hundreds of thousands) of pages from the top-ranking domains listed in Google. There was one exception - the #5 result was much lower numbers (though still higher than the page I optimized) and this bodes further examination. My client's site had the greatest number of backlinks compared with the top ten sites. Some of the sites only showed 1 or 2 backlinks. I assume that Google did not show backlinks from within a site's own domain because the pages indexed had low PR.
What are your thoughts/experiences with this? Do you think Google places greater weight (where all other factors are equal) on sites that have many more pages indexed and also contain many more instances of the keyword you're targeting? So, sites that are thematically-relevant (in terms of the sheer weight of content) for a specific term or terms?
Bottom Line: Does a site with only a few pages stand a chance in showing up in the top ten if there is a high PR, a lot of high-quality backlinks, and decent optimization but it has many less pages than top ranking Web sites are all huge portal/content sites?
Thanks and sorry for the huge post!
Posted 16 February 2005 - 12:03 PM
I think it does. But I`m not a pro, so may be I`m wrong. You said "pagerank, backlinks, keyword repeats" . I think PR is not so important. And you did not mention the pages' title and the H tags, may be there is the key in on-page optimization as well as keyword density.
But I`ll say it again, I`m not a pro
Posted 16 February 2005 - 12:16 PM
It certainly wouldn't hurt to have a few more that are focused on it, as you could possibly then have 2 pages show up in the results for the same phrase, but any more than that isn't going to help, in my opinion. (Won't hurt, but it's not necessary.)
Think about it. Most sites that never heard of SEO may or may not have pages that use the same phrases.
My site happens to be focused mostly on search engine optimization, so it does use that phrase on pretty much every page.
However, an ecommerce site that might want to rank highly for "sporting goods" isn't necessarily going to use that phrase on every page. They will use it on the home page, and then they will target individual sports item phrases on the inner pages.
That is fine. That is natural. That is as it should be. That is not a problem for the engines. Just make sure the links that go back to the home page are focused on sporting goods phrases, and the links to the individual pages are focused on their more specific item phrases.
Posted 16 February 2005 - 01:47 PM
At the end of the day was left with no alternative than to come back to the conclusion that the search engines simply aren't looking at a whole site much at all. It's just page by page ranking as Jill said above.
I had to look just because it's in my curious nature, and because I really do wish the search engines would have some method of factoring Site Themes into the equation. However when push came to shove I still couldn't get away from the fact that one of my #1 terms appeared on only 16 pages of my site, while it appeared on a minimum of 250 pages and a maximum of 1,100 pages of every other site in the Top 20.
When I started really delving into the details of other markets, some would have the #1 spot occupied by a site that had a lot of pages with a given term, while other markets would be the exact opposite.
The only place I can see the number of pages having any effect is in distributing PageRank or Link Popularity within a site. Since it is quite common for every page on any site to all link back to at least the home page, if not a handful of base-level pages, the sheer size of one site can have an effect if no other sites in the market have nearly as many pages.
That's about it though, and even this portion is not a very big effect. Unless you have 20,000 pages and everybody else has 150.
Posted 16 February 2005 - 02:20 PM
Thanks for your responses. It makes sense that the internal links should be optimized to support the keywords you're targeting. However, I did want to mention that 7 of the top 10 sites that were in Google for my client's keyword did not contain the targeted keyword in their link text (on the page that was listed). The number one listing did not contain the keyword in its link text. I only looked at the actual pages that were ranking in the top 10, and not any of the pages (on the same domain) that may have linked to those top-ranking pages.
I think I'm starting to talk in circles here...
What I'm trying to say is this - it seems it doesn't matter (to Google) if your optimized page contains the keyword in the link text. Rather, it is more important that the other pages on your site that link back to your optimized page contain the keyword in the link text. Would you say that's accurate? Thanks again!
Posted 16 February 2005 - 02:52 PM
Cool! Nice to see my common sense and intuition are still serving me well!
Posted 16 February 2005 - 03:31 PM
I'd have to make an additional recommendation - think very carefully about the structure of the site. You only have about 60 pages to work with, but you should still be able to do a certain degree of topic analysis and targeting.
Here's an example:
Home Page (targeted for bicycles)
Bicycles > Mountain > Brands > Models
Now, it sounds like your site might only be about the specific model, so let's use the bicycle example again and I'll show you how I'd architect my data:
Model X123 > X123 Components > X123 Gears > X123 Chains
You can see how there is almost always a way to 'dig' deeper into your subject and create a site that has a real topic focus. These are the types of architectures I expect will succeed more and more in the future, because they are built with the right kind of content and interlinking structure. Keep thinking - Narrow > Specific > Detail and you'll do great.
Posted 17 February 2005 - 02:47 PM
Thanks for the advice, I think it's great in terms of usability, but my main question is - if Google only looks at pages and not entire sites, then (theoretically) it shouldn't matter what an overall site's architecture is, right? It's just the structure of the page that matters.
That said, I just want to clarify your recommendations. Are you recommending the page structure be formatted with a "trail of breadcrumbs" set of links at the top (e.g., Bicycles > Mountain > Brand > Model). Of course the breadcrumbs dictate the site structure (e.g., you have your home page about Bicycles, a subsection called Mountain, a further subsection called "Brand" and then the specific models within the Brand section)?
Could it be a problem for search engine positioning to have some key content three clicks away from the home page? Sorry for my confusion!
Posted 17 February 2005 - 04:46 PM
I understand your confusion. I was not referring to breadcrumbs, I was talking about site/information architecture. Google very much looks at sites (not just pages) and so do all of the other search engines. That's why you see sites like bankrate.com ranking for so many financial terms and DMOZ ranking well in many queries - these are huge, powerful sites that have tons of global (and local) link popularity.
The reason information architecture is so valuable is because search engine researchers and engineers (typically this field is called IR - Information Retrieval) have been looking at ways to get away from pure link popularity towards better methods of analyzing the content of a page and the contents of the sites & pages that link to them.
By using this type of structure (the broad > narrow), you can improve the relationships between the pages - think hypernyms > hyponyms - and thus the relevancy in the search engine's eyes. This structure is good for usability too - which is an added bonus!
Ask if you're still confused!
Posted 17 February 2005 - 06:24 PM
Jill and Randy are saying that the search engines look at pages, but Rand is saying that search engines look at overall content. Someone's gotta be wrong here...
Now, Jill, you said to make sure your internal pages all link back to your home page using the relevant keyword text. However, why would this matter if SEs index pages and not sites? Unless of course Google doesn't really give any weight to the fact that the links are coming from the same domain - they're just incoming links and that is good. But even if that's the case, it still supports the fact that larger sites are better sites.
Rand's point that larger sites also tend to have more incoming links from external sources (e.g., people linking to Dmoz.org and About.com) supports this theory as well.
There must be some critical factor that I'm overlooking, because my client's site has more backlinks in Google than any of the top ten sites listed for his primary keyword. I believe this missing link could be sheer volume since all of the sites listed are enormous. But of course I can't be certain. They are also sites that have undoubtedly been around longer. One other thing that baffles me is the site used to rank fairly well (at least, it was in the top 20-30 results) for the its primary keyword and now it's gone. Goodbye. Seeya later. The main thing my client did was build incoming links - organically, by hand, very legitimately.
Does anyone have a copy of Google's algorithm that I can borrow?
Posted 17 February 2005 - 07:42 PM
Usability is definitely a concern that has to be in the mind of every webmaster. Even if you assume that it's only going to help real human visitors, it's still very, very important.
It's still (also) true that the search engines view things on a page-by-page basis for ranking purposes. Even with that as a starting point the internal links of your site are still very, very important to the search engines. After all, how you link from one page to another is not only going to affect if and how quickly the search engines are able to spider and index your site, but those internal links also count at least somewhat in the ranking of each page.
Confusing, and a bit like walking on a thin piece of wire 100 feet off the ground. With no net.
My personal approach has always been to build navigation strictly with User benefit in mind. Those are the folks who pay me after all, and I've found that building for the users will usually get my keyword phrases in the links naturally anyway. Then make sure I'm not throwing any roadblocks in the way of the search engines.
Posted 17 February 2005 - 09:12 PM
When you say "gone," do you mean that it's no longer even in the top 1000 results? If so, then you may be dealing with something a lot simpler, like an overoptimization filter for stuff on the page, or for those links if they all use the same keywords. It's hard to say anything specific without knowing anything specific, like the URL and the search terms.
I left my copy on the bus, riding back from the Google Dance last summer, but I think it said something about pigeons.
Posted 17 February 2005 - 09:19 PM
this summer's blockbuster film plot: stealing the algorithm from deep inside the googleplex.
Posted 17 February 2005 - 10:35 PM
And you're free to believe whichever one of us (or none of us) that you'd like!
I would highly suggest that you don't believe anyone, but do your own tests to see what you come up with.
I don't understand why one precludes the other. They do index pages, not sites (for the most part) so every page that has a link to your home page with the targeted keyword phrase, is a good page regardless of the domain it's on.
Larger sites can definitely be better sites, assuming that their size is actually a factor of their having more useful content than other sites. And again, the usefulness comes from the fact that you can link back to yourself on all those pages in your large site.
When did it drop? Just a few weeks ago with the last algorithm change?
I imagine since so many sites finally came out of the aging delay, that pushed a lot of other sites down.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users