July 17, 2002
~~~IN TODAY'S ADVISOR~~~
----> Last Chance for Free SES Conference Pass
*Search Engine Marketing:
----> Keywords Coming Out of My Ears
*This Week's Sponsor:
----> Low-priced Hosting from Presage Designs
*Other SEO News:
----> Overture Auto-bid Feature Creates New Bid War Tricks
*Stuff You Might Like
----> Follow-up With Mike Grehan
----> It's Not as Easy as It Looks
Hey everyone! I got a lot of requests for the Advisor's free two-day
pass to the Search Engine Strategies conference in San Jose. I'm
gonna take additional requests for the next couple of days, and then
I'll be choosing the winner. So if you are absolutely, positively
sure you'll be able to use it if you win, send your requests to
firstname.lastname@example.org. (Learn more about the conference here:
As usual, I've got some great search engine optimization info for you.
So let's get right to the good stuff! - Jill
~~~Search Engine Marketing Issues~~~
++Keywords Coming Out of My Ears++
From: Walter Loughney [email@example.com]
I really enjoy your newsletter. I get several SEO newsletters a week
and yours consistently has the most useful and relevant information. I
think this comes from your real life use of the info rather than just
writing about it which some seem to do.
I have had good (maybe even great) results in getting my home pages
ranked in the top 10 or top 20 depending on how many thousands or
tens/hundreds of thousands of pages are found for various keywords.
But have not done as well in getting my more content specific pages
ranked as high for the same keywords even though they have more
content and use the keywords. Is there such a thing as too much
specific use of a keyword on a page?
For example say your site sells shoes (not my product). If you have a
page that talks about shoes and says you have red shoes and brown
shoes and black shoes and large shoes and small shoes and open toed
shoes and work shoes, etc. You might expect to get a good ranking on
shoes. The page is named shoes.htm and the title is "Shoes in all
sizes - Shoes in all Colors" and you have keywords on shoes being
careful not to violate any of the "rules" on how many times you use
"shoes" or "open toed" etc. in the keywords.
You might expect to get a good ranking when someone searches for
shoes. But if you search for shoes and find sites in the top 10 for
shoes that do not have the best page name, title, keywords, or use of
the keywords in meaningful content...well you have to wonder what
Any thoughts on this? Is too much of the right things just too much?
In particular I am optimizing for Google, Alltheweb, AltaVista, MSN,
AOL and Yahoo.
Thanks for any comments and I hope I have at least given you some
things to think about.
Glad you enjoy the newsletter!
There are many reasons why your inner content pages may not be ranking
highly in the engines. My first question to you would be, are the
pages actually in the search engines' databases? Obviously, if the
search engines aren't aware of the pages, they can't come up in the
results. So the first thing you must do is check to see which pages
of your site are indexed. Generally, you can figure this out by
typing in your URL at the engines' sites and seeing what pages come
up. If the inner pages aren't showing up, this is probably the reason
they don't rank well!
Of course, if the pages aren't showing up, then you have a major
problem on your hands. All the optimization and great writing that
you do will be for nothing if the search engines can't find or index
your pages for some reason. If you have a dynamically generated site,
it's possible that your design is causing your problems. Certain
dynamic pages are still not crawlable by many of the search engine
spiders. There are ways to get around this, but if you created your
site dynamically with no thought to the search engines, it can be a
lot of work to "fix" things after the fact. I've discussed some of
the workarounds in a few issues of the old Rank Write newsletter
<http://www.rankwrite.com>. (Do a search at Rank Write for the word
"dynamic" and you should find all the articles.)
Another reason that the search engines may not have your inner pages
indexed is because they are simply buried too deep within the site
architecture. Make sure that any important pages that you definitely
want indexed by the search engines are easy for the spiders to find
and crawl. At a minimum, build a sitemap page that has a link on your
home page, and make sure all the important pages of your site are
listed on it.
If the search engines *do* have your pages indexed, but they have poor
rankings, then yes, you do want to revisit your copy. There are
definitely problems with the over-use of the word "shoes" in your
example. For one thing, I wouldn't recommend even attempting to rank
highly for the single word "shoes." Yes, in the case of a shoe store,
that may definitely be the word that could bring you the most traffic,
but you've gotta change your mindset when it comes to search engine
Instead of trying to take one page and rank highly for "shoes" you've
absolutely got to find more descriptive phrases and optimize whole
bunches of pages specifically geared towards each of them. In other
words, don't try to rank highly for "red shoes," "brown shoes," "black
shoes" and the rest of them all on one page. Use WordTracker and find
the specific phrases people use, and then focus on just a couple of
them for each page. If "open-toed shoes" turns out to be a good one,
have a page that discusses those. You could even write an article
that discusses how open-toed shoes are all the rage right now, or
whatever. Certainly, you could discuss both "open-toed shoes" and
"open-toed sandals" on the same page, but don't stray too far off
course, or you'll make things difficult for yourself.
You don't need (or want) to repeat the word a zillion times over and
over again. The key is in using your phrases where they make sense in
the copy. The copy *must* read well. It absolutely must. The
biggest mistake some SEOs make is sticking their keyword phrases
anywhere and everywhere. Don't do it. Don't be tempted to do it.
Make sure you have enough copy to work with (at least 250 words) and
go from there. My basic rule of thumb is that if it sounds stupid or
sounds like you're overdoing it, most likely you are! As my former
Rank Write partner Heather Lloyd-Martin always says, read the copy
aloud. You can pretty much tell if you've gone overboard when you
hear it out loud, so don't forget this important step in the writing
If you use your keywords naturally, you won't have a problem with the
search engines. If you simply stick them everywhere, it's certainly
possible that your pages could get flagged as some sort of doorway
page, and then be given a lower weight. I'm not saying this is a
certainty, but the engines do seem to give a preference (as they
should) to well written pages. Content is king, guys. Always has
been, and always will be. Keyword repetition is not good content.
The goal is to give the search engines and your readers what they
want. It's not always easy, but it is most definitely doable. Don't
let anyone tell you otherwise. But you really do have to start with
good writing from a professional copywriter. I cannot stress this
enough. If the underlying copy is good, it's much easier to then get
your keywords into it. If you can start from scratch with your
keywords in mind, you should definitely do that. However, very often
you can find good places for your keywords within your existing
writing. In fact, this is what I discuss in my Search Engine
Strategies presentation, "Writing for the Search Engines." (You're
gonna have to come to the conference to learn all the editing tricks
I've amassed through the years! I'm also hoping to eventually put it
all down in a book, but I just have too many other things on my plate
right now. <sigh> Someday...)
IS YOUR WEB HOSTING COMPANY UNSATISFACTORY?
You can't run your business if your Web site or email is down.
--> Are you sick of all the excuses from your current host?
--> You deserve 24/7 live support and 100% uptime!
See what the expert staff at Presage Designs can do for you, and
check out our low prices at:
High Rankings Subscribers get 2 months FREE with a year's contract!
~~~Other SEO News~~~
++Overture Auto-bid Feature Creates New Bid War Tricks++
I've got a special treat for you today! You know that I'm not big on
pay-per-click (PPC) campaigns, as I consider them a whole other
advertising realm than SEO. However, I understand that many of you
are heavily into PPC through Overture and Google AdWords Select. I've
been reading elsewhere about the major changes in effect at Overture
and knew that it was important to report these to you. Since I don't
have the personal experience to give you the inside scoop on this
stuff, I begged and pleaded with PPC expert Ammon Johns to write about
it for me! Thankfully, Ammon graciously agreed. (I think I actually
had him with "hello"!)
Just so you know...Ammon is an internationally renowned expert in
Internet marketing and search engine optimization techniques. He's
based in the UK, but through the power of the Internet he works for
clients all around the world. In addition to his business interests,
Ammon also maintains the site "Web Marketing and Site Promotion"
<http://www.webmarketingplus.co.uk/>, which he describes as "a hobby
site sharing my love for the subject, along with some golden tips
proven in years of campaigns."
The changes Ammon highlights will certainly affect your Overture ad
campaign, so listen closely to what he has to say. Over to Ammon! -
Overture Auto-bid Feature Creates New Bid War Tricks
By Ammon Johns
Overture currently boasts that their top listings reach a staggering
85 percent of the entire US online audience through distributing
sponsored search listings to thousands of Web sites across the
Internet, including Lycos, MSN, and Yahoo!. Their revenue for the
Quarter ending 31st March 2002 was $142.8 million USD. Overture is the
originator of "Pay-For-Performance" search listings, and aims to
maintain their position ahead of their many imitators through offering
superior reach, service and features.
However, one of their more recent "enhancements," the new auto-bid
feature, is causing devastation unseen since Alfred Nobel decided to
save the lives of hundreds of explosives experts by inventing
dynamite. Then as now, an innovation that was intended to be helpful
was soon misused to cause far more harm than it had ever been devised
The auto-bid tool seems at first glance to be an eminently sensible
innovation. Don't be fooled. This tool allows you to set an amount
that you are prepared to pay as a "maximum bid" to maintain a
position. In the event of bidding changes for your search terms, the
auto-bid feature adjusts the actual cost of your clicks to maintain
the best position possible within your maximum spend, which places you
one cent above the maximum bid of the closest trailing competitor.
The problem here is that the maximum bids are visible. Your competitor
can see exactly what your maximum bid is, and can deliberately set his
maximum bid to one cent lower. This is not the actual cost of his
clicks, but his maximum bid that he'd be prepared to pay in order to
take your position. Since it's lower, his click costs won't actually
adjust (he can't take the position within the limits of his maximum).
Your current bid, however, will automatically adjust to be one cent
higher than his maximum.
The reason it works this way is simple. Imagine it as if it were a
process happening a step at a time. His auto-bid increases his current
bid to one cent above your current bid. Your auto-bid reacts by
increasing your current bid. This cycle continues until his bid
reaches the maximum, at which time that beaten auto-bid settles the
current bid back to the best position it can maintain -- one cent
above the next highest maximum bid.
That isn't actually how it works, but it explains one of the
principles behind it. In actual fact, an auto-bid is a variable -- a
bid that changes its cost-per-click to maintain the highest position
possible within the limits of your maximum bid in order to gain the
highest possible position up to the target you set (1st - 5th
Let me illustrate this to be sure we're all clear. Let's take a look
at the listings and bids for the search term "search engine
positioning" as a real-world example. The top-4 listings at the time
of this writing were as follows:
1. Increase Qualified Traffic to Your Site
(Advertiser's Max Bid: $4.52)
2. [Domain Name Removed] Strategic Marketing
(Advertiser's Max Bid: $4.51)
3. Proven Search Engine Positioning Experts
(Advertiser's Max Bid: $2.07)
4. Search Engine Positioning
(Advertiser's Max Bid: $2.06)
Look at the second position listing. See how the max bid is just one
cent lower than the #1 result? This forces the first listing to pay
the full $4.52 per click maximum. However, the second-place listing is
certainly using auto-bid, because they could have the exact same
position (2nd place) for just $2.08 as a fixed bid.
The second place listing will only be paying the $2.08 necessary to
beat the third position in this scenario -- having their auto-bid
maximum set so high is designed purely to make the number one listing
pay more than double! This is a whole new form of industrial sabotage.
Note that every bid is shown only as the maximum bid amount, not the
actual cost. We had to be intelligent to spot that the first and
second place listings are almost certainly auto-bids -- and to spot
that the #1 listing is being seriously sabotaged by the tactics of the
second place listing.
You can no longer see actual bids/costs, only maximum bids. There can
surely be no explanation for this other than Overture wanting to raise
the bid prices across the site. Which means companies are forced to
bid against maximums. It's a bit like going into an auction where
everyone in the bidding already knows what your highest bid will be;
they can force you to go that high without getting lumbered with the
bill if they bid against you. This hiding of actual costs is where
Overture has either made an error in judgment, or as many are
claiming, has deliberately tried to incite higher bidding than is
necessary -- to the detriment of their customers.
Even the Overture tool
<http://www.overture.com/d/USm/search/tools/bidtool/> to view the
current bids is deceptive because it lists only maximum bids, not the
actual current bids. As far as I can tell, it makes no distinction
between a fixed-bid and an auto-bid. Knowing that Overture isn't
disclosing the information accurately and openly, gives advertisers
some doubt about their credibility. To many it seems that Overture
has gotten too greedy. Unnecessarily so considering they were already
set to more than double their revenue this year over last.
What you must do if you are an Overture advertiser is disable
auto-bidding on all your search terms, or else set the maximum bid to
your current bid amount so it could only possibly make costs go down,
never up. Do not set higher maximums than you are actually willing to
pay for every click.
You must also attempt to spot the bid-gaps in order to gain better
return on investment (ROI). While other advertisers may still be using
auto-bids (or at least while you are still unable to check the actual
current bids), look instead at the bid prices below the position you
want. The latest features on third-party bid management tools such as
BidRank <http://www.bidrank.com/> and GoToast
<http://www.gotoast.com/> can be of great help here.
All of the bid management tools are also being fed the maximum bid
data (not the actual bid data), but this still enables them to scan
for bid price gaps. For example, the latest feature in BidRank allows
you to actually select to auto-bid one cent under the maximum bid of
any competitor for any search term. If that competitor has auto-bids
enabled, then they're going to be paying the full maximum.
The most important thing that you must do, more important than
anything else, is make your complaints and dissatisfaction about this
known to Overture. If you don't like the idea that this new feature
means your competitors can force you to pay your maximum price (and
that prices across Overture are already sky-rocketing as a result),
then complain. Overture needs to reconsider whether the extra revenue
is worth the loss of goodwill.
As a long time proponent of Overture, I have to say that I'm bitterly
disappointed in them for acting so firmly against the interests of
their advertisers. Most of all, I'm amazed that a company so creative
and prosperous has made such an incredible gaffe, and is willing to
risk their credibility for the sake of a few faster bucks. It's
especially amazing when you consider the impressive bucks they were
Web Marketing and Site Promotion
~~~Stuff You Might Like~~~
++Follow-up With Mike Grehan++
As you may recall, last week I reviewed Mike Grehan's excellent (but a
bit technical) "Search Engine Optimization Report." (If you missed
the review you can read it here:
</issue018.htm#stuff>.) Since then, I've
had some follow-up conversations and emails with Mike that I thought
you might be interested in.
Mike is determined to make sure I understand what term vectors are
even if it kills me (which it does!). I do enjoy speaking with him on
the phone, though, as he has this great British accent. (Little does
he know that my eyes are glazing over and I'm just...oh never mind.)
At any rate, I really am starting to understand those term vector
thingees and how they impact search engine optimization campaigns.
(Just don't tell Mike, as I want him to keep calling me!)
One of the most interesting things we've discussed is how he's
systematically "debunked" the notion of themes-based SEO. Many SEOs
who discuss themes seem to believe that every site can be described in
two words. Those two words are supposedly the theme of the site.
This certainly never made sense to me, nor does it make sense to Mike.
Here's what Mike says on the subject (read with proper British
It's not too difficult to debunk themes. First and most obvious is
this: the premise works on the fact that a crawler indexes every page
on your site. It then evaluates from the copy what the entire site is
based on in the sum of two words. Now, I have clients that are major
world-leading corporations, to the middle and smaller sized. The major
ones (bearing in mind, one of my clients has over 300 very large sites
to promote online each year) can have up to thousands of pages. The
smaller ones may have only 50-60 - and I don't know one of them that
has every single page of its site indexed.
It's unfortunate that many SEOs have, in fact, taken the idea of
themes to mean that their whole web site should be around the same few
words. A mini site about this, a mini site about that. But the fact of
the matter is search engines return "web pages" at the interface
following a query: not "web sites." Search engines determine the
corpus by the number of pages they have in the database, not the
number of sites.
Just because somebody talks more about a given topic on their web site
than the next guy, doesn't mean that they have greater preference.
However, common sense (as ever) prevails: if you are more focused on
what your topic is, then it's bound to stand a better chance of being
"picked up." And if one of your pages happens to have the most
astounding information on a given topic, something very important,
like Jill Whalen's newsletter, for instance, that so many thousands of
"hubs" point back to it - that's the page that's coming up. Regardless
of how great, or how finely tuned the keyword density is on the other
pages in the site.
It makes sense. If you have a site about a certain chronic illness and
you place useful links to, say, a pharmaceutical web site, that
company may be so large that it has 5,000 or more pages on its site
about all kinds of remedies. Now where would you point your visitor?
To the home page where they have to travel another ten clicks to find
the information? Or to the page with the information on it?"
It certainly makes sense to me. But I hear others asking about newer
engines such as Teoma that group sites as a community. Aren't engines
like that theme-based in many ways?
They're examining pages, not sites. Google, Teoma and Wisenut all owe
more than a drink to Professor Jon Kleinberg who developed the HITS
algorithm back in 1997. It was he who coined the phrase "hubs and
authorities" way back then. His algorithm was developed to identify
"web communities" by linkage data, not page content. It's been further
developed since then (IBM owns the patent). I've covered the entire
development in my book. There's nothing new about what Teoma and
Wisenut are doing.
Themes are about "on topic" pages pointing to "on topic" pages. That
way it gives the search engine a better chance of being able to
classify and categorise them. In one section of my book, Brian
Pinkerton (who developed the web's first full text retrieval search
engine using the vector space model), prioritised and classified pages
which were "hot to crawl," i.e., important by the simple number of
backlinks. It may not have been HITS. But it was about pages and links
even back then. Still is.
I hope this info gives you all a good feel for the sort of information
you'll get in Mike's book. He has thoroughly researched this stuff
and more. He doesn't just tell you what he thinks. He tells you what
he knows. And he's got all this knowledge from speaking at length
with many a search engine programmer. If they don't know the facts,
If you're interested in learning more about all this stuff from Mike,
you should definitely read his book. You can learn more about it
using my affiliate link here:
Still with me? Seems like a pretty hefty newsletter today. I'm
planning to have more guest articles in the future, as it takes a bit
of the work off of me. It ain't easy putting this sucker together
every Wednesday. It's actually the hardest work I do all week,
believe it or not. I remember when we first started Rank Write it
seemed so easy. What happened?
Please let me know if you have any good ideas for articles, and of
course keep sending those SEO questions in. Even if they don't get
posted, I often use them as ideas for future newsletters, and I do
often answer them personally.
Catch you next time! - Jill