I recently had an inquiry from someone who was looking for some possible SEO consulting with me. He was in the process of a redesign and wanted to be
sure not to make any mistakes along the way, which is super-smart! The time
to be looking at SEO is definitely in the beginning stage of any design or
The interesting part of the email was this person’s misconceptions about
what he thought were important factors for the search engines. I’d like to
share those points with you, with my comments following each one:
* Little or no Flash.
This is a huge misconception to many who are trying to design
search-engine-friendly websites. There’s nothing inherently wrong with
using Flash and no reason to avoid it altogether. What you do need to avoid
is an *all-Flash* site, as well as Flash navigation. But that’s it. And
even if you have those things, there are workarounds.
* All scripts should be called from external files.
This is a great idea to keep file size down and make it easy to update your
pages, but it’s got nothing to do with search engines or how your pages are
ranked within them. Search engines have long known how to ignore code that
is of no use to them. Whether your scripts are right there in the source
code of the page or called up externally will have no bearing on your
rankings or search engine relevance.
* The site should be designed using CSS as extensively as possible.
Another myth. CSS doesn’t have any special properties that search engines
like better than tables or any other HTML code. Again, it may make it
easier for you to update your pages, or to use your content for other
things, but it’s not an SEO technique that will increase rankings or
* The CSS should be called from external files.
Same as calling up scripts in external files — nice to do, but not a search
engine issue in the least.
* There should be no comments in the code. It should be added to an FAQ or
Why not? I’m not sure where this myth came from, but I suppose if you’re
thinking that file size is going to affect your search engine rankings, you
might also believe this one. It may have also come about because some
people used to think that adding keyword phrases to comment tags would help
search engine rankings, even though it didn’t. Comment tags have long been
ignored by the engines, and because of this, you can use them as much or as
little in your source code as you would like. I always comment out bits of
text and code that I no longer wish to use but that I may want to add back
in later. It’s absolutely, positively not a problem!
* A large percentage of the code on each page needs to change from page to
page so that the search engines don’t see the pages as duplicate content.
Nope. You certainly do NOT have to change the code in your pages to avoid
duplicate-content issues! Website templates have code that is exactly the
same from page to page. This is good and normal and certainly fine with the
search engines. One would have to think that the search engineers were
really dumb if they were going to penalize pages because they used the same
design template from page to page! Sure, you don’t want the same exact
*content* on every page of your site, but even that is not generally a
problem if it’s a few sentences here and there. (See my recent article at
Danny’s Search Engine Land site on the Myth of Duplicate Content.)
* All picture links should have text links under the pictures.
No reason for that at all. Image links that make use of the image alt
attribute (aka “alt tags”) have always been followed easily by the search
engines and will always continue to be followed. They’re followed even
without the alt attribute, but the words you place in there tell the search
engines and the site users exactly what they’ll be getting when they follow
the link. It’s essentially the same thing as the anchor text of a text
This is fairly good advice; however, there are very easy workarounds if you
perfectly legitimate place to recreate your menu for those who (like the
definitely not a problem. I just haven’t gotten around to redesigning my
site with a more crawler-friendly navigation. Certainly these days, a CSS
menu would be a better option.
mouse-over, and no image map graphical navigation).
of crawler-friendly image maps, and like I mentioned previously, graphical
links are A-OK with search engines.
* All pages must be VALIDATED by an HTML validator and all style sheets need
to be VALIDATED through a CSS validator.
Why? This has nothing to do with search engines. It’s nice to do, though.
* The majority of the site will be static, as static pages are easier for
search engines to crawl and rank properly.
‘Fraid not. Dynamic pages are just as easy to crawl and rank as static
pages. Most websites today are dynamic because they’re simply easier to
maintain. The search engines have figured out how to crawl and rank them
just fine for many, many years now. It’s true that there are specific
things you need to watch out for when creating a dynamic site, but most
developers are aware of the worst of the issues. You certainly should
consult with an SEO if you’re changing content management systems, or if
you’re having problems getting your dynamic URLs spidered and indexed. But
there’s no reason to have only static pages on your site because you’re
worried about the search engines being able to index dynamic pages.
* The site needs to be browser-compatible and screen-resolution-compatible.
This is another thing that’s nice to do for your site visitors, but it has
no bearing on search engine rankings or relevance.
Phew! I hope this helped clear up a lot of misconceptions that anyone else
may have had. Please don’t get me wrong — I do agree that most of the
things listed here are great design tips that can help you to create an
awesome, user-friendly website. I just want to make it very clear that they
have nothing to do with SEO, rankings, spidering, indexing, etc.