Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Crazy Idea


This topic has been archived. This means that you cannot reply to this topic.
18 replies to this topic

#16 mcanerin

mcanerin

    HR 7

  • Active Members
  • PipPipPipPipPipPipPip
  • 2,242 posts

Posted 31 July 2003 - 11:16 AM

I have a suggestion for SEO's. What if a group of us got together and actually started using metadata for the purposes of SEO? I honestly believe that if the top SEO's began to use a useful metatag description, search engines would catch on - if only to compete with each other.

Problems:

1. Has to be easy to use (even for non-programmers) No scary codes.
2. spammer-proof, ( or at least, harder to spam) and,
3. actually useful from the get go (typing in metatags that will be ignored for years to come isn't my idea of time well spent)

I think for my clients, anyway, the best "bang for the buck" would be location data. Why? Because half of the searchers only type in one word searches (Jensen, Spink 2003) and you can bet that word usually isn't location data. Worse, 54% only look at one page of results from that and then stop looking - ouch! You figure 3 million searches a month and that runs into up to 810,000 searches per month by people looking for things that didn't know enough to put in location data (obviously some are looking for things that only require one term one page, so the number isn't exact, but it's safe to say "a lot")

The second reason for location data is that it's much harder to spam, and not as much reason to to it.

Suggestion:

In the spirit is K.I.S.S., how about the following:

<meta name="location" content="us,NV,Las Vegas,89102">

for a physical location, and

<meta name="location" content="web"> for a purely cyberspace entity.

The search engine looks at the tag in order, so if it's only interested in US sites then it would stop there. the country code is simple - use the standard TLD codes: us, ca, uk, etc. Then you use local address conventions in descending order. After all, it's just keywords. You can even include common misspellings.

THEN (and this is what makes it useful right away) put the tag in html and visible on the page (designers choice where) like this:

<location = us,nv,las vegas,89102> - now it will show up normally in a search engine.

Now it would be very simple for a search engine to do many things, like a) organise by location in a directory, b) prompt user for location data (like googles spelling prompt) or c) use a cookie or user profile to automatically put a higher priority on local info.


What do you think? comments? suggestions? changes?

Yours,

Ian

#17 Jill

Jill

    Recovering SEO

  • Admin
  • 32,963 posts

Posted 31 July 2003 - 01:01 PM

Welcome, Ian! :)

Nope, I just don't think anything like that can work, nor do I think it's necessary. For one, you just can't leave site owners or SEOs to accurately reflect what's on their pages with hidden tags. You just can't. They can't be trusted!

For another, why is it so difficult to tell what a site is about by what it actually says on it? If your site doesn't say what it's about, then is it really about that? (If you know what I mean.)

As long as our visible content accurately describes our sites, we don't need hidden ways of telling the search engines.

:)

Jill

#18 TBroadfoot3rd

TBroadfoot3rd

    HR 2

  • Active Members
  • PipPip
  • 30 posts

Posted 31 July 2003 - 05:14 PM

ove 85% spam submissions, but they have to be looked at to find out they are spam.


Well maybe if the bots of the commerical search engines would take the slag heap of spam rejects from DMOZ such as placing all the spam sites into one directory. Then let those search spiders index that directory and apply a devalue weight for all sites found within, once they realize that adding spam adds to diminished returns they may just stop submitting junk. Of course that is like whistling past a grave yard..... good for the soul but helps little to combat the boogey man :hmm:

But maybe that should be the focus of how to combat spam, figure ways for the spiders and the directories to "work" together since the directories are still human powered they should get a weitght factor to the automatons. It might shift the balance if considering 85% of submissions are spam just think of how quickly the commerical search engines would clear their indice.

Ah to dream about spam free schemes and dreams. There has to be a way to put the human element as a factor and then the hard work the editors is rewarded and the slow end of backlog because spam sites get slowed since they see no value in being put in the spider purgatory section. But to get the engines to agree to kill or devaule sites found in directory spam list is slim to none unless as it is suggested a grass roots uprising happens or a large email campaign to have this issue addressed so that the over worked editors can be the human gate keeper against spam and its spread.

Stepping off the soap box.

Oh and great forum with some fantastic ideas and concepts to make one ponder and think.

As always YMMV
:)

#19 Jill

Jill

    Recovering SEO

  • Admin
  • 32,963 posts

Posted 31 July 2003 - 06:17 PM

Oh and great forum with some fantastic ideas and concepts to make one ponder and think.


Thanks, Thomas, and welcome! :hmm:

Jill




SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!