Are you a Google Analytics enthusiast?
More SEO Content
Posted 01 August 2012 - 04:25 AM
I've been building website since 1997 and pride myself on building sites that meet web standards and comply to the disability discrimination act. Towards this end I make full and proper use of the following:
- Title tag
- Meta tags
- Alt tags on images
- Title tags on links
- H1-H6 tags
- Low ratio of HTML to live text
But I've been reading that employing all these things on a single page results in over-optimisation.
Now I don't do these things in an attempt to scam search engines but simply because it's best practice. However, if I'm advertising 'red widgets' then that phrase/word combo/variations is going to be present in the above elements several times. Is this considered over-optimisation even if it's legit?
The site I'm working on currently has 300+ pages of unique copy broken down into canonical subsets where 'red widgets' aren't mentioned at most levels other than the main top-levels plus the urls (www.red-widgets.co,/red-widgets/.../... etc).
Can anyone clear this up or give me an example of what is considered as over-optimization?
Posted 01 August 2012 - 08:37 AM
What people are talking about when they use that term, however, is not what you're describing. You're talking about the proper use of the elements of a page to make it clear to humans and search engines what that page is about and how its information is structured. That's what everyone ought to be doing.
But I'm sure you can imagine what it would be like if you took those ideas too far. Your page would be so narrowly targeted to a single word or phrase that reading it would feel like someone was just getting in your face and screaming that phrase at you over and over.
Your title, your <h1>, and probably some of your other headings would be identical, as would the anchor text of every link pointing to the page, including the alt attribute of any image anchoring a link there.
Books! We've got books! Lots of books! More books than you shake a book at! Need a book? We've got that book! Get that book from us!!!
It's one thing to use your page to send a clear message to readers and consistent signals to search engines. It's quite another to just completely overdo it, and people can tell when you've crossed that line. So can search engines.
- Jill and chrishirst like this
Posted 01 August 2012 - 09:17 AM
What you've described sounds more like keyword stuffing, which obviously I'm trying to avoid.
However, lets go back to 'red widgets' and say it has a sub category called 'blue valves', would I be penalised for eg:
<h2>Blue valves for red widgets</h2>?
That's a bit vague without a full page to base it on, but you get the idea...
Posted 01 August 2012 - 09:25 AM
Posted 01 August 2012 - 09:35 AM
I'm just trying to get people's thoughts on this to nip any potential issues in the bud.
Normally I'd not worry myself, however after reading about over-optimisation (or what was described as over-otimisation on other websites) I thought I'd ask the question.
Posted 01 August 2012 - 09:40 AM
Of course, I'm not Google. But as long as you're using the terms naturally within the "flow" of the content and the heading really is what the page is about, I wouldn't think you'd have any reason to worry.
- Jill likes this
Posted 01 August 2012 - 11:41 AM
We all have our own standards, but there's some point at which the vast majority of us are going to agree that one page we look at has simply taken things too far. Google's engineers have the job of figuring out an algorithmic way to come to the same conclusion the rest of us would -- to know it when it sees it. But the algo can't say, "that page just leaves me feeling... wrong." Instead, they have to look at some long list of signals and how strong they are, and decide if signals a, b, c are this strong, and signals x, y, and z are this weak, they're going to flag this page as over-cooked.
And that's bound to change over time. Google's going to find that by dialing up the weight of this signal combined with that signal, they're flagging too many pages that humans would consider ok to be not ok, so they'll have to make adjustments to tune the way they weigh this signal and that one.
That leaves us in the position of making our pages as good as we can and just hoping Google does a good job of judging them. There's no point in trying to figure out exactly what they're looking for, especially if making our pages match those signals makes them worse for people, because even if you're matching what Google's looking for today, if it's worse for people it's probably going to lead to Google tweaking the algo again, leaving you to try to match the algo again. And really, that's the position we've been in all along.
- Jill and torka like this
Posted 01 August 2012 - 01:26 PM
For example if there were only 3 sites selling 'red widgets' and site 1 had no onsite SEO, site 2 had medium onsite SEO and site 3 had onsite SEO out of its ass then I'm guessing site 2 would get preference in the SERPs. Now apply the same logic to a niche with 1000s of sites dedicated to it, there's no real way of knowing what's the correct percentage of onsite SEO to favour highly in the SERPs is.
Suppose all I can do is the best job I possibly can using best-practice onsite SEO methods and see where it gets me.
Although in this particular niche the page 1 results on Google have very limited onsite SEO but huge numbers of backlinks. Should be an interesting few weeks.
Posted 01 August 2012 - 01:59 PM
- OldWelshGuy likes this
Posted 02 August 2012 - 05:31 AM
- Jill and OldWelshGuy like this
Posted 02 August 2012 - 05:35 AM
Posted 02 August 2012 - 06:18 AM
I usually find that if you have to ask yourself is that Ok? You are getting to, or have already reached the point of over-doing it.
Like I said - normally I wouldn't be worried, but after reading articles recently and their woolly descriptions of over-optimisation, I thought I'd check.
And yes, I know it's an Alt attribute.
Posted 03 August 2012 - 06:34 PM
Imagine 150 words amongst say 5k of code and then place the same 150 words among 5Mb of code.
there is no "code to content" ratio that search engines use as a metric
Is that statement still true ???
Posted 03 August 2012 - 08:46 PM
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users