SEO Class in Chicago, IL
Are you a Google Analytics enthusiast?
More SEO Content
Posted 14 March 2005 - 09:31 AM
That's it - thanks in advance
Posted 14 March 2005 - 09:34 AM
Sounds like using CSS to deceive to me.
The most common application of CSS to deceive is to create font styles that match the background color of a webpage and fill that space with (unseen, to people) spammy keywords.
Posted 14 March 2005 - 09:36 AM
FWIW, I wouldn't expect that little loophole to be around for too long. It's too easy for the search engines to detect.
Posted 14 March 2005 - 01:34 PM
Just one added question: Can't their sometimes be totally legitiment reason to hide a layer filled with text? I understand that it should be possible distinguish spam from the real thing, but still it seems evident that some sites use hidden layers with text. Would the SE robots check if the layer can be activated in that case?
Posted 14 March 2005 - 01:37 PM
Posted 14 March 2005 - 02:37 PM
Right. What's so unfortunate about this particular situation is that dynamic layers are such an excellent way for web authors to take advanatage of client side processing to make their applications snappier and richer.
Sort of off-topic, but what's really got me excited right now is XML + SVG to create widgets such as those seen in Urchin 5+. Simply beautiful.
Posted 19 March 2005 - 07:17 PM
Say by setting a <h1> tag to display in size 12 font. altered tags could technically give the benifit of having their included text treated as something it is not; however it isn't what I would deem ethical. With a little ingenuity one could pretty easily hide the CSS file itself, but sooner or later the SE companies will have to start looking at the CSS files to determine what's legit.
Posted 20 March 2005 - 09:12 AM
Posted 20 March 2005 - 09:33 AM
Same with display:none|block|inline to change the tag behaviour or visibility:visible|hidden to show or hide features on a click.
I have a collapsible navigation menu that uses display:none|block to toggle sections of menu on and off, is that deceptive? It's done for the user so there isn't 200 or 300 visible links in the menu on every page, all that is seen are the top categories, then clicking those opens the relevant blocks and closes others. DevGuru use the same technique and have done for a couple of years now.
I certainly am not going to sit and fret about getting banned for using a legitimate design technique. If you are not trying to deceive then why worry.
It's not the technique that is incorrect it's the intent behind using it and no SE algo can work that out.
Posted 20 March 2005 - 09:45 AM
I'm thinking of uses like Chris' menu as well as Torka's home page, where the user just has to mouse-over or click on an element to reveal the hidden element. Is it reasonable to expect a smart SE programmer be able to figure that out?
Posted 20 March 2005 - 10:10 AM
Say by setting a <h1> tag to display in size 12 font.
I don't buy this as inherently deceptive (maybe just because I don't want to).
Standard <H#> tags are a DRAG, for a number of reasons. I think using CSS to modify their appearance is totally legit if it serves to make your page more legible and usable. For one, you have <H1> which is enormous, <H2> which is significantly smaller, and <H3> which is another huge reduction in size, and not too much bigger than default text. The <H#>s that fall below 3 are so small they are worthless.
The intent is to give emphasis and priority, telling the user (and SE) what is a heading or section title, to organize the page. What I really need for the composition of my page is <H2>, <H3*.67>, <H3*.33> & maybe <H3> .... but to meet the intent behind the <H#> tags, I should use <H1>, <H2> & <H3>. HTML is not a graphics language, and the originators really built in some huge roadblocks to good graphic design. If you are happy w/ an ugly, clumsy page, and your graphics don't matter, then the defaults work fine. But most sites would only be able to use <H2> & <H3> effectively, if that. CSS allows you to meet the intent, while maintaining control over your presentation.
The other BIG issue to me is that <H#> tags are block elements and force a hard return. As Chris pointed out, using "inline" via CSS is the only way to override this and use <H#> tags.
I opted not to use any because of the limitations, but I'm going to change to a CSS modified strategy. Otherwise, the tags, and their benefits are totally useless to me.
Posted 20 March 2005 - 12:09 PM
Could they detect hidden layers?
Most certainly. It would be child's play.
Not according to anything I've ever seen.
Do I wish they would?
The problem for the search engines is that if they flat out ignore any content that is hidden on the page, they'll end up "penalizing" thousands and thousands of sites that use hidden layers for exactly what they were intended to be used for. All in the name of trying to catch a few bad apples that use them to keyword stuff.
So they first need to be able to detect if the layer can in fact be made visible by a user's action. Which is a taller order than simply detecting if it's hidden or not.
Posted 20 March 2005 - 12:57 PM
Yeah Randy, but how's that different from sandboxing thousands and thousands of valid sites all in the name of discouraging the bad apples that game their algorithms? Maybe I'm just impatient, maybe I'm actually getting more than a little PO'd about this aging filter, but it seems to me this would be par for the course from my limited experience ... call it a penalty or not, it's sure as hell feels like one not being allowed to compete naturally on a level playing field - and extremely anti-entrepreneurial at that. It's hard enough to get a small business started without this. I don't see fairness being a high priority for those at google.
For those who would think I'm implying it .. no, I'm not saying Google owes me anything, I'm just damn frustrated they see this as a positive move. 10 1/2 months and counting.
Posted 20 March 2005 - 02:10 PM
But that's still not the reason Google wouldn't do one when they've done the other. Truth is that the whole "I'm gonna start a brand new site to replace the one that just got axed by Google and link to it from all of my other sites" concept is a lot more dangerous to Google than hidden layers will ever be.
Truth be known, if they could all simply come up with a really good keyword stuffing/content spam filter they would never need to worry about hidden layers. Because those employing the tactic the wrong way would already be filtered.
Posted 20 March 2005 - 02:13 PM
It's not a question of fairness, Arlen.
The only thing Google cares about is creating a searchable database that will fill their users (the searchers') needs.
The needs of you as a webmaster are not a consideration to Google, nor should it be.
If there is a certain technique that they feel they need to nuke because more often than not it will lower the quality of their index, then that's what they will do.
Obviously, they have decided that most new sites (6 mos or less) are more often than not, NOT deserving to show up highly for competitive search phrases. They had to weigh the pros and cons, by looking at how these sites affected the quality of their index, not by whether the owners of those websites would get mad at them or not.
Remember, Google's motivations are completely at odds with our own.
I'm not saying I like this, nor am I happy about it as a Webmaster or SEO, but I completely understand it. Google has got to do what it has got to do, and we've got to do what we've got to do. It's not a partnership nor a symbiotic relationship.
They show whatever sites they feel are best to show, and we have to try to figure out how to make our sites be those ones that they feel are the best according to whatever relevancy factors they say are the right ones.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users