Are you a Google Analytics enthusiast?
More SEO Content
The Effect Of Split Testing On Rankings
Posted 02 October 2009 - 05:09 AM
The new version may not rank quite as well for one of my busy phrases, but I'm hoping it will convert better on the other busy phrases (this is, of course, the point of doing the test) - but what happens if I'm wrong and the end result is that I get indexed lower for one of my busy phrases and no improvements in other areas?
Posted 02 October 2009 - 07:17 AM
To see what the G! bot is likely to see, use firefox with JS disabled and view the page in question and its source, what you see is likely to be what G! bot sees.
Unless others know different of course
Posted 02 October 2009 - 08:38 AM
I knew all those things, I think I just needed someone with a bit of common sense to put it together for me! (forgive me, it's been a long week)
Posted 02 October 2009 - 08:41 AM
we all have those "can't see for looking moments", it's every day for me
Posted 02 October 2009 - 09:37 AM
Typically if you're testing conversion alone you're not really fooling with stuff like page titles. You can test those to see which drives more clickthru's, but that's kind of another type of testing. Titles just don't drive conversions on the back end in my experience.
So if you're not changing the <title> and the phrases you're targeting are still in the content in prominent places, I doubt you'll see much if any ranking effect during your test.
FTR, those times I'm running conversion tests on a prominent traffic driving page is the one time I do actually track rankings. Not so much for use during the test itself, but so that I know where rankings were before I started testing and where they end up after I've settled on a CT winner. And yes there have been times where I've tweaked the winner slightly to improve rankings a bit back towards where they were before.
Posted 02 October 2009 - 01:14 PM
Posted 03 October 2009 - 10:44 AM
Posted 03 October 2009 - 10:50 AM
No clue whether that means they'd be processing off-domain js calls or not. Or if they'd do that and count it as content for the target domain's page. Flip a coin on that question.
Posted 04 October 2009 - 12:48 AM
Posted 04 October 2009 - 07:26 AM
I have a feeling this is the type of situation they're describing when they say the anchor text can pass value in JS links. Where all of the info necessary for them to extract information is right there in the source code in plain old text.
Posted 04 October 2009 - 10:42 PM
Posted 04 October 2009 - 10:50 PM
The old JS way of hiding something from the bots isn't a wise idea anymore these days.
Posted 05 October 2009 - 04:14 AM
AJAX is a prime example that if you don't execute and process the response, the page won't contain any content hwta so ever other than a blank div.
It's not stealth, it's AJAX to perform DHTML, and it doesn't necesarily contain a document.write or other JS style content command.
myObject.innerHTML = myHTML;
unless executed will mean absolutely nothing to G! as plain text for indexing.
Also does G! download the JS file and index it as a JS file to be provide in the SERPS, I'm not happy if it does, ok I know the JS code is available on the WWW, and I know you should use robots.txt to block unwanted files being indexed.
But there is just no need and it's wrong to arbitarily index peoples JS files, i've not come across this yet in any of my SERPS, but if G! is now indexing JS, will this start to happen?
Posted 05 October 2009 - 10:14 AM
And you've hit on one of the things I've actually looked at from time to time. That being AJAX driven sites. Their content simply doesn't get fully indexed if the AJAX isn't developed in such a way to make it accessible (via different urls) for non-js compliant browsers.
Now of course none of this means that Google isn't grabbing those .js files and doing something with them internally that doesn't show up in the SERPs. If you asked me to guess, I'd say they probably are doing exactly that. And doing it for the purpose of catching sneaky stuff that flies in the face of their quality guidelines. After all, that's how most of the really widespread Google SERP Hacks have been implemented over the past few years. And it's also how most badware is getting distributed these days. So it would make sense for them to grab that JS code and analyze it.
Oh, and for the record I don't believe encrypting the js does anything to stay out of harms way with Google. I have seen plenty of evidence with badware stuff where the js was encrypted, but Google still picked up the actual site address people were being forwarded to. So encrypting sneaky stuff isn't going to keep if from being discovered.
Posted 05 October 2009 - 11:07 AM
But I'm still not sure G! could index dynamic DOM updates via AJAX calls even if it triggered the JS.
If you view the source markup after a DOM update the content isn't there for normal viewing, so I guess the same would be said for the Google bot!
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users