Are you a Google Analytics enthusiast?
More SEO Content
Posted 19 February 2006 - 12:55 PM
Is it because people simply don't know how to do this sort of A/B testing? Or is it because they don't know how to analyze the results?
Is this the next big thing someone needs to develop scripts or tools to help people with?
Posted 19 February 2006 - 01:58 PM
I think that's it: "People simply don't know how to do this sort of A/B testing." Cline's tool seem pretty cool to me. The other part that you mentioned about analyzing the results is also true. Most average webmasters seem to spend their time on so many other things.
I've been doing a wide variety of testing recently. Gone are the days for me of worrying solely about rankings. A big part of my focus is overall site improvement for increased conversions on my sites and my clients which has been my focus for sometime now. I've also had some success recently with getting in that little blue shaded area for sponsored links on Google which I think is semi-cool.
I've had a lot of word of mouth referrals send people to my site recently. Those who were referred took action which lead to increased business. Yes, they were referred to the site, but to my amazement, they actually read the information and it caused them to take the next step.
Somehow the message has to get LOUD AND CLEAR about what you are asking because as you know, people are still focused on that little green Google bar, focused on keyword meta tags ( especially designers ) and all the other stuff that doesn't really matter. They should be focused on whether the 10, 100 or 1000 visitors that went to their site did what they wanted them to.
Posted 19 February 2006 - 02:02 PM
Running on faith!
Posted 19 February 2006 - 04:09 PM
I think it's both!
Should I use a script for tracking? How do I install it? Should I use an outside tracking service? Whatever your answers, which script or service?
What am I tracking? Hits? Purchases? Subscriptions? Paths? Everything? Can your script/service even track sales through someone else's shopping cart or affiliate program?
How do you do a split test? Of your home page? Of your newsletter? With Pay-Per-Click?
Have you ever done an A/B split test and received identical (or very similar) results for both pages (or newsletters)?
Have you done the same split test on a different day and received totally different results? If so, what does this mean?
What is the minimum size sample you should be using for your tests?
Despite the complexity, it's important to get answers. Without testing, you will never know for sure what works and what doesn't work. Testing is key to improving profits from your online business.
Do you have any recommendations for a reasonably priced product or service that helps simplify testing and tracking for Internet marketers?
Posted 19 February 2006 - 10:49 PM
I've not yet developed a script to do all of the split testing automatically. It wouldn't be that difficult though. When I do split testing --and I must point out here that I'm almost always testing something or the other-- I tend to do it in a random, but very controlled manner. Meaning I try to limit the number of variables that could affect the outcome for control, but the random part means that that one visitor gets one page and the next visitor gets another page.
I've found that this sort of random (though it's not really random) testing helps to even out some of the tiem of the day, day of the week and week to week differences that can creep in.
This is going to sound silly to some, but I've even tested stuff like the look of a Buy Now button or where some statement regarding how someones private information is going to be handled. But just one at a time. So I won't attempt to test the Position of a button at the same time I'm testing the appearance of the button. First I'll test one and then the other. Only then will I test the combinations. It's not terribly unusual for me to see a 50% or larger conversion improvement for something as simple as the look and position of a buy button.
I haven't historically done an awful lot of click path testing either, though I should. I'll be doing a lot more of this going forward.
As far as the sample size goes... I like to see at least 2,000 uniques on each element I'm testing. I would rather see 10,000 per in the sample size, but that's not always possible. And not always necessary either if you see a large jump from one to the other and have at least a decent sample size.
A script would make it easier for some I think, at least on the implementation side of things. However I think just having the discussion could also have a positive impact for many.
Posted 20 February 2006 - 08:07 PM
I think it boils down to 2 main reasons:
1) They don't know that they should
2) They don't know how, or it's too complex
I use a free script that serves me well for some sites, and another custom one I developed for a couple more "specialized" sites. I've been looking into some commercial solutions that promise more of a "turnkey" solution, but haven't found the perfect one for my needs yet. But there are so many to choose from, so I'm sure it's out there (I'd hate to have to develop another custom one...my time is better spent elsewhere).
I'd be interested in what others are using, and their experiences with them.
Posted 20 February 2006 - 08:29 PM
Not doing testing can cost more $$$$$$$$$$$$$$$ long term
Posted 20 February 2006 - 11:32 PM
We hear it all the time here on the forum. People come in with questions about "What will happen if I add this?" or "What will happen if I rearrange that?" And when we tell them to try it out and see, you can almost hear them quaking in their shoes at the thought of experimenting with their website.
People would rather live with the less-than-optimal situation they know than risk making things worse. And for some reason, it doesn't seem to occur to them that if it does make things worse, all they have to do is change it back. Or if it does occur to them that they can change things back, they're in fear that whatever change they make will still somehow be treated as permanent by the search engines.
Some folks are so much in fear of being "banned" by the search engines that they're afraid to sneeze around their computers for fear of offending the Search Engine Gods.
I think it's going to involve three-fold education. First, we need to teach folks they're not going to irrevocably ruin their livelihood by testing -- and they may well increase their conversion rates and revenue and make things much better for themselves. Second, we need to teach them a few solid methodologies for testing -- nothing fancy, necessarily, but good common-sense stuff like Randy's advice to only try to test one thing at a time. And, of course, some rules of thumb for how to measure and evaluate the results. And third, we need to let people know about the tools that are out there (or techniques they can implement on their own) to make the testing and evaluation process easier and more accurate.
Posted 21 February 2006 - 06:08 AM
John: The split testing software (if you can even call it that) I use is home grown and frankly wouldn't work well for most sites. It's set up for the way I create my sites, so isn't nearly flexible enough.
I've been looking at a couple of the commercial options that are available out there the last couple of days out of curiosity. I'll probably grab a license for a couple of those to do some testing of the testing software. Sort of a side-by-side test to see which performs best and gives the most important information in an easy to understand format. If the freebie script you're using is still available PM it on over and I'll add it to the list for testing.
Posted 21 February 2006 - 09:03 AM
In our case, using 1AutomationWiz, we have the ability to direct visitors to as many as 3 different pages and the system will even track the conversions.
I don't know about your respective businesses, but in our case, we find lots of people start out looking for free information and I am not sure if (for the most part) they have any interest in anything more. Our conversions are in the low single digits and to top it off, we are a very seasonal business. That doesn't make for a good base to start testing.
To add to the testing problem, we're finding is that people are grabbing the free downloads and then coming back to purchase some other time. I don't know if we ever sold anybody anything on a 'first hit' basis. Besides, in our case, I am not convinced people come to the site with the intent to purchase in the first place.
So how can you A/B test that?
Posted 21 February 2006 - 01:46 PM
Posted 21 February 2006 - 05:32 PM
What Randy said is so simple, but so true. You should A/B test but only on one variable at a time. How else can you isolate cause and effect?
I'm far from a usability expert, but subtle changes to a buy button or putting information where users can see it and act on it can make a BIG difference to conversions or sales or whatever it is you want your visitors to do.
This forum is such a great resource. Thanks!
Posted 22 February 2006 - 02:51 PM
1. They do not know how to do it.
This kind of testing isn't taught much. You can learn the methodology in some fields of psychology, and some areas of the sciences. Not many people studied this in school.
Most people do not have much understanding of statistics. And even if they do, they may not know the types of statistical analysis required where the data of interest are rare within the universe (e.g., sales, responses). They are more likely to know survey statistics, which are not applicable to common marketing problems.
Your typical marketer or business manager is in practice wildly overconfident about their own judgement. They don't feel they need to test. They are already convinced they know what is best.
Testing is a lot of work, with no guaranteed outcome.
5. Lack of Understanding about Levers
Many people don't have enough experience to gauge what is worth testing and what isn't. They'll want to test trivial stuff which is unlikely to have a significant impact. (Should the background be blue or green? It's so hard to decide! Let's test it!) And they won't want to test other things, like price, offer, or information architecture that -- while they may be a lot of work to set up -- can have a tremendous impact on profitability.
BTW, the reason I created that A/B Test Statistical Calculator was primarily to pretty up a tool that I use so that I could point clients to it so they could run numbers for themselves, as they sometimes are incredulous about test results (see that point on overconfidence).
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users