Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo

Volume Of Traffic Has No Effect On Serp Or Outbound Link Power/popular


  • Please log in to reply
3 replies to this topic

#1 beibei

beibei

    HR 1

  • Members
  • Pip
  • 1 posts

Posted 29 June 2009 - 10:30 AM

Hello all.

I would like suggest to something which may be seen as controversial (or maybe not – I’m still quite new at this) and I would appreciate any informed responses you can give – as there seems very little about this on the forums:

Volume of traffic has no effect on SERP or Link Power/Popularity/Page Rank.


In theory, 2 identical sites with identical back-links/domain age/keywords etc etc where one had a million visitors a day and the other had none, would both achieve the same SERP / PR.

Counter-Evidence:

It is obvious in most cases that sites with high traffic have high search engine position and a lot of link power. But that in no way means that traffic is playing a part in SERPs or PR. Obviously, a site with a high SERP, or with many relevant, authoritative in-bound links attracts a lot of traffic. The traffic is caused by the SERP / links. The SERP / links are not caused by the traffic.

How could it be done?

If you disagree, the question you should be asking yourself is: How does Google (for example) fairly measure the traffic that goes to a website?

Possible answers:
• By recording the website data from Google analytics.
• By recording the browser data from Google Chrome.
• By recording the browser data collected from the Google tool-bar
• By recording the successful click-throughs from the Google search-engine (be it organic or advertised)

I can think of no other ways (but would be delighted if someone else can). I’ll look at each and explain why I feel that Google cannot use the data.

• By recording the website data from Google analytics:

The obvious answer here is because a relatively tiny % of the internet uses Google analytics. Why is that a problem? Because Google would be giving an unfair advantage to those that do have it which would skew its own SERP results / link power rankings away from delivering the most relevant results. Any major company using bespoke analytics would be at a disadvantage, as would every gem of a website that doesn’t have analytics at all.

You might argue that for those without GA, Google would ‘assign’ an estimated average value based on...? number of pages? Back-links? But that would mean that half the websites with GA (the one’s with below average traffic) would actually perform better if they didn’t use the analytics. It would be better to use a rival package.

For me, none of this makes sense. I don’t feel that Google could use traffics data from its own analytics package to accurately or fairly judge a websites SERP or PR.

• By recording the browser data from Google chrome.
• By recording the browser data collected from the Google tool-bar

These two are quite similar. I must admit, I do not know if Google does a similar thing to Alexa and actively takes browser data (i.e. the Alexa traffic rank), but I suspect that doing so would be a poor way to judge SERP or PR for many of the same reasons. Primarily, these tools would not show all the traffic that a website attracts, only the ones that use Chrome or the tool bar. It could only work by assuming the data is reflective of all users and when the % involved are so small, the results will be off. Alexa results are notoriously skewed in favour of the geeky sites that are visited by the kind of geeky people (like me) that have used the Alexa Toolbar, or indeed even know what it is. Again, I feel this approach would damage the image of quality results that Google has tried so hard to achieve.

• By recording the successful click-throughs from the Google search-engine (be it organic or advertised)

Again, this would only pick up a percentage of the total traffic of the site. Is that % enough? I would argue no. How many people type ‘bbc’, ‘cnn’ or their college/university into Google? How many more type it straight into the URL, come in via links or have them bookmarked. Again, I would argue that the inconsistency across the internet would make it impossible for Google to accurately estimate total traffic based on its own click throughs.

So I am arguing that Google cannot accurately gauge the traffic of a website relative to other sites. This inaccuracy means that it cannot use the inconsistent knowledge it does have in any of its ranking algorithms.

Would love some feedback on what feels like quite a counter-intuitive suggestion.






#2 BBCoach

BBCoach

    HR 5

  • Moderator
  • 402 posts

Posted 29 June 2009 - 10:45 AM

First, the world is a lot bigger than Google. Second, how would they be able to accurately formulate a site's traffic to give it an effective ranking for its SERPs? They can't and so they crawl the web with their own algos to make up the rankings based on a page's info in relation to the many other secret factors for ranking a page.

#3 Jill

Jill

    Recovering SEO

  • Admin
  • 33,244 posts

Posted 29 June 2009 - 11:03 AM

QUOTE
So I am arguing that Google cannot accurately gauge the traffic of a website relative to other sites. This inaccuracy means that it cannot use the inconsistent knowledge it does have in any of its ranking algorithms.


I think most here would agree with you.

Although, I imagine some day they may use that data, and might even today to a certain extent. But, like you, I believe it's usefulness is limited, so they would take it with grain of salt that it's worth.

#4 Randy

Randy

    Convert Me!

  • Moderator
  • 17,540 posts

Posted 29 June 2009 - 11:44 AM

I would contend that even if they could gain some accuracy traffic volume is still a bad indicator to use.

Starting with the fact that just because one site gets more traffic than another site doesn't necessarily mean it's a "better" site. Let alone a better site for people looking for whatever search terms it targets. I've said it before and I'll say it again... I'd much rather get visits from only 500 people per week if all or most of those people are truly looking for what I have to offer and have the wherewithal to pay for it. I'd much rather have those 500 and those 500 only, as opposed to attracting an additional 4,500 or 49.500 that either can't or won't buy just to get my stuff in front of the 500 that are buyers. The others just waste my time and bandwidth.

Unfortunately limiting it to just the 500 buyers is much easier said than done. But if traffic volume were a consideration I'd have make a legitimate effort to attract all of those non-interested or not-ready-to-buy visitors, as opposed to being more focused on better serving those I can actually help.

So there's that. More traffic doesn't prove anything. And in fact most times more traffic is fairly useless.

And just to put another spin on it, let's say they did start utilizing traffic as some sort of indicator or ranking factor. If they did I could go out today and spend a couple hundred bucks to buy 100,000 - 1,000,000 hits from popup or pop under vendors. Talk about your cheap advertising! It's cheap because conversion rates really, really suck. But if traffic volume were a ranking factor those would suddenly become quite popular.

Or I could simply go purchase a dozen or so old, expired domains and point their traffic at my real domain, thus gaining all sorts of traffic that wasn't really mine to begin with.

The moral of the story being that traffic volume as a ranking factor would be far too easy to game. And tells the engines basically nothing about the real popularity of any site with its target audience.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

We are now a read-only forum.
 
No new posts or registrations allowed.