Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo

Analytics Shows 0 Sec. Duration For Most Of The Visits


  • Please log in to reply
9 replies to this topic

#1 dcpramachandra2011

dcpramachandra2011

    HR 1

  • Members
  • Pip
  • 4 posts

Posted 09 June 2015 - 12:03 PM

Hi Everybody!


I have seen a special case for the first time in my SEO career. My friend is working for an E-Commerce website and It has Shopify platform.

Here he is facing 2 problems.

1. Whenever we are observing its traffic in Google analytics, most of the visits are shown with 0 sec. duration. I am not able to understand about it for the last 2 months. Can anyody please tell me the main reason for this problem and how to solve this problem.

 

2. It is also showing lot of spam visits in referral traffic from various sites like social-buttons.com, site34.social-buttons.com, 4webmasters.org, sitevaluation.org, free-social-buttons.com, Get-free-traffic-now.com, etc. Is there any solution for this problem? How to prevent these spam visits? We want genuine visits to the website.


We want solution for the above 2 problems. Kindly anybody give the reply as early as possible.

Thanks & Regards,
Ramachandra.
 



#2 Jill

Jill

    Recovering SEO

  • Admin
  • 33,244 posts

Posted 09 June 2015 - 01:54 PM

You can ban the referring spam URLs via your .htaccess file.



#3 dcpramachandra2011

dcpramachandra2011

    HR 1

  • Members
  • Pip
  • 4 posts

Posted 10 June 2015 - 09:27 AM

You can ban the referring spam URLs via your .htaccess file.

We have no choice to ban the referring spam URLs via .htaccess file as it is shopify platform.



#4 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,325 posts
  • Location:Georgia

Posted 10 June 2015 - 10:07 AM

No analytics package can accurately track how long a visitor stays on a Website but a 0-second estimated duration means that the visitor only looked at one page during the timeframe that the analytics software deems to be a "visit".  So if that timeframe is set to 30 minutes and the visitor clicks on a link at 30 minutes and 1 second, the second click counts as an entirely new visit.

 

Once you filter out the fake referral traffic you can start drilling down into the data for real visitors and try to determine if:

 

  • Your analytics code is missing from some pages (including shopping cart and checkout)

  • Visitors are spending a longer time on your pages than you expect (maybe comparing the pages to another site)
  • Visitors are leaving after landing (bouncing, which has NO EFFECT on search engine optimization)

     

     


Edited by Michael Martinez, 10 June 2015 - 10:07 AM.


#5 Jill

Jill

    Recovering SEO

  • Admin
  • 33,244 posts

Posted 10 June 2015 - 10:49 AM

We have no choice to ban the referring spam URLs via .htaccess file as it is shopify platform.

Not sure what you mean by that. You should always be able to upload an .htaccess file.



#6 torka

torka

    Vintage Babe

  • Moderator
  • 4,825 posts
  • Location:Triangle area, NC, USA, Earth (usually)

Posted 10 June 2015 - 11:18 AM

Shopify is a third-party SaaS solution. I don't believe they give users the ability to upload an .htaccess file.

 

If you can't block the traffic, you can at least set up filters in your Google Analytics to prevent the traffic from skewing your stats. Couple of things to keep in mind: you'll probably want to set up a second view for your filtered results, just so you can still access the "whole" non-filtered data set should you ever need to. And the filters will only take effect for data received after you set them up.  You're kinda stuck with manually subtracting out the offending spam for historical reports. Here's Google help section if you need more info about filters.

 

HTH!

 

--Torka :oldfogey:



#7 dcpramachandra2011

dcpramachandra2011

    HR 1

  • Members
  • Pip
  • 4 posts

Posted 14 June 2015 - 10:06 AM

Thanks to all of you for your quick response.

 

I have started using filters and the spam problem is solved.

 

But what about 0 sec. session duration? I am observing lot of sessions having 0 sec. duration even in direct, referral, social and ORGANIC channels from different geographic locations. Can anybody guess the problem and solution for it?



#8 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,718 posts
  • Location:Blackpool UK

Posted 14 June 2015 - 11:51 AM

 

 

 Can anybody guess the problem and solution for it?

 

Why is there a 'problem'?



#9 torka

torka

    Vintage Babe

  • Moderator
  • 4,825 posts
  • Location:Triangle area, NC, USA, Earth (usually)

Posted 15 June 2015 - 09:00 AM

Michael already explained the "zero-second visit" issue, if you'll re-read the first paragraph of his answer:

No analytics package can accurately track how long a visitor stays on a Website but a 0-second estimated duration means that the visitor only looked at one page during the timeframe that the analytics software deems to be a "visit". So if that timeframe is set to 30 minutes and the visitor clicks on a link at 30 minutes and 1 second, the second click counts as an entirely new visit.

In other words, it doesn't mean that people were only staying there for one second, and it doesn't indicate a problem with the analytics package. It just means that people are only looking at one page in your site, without clicking through to another page, during whatever length of time your analytics package deems a "visit duration." They might simply be spending a long time on that one page before clicking on, or they might be landing on the page, deciding it's crap (or at least, not what they were looking for) and leaving right away.

 

The way to "fix" that is to make your site better, and seek more targeted visitors. Make your site more engaging, more useful, more what your visitors are looking for. Include strong calls to action. Be sure your offer is worth their while to start with.

 

And stop worrying about getting a lot of visitors, and start thinking more about how to get the right visitors -- that is, those who are actually interested in what you offer, whatever that may be.

 

You would probably be helped by searching on "conversion optimization" and reading some of the experts in that area (the Eisenberg brothers, Brian Massey and the folks at Marketing Experiments and Psychotactics all spring immediately to mind, but there are others).

 

--Torka :oldfogey:



#10 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,325 posts
  • Location:Georgia

Posted 16 June 2015 - 01:58 AM

There are some crawlers that don't provide referral information.  It could be your site is being hit by those.  These are a mix of SEO tools, social media monitoring services, and business intelligence services that have all started up in the past few years.  Many of them use open-source crawlers with little real knowledge of what they are doing.  They are a constant plague in our server log data.

 

Some of these crawlers execute Javascript and it could be that they are coming in from different IP addresses, pretending to be real users visiting your site.  You would have to look at the raw server logs on your Web hosting provider account.  You may have to download them and uncompress them using non-standard compression tools (not everything is ZIPped).

 

I find these crawlers by searching the raw server log data for words like "crawler", "github", and "bot" (although searching for "bot" shows you a lot of good robots like "AppleBot", "BingBot", "GoogleBot", and "YandexBot".

 

Slurp (Yahoo's crawler) has been shown to act like a mobile user in Google Analytics.  It's helpful to block traffic from Slurp since Yahoo no longer maintains its own search index.  They have never explained why they kept running Slurp all these years since they entered in their deal with Microsoft.

 

To block these crawlers you will have to identify IP addresses to include in a firewall.  Some Web hosting accounts don't give you a firewall but they may allow you to create an "IP Deny List".  This list will be maintained in an .htaccess file on an Apache/Unix server.  The Microsoft IIS platform uses a configuration file and I am not sure how that works.

 

If you have a CPANEL dashboard for your Web hosting account you may be able to access a File Manager tool to find your .htaccess file.  You'll have to carefully research how you update it because any error will cause your server to generate an Error 500 response on every attempt to view your Website.  It's always a good idea to back up your configuration/htaccess file before making any changes.

 

You can block crawlers by user-agent in the htaccess file, although some of them can change their user-agents or mask their user-agents.  Some people may also alter the code to do this.

 

If you publish RSS feeds and you see crawlers just fetching your feeds, you can probably leave them alone.

 

Any crawlers that identify themselves as being from search engines or Websites that send you legitimate referral traffic should be left alone as well.

 

Everything else should be blocked, either by IP address or user-agent, or by both.

 

Managing rogue crawlers on a Website is an ongoing task.  It will never end because there are always new crawlers and the more aggressive ones change the IP addresses they use to probe Websites.

 

Some crawlers are used by botnets to probe your Website for vulnerable files that they can exploit.  If you know what filenames are used by Shopify's software you should be able to spot the probes because they try to fetch files that don't exist.

 

There is no guarantee that your 0-second visitors include crawlers that execute Javascript.  You just have to look and see what you find.  On some sites that is the problem.  On other sites it may be a lack of interesting or useful information on the pages.  And on others it could be both factors, or something else altogether.






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

We are now a read-only forum.
 
No new posts or registrations allowed.