Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Slow Page Load


  • Please log in to reply
10 replies to this topic

#1 madams

madams

    HR 5

  • Active Members
  • PipPipPipPipPip
  • 504 posts
  • Location:Costa Blanca, Spain

Posted 16 January 2010 - 05:38 AM

Hi

I have a rather slow page load, so to see what was going on I installed the Firebug add on for Firefox.

I ran a speed test and one result said...

QUOTE
This page makes 77 parallelizable requests to www.my-site.com. Increase download parallelization by distributing these requests across multiple hostnames:


...and it shows a list of css, png, and gif files.

Can anyone explain what this means, in simple English, and how it can be fixed.

Thanks

#2 rolf

rolf

    HR 6

  • Active Members
  • PipPipPipPipPipPip
  • 675 posts
  • Location:Suffolk UK

Posted 16 January 2010 - 06:41 AM

As I understand it, this is basically spreading the content over several servers, easing the load on the source(s) and path(s) of the data. I get similar recommendations in firebug but after advice here I am under the impression that this would only make sense for a larger organisation with a massive site like eBay or Amazon - although I'm just passing on hearsay and I may have misunderstood.

One firebug recommendation I have acted on is the use of gzip, which has improved my page load times dramatically. From a human perspective my pages already loaded fast, but from Google's perspective my test site was slower than about 80% of other sites. Since implementing gzip (and cutting some old and redundant functions from the javascript file) that site is now showing in G as faster than 58% of other sites, plus I'm using less of my bandwidth as a pure bonus :-) (Although it does occur to me that as G and Firebug are recommending this sort of thing to everyone this will only give me a temporary advantage on the G front :-s)

#3 madams

madams

    HR 5

  • Active Members
  • PipPipPipPipPip
  • 504 posts
  • Location:Costa Blanca, Spain

Posted 16 January 2010 - 07:00 AM

Hi Rolf

Thanks for that. I am not a multinational co. biggrin.gif

Yes, I am trying to implement gzip_deflate.

This is active on my server. Had a few problems and the host tech guys are looking into it.

I have googled for the .htaccess code to tiger gzip_deflate and it is confusing, there are so many different opinions and approaches.

I am on Apache 2.2.14

If anyone knows the code or could guide me to the relevant page, please let me know.

#4 rolf

rolf

    HR 6

  • Active Members
  • PipPipPipPipPipPip
  • 675 posts
  • Location:Suffolk UK

Posted 16 January 2010 - 07:44 AM

my file says :

CODE
<FilesMatch "\.(php|html|css|js)$">
SetOutputFilter DEFLATE
</FilesMatch>


I have some issues I'm monitoring regarding the use of implied index files (e.g. just the domain or directory with no file specified) and the use of variables in the URL (e.g. filename.php?variable1=hello&variable2=world) but there are some anomalies in the data I'm getting from my tracking software that make me think the .htaccess file is fine and it's my reporting software that is having a problem.

That aside, this code is definitely working on some/most/all of my files and has increased the speed G measures for my page/site loading times.

#5 madams

madams

    HR 5

  • Active Members
  • PipPipPipPipPip
  • 504 posts
  • Location:Costa Blanca, Spain

Posted 16 January 2010 - 08:30 AM

Tried that Rolf, still no luck

I checked it at this site which has a tool for testing gzip compression.

By the way, did you put the code in the .htaccess in root or in the folder with the relevant files?

I have 2 .htaccess files.

Anyway, I will have to wait till my host tech guys have a look.

#6 rolf

rolf

    HR 6

  • Active Members
  • PipPipPipPipPipPip
  • 675 posts
  • Location:Suffolk UK

Posted 16 January 2010 - 08:52 AM

QUOTE
By the way, did you put the code in the .htaccess in root or in the folder with the relevant files?


Just in the root folder. As I understand it (and I'm no expert, so I could be wrong) settings/commands the root .htaccess file will apply to all sub folders unless there is a .htaccess file in a given folder with a contradictory setting/comand, in which case the most local .htaccess file gets priority - is that your understanding of it too?

#7 madams

madams

    HR 5

  • Active Members
  • PipPipPipPipPip
  • 504 posts
  • Location:Costa Blanca, Spain

Posted 16 January 2010 - 09:32 AM

Yes, thats how I understand .htaccess.



#8 Randy

Randy

    Convert Me!

  • Moderator
  • 17,540 posts

Posted 16 January 2010 - 11:21 AM

All above is spot on. There are lots of ways to do it, all of which are done a little bit differently and have a slightly different effect. You may want to reference this thread from a couple of weeks ago.

Since you have Apache 2, and I'm going to assume here that it's a standard build of Apache 2, it probably already had mod_deflate compiled into it. Thus that'll probably be the easiest way to implement compression.

I do that by the mime type of the file instead of designating file extensions. Either should work, just a personal preference thing.

To do it by the file mime type and for text files only (you don't want to try to compress images or binary files) I use a single line instruction in the root level .htaccess. It says:

CODE
AddOutputFilterByType DEFLATE text/html text/plain text/xml


If Deflate was compiled into your Apache 2 build --and that was the default-- all of your text based pages should now be compressed.

#9 madams

madams

    HR 5

  • Active Members
  • PipPipPipPipPip
  • 504 posts
  • Location:Costa Blanca, Spain

Posted 17 January 2010 - 05:48 AM

That did it Randy

(Or my host tech guys changed something)

Anyhow it works a treat now.

Thanks Randy - Rolf

One other thing...

The Firebug performance test indicated to enable cache

I ran a header test and the results where...

CODE
cache-control      no-store, no-cache,
must-revalidate,
post-check=0,
pre-check=0
pragma     no-cache


I entered the following in the .htaccess file...

CODE
<ifmodule mod_expires.c>
  <filesmatch "\.(jpg|gif|png|css|js)$">
       ExpiresActive on
       ExpiresDefault "access plus 1 month"
   </filesmatch>
</ifmodule>


I then ran the header checker again and it was the same, no cache

Am I missing something?

#10 Randy

Randy

    Convert Me!

  • Moderator
  • 17,540 posts

Posted 17 January 2010 - 12:13 PM

And now we get into one of those areas where I think some tools just get it wrong. lol.gif Personal opinion follows, but it's backed up by fact.

I'm going to assume here that you have a site where you're simply not giving the user any Cache instructions. Typically this would be done in the html code with a <meta> header. I'm going to assume also that Firebug is giving you what I would call a false positive message that starts out something like "The following elements are missing a cache expiration date." and then probably lists some images, along with possibly some .js and .css files.

Am I correct in my assumptions?

If so, I would encourage you to do exactly nothing.

The Firebug folks make the warning look ominous, but it's a false positive IMHO.

Bottom line as far as I'm concerned is that except in very, very rare situations the choice of when/how a browser caches information should always be up to the end user. Not the webmaster. It is the user's computer after all, so they should have full control over what gets cached on it, for how long and when. By default every browser I've ever tested is set up to automatically cache information correctly.

Additionally a correctly configured server --and the vast majority are correctly configured out of the box-- will deliver a quite proper 304 header status message for all of these image, css and js files if a page has already been visited and cached by a user. You can quite easily confirm if your server is correctly configured by reviewing your raw log files while you're hitting a page on your site. Look for something in each line that says something like "GET /someimage.jpg HTTP/1.1" 304 where the important part is the 304. That's the status code. If the server is forcing a fresh download that would say 200 instead of 304.

What does 304 mean? In English it means Not Modified since the last visit by that browser, thus telling the browser that there is no need to download the files again if it's already in the browsers cache.

So, like I said above I'd do exactly nothing. You're already perfect and the tool is misreporting information.

#11 madams

madams

    HR 5

  • Active Members
  • PipPipPipPipPip
  • 504 posts
  • Location:Costa Blanca, Spain

Posted 18 January 2010 - 02:10 AM

Yes you are correct in your assumption Randy.

Thanks as always for the help






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

We are now a read-only forum.
 
No new posts or registrations allowed.