Leverage Browser Caching for 3rd Party JavaScript

The following Article about Leverage Browser Caching for 3rd Party JavaScript will include you having to make small changes to your website source code and make small edits to a script to be run as your cron. If you’re not confident at making these changes then please do not proceed. However, the knowledge needed to make these changes is minimal and I will walk you through all the changes below.

Anybody who works with websites will know the pain of Leverage Browser Caching. A relatively simple task, until you realise you have JavaScript files that are loaded externally. I’m going to explain today how to fix the issue and get rid of that annoying alert telling you that you need to cache Google Analytics!

How do you do that?

Now this is not as easily fixable as throwing some code into your .htaccess file I’m afraid. But what you can do is use a cron, which basically means you load the JavaScript locally.

First of all, you need to download the script that you’re running. I will be using Google Analytics as an example throughout this article (this appears to be the most problematic script people complain about, but you can replicate this for any external scripts that you might have running on your website).

Look in your code and find the name of the script that you’re running externally, in our case it is: google-analytics.com/ga.js (for most sites this will be located at the bottom of your code). Pop this URL into your web browser and it will bring up the source code. Simply make a copy of it and save it as ga.js. You can name it what ever you like, but I advise just keeping it the same name.

Save this newly created JavaScript file onto your webserver, in my case:

- JS
  - ga.js

Next, you will need to update the code on the pages that are calling your script and just change the directory that is calling the JavaScript file. Once again in our case, we will be changing this line of our Google Analytics code:

ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';

to

ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.yoursite.com/js/ga.js';

So the change we’ve made is changing the domain that the script is run from. So instead of google-anayltics.com/ga.js it will become yoursite.com/js/ga.js. So you simply change it to your websites domain name and then wherever you’ve stored the JavaScript file.

At this point, your site will now run the script from your website locally! However, this means the script will never update. Unless you re-run this short process every week. That is up to you… but I would advise continuing.

For this next part you will need access to your hosting account and will be required to upload some code.

This is where the CRON comes into play:

Just about every single hosting service will have an option for you to set up cron jobs. On Hostinger it is on your Hosting Panel, on GoDaddy you will find it under the Content option. If you’re unsure, just ask your hosting provider 🙂

Put the following script into your cron, and all you need to do is change the absolute path to the variable $localfile. What this script does is checks for updates from Google to the ga.js file. You can set the time frame on how often you want it to check for updates. Ranging from once every minute to once a month and beyond.

If you’re also doing this for external files other than Google Analytics, then you will also need to change the variable $remoteFile. So $remoteFile is the URL to the external JavaScript file and the variable $localFile is where you will put the path to your new locally stored file, simple as that!

<?
// script to update local version of Google analytics script

// Remote file to download
$remoteFile = 'http://www.google-analytics.com/ga.js';
$localfile = 'ENTER YOUR ABSOLUTE PATH TO THE FILE HERE';
//For Cpanel it will be /home/USERNAME/public_html/ga.js

// Connection time out
$connTimeout = 10;
$url = parse_url($remoteFile);
$host = $url['host'];
$path = isset($url['path']) ? $url['path'] : '/';

if (isset($url['query'])) {
  $path .= '?' . $url['query'];
}

$port = isset($url['port']) ? $url['port'] : '80';
$fp = @fsockopen($host, '80', $errno, $errstr, $connTimeout );
if(!$fp){
  // On connection failure return the cached file (if it exist)
  if(file_exists($localfile)){
    readfile($localfile);
  }
} else {
  // Send the header information
  $header = "GET $path HTTP/1.0\r\n";
  $header .= "Host: $host\r\n";
  $header .= "User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.6) Gecko/20070725 Firefox/2.0.0.6\r\n";
  $header .= "Accept: */*\r\n";
  $header .= "Accept-Language: en-us,en;q=0.5\r\n";
  $header .= "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7\r\n";
  $header .= "Keep-Alive: 300\r\n";
  $header .= "Connection: keep-alive\r\n";
  $header .= "Referer: http://$host\r\n\r\n";
  fputs($fp, $header);
  $response = '';

  // Get the response from the remote server
  while($line = fread($fp, 4096)){
    $response .= $line;
  }

  // Close the connection
  fclose( $fp );

  // Remove the headers
  $pos = strpos($response, "\r\n\r\n");
  $response = substr($response, $pos + 4);

  // Return the processed response
  echo $response;

  // Save the response to the local file
  if(!file_exists($localfile)){
    // Try to create the file, if doesn't exist
    fopen($localfile, 'w');
  }

  if(is_writable($localfile)) {
    if($fp = fopen($localfile, 'w')){
      fwrite($fp, $response);
      fclose($fp);
    }
  }
}
?>

That is it! Add that to your cron with your selected update times and it should fix any issues you’re having with Leverage Browser Caching third party scripts.

If you have any questions regarding the above and require some help then please don’t hesitate to get in contact.

NOTE:

In truth, these files don’t tend to have a great effect on your actual page speed. But I can understand the worry you have with Google penalising you. But that would only happen if you had a LARGE amount of these external scripts running.