Speed Up Your Javascript Load Time

Get the Math, Better Explained eBook and turn Huh? to Aha!

Javascript is becoming increasingly popular on websites, from loading dynamic data via AJAX to adding special effects to your page.

Unfortunately, these features come at a price: you must often rely on heavy Javascript libraries that can add dozens or even hundreds of kilobytes to your page.

Users hate waiting, so here are a few techniques you can use to trim down your sites.

(Check out part 2 for downloadable examples.)

Find The Flab

Like any optimization technique, it helps to measure and figure out what parts are taking the longest. You might find that your images and HTML outweigh your scripts. Here’s a few ways to investigate:

1. The Firefox web-developer toolbar lets you see a breakdown of file sizes for a page (Right Click > Web Developer > Information > View Document Size). Look at the breakdown and see what is eating the majority if your bandwidth, and which files:

yahoo size

2. The Firebug Plugin also shows a breakdown of files – just go to the “Net” tab. You can also filter by file type:

yahoo_firebug.png

3. OctaGate SiteTimer gives a clean, online chart of how long each file takes to download:

yahoo_octagate.png

Disgusted by the bloat? Decided your javascript needs to go? Let’s do it.

Compress Your Javascript

First, you can try to make the javascript file smaller itself. There are lots of utilities to “crunch” your files by removing whitespace and comments.

You can do this, but these tools can be finnicky and may make unwanted changes if your code isn’t formatted properly. Here’s what you can do:

1. Run JSLint (online or downloadable version) to analyze your code and make sure it is well-formatted.

2. Use YUI Compressor to compress your javascript from the command line. There are some online packers, but the YUI Compressor (based on Rhino) actually analyzes your source code so it has a low chance of changing it as it compresses, and it is scriptable.

Install the YUI Compressor (it requires Java), then run it from the command-line (x.y.z is the version you downloaded):

java -jar yuicompressor-x.y.z.jar myfile.js -o myfile-min.js

This compresses myfile.js and spits it out into myfile-min.js. Rhino will remove spaces, comments and shorten variable names where appropriate.

Using Rhino, I pack the original javascript and deploy the packed version to my website.

Debugging Compressed Javascript

Debugging compressed Javascript can be really difficult because the variables are renamed. I suggest creating a “debug” version of your page that references the original files. Once you test it and get the page working, pack it, test the packed version, and then deploy.

If you have a unit testing framework like jsunit, it shouldn’t be hard to test the packed version.

Eliminating Tedium

Because typing these commands over and over can be tedious, you’ll probably want to create a script to run the packing commands. This .bat file will compress every .js file and create .js.packed:

compress_js.bat:
for /F <span class="tex-inline-html" alt="F in ('dir /b *.js') do java -jar custom_rhino.jar -c ">F in ('dir /b &middot;.js') do java -jar custom_rhino.jar -c </span>F > %%F.packed 2>&#038;1

Of course, you can use a better language like perl or bash to make this suit your needs.

Optimize Javascript Placement

Place your javascript at the end of your HTML file if possible. Notice how Google analytics and other stat tracking software wants to be right before the closing </body> tag.

This allows the majority of page content (like images, tables, text) to be loaded and rendered first. The user sees content loading, so the page looks responsive. At this point, the heavy javascripts can begin loading near the end.

I used to have all my javascript crammed into the <head> section, but this was unnecessary. Only core files that are absolutely needed in the beginning of the page load should be there. The rest, like cool menu effects, transitions, etc. can be loaded later. You want the page to appear responsive (i.e., something is loading) up front.

Load Javascript On-Demand

An AJAX pattern is to load javascript dynamically, or when the user runs a feature that requires your script. You can load an arbitrary javascript file from any domain using the following import function:

function $import(src){
  var scriptElem = document.createElement('script');
  scriptElem.setAttribute('src',src);
  scriptElem.setAttribute('type','text/javascript');
  document.getElementsByTagName('head')[0].appendChild(scriptElem);
}

// import with a random query parameter to avoid caching
function $importNoCache(src){
  var ms = new Date().getTime().toString();
  var seed = "?" + ms;
  $import(src + seed);
}

The function $import('http://example.com/myfile.js') will add an element to the head of your document, just like including the file directly. The $importNoCache version adds a timestamp to the request to force your browser to get a new copy.

To test whether a file has fully loaded, you can do something like

if (myfunction){
  // loaded
}
else{ // not loaded yet
  $import('http://www.example.com/myfile.js');
}

There is an AJAX version as well but I prefer this one because it is simpler and works for files in any domain.

Delay Your Javascript

Rather than loading your javascript on-demand (which can cause a noticeable gap), load your script in the background, after a delay. Use something like

var delay = 5;
setTimeout("loadExtraFiles();", delay * 1000);

This will call loadExtraFiles() after 5 seconds, which should load the files you need (using $import). You can even have a function at the end of these imported files that does whatever initialization is needed (or calls an existing function to do the initialization).

The benefit of this is that you still get a fast initial page load, and users don’t have a pause when they want to use advanced features.

In the case of InstaCalc, there are heavy charting libraries that aren’t used that often. I’m currently testing a method to delay chart loading by a few seconds while the core functionality remains available from the beginning.

You may need to refactor your code to deal with delayed loading of components. Some ideas:

  • Use SetTimeout to poll the loading status periodically (check for the existence of functions/variables defined in the included script)
  • Call a function at the end of your included script to tell the main program it has been loaded

Cache Your Files

Another approach is to explicitly set the browser’s cache expiration. In order to do this, you’ll need access to PHP or Apache’s .htaccess so you can send back certain cache headers (read more on caching).

Rename myfile.js to myfile.js.php and add the following lines to the top:

In this case, the cache will expire in (60 * 60 * 24 * 3) seconds or 3 days. Be careful with using this for your own files, especially if they are under development. I’d suggest caching library files that you won’t change often.

If you accidentally cache something for too long, you can use the $importNoCache trick to add a datestamp like “myfile.js?123456″ to your request (which is ignored). Because the filename is different, the browser will request a new version.

Setting the browser cache doesn’t speed up the initial download, but can help if your site references the same files on multiple pages, or for repeat visitors.

Combine Your Files

A great method I initially forgot is merging several javascript files into one. Your browser can only have so many connections to a website open at a time — given the overhead to set up each connection, it makes sense to combine several small scripts into a larger one.

But you don’t have to combine files manually! Use a script to merge the files — check out part 2 for an example script to do this. Giant files are difficult to edit – it’s nice to break your library into smaller components that can be combined later, just like you break up a C program into smaller modules.

Should I Gzip It?

You probably should. I originally said no, because some older browsers have problems with compressed content.

But the web is moving forward. Major sites like Google and Yahoo use it, and the problems in the older browsers aren’t widespread.

The benefits of compression, often a 75% or more reduction in file size, are too good to ignore: optimize your site with HTTP compression.

All done? Keep learning.

Once you’ve performed the techniques above, recheck your page size using the tools above to see the before-and-after difference.

I’m not an expert on these methods — I’m learning as I go. Here are some additional references to dive in deeper:

Keep your scripts lean, and read part 2 for some working examples.

Other Posts In This Series

  1. How To Optimize Your Site With HTTP Caching
  2. How To Optimize Your Site With GZIP Compression
  3. How To Debug Web Applications With Firefox
  4. Speed Up Your Javascript Load Time
  5. Speed Up Your Javascript, Part 2: Downloadable Examples!
Kalid Azad loves sharing Aha! moments. BetterExplained is dedicated to learning with intuition, not memorization, and is honored to serve 250k readers monthly.

Enjoy this article? Try the site guide or join the newsletter:
Math, Better Explained is a highly-regarded Amazon bestseller. This 12-part book explains math essentials in a friendly, intuitive manner.

"If 6 stars were an option I'd give 6 stars." -- read more reviews

45 Comments

  1. geez man… :)

    instead of plugins [web-developer toolbar]&[OctaGate SiteTimer] you can use an another great fx-plugin. this is firebug!

  2. Probably not. Although some browsers can accept compressed javascript (myfile.js.gz) or files returned with the “gzip” encoding header, this behavior is not consistent between browsers and can be problematic.

    Can you be more specific about the problematic part? Because the great advantage of gzipping javascript is that the code is still readable.

    I haven’t had any problems with it so far, but maybe I’m overlooking something.

  3. @magic.ant: Thanks, I forgot about mentioning firebug. It’s great for debugging, but does it break down page sizes as well? (Update: Firebug does show page sizes on the Net tab, not sure how I missed that one! I’ve updated the article.)

    @Blaise: The ThinkVitamin article goes into more detail — apparently some versions of Netscape and Internet Explorer (4-6) have issues correctly decompressing gzipped content:

    “When loading gzipped JavaScript, Internet Explorer will sometimes incorrectly decompress the resource, or halt compression halfway through, presenting half a file to the client. If you rely on your JavaScript working, you need to avoid sending gzipped content to Internet Explorer.”

    There may be workarounds by detecting the browser and returning different code, but I don’t think it’s worth the risk.

    However, if you do get it working in all browsers, I’d love to hear about it!

  4. Nice article. You can also mention about merging all javascript files into one – this would reduce the no. of HTTP calls (even if the individual files are cached). And if you are pulling JS files from another host, it also makes sense to move those files to your domain, as this would reduce the server lookup time.

  5. i wrote n uploaded a javascript in my website
    that script loads as the page loads
    but i want to enter some 30 seconds delay in javascript to load
    can i do it
    plz help me

  6. Kalid, I don’t agree with your comment on not using gzip. Gzip is required to be supported by HTTP 1.1 protocol, which includes most modern browsers. The bug in IE 6 that is referenced by the ThinkVitamin article affects not just gzip content, but other content as well. People are far better off using gzip than not.

    Also, regarding your comments on caching you need to be careful about what HTTP 1.1 caching commands you use because users could be going through proxies that are HTTP 1.0 resulting in odd/unexpected behavior.

  7. Hi Trevin, thanks for the comments.

    1) Regarding gzip compression, I agree that it is extremely useful (in fact, I’m researching the best way to turn it on for InstaCalc).

    One problem with IE6 is that it says it accepts gzip’d content but may have problems decoding it:

    http://www.google.com/search?q=ie6%20gzip%20problem
    http://support.microsoft.com/default.aspx?scid=kb;en-us;Q312496

    As a result, webmasters serving gzip’d content have to resort to hacks like detecting the browser user-agent and returning regular content to IE6, even if it says it can accept compressed content:

    http://httpd.apache.org/docs/2.0/mod/mod_deflate.html

    These tricks can be done, but may be tough for a newbie. Of course, we can just take a scorched earth approach and let IE6 choke if it can’t render the page, but this is tough stance to take given that IE6 still has a large fraction of browser share.

    However, I agree with you that output compression is extremely valuable. In my tests it shaves over 2/3 of the bandwidth, so I really, really want to enable it (and find a suitable workaround for IE6).

    2) Yes, caching can be tricky as well. As I followed these topics down the rabbit-hole I’m seeing more of the intricacies here.

    In general though, it appears to me that an old HTTP 1.0 proxy won’t cache something a new HTTP 1.1 header is set. You might not get the performance benefit when using an old proxy, but I’m not sure what other impact there would be.

    Both of these are probably topics for a follow-up article :)

  8. I just installed Java..What will it do? I need it to speed up my browser. Will it do that? If not, what?

    Peace

    Dirk

  9. Hi Dirk, Java and Javascript have similar names but actually aren’t related (it’s often a big source of confusion).

    I’ve been using Opera (http://www.opera.com) as my browser — it’s really fast and you may find it fits your needs.

  10. Good point… although I think there may be some other ancient browsers in the “gzip not welcome” list too :)

  11. response to “Should I Gzip It?” paragraph:

    You CANNOT ignore old browsers. It shouldn’t even be an option unless you re making an intranet app for a specific browser. As a (web) developper, it’s YOUR JOB to make it work for everyone. Any other mindset and you might as well just go sell peanuts…

  12. @nnm

    while I agree that efforts should be taken to support older browsers, who elected you to decide what his job is? Do you still support NCSA Mosaic 1.0?

    Mosaic 1.0 is only 15 years old, IE 5 is 9 years old, IE 5.5 is 8 years old. When is something so outdated that you don’t support it anymore? Or is the web developer supposed to shoulder all the burden when the browser vendor doesn’t provide proper patches for their browser or the consumer runs a buggy browser?

  13. Combine Your Files … why on heart no one talk about packed.it? It could combine in ONE file both CSS and JavaScript, merging them in a way that Rhino could not do!

    At the same time, packed.it parses conditional comments as well, there’s JavaScript inside, but of course a JavaScript that Rhino could not understand.

    Did I forget something? Yes, packed.it uses deflate before gzip, about 3% bigger but probably much more safer with old IE too?

    Anyway, good stuff, and Kind Regards

  14. I would also recommend this online free tool: http://Site-Perf.com/

    It measure loading speed of page and it’s requisites (images/js/css) like browsers do and shows nice detailed chart – so you can easily spot bottlenecks.

    Also very useful thing is that this tool is able to verify network quality of your server (packet loss level and ping delays).

  15. This is a rather helpful post. My only concern with compression is debugging. I now decompress and recompress source code to debug it which makes it a living hell for the programmer. Another issue is backwards compatibility. I’ve seen strange behavior of a script of mine when i tested it with IE6 and IE7. IE6 acted weird (as usual?!)

    Thanks.

  16. @JustAnotherVictim: When you debug using gzip compression, you don’t have to worry about keeping two versions around (since it’s automatically decompressed by the browser). But if you do “Crunching” (removing whitespace/renaming variables) then you’re right, that can be a big problem. Backwards compatibility can be really annoying with older browsers — sometimes you have to do tests for them and not serve them compressed content.

  17. @sklepy: Yes, I agree. For that reason I don’t really recommend minifying source code unless absolutely necessary — most of the gains will come from regular gzip compression of the output.

  18. Note that the “Forced IE7 upgrade” is not that forced, and is not for all IE6 users:

    1. IE6 users on Windows 2000 cannot upgrade to IE7 and will not be forced to do so.

    2. IE6 users on Windows XP/2003 will be forced to install IE7 only if they blindly allow the MS update mechanism to install any junk chosen by Microsoft, no questions asked. So IE6 on XP is still possible, although there might be no more security updates offered for it.

    3. Windows Vista users never had the option to run IE versions older than IE7.

    P.S.
    The safe way to handle gzip or deflate compression (the latter should be better than gzip but has more browser bugs) is to let the web server do this based on the headers in the request (“Accept-Encoding” and “User-Agent”). As someone mentioned, the Apache web server standard example configuration does this out of the box. For IIS and other web servers, configuration scripts may be available somewhere.

  19. I really like this post. Thanks for this article, Anyone got any more info about it? I am now your blog’ s rss follower. you are now in my bookmarks.

  20. Thanks alot – another great post- Your gzip html files post was also very helpful – I took your advise and changed the extension to .php and added the coding – it worked great. Thanks

  21. There are two errors:

    in the myscript.js.php bit it says:
    header(“Content-type: text/javascript; charset: UTF-8″);

    it should be:
    header(“Content-Type: application/javascript; charset=UTF-8″);

    In the delay your JavaScript bit it says:
    setTimeout(“loadExtraFiles();”, delay * 1000);

    it should be:
    setTimeout(loadExtraFiles, delay * 1000);

    or:
    setTimeout(function(){ loadExtraFiles(); }, delay * 1000);

    otherwise the code gets `eval()`d which is stupid.

  22. Great article but the link to Rhino goes to something called Dojo, and I can’t see any js compressor on their site.

  23. You could certainly see your enthusiasm in the work
    you write. The arena hopes for even more passionate writers such as you who aren’t afraid to mention how they believe.
    At all times go after your heart.

  24. I didn’t read all the way through the comments, but I’d like to point something out:
    var delay = 5;
    setTimeout(“loadExtraFiles();”, delay * 1000);

    Passing a string to setTimeout makes it act like “eval”: it wakes up the JavaScript interpreter, searches up on the prototype chain for the function name. When it finds, calls it, just like calling a function indeed.
    But, however, as functions are objects in JavaScript, you can (and should) pass a function to the setTimeout, like this:

    setTimeout(loadExtraFiles, delay * 1000);

    Or, if it feels to clumsy (or you need to do extra stuff that aren’t stated in your loadExtraFiles function), you can pass an anonymous function that executes your function:

    setTimeout(function () {
    loadExtraFiles();
    }, delay * 1000);

Your feedback is welcome -- leave a reply!

Your email address will not be published.

LaTeX: $$e=mc^2$$