12

I have a 700kb decompressed JS file which is loaded on every page. Before I had 12 javascript files on each page but to reduce http requests I compressed them all into 1 file.

This file is ~130kb gzipped and is served over gzip. However on the local computer it is still unpacked and loaded on every page. Is this a performance issue?

I've profiled the javascript with firebug profiler but did not see any issues. The problem/illusion I am facing is there are jquery libraries compressed in that file that are sometimes not used on the current page.

For example jquery datatables is 200kb compressed and that is only loaded on 2 of my website pages. Another is jqplot and that is another 200kb.

I now have 400kb of excess code that isn't executed on 80% of the pages.

Should I leave everything in 1 file?

Should I take out the jquery libraries and load only relevant JS on the current page?

Su'
  • 19,342
  • 3
  • 43
  • 65
Kyle
  • 305
  • 2
  • 7

3 Answers3

8

If your framework/CMS/whatever has the appropriate functions, you can include the scripting conditionally as @Michael suggests, but without the additional library.

Taking your datatables case, for example, WordPress might handle the situation via something like:

// For reference; this isn't functional code.
if (is_page('whatever')) {
    <script src="/path/to/datatables.js"><script>
    <script>
        [Your datatables setup here]
    </script>
}

There's nothing wrong with RequireJS; you just need to evaluate whether the additional level of complexity it adds(plus learning to use it in the first place) offsets what more readily-available tools can do for you. If you only have the two cases you mentioned above, this might be a better option. If you've got a lot more going on, then RequireJS might be a better approach on the whole.

Su'
  • 19,342
  • 3
  • 43
  • 65
6

You can use requirejs to dynamically load the libraries you need only on that pages. Then you only have to load the requirejs (which is about 14k) on all pages, saving about 385kb.

Integration is also very easy: just "wrap" the code you have with the require include stuff:

require(["jquery", "jquery.alpha", "jquery.beta"], function($) {
    //the jquery.alpha.js and jquery.beta.js plugins have been loaded.
    $(function() {
        $('body').alpha().beta();
    });
})
Michael
  • 864
  • 9
  • 17
1

~700kb of JavaScript is a performance issue and if compressed and we have to see the Rules to be followed while dealing to optimize the Code are:

  1. Minify Javascripts - Simply you are compressing and decompressing, which didn't reduce the code, First of all use the good Minify JS tool and Minify your code. You are 12 Files and each file would be Minify sepratly before clubbing for best performance.

  2. Use asynchronous javascript loading, by Using asynchronous loading results in a very fast load time and rendering of the page. The user impact is very strong because good asynchronous loading won't block the rendering process and the feeled page load time is heavily decreased. Images and other displayed items regulary will be shown as of no javascript is loaded.

  3. Use GOOGLE cdn for JQUERY; I think you are using JQUERY and loading it from your own website which is also a added dis-advantage, kindly use the GOOGLE CDN (free) to load the JQUERY. As it is being used by almost every 3rd website and thus already available on client computer in cache.

  4. Custom long Expires Headers: Some how the website is loaded with issue of loading time then you must give the Long Expires for HTTP JS Files, so that they can't be downloaded every time, which will reduce the request for second time. According to my research, loading time taken on second page have more exits compared to first time page visit.

  5. Check with Page Speed: Some times other resources also effects the loading speed of the page kindly cross check and also try to optimize other resources also. As doing bit of step on every resource give a extra time to our JS loading.

Vineet1982
  • 171
  • 2
  • 12