Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

We have about 30 external JavaScripts on our page. Each is already minified.

To reduce HTTP requests and page load time, we are considering combining them to a single file. This was recommended by the YSlow tool.

Is this wise, or is it better to combine them into, say, two files with 15 scripts each?

Is there an optimal size for the combined JavaScript files?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
679 views
Welcome To Ask or Share your Answers For Others

1 Answer

The fewer the HTTP requests, the better. If you want your page to work on Mobile devices as well, then keep the total size of each script node under 1MB (See http://www.yuiblog.com/blog/2010/07/12/mobile-browser-cache-limits-revisited/)

You might also want to check whether any of your scripts can be deferred to after onload fires. You could then make two combined files, one that's loaded in the page, and the other that's loaded after page load.

The main reason we ask people to reduce HTTP requests is because you pay the price of latency on each request. This is a problem if those requests are run sequentially. If you can make multiple requests in parallel, then this is a much better use of your bandwidth[*], and you pay the price of latency only once. Loading scripts asynchronously is a good way to do this.

To load a script after page load, do something like this:

// This function should be attached to your onload handler
// it assumes a variable named script_url exists.  You could easily
// extend it to use an array of scripts or figure it out some other
// way (see note late)
function lazy_load() {
    setTimeout(function() {
            var s = document.createElement("script");
            s.src=script_url;
            document.body.appendChild(s);
        }, 50);
}

This is called from onload, and sets a timeout for 50ms later at which point it will add a new script node to the document's body. The script will start downloading after that. Now since javascript is single threaded, the timeout will only fire after onload has completed even if onload takes more than 50ms to complete.

Now instead of having a global variable named script_url, you could have script nodes at the top of your document but with unrecognised content-types like this:

<script type="text/x-javascript-deferred" src="...">

Then in your function, you just need to get all script nodes with this content type and load their srcs.

Note that some browsers also support a defer attribute for script nodes that will do all this automatically.

[*] Due to TCP window size limits, you won't actually use all the bandwidth that you have available on a single download. Multiple parallel downloads can make better use of your bandwidth.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...