Zoompf's Web Performance Blog

Should You Use JavaScript Library CDNs?

 Billy Hoffman on January 15, 2010. Category: Uncategorized

The concept is simple. Hundreds of thousands of websites use JavaScript libraries like jQuery or Prototype. Different websites you visit each download another identical copy of these libraries. You probably have a few dozen copies of jQuery in your browser’s cache right now. That’s silly. We should fix that.

How? Well, if there was a 3rd party repository of common JavaScript libraries, websites could simply load their JavaScript files from them. Now imagine the repository implemented caching. SiteA, SiteB, and SiteC all have <SCRIPT SRC> tags that reference http://some-code-respo.com/javascript/jquery.js. When someone visits any one of these sites, the JavaScript library jQuery is downloaded and cached. If that same person visits one of the other sites, that person will not have to re-download jQuery again. The idea is that sites will load faster because these libraries should not have to be re-downloaded very often at all. Of course, this only works if a lot of people all use the common repository. If only a few people use the common repository, then virtually no one benefits because the library will not have been downloaded and cached by a previous website and has to be re-downloaded.

This is an example of the Network effect. The more people that use a system the more valuable the system becomes.

Implementations of this idea of a central shared repository of common JavaScript libraries are called several different things. Google calls their implementation Google AJAX Library API. Yahoo doesn’t have a clear name for their implementation. I’ve seen “Free YUI hosting” or “YUI Dependencies”, or even Yahoo YUI CDN. Microsoft calls their implementation the Microsoft AJAX CDN. To keep things simple, I will collectively refer to these repositories of common JavaScript libraries as JavaScript Library CDNs.

JavaScript Library CDNs seem like a performance no brainer. Use the service, your site loads faster and consumes less bandwidth. This post will explore if and under what conditions does a JavaScript Library CDN actually improve web performance.

The Choice

Consider this situation. You are speed conscious web developer. You have a website that uses jQuery 1.3.2 as well as some additional site specific JavaScript. Because you value web performance, you know you should concatenate all your JavaScript files into as few files as possible, minify them, and serve them using gzip compression. You have 2 choices:

  1. Serve all your JavaScript locally. You will have a single <SCRIPT SRC> tag that points to a JavaScript file containing jQuery 1.3.2 and your site specific JavaScript.
  2. Serve some of the JavaScript using a JavaScript Library CDN. You will have 2 <SCRIPT SCR> tags. The first tag will point to a single file on your website containing your site specific JavaScript files. The second tag will point to the copy of jQuery 1.3.2 on Google AJAX Library API.

What’s the difference? Well a minified, gipped copy of jQuery 1.3.2 is 19,763 bytes in length. If you choose option 1 all your users will have to download these 19,763 bytes regardless of what other sites they may have already visited. That’s the cost: downloading 19,763 bytes. Notice there is no cost of an additional HTTP request and response or other overhead because those bytes of jQuery content are included inside the response for the site specific JavaScript content which the visitor already has to make. This is important, so I will repeat: The cost of not using a JavaScript Library CDN is only the downloading of JavaScript content and not any additional HTTP requests or overhead.

In the second option, you are going to gamble with a JavaScript Library CDN. You are hoping a visitor has already browsed another website which also uses Google to serve jQuery 1.3.2. If you are right, then that visitor does not need to download 19,763 bytes. If you wrong, the visitor needs to download 19,763 bytes from Google. That’s the prize in a nutshell. And downloading 19,763 bytes doesn’t sound bad! Who cares where it comes from?

The Price of Missing

Unfortunately an HTTP request to Google’s JavaScript Library CDN is more expensive than an HTTP request to your own website! This is because a visitor’s browser has to perform a DNS lookup for ajax.googleapis.com and establish a new TCP connection with Google’s systems. If the additional request was to your site instead the visitor’s browser would not need to make another DNS lookup and the HTTP request would be sent over an existing HTTP connection.

Unfortunately this is a stubborn process. DNS lookups and establishing TCP connections involve a few number of very small packets. Having a faster Internet connection will not significantly impact the speed of these operations. Two different runs on WebPageTest showed that it takes 1/3 of a second for a web browser to make a connection to Google’s JavaScript Library CDN and start downloading it. (And remember, these are CDNs so where I make the request from should not matter as the CDN makes sure I’m downloading the content from a web server that is geographically near me.)

Let me repeat that: Using Google’s JavaScript Library CDN comes with a 1/3 of a second tax on missing. (Note that a tax like this applies to opening connections to a any new host: JavaScript Library CDNs, advertisers, analytics and visitor tracking, etc. This is why you should try to reduce the number of different hostnames you serve content from.) Even if this number is smaller for other users, say, 100 milliseconds, it is still a tax that is paid for using a JavaScript Library CDN and missing.

It gets worse because downloading a file over a new TCP connection with Google is slower than downloading a file over an existing TCP connection with your website! This is due to TCP’s slow start and congestion control. Newly created connections transmit data slower than existing connections do. (This is why persistent connections are so important!)

The Odds of Winning

Since JavaScript Library CDNs utilize the Network Effort, they are only valuable if a large number of websites use them. After all, the only way your visitors can “win” in the JavaScript Library CDN gamble is if they have already been to a site that also uses the same CDN. So, how many people actually use Google?

Well, according to the great folks at BuiltWith, only 13% of all websites use some kind of 3rd party CDN. Of those websites using a CDN, 25.56% of them are using Google’s Ajax Library API. So only 3.89% of all websites surveyed are using Google’s AJAX Library API.

I wanted to gather more data than BuiltWith. I also didn’t like that way they grouped Traditional CDNs (like Akamai) with JavaScript Library CDNs (like Google) with private site-specific CDNs (like Turner’s CDN). So I performed my own survey. I visited the top 2000 sites on Alexa and analyzed each one to see who is using Google’s JavaScript Library CDN. The result? Only 69 sites out of 2000, or 3.45%, are using Google’s JavaScript Library CDN. My data is on track with BuiltWith’s data which is good.

Unfortunately you do not vaguely or abstractly “use a JavaScript Library CDN.” You reference a specific URL for the specific JavaScript Library and version number. You only get a benefit from the CDN if you referencing the specific URL that other websites are referencing. So we have to dig deeper and see what versions of what JavaScript libraries are in use. Below is the a table of JavaScript libraries that Alexa Top 2000 sites use served by Google’s AJAX Library API.

JavaScript LibraryNumber of Alexa Top 2000
sites serving the library
from Google’s CDN
jQuery UI4

We see that 48 sites are using Google’s JavaScript Library CDN to serve jQuery, and of those 36 sites are using jQuery 1.3.2. That means jQuery 1.3.2 is used by 1.8% of the Alexa 2000 websites. SWFObject and Prototype came in next at 6 sites each, or less than 0.334% of the sites. When you factor in version numbers, their penetration drops to around 0.10%.

So what is the best case here? What are the odds that someone would have jQuery 1.3.2 served from Google’s JavaScript Library CDN sitting in their browser cache? If I have clear browser cache, and I visit 35 randomly selected websites from the Alexa top 2000, and then I visit your site, there is only a 47% chance that I will have a cached copy of jQuery 1.3.2 ready for you to use. You calculate this by first determining the probably of randomly picking 35 websites that don’t have jQuery 1.3.2 and subtracting 1. The formula is: 1 – ( (1 – .018) ^ 35 ).

Those are not very good odds. And they only are applicable if you are using jQuery 1.3.2. Anything else is not practical. You also should consider the makeup of the sites on the list. I have probably only visited 30 or so of the websites listed in the Alexa top 2000 list ever and I probably only visit 5-10 with any regularity. We have determined that the odds of “winning” in the CDN gamble are fairly small. How small the odds are will depend on your site content and your visitors. However I think it is safe to say, as of January 2010, the majority of your users will not have visited a site that uses a JavaScipt Library CDN for the JavaScript library that you use.

Getting More Data

So maybe the odds aren’t good. But is it still worth it to potentially help some people?

Let’s go back to our hypothetical situation where we are deciding if we should use a JavaScript CDN or not. Consider someone with 768 kilobyte per second Internet connection where 768 * 1024= 786,432 bits downloaded per second. Let’s say it is operating at only 80% efficiency to account for overhead like IP, TCP, congestion, packet loss, etc. That 629,145 bits downloaded per second, gives us 78,643 bytes downloaded per second or 26,214 bytes downloaded in 1/3 of a second. A minified and gzipped copy of jQuery 1.3.2 is 19,763 bytes long. This means anyone using a 768 kbps internet connection can download the contents of jQuery 1.3.2 in 1/3 of a second. In other words, downloading jQuery 1.3.2 on that connection takes the same amount of time as simply connecting to Google’s JavaScript Library CDN.

This simplifies the decision in our hypothetical situation on where to host jQuery. In the locally hosted option, we are asking our visitors to download some amount of content X. X is all our HTML, images, site specific JavaScript, and includes the 19,763 bytes of jQuery 1.3.2. In the “use a CDN” option, we still have X amount of content. The only difference is the CDN has the 19,763 bytes of jQuery and our site has X – 19,763 bytes of content. If a visitor does not have cached copy of JavaScript Library they still download a total of X amount of content. It is served from our website and from Google. Under these conditions we are led to the following points:

  1. If you are using a CDN and the visitor does not have cached copy, they download the site 1/3 of a second slower than if they had downloaded all the content from your web server.
  2. If you are using a CDN and the visitor does have cached copy, they download all of the content 1/3 of a second faster than if they had downloaded all the content from your web server.

Or, more simply: If we use Google’s JavaScript Library CDN, we are asking the majority of our website visitors (who don’t have jQuery already cached) to take a 1/3 of a second penalty (the time to connection to Google’s CDN) to potentially save a minority of our website visitors (those who do have a cached copy of jQuery) 1/3 of a second (the length of time to download jQuery 1.3.2 over a 768kps connection).

That does not make sense. It makes even less sense as the download speed of your visitors increases. Try to avoid serving 20 or 30 kilobytes of content at the cost of using a 3rd party just doesn’t make sense.


JavaScript Library CDNs use the network effect. Our survey of the Alexa 2000 shows that right now there are too few people in the network to get any value. Only Google’s AJAX Library API has anywhere near the penetration to provide any benefit and only if you are using a specific version of a single JavaScript library. Even in that remote case, serving jQuery 1.3.2 using Google will slow down the majority of your users at the expense of a possibly nonexistent minority. Zoompf recommends the vast majority of websites avoid using JavaScript Library CDNs until they gain more market penetration.

I will discuss the very select group of sites that should use CDNs, as well as some other interesting data discovered while surveying the Alexa 2000 in posts early next week.

Want to see what performance problems you have? Using JavaScript Library CDNs appropriately are just a few of the 200+ performance issues Zoompf detects while assessing your web applications for performance. You can sign up for a free mini web performance assessment at Zoompf.com today!


    January 15, 2010 at 2:10 pm

    Overall, a well written and thoughtful article. There are several salient points which are important for people to take into account. But I disagree with the overall conclusions. I want to point out a few things in contrast, because they are important details to factor in which seem to have been left out (or at least not disclosed/addressed):

    1. “You only get a benefit from the CDN if you referencing the specific URL that other websites are referencing.”

    I think this is misleading, because the premise you had was that a disproporitonately large penalty is payed in the DNS lookup itself. It’s true that you don’t get the cache effect if the URL is different, but you DO get other benefits still.

    You do *not* as a user had to have downloaded SPECIFICALLY jquery 1.3.2 (recently) to get the DNS lookup cached. You just have to have visited any site that had any of the Google Ajax CDN libraries used. This is a larger spread than just a specific version of jQuery. There’s a larger chance that someone has the DNS cached for the CDN in question, whichever one it is, as long as that CDN serves more than one file. Which they all do.

    2. Serving up jQuery yourself, inlined in the same concat’d file as your own JS content, you will serve a single larger file. This will have the following effects:
    * the single larger file will likely load slower (to varying extents of course) than jquery as a separate file, because most modern browsers (and even older browsers if the page uses a smart script loader like LABjs!) will download both files in parallel.
    * the user’s cache of jquery is now irrevocably tied to the rest of your code they cached. This means that if you ever tweak even one single byte of your other code (which happens more frequently than most care to admit), all users will have to re-download the entire 19,763 bytes of jQuery unnecessarily. That version of jQuery certainly didn’t change, but the browser and server aren’t smart enough to re-transmit only the deltas.

    3. Even if you then split out jQuery to be it’s own copy on your server, you still lose a theoretical speed up that most browsers have, which is that they will make more concurrent requests for data (parallel) if the resources are from different domains. If, all things considered, you have 2 JS files on your domain, versus one file on your domain and one on another domain like the Google CDN, there’s a better chance the browser will be able to optimize and parallel load those files than if they come from the same domain.

    4. Serving content from another domain, like the Google CDN, increases the likelihood that the request does not have unnecessary cookies weighing down the request. This is why one of the YSlow/PageSpeed suggestions is to use cookie-less domains. Even using a sub-domain of your main site domain doesn’t necessarily protect you from these wasteful cookies being sent with every request, because many cookies (like session cookies and the cookies set by most analytics packages) are set “globally” meaning they cover the domain and *ALL SUBDOMAINS*. The only answer is loading content from another domain entirely. A CDN is one such CDN which can accomplish that. Side Note: I have a new service for also helping deal with this problem: http://2static.it


    Overall, I think this issue is much more complicated and artful balancing than hard-and-fast logic. Sure, there are flaws and drawbacks. But, it’s a healthy direction to be moving in. Moreover, I think there are ways to mitigate the downsides to the CDN usage that you present here. I’m working on just such a project, and I hope it’ll be out in the next few months. Maybe soon, the arguments against CDN usage will be mooted.

    I think working toward improving CDN behavior and performance and reducing its draw backs is a far more valuable expenditure of time than scaring everyone away with conclusions which aren’t entirely big picture.

    January 15, 2010 at 2:47 pm

    There’s another important factor to take into consideration with Google’s CDN: It servers files with a 1 hour Expires! That reduces the probability of a cache hit significantly making it almost worthless. It appears Google is more interested in gathering stats than providing a useful CDN service.

    April 16, 2011 at 1:57 pm

    Only if you say you want, say, the latest version in the 1.7.x series. (As it must: a new version might be released.) If you say you want a specific version you get a max-cache of a year.

    January 15, 2010 at 3:05 pm

    Kyle, you really need to stop writting such well thought out and written comments! Its getting annoying ;-) Actually they aren’t this is a good discussion. Please let me know if I mischaractize any of your points as I summarize them.

    A few counter points:

    1) DNS will be cached for *any* request to Google AJAX Library API

    Yes you are correct. This is an oversight. However, only 4% of the top 2000 sites use any form of Google’s AJAX library API. Thats not much better than the 1.8% for jQuery 1.3.2. There isn’t a good chance of someone already having the DNS for ajax.googleapis.com resolved.

    On top of that, Google DNS server tells your OS/Browser to only cache the DNS record for 1 hour! Ouch! You can verify this by clearing your DNS cache and using WireShark/Ethereal like I just did. Closing the browser, hibernation, reboots, and other things can also cache the DNS cache before the 1 hour limit.

    So yes, you are correct that the DNS lookup aspect can, at times, not happen. However the connection establishment still occurs. And the low caching TTL value doesn’t help matters…

    2) Modern browser can parallize scripts ? Small changes mess you up. Browsers don’t send deltas

    Well first of all combining jQuery into your other JavaScript is actually smaller than jQuery and the other JS by themselves! More content means more redundancy so GZIP compresses the JS+jQuery better (by .5-1% for me) depending on the other JS. But thats not a big deal.

    Not entirely sure what you mean when you talk about LabJS. Is it that modern browsers can download multiple scripts from the same domain in parallel?

    You are also correct that change control can be a problem. You can’t cache stuff with far-futures, and then make changes every day. In fact, I submitted a talk to Velocity about the processes you need in place to properly do web performance. Having a regular, repeatable way, to push changes and doing it on a schedule instead of hap hazard is one of them. I sure do wish browsers had support for Delta encoding. Silly RFCs.

    3) Browser parallize downloads across multiple domains. Multiple domains aren’t bad.

    Sure browsers can parallize downloads across mulitple domains. Thats often a big performance boost. Everyone recommends that method to load images and other static content faster. But we don’t use 40 hostnames do we? No, we try not to use more than 2-4. Why? Because there are is cost to talk to a new domain. That cost is DNS, TCP 3-way, Slow-start. The entire point of this post was to try and figure out if that delay was worth it to only download a single resource from the new domain. At what size is it not worth the overhead? 500 bytes? 10K? 20k?

    4) there might be cookies if you don’t use Google CDN

    This is pretty weak. You are the only person who controls what cookies are set and sent with what on your domains. Trust me, while I worked in the web security industry cross domain cookie tampering was the holy grail! We got it a few times. Oh 2 dot rule and ccTLDs, how I love you. However to suggest that you should Google CDN so you can be sloppy is just silly.

    5) You can mitigate these problems. I’m working on a solution. Stop scaring people

    I don’t think its clear that you can mitigate the problem.

    FACT: If you use pull content froma 3rd party, there is a cost.
    FACT: DNS caching only undoes some of that cost.
    FACT: Right now not many people use these CDNs so one advantage (local cache hits) doesn’t happen alot.

    Are any of these facts false?

    I hope you do make something to solve this issue. I hope the browser manufactures solve this, or implement delta encoding, or Google increases its DNS TTL value. However right now that’s not the case. Hopefully things will change.

    There is two raw advantages of CDN use I didn’t touch on. The fact that its a free geographyically-close CDN. That could help if you are in out of the way places and not currently using another CDN product for your static content. There is also the bandwidth savings of not serving jQuery.

    Also, I’m a little offended by the “scaring people” comment. I didn’t say Google’s CDN kills children. All I said is I don’t believe, and we don’t recommend to our clients, that they should use JS CDNs, except in certain circumstances (no other CDN and bandwidth savings are 2 of them). Lets try to keep this civil and classy ok?

    Thanks for the great feedback and Take care,

    January 15, 2010 at 3:46 pm

    1. Google sucks for making their TTL 1 hr. Plain and simple. I knew it was low, but didn’t know it was that low. We should all publicly shame them for being idiots on this point.

    2. What I meant about the LABjs comment was that modern browsers will “look-ahead” in your markup and parallel download scripts they find in <script> tags, but ensure they execute in proper order. Of course, this leaves out FF3, IE6/7, Safari 3, etc, which is still a non-trivial segement of the general viewership. So, LABjs normalizes the behavior, by making parallel script downloading (while still ensuring script execution order) possible for practically *all* browsers, not just the modern ones.

    3. But in any case, I agree that you shouldn’t just arbitrarily use extra DNS lookups carelessly just to get more parallelization. It’s an unfortunate paradox of these performance rules that if you score really well with one tactic (reducing dns lookups), you hurt yourself in another area (parallel loading). Point is, I believe you have to balance. And simply saying “well, I always and only load from one domain” is a little too far from the side of “what about parallel loading”. I just think there should be a balance in the middle. And I was pointing out that a CDN lookup is one such way to do that.

    4. The problem with cookies (which I try to explain more clearly in the 2static.it About is that you DON’T really always control cookie setting. If you’re only talking about cookies set by server-side processes, yeah, you control those more often. What I found, to my shock, was that a long-held practice of using something as simple as Google Analytics JavaScript to track my sites with multiple domains was that it was setting a “global cookie” (one that was like “.mydomain.tld”) that I didn’t know about, that then was getting forwarded with ALL my static asset requests (JS, CSS, images), even though I had those on a “static.mydomain.tld” sub-domain that I foolishly assumed was safe and shielded from such cookies.

    I’ve done some snooping around sites and found it to be pretty common for analytics packages to set such global cookies (whether you ask them to or not) because it makes it easier for them to track requests from the “mydomain.tld” and “www.mydomain.tld” forms of your page.

    I also found that for a similar reason (the www/non-www versions of sites), it’s a more common practice for session cookies to get set in the “global” fashion.

    Bottom line, this results in lots of static asset requests having unnecessary cookies, and even more insidious, even with sites who (like me) foolishly thought they were stripping them by using a sub-domain. Using an entirely separate domain (like that of a CDN) is one such way of mitigating the problem. The exacty same request for jquery.js on your domain is likely to forward cookies (and thus be bigger and slower) while a CDN request likely would not have such cookies (unless the CDN provider is misbehaving — again they should be shamed publicly).

    I’m not saying this is, in and of itself, justification for CDN usage. But it *is* part of the puzzle and does contribute to the various pros/cons as you weigh out this tricky decisions.

    Sorry if you think it’s weak. It was my point #4 in my list. I didn’t say it was earth shattering. But it is something that gets less attention than I think it should. That’s why I created 2static.it recently.

    5. All 3 facts are true. But all 3 facts are solved by the solution I’m building. I just don’t have it launched yet. But I’d be happy to discuss it privately in the mean time. :)


    Lastly, let me apologize if I offended with the “scaring” comment. Really, what I’m getting at is, I don’t like blog posts that take a tricky subject, present several (but not all) sides of the issue, and then boil it down to one conclusion. Because then the uninformed masses will take up the banner of that conclusion, not think any further about it, and go tell their boss “hey, we need to stop using the CDN”. I’ve seen that kind of foolishness happen far too many times to count or mention.

    I would rather we spend our time and effort, and choose our words, in such a way to EDUCATE people on the decisions *they need to be making*, rather than making the decisions for them with statements like “In conclusion…” Especially because my opinion is, this is far from a settled issue.

    Again, you have plenty of very valid points. But they are input into a much more complicated equation, one which will not any time soon be boiled down to 1+1 = 2. Keep up the good work in helping spread the word of better web performance. :)

    January 15, 2010 at 4:25 pm

    Social comments and analytics for this post…

    This post was mentioned on Twitter by zoompf: New blog post: Should You Use JavaScript Library CDNs? http://bit.ly/4wA71y Find out if Google really gives you jQuery any faster!…

    January 16, 2010 at 6:49 am

    Clearly the browsers should ship with predefined libraries and update them on a regular basis, so devs could use advantage of that. I.e.:

    [script src=”CDN/jquery-14.js” rel=”jquery;1.4″]

    The browser could look at the rel attribute and load the library from the chrome. Yeah, I know It’s not standards based behavior and I’m getting off the topic, but wouldn’t it be nice? ;-D

    January 16, 2010 at 6:57 am

    […] Should you use JavaScript CDNs is a short review of it really being worth it – it is missing the amount of YUI delivered from yahooapis.com though. […]

    January 16, 2010 at 10:38 am

    I develop different sites for different clients on different hosting services. And that is why I tend to use a CDN for JS-Libs. Some providers just suck at server configuration, and the delivered js/css/html is _not_ gzipped. So it’s ~68kb vs. ~23kb — and depending on the connection the user has (and the number of mobile internet users with slower connections is increasing) I think it is worth using CDNs in some cases.

    Though, when on a good server and using handselected mootools-core, I don’t use CDNs.

    January 16, 2010 at 8:34 pm

    The Google CDN has a far future expires of 1 year on the files they are hosting, so the low TTL on the domains should have a lower affect.

    Personally i don’t see much benefit of using the Google CDN for jQuery as it is a small single file. If you add Far Future Expires headers and Gzip compression you get the same affect and retain total control and security.

    This is different with the YUI framework and the Yahoo CDN. The combine feature of yahooapis.com that returns multiple js files as one is a time saver for page load times. Especially if you are using a few components of YUI2. Plus they still Gzip the file and use far futurre headers.

    January 18, 2010 at 5:07 am

    I think the whole idea has been oversold somewhat; in my experience, performance has always suffered using JavaScript CDNs because script loading blocks the rest of the page.

    The added pause for the DNS lookup really exaggerates the blocking effect.

    Then there is the issue when the third-party site is down and your server is working fine. Rare when you’re using google servers, but can happen.

    January 18, 2010 at 9:51 am

    […] This post was mentioned on Twitter by Snook, Peter Slagter. Peter Slagter said: Should or should you not use a JS lib CDN? Well written and thoughtfull article! http://bit.ly/5FqVdf #js #jslib #performance #cdn […]

    January 18, 2010 at 10:15 am

    “Zoompf recommends the vast majority of websites avoid using JavaScript Library CDNs until they gain more market penetration.”

    In order to gain more market penetration shouldn’t Zoompf be encouraging more folks to use JavaScript Library CDNs?

    Chicken or Egg?

    January 18, 2010 at 10:26 am

    Wow. I’m so glad someone is actually writing about this.

    @Pete brings up a very interesting point: What if the script doesn’t load?

    While I think the chances of Google or Yahoo!’s CDN dropping requests are slim, it’s important to consider other causes for external scripts not loading.

    While developing a intranet application for an old client a few years ago, we noticed that JavaScript wasn’t loading when we pushed our code to the production server, however it worked fine in Dev. Yes, our Dev server mirrored our Prod server exactly…

    The problem was actually that the users of the application didn’t have internet access and as a result the external assets on the CDN wouldn’t load. While this is an edge case, this alone made me reconsider the tangible value of CDN’s for JavaScript libraries.

    January 18, 2010 at 10:31 am

    On second thought, remember that time Google did go down? If I recall correctly, there was some DNS issue to blame.

    Regardless, any site which had adsense on it essentially stopped working (because they require specific positioning of your ‘s since they still use ol’rusty document.write). And yes, this is damn near ever site on the net.

    January 18, 2010 at 10:33 am

    Social comments and analytics for this post…

    This post was mentioned on Twitter by derSchepp: RT @zoompf: New blog post: Should You Use JavaScript Library CDNs? http://bit.ly/4wA71y Find out if Google really gives you jQuery any f ……

    January 18, 2010 at 10:56 am

    I love all the discussion around this! Lots of stuff that I want to comment on or write new blog posts about!

    @Macie: Browsers shipping with some libraries is an interesting idea. There are some security implications here that need to be addressed (imagine evil.com getting the browser to use their version of jQuery for all other sites that want to use jQuery…). Will write more about this soon.

    @Steffen: Remember you shouldn’t try to hack around Hosting Providers that Suck! (http://bit.ly/334kPQ) That being said, sometimes you have to and JavaScript CDNs provide you a way to use a host for some of your files that understands performance.

    @adam: Thats the crux of what this post is about: Under what conditions does using a 3rd party to serve content stop making sense? When does the overhead of connecting to and downloading from the 3rd party out weight the benefits. I believe for small files, say <20K, they might not make sense. This has some interesting implications for analytics packages, advertising networks, etc.

    @Matt Pramschufer: I agree its a chicken and the egg problem and we are stuck in it to. I want JavaScript CDNs to catch on. However I don’t want to recommend to our clients that they do something that will negatively effect performance now in the hope that it might one day improve performance.

    @Mike and Pete: Good points. Whenever you include content from a 3rd party, especially script content, you are taking a HUGE risk. Crockford has a great quote: “A Mashup is a self inflicted Cross-Site Scripting Attack!” You bring up the other side, which is you trust their availability as well. While Google and Yahoo might not have a problem, what about other people, like advertising networks?

    January 18, 2010 at 1:54 pm

    On the issue of CDN reliability, @maboa and I have toyed around with various ways to create “fallback” behavior where a CDN is first tried, then if it fails (or times out), a local copy is used instead. Such behavior is pretty easy with a script loader like LABjs, and would address the concerns you have about Google going down, for instance.

    For instance, check out this snippet I suggested for something of the sort:


    Jordi Roca
    January 18, 2010 at 3:44 pm

    In your thoughtful article there is something your statistics analysis doesn’t take into account: if, i.e., 50% of your visitors come from either Google, Yahoo or Bing and these Search Engines used their own CDNs for loading any of the JS frameworks, 50% of your users would already have a fresh DNS entry cached to those CDNs.

    Really disappointing from Google to have a TTL of 1 hr.

    On the other hand, a properly configured nginx server in a VPS server that costs 30$/month can serve 3 thousand jquery.gz files per second.

    Why wouldn’t you use the VPS IP directly and skip the DNS latency?

    January 18, 2010 at 4:22 pm

    @jordi: Unfortunately Google does not eat their own dog food and does not use their own JavaScript CDN. Neither does Bing.

    January 19, 2010 at 10:22 am

    Yeah, there sure are some security concerns, but it’s far from impossible. Fingerprints, signing, certificates, etc. After all, browsers “trust” third party sites in terms of malware/phishing detection. Besides, you shouldn’t link to offsite javascript unless you totally trust the provider. This could be a big no to public CDN-s for some companies.
    Anyway, I’m looking forward for your article on the topic.

    Little addition on the Chicken-Egg problem: we could prefetch. I.e. simply load common libraries from CDN-s in the background, when there’s unused bandwidth (a few seconds after the page load), just to populate them in the cache. If many websites do this, after some time, there is a chance of a higher cache hit ratio, so it would be optimal to switch to those CDN-s rather than using own static domain.

    January 19, 2010 at 3:22 pm

    @Maciej — Your idea of “preloading” is exactly what I’m implementing in a browser plugin. More details soon.

    January 19, 2010 at 4:52 pm

    I like this idea of using a pre-fetch loader to be helpful for your visitors and fill their cache with the CDN files. Kyle’s approach of a plugin is nice. A “Make websites faster” plugin and ultimately can be absorbed into the browser.

    I’m working on a pure JS version of the same thing right now. This way all browsers can take advantage.

    January 26, 2010 at 5:33 pm

    […] Should You Use JavaScript Library CDNs? « Lickity Split […]

    January 29, 2010 at 2:41 pm

    > Zoompf recommends the vast majority of websites avoid using JavaScript Library CDNs
    > until they gain more market penetration.

    Thank you for writing this well researched, thought-out and informative article. While your article is excellent, I respectfully disagree with your recommendation.

    If many sites accept such a recommendation, then JavaScript Library CDNs won’t gain more market penetration. It is a self-fulfilling conclusion. For such things to succeed and benefit everyone, some people need to take the first steps.

    If instead we encourage more people, especially major sites to embrace the use of JavaScript Library CDNs, it will begin to benefit them and many other sites too.

    Also, the article mentions, but doesn’t sufficiently go into the benefits of using JavaScript Library CDNs. For example, savings in bandwidth costs due to lower bandwidth consumption. However, that may not have been the point of your article since the benefits are well covered in other articles.

    Despite my disagreement with the conclusion/recommendation, I found this article an excellent resource. Thank you for articulating your points.

    January 29, 2010 at 8:22 pm


    Zoompf is not in the business of helping Google’s initiatives gain market penetration. Zoompf is in the business of making our clients websites fast as possible. Currently we do not believe that using a JavaScript CDN helps our clients. We are not going to tell our clients to reduce the performance of their websites on the off chance that maybe, sometime down the road, Google’s project hits critical mass.

    I understand this sounds cold. Personally I like the idea of a shared global cache of common JavaScript libraries. I wish them success. I understand our recommendation to our clients will not help Google. However Google isn’t paying our employees’ salaries. Our customers are.

    February 1, 2010 at 10:57 am

    What happens with SSL included CDN libraries? Are they cached if a previous site included the non SSL library?

    February 1, 2010 at 12:11 pm


    No. HTTP caching is based entirely on the URL. So if I have downloaded and cached a copy http://ajax.googleapis.com/blah and then on your webite there is a reference to the SSL version (https://ajax.googleapis.com/blah) by browser will fetch the file because the URL is different.

    March 29, 2010 at 6:23 pm

    Citing the BuiltWith numbers, it’s important not to omit the overall trend: http://trends.builtwith.com/cdn/AJAX-Libraries-API

    In general, it’s not very useful to focus on aggregate numbers when it comes to priming caches. For example, Stack Overflow uses the Google CDN for jQuery. If I were to avoid the CDN on my .NET focused programming blog, based on these aggregate numbers, I would be misled. A significant portion of my traffic is coming directly from Stack Overflow, and the rest are likely to have been there lately.

    Similarly, an e-commerce site avoiding the CDN based on aggregate numbers is likely making a mistake as well; Woot.com uses it. Online gaming niche? ArmorGames.com uses it. Funny-video/time-waster site? Break.com uses it.

    With the extremely wide reach that top sites have, it only takes a few high-traffic users of the CDN to prime a tremendous number of caches in practice. I believe we’re already beyond that tipping point.

    Billy Hoffman
    March 30, 2010 at 2:38 pm


    Great feedback. Your point about where your traffic sources come from is a good one. Just remember there are 3 factors that affect whether a JavaScript file from a JavaScript CDN will be in a visitor’s cache.

    -Do your main traffic sources use the same CDN that you do?
    -Do they use the JavaScript CDN for the same libraries that you do?
    -Do they use the same version of the same libraries as you do?
    -Do they use the same URL to fetch the same library with the same version that you do?

    You mentioned Break.com as an example. However they do a horrible job. They are using jQuery 1.3.2, but are referencing it using the very uncommon URL:


    instead of the much more common:


    So unless your site are also using jQuery 1.3.2 and referencing (using the exact same URL) you get no benefit.

    Because of this, I think your point that it only takes a few high traffic sites to prime a large number of caches is too simplistic. It actually takes high traffic sites using the same libraries, with the same versions, from the same CDN to prime a large number of caches. Based on the trends I’ve seen so far, the latest version of jQuery is the only JavaScript library with the potential to benefit from these CDNs.

    April 2, 2010 at 7:52 pm

    Of the top Alexa sites using the CDN, the overwhelming majority are using Google’s CDN and are explicitly referencing either 1.3.2, 1.4.2, or 1.2.6. There’s very little fragmentation there to worry about. It really only takes a tiny handful of these top-tier sites to prime a tremendous number of caches on a daily basis.

    Also, the implicit “latest” references (e.g. 1.3 and 1.4) will continue to be less and less common as awareness is raised about the caching drawbacks. Since those are only served with a +1 hour expires header, that’s detrimental even to the single site referencing it that way. That’s typically the result of migrating to 1.4 (or jQuery UI 1.8 now) early and not realizing it should be referenced as 1.4.0 to get the correct expires header.

    August 26, 2010 at 1:34 pm

    Although I found this a very rich and usefull article + comments, there’s one thing I do not get.
    Since one of the points is that we do not benefit from these cached versions due to the fact that too few sites actually use these methods. This doesn’t make sense to me. You end with stating :
    “Zoompf recommends the vast majority of websites avoid using JavaScript Library CDNs until they gain more market penetration”.
    But that will never happen if everyone waits for this.
    Should this not be:
    “Zoomppf recommends the vast majority of websites start using JavaScript Library CDNs in order to gain more market penetration” ? and therefor everyone gains..right ?

    Billy Hoffman
    August 30, 2010 at 1:53 pm


    We are not in the business of helping Microsoft”s or Google’s ideas hit critical mass.. We are in the business of helping people make their websites faster. We are not going to tell our customers to do something that would hurt performance for their website, on the off-chance that one day it might help their site’s performance. If or when JavaScript library CDNs provide a clear performance advantage we will start recommending them.

    Andy Corpes
    October 12, 2010 at 12:09 pm

    Just found this thread while searching for “why can’t i download googles java script libraries”. I’ve played around with yahoo’s yui & CI, but the first thing i did was downloaded it and unipped it onto my own server. So, why can’t i do the same for google’s libraries?

    Billy Hoffman
    October 27, 2010 at 4:40 pm


    You absolutely can do that.

    July 14, 2011 at 5:24 am

    Thank you for an informative article. It was just the information I had been looking for. However, you wrote this in Jan 2010 and it is now a year and a half later. The data on builtwith.com shows that, especially Googles AJAX Libraries API is now at 13.07% compared with the 3.89% figure you quoted a year ago. Clearly this increase in use of the CDN will change the numbers and balance of your article. Would it be possible for you to do an update of this page?

    July 16, 2011 at 1:23 pm


    We do need to re-examine the data. However remember its not just an increase in Google’s JavaScript CDN usage that matters. Its CDN usage, library usage, usage of a specific library version, and finally the specific URL used to access that library (/jquery/latest.js vs. /jquery/jquery.1.5.0.js).

    November 28, 2011 at 7:39 pm

    One more thought regarding the DNS request issue. Many larger businesses, ours included, utilize proxy servers and edge routers with caching. Even with clearing my local DNS I cannot get my computer to show the DNS penalty. My conclusion is the the local proxy server is caching the Google CDN. It only has to do this for the first employee, every other employee gets the benefits of the locally cached copy of the DNS record, regardless if they do a Ctrl+F5 or ipconfig /flushdns

    I do not know if any ISP’s do this. I.e. run local dns resolver and therefor speed up the results.

    I may be wrong, as I haven’t done the research, but it seems to me that the TTL on the DNS resolution would need to be low for mobile users. If you are on the west coast and get a 1 year cached copy of the actual resolution to a CDN resource and then fly to the east cost your computer would still try and load files from the CDN on the west coast. Again, that’s just my theory, I have not done the research.

    Last point: Statistics do not always tell the whole story. Even if a very small percentage of sites are the ones using the CDN we must look at the average use of those sites. For example, the following sites use the Google repository: Break.com, FAIL Blog, Foursquare, Twitter, Posterous, SitePoint, Stack Overflow, Stanford.edu, and even the jQuery site itself. If we can get Facebook on board, that would take care of allot of people. :)

    One very salient point has to do with your business needs. We have customers all over the world. It’s far better for companies over seas to load these libraries from Google then from our west coast location.

    February 10, 2012 at 9:47 am

    Kyle Simpson says:
    “Bottom line, this results in lots of static asset requests having unnecessary cookies, and even more insidious, even with sites who (like me) foolishly thought they were stripping them by using a sub-domain. Using an entirely separate domain (like that of a CDN) is one such way of mitigating the problem. The exacty same request for jquery.js on your domain is likely to forward cookies (and thus be bigger and slower) while a CDN request likely would not have such cookies (unless the CDN provider is misbehaving — again they should be shamed publicly).”

    I have to disagree with your comment in regards to cookies… you should better read the documentation before implementing a third party script such as google analytics and you wouldnt have those issues…

    The cookie domain is controllable in just about any third party script I have ever used, joomla, invision power board, google analytics, do I need to keep going?

    anyways the point is read up before “inserting code here” and you will save yourself a headache…

    THe google analytics code you referenced can be very accurately controlled as to which cookie domain it uses… just be sure to set it correctly :)

    June 7, 2012 at 9:32 am

    […] are reasonable critiques against using Google’s CDN. Some teams have a great build process with automatic script combining and minification, as well as […]

    July 10, 2012 at 5:22 pm

    You considered the effect of download speed but you made no mention of geographical location. Serving your files from a single server may be faster if all of your visitors are relatively local but if your site has a more global reach, serving those 20kb from a CDN may be a great advantage.

    July 12, 2012 at 4:05 pm

    Great point Radu. However, if that is a concern, I suggest a site push all of its assets to a CDN including JS libraries bundled into their own custom JS code.

    October 12, 2012 at 3:43 am

    […] You might even hope for caching efficiencies by using such a common CDN, but in practice this does not work out so well. And if you do development from somewhere with a flaky WiFi, every page reload is a spin of the […]

    March 22, 2013 at 6:58 am

    […] 不过不管怎样,这都会是个问题,这样的连接都是运用了项目托管,http://zoompf.com/2010/01/should-you-use-javascript-library-cdns 这篇文章说明了一个重要问题:大多数网站不采用这样的方式进行链接。原因有很多,不再班门弄斧,大家可以看看了解一下。 […]

Comments are closed.