43

Say I have a web app that uses jQuery. Is it better practice to host the necessary javascript files on my own servers along with my website files, or to reference them on jQuery's CDN (example: http://code.jquery.com/jquery-1.7.1.min.js)?

I can see pros for both sides:

  • If it's on my servers, that's one less external dependency; if jQuery went down or changed their hosting structure or something like that, then my app breaks. But I feel like that won't happen often; there must be lots of small-time sites doing this, and the jQuery team will want to avoid breaking them.
  • If it's on my servers, that's one less external reference that someone could call a security issue
  • If it's referenced externally, then I don't have to worry about the bandwidth to serve the files (though I know it's not that much).
  • If it's referenced externally and I'm deploying this web site to lots of servers that need to have their own copies of all the files, then it's one less file I have to remember to copy/update.
Mr. Jefferson
  • 1,361
  • 1
  • 11
  • 19
  • The first two points only apply if you're worried about Google either going down or getting hacked. – user16764 Mar 12 '12 at 17:58
  • 1
    @user16764 - or they take down their copy of jQuery for some reason (politics, who knows). – Mr. Jefferson Mar 12 '12 at 19:53
  • 2
    Another reason not to is privacy. Using a third parties hosted content gives that third party a way of tracking users that they can't easily opt out of. – Ian Newson Jun 12 '12 at 19:36
  • also some CDNs specially google hosts sometimes tend to restrict the files from specific countries – azerafati Apr 27 '15 at 14:12

7 Answers7

57

You should do both:

Start with hosting from a CDN such as Google's because it will likely have a higher up-time than your own site and will be configured for the fastest response time. Additionally, anyone who has visited a page that links to the CDN will use their cached copy of the file, so they won't even have to re-download a copy, making the initial loading even faster.

Then add a fallback reference to your own server in case the CDN happens to be down (not likely, but safe is safe). Fallbacks are relatively easy to understand, but need to be customized to suit the script being used:

<script src="https://ajax.googleapis.com/ajax/libs/jqueryui/1.8.18/jquery-ui.min.js"></script>
<script>
    if (!window.jQuery) document.write('<script src="/path/to/jquery-ver.sion.min.js"><\/script>');
</script>

Make sure you don't write </script> anywhere within a <script> element, as it will close the HTML element and cause the script to fail. The simple fix is to use a backslash as an escape: <\/script>.


One more reason to do both:

If you pick a popular CDN it's highly unlikely that it'll ever have any down-time, however in the far far future (~18 months from now given Moore's law) when the hosting format changes, or the address is adjusted, or the network is placed behind a paywall, or anything else, it's possible that your link will no longer work as-is. If you use a fallback, then it'll give you a bit of time to adjust to any new format for hosting before having to go back through every website you've ever created and change the CDN links.


another reason to do both:

Recently I've been hit with a string of internet outages. I was able to keep working locally on projects where I'd linked local copies of script resources, and I quickly found that there were a number of projects that needed to have local copies linked.

zzzzBov
  • 5,794
  • 1
  • 27
  • 28
  • +1 Gives you the benefits of the CDN and covers the potential pitfalls, too. – quentin-starin Mar 13 '12 at 06:31
  • 5
    What relevance is the uptime? If his site isn't up, having the js online is hardly a benefit :) – Boris Yankov Mar 13 '12 at 19:49
  • 3
    @BorisYankov, If the CDN is down and your site is up, **and** you haven't used a local fallback, your site wont work. That's the relevance of uptime. If your site is down, *your site is down* and the local copy wont matter. – zzzzBov Mar 13 '12 at 19:53
  • CDN will also have servers all over the place, so you'll get the resource from the closest node. – hanzolo Apr 08 '14 at 19:42
34

I had the same question, then I read this article and I was sold on the idea of letting Google host my jQuery library.

The article states the main benefits of letting your libraries be hosted by Google's Content Delivery Network (CDN):

  • Decreased Latency - Users not physically near your server will be able to download jQuery faster from Google than if you force them to download it from your arbitrarily located server.
  • Increased Parallelism - Browsers limit the number of connections that can be made simultaneously. Depending on which browser, this limit may be as low as two connections per hostname. Using the Google AJAX Libraries CDN eliminates one request to your site, allowing more of your local content to downloaded in parallel.
  • Better Caching - Using the Google AJAX Libraries, your users may not need to download jQuery at all. On the other hand, if you’re hosting jQuery locally then your users must download it at least once. Each of your users probably already has dozens of identical copies of jQuery in their browser’s cache, but those copies of jQuery are ignored when they visit your site.

As for the two bullet points you listed as pros for hosting your own library, remember that it's Google hosting the cloud version, and Google knows what they're doing and can be trusted as far as availability and security go. However, @zzzzBov makes a very good point in his answer to this question, where he recommends also storing a local copy of the library and defaulting to that in the unlikely event that the CDN version cannot be accessed for any reason.

CFL_Jeff
  • 3,517
  • 23
  • 33
17

Personally, I take a cue from http://html5boilerplate.com/

<script src="//ajax.googleapis.com/ajax/libs/jquery/1.6.1/jquery.min.js"></script>
<script>!window.jQuery && document.write(unescape('%3Cscript src="includes/js/libs/jquery-1.6.1.min.js"%3E%3C/script%3E'))</script>

This pulls the main jQuery file from Google, but if it doesn't load for some reason, the next line loads it from your own server.

kevin cline
  • 33,608
  • 3
  • 71
  • 142
Adrian J. Moreno
  • 1,098
  • 6
  • 12
  • 5
    This has the added benefit of letting you develop offline. – RSG Mar 12 '12 at 21:56
  • The use of a protocol-relative URL is no longer so useful as it once was (see https://www.paulirish.com/2010/the-protocol-relative-url/). It's reasonable here to just type `https://`. – GKFX Mar 13 '19 at 22:54
5

It is a better practice to use a CDN, and if that CDN happens to be Google, all the better -- as both @CFL_Jeff and @Morons have noted.

I am adding this answer to point out something that often goes overlooked when pointing elsewhere, which is avoiding the mixed content warning. Consider using protocol-less URLs, e.g.:

<script src="//ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.js" type="text/javascript">
</script>

There are still some support issues in using protocol-less URLs, so also take a gander at the answers on Can I change all my http:// links to just //? on SO, but do at least try to handle potential mixed-content warnings in some way.

jcmeloni
  • 2,461
  • 2
  • 20
  • 25
4

You should reference it to Google's Library of APIs..

The primary reason for this is to speed up your page loads. If you user has already visited another site referencing the same library it will already be stored in the browsers cache and it will not need to be downloaded at all.

Morons
  • 14,674
  • 4
  • 37
  • 73
0

I could be wrong, but implementing document.write overwrites anything that the DOM has. Since it's good practice to put JavaScript files at the end of the body,

I propose the following method based on previous answers:

<script type="text/javascript" src="//ajax.googleapis.com/ajax/libs/jquery/2.1.3/jquery.min.js"></script>
<script type="text/javascript">

if(!window.jQuery)
{
    //Creates the script element
    var script = document.createElement('script'); 
    //Adds the type attribute with "text/javascript" value
        script.setAttribute('type', 'text/javascript'); 
    //Adds the source attribute and populates it
        script.setAttribute('src', 'Put_The_Relative_Path_To_Your_JavaScript_File_Here'); 
    //Adds it to the end of the body, as it is good practice, to prevent render-blocking.  
    document.body.appendChild(script);

}

//Note that there's no need for you to verify with an onload function, since all scripts
//must be loaded before going to the next one! 

</script>
Jose A
  • 275
  • 3
  • 12
0

I would like to add that hosting a local copy is a best practice because none of above state anything related to a secure posture whereby Geo Location and strict White Listing are imperative. Not hosting that file locally presumes to push a reduced security posture onto your clients.

Mark
  • 9
  • 1
  • A security policy that blacklists or otherwise restricts the CDN of either Google of jQuery is going to break _a lot_ of web sites, not only that of the asker. In other words, there are bigger problems to worry about. –  Apr 27 '15 at 12:32
  • 2
    @Snowman ... like the Great Firewall in China (which regularly breaks Stack Exchange sites that uses a CDN for its scripts)? –  Apr 27 '15 at 13:42