Self host static assets

Now this may seem counter-intuitive, but stay with me.  Please self host your static assets.

There was a big trend a few years ago (probably more than a few) to use CDNs, especially for third party libraries, such as jQuery.  And yes, I fell into this trap as well.

The idea behind it was pretty simple, with big potential benefits…

  • Easy to do – you can just add a script tag in pointing to the resource and off you go, no need to download the right version, unpack the files, upload the right one(s) to your website and then add the script tag.
  • Low-latency – the CDN would have a server much closer to the user than yours possibly could be (at least by the law of average, assuming you have users from around the globe) and could therefore return the file quicker due to a shorter distance and lower latency.
  • Pre-cached – because other websites were using the same file, there’s a greater chance that it would already be in the user’s cache, even if this is their first visit to your website.

Ok, so those are the supposed benefits.  What about the negatives that no one really talked about…

  • Availability – there have been instances where quite popular CDNs have shut down their service, sometimes with very short notice.  Or maybe they’re currently suffering a DDoS attack and now this is affecting your website too.  A big player like Google (who have their own CDN) is not as likely to go offline like this, but they cannot be accessed in China, which may be an issue for some of your users, for example.
  • Security – there have been instances where a third party library has been tampered with at the source, and this has then allowed the malicious code to be run on every website that loads that resource (try Googling “browsealoud hack”).  Whilst this can be mitigated with Sub-Resource Integrity (or SRI) checks, you’re still creating a potential attack vector.
  • Connections – I posted a week ago about Zero Round Trip Time Resumption (or 0-RTT) in which I discussed the upfront behind-the-scenes effort required in order to fetch a web page (DNS lookup, TCP handshake, and TLS handshake) – this needs to be done for each and every domain that your site references, so if you’re pulling different third party libraries and external resources from a number of different places, then you’re adding this overhead in multiple times.
  • Caching – some third parties like to have strict control over the assets that they allow you to link to, and this means that they use a very short cache expiry time, so that if they change their file then your users will get the latest version very quickly.  Whilst you are referencing the asset on their servers, you have no control over this.
  • Pre-caching – there is little evidence that this actually works in practice, although it sounds good in theory.  And now major browsers, such as Safari, deliberately block it.  This is to avoid a security risk known as cache poisoning, where an attacker would deliberate cache a malicious file so that you get this version fetched from cache instead of fetching a lovely fresh (unmalicious) version from the server.  I’m sure other browsers will follow suit as well.
  • Low-latency – it’s absolutely possible to use a service (such as Cloudflare) to wrap your whole website in a reverse proxy which provides CDN functionality, giving your users a low latency connection to everything, without any of the drawbacks listed above.

All in all, it’s just better to make sure that you’ve fetched any external assets and choose to host them locally instead.  You’ll get a better performing website, and it may even be a bit more secure to boot.