Subscribe to receive notifications of new posts:

How we made our page-load optimisations even faster

2018-02-02

6 min read

In 2017 we made two of our web optimisation products - Mirage and Rocket Loader - even faster! Combined, these products speed up around 1.2 billion web-pages a week. The products are both around 5 years old, so there was a big opportunity to update them for the brave new world of highly-tuned browsers, HTTP2 and modern Javascript tooling. We measured a performance boost that, very roughly, will save visitors to sites on our network between 50-700ms. Visitors that see content faster have much higher engagement and lower bounce rates, as shown by studies like Google’s. This really adds up, representing a further saving of 380 years of loading time each year and a staggering 1.03 petabytes of data transfer!

dimon-blr-309444

Cycling image Photo by Dimon Blr on Unsplash.

What Mirage and Rocket Loader do

Mirage and Rocket Loader both optimise the loading of a web page by reducing and deferring the number of assets the browser needs to request for it to complete HTML parsing and rendering on screen.

Mirage

.has_images { overflow: auto; } @media (min-width: 600px) { .has_images p { float: left; width: 50%; } .has_images img { float: right; margin: 1em; width: 40% } }

With Mirage, users on slow mobile connections will be quickly shown a full page of content, using placeholder images with a low file-size which will load much faster. Without Mirage visitors on a slow mobile connection will have to wait a long time to download high-quality images. Since it’ll take a long time, they will perceive your website as slow:

Mirage---before

With Mirage visitors will see content much faster, will thus perceive that the content is loading quickly, and will be less likely to give up:

Rocket Loader

Browsers will not show content that until all the Javascript that might affect it has been loaded and run. This can mean users wait a significant time before seeing any content at all, even if that content is the only reason they’re on visiting the page!

Rocket-before

Rocket Loader transparently defers all Javascript execution until the rest of the page has loaded. This allows the browser to display the content the visitors are interested in as soon as possible.

Rocket-after

How they work

Both of these products involve a two step process: first our optimizing proxy-server will rewrite customers’ HTML as it’s delivered, and then our on-page Javascript will attempt to optimise aspects of the page load. For instance, Mirage's server-side rewrites image tags as follows:

<!-- before -->
<img src="/some-image.png">

<!-- after -->
<img data-cfsrc="/some-image.png" style="display:none;visibility:hidden;">

Since browsers don't recognise data-cfsrc, the Mirage Javascript can control the whole process of loading these images. It uses this opportunity to intelligently load placeholder images on slow connections.

Rocket Loader uses a similar approach to de-prioritise Javascript during page load, allowing the browser to show visitors the content of the page sooner.

The problems

The Javascript for both products was written years ago, when ‘rollup’ brought to mind a poor lifestyle choice rather than an excellent build-tool. With the big changes we’ve seen in browsers, protocols, and JS, there were many opportunities to optimise.

Dynamically... slowing things down

Designed for the ecosystem of the time, both products were loaded by Cloudflare’s asynchronous-module-definition (AMD) loader, called CloudflareJS, which also bundled some shared libraries.

This meant the process of loading Mirage or Rocket Loader looked like:

  1. CFJS inserted in blocking script tag by server-side rewriter

  2. CFJS runs, and looks at some on-page config to decide at runtime whether to load Rocket/Mirage via AMD, inserting new script tags

  3. Rocket/Mirage are loaded and run

Fighting browsers

Dynamic loading meant the products could not benefit from optimisations present in modern browsers. Browsers now scan HTML as they receive it instead of waiting for it all to arrive, identifying and loading external resources like script tags as quickly as possible. This process is called preload scanning, and is one of the most important optimisations performed by the brower. Since we used dynamic code inside CFJS to load Mirage and Rocket Loader, we were preventing them from benefitting from the preload scanner.

To make matters worse, Rocket Loader was being dynamically inserted using that villain of the DOM API, document.write - a technique that creates huge performance problems. Understanding exactly why is involved, so I’ve created a diagram. Skim it, and refer back to it as you read the next paragraph:

Old-version

As said, using document.write to insert scripts is be particularly damaging to page load performance. Since the document.write that inserts the script is invisible to the preload scanner (even if the script is inline, which ours isn’t, preload scanning doesn’t even attempt to scan JS), at the instant it is inserted the browser will already be busy requesting resources the scanner found elsewhere in the page (other script tags, images etc). This matters because a browser encountering a non-deferred or asynchronous Javascript, like Rocket Loader, must block all further building of the DOM tree until that script is loaded and executed, to give the script a chance to modify the DOM. So Rocket Loader was being inserted at an instant in which it was going to be very slow to load, due to the backlog of requests from the preload scan, and therefore causes a very long delay until the DOM parser can resume!

Aside from this grave performance issue, it became more urgent to remove document.write when Chrome began to intervene against it in version 55 triggering a very interesting discussion. This intervention would sometimes prevent Rocket Loader from being inserted on slow 2G connections, stopping any other Javascript from loading at all!

Clearly, document.write needed to be extirpated!

Unused and over-general code

CFJS was authored as a shared library for Cloudflare client-side code, including the original Cloudflare app store. This meant it had quite a large set of APIs. Although both Mirage and Rocket Loader depended on some of them, the overlap was actually small. Since we've launched the new, shiny Cloudflare Apps, CFJS had no other important products dependant upon it.

A plan of action

Before joining Cloudflare in July this year, I had been working in TypeScript, a language with all the lovely new syntax of modern Javascript. Taking over multiple AMD, ES5-based projects using Gulp and Grunt was a bit of a shock. I really thought I'd written my last define(['writing', 'very-bug'], function(twice, prone) {}), but here I was in 2017 seeing it again!

So it was very tempting to do a big-bang rewrite and get back to playing with the new ECMAScript 2018 toys. However, I’ve been involved in enough rewrites to know they’re very rarely justified, and instead identified the highest priority changes we’d need to improve performance (though I admit I wrote a few git checkout -b typescript-version branches to vent).

So, the plan was:

  1. identify and inline the parts of CFJS used by Mirage and Rocket Loader

  2. produce a new version of the other dependencies of CFJS (our logo badge widget is actually hardcoded to point at CloudflareJS)

  3. switch from AMD to Rollup (and thus ECMAScript import syntax)

The decision to avoid making a new shared library may be surprising, especially as tree-shaking avoids some of the code-size overhead from unused parts of our dependencies. However, a little duplication seemed the lesser evil compared to cross-project dependencies given that:

  • the overlap in code used was small

  • over-general, library-style functions were part of why CFJS became too big in the first place

  • Rocket Loader has some exciting things in its future...

Sweating kilobytes out of the minified + Gzipped Javascript files is be a waste of time for most applications. However, in the context of code that'll be run literally millions of times in the time you read this article, it really pays off. This is a process we’ll be continuing in 2018.

Switching out AMD

Switching out Gulp, Grunt and AMD was a fairly mechanical process of replacing syntax like this:

define(['cloudflare/iterator', 'cloudflare/dom'], function(iterator, dom) {
    // ...
    return {
        Mirage: Mirage,
    };
})

with ECMAScript modules, ready for Rollup, like:

import * as iterator from './iterator';
import { isHighLatency } from './connection';

// ...

export { Mirage }

Post refactor weigh-in

Once the parts of CFJS used by the projects were inlined into the projects, we ended up with both Rocket and Mirage being slightly larger (all numbers minified + GZipped):

So we made a significant file-size saving (about half a jQuery’s worth) vs the original file-size required to completely load either product.

New insertion flow

Before, our original insertion flow looked something like this:

// on page embed, injected into customers' pages
<script>
  var cloudflare = { rocket: true, mirage: true };
</script>
<script src="/cloudflare.min.js"></script>

Inside cloudflare.min.js we found the dynamic code that, once run, would kick off the requests for Mirage and Rocket Loader:

// cloudflare.min.js
if(cloudflare.rocket) {
    require(“cloudflare/rocket”);
}

Our approach is now far more browser friendly, roughly:

// on page embed
<script>
  var cloudflare = { /* some config */ }
</script>
<script src="/mirage.min.js"></script>
<script src="/rocket.min.js"></script>

If you compare the new insertion sequence diagram, you can see why this is so much better:

New-Version

Measurement

Theory implied our smaller, browser-friendly strategy should be faster, but only by doing some good old empirical research would we know for sure.

To measure the results, I set up a representative test page (including Bootstrap, custom fonts, some images, text) and calculated the change in the average Lighthouse performance scores out of 100 over a number of runs. The metrics I focussed on were:

  1. Time till first meaningful paint (TTFMP) - FMP is when we first see some useful content, e.g. images and text

  2. Overall - this is Lighthouse's aggregate score for a page - the closer to 100, the better

Assessment

So, improved metrics across the board! We can see the changes have resulted in solid improvements, e.g a reduction in our average time till first meaningful paint of 694ms for Rocket, and 49ms for Mirage.

Conclusion

The optimisations to Mirage and Rocket Loader have resulted in less bandwidth use, and measurably better performance for visitors to Cloudflare optimised sites.

Footnotes

a:not([href]) { text-decoration: inherit !important; color: inherit !important; }

  1. The following are back-of-the-envelope calculations. Mirage gets 980 million requests a week, TTFMP reduction of 50ms. There are 1000 ms in a second * 60 seconds * 60 minutes * 24 hours * 365 days = 31.5 billion milliseconds in a year. So (980e6 * 50 * 52) / 31.5e9 = in aggregate, 81 years less waiting for first-paint. Rocket gets 270 million requests a week, average TTFMP reduction of 694ms, (270e6 * 694 * 52) / 31.5e9 = in aggregate, 301 years less waiting for first-meaningful-paint. Similarly 980 million savings of 16kb per week for Mirage = 817.60 terabytes per year and 270 million savings of 15.2kb per week for Rocket Loader = 213.79 terabytes per year for a combined total of 1031 terabytes or 1.031 petabytes.

  2. and a tiny 1.5KB file for our web badge - written in TypeScript ? - which previously was loaded on top of the 21.6KB CFJS

  3. shut it Hume

  4. Thanks to Peter Belesis for doing the initial work of identifying which products depended upon CloudflareJS, and Peter, Matthew Cottingham, Andrew Galloni, Henry Heinemann, Simon Moore and Ivan Nikulin for their wise counsel on this blog post.

Cloudflare's connectivity cloud protects entire corporate networks, helps customers build Internet-scale applications efficiently, accelerates any website or Internet application, wards off DDoS attacks, keeps hackers at bay, and can help you on your journey to Zero Trust.

Visit 1.1.1.1 from any device to get started with our free app that makes your Internet faster and safer.

To learn more about our mission to help build a better Internet, start here. If you're looking for a new career direction, check out our open positions.
MirageRocket LoaderSpeed & ReliabilityProduct News

Follow on X

Tim Ruffles|@timruffles
Cloudflare|@cloudflare

Related posts

October 24, 2024 1:00 PM

Durable Objects aren't just durable, they're fast: a 10x speedup for Cloudflare Queues

Learn how we built Cloudflare Queues using our own Developer Platform and how it evolved to a geographically-distributed, horizontally-scalable architecture built on Durable Objects. Our new architecture supports over 10x more throughput and over 3x lower latency compared to the previous version....

October 09, 2024 1:00 PM

Improving platform resilience at Cloudflare through automation

We realized that we need a way to automatically heal our platform from an operations perspective, and designed and built a workflow orchestration platform to provide these self-healing capabilities across our global network. We explore how this has helped us to reduce the impact on our customers due to operational issues, and the rich variety of similar problems it has empowered us to solve....

October 08, 2024 1:00 PM

Cloudflare acquires Kivera to add simple, preventive cloud security to Cloudflare One

The acquisition and integration of Kivera broadens the scope of Cloudflare’s SASE platform beyond just apps, incorporating increased cloud security through proactive configuration management of cloud services. ...

September 27, 2024 1:00 PM

AI Everywhere with the WAF Rule Builder Assistant, Cloudflare Radar AI Insights, and updated AI bot protection

This year for Cloudflare’s birthday, we’ve extended our AI Assistant capabilities to help you build new WAF rules, added new AI bot & crawler traffic insights to Radar, and given customers new AI bot blocking capabilities...