One does not simply … (Turbolinks edition)

The web is a profoundly broken medium in many ways. The network is unreliable, servers are unreliable, and the human beings who write code for dynamic services are most unreliable of all. Nonetheless, amongst all those problems, one that I’ve never really found significant is the milliseconds taken to load and process a page’s JavaScript and CSS—and if that were an issue, I think my solution would be to optimise and minimise the JavaScript and CSS, not write more of it!

The beta of Rails 4.0 is now available. One of the new features is Turbolinks. It’s a Ruby gem that actually contains code written in CoffeeScript, almost as if someone is trying to troll a large number of developers, but let’s leave that to one side. Here’s what it does:

Speed-up the client-side with Turbolinks, which essentially turns your app into a single-page javascript application in terms of speed, but with none of the developmental drawbacks (except, maybe, compatibility issues with some existing JavaScript packages).

I was wary of Turbolinks when first announced, as I said at the time:

I understand the motivation for this kind of stuff, and it’s neat, but I’m wary of it because of the additional complexity it introduces for a relatively small benefit.

I may be misleading myself, but it’s rare (on a desktop browser, at least) that it’s the page rendering time that I really notice: far more significant is usually the latency, or the time taken to transfer the significant proportion of a megabyte of HTML that’s smothering a few kilobytes of text.

On the downside, it replaces something that just works with something that … mostly just works. See elsewhere on this page: “Loads blank white screens in firefox 15” / “This is now fixed”. And that’s the problem: you’ve replaced something that works in every browser with something that you have to (or someone has to) test in every browser, and whose failure modes must all be handled. What happens when you click on a “turbolink” on an overloaded server, for example? My experience so far has been that this kind of enhanced link is usually faster, but the probability of nothing happening in response to a click is not insignificant.

I’m aware that I probably sound like an old grouch.

Others have made similar comments; Yehuda Katz wrote:

At the end of the day, unless Turbolinks can perfectly emulate the browser’s behavior, attempts to use Turbolinks with third-party JavaScript will either fail often or require an ever-growing library that handles more and more targeted edge-cases.

It seems that I wasn’t wrong: there have been 183 closed bugs on Turbolinks to date and five are currently still open. Don’t misinterpret me: it’s good that they’ve fixed these issues, but it reinforces the fact that it’s not just a simple case of swapping in the page body with a bit of JavaScript and an asynchronous request.

One does not simply swap in the page body with a bit of JavaScript

And that’s my objection, really: Turbolinks fixes a non-problem by adding a lot of complexity. That just seems a bit self-indulgent. Like those mobile websites that replace scrolling (just works everywhere) with heavy and broken swipe pagination (hey, we tested it on iOS 6 on WiFi!), it’s not driven by user needs.

Many of the websites I use on a daily basis are too clever for their own good, and break horribly when exposed to network latency, dropped connections, busy servers, or browsers in the wild. It doesn’t save me time if I have to phone up to give my electricity meter reading because the asynchronous form submission just doesn’t work. It doesn’t save me time if I have to log into a website in Firefox to close an item because nothing happens when I press the button in Chrome. Both of those happened yesterday, on the websites of companies worth hundreds of millions of dollars. My user experience would have been better if they’d been plain old CGI scripts! At least then I’d have got a sensible response from the submission: a timeout, or an error, but something.

If you must replace pages with an asynchronous call rather than just GET followed by 200 OK, then please do use Turbolinks, because there’s zero chance that you’ll get it working right on your own. But even then, maybe still don’t bother, because your visitors might actually prefer predictable click behaviour and memory usage, and websites that just work across a wide range of browsers and network connections. You’ll save yourself trouble, too.


  1. Duncan Sample

    Wrote at 2013-02-27 19:52 UTC using Chrome 25.0.1364.97 on Linux:

    I’ve been of this opinion in the past about ‘app’ style sites that don’t work with JavaScript turned off other than displaying a single empty DIV.

    However, having just joined a project that’s doing something similar to this I can see some advantages, but only if you don’t need to fetch each ‘page’ from the server, and instead fetch everything at once and swap them in the DOM using JavaScript.

    Since a lot of sites use AJAX for form validation and other purposes, making the extra requests and downloading a new HTML frame around each step of a form seems a little heavy on both the server and client, if the site is already stored as JavaScript objects then the response ‘feels’ faster to the user. It also helps as the form can be cached and served better when it doesn’t need to always be processed server-side to implant the personalised data.

    I’m still not keen on sites which don’t work without JavaScript turned off, and yes, there is an overhead in redoing HTTP response code handling, but if you must do it, this feels like a more reasonable way than still having full requests per page with the added overhead of JS. I think on mobile/low-speed connections it even makes more sense if you consider how much optimisation you can do on what is then a static page & scripts served from a CDN.