Unplanned Obsolescence


Who's Afraid of a Hard Page Load?

July 16, 2024

While I'm not going to settle the Single-Page Web Application (SPA) debate in a blog post, there is one claim about SPAs that routinely goes unchallenged, and it drives me nuts: that users prefer them because of the "modern," responsive feel.

SPAs achieve their signature feel using partial page replacement: adding or removing DOM elements instead of loading a new page. Partial page replacement is a very useful feature—I'm working on an HTML standards proposal for it right now—but SPAs typically use them for everything, including page navigation, which causes a lot of problems.

The way this works is that rather than letting the browser load a new page when the user clicks an <a> tag, SPAs simulate page navigation by fetching with JavaScript, updating the page, and using the History API to edit the browser's URL bar. NextJS and React Router work this way, as does SvelteKit. Even the hypermedia libraries support this paradigm, with htmx's hx-boost and Hotwire's Turbo Drive.

In theory, avoiding "hard" page navigations has the following benefits:

What this does is essentially abstract away the concept of a link, and make the web page feel more like an application on your phone. No longer are you navigating web pages, you're moving around an app. I have a number of problems with this, but purely from a UX standpoint, it's a massive disservice to web users.

Managing the network

Every day I ride the New York City Subway. For my carrier, most of the stops have cell service, and most of the tunnels between stops do not. When I read web pages while riding, I am keenly aware that if I click a link while I don't have service, not only will the page fail to load, I will probably also lose access to the one I'm currently reading. Everyone who uses a web browser understands this behavior on some level. So I avoid clicking links until I'm at a stop.

Occasionally though, I'll mis-time it, and click a link right as the subway is pulling out of a stop: the page fails to load, and now I'm looking at a blank screen. In that situation, I much prefer to be on a traditional website than an SPA. On a website like Wikipedia, one that uses hard links and full page loads, then there's a decent chance that the browser can save me: the back button will usually load the cached version of the page I was just on.

If it's an SPA, however, in all likelihood clicking the back button will take me a different, mostly blank page, and now I'm just stuck. When the internet comes back, I'll refresh the page and hopefully land in the same place, but maybe not. In fact, my whole attitude towards a website changes if it feels like an SPA. Subconsciously, I know that I have to baby it, and only use it in the most optimal network conditions. The smoothness of a web application is an anti-indicator of its reliability and predictability as a web page.

That anti-indicator holds even in situations without unreliable internet. As a user, I'm always much happier when presented with a form that is entirely on one page, or has a "hard" submit button for each step that takes me to a new page, as opposed to a "seamless" form that exists as a blob of JS state. The former has relatively predictable submit, autocomplete, and back button behavior, while the latter varies widely by implementation.

Maybe you don't ride the subway. But you've probably driven on a highway with spotty service, or had a bad Wi-Fi connection, or gotten on a plane, or been inside a basement with weirdly thick walls. Everyone has had to navigate the web under less-than-ideal network conditions, and you quickly develop an intuition for which websites will be resilient to them.

The web has seams, let them show

Developers are naturally inclined to make their applications feel more responsive, and when they test their SPA, it feels like a more natural experience than a clunky old web page. But this instinct is usually incorrect, because most websites need to hit the network in response to user actions.

When a user clicks a link, they want whatever information was at that link—which their device will have to make a network trip to discover. When a user submits a form, they need to know whether or not that information was saved to the server, which their device will have to make a network trip to accomplish.

I suppose there's a version of the web that pre-fetches every possible page for you—and that might feel pretty instantaneous. But there's no world where that works for user-submitted data, because the only thing I care about as a user is that the data actually got submitted. If I submit a form to a website, the website optimistically and instantaneously shows me that the submission succeeded, and I later find out that it didn't, I am mad.

The friction involved with a hard page load doesn't exist because web developers are too lazy to do performance work—it reflects a real, physical limitation in the system that is beyond the ability of one developer, and possibly humanity, to overcome. SPAs not only fail to remove the need for the network call, they diminish the user's ability to manage when that network call is made, and handle failure cases.

Discussions of user agency in software are often very... optimistic about how much users want to exercise that agency. But agency comes in many forms. When I was in 5th grade, I would load up GameFAQs guides for Final Fantasy III on my iPod Touch before a road trip, and in the car I'd make sure not navigate away from the page, or I'd lose the guide. When I avoid clicking links between subways stops, I'm building on behavior I learned as a child, not as a software engineer.

In the long run, the browser always wins

I suspect that the primary impetus for this smoothness is commerce, or something I call "casino-driven development." As my Papou used to tell me, casinos do not have clocks because clocks remind you that time is passing; the casino would like you to forget that time is passing, because they make more money the longer you remain in the casino. In the ad-based internet attention economy, the website would like to keep you in their casino as long as possible-the less that you're reminded you're on the web, where clicks usually require waiting, the better.

Internet folklore has it that, in the 2000s, Amazon and Google research discovered that for each X additional millisecond of page load latency they lost Y customers and therefore Z dollars. I can't find any reliable sources for this, but the logic is sound. Some percentage of people will give up the longer it takes to see a result, and at that scale, that percentage translates into a lot of lost money.

Here's the problem: your team almost certainly doesn't have what it takes to out-engineer the browser. The browser will continuously improve the experience of plain HTML, at no cost to you, using a rendering engine that is orders of magnitude more efficient than JavaScript. To beat that, you need to be continuously investing significant engineering effort into cutting-edge application work.

Some things you have to consider with SPAs:

You can engineer your way out of basically all the problems I've described here, but it takes enormous effort. And maintenance on the pile of libraries required to get back basic browser features like "back button navigation" on your SPA is a new fixed cost, paid for with your time. If you use hard page loads, those things not only work for free, they work forever, and they work in exactly the way the user expects and desires.

At the time of this writing, the NextJS showcase lists Nike's shopping platform as one of their successes. If you are literally Nike, and throwing millions at making your shopping portal slightly more responsive could result in tens of millions of revenue, by all means take a crack at it. I, personally, am dubious that the math typically pencils out, even for Nike, but I concede that it's at least plausible that you will deliver a networked experience that is a hair quicker than what the default HTML can do, and reap the rewards.

Meanwhile, the browser marches on, improving the UX of every website that uses basic HTML semantics. For instance: browsers often don't repaint full pages anymore. Try browsing Wikipedia (or my blog) on a decent internet connection and notice how rarely the common elements flash (I can't find any documentation for this feature, but it definitely exists). And, if the connection isn't fast, then the browser shows a loading bar! It's a win for users, and one of the many ways that sticking with the web primitives rewards developers over time.

So if you're a bank, or a government, or pretty much anyone with engineering resources short of "limitless," you will likely be better served by sticking to hard page loads (and the default HTML capabilities) as much as possible. It's dramatically easier to implement and benefits from browser performance and security improvements over time. For page responsiveness improvements, try tweaking your cache headers, scrutinizing the JavaScript you send to the client, and optimizing your CDN setup. It always pays off in the long run.

Bonus: A Good Use of SPAs

When I worked at the Washington Post, I worked on the interactive map that they used for live election night coverage. Watch my former boss, Jeremy Bowers, clicking around it the livestream. Here's me and Dylan Freedman, in front of an early version:

Alex Petros and Dylan Freedman in front of a big screen with a gray map on it,
       at the Washington Post offices
I had to leave this project after like 6 weeks and Brittany took my place. As you can see, it improved dramatically after I left.

That's a giant SvelteKit app! The map GUI is controlled by a Svelte store, and, if I remember correctly, a websocket updates the votes totals in the background. When you click on a US State, it shows a close-up of that state and all the election info that had come in so far.

This is a great use of a reactive UI framework, because the data stored on the client doesn't update in response to user actions, it updates in response to new election results. The clicking should be instantaneous, and the UI should live entirely on the client, because they can!

And it's remarkable that you can compete with very expensive interactive map products using nothing but a browser, open source libraries, and a couple months of engineer-time.

I had so much fun learning Svelte for this that I used to as the basis for AYTA. If I were doing AYTA again though, I would definitely use htmx.

Thanks to Mani Sundararajan for his feedback on a draft of this post.

Notes