title: Why I’m turning JavaScript off by default
url: https://tommorris.org/posts/8677
hash_url: 2e88519b5d
I managed to offend a lot of front-end programmers in the office today by announcing that I was installing NoScript, and enforcing a strict JavaScript whitelisting policy.
I’m quite vocal in my dislike of JavaScript. They seemed to think this was some kind of slight of JavaScript. I’m not a big fan of JavaScript, but the reason I’m installing NoScript isn’t because I don’t like JavaScript. It’s because I dislike an enormous amount of client-side scripting… which just so happens to be done in JavaScript.
I snark a lot about JavaScript, but I’m of the opinion that most of the web would be improved if there were a lot less JavaScript running on it.
I don’t want web designers redesigning the “experience” of using the web. The unification of the user experience of using computers is a positive thing. If you use old software from the early days of computing, everything had a different user experience. If you use Windows or OS X, you’ll know of software that behaves differently from the norm. If you are a reasonably perceptive user, you’ll see it, and then you’ll be annoyed by it. The reason I prefer Pixelmator to Photoshop is that it more closely adheres to the way OS X apps are supposed to be designed. When I use Pixelmator, things like file opening, window management, document navigation and so on are consistent with other applications I use. This makes things more predictable and thus usable.
On the web, the controls I’m referring to are things like knowing where I am on the web, having links and navigation elements behave consistently and sensibly. If I right click on a link and choose “Open Link in New Tab”, if it’s a “proper” link, it’ll open in a new tab. If it is just an anchor element that triggers a blob of JavaScript, it doesn’t do anything. The things that look like links behave differently for no discernible reason.
With the triumph of client-side scripting—of “web.js”, in this specific way, we’re going back to the bad old days of computing but for the most trivial pieces of software. Why does every newspaper or blog have to behave differently, to modify the experience of using a simple system for the retrieval of rich text documents. It doesn’t. There’s no valid justification for it. It’s a cargo cult: people do it because everyone else is doing it.
The purported justification for it is the creation of “web apps”. As Jeremy points out, a web app seems to be nothing but a web site that requires JavaScript. And the justification for building sites that don’t work without JavaScript is that it’s not a web site, it’s a web app. Needless to say this is circular.
In the era of web.js rather than the old-fashioned web, URIs don’t matter. A URI doesn’t identify a resource—it doesn’t really do anything. At best, the blob of unreadable JavaScript might interpret the URL as an instruction to perhaps load some blob of JSON and render up a stack of semantically-meaningless div elements in the document object model at some indiscernible time in the future. In “web.js”, elements that aren’t div exist primarily as a nostalgic throwback to a gentler era.
Another nostalgic throwback to an earlier era is the idea of progressive enhancement. Thanks to frameworks like Angular and Backbone, you can build applications that contain no data in the HTML document at all. Hypertext without any actual hypertext. What happens if someone views it with JavaScript turned off? Or on an old browser? Well, there’s pretty much one answer to that: they’re fucked.
This is apparently a good thing for user experience: you change the user experience arbitrarily on different websites, and have it so the content doesn’t degrade gracefully.
What UI innovations does this give us?
Whenever a browser has crashed on me, or allocated so much memory I’ve needed to restart it, it’s never been because I’ve been reading too many plain, simple web pages that aren’t bloated down with JavaScript. It’s been because of monstrously overly engineered beasts.
Turning that shit off is the first step towards sanity.
Obviously, some sites need JavaScript. If I trust them enough to not fill my browser and RAM with badly-written shit that makes the user experience worse, they’ll get on my personal whitelist. Most sites won’t. If they abuse my trust by making the experience worst for fashion-driven reasons, they get taken back off the whitelist.
Perhaps we could go a step further and share the whitelists and blacklists. A web of trust for client-side code, where the default is “off”.
If we build a community of people keen on having the old web back before it started getting ruined by overenthusiastic client side developers, we might be able to save the web from sliding any further down the ruinous path of “every website a web app (even though we’re not quite sure what one of those is)” and other similar follies.
Do I expect you all to do likewise? No, I’m perfectly well aware that I’m probably likely to be something of a pariah in my crusade. I’m a gay vegetarian: I’m okay with being in the minority. Whatever.
Do I hate JavaScript? Well, it’s not the language I’d want to code in for the rest of time. I’m not fond of it. But as I said, the language isn’t the issue. If Haskell or Scheme were the language of client-side web scripting and could be used in place, I’m sure we’d see just as many dumb things in that approach.
JavaScript is a necessity. I use JavaScript on my own site. Incidentally, not for a great deal. The only person who is significantly injured by turning off JavaScript on my site is me, because it’s needed for the login system and posting UI.
I do think modern web development has gone down a deeply unwise path. Only through exercising our personal choices can we bring it back. We have mostly stopped the web from being a hellhole of shitty punch the money adverts by blocking the living shit out of adverts. JavaScript is becoming the new conduit for awfulness. I like the web too much to have to endure any more of it when not strictly necessary.