A place to cache linked articles (think custom and personal wayback machine)
選択できるのは25トピックまでです。 トピックは、先頭が英数字で、英数字とダッシュ('-')を使用した35文字以内のものにしてください。

index.md 21KB

title: The Market for Lemons url: https://infrequently.org/2023/02/the-market-for-lemons/ hash_url: 5b35e3f363

For most of the past decade, I have spent a considerable fraction of my professional life consulting with teams building on the web.

It is not going well.

Not only are new services being built to a self-defeatingly low UX and performance standard, existing experiences are pervasively re-developed on unspeakably slow, JS-taxed stacks. At a business level, this is a disaster, raising the question: "why are new teams buying into stacks that have failed so often before?"

In other words, "why is this market so inefficient?"

George Akerlof's most famous paper introduced economists to the idea that information asymmetries distort markets and reduce the quality of goods because sellers with more information can pass off low-quality items as more valuable than informed buyers appraise them to be. (PDF, summary)

Customers that can't assess the quality of products pay the wrong amount for them, creating a disincentive for high-quality products to emerge and working against their success when they do. For many years, this effect has dominated the frontend technology market. Partisans for slow, complex frameworks have successfully marketed lemons as the hot new thing, despite the pervasive failures in their wake, and crowding out higher-quality options in the process.

These technologies were initially pitched on the back of "better user experiences", but have utterly failed to deliver on that promise outside of the high-management-maturity organisations in which they were born. Transplanted into the wider web, these new stacks have proven to be expensive duds.

The complexity merchants knew their environments weren't typical, but they sold highly specialised tools as though they were generally appropriate. They understood that most websites lack tight latency budgeting, dedicated performance teams, hawkish management reviews, ship gates to prevent regressions, and end-to-end measurements of critical user journeys. They understood the only way to scale JS-driven frontends are massive investments in controlling complexity, but warned none of their customers.

They also knew that their choices were hard to replicate. Few can afford to build and maintain 3+ versions of the same web app ("desktop", "mobile", and "lite"), and vanishingly few scaled sites feature long sessions and login-gated content.

Armed with all of this background and knowledge, they kept all the caveats to themselves.

What Did They Know And When Did They Know It? #

This information asymmetry persists; the worst actors still haven't levelled with their communities about what it takes to operate complex JS stacks at scale. They did not signpost the delicate balance of engineering constraints that allowed their products to adopt this new, slow, and complicated tech. Why? For the same reason used car dealers don't talk up average monthly repair costs.

The market for lemons depends on customers having less information than those selling shoddy products. Some who hyped these stacks early on were earnestly ignorant, which is forgivable when recognition of error leads to changes in behaviour. But that's not what the most popular frameworks of the last decade did. As time passed, and the results continued to underwhelm, an initial lack of clarity was parlayed into intentional acts of omission. These omissions have been both material to users and developers. Given the extensive evidence of these failures provided direct to their marketeers, often by me, at some point the omissions veered into intentional prevarication.

Faced with the dawning realisation that this tech mostly made things worse, not better, the JS-industrial-complex pulled an Exxon. They could have copped to an honest error, admitted that these technologies require vast infrastructure and investment to operate, and that they are unscalable in the hands of all but the most sophisticated teams. They did the opposite, doubling down, breathlessly announcing vapourware year after year to defer critical thinking about the fundamental errors in the design. They also worked behind the scenes to marginalise those who pointed out the disturbing results and extraordinary costs.

Credit where it's due, the complexity merchants have been incredibly effective in one regard: top-shelf marketing discipline. Over the last ten years, they have worked overtime to make frontend an evidence-free zone. The hucksters knew that discussions about performance tradeoffs would not end with teams investing more in their technology, so boosterism and misdirection were aggressively substituted for evidence and debate. Like a curtain of Halon descending to put out the fire of engineering debate, they blanketed the discourse with toxic positivity. Those who dared speak up were branded "negative" and "haters", no matter how much data they lugged in tow.

Sandy Foundations #

It was, of course, bullshit.

Astonishingly, gobsmackingly effective bullshit, but nonsense nonetheless. There was a point to it, though. Playing for time allowed the bullshitters to punt introspection of the always-wrong assumptions they'd built their entire technical ediface on. In time, these misapprehensions would become cursed articles of faith:

  • CPUs get faster every year

    [ narrator: they do not  ]

  • Organisations can manage these complex stacks

    [ narrator: they cannot  ]

All of this was falsified by 2016, but nobody wanted to turn on the house lights while the JS party was in full swing. Not the developers being showered with shiny tools and boffo praise for replacing "legacy" HTML and CSS that performed fine. Not the scoundrels peddling foul JavaScript elixirs and potions. Not the managers that craved a check to write and a rewrite to take credit for in lieu of critical thinking about user needs and market research.

Consider the narrative Crazy Ivans that led to this point.

It's challenging to summarise a vast discourse over the span of a decade, particularly one as dense with jargon and acronyms as the one that led to today's status quo of overpriced failure. These are not quotes, but vignettes of what I have perceived as distinct epochs of our tortured journey:

  • "SPAs are a better user experience, and managing state is a big problem on the client side. You'll need a tool to help structure that complexity when rendering on the client side, and our framework works at scale"

    illustrative example  ]

  • "Instead of waiting on the JavaScript that will absolutely deliver a superior SPA experience...someday...why not render on the server as well, so that there's something for the user to look at while they wait for our awesome and totally scalable JavaScript to collect its thoughts?"

    an intro to "isomorphic javascript", a.k.a. "Server-Side Rendering", a.k.a. "SSR"  ]

  • "SPAs are a better experience, but everyone knows you'll need to do all the work twice because SSR makes that better experience minimally usable. But even with SSR, you might be sending so much JS that things feel bad. So give us credit for a promise of vapourware for delay-loading parts of your JS."

    impressive stage management  ]

  • "SPAs are a better experience. SSR is vital because SPAs take a long time to start up, and you aren't using our vapourware to split your code effectively. As a result, the main thread is often locked up, which could be bad?

    Anyway, this is totally your fault and not the predictable result of us failing to advise you about the controls and budgets we found necessary to scale JS in our environment. Regardless, we see that you lock up main threads for seconds when using our slow system, so in a few years we'll create a parallel scheduler that will break up the work transparently*"


    2017's beautiful overview of a cursed errand and 2018's breathless re-brand  ]

  • "The scheduler isn't ready, but thanks for your patience; here's a new way to spell your component that introduces new timing issues but doesn't address the fact that our system is incredibly slow, built for browsers you no longer support, and that CPUs are not getting faster"

    representative pitch  ]

  • "Now that you're 'SSR'ing your SPA and have re-spelt all of your components, and given that the scheduler hasn't fixed things and CPUs haven't gotten faster, why not skip SPAs and settle for progressive enhancement of sections of a document?"

    "islands", "server components", etc.  ]

The Steamed Hams of technology pitches.

Like Chalmers, many teams and managers acquiesce to the contriditions embedded in the stacked rationalisations. There are dozens of reasons to look the other way, real and imaginary. But even as the genuinely well-intentioned victims of the complexity merchants meekly recapitulate how trickle-down UX will work this time — if only we try it hard enough! — the evidence mounts that "modern" web development is, in the main, an expensive failure.

The baroque and insular terminology of the in-group is a clue. It's functional purpose (outside of signaling) is to obscure furious plate spinning. This tech isn't working for most adopters, but admitting as much would shrink the market for lemons. You'd be forgiven for thinking the verbiage was designed obfuscate. Little comfort, then, that folks selling new approaches must now wade through waist-deep jargon excrement to argue for the next increment of complexity.

The most recent turn is as predictable as it is bilious. Today's most successful complexity merchants have never backed down, never apologised, and never come clean about what they knew about the level of expense involved in keeping SPA-oriented technologies in check. But they expect you'll follow them down the next dark alley anyway:

An admission against interest.
An admission against interest.

And why not? The substitution of heroic developer narratives for user success happened imperceptibly. Admitting it was a mistake would embarrass the good and the great alike. Once the lemon sellers embed the data-light idea that improved "Developer Experience" ("DX") leads to better user outcomes, improving "DX" became and end unto itself, and many who knew better felt forced to play along. The long lead times in falsifying trickle-down UX was a feature, not a bug; they don't need you to succeed, only to keep buying.

As marketing goes, the "DX" bait-and-switch is brilliant, but the tech isn't delivering for anyone but developers. The goal of the complexity merchants is to put your brand on their marketing page and showcase microsite and to make acqui-hiring your failing startup easier.

Denouement #

After more than a decade of JS hot air, the framework-centric pitch is still phrased in speculative terms because there's no there there. The complexity merchants can't cop to the fact that management competence and lower complexity — not baroque technology — are determinative of end-user success.

By turns, the simmering embarrassment of a widespread failure of technology-first approaches has created new pressures that have forced the JS colporteurs into a simulated annealing process. In each iteration, they must accept a smaller and smaller rhetorical lane as their sales grow, but the user outcomes fail to improve.

The excuses are running out.

At long last, the journey has culminated with the rollout of Core Web Vitals. It finally provides an effortless, objective quality measurement prospective customers can use to assess frontend architectures. It's no coincidence the final turn away from the SPA justification has happened just as buyers can see a linkage between the stacks they've bought and the monetary outcomes they already value, namely SEO. The objective buyer, circa 2023, will understand heavy JS stacks as a regrettable legacy, one that teams who have hollowed out their HTML and CSS skill bases will pay dearly for in years to come.

No doubt, many folks who now know their web stacks are slow and outdated will do as Akerlof predicts, and work to obfuscate that reality for managers and customers for as long as possible. The market for lemons is, indeed, mostly a resale market, and the excesses of our lost decade will not be flushed from the ecosystem quickly. Beware tools pitching "100 on Lighthouse" without checking the real-world Core Web Vitals results.

Shrinkage #

A subtle aspect of Akerlof's theory is that markets in which lemons dominate eventually shrink. I've warned for years that the mobile web is under threat from within, and the depressing data I've cited about users moving to apps and away from terrible web experiences is in complete alignment with the theory.

More prosaically, when websites feel like worse experiences to those who greenlight digital services, why should anyone expect them to spend a lot to build a website? And when websites stop being where most of the information and services are, who will hire web developers?

The lost decade we've suffered at the hands of lemon purveyors isn't just a local product travesty; it's also an ecosystem-level risk. Forget AI putting web developers out of jobs; JS-heavy web stacks have been shrinking the future market for your services for years.

As Stigliz memorably quipped:

Adam Smith's invisible hand — the idea that free markets lead to efficiency as if guided by unseen forces — is invisible, at least in part, because it is not there.

But dreams die hard.

I'm already hearing laments from folks who have been responsible citizens of framework-landia lo these many years. Oppressed as they were by the lemon vendors, they worry about babies being throw out with the bathwater, and I empathise. But for the sake of users, and for the new opportunities for the web that will open up when experiences finally improve, I say "chuck those tubs". Chuck 'em hard, and post the photos of the unrepentant bastards that sold this nonsense behind the cash register.

Anti JavaScript JavaScript Club

We lost a decade to smooth talkers and hollow marketeering; folks who failed the most basic test of intellectual honesty: signposting known unknowns. Instead of engaging honestly with the emerging evidence, they sold lemons and shrunk the market for better solutions. Furiously playing catch-up to stay one step ahead of market rejection, frontend's anguished, belated return to quality has been hindered at every step by those who would stand to lose if their false premises and hollow promises were to be fully re-evaluated.

Toxic mimicry and recalcitrant ignorance must not be rewarded.

Vendor's random walk through frontend choices may eventually lead them to be right twice a day, but that's not a reason to keep following their lead. No, we need to move our attention back to the folks that have been right all along. The people who never gave up on semantic markup, CSS, and progressive enhancement for most sites. The people who, when slinging JS, have treated it as special occasion food. The tools and communities whose culture puts the user ahead of the developer and hold evidence of doing better for users in the highest regard.

It's not healing, and it won't be enough to nurse the web back to health, but tossing the Vercels and the Facebooks out of polite conversation is, at least, a start.