Browse Source

Links

master
David Larlet 2 months ago
parent
commit
a59ff25449
Signed by: David Larlet <david@larlet.fr> GPG Key ID: 3E2953A359E7E7BD

+ 760
- 0
cache/2024/0676c7ccf1ab2b380641866789366d26/index.html View File

@@ -0,0 +1,760 @@
<!doctype html><!-- This is a valid HTML5 document. -->
<!-- Screen readers, SEO, extensions and so on. -->
<html lang="fr">
<!-- Has to be within the first 1024 bytes, hence before the `title` element
See: https://www.w3.org/TR/2012/CR-html5-20121217/document-metadata.html#charset -->
<meta charset="utf-8">
<!-- Why no `X-UA-Compatible` meta: https://stackoverflow.com/a/6771584 -->
<!-- The viewport meta is quite crowded and we are responsible for that.
See: https://codepen.io/tigt/post/meta-viewport-for-2015 -->
<meta name="viewport" content="width=device-width,initial-scale=1">
<!-- Required to make a valid HTML5 document. -->
<title>The Performance Inequality Gap, 2024 (archive) — David Larlet</title>
<meta name="description" content="Publication mise en cache pour en conserver une trace.">
<!-- That good ol' feed, subscribe :). -->
<link rel="alternate" type="application/atom+xml" title="Feed" href="/david/log/">
<!-- Generated from https://realfavicongenerator.net/ such a mess. -->
<link rel="apple-touch-icon" sizes="180x180" href="/static/david/icons2/apple-touch-icon.png">
<link rel="icon" type="image/png" sizes="32x32" href="/static/david/icons2/favicon-32x32.png">
<link rel="icon" type="image/png" sizes="16x16" href="/static/david/icons2/favicon-16x16.png">
<link rel="manifest" href="/static/david/icons2/site.webmanifest">
<link rel="mask-icon" href="/static/david/icons2/safari-pinned-tab.svg" color="#07486c">
<link rel="shortcut icon" href="/static/david/icons2/favicon.ico">
<meta name="msapplication-TileColor" content="#f7f7f7">
<meta name="msapplication-config" content="/static/david/icons2/browserconfig.xml">
<meta name="theme-color" content="#f7f7f7" media="(prefers-color-scheme: light)">
<meta name="theme-color" content="#272727" media="(prefers-color-scheme: dark)">
<!-- Is that even respected? Retrospectively? What a shAItshow…
https://neil-clarke.com/block-the-bots-that-feed-ai-models-by-scraping-your-website/ -->
<meta name="robots" content="noai, noimageai">
<!-- Documented, feel free to shoot an email. -->
<link rel="stylesheet" href="/static/david/css/style_2021-01-20.css">
<!-- See https://www.zachleat.com/web/comprehensive-webfonts/ for the trade-off. -->
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_regular.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: light), (prefers-color-scheme: no-preference)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_bold.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: light), (prefers-color-scheme: no-preference)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_italic.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: light), (prefers-color-scheme: no-preference)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t3_regular.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: dark)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t3_bold.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: dark)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t3_italic.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: dark)" crossorigin>
<script>
function toggleTheme(themeName) {
document.documentElement.classList.toggle(
'forced-dark',
themeName === 'dark'
)
document.documentElement.classList.toggle(
'forced-light',
themeName === 'light'
)
}
const selectedTheme = localStorage.getItem('theme')
if (selectedTheme !== 'undefined') {
toggleTheme(selectedTheme)
}
</script>

<meta name="robots" content="noindex, nofollow">
<meta content="origin-when-cross-origin" name="referrer">
<!-- Canonical URL for SEO purposes -->
<link rel="canonical" href="https://infrequently.org/2024/01/performance-inequality-gap-2024/">

<body class="remarkdown h1-underline h2-underline h3-underline em-underscore hr-center ul-star pre-tick" data-instant-intensity="viewport-all">


<article>
<header>
<h1>The Performance Inequality Gap, 2024</h1>
</header>
<nav>
<p class="center">
<a href="/david/" title="Aller à l’accueil"><svg class="icon icon-home">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-home"></use>
</svg> Accueil</a> •
<a href="https://infrequently.org/2024/01/performance-inequality-gap-2024/" title="Lien vers le contenu original">Source originale</a>
<br>
Mis en cache le 2024-01-31
</p>
</nav>
<hr>
<p>It's time once again to update our priors regarding the global device and network situation. What's changed since last year? And how much more HTML, CSS, and (particularly) JavaScript can a new project afford?</p>
<p></p>
<h2 id="the-budget%2C-2024">The Budget, 2024 <a class="permalink" href="#the-budget%2C-2024">#</a></h2>
<p>In a departure from previous years, we'll evaluate two sets of baseline numbers for first-load under five seconds on 75<sup>th</sup> (<abbr>P75</abbr>) percentile devices and networks. First, we'll look at limits for JavaScript-heavy content, and separately we'll enunciate recommendations for markup-centric stacks.</p>
<p>This split decision was available via <a href="https://infrequently.org/2022/12/performance-baseline-2023/">last year's update</a>, but was somewhat buried. Going forward, I'll produce both as top-line guidance. The usual caveats also apply:</p>
<ul>
<li>Performance is a deep and nuanced domain, and much can go wrong beyond content size and composition.</li>
<li>How sites manage resources after-load can have a big impact on perceived performance.</li>
<li>Your audience may justify more stringent, or more relaxed, limits.</li>
</ul>
<p>With that stipulated, global baselines matter because many teams have low <a href="https://infrequently.org/2022/05/performance-management-maturity/">performance management maturity</a>, and today's popular frameworks – including some that market performance as a feature – <a href="https://infrequently.org/2023/02/the-market-for-lemons/">fail to ward against catastrophic results</a>.</p>
<p><em>Until and unless teams have better data about their performance, the global baseline budget should be enforced.</em></p>
<p>This isn't charity; it's how teams ensure products stay functional, accessible, and reliable in a market <a href="https://infrequently.org/2023/02/the-market-for-lemons/">awash in bullshit</a>. Limits help teams steer away from complexity and towards tools that generate simpler output that's easier to manage and repair.</p>
<h3 id="javascript-heavy">JavaScript-Heavy <a class="permalink" href="#javascript-heavy">#</a></h3>
<p>Since at least 2015, building JavaScript-first websites has been a predictably terrible idea, yet most of the sites I trace on a daily basis remain mired in script. For these sites, we have to factor in the heavy cost of running JavaScript on the client when describing how much content we can afford. HTML, CSS, images, and fonts can all be parsed and run at near wire speeds on low-end hardware, but JavaScript is at least three times more expensive, byte-for-byte.</p>
<p>Most sites, even those that aspire to be "lived in", feature short median sessions, which means we can't actually justify much in the way of up-front code, and first impressions always matter.</p>
<figure>
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2023/02/the-market-for-lemons/depth-and-frequency-small.png?nf_resize=fit&amp;w=3600 2400w,
/2023/02/the-market-for-lemons/depth-and-frequency-small.png?nf_resize=fit&amp;w=2400 1600w,
/2023/02/the-market-for-lemons/depth-and-frequency-small.png?nf_resize=fit&amp;w=1800 1200w,
/2023/02/the-market-for-lemons/depth-and-frequency-small.png?nf_resize=fit&amp;w=1200 800w,
/2023/02/the-market-for-lemons/depth-and-frequency-small.png?nf_resize=fit&amp;w=900 600w,
/2023/02/the-market-for-lemons/depth-and-frequency-small.png?nf_resize=fit&amp;w=750 500w,
/2023/02/the-market-for-lemons/depth-and-frequency-small.png?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2023/02/the-market-for-lemons/depth-and-frequency-small.png" alt="Most sorts of sites have shallow sessions, making up-front script costs hard to justify." class="preview" decoding="async">
</source></picture>

<figcaption>Most sorts of sites have shallow sessions, making up-front script costs hard to justify.</figcaption>
</figure>
<p>Over the estimated P75 global network, and targeting the slower of our two representative devices — and to hit five seconds to interactivity with only two critical-path network connections — we can afford ~1.3MiB of compressed content, comprised of:</p>
<ul>
<li>650KiB of HTML, CSS, images, and fonts</li>
<li>650KiB of JavaScript</li>
</ul>
<p>If we set the target to a much more reasonable three seconds, our total payload must fit in only ~730KiB, with no more than 365KiB of compressed JavaScript.</p>
<p>Similarly, if we keep the five second target but open five <abbr>TLS</abbr> connections, our budget would be closer to 1MiB. If the target were reset to three seconds with five connections, our total payload falls to ~460KiB, leaving only ~230KiB for scripts.</p>
<h3 id="markup-heavy">Markup-Heavy <a class="permalink" href="#markup-heavy">#</a></h3>
<p>Sites comprised mostly of markup (HTML and CSS) can afford a <em>lot</em> more, although CSS complexity and poorly-loaded fonts can still slow down otherwise quick content. Conservatively, to load in five seconds over, at most, two connections, we should try to keep content under 2.5MiB, including:</p>
<ul>
<li>2.4MiB of HTML, CSS, images, and fonts, and</li>
<li>100KiB of JavaScript.</li>
</ul>
<p>To hit a more reasonable three second first-load target with two connections, we should aim for a max 1.4MiB transfer, made up of:</p>
<ul>
<li>1.325MiB of HTML, CSS, etc., and</li>
<li>75KiB of JavaScript.</li>
</ul>
<p>These are generous targets. The blog you're reading <a href="https://www.webpagetest.org/video/compare.php?tests=240130_AiDcW6_5QC-r%3A1-c%3A0&amp;thumbSize=200&amp;ival=100&amp;end=full">loads over a single connection in ~1.2 seconds on the target device and network profile, consuming 120KiB of critical path resources to become interactive, only 8KiB of which is script</a>.</p>
<h3 id="calculate-your-own">Calculate Your Own <a class="permalink" href="#calculate-your-own">#</a></h3>
<p>As in years past, you can use <a href="https://infrequently.org/2024/01/performance-inequality-gap-2024/chart/index.html">the interactive estimate chart</a> to understand how connections and devices impact budgets. This year the chart has been updated to also allow you to select from JavaScript-heavy and JavaScript-light content composition, as well as updated network and device baselines (see below).</p>
<figure>
<a href="https://infrequently.org/2024/01/performance-inequality-gap-2024/chart/index.html" alt="&lt;em&gt;Tap to try the interactive version.&lt;/em&gt;" target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/chart.png?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/chart.png?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/chart.png?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/chart.png?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/chart.png?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/chart.png?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/chart.png?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/chart.png" alt="&lt;em&gt;Tap to try the interactive version.&lt;/em&gt;" decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption><em>Tap to try the interactive version.</em></figcaption>
</figure>
<p>It's straightforward to understand the number of critical path network connections for a site from DevTools and to eyeball the content composition. Armed with that information, it's possible to use this estimator to quickly understand what sort of first-load experience users at the margins can expect. Give it a try!</p>
<h2 id="situation-report">Situation Report <a class="permalink" href="#situation-report">#</a></h2>
<p>These recommendations are not context-free, and you may disagree with them in whole or in part. To the extent that other estimates are more grounded, or based on different situational data, they may be more appropriate for specific products and teams. Many critiques are possible, both of the target (five seconds for first load), the sample population (worldwide internet users), and of the methodology (informed reckons). Regardless, I present the thinking behind them because it can provide teams with informed points of departure, and also because clarifying the ritual freakout taking place as <a href="https://web.dev/articles/inp"><abbr>INP</abbr> begins to put a price on JavaScript externalities.</a></p>
<p>It's clear that developers <a href="https://rviscomi.dev/2023/11/a-faster-web-in-2024/">are out of touch with market ground-truth</a>, but it's not obvious why. Understanding the differences in the experiences of wealthy developers versus working-class users helps to make the diffuse surface of the privilege bubble perceptible.</p>
<p>Engineering is the discipline of designing solutions under specific constraints. For the front end to improve, it must finally learn to operate within the envelope of what's possible on <em>most</em> devices.</p>
<h3 id="mobile">Mobile <a class="permalink" href="#mobile">#</a></h3>
<p>The "i" in iPhone stands for "inequality".</p>
<p>Owing to the chasm of global wealth inequality, premium devices are largely absent in markets with billions of users. India's iOS share has <a href="https://economictimes.indiatimes.com/tech/technology/apple-set-to-end-2023-with-7-market-share-for-iphones-in-android-dominated-india/articleshow/103532336.cms?from=mdr">surged to an all-time high of 7%</a> on the back of last-generation and refurbished devices. That's a market of 1.43 billion people where Apple <a href="https://www.counterpointresearch.com/insights/india-smartphone-share/">doesn't even crack the top five in terms of shipments</a>.</p>
<p>The Latin American (<abbr>LATAM</abbr>) region, home to more than 600 million people and <a href="https://www.statista.com/topics/7195/smartphone-market-in-latin-america/#topicOverview">nearly 200 million smartphones</a> shows a <a href="https://www.canalys.com/newsroom/latam-smartphone-market-q3-2023">similar market composition</a>:</p>
<figure>
<a href="https://www.counterpointresearch.com/research_portal/counterpoint-quarterly-smartphone-q4-2023/" alt="In &lt;abbr&gt;LATAM&lt;/abbr&gt;, iPhones make up less than 6% of total device shipments." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/latam_share_yoy_counterpoint.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/latam_share_yoy_counterpoint.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/latam_share_yoy_counterpoint.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/latam_share_yoy_counterpoint.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/latam_share_yoy_counterpoint.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/latam_share_yoy_counterpoint.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/latam_share_yoy_counterpoint.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/latam_share_yoy_counterpoint.webp" alt="In &lt;abbr&gt;LATAM&lt;/abbr&gt;, iPhones make up less than 6% of total device shipments." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>In <abbr>LATAM</abbr>, iPhones make up less than 6% of total device shipments.</figcaption>
</figure>
<p>Everywhere wealth is unequally distributed, the haves <a href="https://www.statista.com/statistics/512863/smartphones-cell-phones-tablets-and-ereaders-brands-owned-by-affluent-americans/">read about it in Apple News over 5G while the have-nots struggle to get reliable 4G coverage for their Androids.</a> In <a href="https://assets.publishing.service.gov.uk/media/62a1cb0b8fa8f50395c0a0e7/Consumer_purchasing_behaviour_in_the_UK_smartphone_market_-_CMA_research_report_new.pdf">country after country (PDF)</a> the embedded inequality of our societies sorts ownership of devices by price, and brand through price segmentation.</p>
<p>This matters because the properties of those devices dominate the experiences we can deliver. In the U.S., the term "smartphone dependence" has been coined to describe folks without other ways to access the increasing fraction of essential services only available through the internet. Unsurprisingly, folks who can't afford other internet-connected devices or a fixed broadband subscription are also most likely to buy less expensive (and therefore slower) smartphones:</p>
<figure>
<a href="https://www.pewresearch.org/internet/fact-sheet/mobile/?tabId=tab-011fca0d-9756-4f48-b352-d58f343696bf" alt="undefined" target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/us_smartphone_dependence_pew.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/us_smartphone_dependence_pew.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/us_smartphone_dependence_pew.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/us_smartphone_dependence_pew.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/us_smartphone_dependence_pew.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/us_smartphone_dependence_pew.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/us_smartphone_dependence_pew.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/us_smartphone_dependence_pew.webp" alt="Missing alt text" decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption></figcaption>
</figure>
<p>As smartphone ownership and use grow, the front ends we deliver are ever-more mediated by the properties of those devices. The inequality between the high-end and low-end, even in wealthy countries, is only growing. What we choose to do in response defines what it means to practice <abbr>UX</abbr> engineering ethically.</p>
<h4 id="device-performance">Device Performance <a class="permalink" href="#device-performance">#</a></h4>
<p>Extending the <abbr title="system-on-chip">SoC</abbr> performance by price point series with another year's data, the picture remains ugly. The segments are roughly "fastest iPhone", "fastest Android", "budget", and "low-end":</p>
<figure>
<a href="https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png" alt="&lt;em&gt;Tap for a larger version.&lt;/em&gt;&lt;br&gt;Geekbench 5 single-core scores for each mobile price point." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png" alt="&lt;em&gt;Tap for a larger version.&lt;/em&gt;&lt;br&gt;Geekbench 5 single-core scores for each mobile price point." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption><em>Tap for a larger version.</em><br>Geekbench 5 single-core scores for each mobile price point.</figcaption>
</figure>
<p>Not only have fruity phones extended their single-core <abbr>CPU</abbr> performance lead over contemporary high-end Androids to <em>a four year advantage</em>, the performance-per-dollar curve remains unfavourable to Android buyers.</p>
<p>At the time of publication, the cheapest iPhone 15 Pro (the only device with the A17 Pro chip) is $999 <abbr>MSRP</abbr>, while the S23 (using the Snapdrago 8 gen 2) can be had for $860 from Samsung. This nets out to 2.32 points per dollar for the iPhone, but only 1.6 points per dollar for the S23.</p>
<p>Meanwhile a Samsung A24 that is $175 new, unlocked, and available on Amazon today scores a more reasonable 3.1 points per dollar on single-core performance, but is more than 4.25× slower than the leading contemporary iPhone.</p>
<p>The delta between the fastest iPhones and moderately price new devices rose from 1,522 points last year to 1,774 today.</p>
<p>Put another way, the performance gap between what devices the wealthy carry and what budget shoppers carry grew more this year (252 points) than the year-over-year gains from process and architecture at the volume price point (174 points). This is particularly depressing because single-core performance tends to determine the responsiveness of web app workloads.</p>
<p>A less pronounced version of the same story continues to play out in multi-core performance:</p>
<figure>
<a href="https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png" alt="&lt;em&gt;Tap for a larger version.&lt;/em&gt;&lt;br&gt;Round and round we go: Android ecosystem &lt;abbr&gt;SoC&lt;/abbr&gt;s are improving, but the Performance Inequality Gap continues to grow. Even the fastest Androids are two-plus years behind iOS-ecosystem devices." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png" alt="&lt;em&gt;Tap for a larger version.&lt;/em&gt;&lt;br&gt;Round and round we go: Android ecosystem &lt;abbr&gt;SoC&lt;/abbr&gt;s are improving, but the Performance Inequality Gap continues to grow. Even the fastest Androids are two-plus years behind iOS-ecosystem devices." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption><em>Tap for a larger version.</em><br>Round and round we go: Android ecosystem <abbr>SoC</abbr>s are improving, but the Performance Inequality Gap continues to grow. Even the fastest Androids are two-plus years behind iOS-ecosystem devices.</figcaption>
</figure>
<p>Recent advantages in high-end Android multi-core performance have closed the gap to previous three-year gap to 18 months. Meanwhile, budget segment devices have finally started to see improvement (<a href="https://infrequently.org/2021/03/the-performance-inequality-gap/#:~:text=The%20good%20news%20is%20that%20this%20will%20change%20rapidly%20in%20the%20next%20few%20years.">as this series predicted</a>), thanks to hand-me-down architecture and process node improvements. That's where the good news ends.</p>
<p>The multi-core performance gap between i-devices and budget Androids grew considerably, with the score delta rising from 4,318 points last year to 4,936 points in 2023.</p>
<p>Looking forward, we can expect high-end Androids to at least stop falling further behind owing to <a href="https://www.androidauthority.com/snapdragon-8-gen-3-dimensity-9300-benchmarked-3395385/">a new focus on performance by Qualcomm's Snapdragon 8 gen 3 and MediaTek's Dimensity 9300 offerings</a>. This change is long, long overdue and will take years to filter down into positive outcomes for the rest of the ecosystem. Until that happens, the gap in experience for the wealthy versus the rest will not close.</p>
<p>iPhone owners live in a different world than high-end Android buyers, and light-years away what the bulk of the market experiences. No matter how you slice it, the performance inequality gap is growing for <abbr>CPU</abbr>-bound workloads like JavaScript-heavy web apps.</p>
<h4 id="networks">Networks <a class="permalink" href="#networks">#</a></h4>
<p>As ever, 2023 re-confirmed an essential truth when it comes to user experience: <a href="https://www.opensignal.com/2023/05/11/poor-connectivity-damages-the-mobile-app-business">when things are slow, users engage less often.</a> Doing a good job in an uneven network environment requires thinking about availability and engineering for resilience and a lightweight footprint — always better to avoid testing the radio gods than it is spending weeks or months appeasing them after the damage is done.</p>
<p>5G network deployment continues apace, but as with the arrival of 4G, it is happening unevenly and in ways and places that exacerbate (rather than lessen) performance inequality.</p>
<p>Data on mobile network evolution is sketchy, and the largest error bars in this series' analysis continue to reside in this section. Regardless, we can look industry summaries like the <a href="https://www.gsma.com/mobileeconomy/wp-content/uploads/2023/03/270223-The-Mobile-Economy-2023.pdf"><abbr>GSMA</abbr>'s report on "The Mobile Economy 2023" (PDF)</a> for a directional understanding that we can triangulate with other data points to develop a strong intuition.</p>
<p>For instance, <abbr>GSMA</abbr> predicts that 5G will only comprise half of connections by 2030. Meanwhile, McKinsey <a href="https://www.techtarget.com/whatis/feature/5-Predictions-about-5G-Adoption-in-2021-and-Beyond#:~:text=Regardless%20of%20the,the%205G%20revolution.%22">predicts</a> that high-quality 5G (networks that use 6GHz bands) will only cover a quarter of the world's population by 2030. Regulatory roadblocks are <a href="https://economictimes.indiatimes.com/industry/telecom/telecom-news/itu-reaches-agreement-to-open-new-6-ghz-spectrum-band-for-5g-6g/articleshow/106000126.cms">still being cleared</a>.</p>
<p>As we said in 2021, <em>"<a href="https://infrequently.org/2021/03/the-performance-inequality-gap/#oh-em-gee">4G is a miracle, 5G is a mirage</a>."</em></p>
<p>This doesn't mean that 4G is one thing, or that it's deployed evenly, or even that the <a href="https://www.opensignal.com/2023/06/29/more-usable-spectrum-boosts-the-4g-and-5g-experience">available spectrum will remain stable</a> within a single generation of radio technology. For example, India's network environment has continued to evolve since the <a href="https://www.kaiostech.com/reliance-jio-became-worlds-fastest-growing-mobile-network/">Reliance Jio revolution</a> that drove 4G into the mainstream and pushed the price of a mobile megabyte down by ~90% on <em>every</em> subcontinental carrier. But that's not the whole story! <a href="https://www.speedtest.net/global-index/india#mobile">Speedtest.net's data for India shows dramatic gains, for example</a>, and <a href="https://www.financialexpress.com/business/industry/mobile-download-speeds-india-moves-up-72-spots-in-global-ranking/3260813/">analysts credit this to improved infrastructure density, expanded spectrum, and back-haul improvements related to the 5G rollout</a> — all of which is to say that 4G users are getting better experiences than they did last year <em>because of</em> 5G's role in reducing contention.</p>
<figure>
<a href="https://www.speedtest.net/global-index/india#mobile" alt="India's speed test medians are moving quickly, but variance is orders-of-magnitude wide, with 5G penetration below 25% in the most populous areas." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/india_mobile_speedtest_data.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/india_mobile_speedtest_data.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/india_mobile_speedtest_data.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/india_mobile_speedtest_data.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/india_mobile_speedtest_data.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/india_mobile_speedtest_data.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/india_mobile_speedtest_data.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/india_mobile_speedtest_data.webp" alt="India's speed test medians are moving quickly, but variance is orders-of-magnitude wide, with 5G penetration below 25% in the most populous areas." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>India's speed test medians are moving quickly, but variance is orders-of-magnitude wide, with 5G penetration below 25% in the most populous areas.</figcaption>
</figure>
<p>These sorts of gains are easy to miss if we look only at headline "4G vs. 5G" coverage, so it's important to level-set as new data becomes available. Improvements can arrive unevenly, with the "big" story happening slowly, long after the initial buzz of headlines wears off. These effects reward us for looking at P75+, not just means or medians, and intentionally updating priors on a regular basis.</p>
<p>Events can turn our intuitions on their heads, too. Japan is famously well connected. I've personally experienced rock-solid 4G through entire Tokyo subway journeys, <a href="https://en.wikipedia.org/wiki/Roppongi_Station">more than 40m underground</a> and with no hiccups. And yet, the network environment has been largely unchanged by the introduction of 5G. Having provisioned more than adequately in the 4G era, new technology isn't having the same impact from pent-up demand. But despite consistent performance, the quality of service for all users is distributed in a <em>much</em> more egalitarian way:</p>
<figure>
<a href="https://www.speedtest.net/global-index/japan#mobile" alt="Japan's network environment isn't the fastest, but is much more evenly distributed." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/japan_mobile_speedtest_data.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/japan_mobile_speedtest_data.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/japan_mobile_speedtest_data.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/japan_mobile_speedtest_data.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/japan_mobile_speedtest_data.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/japan_mobile_speedtest_data.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/japan_mobile_speedtest_data.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/japan_mobile_speedtest_data.webp" alt="Japan's network environment isn't the fastest, but is much more evenly distributed." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Japan's network environment isn't the fastest, but is much more evenly distributed.</figcaption>
</figure>
<p>Fleet device composition has big effects, owing to differences in signal-processing compute availability and spectrum compatibility. At a population level, these influences play out slowly as devices age out, but still have impressively positive impacts:</p>
<figure>
<a href="https://www.opensignal.com/2023/09/25/users-should-upgrade-their-iphone-to-have-the-best-mobile-network-experience" alt="Device impact on network performance is visible in Opensignal's iPhone dataset." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/opensignal_iphone_relative_network_speed.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/opensignal_iphone_relative_network_speed.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/opensignal_iphone_relative_network_speed.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/opensignal_iphone_relative_network_speed.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/opensignal_iphone_relative_network_speed.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/opensignal_iphone_relative_network_speed.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/opensignal_iphone_relative_network_speed.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/opensignal_iphone_relative_network_speed.webp" alt="Device impact on network performance is visible in Opensignal's iPhone dataset." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Device impact on network performance is visible in Opensignal's iPhone dataset.</figcaption>
</figure>
<p>As inequality grows, <a href="https://www.financialexpress.com/business/industry/mobile-download-speeds-india-moves-up-72-spots-in-global-ranking/3260813/#:~:text=In%20fact%2C%20in,of%20385.50%20Mbps.">averages and "generation" tags can become illusory and misleading</a>. Our own experiences are no guide; we've got to keep our hands in the data to understand the texture of the world.</p>
<p>So, with all of that as prelude, what <em>can</em> we say about where the mobile network baseline should be set? In a departure from years prior, I'm going to use a unified network estimate (see below). You'll have to read on for what it is! But it won't be based on the sort of numbers that folks explicitly running speed tests see; those aren't real life.</p>

<h4 id="market-factors">Market Factors <a class="permalink" href="#market-factors">#</a></h4>
<p>The market forces this series <a href="https://infrequently.org/2017/10/can-you-afford-it-real-world-web-performance-budgets/#global-ground-truth">previewed in 2017</a> have played out in roughly a straight line: smartphone penetration in emerging markets is approaching saturation, ensuring a growing fraction of purchases are made by upgrade shoppers. Those who upgrade see more value in their phones and save to buy better second and third devices. Combined with the <a href="https://en.wikipedia.org/wiki/IPhone_X#:~:text=At%20the%20time%20of%20its,local%20sales%20and%20import%20taxes.">emergence</a> and <a href="https://www.counterpointresearch.com/insights/premium-smartphone-asp-reaches-record-q2-high/">growth of the "ultra premium" segment</a>, average selling prices (<abbr>ASP</abbr>s) have risen.</p>
<p>2022 and 2023 have established an inflection point in the regard, with worldwide average selling prices <a href="https://www.idc.com/getdoc.jsp?containerId=prUS51430223">jumping to more than $430</a>, up from $300-$350 for much of the decade prior. Some price appreciation has been <a href="https://www.technavio.com/report/smartphone-market-industry-analysis">due to transient impacts of the U.S./China trade wars</a>, but most of it appears driven by iOS <abbr>ASP</abbr>s which peaked above $1,000 for the first time in 2023. Android <abbr>ASP</abbr>s, meanwhile, continued a gradual rise to nearly $300, up from $250 five years ago.</p>
<figure>
<a href="https://www.idc.com/getdoc.jsp?containerId=prUS51430223" alt="undefined" target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/idc_forecast.png?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/idc_forecast.png?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/idc_forecast.png?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/idc_forecast.png?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/idc_forecast.png?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/idc_forecast.png?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/idc_forecast.png?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/idc_forecast.png" alt="Missing alt text" decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption></figcaption>
</figure>
<p>A <a href="https://www.counterpointresearch.com/insights/global-smartphone-market-reaches-its-lowest-q3-levels-in-a-decade-apples-share-at-16/">weak market for handsets in 2023</a>, plus stable sales for iOS, had an notable impact on prices. <abbr>IDC</abbr> expects global average prices to fall back below $400 by 2027 as Android volumes increase from an unusually soft 2023.</p>
<figure>
<a href="https://www.counterpointresearch.com/research_portal/counterpoint-quarterly-smartphone-q4-2023/" alt="Counterpoint data shows declining sales in both 2022 and 2023." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/smartphone_shipments_2023.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/smartphone_shipments_2023.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/smartphone_shipments_2023.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/smartphone_shipments_2023.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/smartphone_shipments_2023.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/smartphone_shipments_2023.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/smartphone_shipments_2023.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/smartphone_shipments_2023.webp" alt="Counterpoint data shows declining sales in both 2022 and 2023." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Counterpoint data shows declining sales in both 2022 and 2023.</figcaption>
</figure>
<figure>
<a href="https://www.counterpointresearch.com/research_portal/counterpoint-quarterly-smartphone-q4-2023/" alt="Shipment growth in late 2023 and beyond is coming from emerging markets like the Middle East and Africa. Samsung's A-series mid-tier is doing particularly well." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/return_to_growth_driven_by_em.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/return_to_growth_driven_by_em.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/return_to_growth_driven_by_em.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/return_to_growth_driven_by_em.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/return_to_growth_driven_by_em.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/return_to_growth_driven_by_em.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/return_to_growth_driven_by_em.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/return_to_growth_driven_by_em.webp" alt="Shipment growth in late 2023 and beyond is coming from emerging markets like the Middle East and Africa. Samsung's A-series mid-tier is doing particularly well." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Shipment growth in late 2023 and beyond is coming from emerging markets like the Middle East and Africa. Samsung's A-series mid-tier is doing particularly well.</figcaption>
</figure>
<p>Despite falling sales, distribution of Android versus iOS sales remains largely unchanged:</p>
<figure>
<a href="https://www.counterpointresearch.com/insights/global-smartphone-os-market-share/" alt="Android sales reliably constitute 80-85% of worldwide volume." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/counterpoint_smartphone_sales_by_OS_Q3-2023.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/counterpoint_smartphone_sales_by_OS_Q3-2023.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/counterpoint_smartphone_sales_by_OS_Q3-2023.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/counterpoint_smartphone_sales_by_OS_Q3-2023.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/counterpoint_smartphone_sales_by_OS_Q3-2023.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/counterpoint_smartphone_sales_by_OS_Q3-2023.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/counterpoint_smartphone_sales_by_OS_Q3-2023.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/counterpoint_smartphone_sales_by_OS_Q3-2023.webp" alt="Android sales reliably constitute 80-85% of worldwide volume." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Android sales reliably constitute 80-85% of worldwide volume.</figcaption>
</figure>
<figure>
<a href="https://www.statista.com/statistics/245191/market-share-of-mobile-operating-systems-for-smartphone-sales-in-australia/" alt="Even in rich nations like Australia and the &lt;a href='https://www.statista.com/statistics/262179/market-share-held-by-mobile-operating-systems-in-the-united-kingdom/'&gt;the U.K.&lt;/a&gt;, iPhones account for less than half of sales. Predictably, they are over-represented in analytics and logs owing to wealth-related factors including superior network access and performance hysteresis." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/au_smartphone_share_by_os.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/au_smartphone_share_by_os.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/au_smartphone_share_by_os.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/au_smartphone_share_by_os.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/au_smartphone_share_by_os.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/au_smartphone_share_by_os.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/au_smartphone_share_by_os.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/au_smartphone_share_by_os.webp" alt="Even in rich nations like Australia and the &lt;a href='https://www.statista.com/statistics/262179/market-share-held-by-mobile-operating-systems-in-the-united-kingdom/'&gt;the U.K.&lt;/a&gt;, iPhones account for less than half of sales. Predictably, they are over-represented in analytics and logs owing to wealth-related factors including superior network access and performance hysteresis." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Even in rich nations like Australia and the <a href="https://www.statista.com/statistics/262179/market-share-held-by-mobile-operating-systems-in-the-united-kingdom/">the U.K.</a>, iPhones account for less than half of sales. Predictably, they are over-represented in analytics and logs owing to wealth-related factors including superior network access and performance hysteresis.</figcaption>
</figure>
<p>Smartphone replacement rates have remained roughly in line with previous years, although we should expect elongation in the data from next year. <a href="https://www.sellcell.com/blog/how-often-do-people-upgrade-their-phone-2023-statistics/">Survey reports</a> and market analysts continue to estimate average replacement at 3-4 years, depending on segment. Premium devices last longer, and a higher fraction of devices may be older in wealthy geographies. Combined with <a href="https://www2.deloitte.com/us/en/insights/economy/consumer-pulse/state-of-the-us-consumer.html">discretionary spending pressure</a> and <a href="https://www.ons.gov.uk/economy/inflationandpriceindices/articles/costofliving/latestinsights">inflationary impacts on household budgets</a>, consumer intent to spend on electronics has taken a hit, which will be felt in device lifetime extension until conditions improve. <a href="https://www.counterpointresearch.com/insights/apple-refurbished-smartphone-volumes-grew-16-yoy-globally-in-2022/">Increasing demand for refurbished devices</a> also adds to observable device aging.</p>
<p>The data paints a substantially similar picture to previous years: the web is experienced on devices that are slower and older than those carried by affluent developers and corporate directors whose purchasing decisions are not impacted by transitory inflation.</p>
<p>To serve users effectively, we must do extra work to <a href="https://glazkov.com/2023/07/30/live-as-our-customer/">live as our customers do</a>.</p>
<h4 id="test-device-recommendations">Test Device Recommendations <a class="permalink" href="#test-device-recommendations">#</a></h4>
<p>Re-using <a href="https://infrequently.org/2022/12/performance-baseline-2023/#devices-1">last year's P75 device calculus</a>, our estimate is based on a device sold new, unlocked for the mid-2020 to mid-2021 global <abbr>ASP</abbr> of ~$350-375.</p>
<p>Representative examples from that time period include the <a href="https://www.gsmarena.com/samsung_galaxy_a51-9963.php">Samsung Galaxy A51</a> and the <a href="https://www.gsmarena.com/google_pixel_4a-10123.php">Pixel 4a</a>. Neither model featured 5G, and we cannot expect 5G to play a significant role in worldwide baselines for at least the next several years.</p>
<p>The A51 featured <a href="https://www.gsmarena.com/samsung_galaxy_a51-9963.php#:~:text=Octa%2Dcore%20(4x2.3%20GHz%20Cortex%2DA73%20%26%204x1.7%20GHz%20Cortex%2DA53)">eight slow cores (4x2.3 GHz Cortex-A73 and 4x1.7 GHz Cortex-A53) on a 10nm process</a>:</p>
<figure>
<a href="https://browser.geekbench.com/v6/cpu/compare/350184?baseline=3639070" alt="Geekbench 6 scores for the Galaxy A51 versus today's leading device." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/a51_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a51_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a51_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a51_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a51_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a51_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a51_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/a51_vs_iphone_15_pro.webp" alt="Geekbench 6 scores for the Galaxy A51 versus today's leading device." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Geekbench 6 scores for the Galaxy A51 versus today's leading device.</figcaption>
</figure>
<p><a href="https://www.gsmarena.com/google_pixel_4a-10123.php#:~:text=Octa%2Dcore%20(2x2.2%20GHz%20Kryo%20470%20Gold%20%26%206x1.8%20GHz%20Kryo%20470%20Silver)">The Pixel 4a's slow, eight-core big.LITTLE configuration was fabricated on an 8nm process</a>:</p>
<figure>
<a href="https://browser.geekbench.com/v6/cpu/compare/4295850?baseline=3639070" alt="Google spent more on the &lt;abbr&gt;SoC&lt;/abbr&gt; for the Pixel 4a and enjoyed a later launch date, boosting performance relative to the A51." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/pixel_4a_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/pixel_4a_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/pixel_4a_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/pixel_4a_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/pixel_4a_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/pixel_4a_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/pixel_4a_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/pixel_4a_vs_iphone_15_pro.webp" alt="Google spent more on the &lt;abbr&gt;SoC&lt;/abbr&gt; for the Pixel 4a and enjoyed a later launch date, boosting performance relative to the A51." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Google spent more on the <abbr>SoC</abbr> for the Pixel 4a and enjoyed a later launch date, boosting performance relative to the A51.</figcaption>
</figure>
<p><a href="https://www.androidpolice.com/why-google-pixel-phones-hardware-do-not-sell/">Pixels have never sold well,</a> and Google's focus on strong <qabbr>SoC performance per dollar was sadly not replicated across the Android ecosystem, forcing us to use the A51 as our stand-in.</qabbr></p>
<p>Devices within the envelope of our attention are 15-25% as fast as those carried by programmers and their bosses — even in wealthy markets.</p>
<p>The Galaxy may be <a href="https://browser.geekbench.com/v6/cpu/compare/4301594?baseline=442665">slightly faster</a> than last year's <a href="https://infrequently.org/2022/12/performance-baseline-2023/#:~:text=The%20best%20analogue%20you%20can%20buy%20for%20a%20representative%20P75%20device%20today%20are%20~%24200%20Androids%20from%20the%20last%20year%20or%20two%2C%20such%20as%20the%20Samsung%20Galaxy%20A50%20and%20the%20Nokia%20G11.">recommendation</a> of the <a href="https://www.gsmarena.com/samsung_galaxy_a50-9554.php">Galaxy A50 for testing</a>, but the picture is muddy:</p>
<figure>
<a href="https://browser.geekbench.com/v5/cpu/compare/22080983?baseline=22095605" alt="Geekbench 5 shows almost no improvement between the A50 and the A51." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb5.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb5.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb5.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb5.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb5.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb5.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb5.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb5.webp" alt="Geekbench 5 shows almost no improvement between the A50 and the A51." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Geekbench 5 shows almost no improvement between the A50 and the A51.</figcaption>
</figure>
<figure>
<a href="https://browser.geekbench.com/v6/cpu/compare/4302358?baseline=4260956" alt="Geekbench 6 shows the same story within the margin of error. The low-end is stagnant, and still &lt;a href='https://www.statista.com/statistics/934471/smartphone-shipments-by-price-category-worldwide/' target='_new'&gt;30% of worldwide volume&lt;/a&gt;." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb6.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb6.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb6.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb6.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb6.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb6.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb6.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb6.webp" alt="Geekbench 6 shows the same story within the margin of error. The low-end is stagnant, and still &lt;a href='https://www.statista.com/statistics/934471/smartphone-shipments-by-price-category-worldwide/' target='_new'&gt;30% of worldwide volume&lt;/a&gt;." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Geekbench 6 shows the same story within the margin of error. The low-end is stagnant, and still <a href="https://www.statista.com/statistics/934471/smartphone-shipments-by-price-category-worldwide/" target="_new">30% of worldwide volume</a>.</figcaption>
</figure>
<p>If you're building a test lab today, refurbished A51s can be had for ~$150. Even better, the newer <a href="https://browser.geekbench.com/v6/cpu/compare/3826666?baseline=350184">Nokia G100</a> can be had for as little as $100, and it's <a href="https://www.nokia.com/phones/en_us/nokia-g-100?sku=F22CF51022200">faithful</a> to the sluggish original in <a href="https://www.gsmarena.com/compare.php3?idPhone1=9963&amp;idPhone2=12373">nearly every respect</a>.</p>
<p>If your test bench is based on last year's recommended A50 or <a href="https://www.gsmarena.com/nokia_g11-11358.php">Nokia G11</a>, I do not recommend upgrading in 2024. The absolute gains are so slight that the difference will be hard to feel, and bench stability has a value all its own. Looking forward, we can also predict that our bench performance will be stable until 2025.</p>
<p>Claims about how "performant" modern front-end tools are have to be evaluated in this slow, stagnant context.</p>
<h3 id="desktop">Desktop <a class="permalink" href="#desktop">#</a></h3>
<p>It's a bit easier to understand the Desktop situation because the Edge telemetry I have access to provides statistically significant insight into <a href="https://www.statista.com/statistics/576473/united-states-quarterly-pc-shipment-share-apple/">85+% of the market</a>.</p>
<h4 id="device-performance-1">Device Performance <a class="permalink" href="#device-performance-1">#</a></h4>
<p>The <abbr>TL;DR</abbr> for desktop performance is that Edge telemetry puts ~45% of devices in a "low-end" bucket, meaning they have &lt;= 4 cores or &lt;= 4GB of RAM.</p>
<table class="summary">
<thead>
<tr>
<td>Device Tier</td>
<td>Fleet %</td>
<td>Definition</td>
</tr>
</thead>
<tbody>
<tr>
<td>Low-end</td>
<td>45%</td>
<td>Either:<br>&lt;= 4 cores, or<br>&lt;= 4GB RAM</td>
</tr>
<tr>
<td>Medium</td>
<td>48%</td>
<td><abbr>HDD</abbr> (not <abbr>SSD</abbr>), or<br>4-16 GB RAM, or<br>4-8 cores</td>
</tr>
<tr>
<td>High</td>
<td>7%</td>
<td><abbr>SSD</abbr> +<br>&gt; 8 cores +<br>&gt; 16GB RAM</td>
</tr>
</tbody>
</table>
<p>20% of users are on <abbr>HDD</abbr>s (not <abbr>SSD</abbr>s) and nearly all of those users also have low (and slow) cores.</p>
<p>You might be tempted to dismiss this data because it doesn't include Macs, which are faster than the PC cohort. Recall, however, that the snapshot also excludes ChromeOS.</p>
<p>ChromeOS share has veered wildly in recent years, representing 50%-200% of Mac shipments in a given per quarter. In '21 and '22, ChromeOS shipments regularly doubled Mac sales. Despite post-pandemic mean reversion, <a href="https://www.idc.com/getdoc.jsp?containerId=IDC_P36344">according to <abbr>IDC</abbr></a> ChromeOS devices outsold Macs ~5.7M to ~4.7M in 2023 Q2. The trend reversed in Q3, with Macs almost doubling ChromeOS sales, but slow ChromeOS devices aren't going away and, from a population perspective, more than offset Macs today. Analysts also <a href="https://www.idc.com/promo/pcdforecast">predict growth in the low end of the market as educational institutions begin to refresh their past purchases.</a></p>
<h4 id="networks-1">Networks <a class="permalink" href="#networks-1">#</a></h4>
<p>Desktop-attached networks <a href="https://www.fiercetelecom.com/broadband/ookla-global-fixed-download-speeds-nearly-doubled-2022">continue to improve</a>, notably <a href="https://www.allconnect.com/blog/broadband-availability-by-type">in the U.S.</a> Regulatory intervention and subsidies have done much to spur enhancements in access to U.S. fixed broadband, although <a href="https://www.allconnect.com/blog/broadband-availability-by-type#:~:text=Top%20and%20bottom%20states%20for%20speed">disparities in access remain</a> and the gains <a href="https://www.cbsnews.com/news/affordable-internet-service-could-be-lost-fcc-program-to-run-out-of-funds/">may not persist</a>.</p>
<p>This suggests that it's time to also bump our baseline for desktop tests beyond the 5Mbps/1Mbps/28ms configuration that <a href="https://www.webpagetest.org/">WebPageTest.org's "Cable" profile</a> has defaulted to for desktop tests.</p>
<p>How far should we bump it? Publicly available data is unclear, and I've come to find out that Edge's telemetry lacks good network observation statistics (doh!); Windows telemetry doesn't capture a proxy for network quality, I no longer have access to Chrome's data, the <a href="https://developer.chrome.com/docs/crux/api#effective_connection_type">population-level telemetry available from CrUX is unhelpful</a>, and <a href="https://www.fcc.gov/reports-research/reports/measuring-broadband-america/measuring-fixed-broadband-twelfth-report#:~:text=Chart%2016.2%3A%20The%20ratio%20of%2070/70%20consistent%20download%20speed%20to%20advertised%20download%20speed.">telcos li</a>...er...sorry, <em>"market their products in accordance with local laws and advertising standards."</em> All of this makes it difficult to construct an estimate.</p>
<p>One option is to use a population-level assessment of medians from <a href="https://www.speedtest.net/global-index">something like the Speedtest.net data</a> and then construct a histogram from median speeds. This is both time-consuming and error-prone, as population-level data varies widely across the world. Emerging markets with high mobile internet use and dense populations <a href="https://www.fiercewireless.com/wireless/indias-top-2-mobile-carriers-fight-supremacy-fixed-broadband">can feature</a> poor fixed-line broadband penetration <a href="https://www.opensignal.com/2023/11/22/closing-the-gap-fixed-broadbands-role-in-global-progress">compared with Western markets</a>.</p>
<p>Another option is to mathematically hand-wave using the best evidence we can get. This might allow us to reconstruct probable P75 and P90 values if we know something about the historical distribution of connections. From there, we can gut-check using other spot data. To do this, we need to assume some data set is representative, a fraught decision all its own. Biting the bullet, we could start from the Speedtest.net global survey data, which currently fails to provide anything but medians (P50):</p>
<figure>
<a href="https://www.speedtest.net/global-index" alt="Speedtest.net's global median values are unhelpful on their own, both because they represent users who are testing for speed (and not organic throughput) and because they don't give us a fuller understanding of the distribution." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/global_fixed_speedtest_medians.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/global_fixed_speedtest_medians.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/global_fixed_speedtest_medians.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/global_fixed_speedtest_medians.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/global_fixed_speedtest_medians.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/global_fixed_speedtest_medians.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/global_fixed_speedtest_medians.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/global_fixed_speedtest_medians.webp" alt="Speedtest.net's global median values are unhelpful on their own, both because they represent users who are testing for speed (and not organic throughput) and because they don't give us a fuller understanding of the distribution." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Speedtest.net's global median values are unhelpful on their own, both because they represent users who are testing for speed (and not organic throughput) and because they don't give us a fuller understanding of the distribution.</figcaption>
</figure>

<p>After many attempted Stupid Math Tricks with poorly fitting curves (bandwidth seems to be a funky cousin of log-normal), I've decided to wing it and beg for help: instead of trying to be clever, I'm leaning on <a href="https://radar.cloudflare.com/quality/">Cloudflare Radar's P25/P50/P75 distributions</a> for <a href="https://en.wikipedia.org/wiki/List_of_countries_by_number_of_Internet_users">populous, openly-connected countries with &gt;= ~50M internet users</a>. It's cheeky, but a weighted average of the P75 of download speeds (3/4ths of all connections are faster) should get us in the ballpark. We can then use the usual 5:1 downlink:uplink ratio to come up with an uplink estimate. We can also derive a weighted average for the P75 <abbr>RTT</abbr> from Cloudflare's data. Because Cloudflare doesn't distinguish mobile from desktop connections, this may be an overly conservative estimate, but it's still be more permissive than what we had been pegged to in years past:</p>

<table class="summary" id="natspeeds">
<caption>National P75 Downlink and <abbr>RTT</abbr></caption>
<thead>
<tr>
<td>Country</td>
<td>P75 Downlink (Mbps)</td>
<td>P75 <abbr>RTT</abbr> (ms)</td>
</tr>
</thead>
<tbody>
<tr>
<td>India</td>
<td>4</td>
<td>114</td>
</tr>
<tr>
<td>USA</td>
<td>11</td>
<td>58</td>
</tr>
<tr>
<td>Indonesia</td>
<td>5</td>
<td>81</td>
</tr>
<tr>
<td>Brazil</td>
<td>8</td>
<td>71</td>
</tr>
<tr>
<td>Nigeria</td>
<td>3</td>
<td>201</td>
</tr>
<tr>
<td>Pakistan</td>
<td>3</td>
<td>166</td>
</tr>
<tr>
<td>Bangladesh</td>
<td>5</td>
<td>114</td>
</tr>
<tr>
<td>Japan</td>
<td>17</td>
<td>42</td>
</tr>
<tr>
<td>Mexico</td>
<td>7</td>
<td>75</td>
</tr>
<tr>
<td>Egypt</td>
<td>4</td>
<td>100</td>
</tr>
<tr>
<td>Germany</td>
<td>16</td>
<td>36</td>
</tr>
<tr>
<td>Turkey</td>
<td>7</td>
<td>74</td>
</tr>
<tr>
<td>Philippines</td>
<td>7</td>
<td>72</td>
</tr>
<tr>
<td>Vietnam</td>
<td>7</td>
<td>72</td>
</tr>
<tr>
<td>United Kingdom</td>
<td>16</td>
<td>37</td>
</tr>
<tr>
<td>South Korea</td>
<td>24</td>
<td>26</td>
</tr>
<tr>
<td><em>Weighted Avg.</em></td>
<td>7.2</td>
<td>94</td>
</tr>
</tbody>
</table>
<p>We, therefore, update our P75 link estimate <strong>7.2Mbps down, 1.4Mbps up, and 94ms <abbr>RTT</abbr>.</strong></p>
<p>This is a mild crime against statistics, not least of all because it averages unlike quantities and fails to sift mobile from desktop, but all the other methods available at time of writing are just as bad. Regardless, this new baseline is half again as much link capacity as last year, showing measurable improvement in networks worldwide.</p>
<p>If you or your company are able to generate a credible worldwide latency estimate in the higher percentiles for next year's update, please <a href="https://infrequently.org/about-me/">get in touch</a>.</p>
<h4 id="market-factors-1">Market Factors <a class="permalink" href="#market-factors-1">#</a></h4>
<p>The forces that shape the PC population have been largely fixed for many years. Since 2010, <a href="https://en.wikipedia.org/wiki/Market_share_of_personal_computer_vendors#Worldwide_(1996%E2%80%932022)">volumes have been on a slow downward glide path</a>, shrinking from ~350MM per year in a decade ago to ~260MM in 2018. The pandemic buying spree of 2021 pushed volumes above 300MM per year for the first time in eight years, with the vast majority of those devices being sold at low-end price points — think ~$300 Chromebooks rather than M1 MacBooks.</p>
<p>Lest we assume low-end means "short-lived", <a href="https://blog.google/outreach-initiatives/education/automatic-update-extension-chromebook/">recent announcements regarding software support for these devices</a> will considerably extend their impact. This low-end cohort will filter through the device population for years to come, pulling our performance budgets down, even as renewed process improvement is unlocking improved power efficiency and performance at the high end of the first-sale market. This won't be as pronounced as the diffusion of $100 smartphones has been in emerging markets, but the longer life-span of desktops is already a factor in our model.</p>
<h4 id="test-device-recommendations-1">Test Device Recommendations <a class="permalink" href="#test-device-recommendations-1">#</a></h4>
<p>Per our methodology from last year which uses the 5-8 year replacement cycle for a PC, we update our target date to late 2017 or early 2018, but leave the average-selling-price fixed between $600-700. Eventually we'll need to factor in the past couple of years of gyrations in inflation and supply chains into account when making an estimate, but not this year.</p>
<p>So what did $650, give or take, buy in late 2017 or early 2018?</p>
<p>One option was a <a href="https://www.theverge.com/2017/5/30/15698476/dell-inspiron-gaming-desktop-announced">naf looking tower from Dell, optimistically pitched at gamers</a>, with a <abbr>CPU</abbr> that scores <a href="https://browser.geekbench.com/v6/cpu/compare/3639070?baseline=253445">poorly versus a modern phone.</a>, but which blessedly sports 8GB of RAM.</p>
<p>In laptops (the larger segment), ~$650 bought the <a href="https://www.pcmag.com/reviews/lenovo-yoga-720-12-inch">Lenovo Yoga 720 (12")</a>, with a 2-core (4-thread) <a href="https://ark.intel.com/content/www/us/en/ark/products/95442/intel-core-i3-7100u-processor-3m-cache-2-40-ghz.html">Core i3-7100U</a> and 4GB of RAM. Versions with more RAM and a faster chip were available, but cost considerably more than our budget. This was not a fast box. <a href="https://browser.geekbench.com/v6/cpu/compare/3639070?baseline=4311168">Here's a device with that CPU compared to a modern phone</a>; not pretty:</p>
<figure>
<a href="https://browser.geekbench.com/v6/cpu/compare/3639070?baseline=4311168" alt="The phones of wealthy developers absolutely smoke the baseline PC." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/i3-7100U_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/i3-7100U_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/i3-7100U_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/i3-7100U_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/i3-7100U_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/i3-7100U_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/i3-7100U_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/i3-7100U_vs_iphone_15_pro.webp" alt="The phones of wealthy developers absolutely smoke the baseline PC." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>The phones of wealthy developers absolutely smoke the baseline PC.</figcaption>
</figure>
<p>It's considerably faster than <a href="https://www.bhphotovideo.com/c/product/1704697-REG/hp_6k639ut_aba_probook_fortis_14_n4500.html">some devices still being sold to schools, though</a>.</p>
<p>What does this mean for our target devices? There's wild variation in performance per dollar below $600 which will only increase a inflation-affected cohorts grow to represent a larger fraction of the fleet. Intel's move (finally!) off of 14nm also means that gains are starting to arrive at the low end, but in an uneven way. General advice is therefore hard to issue. That said, we can triangulate based on what we know about the market:</p>

<p>My recommendation, then, to someone setting up a new lab today is not to spend more than $350 on new a test device. Consider laptops with chips like the <a href="https://www.intel.com/content/www/us/en/products/sku/197309/intel-celeron-processor-n4120-4m-cache-up-to-2-60-ghz/specifications.html">N4120</a>, <a href="https://www.intel.com/content/www/us/en/products/sku/212326/intel-celeron-processor-n4500-4m-cache-up-to-2-80-ghz/specifications.html">N4500</a>, or the <a href="https://www.intel.com/content/www/us/en/products/sku/212328/intel-celeron-processor-n5105-4m-cache-up-to-2-90-ghz/specifications.html">N5105</a>. Test devices should also have no more than 8GB of RAM, and preferably 4GB. The <a href="https://a.co/d/98wiGAl">2021 HP 14</a> is a fine proxy. The <a href="https://www.bhphotovideo.com/c/product/1761467-REG/hp_7f424ua_aba_14_14_ep0010nr_laptop_intel.html">updated ~$375 version</a> will do in a pinch, but try to spend less if you can. <a href="https://www.notebookcheck.net/Mobile-Processors-Benchmark-List.2436.0.html?type=&amp;sort=&amp;search=Intel&amp;itemselect_13189=13189&amp;itemselect_13111=13111&amp;itemselect_11533=11533&amp;itemselect_13079=13079&amp;or=0&amp;itemselect_13189=13189&amp;itemselect_13111=13111&amp;itemselect_11533=11533&amp;itemselect_13079=13079&amp;showCount=1&amp;showBars=1&amp;geekbench5_1_single=1&amp;geekbench5_1_multi=1&amp;geekbench6_2_single=1&amp;geekbench6_2_multi=1&amp;octane2=1&amp;speedometer=1&amp;cpu_fullname=1&amp;codename=1&amp;l2cache=1&amp;l3cache=1&amp;tdp=1&amp;mhz=1&amp;turbo_mhz=1&amp;cores=1&amp;threads=1">Test devices should preferably score no higher than 1,000 in single-core Geekbench 6 tests</a>; a line <a href="https://browser.geekbench.com/v6/cpu/4352217">the HP 14's N4120 easily ducks, clocking in at just over 350</a>.</p>
<h2 id="takeaways">Takeaways <a class="permalink" href="#takeaways">#</a></h2>
<p>There's a lot of good news embedded in this year's update. Devices and networks have finally started to get a bit faster (as predicted), pulling budgets upwards.</p>
<p>At the same time, the community remains in solid denial about the disastrous consequences of an over-reliance on JavaScript. This paints a picture of path dependence — front-end isn't moving on from approaches that hurt users, <a href="https://x.com/FredKSchott/status/1744842592905552227?s=20">even as the costs shift back onto teams that have been degrading life for users at the margins</a>.</p>
<p>We can anticipate continued improvement in devices over the next few years, and network pace may level out somewhat as the uneven deployment of 5G lurches forward. Regardless, the gap between the digital haves and have-nots continues to grow. Those least able to afford the fast devices are actively taxed by developers high on their own developer experience (<abbr title="Developer Experience">DX</abbr>).</p>
<p>It's not a mystery why folks who spend every waking hour inside a digital privilege bubble are not building with empathy or humility when nobody calls them to account. What's mysterious is that anybody pays them to do it. The Product Management (<abbr>PM</abbr>) and Engineering Management (<abbr>EM</abbr>) disciplines have utterly failed organisations building on the web, failing to put pro-user and pro-business constraints on the enthusiasms of developers.</p>
<p>Instead of cabining the the enthusiasms of the FP crowd, managers meekly repeated bullshit about how <em>"you can't hire for fundamentals"</em> as they waved in busloads of bootcampers whose React-heavy <abbr>CV</abbr> paint jobs had barely dried. They could have run bake-offs. They could have paid for skills that would serve the business over time. They could have facilitated learning anything the business valued. Instead, they abdicated. The kicker is that they didn't even reliably make things better for the class they imagined they were serving.</p>
<p>This post was partially drafted on airplane wifi, and I can assure you that wealthy folks also experience <abbr>RTT</abbr>'s north of 500ms and <a href="https://en.wikipedia.org/wiki/Gogo_Inflight_Internet#Technologies">channel capacity in the single-digit-Mbps</a>.</p>
<p>Even the wealthiest users step out of the privilege bubble sometimes. Are these EMs and PMs <em>really</em> happy to lose that business?</p>
<figure>
<a href="https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg" alt="&lt;em&gt;Tap for a larger version.&lt;/em&gt;&lt;br&gt;Wealthy users are going to experience networks with properties that are even worse than the 'bad' networks offered to the Next Billion Users. At an altitude of 40k feet and a ground speed for 580 MPH somewhere over Alberta, CA, your correspondent's bandwidth is scarce, lopsided, and laggy." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg" alt="&lt;em&gt;Tap for a larger version.&lt;/em&gt;&lt;br&gt;Wealthy users are going to experience networks with properties that are even worse than the 'bad' networks offered to the Next Billion Users. At an altitude of 40k feet and a ground speed for 580 MPH somewhere over Alberta, CA, your correspondent's bandwidth is scarce, lopsided, and laggy." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption><em>Tap for a larger version.</em><br>Wealthy users are going to experience networks with properties that are even worse than the 'bad' networks offered to the Next Billion Users. At an altitude of 40k feet and a ground speed for 580 MPH somewhere over Alberta, CA, your correspondent's bandwidth is scarce, lopsided, and laggy.</figcaption>
</figure>
<p>Of course, any trend that can't continue won't, and <abbr>INP</abbr>'s impact is already being felt. The great JavaScript merry-go-round may grind to a stop, but the momentum of consistently bad choices is formidable. Like passengers on a cruise ship ramming a boardwalk at flank speed, JavaScript regret is dawning far too late and interacting very poorly with something we ate. As the good ship Scripting shudders and lists on the remains of the Ferris Wheel, it's not exactly clear how to get off, but the choices that led us here are at least visible, if only by their belated consequences.</p>
<h3 id="the-great-branch-mispredict">The Great Branch Mispredict <a class="permalink" href="#the-great-branch-mispredict">#</a></h3>
<p>We got to a place where performance has been a constant problem in large part because a tribe of programmers convinced themselves that it <em>wasn't</em> and <em>wouldn't be</em>. The circa '13 narrative asserted that:</p>
<ul>
<li>CPUs would keep getting faster (just like they always had).</li>
<li>Networks would get better, or at least not get worse.</li>
<li>Organisations had all learned the lessons of Google and FaceBook's adventures in Ajax.</li>
</ul>
<p>It was all bullshit, <em>and many of us spotted it a mile away</em>.</p>
<p>But tribalism-boosted confirmation bias mixed with JavaScript's toxic positivity culture to precipitate out a Silicon Prosperity Gospel; all resources would go infinite if you just <em>believed</em>. No matter how wrong the premise, we kept executing down the obviously-falsified branch until the buffers drained.</p>
<p>The solutions are social, not technical, because the the delusions are social, rather than technical.</p>
<p>The stories that propped up <abbr>IE8</abbr>-focused frameworks like Angular and React in the mobile era have only served as comforting myths to ward off emerging device and network reality. For the past decade, the important question hasn't been if enough good technology existed, but rather how long the delusions would keep hold.</p>
<p>The community wanted to live in a different world than the one we inhabit, so we collectively mis-predicted. A healthy web community will value learning faster.</p>
<p>How deep was the branch? And how many cycles will the fault cost us? If CPUs and networks continue to improve at the rate of the past two years, and <abbr>INP</abbr> finally forces a reckoning, the answer might be as little as a decade. I fear we will not be so lucky; an entire generation has been trained to ignore reality, to prize tribalism rather than engineering rigor, and to devalue fundamentals. Those folks may not find the next couple of years to their liking.</p>
<p>Front-end's hangover from the JavaScript party is gonna <em>suck</em>.</p>
</article>


<hr>

<footer>
<p>
<a href="/david/" title="Aller à l’accueil"><svg class="icon icon-home">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-home"></use>
</svg> Accueil</a> •
<a href="/david/log/" title="Accès au flux RSS"><svg class="icon icon-rss2">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-rss2"></use>
</svg> Suivre</a> •
<a href="http://larlet.com" title="Go to my English profile" data-instant><svg class="icon icon-user-tie">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-user-tie"></use>
</svg> Pro</a> •
<a href="mailto:david%40larlet.fr" title="Envoyer un courriel"><svg class="icon icon-mail">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-mail"></use>
</svg> Email</a> •
<abbr class="nowrap" title="Hébergeur : Alwaysdata, 62 rue Tiquetonne 75002 Paris, +33184162340"><svg class="icon icon-hammer2">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-hammer2"></use>
</svg> Légal</abbr>
</p>
<template id="theme-selector">
<form>
<fieldset>
<legend><svg class="icon icon-brightness-contrast">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-brightness-contrast"></use>
</svg> Thème</legend>
<label>
<input type="radio" value="auto" name="chosen-color-scheme" checked> Auto
</label>
<label>
<input type="radio" value="dark" name="chosen-color-scheme"> Foncé
</label>
<label>
<input type="radio" value="light" name="chosen-color-scheme"> Clair
</label>
</fieldset>
</form>
</template>
</footer>
<script src="/static/david/js/instantpage-5.1.0.min.js" type="module"></script>
<script>
function loadThemeForm(templateName) {
const themeSelectorTemplate = document.querySelector(templateName)
const form = themeSelectorTemplate.content.firstElementChild
themeSelectorTemplate.replaceWith(form)

form.addEventListener('change', (e) => {
const chosenColorScheme = e.target.value
localStorage.setItem('theme', chosenColorScheme)
toggleTheme(chosenColorScheme)
})

const selectedTheme = localStorage.getItem('theme')
if (selectedTheme && selectedTheme !== 'undefined') {
form.querySelector(`[value="${selectedTheme}"]`).checked = true
}
}

const prefersColorSchemeDark = '(prefers-color-scheme: dark)'
window.addEventListener('load', () => {
let hasDarkRules = false
for (const styleSheet of Array.from(document.styleSheets)) {
let mediaRules = []
for (const cssRule of styleSheet.cssRules) {
if (cssRule.type !== CSSRule.MEDIA_RULE) {
continue
}
// WARNING: Safari does not have/supports `conditionText`.
if (cssRule.conditionText) {
if (cssRule.conditionText !== prefersColorSchemeDark) {
continue
}
} else {
if (cssRule.cssText.startsWith(prefersColorSchemeDark)) {
continue
}
}
mediaRules = mediaRules.concat(Array.from(cssRule.cssRules))
}

// WARNING: do not try to insert a Rule to a styleSheet you are
// currently iterating on, otherwise the browser will be stuck
// in a infinite loop…
for (const mediaRule of mediaRules) {
styleSheet.insertRule(mediaRule.cssText)
hasDarkRules = true
}
}
if (hasDarkRules) {
loadThemeForm('#theme-selector')
}
})
</script>
</body>
</html>

+ 589
- 0
cache/2024/0676c7ccf1ab2b380641866789366d26/index.md View File

@@ -0,0 +1,589 @@
title: The Performance Inequality Gap, 2024
url: https://infrequently.org/2024/01/performance-inequality-gap-2024/
hash_url: 0676c7ccf1ab2b380641866789366d26
archive_date: 2024-01-31
<p>It's time once again to update our priors regarding the global device and network situation. What's changed since last year? And how much more HTML, CSS, and (particularly) JavaScript can a new project afford?</p>
<p></p>
<h2 id="the-budget%2C-2024">The Budget, 2024 <a class="permalink" href="#the-budget%2C-2024">#</a></h2>
<p>In a departure from previous years, we'll evaluate two sets of baseline numbers for first-load under five seconds on 75<sup>th</sup> (<abbr>P75</abbr>) percentile devices and networks. First, we'll look at limits for JavaScript-heavy content, and separately we'll enunciate recommendations for markup-centric stacks.</p>
<p>This split decision was available via <a href="https://infrequently.org/2022/12/performance-baseline-2023/">last year's update</a>, but was somewhat buried. Going forward, I'll produce both as top-line guidance. The usual caveats also apply:</p>
<ul>
<li>Performance is a deep and nuanced domain, and much can go wrong beyond content size and composition.</li>
<li>How sites manage resources after-load can have a big impact on perceived performance.</li>
<li>Your audience may justify more stringent, or more relaxed, limits.</li>
</ul>
<p>With that stipulated, global baselines matter because many teams have low <a href="https://infrequently.org/2022/05/performance-management-maturity/">performance management maturity</a>, and today's popular frameworks – including some that market performance as a feature – <a href="https://infrequently.org/2023/02/the-market-for-lemons/">fail to ward against catastrophic results</a>.</p>
<p><em>Until and unless teams have better data about their performance, the global baseline budget should be enforced.</em></p>
<p>This isn't charity; it's how teams ensure products stay functional, accessible, and reliable in a market <a href="https://infrequently.org/2023/02/the-market-for-lemons/">awash in bullshit</a>. Limits help teams steer away from complexity and towards tools that generate simpler output that's easier to manage and repair.</p>
<h3 id="javascript-heavy">JavaScript-Heavy <a class="permalink" href="#javascript-heavy">#</a></h3>
<p>Since at least 2015, building JavaScript-first websites has been a predictably terrible idea, yet most of the sites I trace on a daily basis remain mired in script. For these sites, we have to factor in the heavy cost of running JavaScript on the client when describing how much content we can afford. HTML, CSS, images, and fonts can all be parsed and run at near wire speeds on low-end hardware, but JavaScript is at least three times more expensive, byte-for-byte.</p>
<p>Most sites, even those that aspire to be "lived in", feature short median sessions, which means we can't actually justify much in the way of up-front code, and first impressions always matter.</p>
<figure>
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2023/02/the-market-for-lemons/depth-and-frequency-small.png?nf_resize=fit&amp;w=3600 2400w,
/2023/02/the-market-for-lemons/depth-and-frequency-small.png?nf_resize=fit&amp;w=2400 1600w,
/2023/02/the-market-for-lemons/depth-and-frequency-small.png?nf_resize=fit&amp;w=1800 1200w,
/2023/02/the-market-for-lemons/depth-and-frequency-small.png?nf_resize=fit&amp;w=1200 800w,
/2023/02/the-market-for-lemons/depth-and-frequency-small.png?nf_resize=fit&amp;w=900 600w,
/2023/02/the-market-for-lemons/depth-and-frequency-small.png?nf_resize=fit&amp;w=750 500w,
/2023/02/the-market-for-lemons/depth-and-frequency-small.png?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2023/02/the-market-for-lemons/depth-and-frequency-small.png" alt="Most sorts of sites have shallow sessions, making up-front script costs hard to justify." class="preview" decoding="async">
</source></picture>

<figcaption>Most sorts of sites have shallow sessions, making up-front script costs hard to justify.</figcaption>
</figure>
<p>Over the estimated P75 global network, and targeting the slower of our two representative devices — and to hit five seconds to interactivity with only two critical-path network connections — we can afford ~1.3MiB of compressed content, comprised of:</p>
<ul>
<li>650KiB of HTML, CSS, images, and fonts</li>
<li>650KiB of JavaScript</li>
</ul>
<p>If we set the target to a much more reasonable three seconds, our total payload must fit in only ~730KiB, with no more than 365KiB of compressed JavaScript.</p>
<p>Similarly, if we keep the five second target but open five <abbr>TLS</abbr> connections, our budget would be closer to 1MiB. If the target were reset to three seconds with five connections, our total payload falls to ~460KiB, leaving only ~230KiB for scripts.</p>
<h3 id="markup-heavy">Markup-Heavy <a class="permalink" href="#markup-heavy">#</a></h3>
<p>Sites comprised mostly of markup (HTML and CSS) can afford a <em>lot</em> more, although CSS complexity and poorly-loaded fonts can still slow down otherwise quick content. Conservatively, to load in five seconds over, at most, two connections, we should try to keep content under 2.5MiB, including:</p>
<ul>
<li>2.4MiB of HTML, CSS, images, and fonts, and</li>
<li>100KiB of JavaScript.</li>
</ul>
<p>To hit a more reasonable three second first-load target with two connections, we should aim for a max 1.4MiB transfer, made up of:</p>
<ul>
<li>1.325MiB of HTML, CSS, etc., and</li>
<li>75KiB of JavaScript.</li>
</ul>
<p>These are generous targets. The blog you're reading <a href="https://www.webpagetest.org/video/compare.php?tests=240130_AiDcW6_5QC-r%3A1-c%3A0&amp;thumbSize=200&amp;ival=100&amp;end=full">loads over a single connection in ~1.2 seconds on the target device and network profile, consuming 120KiB of critical path resources to become interactive, only 8KiB of which is script</a>.</p>
<h3 id="calculate-your-own">Calculate Your Own <a class="permalink" href="#calculate-your-own">#</a></h3>
<p>As in years past, you can use <a href="https://infrequently.org/2024/01/performance-inequality-gap-2024/chart/index.html">the interactive estimate chart</a> to understand how connections and devices impact budgets. This year the chart has been updated to also allow you to select from JavaScript-heavy and JavaScript-light content composition, as well as updated network and device baselines (see below).</p>
<figure>
<a href="https://infrequently.org/2024/01/performance-inequality-gap-2024/chart/index.html" alt="&lt;em&gt;Tap to try the interactive version.&lt;/em&gt;" target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/chart.png?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/chart.png?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/chart.png?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/chart.png?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/chart.png?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/chart.png?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/chart.png?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/chart.png" alt="&lt;em&gt;Tap to try the interactive version.&lt;/em&gt;" decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption><em>Tap to try the interactive version.</em></figcaption>
</figure>
<p>It's straightforward to understand the number of critical path network connections for a site from DevTools and to eyeball the content composition. Armed with that information, it's possible to use this estimator to quickly understand what sort of first-load experience users at the margins can expect. Give it a try!</p>
<h2 id="situation-report">Situation Report <a class="permalink" href="#situation-report">#</a></h2>
<p>These recommendations are not context-free, and you may disagree with them in whole or in part. To the extent that other estimates are more grounded, or based on different situational data, they may be more appropriate for specific products and teams. Many critiques are possible, both of the target (five seconds for first load), the sample population (worldwide internet users), and of the methodology (informed reckons). Regardless, I present the thinking behind them because it can provide teams with informed points of departure, and also because clarifying the ritual freakout taking place as <a href="https://web.dev/articles/inp"><abbr>INP</abbr> begins to put a price on JavaScript externalities.</a></p>
<p>It's clear that developers <a href="https://rviscomi.dev/2023/11/a-faster-web-in-2024/">are out of touch with market ground-truth</a>, but it's not obvious why. Understanding the differences in the experiences of wealthy developers versus working-class users helps to make the diffuse surface of the privilege bubble perceptible.</p>
<p>Engineering is the discipline of designing solutions under specific constraints. For the front end to improve, it must finally learn to operate within the envelope of what's possible on <em>most</em> devices.</p>
<h3 id="mobile">Mobile <a class="permalink" href="#mobile">#</a></h3>
<p>The "i" in iPhone stands for "inequality".</p>
<p>Owing to the chasm of global wealth inequality, premium devices are largely absent in markets with billions of users. India's iOS share has <a href="https://economictimes.indiatimes.com/tech/technology/apple-set-to-end-2023-with-7-market-share-for-iphones-in-android-dominated-india/articleshow/103532336.cms?from=mdr">surged to an all-time high of 7%</a> on the back of last-generation and refurbished devices. That's a market of 1.43 billion people where Apple <a href="https://www.counterpointresearch.com/insights/india-smartphone-share/">doesn't even crack the top five in terms of shipments</a>.</p>
<p>The Latin American (<abbr>LATAM</abbr>) region, home to more than 600 million people and <a href="https://www.statista.com/topics/7195/smartphone-market-in-latin-america/#topicOverview">nearly 200 million smartphones</a> shows a <a href="https://www.canalys.com/newsroom/latam-smartphone-market-q3-2023">similar market composition</a>:</p>
<figure>
<a href="https://www.counterpointresearch.com/research_portal/counterpoint-quarterly-smartphone-q4-2023/" alt="In &lt;abbr&gt;LATAM&lt;/abbr&gt;, iPhones make up less than 6% of total device shipments." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/latam_share_yoy_counterpoint.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/latam_share_yoy_counterpoint.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/latam_share_yoy_counterpoint.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/latam_share_yoy_counterpoint.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/latam_share_yoy_counterpoint.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/latam_share_yoy_counterpoint.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/latam_share_yoy_counterpoint.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/latam_share_yoy_counterpoint.webp" alt="In &lt;abbr&gt;LATAM&lt;/abbr&gt;, iPhones make up less than 6% of total device shipments." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>In <abbr>LATAM</abbr>, iPhones make up less than 6% of total device shipments.</figcaption>
</figure>
<p>Everywhere wealth is unequally distributed, the haves <a href="https://www.statista.com/statistics/512863/smartphones-cell-phones-tablets-and-ereaders-brands-owned-by-affluent-americans/">read about it in Apple News over 5G while the have-nots struggle to get reliable 4G coverage for their Androids.</a> In <a href="https://assets.publishing.service.gov.uk/media/62a1cb0b8fa8f50395c0a0e7/Consumer_purchasing_behaviour_in_the_UK_smartphone_market_-_CMA_research_report_new.pdf">country after country (PDF)</a> the embedded inequality of our societies sorts ownership of devices by price, and brand through price segmentation.</p>
<p>This matters because the properties of those devices dominate the experiences we can deliver. In the U.S., the term "smartphone dependence" has been coined to describe folks without other ways to access the increasing fraction of essential services only available through the internet. Unsurprisingly, folks who can't afford other internet-connected devices or a fixed broadband subscription are also most likely to buy less expensive (and therefore slower) smartphones:</p>
<figure>
<a href="https://www.pewresearch.org/internet/fact-sheet/mobile/?tabId=tab-011fca0d-9756-4f48-b352-d58f343696bf" alt="undefined" target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/us_smartphone_dependence_pew.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/us_smartphone_dependence_pew.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/us_smartphone_dependence_pew.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/us_smartphone_dependence_pew.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/us_smartphone_dependence_pew.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/us_smartphone_dependence_pew.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/us_smartphone_dependence_pew.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/us_smartphone_dependence_pew.webp" alt="Missing alt text" decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption></figcaption>
</figure>
<p>As smartphone ownership and use grow, the front ends we deliver are ever-more mediated by the properties of those devices. The inequality between the high-end and low-end, even in wealthy countries, is only growing. What we choose to do in response defines what it means to practice <abbr>UX</abbr> engineering ethically.</p>
<h4 id="device-performance">Device Performance <a class="permalink" href="#device-performance">#</a></h4>
<p>Extending the <abbr title="system-on-chip">SoC</abbr> performance by price point series with another year's data, the picture remains ugly. The segments are roughly "fastest iPhone", "fastest Android", "budget", and "low-end":</p>
<figure>
<a href="https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png" alt="&lt;em&gt;Tap for a larger version.&lt;/em&gt;&lt;br&gt;Geekbench 5 single-core scores for each mobile price point." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/single_core_scores.png" alt="&lt;em&gt;Tap for a larger version.&lt;/em&gt;&lt;br&gt;Geekbench 5 single-core scores for each mobile price point." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption><em>Tap for a larger version.</em><br>Geekbench 5 single-core scores for each mobile price point.</figcaption>
</figure>
<p>Not only have fruity phones extended their single-core <abbr>CPU</abbr> performance lead over contemporary high-end Androids to <em>a four year advantage</em>, the performance-per-dollar curve remains unfavourable to Android buyers.</p>
<p>At the time of publication, the cheapest iPhone 15 Pro (the only device with the A17 Pro chip) is $999 <abbr>MSRP</abbr>, while the S23 (using the Snapdrago 8 gen 2) can be had for $860 from Samsung. This nets out to 2.32 points per dollar for the iPhone, but only 1.6 points per dollar for the S23.</p>
<p>Meanwhile a Samsung A24 that is $175 new, unlocked, and available on Amazon today scores a more reasonable 3.1 points per dollar on single-core performance, but is more than 4.25× slower than the leading contemporary iPhone.</p>
<p>The delta between the fastest iPhones and moderately price new devices rose from 1,522 points last year to 1,774 today.</p>
<p>Put another way, the performance gap between what devices the wealthy carry and what budget shoppers carry grew more this year (252 points) than the year-over-year gains from process and architecture at the volume price point (174 points). This is particularly depressing because single-core performance tends to determine the responsiveness of web app workloads.</p>
<p>A less pronounced version of the same story continues to play out in multi-core performance:</p>
<figure>
<a href="https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png" alt="&lt;em&gt;Tap for a larger version.&lt;/em&gt;&lt;br&gt;Round and round we go: Android ecosystem &lt;abbr&gt;SoC&lt;/abbr&gt;s are improving, but the Performance Inequality Gap continues to grow. Even the fastest Androids are two-plus years behind iOS-ecosystem devices." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/multi_core_scores.png" alt="&lt;em&gt;Tap for a larger version.&lt;/em&gt;&lt;br&gt;Round and round we go: Android ecosystem &lt;abbr&gt;SoC&lt;/abbr&gt;s are improving, but the Performance Inequality Gap continues to grow. Even the fastest Androids are two-plus years behind iOS-ecosystem devices." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption><em>Tap for a larger version.</em><br>Round and round we go: Android ecosystem <abbr>SoC</abbr>s are improving, but the Performance Inequality Gap continues to grow. Even the fastest Androids are two-plus years behind iOS-ecosystem devices.</figcaption>
</figure>
<p>Recent advantages in high-end Android multi-core performance have closed the gap to previous three-year gap to 18 months. Meanwhile, budget segment devices have finally started to see improvement (<a href="https://infrequently.org/2021/03/the-performance-inequality-gap/#:~:text=The%20good%20news%20is%20that%20this%20will%20change%20rapidly%20in%20the%20next%20few%20years.">as this series predicted</a>), thanks to hand-me-down architecture and process node improvements. That's where the good news ends.</p>
<p>The multi-core performance gap between i-devices and budget Androids grew considerably, with the score delta rising from 4,318 points last year to 4,936 points in 2023.</p>
<p>Looking forward, we can expect high-end Androids to at least stop falling further behind owing to <a href="https://www.androidauthority.com/snapdragon-8-gen-3-dimensity-9300-benchmarked-3395385/">a new focus on performance by Qualcomm's Snapdragon 8 gen 3 and MediaTek's Dimensity 9300 offerings</a>. This change is long, long overdue and will take years to filter down into positive outcomes for the rest of the ecosystem. Until that happens, the gap in experience for the wealthy versus the rest will not close.</p>
<p>iPhone owners live in a different world than high-end Android buyers, and light-years away what the bulk of the market experiences. No matter how you slice it, the performance inequality gap is growing for <abbr>CPU</abbr>-bound workloads like JavaScript-heavy web apps.</p>
<h4 id="networks">Networks <a class="permalink" href="#networks">#</a></h4>
<p>As ever, 2023 re-confirmed an essential truth when it comes to user experience: <a href="https://www.opensignal.com/2023/05/11/poor-connectivity-damages-the-mobile-app-business">when things are slow, users engage less often.</a> Doing a good job in an uneven network environment requires thinking about availability and engineering for resilience and a lightweight footprint — always better to avoid testing the radio gods than it is spending weeks or months appeasing them after the damage is done.</p>
<p>5G network deployment continues apace, but as with the arrival of 4G, it is happening unevenly and in ways and places that exacerbate (rather than lessen) performance inequality.</p>
<p>Data on mobile network evolution is sketchy, and the largest error bars in this series' analysis continue to reside in this section. Regardless, we can look industry summaries like the <a href="https://www.gsma.com/mobileeconomy/wp-content/uploads/2023/03/270223-The-Mobile-Economy-2023.pdf"><abbr>GSMA</abbr>'s report on "The Mobile Economy 2023" (PDF)</a> for a directional understanding that we can triangulate with other data points to develop a strong intuition.</p>
<p>For instance, <abbr>GSMA</abbr> predicts that 5G will only comprise half of connections by 2030. Meanwhile, McKinsey <a href="https://www.techtarget.com/whatis/feature/5-Predictions-about-5G-Adoption-in-2021-and-Beyond#:~:text=Regardless%20of%20the,the%205G%20revolution.%22">predicts</a> that high-quality 5G (networks that use 6GHz bands) will only cover a quarter of the world's population by 2030. Regulatory roadblocks are <a href="https://economictimes.indiatimes.com/industry/telecom/telecom-news/itu-reaches-agreement-to-open-new-6-ghz-spectrum-band-for-5g-6g/articleshow/106000126.cms">still being cleared</a>.</p>
<p>As we said in 2021, <em>"<a href="https://infrequently.org/2021/03/the-performance-inequality-gap/#oh-em-gee">4G is a miracle, 5G is a mirage</a>."</em></p>
<p>This doesn't mean that 4G is one thing, or that it's deployed evenly, or even that the <a href="https://www.opensignal.com/2023/06/29/more-usable-spectrum-boosts-the-4g-and-5g-experience">available spectrum will remain stable</a> within a single generation of radio technology. For example, India's network environment has continued to evolve since the <a href="https://www.kaiostech.com/reliance-jio-became-worlds-fastest-growing-mobile-network/">Reliance Jio revolution</a> that drove 4G into the mainstream and pushed the price of a mobile megabyte down by ~90% on <em>every</em> subcontinental carrier. But that's not the whole story! <a href="https://www.speedtest.net/global-index/india#mobile">Speedtest.net's data for India shows dramatic gains, for example</a>, and <a href="https://www.financialexpress.com/business/industry/mobile-download-speeds-india-moves-up-72-spots-in-global-ranking/3260813/">analysts credit this to improved infrastructure density, expanded spectrum, and back-haul improvements related to the 5G rollout</a> — all of which is to say that 4G users are getting better experiences than they did last year <em>because of</em> 5G's role in reducing contention.</p>
<figure>
<a href="https://www.speedtest.net/global-index/india#mobile" alt="India's speed test medians are moving quickly, but variance is orders-of-magnitude wide, with 5G penetration below 25% in the most populous areas." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/india_mobile_speedtest_data.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/india_mobile_speedtest_data.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/india_mobile_speedtest_data.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/india_mobile_speedtest_data.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/india_mobile_speedtest_data.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/india_mobile_speedtest_data.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/india_mobile_speedtest_data.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/india_mobile_speedtest_data.webp" alt="India's speed test medians are moving quickly, but variance is orders-of-magnitude wide, with 5G penetration below 25% in the most populous areas." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>India's speed test medians are moving quickly, but variance is orders-of-magnitude wide, with 5G penetration below 25% in the most populous areas.</figcaption>
</figure>
<p>These sorts of gains are easy to miss if we look only at headline "4G vs. 5G" coverage, so it's important to level-set as new data becomes available. Improvements can arrive unevenly, with the "big" story happening slowly, long after the initial buzz of headlines wears off. These effects reward us for looking at P75+, not just means or medians, and intentionally updating priors on a regular basis.</p>
<p>Events can turn our intuitions on their heads, too. Japan is famously well connected. I've personally experienced rock-solid 4G through entire Tokyo subway journeys, <a href="https://en.wikipedia.org/wiki/Roppongi_Station">more than 40m underground</a> and with no hiccups. And yet, the network environment has been largely unchanged by the introduction of 5G. Having provisioned more than adequately in the 4G era, new technology isn't having the same impact from pent-up demand. But despite consistent performance, the quality of service for all users is distributed in a <em>much</em> more egalitarian way:</p>
<figure>
<a href="https://www.speedtest.net/global-index/japan#mobile" alt="Japan's network environment isn't the fastest, but is much more evenly distributed." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/japan_mobile_speedtest_data.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/japan_mobile_speedtest_data.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/japan_mobile_speedtest_data.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/japan_mobile_speedtest_data.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/japan_mobile_speedtest_data.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/japan_mobile_speedtest_data.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/japan_mobile_speedtest_data.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/japan_mobile_speedtest_data.webp" alt="Japan's network environment isn't the fastest, but is much more evenly distributed." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Japan's network environment isn't the fastest, but is much more evenly distributed.</figcaption>
</figure>
<p>Fleet device composition has big effects, owing to differences in signal-processing compute availability and spectrum compatibility. At a population level, these influences play out slowly as devices age out, but still have impressively positive impacts:</p>
<figure>
<a href="https://www.opensignal.com/2023/09/25/users-should-upgrade-their-iphone-to-have-the-best-mobile-network-experience" alt="Device impact on network performance is visible in Opensignal's iPhone dataset." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/opensignal_iphone_relative_network_speed.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/opensignal_iphone_relative_network_speed.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/opensignal_iphone_relative_network_speed.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/opensignal_iphone_relative_network_speed.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/opensignal_iphone_relative_network_speed.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/opensignal_iphone_relative_network_speed.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/opensignal_iphone_relative_network_speed.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/opensignal_iphone_relative_network_speed.webp" alt="Device impact on network performance is visible in Opensignal's iPhone dataset." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Device impact on network performance is visible in Opensignal's iPhone dataset.</figcaption>
</figure>
<p>As inequality grows, <a href="https://www.financialexpress.com/business/industry/mobile-download-speeds-india-moves-up-72-spots-in-global-ranking/3260813/#:~:text=In%20fact%2C%20in,of%20385.50%20Mbps.">averages and "generation" tags can become illusory and misleading</a>. Our own experiences are no guide; we've got to keep our hands in the data to understand the texture of the world.</p>
<p>So, with all of that as prelude, what <em>can</em> we say about where the mobile network baseline should be set? In a departure from years prior, I'm going to use a unified network estimate (see below). You'll have to read on for what it is! But it won't be based on the sort of numbers that folks explicitly running speed tests see; those aren't real life.</p>

<h4 id="market-factors">Market Factors <a class="permalink" href="#market-factors">#</a></h4>
<p>The market forces this series <a href="https://infrequently.org/2017/10/can-you-afford-it-real-world-web-performance-budgets/#global-ground-truth">previewed in 2017</a> have played out in roughly a straight line: smartphone penetration in emerging markets is approaching saturation, ensuring a growing fraction of purchases are made by upgrade shoppers. Those who upgrade see more value in their phones and save to buy better second and third devices. Combined with the <a href="https://en.wikipedia.org/wiki/IPhone_X#:~:text=At%20the%20time%20of%20its,local%20sales%20and%20import%20taxes.">emergence</a> and <a href="https://www.counterpointresearch.com/insights/premium-smartphone-asp-reaches-record-q2-high/">growth of the "ultra premium" segment</a>, average selling prices (<abbr>ASP</abbr>s) have risen.</p>
<p>2022 and 2023 have established an inflection point in the regard, with worldwide average selling prices <a href="https://www.idc.com/getdoc.jsp?containerId=prUS51430223">jumping to more than $430</a>, up from $300-$350 for much of the decade prior. Some price appreciation has been <a href="https://www.technavio.com/report/smartphone-market-industry-analysis">due to transient impacts of the U.S./China trade wars</a>, but most of it appears driven by iOS <abbr>ASP</abbr>s which peaked above $1,000 for the first time in 2023. Android <abbr>ASP</abbr>s, meanwhile, continued a gradual rise to nearly $300, up from $250 five years ago.</p>
<figure>
<a href="https://www.idc.com/getdoc.jsp?containerId=prUS51430223" alt="undefined" target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/idc_forecast.png?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/idc_forecast.png?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/idc_forecast.png?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/idc_forecast.png?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/idc_forecast.png?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/idc_forecast.png?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/idc_forecast.png?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/idc_forecast.png" alt="Missing alt text" decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption></figcaption>
</figure>
<p>A <a href="https://www.counterpointresearch.com/insights/global-smartphone-market-reaches-its-lowest-q3-levels-in-a-decade-apples-share-at-16/">weak market for handsets in 2023</a>, plus stable sales for iOS, had an notable impact on prices. <abbr>IDC</abbr> expects global average prices to fall back below $400 by 2027 as Android volumes increase from an unusually soft 2023.</p>
<figure>
<a href="https://www.counterpointresearch.com/research_portal/counterpoint-quarterly-smartphone-q4-2023/" alt="Counterpoint data shows declining sales in both 2022 and 2023." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/smartphone_shipments_2023.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/smartphone_shipments_2023.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/smartphone_shipments_2023.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/smartphone_shipments_2023.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/smartphone_shipments_2023.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/smartphone_shipments_2023.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/smartphone_shipments_2023.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/smartphone_shipments_2023.webp" alt="Counterpoint data shows declining sales in both 2022 and 2023." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Counterpoint data shows declining sales in both 2022 and 2023.</figcaption>
</figure>
<figure>
<a href="https://www.counterpointresearch.com/research_portal/counterpoint-quarterly-smartphone-q4-2023/" alt="Shipment growth in late 2023 and beyond is coming from emerging markets like the Middle East and Africa. Samsung's A-series mid-tier is doing particularly well." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/return_to_growth_driven_by_em.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/return_to_growth_driven_by_em.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/return_to_growth_driven_by_em.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/return_to_growth_driven_by_em.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/return_to_growth_driven_by_em.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/return_to_growth_driven_by_em.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/return_to_growth_driven_by_em.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/return_to_growth_driven_by_em.webp" alt="Shipment growth in late 2023 and beyond is coming from emerging markets like the Middle East and Africa. Samsung's A-series mid-tier is doing particularly well." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Shipment growth in late 2023 and beyond is coming from emerging markets like the Middle East and Africa. Samsung's A-series mid-tier is doing particularly well.</figcaption>
</figure>
<p>Despite falling sales, distribution of Android versus iOS sales remains largely unchanged:</p>
<figure>
<a href="https://www.counterpointresearch.com/insights/global-smartphone-os-market-share/" alt="Android sales reliably constitute 80-85% of worldwide volume." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/counterpoint_smartphone_sales_by_OS_Q3-2023.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/counterpoint_smartphone_sales_by_OS_Q3-2023.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/counterpoint_smartphone_sales_by_OS_Q3-2023.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/counterpoint_smartphone_sales_by_OS_Q3-2023.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/counterpoint_smartphone_sales_by_OS_Q3-2023.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/counterpoint_smartphone_sales_by_OS_Q3-2023.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/counterpoint_smartphone_sales_by_OS_Q3-2023.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/counterpoint_smartphone_sales_by_OS_Q3-2023.webp" alt="Android sales reliably constitute 80-85% of worldwide volume." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Android sales reliably constitute 80-85% of worldwide volume.</figcaption>
</figure>
<figure>
<a href="https://www.statista.com/statistics/245191/market-share-of-mobile-operating-systems-for-smartphone-sales-in-australia/" alt="Even in rich nations like Australia and the &lt;a href='https://www.statista.com/statistics/262179/market-share-held-by-mobile-operating-systems-in-the-united-kingdom/'&gt;the U.K.&lt;/a&gt;, iPhones account for less than half of sales. Predictably, they are over-represented in analytics and logs owing to wealth-related factors including superior network access and performance hysteresis." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/au_smartphone_share_by_os.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/au_smartphone_share_by_os.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/au_smartphone_share_by_os.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/au_smartphone_share_by_os.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/au_smartphone_share_by_os.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/au_smartphone_share_by_os.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/au_smartphone_share_by_os.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/au_smartphone_share_by_os.webp" alt="Even in rich nations like Australia and the &lt;a href='https://www.statista.com/statistics/262179/market-share-held-by-mobile-operating-systems-in-the-united-kingdom/'&gt;the U.K.&lt;/a&gt;, iPhones account for less than half of sales. Predictably, they are over-represented in analytics and logs owing to wealth-related factors including superior network access and performance hysteresis." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Even in rich nations like Australia and the <a href="https://www.statista.com/statistics/262179/market-share-held-by-mobile-operating-systems-in-the-united-kingdom/">the U.K.</a>, iPhones account for less than half of sales. Predictably, they are over-represented in analytics and logs owing to wealth-related factors including superior network access and performance hysteresis.</figcaption>
</figure>
<p>Smartphone replacement rates have remained roughly in line with previous years, although we should expect elongation in the data from next year. <a href="https://www.sellcell.com/blog/how-often-do-people-upgrade-their-phone-2023-statistics/">Survey reports</a> and market analysts continue to estimate average replacement at 3-4 years, depending on segment. Premium devices last longer, and a higher fraction of devices may be older in wealthy geographies. Combined with <a href="https://www2.deloitte.com/us/en/insights/economy/consumer-pulse/state-of-the-us-consumer.html">discretionary spending pressure</a> and <a href="https://www.ons.gov.uk/economy/inflationandpriceindices/articles/costofliving/latestinsights">inflationary impacts on household budgets</a>, consumer intent to spend on electronics has taken a hit, which will be felt in device lifetime extension until conditions improve. <a href="https://www.counterpointresearch.com/insights/apple-refurbished-smartphone-volumes-grew-16-yoy-globally-in-2022/">Increasing demand for refurbished devices</a> also adds to observable device aging.</p>
<p>The data paints a substantially similar picture to previous years: the web is experienced on devices that are slower and older than those carried by affluent developers and corporate directors whose purchasing decisions are not impacted by transitory inflation.</p>
<p>To serve users effectively, we must do extra work to <a href="https://glazkov.com/2023/07/30/live-as-our-customer/">live as our customers do</a>.</p>
<h4 id="test-device-recommendations">Test Device Recommendations <a class="permalink" href="#test-device-recommendations">#</a></h4>
<p>Re-using <a href="https://infrequently.org/2022/12/performance-baseline-2023/#devices-1">last year's P75 device calculus</a>, our estimate is based on a device sold new, unlocked for the mid-2020 to mid-2021 global <abbr>ASP</abbr> of ~$350-375.</p>
<p>Representative examples from that time period include the <a href="https://www.gsmarena.com/samsung_galaxy_a51-9963.php">Samsung Galaxy A51</a> and the <a href="https://www.gsmarena.com/google_pixel_4a-10123.php">Pixel 4a</a>. Neither model featured 5G, and we cannot expect 5G to play a significant role in worldwide baselines for at least the next several years.</p>
<p>The A51 featured <a href="https://www.gsmarena.com/samsung_galaxy_a51-9963.php#:~:text=Octa%2Dcore%20(4x2.3%20GHz%20Cortex%2DA73%20%26%204x1.7%20GHz%20Cortex%2DA53)">eight slow cores (4x2.3 GHz Cortex-A73 and 4x1.7 GHz Cortex-A53) on a 10nm process</a>:</p>
<figure>
<a href="https://browser.geekbench.com/v6/cpu/compare/350184?baseline=3639070" alt="Geekbench 6 scores for the Galaxy A51 versus today's leading device." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/a51_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a51_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a51_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a51_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a51_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a51_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a51_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/a51_vs_iphone_15_pro.webp" alt="Geekbench 6 scores for the Galaxy A51 versus today's leading device." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Geekbench 6 scores for the Galaxy A51 versus today's leading device.</figcaption>
</figure>
<p><a href="https://www.gsmarena.com/google_pixel_4a-10123.php#:~:text=Octa%2Dcore%20(2x2.2%20GHz%20Kryo%20470%20Gold%20%26%206x1.8%20GHz%20Kryo%20470%20Silver)">The Pixel 4a's slow, eight-core big.LITTLE configuration was fabricated on an 8nm process</a>:</p>
<figure>
<a href="https://browser.geekbench.com/v6/cpu/compare/4295850?baseline=3639070" alt="Google spent more on the &lt;abbr&gt;SoC&lt;/abbr&gt; for the Pixel 4a and enjoyed a later launch date, boosting performance relative to the A51." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/pixel_4a_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/pixel_4a_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/pixel_4a_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/pixel_4a_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/pixel_4a_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/pixel_4a_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/pixel_4a_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/pixel_4a_vs_iphone_15_pro.webp" alt="Google spent more on the &lt;abbr&gt;SoC&lt;/abbr&gt; for the Pixel 4a and enjoyed a later launch date, boosting performance relative to the A51." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Google spent more on the <abbr>SoC</abbr> for the Pixel 4a and enjoyed a later launch date, boosting performance relative to the A51.</figcaption>
</figure>
<p><a href="https://www.androidpolice.com/why-google-pixel-phones-hardware-do-not-sell/">Pixels have never sold well,</a> and Google's focus on strong <qabbr>SoC performance per dollar was sadly not replicated across the Android ecosystem, forcing us to use the A51 as our stand-in.</qabbr></p>
<p>Devices within the envelope of our attention are 15-25% as fast as those carried by programmers and their bosses — even in wealthy markets.</p>
<p>The Galaxy may be <a href="https://browser.geekbench.com/v6/cpu/compare/4301594?baseline=442665">slightly faster</a> than last year's <a href="https://infrequently.org/2022/12/performance-baseline-2023/#:~:text=The%20best%20analogue%20you%20can%20buy%20for%20a%20representative%20P75%20device%20today%20are%20~%24200%20Androids%20from%20the%20last%20year%20or%20two%2C%20such%20as%20the%20Samsung%20Galaxy%20A50%20and%20the%20Nokia%20G11.">recommendation</a> of the <a href="https://www.gsmarena.com/samsung_galaxy_a50-9554.php">Galaxy A50 for testing</a>, but the picture is muddy:</p>
<figure>
<a href="https://browser.geekbench.com/v5/cpu/compare/22080983?baseline=22095605" alt="Geekbench 5 shows almost no improvement between the A50 and the A51." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb5.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb5.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb5.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb5.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb5.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb5.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb5.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb5.webp" alt="Geekbench 5 shows almost no improvement between the A50 and the A51." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Geekbench 5 shows almost no improvement between the A50 and the A51.</figcaption>
</figure>
<figure>
<a href="https://browser.geekbench.com/v6/cpu/compare/4302358?baseline=4260956" alt="Geekbench 6 shows the same story within the margin of error. The low-end is stagnant, and still &lt;a href='https://www.statista.com/statistics/934471/smartphone-shipments-by-price-category-worldwide/' target='_new'&gt;30% of worldwide volume&lt;/a&gt;." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb6.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb6.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb6.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb6.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb6.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb6.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb6.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/a50_vs_a51_gb6.webp" alt="Geekbench 6 shows the same story within the margin of error. The low-end is stagnant, and still &lt;a href='https://www.statista.com/statistics/934471/smartphone-shipments-by-price-category-worldwide/' target='_new'&gt;30% of worldwide volume&lt;/a&gt;." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Geekbench 6 shows the same story within the margin of error. The low-end is stagnant, and still <a href="https://www.statista.com/statistics/934471/smartphone-shipments-by-price-category-worldwide/" target="_new">30% of worldwide volume</a>.</figcaption>
</figure>
<p>If you're building a test lab today, refurbished A51s can be had for ~$150. Even better, the newer <a href="https://browser.geekbench.com/v6/cpu/compare/3826666?baseline=350184">Nokia G100</a> can be had for as little as $100, and it's <a href="https://www.nokia.com/phones/en_us/nokia-g-100?sku=F22CF51022200">faithful</a> to the sluggish original in <a href="https://www.gsmarena.com/compare.php3?idPhone1=9963&amp;idPhone2=12373">nearly every respect</a>.</p>
<p>If your test bench is based on last year's recommended A50 or <a href="https://www.gsmarena.com/nokia_g11-11358.php">Nokia G11</a>, I do not recommend upgrading in 2024. The absolute gains are so slight that the difference will be hard to feel, and bench stability has a value all its own. Looking forward, we can also predict that our bench performance will be stable until 2025.</p>
<p>Claims about how "performant" modern front-end tools are have to be evaluated in this slow, stagnant context.</p>
<h3 id="desktop">Desktop <a class="permalink" href="#desktop">#</a></h3>
<p>It's a bit easier to understand the Desktop situation because the Edge telemetry I have access to provides statistically significant insight into <a href="https://www.statista.com/statistics/576473/united-states-quarterly-pc-shipment-share-apple/">85+% of the market</a>.</p>
<h4 id="device-performance-1">Device Performance <a class="permalink" href="#device-performance-1">#</a></h4>
<p>The <abbr>TL;DR</abbr> for desktop performance is that Edge telemetry puts ~45% of devices in a "low-end" bucket, meaning they have &lt;= 4 cores or &lt;= 4GB of RAM.</p>
<table class="summary">
<thead>
<tr>
<td>Device Tier</td>
<td>Fleet %</td>
<td>Definition</td>
</tr>
</thead>
<tbody>
<tr>
<td>Low-end</td>
<td>45%</td>
<td>Either:<br>&lt;= 4 cores, or<br>&lt;= 4GB RAM</td>
</tr>
<tr>
<td>Medium</td>
<td>48%</td>
<td><abbr>HDD</abbr> (not <abbr>SSD</abbr>), or<br>4-16 GB RAM, or<br>4-8 cores</td>
</tr>
<tr>
<td>High</td>
<td>7%</td>
<td><abbr>SSD</abbr> +<br>&gt; 8 cores +<br>&gt; 16GB RAM</td>
</tr>
</tbody>
</table>
<p>20% of users are on <abbr>HDD</abbr>s (not <abbr>SSD</abbr>s) and nearly all of those users also have low (and slow) cores.</p>
<p>You might be tempted to dismiss this data because it doesn't include Macs, which are faster than the PC cohort. Recall, however, that the snapshot also excludes ChromeOS.</p>
<p>ChromeOS share has veered wildly in recent years, representing 50%-200% of Mac shipments in a given per quarter. In '21 and '22, ChromeOS shipments regularly doubled Mac sales. Despite post-pandemic mean reversion, <a href="https://www.idc.com/getdoc.jsp?containerId=IDC_P36344">according to <abbr>IDC</abbr></a> ChromeOS devices outsold Macs ~5.7M to ~4.7M in 2023 Q2. The trend reversed in Q3, with Macs almost doubling ChromeOS sales, but slow ChromeOS devices aren't going away and, from a population perspective, more than offset Macs today. Analysts also <a href="https://www.idc.com/promo/pcdforecast">predict growth in the low end of the market as educational institutions begin to refresh their past purchases.</a></p>
<h4 id="networks-1">Networks <a class="permalink" href="#networks-1">#</a></h4>
<p>Desktop-attached networks <a href="https://www.fiercetelecom.com/broadband/ookla-global-fixed-download-speeds-nearly-doubled-2022">continue to improve</a>, notably <a href="https://www.allconnect.com/blog/broadband-availability-by-type">in the U.S.</a> Regulatory intervention and subsidies have done much to spur enhancements in access to U.S. fixed broadband, although <a href="https://www.allconnect.com/blog/broadband-availability-by-type#:~:text=Top%20and%20bottom%20states%20for%20speed">disparities in access remain</a> and the gains <a href="https://www.cbsnews.com/news/affordable-internet-service-could-be-lost-fcc-program-to-run-out-of-funds/">may not persist</a>.</p>
<p>This suggests that it's time to also bump our baseline for desktop tests beyond the 5Mbps/1Mbps/28ms configuration that <a href="https://www.webpagetest.org/">WebPageTest.org's "Cable" profile</a> has defaulted to for desktop tests.</p>
<p>How far should we bump it? Publicly available data is unclear, and I've come to find out that Edge's telemetry lacks good network observation statistics (doh!); Windows telemetry doesn't capture a proxy for network quality, I no longer have access to Chrome's data, the <a href="https://developer.chrome.com/docs/crux/api#effective_connection_type">population-level telemetry available from CrUX is unhelpful</a>, and <a href="https://www.fcc.gov/reports-research/reports/measuring-broadband-america/measuring-fixed-broadband-twelfth-report#:~:text=Chart%2016.2%3A%20The%20ratio%20of%2070/70%20consistent%20download%20speed%20to%20advertised%20download%20speed.">telcos li</a>...er...sorry, <em>"market their products in accordance with local laws and advertising standards."</em> All of this makes it difficult to construct an estimate.</p>
<p>One option is to use a population-level assessment of medians from <a href="https://www.speedtest.net/global-index">something like the Speedtest.net data</a> and then construct a histogram from median speeds. This is both time-consuming and error-prone, as population-level data varies widely across the world. Emerging markets with high mobile internet use and dense populations <a href="https://www.fiercewireless.com/wireless/indias-top-2-mobile-carriers-fight-supremacy-fixed-broadband">can feature</a> poor fixed-line broadband penetration <a href="https://www.opensignal.com/2023/11/22/closing-the-gap-fixed-broadbands-role-in-global-progress">compared with Western markets</a>.</p>
<p>Another option is to mathematically hand-wave using the best evidence we can get. This might allow us to reconstruct probable P75 and P90 values if we know something about the historical distribution of connections. From there, we can gut-check using other spot data. To do this, we need to assume some data set is representative, a fraught decision all its own. Biting the bullet, we could start from the Speedtest.net global survey data, which currently fails to provide anything but medians (P50):</p>
<figure>
<a href="https://www.speedtest.net/global-index" alt="Speedtest.net's global median values are unhelpful on their own, both because they represent users who are testing for speed (and not organic throughput) and because they don't give us a fuller understanding of the distribution." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/global_fixed_speedtest_medians.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/global_fixed_speedtest_medians.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/global_fixed_speedtest_medians.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/global_fixed_speedtest_medians.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/global_fixed_speedtest_medians.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/global_fixed_speedtest_medians.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/global_fixed_speedtest_medians.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/global_fixed_speedtest_medians.webp" alt="Speedtest.net's global median values are unhelpful on their own, both because they represent users who are testing for speed (and not organic throughput) and because they don't give us a fuller understanding of the distribution." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>Speedtest.net's global median values are unhelpful on their own, both because they represent users who are testing for speed (and not organic throughput) and because they don't give us a fuller understanding of the distribution.</figcaption>
</figure>

<p>After many attempted Stupid Math Tricks with poorly fitting curves (bandwidth seems to be a funky cousin of log-normal), I've decided to wing it and beg for help: instead of trying to be clever, I'm leaning on <a href="https://radar.cloudflare.com/quality/">Cloudflare Radar's P25/P50/P75 distributions</a> for <a href="https://en.wikipedia.org/wiki/List_of_countries_by_number_of_Internet_users">populous, openly-connected countries with &gt;= ~50M internet users</a>. It's cheeky, but a weighted average of the P75 of download speeds (3/4ths of all connections are faster) should get us in the ballpark. We can then use the usual 5:1 downlink:uplink ratio to come up with an uplink estimate. We can also derive a weighted average for the P75 <abbr>RTT</abbr> from Cloudflare's data. Because Cloudflare doesn't distinguish mobile from desktop connections, this may be an overly conservative estimate, but it's still be more permissive than what we had been pegged to in years past:</p>

<table class="summary" id="natspeeds">
<caption>National P75 Downlink and <abbr>RTT</abbr></caption>
<thead>
<tr>
<td>Country</td>
<td>P75 Downlink (Mbps)</td>
<td>P75 <abbr>RTT</abbr> (ms)</td>
</tr>
</thead>
<tbody>
<tr>
<td>India</td>
<td>4</td>
<td>114</td>
</tr>
<tr>
<td>USA</td>
<td>11</td>
<td>58</td>
</tr>
<tr>
<td>Indonesia</td>
<td>5</td>
<td>81</td>
</tr>
<tr>
<td>Brazil</td>
<td>8</td>
<td>71</td>
</tr>
<tr>
<td>Nigeria</td>
<td>3</td>
<td>201</td>
</tr>
<tr>
<td>Pakistan</td>
<td>3</td>
<td>166</td>
</tr>
<tr>
<td>Bangladesh</td>
<td>5</td>
<td>114</td>
</tr>
<tr>
<td>Japan</td>
<td>17</td>
<td>42</td>
</tr>
<tr>
<td>Mexico</td>
<td>7</td>
<td>75</td>
</tr>
<tr>
<td>Egypt</td>
<td>4</td>
<td>100</td>
</tr>
<tr>
<td>Germany</td>
<td>16</td>
<td>36</td>
</tr>
<tr>
<td>Turkey</td>
<td>7</td>
<td>74</td>
</tr>
<tr>
<td>Philippines</td>
<td>7</td>
<td>72</td>
</tr>
<tr>
<td>Vietnam</td>
<td>7</td>
<td>72</td>
</tr>
<tr>
<td>United Kingdom</td>
<td>16</td>
<td>37</td>
</tr>
<tr>
<td>South Korea</td>
<td>24</td>
<td>26</td>
</tr>
<tr>
<td><em>Weighted Avg.</em></td>
<td>7.2</td>
<td>94</td>
</tr>
</tbody>
</table>
<p>We, therefore, update our P75 link estimate <strong>7.2Mbps down, 1.4Mbps up, and 94ms <abbr>RTT</abbr>.</strong></p>
<p>This is a mild crime against statistics, not least of all because it averages unlike quantities and fails to sift mobile from desktop, but all the other methods available at time of writing are just as bad. Regardless, this new baseline is half again as much link capacity as last year, showing measurable improvement in networks worldwide.</p>
<p>If you or your company are able to generate a credible worldwide latency estimate in the higher percentiles for next year's update, please <a href="https://infrequently.org/about-me/">get in touch</a>.</p>
<h4 id="market-factors-1">Market Factors <a class="permalink" href="#market-factors-1">#</a></h4>
<p>The forces that shape the PC population have been largely fixed for many years. Since 2010, <a href="https://en.wikipedia.org/wiki/Market_share_of_personal_computer_vendors#Worldwide_(1996%E2%80%932022)">volumes have been on a slow downward glide path</a>, shrinking from ~350MM per year in a decade ago to ~260MM in 2018. The pandemic buying spree of 2021 pushed volumes above 300MM per year for the first time in eight years, with the vast majority of those devices being sold at low-end price points — think ~$300 Chromebooks rather than M1 MacBooks.</p>
<p>Lest we assume low-end means "short-lived", <a href="https://blog.google/outreach-initiatives/education/automatic-update-extension-chromebook/">recent announcements regarding software support for these devices</a> will considerably extend their impact. This low-end cohort will filter through the device population for years to come, pulling our performance budgets down, even as renewed process improvement is unlocking improved power efficiency and performance at the high end of the first-sale market. This won't be as pronounced as the diffusion of $100 smartphones has been in emerging markets, but the longer life-span of desktops is already a factor in our model.</p>
<h4 id="test-device-recommendations-1">Test Device Recommendations <a class="permalink" href="#test-device-recommendations-1">#</a></h4>
<p>Per our methodology from last year which uses the 5-8 year replacement cycle for a PC, we update our target date to late 2017 or early 2018, but leave the average-selling-price fixed between $600-700. Eventually we'll need to factor in the past couple of years of gyrations in inflation and supply chains into account when making an estimate, but not this year.</p>
<p>So what did $650, give or take, buy in late 2017 or early 2018?</p>
<p>One option was a <a href="https://www.theverge.com/2017/5/30/15698476/dell-inspiron-gaming-desktop-announced">naf looking tower from Dell, optimistically pitched at gamers</a>, with a <abbr>CPU</abbr> that scores <a href="https://browser.geekbench.com/v6/cpu/compare/3639070?baseline=253445">poorly versus a modern phone.</a>, but which blessedly sports 8GB of RAM.</p>
<p>In laptops (the larger segment), ~$650 bought the <a href="https://www.pcmag.com/reviews/lenovo-yoga-720-12-inch">Lenovo Yoga 720 (12")</a>, with a 2-core (4-thread) <a href="https://ark.intel.com/content/www/us/en/ark/products/95442/intel-core-i3-7100u-processor-3m-cache-2-40-ghz.html">Core i3-7100U</a> and 4GB of RAM. Versions with more RAM and a faster chip were available, but cost considerably more than our budget. This was not a fast box. <a href="https://browser.geekbench.com/v6/cpu/compare/3639070?baseline=4311168">Here's a device with that CPU compared to a modern phone</a>; not pretty:</p>
<figure>
<a href="https://browser.geekbench.com/v6/cpu/compare/3639070?baseline=4311168" alt="The phones of wealthy developers absolutely smoke the baseline PC." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/i3-7100U_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/i3-7100U_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/i3-7100U_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/i3-7100U_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/i3-7100U_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/i3-7100U_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/i3-7100U_vs_iphone_15_pro.webp?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/i3-7100U_vs_iphone_15_pro.webp" alt="The phones of wealthy developers absolutely smoke the baseline PC." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption>The phones of wealthy developers absolutely smoke the baseline PC.</figcaption>
</figure>
<p>It's considerably faster than <a href="https://www.bhphotovideo.com/c/product/1704697-REG/hp_6k639ut_aba_probook_fortis_14_n4500.html">some devices still being sold to schools, though</a>.</p>
<p>What does this mean for our target devices? There's wild variation in performance per dollar below $600 which will only increase a inflation-affected cohorts grow to represent a larger fraction of the fleet. Intel's move (finally!) off of 14nm also means that gains are starting to arrive at the low end, but in an uneven way. General advice is therefore hard to issue. That said, we can triangulate based on what we know about the market:</p>

<p>My recommendation, then, to someone setting up a new lab today is not to spend more than $350 on new a test device. Consider laptops with chips like the <a href="https://www.intel.com/content/www/us/en/products/sku/197309/intel-celeron-processor-n4120-4m-cache-up-to-2-60-ghz/specifications.html">N4120</a>, <a href="https://www.intel.com/content/www/us/en/products/sku/212326/intel-celeron-processor-n4500-4m-cache-up-to-2-80-ghz/specifications.html">N4500</a>, or the <a href="https://www.intel.com/content/www/us/en/products/sku/212328/intel-celeron-processor-n5105-4m-cache-up-to-2-90-ghz/specifications.html">N5105</a>. Test devices should also have no more than 8GB of RAM, and preferably 4GB. The <a href="https://a.co/d/98wiGAl">2021 HP 14</a> is a fine proxy. The <a href="https://www.bhphotovideo.com/c/product/1761467-REG/hp_7f424ua_aba_14_14_ep0010nr_laptop_intel.html">updated ~$375 version</a> will do in a pinch, but try to spend less if you can. <a href="https://www.notebookcheck.net/Mobile-Processors-Benchmark-List.2436.0.html?type=&amp;sort=&amp;search=Intel&amp;itemselect_13189=13189&amp;itemselect_13111=13111&amp;itemselect_11533=11533&amp;itemselect_13079=13079&amp;or=0&amp;itemselect_13189=13189&amp;itemselect_13111=13111&amp;itemselect_11533=11533&amp;itemselect_13079=13079&amp;showCount=1&amp;showBars=1&amp;geekbench5_1_single=1&amp;geekbench5_1_multi=1&amp;geekbench6_2_single=1&amp;geekbench6_2_multi=1&amp;octane2=1&amp;speedometer=1&amp;cpu_fullname=1&amp;codename=1&amp;l2cache=1&amp;l3cache=1&amp;tdp=1&amp;mhz=1&amp;turbo_mhz=1&amp;cores=1&amp;threads=1">Test devices should preferably score no higher than 1,000 in single-core Geekbench 6 tests</a>; a line <a href="https://browser.geekbench.com/v6/cpu/4352217">the HP 14's N4120 easily ducks, clocking in at just over 350</a>.</p>
<h2 id="takeaways">Takeaways <a class="permalink" href="#takeaways">#</a></h2>
<p>There's a lot of good news embedded in this year's update. Devices and networks have finally started to get a bit faster (as predicted), pulling budgets upwards.</p>
<p>At the same time, the community remains in solid denial about the disastrous consequences of an over-reliance on JavaScript. This paints a picture of path dependence — front-end isn't moving on from approaches that hurt users, <a href="https://x.com/FredKSchott/status/1744842592905552227?s=20">even as the costs shift back onto teams that have been degrading life for users at the margins</a>.</p>
<p>We can anticipate continued improvement in devices over the next few years, and network pace may level out somewhat as the uneven deployment of 5G lurches forward. Regardless, the gap between the digital haves and have-nots continues to grow. Those least able to afford the fast devices are actively taxed by developers high on their own developer experience (<abbr title="Developer Experience">DX</abbr>).</p>
<p>It's not a mystery why folks who spend every waking hour inside a digital privilege bubble are not building with empathy or humility when nobody calls them to account. What's mysterious is that anybody pays them to do it. The Product Management (<abbr>PM</abbr>) and Engineering Management (<abbr>EM</abbr>) disciplines have utterly failed organisations building on the web, failing to put pro-user and pro-business constraints on the enthusiasms of developers.</p>
<p>Instead of cabining the the enthusiasms of the FP crowd, managers meekly repeated bullshit about how <em>"you can't hire for fundamentals"</em> as they waved in busloads of bootcampers whose React-heavy <abbr>CV</abbr> paint jobs had barely dried. They could have run bake-offs. They could have paid for skills that would serve the business over time. They could have facilitated learning anything the business valued. Instead, they abdicated. The kicker is that they didn't even reliably make things better for the class they imagined they were serving.</p>
<p>This post was partially drafted on airplane wifi, and I can assure you that wealthy folks also experience <abbr>RTT</abbr>'s north of 500ms and <a href="https://en.wikipedia.org/wiki/Gogo_Inflight_Internet#Technologies">channel capacity in the single-digit-Mbps</a>.</p>
<p>Even the wealthiest users step out of the privilege bubble sometimes. Are these EMs and PMs <em>really</em> happy to lose that business?</p>
<figure>
<a href="https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg" alt="&lt;em&gt;Tap for a larger version.&lt;/em&gt;&lt;br&gt;Wealthy users are going to experience networks with properties that are even worse than the 'bad' networks offered to the Next Billion Users. At an altitude of 40k feet and a ground speed for 580 MPH somewhere over Alberta, CA, your correspondent's bandwidth is scarce, lopsided, and laggy." target="_new">
<picture class="preview">
<source sizes="(max-width: 1200px) 70vw, 600px" srcset="https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg?nf_resize=fit&amp;w=3600 2400w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg?nf_resize=fit&amp;w=2400 1600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg?nf_resize=fit&amp;w=1800 1200w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg?nf_resize=fit&amp;w=1200 800w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg?nf_resize=fit&amp;w=900 600w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg?nf_resize=fit&amp;w=750 500w,
https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg?nf_resize=fit&amp;w=600 400w">
<img src="https://infrequently.org/2024/01/performance-inequality-gap-2024/airplane_wifi_speedtest.jpg" alt="&lt;em&gt;Tap for a larger version.&lt;/em&gt;&lt;br&gt;Wealthy users are going to experience networks with properties that are even worse than the 'bad' networks offered to the Next Billion Users. At an altitude of 40k feet and a ground speed for 580 MPH somewhere over Alberta, CA, your correspondent's bandwidth is scarce, lopsided, and laggy." decoding="async" loading="lazy">
</source></picture>
</a>
<figcaption><em>Tap for a larger version.</em><br>Wealthy users are going to experience networks with properties that are even worse than the 'bad' networks offered to the Next Billion Users. At an altitude of 40k feet and a ground speed for 580 MPH somewhere over Alberta, CA, your correspondent's bandwidth is scarce, lopsided, and laggy.</figcaption>
</figure>
<p>Of course, any trend that can't continue won't, and <abbr>INP</abbr>'s impact is already being felt. The great JavaScript merry-go-round may grind to a stop, but the momentum of consistently bad choices is formidable. Like passengers on a cruise ship ramming a boardwalk at flank speed, JavaScript regret is dawning far too late and interacting very poorly with something we ate. As the good ship Scripting shudders and lists on the remains of the Ferris Wheel, it's not exactly clear how to get off, but the choices that led us here are at least visible, if only by their belated consequences.</p>
<h3 id="the-great-branch-mispredict">The Great Branch Mispredict <a class="permalink" href="#the-great-branch-mispredict">#</a></h3>
<p>We got to a place where performance has been a constant problem in large part because a tribe of programmers convinced themselves that it <em>wasn't</em> and <em>wouldn't be</em>. The circa '13 narrative asserted that:</p>
<ul>
<li>CPUs would keep getting faster (just like they always had).</li>
<li>Networks would get better, or at least not get worse.</li>
<li>Organisations had all learned the lessons of Google and FaceBook's adventures in Ajax.</li>
</ul>
<p>It was all bullshit, <em>and many of us spotted it a mile away</em>.</p>
<p>But tribalism-boosted confirmation bias mixed with JavaScript's toxic positivity culture to precipitate out a Silicon Prosperity Gospel; all resources would go infinite if you just <em>believed</em>. No matter how wrong the premise, we kept executing down the obviously-falsified branch until the buffers drained.</p>
<p>The solutions are social, not technical, because the the delusions are social, rather than technical.</p>
<p>The stories that propped up <abbr>IE8</abbr>-focused frameworks like Angular and React in the mobile era have only served as comforting myths to ward off emerging device and network reality. For the past decade, the important question hasn't been if enough good technology existed, but rather how long the delusions would keep hold.</p>
<p>The community wanted to live in a different world than the one we inhabit, so we collectively mis-predicted. A healthy web community will value learning faster.</p>
<p>How deep was the branch? And how many cycles will the fault cost us? If CPUs and networks continue to improve at the rate of the past two years, and <abbr>INP</abbr> finally forces a reckoning, the answer might be as little as a decade. I fear we will not be so lucky; an entire generation has been trained to ignore reality, to prize tribalism rather than engineering rigor, and to devalue fundamentals. Those folks may not find the next couple of years to their liking.</p>
<p>Front-end's hangover from the JavaScript party is gonna <em>suck</em>.</p>

+ 235
- 0
cache/2024/1d60fc5548a6fe61da80a4e16892fa0c/index.html View File

@@ -0,0 +1,235 @@
<!doctype html><!-- This is a valid HTML5 document. -->
<!-- Screen readers, SEO, extensions and so on. -->
<html lang="fr">
<!-- Has to be within the first 1024 bytes, hence before the `title` element
See: https://www.w3.org/TR/2012/CR-html5-20121217/document-metadata.html#charset -->
<meta charset="utf-8">
<!-- Why no `X-UA-Compatible` meta: https://stackoverflow.com/a/6771584 -->
<!-- The viewport meta is quite crowded and we are responsible for that.
See: https://codepen.io/tigt/post/meta-viewport-for-2015 -->
<meta name="viewport" content="width=device-width,initial-scale=1">
<!-- Required to make a valid HTML5 document. -->
<title>Deep Democracy - IAPOP (archive) — David Larlet</title>
<meta name="description" content="Publication mise en cache pour en conserver une trace.">
<!-- That good ol' feed, subscribe :). -->
<link rel="alternate" type="application/atom+xml" title="Feed" href="/david/log/">
<!-- Generated from https://realfavicongenerator.net/ such a mess. -->
<link rel="apple-touch-icon" sizes="180x180" href="/static/david/icons2/apple-touch-icon.png">
<link rel="icon" type="image/png" sizes="32x32" href="/static/david/icons2/favicon-32x32.png">
<link rel="icon" type="image/png" sizes="16x16" href="/static/david/icons2/favicon-16x16.png">
<link rel="manifest" href="/static/david/icons2/site.webmanifest">
<link rel="mask-icon" href="/static/david/icons2/safari-pinned-tab.svg" color="#07486c">
<link rel="shortcut icon" href="/static/david/icons2/favicon.ico">
<meta name="msapplication-TileColor" content="#f7f7f7">
<meta name="msapplication-config" content="/static/david/icons2/browserconfig.xml">
<meta name="theme-color" content="#f7f7f7" media="(prefers-color-scheme: light)">
<meta name="theme-color" content="#272727" media="(prefers-color-scheme: dark)">
<!-- Is that even respected? Retrospectively? What a shAItshow…
https://neil-clarke.com/block-the-bots-that-feed-ai-models-by-scraping-your-website/ -->
<meta name="robots" content="noai, noimageai">
<!-- Documented, feel free to shoot an email. -->
<link rel="stylesheet" href="/static/david/css/style_2021-01-20.css">
<!-- See https://www.zachleat.com/web/comprehensive-webfonts/ for the trade-off. -->
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_regular.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: light), (prefers-color-scheme: no-preference)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_bold.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: light), (prefers-color-scheme: no-preference)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_italic.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: light), (prefers-color-scheme: no-preference)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t3_regular.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: dark)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t3_bold.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: dark)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t3_italic.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: dark)" crossorigin>
<script>
function toggleTheme(themeName) {
document.documentElement.classList.toggle(
'forced-dark',
themeName === 'dark'
)
document.documentElement.classList.toggle(
'forced-light',
themeName === 'light'
)
}
const selectedTheme = localStorage.getItem('theme')
if (selectedTheme !== 'undefined') {
toggleTheme(selectedTheme)
}
</script>

<meta name="robots" content="noindex, nofollow">
<meta content="origin-when-cross-origin" name="referrer">
<!-- Canonical URL for SEO purposes -->
<link rel="canonical" href="https://iapop.com/deep-democracy/">

<body class="remarkdown h1-underline h2-underline h3-underline em-underscore hr-center ul-star pre-tick" data-instant-intensity="viewport-all">


<article>
<header>
<h1>Deep Democracy - IAPOP</h1>
</header>
<nav>
<p class="center">
<a href="/david/" title="Aller à l’accueil"><svg class="icon icon-home">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-home"></use>
</svg> Accueil</a> •
<a href="https://iapop.com/deep-democracy/" title="Lien vers le contenu original">Source originale</a>
<br>
Mis en cache le 2024-01-31
</p>
</nav>
<hr>
<h3>Definition of Deep Democracy</h3>
<p>The concept of Deep Democracy was developed by Arnold Mindell. It is defined as an attitude and a principle.</p>
<p class="quote">Attitude: Deep Democracy is an attitude that focuses on the awareness of voices that are both central and marginal. This type of awareness can be focused on groups, organizations, one’s own inner experiences, people in conflict, etc. Allowing oneself to take seriously seemingly unimportant events and feelings can often bring unexpected solutions to both group and inner conflicts.</p>
<p class="quote">Principle: Unlike “classical” democracy, which focuses on majority rule, Deep Democracy suggests that all voices, states of awareness, and frameworks of reality are important. Deep Democracy also suggests that the information carried within these voices, awarenesses, and frameworks are all needed to understand the complete process of the system. The meaning of this information appears, when the various frameworks and voices are relating to each other. Deep Democracy is a process of relationship, not a state-oriented still picture, or a set of policies.</p>
<p>From Deep Democracy principle and attitude, Glossary: [1]</p>
<h3>A Brief History of Deep Democracy</h3>
<p>Deep Democracy is a psycho-social-political paradigm and methodology. The term Deep Democracy was developed by Arny Mindell in 1988 and first appeared in Leader as Martial Artist (Mindell, 1992). Mindell, a physicist and Jungian Analyst had researched and written extensively on how awareness creates reality and how we perceive it on different levels, creating different frameworks of reality. An example for this is how we perceive time: the measurable reality of the seconds ticking in a clock, the dreamlike “subjective” perception of time as it passes during an encounter with a lover, and the sentient essence of timelessness as we catch the moment of a sunrise that goes beyond time as we know it and replaces, for a moment, the concept of future with hope. Mindell calls his paradigm Processwork, which formulates these principles and demonstrates how they can be used in psychotherapy in many of his books. In the late eighties he started to formulate them as a political principle that he called Deep Democracy. Unlike “classical” democracy, which focuses on majority rule, Deep Democracy suggests that all voices, states of awareness, and frameworks of reality are important. Deep Democracy also suggests that the information carried within these voices, awarenesses, and frameworks are all needed to understand the complete process of the system. Deep Democracy is an attitude that focuses on the awareness of voices that are both central and marginal.</p>
<p>
</p>
<div class="ngg-gallery-singlepic-image ngg-left">
<a href="https://iapop.com/wp-content/gallery/admin/crdlse300.gif" title="" data-src="https://iapop.com/wp-content/gallery/admin/crdlse300.gif" data-thumbnail="https://iapop.com/wp-content/gallery/admin/thumbs/thumbs_crdlse300.gif" data-image-id="26" data-title="CR DL SE outlines" data-description="" target="_self" class="shutterset_f8297252056becfbf675efe5c4f6b8f7">
<img class="ngg-singlepic" src="https://iapop.com/wp-content/gallery/admin/cache/crdlse300.gif-nggid0226-ngg0dyn-0x0x100-00f0w010c010r110f110r010t010.gif" alt="CR DL SE outlines" title="CR DL SE outlines">
</a>
</div>
<p>
This type of awareness can be focused on groups, organizations, one’s own inner experiences, people in conflict, etc. Allowing oneself to take seriously seemingly unimportant events and feelings can often bring unexpected solutions to both group and inner conflicts.
</p>
<p>Although the term and the concepts of Deep Democracy are now being used by various groups in different ways they have a common denominator that Mindell describes so well: An experience of Deep Democracy as a process of flow in which all actors on the stage are needed to create the play that we are watching.</p>
<p>Numerous attempts to implement Deep Democracy are occurring simultaneously throughout the world. Just as conventional democracy strives to include all people in a political process, Deep Democracy furthers this by striving to foster a deeper level of dialogue and inclusivity that is open to including not only all people in the sense of the right to vote but is also open to allowing space for various and competing views, tensions, feelings, and styles of communication in a way that supports awareness of relative rank, power, and privilege and the ways in which these tend to marginalize various views, individuals, and groups.</p>
<p class="quote">Deep Democracy is our sense that the world is here to help us to become our entire selves, and that we are here to help the world to become whole (Mindell, 1992).</p>
<h3>Roots of Democracy</h3>
<p class="quote">de•moc• ra•cy (di mak’re se) n. [Gr demokratia &lt; demos, the people + kratein, to rule &lt; kratos, strength] 1 government in which the people hold the ruling power either directly or through elected representatives 2 a country, state, etc. with such government 3 majority rule 4 the principle of equality of rights, opportunity, and treatment 5 the common people, esp. as the wielders of political power. (Webster’s, 1983, p. 366)</p>
<p class="quote">We have frequently printed the word Democracy, yet I cannot too often repeat that it is a word the real gist of which still sleeps, quite unawakened, notwithstanding the resonance and the many angry tempests out of which its syllables have come, from pen or tongue. It is a great word, whose history, I suppose, remains unwritten, because that history has yet to be enacted. -Walt Whitman, Democratic Vistas, 1871</p>
<p>Democracy—commonly defined as the free and equal right of every person to participate in a system of government, often practiced by electing representatives of the people—is generally said to have originated in Ancient Greece when the demos organized against their leaders’ abuse of power. But democracy is more than a body of laws and procedures related to the sharing of power. President Carter said that, “Democracy is like the experience of life itself—always changing, infinite in its variety, sometimes turbulent and all the more valuable for having been tested for adversity” (Carter, 1978). How is democracy like life? In what dimensions is it changing and turbulent?</p>
<p>One example of the dynamic turbulence of democracy in the United States is the evolution of freedom of the press and the practical application of the First Amendment rights to free speech. The first American newspaper, Publick Occurrences, Both Foreign and Domestic (Massachusetts Historical Society, 2004), published its first and only issue in Boston on Thursday, September 25th, 1690. Publication was stopped by the governor of Boston who objected to the paper’s negative tone regarding British rule and by the local ministries who were offended by a report that the King of France had had an affair with his son’s wife (Virtual Museum of Printing, 2004).</p>
<h3>A Brief History of Free Speech in the US and its Relationship to Deep Democracy</h3>
<p>For social activists, free speech and the freedom of the press were key issues to fight for. Deep democracy however is a principle that tries to include all experiences. If you speak freely about a political opponent, and bring out your opinion, and marginalize the part in you that realizes that your opponent is also a person and has many dimensions, you have censored yourself and have not used a deeper freedom of speech. Free speech and the freedom of press are important, but without Deep Democracy, they can become an abusive and tyrannical force, that is not relating to the emotional and social realities and total experiences of the people they are reporting about.</p>
<p>Up until 1919 free speech and freedom of the press in the United States meant “little more than no prior restraint, that is, one could say what one wanted, but then could be prosecuted for it” (Holmes, 1919). There was no protection for the dissemination of ideas. In 1859 John Stuart Mill pointed out the risks involved in suppressing ideas in his essay, On Liberty:</p>
<p class="quote">But the peculiar evil of silencing the expression of an opinion is, that it is robbing the human race; posterity as well as the existing generation; those who dissent from the opinion, still more than those who hold it. If the opinion is right, they are deprived of the opportunity of exchanging error for truth: if wrong, they lose, what is almost as great a benefit, the clearer perception and livelier impression of truth, produced by its collision with error. (1859)</p>
<p>Despite Mill’s impassioned plea and the wide distribution of On Liberty—which had great impact on the public discourse of the its day as well as on the course of political philosophy since—the US maintained a very conservative view towards freedom of speech until 1919.</p>
<p>That view changed abruptly in 1919 when Supreme Court Judge Oliver Wendell Holmes entered a dissenting opinion in favor of a group of radical pamphleteers:</p>
<p class="quote">Jacob Abrams and others had been convicted of distributing pamphlets criticizing the Wilson administration for sending troops to Russia in the summer of 1918. Although the government could not prove that the pamphlets had actually hindered the operation of the military, an anti-radical lower court judge had found that they might have done so, and found Abrams and his co-defendants guilty. On appeal, seven members of the Supreme Court had used Holmes’s “clear and present danger” test to sustain the conviction. But Holmes, joined by Louis D. Brandeis, dissented, and it is this dissent that is widely recognized as the starting point in modern judicial concern for free expression. (US Department of State, 1919)</p>
<p>Abrams’ publications seem almost benign by today’s standards: “Workers—Wake Up. . . . Woe unto those who will be in the way of progress. Let solidarity live. . . . German militarism combined with allied capitalism to crush the Russian revolution. . .” and spoke of working class enlightenment (US Department of State, 1919).</p>
<p>Justice Holmes ruled in their defense that:</p>
<p class="quote">It is only the present danger of immediate evil or an intent to bring it about that warrants Congress in setting a limit to the expression of opinion where private rights are not concerned. Congress certainly cannot forbid all effort to change the mind of the country. (Holmes, 1919)</p>
<p>In the discussion of free speech, we often marginalize the need for relationship between the parties. Public dialogue allows reaction to what is going on. Both parties, those who champion for free speech and those who champion for limitations in the interests of public safety, need to relate more to each other and learn to understand the visions and ideals that are behind those opinions. In a deeply democratic society, this is considered more sustainable then a seesaw process of forbidding and allowing the publication of certain texts.</p>
<p>In his ruling Justice Holmes supported the importance of public discourse and freedom of speech with these now widely quoted words: “The best test of truth is the power of the thought to get itself accepted in the competition of the market” (Holmes, 1919). But, after more than twenty-five centuries of development in political philosophy, it is only within the last century that US and European thought has begun to support freedom of speech in a meaningful way. Holmes’s thinking didn’t account for structural forces that tend to repress various ideas in support of special interests.</p>
<p>Joseph Stiglitz, former Chairman of Council of Economic Advisers under President Clinton and former Chief Economist and Senior VP of the World Bank maintains:</p>
<p class="quote">Secrecy . . . undermines democracy. There can be democratic accountability only if those to whom these public institutions are supposed to be accountable are well informed about what they are doing—including what choices they confronted and how those decisions were made. (Stiglitz, 2003, p. 229)</p>
<h3>Evolution of Deep Democracy</h3>
<p class="quote">The most fundamental forum is your own heart. Both as a facilitator and as a human being, you must learn to hear yourself there. Arnold Mindell, “Sitting in the Fire”, 1995</p>
<p>Deep Democracy threatens to press the envelope of political thinking even further. Deep democracy has many aspects, many of which relate to philosophical concepts derived from quantum physics. Deep Democracy at its deepest manifestation refers to an openness towards not only the views of other people and groups, but Deep Democracy also embraces an openness to emotions and personal experiences, which tend to get excluded from conflict and rational public discourse (Mindell, 1992). As R. Buckminster Fuller (1981) said, we need to support the intuitive wisdom and comprehensive informedness of each and every individual to ensure our continued fitness for survival as a species.<br>
Deep Democracy has crossed over into many fields and has been picked up by many authors, some using it as defined by Mindell, some use only particular aspects of it, as it is often the case with crossovers. For example, speaking in a circle of women who gathered shortly after 9-11, Susan Collin Marks, of Search for Common Ground, the world’s largest international conflict NGO, said:</p>
<p class="quote">We need to accommodate the different groups and not have a win-lose [situation] where the winner takes all. In South Africa—having been under apartheid fifty years, and before that under all sorts of authoritarian rule, the British, the Dutch—when we came to our transition we asked ourselves, “What is democracy, what does it mean, what does it mean for us?” A group of people went around the country asking, “What do you think democracy is, and what are we going to call it, and what will our democracy look like?” They came up with the term “deep democracy.” They said, “For us, this is about deep democracy, not just about surface democracy.” (Peace X Peace, 2004)</p>
<p>She intuited a need for a system that is awareness based and not based on social power distribution only. If you follow the Black Economic Empowerment Movement in South Africa, this need for dialogue and bringing in different frames for references, discussing different values of what the core of our life is and how we feel about it each other is crucial. If we address the power issues and financial realities of the Middle Eastern conflict and create a political solution, it cannot be sustainable without addressing the Deep Democracy aspects, the feelings of hate and vengeance, the hope for a peaceful life together, and the despair of not having found the acceptance and love that you hoped for.</p>

<div class="ngg-gallery-singlepic-image ngg-right">
<a href="https://iapop.com/wp-content/gallery/admin/super200.gif" title="" data-src="https://iapop.com/wp-content/gallery/admin/super200.gif" data-thumbnail="https://iapop.com/wp-content/gallery/admin/thumbs/thumbs_super200.gif" data-image-id="27" data-title="super200" data-description="" target="_self" class="shutterset_818e752240c0a8313694be806ff94a8b">
<img class="ngg-singlepic" src="https://iapop.com/wp-content/gallery/admin/cache/super200.gif-nggid0227-ngg0dyn-0x0x100-00f0w010c010r110f110r010t010.gif" alt="super200" title="super200">
</a>
</div>

<p>The idea of supporting a deeper dialogue has been around at least since Plato argued for the inclusion of women in public discourse. Athens needed the intelligence of all and couldn’t afford not to accept women as thinkers and leaders. Even if Plato didn’t expand his thinking enough to extend that acceptance to slaves, other races, and other than the upper classes of women, he planted a cultural seed that needed another twenty-five hundred years to sprout and is only now coming to fruition in culturally creative ways.</p>
<p>Governmental facilitation of protest is challenging because political and bureaucratic inertia prevents it from being open to change from the outside. Suppression of peaceful protest in the name of order invites repression while unrestrained protest invites anarchy. The challenge then is one of balance: to defend the right to freedom of speech and assembly while maintaining public order and countering attempts at intimidation or violence.</p>
<p>This is a difficult balance to maintain. Ultimately, it depends on the commitment of those in power to maintaining the institutions of democracy and the precepts of individual rights as well as the commitment of the mainstream to support these efforts and the commitment of the marginalized groups to self-limit their forms of protest. A US government publication called What is Democracy maintains that, “Democratic societies are capable of enduring the most bitter disagreement among its citizens—except for disagreement about the legitimacy of democracy itself” (US Department of State, 2004). The symbiotic connection between democracy and human development is an aspect of Deep Democracy.</p>
<p>One of the primary concerns of Deep Democracy is the use, maintenance, and awareness of metaskills (Arnold Mindell, 1992, p. 49). The concept of openness to diversity and dialogue between various views doesn’t mean that the facilitator is a pushover—that is only one metaskill (although it often reflects a lack of awareness). Facilitators must also at times practice, embody, and express other metaskills such as toughness, anger, intractability, love, detachment, concern for the well being of the others, and a genuine desire to achieve consensus. Some of the metaskills in that list are organic responses. However, when a facilitator uses her internal organic responses to better inform her intervention, that is a metaskill. This is the reason why the human development—the internal psychological and spiritual growth and inner peace—of the facilitator is so important.</p>
<p>Deep Democracy involves not only openness to other individuals, groups, and diverse views but an openness to experience; which includes feelings, dreams, body symptoms, altered states of consciousness, synchronicities, and an awareness of signals, roles, and the structural dynamics of the interactions between the parties involved.<br>
Repression and exploitation are the two most basic modern forms of structural violence; cardiovascular diseases and cancer are the two basic somatic conditions brought on by modernization. Repression and cardiovascular diseases are similar in that both impede circulation. Exploitation and cancer resemble each other in that a part of the social or human organism lives at the expense of the rest. Peace research and health research are metaphors for each other; each can learn from the other. Similarly, both peace theory and medical science emphasize the role of consciousness and mobilization in healing.</p>
<p>The relationship between somatic experience, altered states of consciousness, and conflict may not be only metaphorical. Ikeda says that Buddhism (and other spiritual traditions) “transcends the dimension on which all phenomena are perceived as interrelated and reveals the dynamism of the universal life on which all interrelations depend.” Similarly, Process Oriented Psychology (also known as Process Work) and its Worldwork theories and practice use experiential phenomena to reveal the deeper underlying universal dynamic and its interrelations on a practical level.</p>
</article>


<hr>

<footer>
<p>
<a href="/david/" title="Aller à l’accueil"><svg class="icon icon-home">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-home"></use>
</svg> Accueil</a> •
<a href="/david/log/" title="Accès au flux RSS"><svg class="icon icon-rss2">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-rss2"></use>
</svg> Suivre</a> •
<a href="http://larlet.com" title="Go to my English profile" data-instant><svg class="icon icon-user-tie">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-user-tie"></use>
</svg> Pro</a> •
<a href="mailto:david%40larlet.fr" title="Envoyer un courriel"><svg class="icon icon-mail">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-mail"></use>
</svg> Email</a> •
<abbr class="nowrap" title="Hébergeur : Alwaysdata, 62 rue Tiquetonne 75002 Paris, +33184162340"><svg class="icon icon-hammer2">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-hammer2"></use>
</svg> Légal</abbr>
</p>
<template id="theme-selector">
<form>
<fieldset>
<legend><svg class="icon icon-brightness-contrast">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-brightness-contrast"></use>
</svg> Thème</legend>
<label>
<input type="radio" value="auto" name="chosen-color-scheme" checked> Auto
</label>
<label>
<input type="radio" value="dark" name="chosen-color-scheme"> Foncé
</label>
<label>
<input type="radio" value="light" name="chosen-color-scheme"> Clair
</label>
</fieldset>
</form>
</template>
</footer>
<script src="/static/david/js/instantpage-5.1.0.min.js" type="module"></script>
<script>
function loadThemeForm(templateName) {
const themeSelectorTemplate = document.querySelector(templateName)
const form = themeSelectorTemplate.content.firstElementChild
themeSelectorTemplate.replaceWith(form)

form.addEventListener('change', (e) => {
const chosenColorScheme = e.target.value
localStorage.setItem('theme', chosenColorScheme)
toggleTheme(chosenColorScheme)
})

const selectedTheme = localStorage.getItem('theme')
if (selectedTheme && selectedTheme !== 'undefined') {
form.querySelector(`[value="${selectedTheme}"]`).checked = true
}
}

const prefersColorSchemeDark = '(prefers-color-scheme: dark)'
window.addEventListener('load', () => {
let hasDarkRules = false
for (const styleSheet of Array.from(document.styleSheets)) {
let mediaRules = []
for (const cssRule of styleSheet.cssRules) {
if (cssRule.type !== CSSRule.MEDIA_RULE) {
continue
}
// WARNING: Safari does not have/supports `conditionText`.
if (cssRule.conditionText) {
if (cssRule.conditionText !== prefersColorSchemeDark) {
continue
}
} else {
if (cssRule.cssText.startsWith(prefersColorSchemeDark)) {
continue
}
}
mediaRules = mediaRules.concat(Array.from(cssRule.cssRules))
}

// WARNING: do not try to insert a Rule to a styleSheet you are
// currently iterating on, otherwise the browser will be stuck
// in a infinite loop…
for (const mediaRule of mediaRules) {
styleSheet.insertRule(mediaRule.cssText)
hasDarkRules = true
}
}
if (hasDarkRules) {
loadThemeForm('#theme-selector')
}
})
</script>
</body>
</html>

+ 61
- 0
cache/2024/1d60fc5548a6fe61da80a4e16892fa0c/index.md View File

@@ -0,0 +1,61 @@
title: Deep Democracy - IAPOP
url: https://iapop.com/deep-democracy/
hash_url: 1d60fc5548a6fe61da80a4e16892fa0c
archive_date: 2024-01-31

<h3>Definition of Deep Democracy</h3>
<p>The concept of Deep Democracy was developed by Arnold Mindell. It is defined as an attitude and a principle.</p>
<p class="quote">Attitude: Deep Democracy is an attitude that focuses on the awareness of voices that are both central and marginal. This type of awareness can be focused on groups, organizations, one’s own inner experiences, people in conflict, etc. Allowing oneself to take seriously seemingly unimportant events and feelings can often bring unexpected solutions to both group and inner conflicts.</p>
<p class="quote">Principle: Unlike “classical” democracy, which focuses on majority rule, Deep Democracy suggests that all voices, states of awareness, and frameworks of reality are important. Deep Democracy also suggests that the information carried within these voices, awarenesses, and frameworks are all needed to understand the complete process of the system. The meaning of this information appears, when the various frameworks and voices are relating to each other. Deep Democracy is a process of relationship, not a state-oriented still picture, or a set of policies.</p>
<p>From Deep Democracy principle and attitude, Glossary: [1]</p>
<h3>A Brief History of Deep Democracy</h3>
<p>Deep Democracy is a psycho-social-political paradigm and methodology. The term Deep Democracy was developed by Arny Mindell in 1988 and first appeared in Leader as Martial Artist (Mindell, 1992). Mindell, a physicist and Jungian Analyst had researched and written extensively on how awareness creates reality and how we perceive it on different levels, creating different frameworks of reality. An example for this is how we perceive time: the measurable reality of the seconds ticking in a clock, the dreamlike “subjective” perception of time as it passes during an encounter with a lover, and the sentient essence of timelessness as we catch the moment of a sunrise that goes beyond time as we know it and replaces, for a moment, the concept of future with hope. Mindell calls his paradigm Processwork, which formulates these principles and demonstrates how they can be used in psychotherapy in many of his books. In the late eighties he started to formulate them as a political principle that he called Deep Democracy. Unlike “classical” democracy, which focuses on majority rule, Deep Democracy suggests that all voices, states of awareness, and frameworks of reality are important. Deep Democracy also suggests that the information carried within these voices, awarenesses, and frameworks are all needed to understand the complete process of the system. Deep Democracy is an attitude that focuses on the awareness of voices that are both central and marginal.</p>
<p>
</p><div class="ngg-gallery-singlepic-image ngg-left">
<a href="https://iapop.com/wp-content/gallery/admin/crdlse300.gif" title="" data-src="https://iapop.com/wp-content/gallery/admin/crdlse300.gif" data-thumbnail="https://iapop.com/wp-content/gallery/admin/thumbs/thumbs_crdlse300.gif" data-image-id="26" data-title="CR DL SE outlines" data-description="" target="_self" class="shutterset_f8297252056becfbf675efe5c4f6b8f7">
<img class="ngg-singlepic" src="https://iapop.com/wp-content/gallery/admin/cache/crdlse300.gif-nggid0226-ngg0dyn-0x0x100-00f0w010c010r110f110r010t010.gif" alt="CR DL SE outlines" title="CR DL SE outlines">
</a>
</div><p>
This type of awareness can be focused on groups, organizations, one’s own inner experiences, people in conflict, etc. Allowing oneself to take seriously seemingly unimportant events and feelings can often bring unexpected solutions to both group and inner conflicts.
</p><p>Although the term and the concepts of Deep Democracy are now being used by various groups in different ways they have a common denominator that Mindell describes so well: An experience of Deep Democracy as a process of flow in which all actors on the stage are needed to create the play that we are watching.</p>
<p>Numerous attempts to implement Deep Democracy are occurring simultaneously throughout the world. Just as conventional democracy strives to include all people in a political process, Deep Democracy furthers this by striving to foster a deeper level of dialogue and inclusivity that is open to including not only all people in the sense of the right to vote but is also open to allowing space for various and competing views, tensions, feelings, and styles of communication in a way that supports awareness of relative rank, power, and privilege and the ways in which these tend to marginalize various views, individuals, and groups.</p>
<p class="quote">Deep Democracy is our sense that the world is here to help us to become our entire selves, and that we are here to help the world to become whole (Mindell, 1992).</p>
<h3>Roots of Democracy</h3>
<p class="quote">de•moc• ra•cy (di mak’re se) n. [Gr demokratia &lt; demos, the people + kratein, to rule &lt; kratos, strength] 1 government in which the people hold the ruling power either directly or through elected representatives 2 a country, state, etc. with such government 3 majority rule 4 the principle of equality of rights, opportunity, and treatment 5 the common people, esp. as the wielders of political power. (Webster’s, 1983, p. 366)</p>
<p class="quote">We have frequently printed the word Democracy, yet I cannot too often repeat that it is a word the real gist of which still sleeps, quite unawakened, notwithstanding the resonance and the many angry tempests out of which its syllables have come, from pen or tongue. It is a great word, whose history, I suppose, remains unwritten, because that history has yet to be enacted. -Walt Whitman, Democratic Vistas, 1871</p>
<p>Democracy—commonly defined as the free and equal right of every person to participate in a system of government, often practiced by electing representatives of the people—is generally said to have originated in Ancient Greece when the demos organized against their leaders’ abuse of power. But democracy is more than a body of laws and procedures related to the sharing of power. President Carter said that, “Democracy is like the experience of life itself—always changing, infinite in its variety, sometimes turbulent and all the more valuable for having been tested for adversity” (Carter, 1978). How is democracy like life? In what dimensions is it changing and turbulent?</p>
<p>One example of the dynamic turbulence of democracy in the United States is the evolution of freedom of the press and the practical application of the First Amendment rights to free speech. The first American newspaper, Publick Occurrences, Both Foreign and Domestic (Massachusetts Historical Society, 2004), published its first and only issue in Boston on Thursday, September 25th, 1690. Publication was stopped by the governor of Boston who objected to the paper’s negative tone regarding British rule and by the local ministries who were offended by a report that the King of France had had an affair with his son’s wife (Virtual Museum of Printing, 2004).</p>
<h3>A Brief History of Free Speech in the US and its Relationship to Deep Democracy</h3>
<p>For social activists, free speech and the freedom of the press were key issues to fight for. Deep democracy however is a principle that tries to include all experiences. If you speak freely about a political opponent, and bring out your opinion, and marginalize the part in you that realizes that your opponent is also a person and has many dimensions, you have censored yourself and have not used a deeper freedom of speech. Free speech and the freedom of press are important, but without Deep Democracy, they can become an abusive and tyrannical force, that is not relating to the emotional and social realities and total experiences of the people they are reporting about.</p>
<p>Up until 1919 free speech and freedom of the press in the United States meant “little more than no prior restraint, that is, one could say what one wanted, but then could be prosecuted for it” (Holmes, 1919). There was no protection for the dissemination of ideas. In 1859 John Stuart Mill pointed out the risks involved in suppressing ideas in his essay, On Liberty:</p>
<p class="quote">But the peculiar evil of silencing the expression of an opinion is, that it is robbing the human race; posterity as well as the existing generation; those who dissent from the opinion, still more than those who hold it. If the opinion is right, they are deprived of the opportunity of exchanging error for truth: if wrong, they lose, what is almost as great a benefit, the clearer perception and livelier impression of truth, produced by its collision with error. (1859)</p>
<p>Despite Mill’s impassioned plea and the wide distribution of On Liberty—which had great impact on the public discourse of the its day as well as on the course of political philosophy since—the US maintained a very conservative view towards freedom of speech until 1919.</p>
<p>That view changed abruptly in 1919 when Supreme Court Judge Oliver Wendell Holmes entered a dissenting opinion in favor of a group of radical pamphleteers:</p>
<p class="quote">Jacob Abrams and others had been convicted of distributing pamphlets criticizing the Wilson administration for sending troops to Russia in the summer of 1918. Although the government could not prove that the pamphlets had actually hindered the operation of the military, an anti-radical lower court judge had found that they might have done so, and found Abrams and his co-defendants guilty. On appeal, seven members of the Supreme Court had used Holmes’s “clear and present danger” test to sustain the conviction. But Holmes, joined by Louis D. Brandeis, dissented, and it is this dissent that is widely recognized as the starting point in modern judicial concern for free expression. (US Department of State, 1919)</p>
<p>Abrams’ publications seem almost benign by today’s standards: “Workers—Wake Up. . . . Woe unto those who will be in the way of progress. Let solidarity live. . . . German militarism combined with allied capitalism to crush the Russian revolution. . .” and spoke of working class enlightenment (US Department of State, 1919).</p>
<p>Justice Holmes ruled in their defense that:</p>
<p class="quote">It is only the present danger of immediate evil or an intent to bring it about that warrants Congress in setting a limit to the expression of opinion where private rights are not concerned. Congress certainly cannot forbid all effort to change the mind of the country. (Holmes, 1919)</p>
<p>In the discussion of free speech, we often marginalize the need for relationship between the parties. Public dialogue allows reaction to what is going on. Both parties, those who champion for free speech and those who champion for limitations in the interests of public safety, need to relate more to each other and learn to understand the visions and ideals that are behind those opinions. In a deeply democratic society, this is considered more sustainable then a seesaw process of forbidding and allowing the publication of certain texts.</p>
<p>In his ruling Justice Holmes supported the importance of public discourse and freedom of speech with these now widely quoted words: “The best test of truth is the power of the thought to get itself accepted in the competition of the market” (Holmes, 1919). But, after more than twenty-five centuries of development in political philosophy, it is only within the last century that US and European thought has begun to support freedom of speech in a meaningful way. Holmes’s thinking didn’t account for structural forces that tend to repress various ideas in support of special interests.</p>
<p>Joseph Stiglitz, former Chairman of Council of Economic Advisers under President Clinton and former Chief Economist and Senior VP of the World Bank maintains:</p>
<p class="quote">Secrecy . . . undermines democracy. There can be democratic accountability only if those to whom these public institutions are supposed to be accountable are well informed about what they are doing—including what choices they confronted and how those decisions were made. (Stiglitz, 2003, p. 229)</p>
<h3>Evolution of Deep Democracy</h3>
<p class="quote">The most fundamental forum is your own heart. Both as a facilitator and as a human being, you must learn to hear yourself there. Arnold Mindell, “Sitting in the Fire”, 1995</p>
<p>Deep Democracy threatens to press the envelope of political thinking even further. Deep democracy has many aspects, many of which relate to philosophical concepts derived from quantum physics. Deep Democracy at its deepest manifestation refers to an openness towards not only the views of other people and groups, but Deep Democracy also embraces an openness to emotions and personal experiences, which tend to get excluded from conflict and rational public discourse (Mindell, 1992). As R. Buckminster Fuller (1981) said, we need to support the intuitive wisdom and comprehensive informedness of each and every individual to ensure our continued fitness for survival as a species.<br>
Deep Democracy has crossed over into many fields and has been picked up by many authors, some using it as defined by Mindell, some use only particular aspects of it, as it is often the case with crossovers. For example, speaking in a circle of women who gathered shortly after 9-11, Susan Collin Marks, of Search for Common Ground, the world’s largest international conflict NGO, said:</p>
<p class="quote">We need to accommodate the different groups and not have a win-lose [situation] where the winner takes all. In South Africa—having been under apartheid fifty years, and before that under all sorts of authoritarian rule, the British, the Dutch—when we came to our transition we asked ourselves, “What is democracy, what does it mean, what does it mean for us?” A group of people went around the country asking, “What do you think democracy is, and what are we going to call it, and what will our democracy look like?” They came up with the term “deep democracy.” They said, “For us, this is about deep democracy, not just about surface democracy.” (Peace X Peace, 2004)</p>
<p>She intuited a need for a system that is awareness based and not based on social power distribution only. If you follow the Black Economic Empowerment Movement in South Africa, this need for dialogue and bringing in different frames for references, discussing different values of what the core of our life is and how we feel about it each other is crucial. If we address the power issues and financial realities of the Middle Eastern conflict and create a political solution, it cannot be sustainable without addressing the Deep Democracy aspects, the feelings of hate and vengeance, the hope for a peaceful life together, and the despair of not having found the acceptance and love that you hoped for.</p>

<div class="ngg-gallery-singlepic-image ngg-right">
<a href="https://iapop.com/wp-content/gallery/admin/super200.gif" title="" data-src="https://iapop.com/wp-content/gallery/admin/super200.gif" data-thumbnail="https://iapop.com/wp-content/gallery/admin/thumbs/thumbs_super200.gif" data-image-id="27" data-title="super200" data-description="" target="_self" class="shutterset_818e752240c0a8313694be806ff94a8b">
<img class="ngg-singlepic" src="https://iapop.com/wp-content/gallery/admin/cache/super200.gif-nggid0227-ngg0dyn-0x0x100-00f0w010c010r110f110r010t010.gif" alt="super200" title="super200">
</a>
</div>

<p>The idea of supporting a deeper dialogue has been around at least since Plato argued for the inclusion of women in public discourse. Athens needed the intelligence of all and couldn’t afford not to accept women as thinkers and leaders. Even if Plato didn’t expand his thinking enough to extend that acceptance to slaves, other races, and other than the upper classes of women, he planted a cultural seed that needed another twenty-five hundred years to sprout and is only now coming to fruition in culturally creative ways.</p>
<p>Governmental facilitation of protest is challenging because political and bureaucratic inertia prevents it from being open to change from the outside. Suppression of peaceful protest in the name of order invites repression while unrestrained protest invites anarchy. The challenge then is one of balance: to defend the right to freedom of speech and assembly while maintaining public order and countering attempts at intimidation or violence.</p>
<p>This is a difficult balance to maintain. Ultimately, it depends on the commitment of those in power to maintaining the institutions of democracy and the precepts of individual rights as well as the commitment of the mainstream to support these efforts and the commitment of the marginalized groups to self-limit their forms of protest. A US government publication called What is Democracy maintains that, “Democratic societies are capable of enduring the most bitter disagreement among its citizens—except for disagreement about the legitimacy of democracy itself” (US Department of State, 2004). The symbiotic connection between democracy and human development is an aspect of Deep Democracy.</p>
<p>One of the primary concerns of Deep Democracy is the use, maintenance, and awareness of metaskills (Arnold Mindell, 1992, p. 49). The concept of openness to diversity and dialogue between various views doesn’t mean that the facilitator is a pushover—that is only one metaskill (although it often reflects a lack of awareness). Facilitators must also at times practice, embody, and express other metaskills such as toughness, anger, intractability, love, detachment, concern for the well being of the others, and a genuine desire to achieve consensus. Some of the metaskills in that list are organic responses. However, when a facilitator uses her internal organic responses to better inform her intervention, that is a metaskill. This is the reason why the human development—the internal psychological and spiritual growth and inner peace—of the facilitator is so important.</p>
<p>Deep Democracy involves not only openness to other individuals, groups, and diverse views but an openness to experience; which includes feelings, dreams, body symptoms, altered states of consciousness, synchronicities, and an awareness of signals, roles, and the structural dynamics of the interactions between the parties involved.<br>
Repression and exploitation are the two most basic modern forms of structural violence; cardiovascular diseases and cancer are the two basic somatic conditions brought on by modernization. Repression and cardiovascular diseases are similar in that both impede circulation. Exploitation and cancer resemble each other in that a part of the social or human organism lives at the expense of the rest. Peace research and health research are metaphors for each other; each can learn from the other. Similarly, both peace theory and medical science emphasize the role of consciousness and mobilization in healing.</p>
<p>The relationship between somatic experience, altered states of consciousness, and conflict may not be only metaphorical. Ikeda says that Buddhism (and other spiritual traditions) “transcends the dimension on which all phenomena are perceived as interrelated and reveals the dynamism of the universal life on which all interrelations depend.” Similarly, Process Oriented Psychology (also known as Process Work) and its Worldwork theories and practice use experiential phenomena to reveal the deeper underlying universal dynamic and its interrelations on a practical level.</p>

+ 415
- 0
cache/2024/cd9184008ba5d9e4c9be4d0a0eea4f60/index.html View File

@@ -0,0 +1,415 @@
<!doctype html><!-- This is a valid HTML5 document. -->
<!-- Screen readers, SEO, extensions and so on. -->
<html lang="fr">
<!-- Has to be within the first 1024 bytes, hence before the `title` element
See: https://www.w3.org/TR/2012/CR-html5-20121217/document-metadata.html#charset -->
<meta charset="utf-8">
<!-- Why no `X-UA-Compatible` meta: https://stackoverflow.com/a/6771584 -->
<!-- The viewport meta is quite crowded and we are responsible for that.
See: https://codepen.io/tigt/post/meta-viewport-for-2015 -->
<meta name="viewport" content="width=device-width,initial-scale=1">
<!-- Required to make a valid HTML5 document. -->
<title>Daring Fireball: The Vision Pro (archive) — David Larlet</title>
<meta name="description" content="Publication mise en cache pour en conserver une trace.">
<!-- That good ol' feed, subscribe :). -->
<link rel="alternate" type="application/atom+xml" title="Feed" href="/david/log/">
<!-- Generated from https://realfavicongenerator.net/ such a mess. -->
<link rel="apple-touch-icon" sizes="180x180" href="/static/david/icons2/apple-touch-icon.png">
<link rel="icon" type="image/png" sizes="32x32" href="/static/david/icons2/favicon-32x32.png">
<link rel="icon" type="image/png" sizes="16x16" href="/static/david/icons2/favicon-16x16.png">
<link rel="manifest" href="/static/david/icons2/site.webmanifest">
<link rel="mask-icon" href="/static/david/icons2/safari-pinned-tab.svg" color="#07486c">
<link rel="shortcut icon" href="/static/david/icons2/favicon.ico">
<meta name="msapplication-TileColor" content="#f7f7f7">
<meta name="msapplication-config" content="/static/david/icons2/browserconfig.xml">
<meta name="theme-color" content="#f7f7f7" media="(prefers-color-scheme: light)">
<meta name="theme-color" content="#272727" media="(prefers-color-scheme: dark)">
<!-- Is that even respected? Retrospectively? What a shAItshow…
https://neil-clarke.com/block-the-bots-that-feed-ai-models-by-scraping-your-website/ -->
<meta name="robots" content="noai, noimageai">
<!-- Documented, feel free to shoot an email. -->
<link rel="stylesheet" href="/static/david/css/style_2021-01-20.css">
<!-- See https://www.zachleat.com/web/comprehensive-webfonts/ for the trade-off. -->
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_regular.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: light), (prefers-color-scheme: no-preference)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_bold.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: light), (prefers-color-scheme: no-preference)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_italic.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: light), (prefers-color-scheme: no-preference)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t3_regular.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: dark)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t3_bold.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: dark)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t3_italic.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: dark)" crossorigin>
<script>
function toggleTheme(themeName) {
document.documentElement.classList.toggle(
'forced-dark',
themeName === 'dark'
)
document.documentElement.classList.toggle(
'forced-light',
themeName === 'light'
)
}
const selectedTheme = localStorage.getItem('theme')
if (selectedTheme !== 'undefined') {
toggleTheme(selectedTheme)
}
</script>

<meta name="robots" content="noindex, nofollow">
<meta content="origin-when-cross-origin" name="referrer">
<!-- Canonical URL for SEO purposes -->
<link rel="canonical" href="https://daringfireball.net/2024/01/the_vision_pro">

<body class="remarkdown h1-underline h2-underline h3-underline em-underscore hr-center ul-star pre-tick" data-instant-intensity="viewport-all">


<article>
<header>
<h1>Daring Fireball: The Vision Pro</h1>
</header>
<nav>
<p class="center">
<a href="/david/" title="Aller à l’accueil"><svg class="icon icon-home">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-home"></use>
</svg> Accueil</a> •
<a href="https://daringfireball.net/2024/01/the_vision_pro" title="Lien vers le contenu original">Source originale</a>
<br>
Mis en cache le 2024-01-31
</p>
</nav>
<hr>
<p>For the last six days, I’ve been simultaneously testing three entirely new products from Apple. The first is a VR/AR headset with eye-tracking controls. The second is a revolutionary spatial computing productivity platform. The third is a breakthrough personal entertainment device.</p>

<p>A headset, a spatial productivity platform, and a personal entertainment device.</p>

<p>I’m sure you’re already getting it. These are not three separate devices. They’re one: Apple Vision Pro. But if you’ll pardon the shameless homage to Steve Jobs’s famous iPhone introduction, I think these three perspectives are the best way to consider it.</p>

<h2>The Hardware</h2>

<p>Vision Pro comes in a surprisingly big box. I was expecting a package roughly the dimensions of a HomePod box; instead, a Vision Pro retail box is <a href="https://daringfireball.net/misc/2024/01/homepods-v-vision-pro.jpeg">quite a bit larger than <em>two</em> HomePod boxes</a> stacked atop each other. (I own more HomePods than most people.)</p>

<p>There’s a lot inside. The top half of the package contains the Vision Pro headset itself, with the light seal, a light seal cushion, and the default Solo Knit Band already attached. The lower half contains the battery, the charger (30W), the cables, the Dual Loop Band, the Getting Started book (which is beautifully printed in full color, on excellent paper — it feels like a keepsake), the polishing cloth<sup id="fnr1-2024-01-30"><a href="#fn1-2024-01-30">1</a></sup>, and an extra light seal cushion.</p>

<p>To turn Vision Pro on, you connect the external battery pack’s power cable to the Vision Pro’s power connector, and rotate it a quarter turn to lock it into place. There are small dots on the headset’s dime-sized power socket showing how to align the cable connector’s small LED. The LED pulses when Vision Pro turns on. (I miss Apple’s glowing power indicator LEDs — this is a really delightful touch.) When Vision Pro has finished booting and is ready to use, it makes a pleasant welcoming sound.</p>

<p>Then you put Vision Pro on. If you’re using the Solo Knit Band, you tighten and loosen it using a dial on the band behind your right ear. VisionOS directs you to raise or lower the headset appropriately to position it at just the right height on your face relative to your eyes. If Vision Pro thinks your eyes are too close to the displays, it will suggest you switch to the “+” size light seal cushion. You get two light seal cushions, but they’re not the same: mine are labeled “W” and “W+”. The “+” is the same width, to match your light seal, but adds a wee bit more space between your eyes and the displays inside Vision Pro. For me the default (non-“+”) one fits fine.</p>

<p>The software then guides you through a series of screens to calibrate the eye tracking. It’s all very obvious, and kind of fun. It’s almost like a simple game: you stare at a series of dots in a circle, and pinch your index finger and thumb as you stare at each one. You go through this three times, in three different artificial lighting conditions: dark, medium, and bright. Near the end of the first-run experience, you’re prompted to bring your iPhone or iPad nearby, just like when setting up a new iPhone or iPad. This allows your Vision Pro to get your Apple ID credentials and Wi-Fi password without entering any of that manually. It’s a very smooth onboarding process. And then that’s it, you’re in and using Vision Pro. </p>

<p>There’s no getting around some fundamental problems with the Vision Pro hardware.</p>

<p>First is the fact that it uses an external battery pack connected via a power cable. The battery itself is about the width and height of an iPhone 15/15 Pro, but thicker. And the battery is heavy: about 325g, compared to 187g for an iPhone 15 Pro, and 221g for a 15 Pro Max. It’s closer in thickness and weight to <em>two</em> iPhone 15’s than it is to one. And the tethered power cable can be an annoyance. Vision Pro has no built-in reserve battery — disconnect the power cable from the headset and it immediately shuts off. It clicks firmly into place, so there’s no risk of accidentally disconnecting it. But if you buy an <a href="https://www.apple.com/shop/product/MW283LL/A/apple-vision-pro-battery">extra Vision Pro Battery for $200</a>, you can’t hot-swap them — you need to shut down first.</p>

<p>Second is the fact that Vision Pro is <a href="https://daringfireball.net/linked/2024/01/20/vision-pro-weight">heavy</a>. I’ve used it for hours at a time without any discomfort, but fatigue does set in, from the weight alone. You never forget that you’re wearing it. Related to Vision Pro’s weight is the fact that it’s quite large. It’s a big-ass pair of heavy goggles on your face. There’s nothing subtle about it — either from your first-person perspective wearing it, or from the third-person perspective of someone else looking at you while you wear it.</p>

<p>One of Apple’s suggestions for adjusting the position of Vision Pro on your face is to balance the weight/pressure on your face equally between your forehead and your cheeks. I found this to be good advice. My instincts, originally, were to place it slightly too low on my face, which causes the system to advise positioning it higher. It needs to be positioned just right on your face both so that you see well through its displays, and so it can see your eyes for Optic ID, Vision Pro’s equivalent of Face ID for unlocking the device and confirming actions that require authentication (like accessing passwords from your keychain, or purchasing apps). Within a day or two, it became natural for me to put it on without needing to fuss with the fit.</p>

<p>The default stretchy Solo Knit Band not only works well for me, but I prefer it, comfort- and convenience-wise, to the Dual Loop Band. With the Solo Knit Band, you put it on and tighten it by twisting the aforementioned dial. When you take it off, you loosen it first. You want it tight enough on your face that it isn’t practical not to have to tighten/loosen it each time you put it on/take it off. It’s a bit like getting accustomed to a new watch strap — at first it feels finicky, but you quickly develop muscle memory. In addition to learning how high to place Vision Pro on your face, playing around with how high the Solo Knit Band should go across the <em>back</em> of your head is essential for getting a consistent fit.</p>

<p>Why does Vision Pro come with the Dual Loop Band, which is an altogether different design? Apple’s Getting Started guide describes its purpose with a wonderful euphemism: “Apple Vision Pro also comes with a Dual Loop Band, which is a great option if you want a different fit.” Translation: You should try it if the Solo Knit Band isn’t comfortable. Vision Pro is an extraordinarily personal device. It’s not just on your face, which is incredibly sensitive to feel and touch, but it’s heavy and requires precise alignment with your eyes. You also really want the light seal to, well, seal out light. <a href="https://www.reddit.com/r/VisionPro/comments/19ardw5/all_light_seal_sizes/">This Reddit post</a> suggests there are 28 different sizes for the light seal. 28! (The N’s and W’s, I presume, are for <em>narrow</em> and <em>wide</em>.) Depending on the shape of your face, size of your head, and volume of hair, the Solo Knit Band might not work well. The Dual Loop Band has two velcro straps — one across the back of your head, and one that goes across the top. People who use the Dual Loop Band will probably need to loosen and tighten both straps each time they take Vision Pro on and off.</p>

<p>If it all sounds a little fussy, that’s because it is. But there’s no way around it: it requires a precise fit both for comfort and optical alignment.</p>

<p>Many have noted that for a product from a company that has pushed fitness-related devices (Watch) and services (Fitness+), there is no fitness-related marketing angle for Vision Pro. It’s simply too heavy. No one wants to exert themselves with a 650g device strapped to their face. Someday Apple will make a fitness-suitable Vision headset; this Vision Pro is not it.</p>

<p>Another aspect that takes some getting used to is simply handling Vision Pro. You need to learn to hold it via the aluminum frame around the device itself, not the light seal. Try to hold it or pick it up by the light seal and the seal will pop off. The light seal attaches to Vision Pro magnetically, and the light seal cushion attaches to the seal magnetically. They’re easy to attach and detach, and snap into place automatically — but they will detach if you use the light seal (or the cushion) to pick up the combined unit. (Zeiss lens inserts for glasses-wearers also pop into place magnetically, and are trivial to insert and remove.)</p>

<p>You don’t need to put Vision Pro to sleep before taking it off, nor wake it up when putting it on. You just take it off and put it on, and the system detects whether you’re using it automatically. You can just leave the battery attached permanently.</p>

<p>I suspect the front face of Vision Pro is easily scratched. This is a device that demands to be handled with a degree of care. Apple’s instructions advise putting the cover on each time you’re done using it, like putting the cap back on a bottle of a fizzy beverage when you’re done drinking. I don’t think it’s delicate, per se, but it is most certainly not rugged.</p>

<p>My review kit included <a href="https://www.apple.com/shop/product/MW2F3LL/A/apple-vision-pro-travel-case">Apple’s $200 travel case</a>. As with the Vision Pro retail box, I found it surprisingly large. It will consume much of the internal volume inside most laptop backpacks — and in fact, the travel case all by itself is roughly the size of a small child’s backpack. I very much look forward to using Vision Pro while traveling, but it’s something you’ll need to plan your packing around.<sup id="fnr2-2024-01-30"><a href="#fn2-2024-01-30">2</a></sup></p>

<p>In a short post two weeks ago — before I had this unit to review at home, but after another hands-on demo with Apple in New York — <a href="https://daringfireball.net/linked/2024/01/19/vision-pro-battery">I wrote</a>:</p>

<blockquote>
<p>Almost every first-generation product has things like this — glaring deficiencies dictated by the limits of technology. The
original Mac had far too little RAM (128 KB) and far too little
storage (a single <a href="https://lowendmac.com/2016/floppy-disk-compatibility-and-incompatibility-in-the-mac-world/">400 KB single-sided floppy disk drive</a>).
The original iPhone only supported 2G EDGE cellular networking,
which was <a href="https://en.wikipedia.org/wiki/Enhanced_Data_rates_for_GSM_Evolution#:~:text=EDGE%20can%20carry%20a%20bandwidth,much%20traffic%20as%20standard%20GPRS.">unfathomably slow</a> and didn’t work at all while
you were on a voice call. The original Apple Watch was very slow
and struggled to last a full day on a single charge. The external
battery pack — which only supplies 2 to 2.5 hours of battery life — is that for this first-gen Vision Pro. Also, the Vision Pro
headset itself — without any built-in battery — is still too big
and too heavy. </p>

<p><a href="https://paulgraham.com/really.html">Paul Graham has a wonderful adage</a>: </p>

<blockquote>
<p>Don’t worry what people will say. If your first version is so
impressive that trolls don’t make fun of it, you waited too long
to launch. </p>
</blockquote>
</blockquote>

<p>Vision Pro isn’t even in stores yet and it’s already subject to mockery. (So was the iPhone before it shipped; so was the original Macintosh.) In a few years, after a few product generations, we will <em>all</em> look back at this first Vision device and laugh. We’ll laugh at the external battery, we will laugh at the size and weight of the device, and eventually we will laugh at its price. The knocks against it are all undeniably true: it’s too heavy and too big for everyone, and too expensive for the mass market.</p>

<p>But, like that original iPhone and the original Macintosh before it, this first Vision Pro is no joke.</p>

<h2>The VisionOS Platform</h2>

<p>Back in June, after getting a 30-minute demo of Vision Pro at WWDC, I wrote:</p>

<blockquote>
<p>Apple is <a href="https://twitter.com/tim_cook/status/1665806600261763072">promoting</a> the Vision Pro announcement as the
launch of “the era of spatial computing”. That term feels perfect.
It’s not AR, VR, or XR. It’s spatial computing, and some <em>aspects</em>
of spatial computing are AR or VR. </p>

<p>To me the Macintosh has always felt more like a <em>place</em> than a
<em>thing</em>. Not a place I go physically, but a place my mind goes
intellectually. When I’m working or playing and in the flow, it
has always felt like MacOS is where I <em>am</em>. I’m in the Mac.
Interruptions — say, the doorbell or my phone ringing — are
momentarily disorienting when I’m in the flow on the Mac, because
I’m pulled out of that world and into the physical one. There’s a
similar effect with iOS too, but I’ve always found it less
profound. Partly that’s the nature of iOS, which doesn’t speak to
me, idiomatically, like MacOS does. I think in many ways that
explains why I never feel <em>in the flow</em> on an iPad like I can on a
Mac, even with the same size display. But with the iPhone in
particular screen size is an important factor. I don’t think <em>any</em>
hypothetical phone OS could be as immersive as I find MacOS,
simply because even the largest phone display is so small.
Watching a movie on a phone is a lesser experience than watching
on a big TV set, and watching a movie on even a huge TV is a
lesser experience than watching a movie in a nice theater. We
humans are visual creatures and our field of view affects our
sense of importance. Size matters. </p>

<p>The <em>worlds</em>, as it were, of MacOS and iOS (or Windows, or
Android, or whatever) are defined and limited by the displays on
which they run. If MacOS is a place I go mentally when working,
that place is manifested physically by the Mac’s display. It’s
like the playing field, or the court, in sports — it has very
clear, hard and fast, rectangular bounds. It is of fixed size and
shape, and everything I do in that world takes place in the
confines of those display boundaries. </p>

<p>VisionOS is very much going to be a conceptual place like that for
work. But there is no display. There are no boundaries. The
intellectual “place” where the apps of VisionOS are presented is
the real-world place in which you use the device, or the expansive
virtual environment you choose. The room in which you’re sitting
is the canvas. The whole room. The display on a Mac or iOS device
is to me like a portal, a rectangular window into a well-defined
virtual world. With VisionOS the virtual world is the actual world
around you. </p>

<p>In the same way that the introduction of multitouch with the
iPhone removed a layer of conceptual abstraction — instead of
touching a mouse or trackpad to move an on-screen pointer to an
object on screen, you simply touch the object on screen — VisionOS removes a layer of abstraction spatially. Using a Mac,
you are in a physical place, there is a display in front of you in
that place, and on that display are application windows. Using
VisionOS, there are just application windows in the physical place
in which you are. On Monday I had Safari and Messages and Photos
open, side by side, each in a window that seemed the size of a
movie poster — that is to say, each app in a window that appeared
larger than any actual computer display I’ve ever used. All side
by side. <a href="https://www.apple.com/newsroom/2023/06/introducing-apple-vision-pro/">Some of the videos in Apple’s Newsroom post</a>
introducing Vision Pro illustrate this. But seeing a picture of an
actor in this environment doesn’t do justice to experiencing it
firsthand, because a photo showing this environment itself has
defined rectangular borders. </p>

<p>This is not confusing or complex, but it feels profound. </p>
</blockquote>

<p>That might be the longest blockquote I’ve ever included in an article. But after nearly a week using Vision Pro, I can’t put it any better now than I did then. My inkling after that first 30-minute experience was exactly right.</p>

<p>This first-generation Vision Pro <em>hardware</em> is severely restricted by the current limits of technology. Apple has pushed those limits in numerous ways, but the limits are glaring. This Vision Pro is a stake in the ground, defining the new state of the art in immersive headset technology. But that stake in the ground will recede in the rear view mirror as the years march on. Just like the Mac’s 9-inch monochrome 512&amp;NoBreak; &amp;NoBreak;×&amp;NoBreak; &amp;NoBreak;342 pixel display. Just like the iPhone’s EDGE cellular modem.</p>

<p>But the conceptual design of VisionOS lays the foundation for an entirely new direction of interaction design. Just like how the basic concepts of the original Mac interface were exactly right, and remain true to this day. Just like how the original iPhone defined the way every phone in the world now works.</p>

<p>There is no practical way to surround yourself with multiple external displays with a Mac or PC to give yourself a workspace canvas the size of the workspace in VisionOS. The VisionOS workspace isn’t <em>infinite</em>, but it feels as close to infinitely large as it could be. It’s the world around you.</p>

<p>And it is very <em>spatial</em>. Windows remain anchored in place in the world around you. Set up a few windows, then stand up and walk away, and when you come back, those windows are exactly where you left them. It is uncanny to walk right through them. (From behind, they look white.) You can do seemingly crazy things like put a VisionOS application window outside a real-world window.</p>

<p>Windows also retain their positions relative to each other. A single press of the digital crown button brings up the VisionOS home view. A long-press of the digital crown button recenters your view according to your current gaze. So, for example, if everything seems a little too low in your view, look up, then press-and-hold the digital crown. All open windows will re-center in your current field of view. But this also works when you stand up and move.</p>

<p>You can start working in, say, your kitchen. Open up Messages, Safari, and Notes. Arrange the three windows around you from left to right. Stand up and walk to your living room. Those windows remain in your kitchen. Sit down on your sofa, and press-and-hold the digital crown. Now those windows move to your living room, re-centered in your current gaze — but exactly in the same positions and sizes relative to each other. It’s like having a multiple display setup that you can easily move to wherever you want to be.</p>

<p>Decades of Mac use has trained me to think that window controls are at the top of a window. In VisionOS they’re at the bottom. This took me a day or two to get accustomed to — when I think “I want to close this window”, my eyes naturally go to the top left corner. VisionOS windows really only have three controls: a close button and a “window bar” underneath, and a resizing indicator at each corner. Once you start using it, it’s easy to see why Apple put the window bar at the bottom instead of the top: if it were at the top, your hand and arm would obscure the window contents as you drag a window around to move it. With the bar at the bottom, window contents aren’t obscured at all while moving them.</p>

<p>Part of what makes the VisionOS workspace seem so expansive is that it’s utterly natural to make use of the Z-axis (depth). While dragging a window, it’s as easy to pull it closer, or push it farther away, as it is to move it left or right. It’s also utterly natural to rotate them, to arrange windows in a semicircle around you. It’s thrillingly science-fiction-like. The Mac, from its inception 40 years ago, has always had the concept of a Z-axis for stacking windows atop each other, but with VisionOS it’s quite different. In VisionOS, windows stay at the depth where you place them, whereas on the Mac windows pop to the front of the stack when you activate them.</p>

<p>Long-tap on a window’s close button and you get the option to close all other open windows, à la the Mac’s “Hide Others” command in the application menu. (Long-tapping is useful throughout VisionOS.)</p>

<p>The pass-through view of the real world around you means you can stand up and walk around while wearing Vision Pro. It doesn’t feel at all unsafe or disorienting. In fact it’s uncannily natural. But for <em>using</em> Vision Pro, it’s clearly intended that you be stationary, sitting or standing in a fixed position. Other than using Vision Pro as a camera, I can’t think of a reason to <em>use</em> it while walking about. Application windows are not fixed in position relative to <em>you</em> — they’re fixed in position relative to the world around you.</p>

<p><a href="https://www.apple.com/apple-vision-pro/guided-tour/">Apple’s Guided Tour video</a> does a better job than any written description could to convey the basic gist of using the platform. The main thing is this: you look at things, and tap your index finger and thumb to “click” or grab the thing you’re looking at. It sounds so simple and obvious, but it’s a breakthrough in interaction design. The Mac gave us “point and click”. The iPhone gave us “tap and slide”. Vision Pro gives us “look and tap”. The one aspect of this interaction model that isn’t instantly intuitive is that — with a few notable exceptions — you don’t reach out and poke the things you see directly. You’ll want to at first, even after reading this. But it doesn’t work for most things, and would quickly grow tiresome even if it did. Instead you really do just keep your hands on your lap or on your table top — wherever they’re most comfortable in a resting position.</p>

<p>One of the notable exceptions are VisionOS’s virtual keyboards. (Keyboards, plural, to include both the QWERTY typing keyboard and the numeric keypad for entering your device passcode, if Optic ID for some reason fails.) VisionOS virtual keyboards work <em>both</em> ways — you can gaze at each key you want to press and tap your finger and thumb to activate them in turn, or you can reach out and poke at the virtual keys. Either method is fine for entering a word or two (or a 6-digit passcode); neither method is good for actually writing. Siri dictation works great for longer-form text entry, but if you want to do any writing at all without dictation, you’ll want to pair a Bluetooth keyboard. The virtual keyboard is better than trying to type on an Apple Watch, but not by much.</p>

<p>One uncanny aspect to using a Bluetooth keyboard is that VisionOS presents a virtual HUD over the keyboard. This HUD contains autocomplete suggestions and also shows you, in a single line, the text you’re typing. This is an affordance for people who can’t type without looking at their keyboard. On a Mac or iPad with a physical keyboard, you really only have to move your eyes to go from looking at your display to your keyboard. On VisionOS, you need to move your head, because windows tend to be further away from you, and higher in space. This little HUD lets you see what you’re typing while keeping your eyes on the physical keyboard. It’s weird but in a good way. One week in and it still brings a smile to my face to have a physical keyboard with tappable autocomplete suggestions.</p>

<p>VisionOS also works great with a Magic Trackpad or Bluetooth mouse. The mouse cursor is a circle, just like the one in iPadOS. You don’t see the cursor in the space between windows; it just appears inside whichever window you’re looking at.</p>

<p>The Mac Virtual Display feature is both useful and almost startlingly intuitive. If you have a MacBook, you just open the MacBook lid and VisionOS will present a popover, just atop the MacBook’s open display, with two buttons: “Connect” and a close button. Tap the Connect button and you get a virtual Studio-display-size-ish Mac display. You can move and resize this window like any other in VisionOS. And while you can’t increase the number of pixels it represents, you can upscale it to be very large. It’s very usable, and while connected, your MacBook’s keyboard and trackpad work in VisionOS apps too. You can also initiate Mac Virtual Display mode manually, using Control Center in VisionOS (which you access by tapping a small chevron at the top of your view, looking up toward the ceiling or sky).</p>

<p>I’ve used this mode quite a bit over the last week, and the only hiccup is that I continually find myself wanting to use VisionOS’s “stare and tap” interaction with elements inside the virtual Mac display. That doesn’t work. So while my hands are on the physical keyboard and trackpad, everything works across both the Mac (inside the virtual Mac display) and VisionOS apps, but when my hands are off the physical trackpad, and I’m finger-to-thumb tapping away in VisionOS application windows, every single time I turn my attention back to a Mac app in the virtual Mac display, I find myself futilely finger-to-thumb tapping to activate or click whatever I’m looking at. This is akin to switching from an iPad to a MacBook and trying to touch the screen, but worse, because with the virtual Mac display in VisionOS, you’re continuously context switching between the Mac environment and VisionOS apps. The trackpad works perfectly in both; look-and-tap only works in VisionOS (or merely to move, resize, or activate the virtual Mac display — not to interact with the Mac UI therein).</p>

<p>VisionOS is already on version 1.0.1 (a software update from version 1.0.0 was already available when I set it up), and has a bunch of 1.0 bugs. I had to force quit apps — including Settings — a few times. (Press and hold both the top button and digital crown button for a few seconds, and you get a MacOS-style Force Quit dialog; keep holding both buttons down and you can force a system restart.)</p>

<p>There are not a lot of native VisionOS apps yet, but iPad apps really do work well. The main difference between native VisionOS apps and iPad apps isn’t that VisionOS apps work better, so much as that they simply look a lot cooler. VisionOS is, to my eyes, the best-looking OS Apple has made since the original skeuomorphic iPhone interface in iOS 1–6. Actual depth and shading — what an idea.</p>

<p>The apps on VisionOS’s home view are not manually organizable — a curious omission even in a 1.0 release. (Especially so given that Apple is bragging about having zillions of compatible iPad apps available.) All iPad apps — including a bunch of built-in apps from Apple itself, including News, Books, Calendar, and Maps — are put in a “Compatible Apps” folder, and have squircle-shaped icons. Native VisionOS apps are at the root level of the apps view, and have circular icons.</p>

<p>The fundamental interaction model in VisionOS feels like it will be copied by all future VR/AR headsets, in the same way that all desktop computers work like the Mac, and all phones and tablets now work like the iPhone. And when that happens, some will argue that of course they all work that way, because how else could they work? But personal computers didn’t have point-and-click GUIs before the Mac, and phones didn’t have “it’s all just a big touchscreen” interfaces before the iPhone. No other headset today has a “just look at a target, and tap your finger and thumb” interface today. I suspect in a few years they all will.</p>

<h2>The Hardware, Again, A/V Edition</h2>

<p>This brings me back to the hardware of Vision Pro. The displays are excellent, but I’m already starting to see how they aren’t good enough. The eye tracking is very good, but it’s not as precise as I’d like it to be. The cameras are good, but they don’t approach the dynamic range of your actual eyesight. There sometimes is color fringing at the periphery of your vision, depending on the lighting. A light source to your side, like a window in daytime, will show the fringing. When you move your head, the illusion of true pass-through is broken — you can tell that you’re looking at displays showing the world via footage from cameras. Just walking around is enough motion to break the illusion of natural pass-through of the real world. In fact, in some ways, the immersive 3D environments — mountaintops, lakesides, the surface of the moon (!) — are more visually realistic than the actual real world, because there’s less latency and shearing as you pan your gaze.</p>

<p>I’ve used the original PlayStation 5 VR headset, HTC Vive Pro, and own a Meta Quest 3. Vision Pro’s display quality makes both of those headsets seem like they’re from a different era. Vision Pro is in a different ballpark, playing a different game. In terms of resolution, Vision Pro is astonishing. I do not see pixels, ever. I see text as crisply as I do in real life. It’s very comfortable to read. (Although very weird, still, one week in, to have, say, a Safari window that appears 6 or 7 feet tall). But I can already imagine a <em>better</em> Vision headset display. I can already imagine lower latency between the camera footage and the displays in front of my eyes. I can already imagine greater dynamic range, like when looking out a window during daytime from inside a dim room.</p>

<p>Vision Pro’s displays are amazing, yet also obviously not good enough.</p>

<p>The speakers, on the other hand, are simply amazing. I’ve never experienced anything quite like them. I expected that they’d sound fine, but not as good as AirPods Pro or Max. Instead, I find they sound far better than any headphones I’ve ever worn. That’s because they’re actually speakers, optimally positioned in front of your ears. There’s always a catch, and the catch with Vision Pro’s speakers is that they’re not private at all. Someone sitting next to you can hear what you hear; someone near you can hear most of what you hear. When using Vision Pro in public or near others, you’ll want to wear AirPods both for privacy and courtesy — not audio quality.</p>

<p>The speakers also convey an uncanny sense of spatial reality.</p>

<p>Last, I’ve consistently gotten 3 full hours of battery life using Vision Pro on a full charge. Sometimes a little more. In my experience, Apple’s stated 2–2.5 hour battery life is a floor, not a ceiling. I also have suffered neither physical discomfort nor nausea in long sessions.<sup id="fnr3-2024-01-30"><a href="#fn3-2024-01-30">3</a></sup></p>

<h2>I’m Sorely Tempted, but Shall Resist, Making a ‘Persona Non Grata’ Pun in This Section Heading</h2>

<p><em>Personas</em> are a highlight feature of Vision Pro. Your persona is a digital avatar of your head and shoulders that appears in your stead when making FaceTime calls, or using other video call software that adopts the APIs to use them. At launch that already includes Zoom, Webex, and Microsoft Teams.</p>

<p>Apple is prominently labeling the entire persona feature “beta”, and it doesn’t take more than a moment of seeing one to know why. Personas are weird. They are very deep in the <a href="https://en.wikipedia.org/wiki/Uncanny_valley">uncanny valley</a>. There is no mistaking a persona for the actual person. At times they seem far more like a character from a video game than a photorealistic visage. And at all times they seem <em>somewhat</em> like a video game character. My hair, for example, looks like a shiny plastic Lego hairpiece.</p>

<p>Even capturing your persona is awkward. You have to take Vision Pro off your head and turn it around so the cameras (and lidar sensor?) can see you, but you don’t get a good image of what the cameras are capturing, because the front-facing EyeSight display is relatively low resolution. Your hair might be messed up from the headband, and you’ll need to check yourself in an actual mirror to make sure your shirt is smooth.</p>

<p>I FaceTimed my wife after capturing mine, and her reaction — not really knowing at all what to expect — was “No, no, no — oh my god what is this?” And then she just started laughing. We concluded the test with her telling me, “Don’t ever call me like that again.” She was joking (I think), but personas are so deep in the uncanny valley that the first time anyone sees one, they’re going to want to talk about it.</p>

<p>Apple is in a tough spot with this feature. The feature clearly deserves the prominent “beta” label in its current state. But you can also see why Apple needed to include the feature at launch, no matter how far from “good” it is. You can’t ship a productivity computer today that can’t be used for video conferencing. It’s like email or web browsing: essential. But that leaves only two options: a cartoon-like Memoji avatar, or an attempt at photorealism. Apple almost certainly could have knocked a Memoji avatar out of the park for this purpose, but I think rightly decided that that would be utterly inappropriate in most professional work contexts. You could FaceTime your family and friends as a Memoji and they’d accept it without judgment, but not professional colleagues or clients.</p>

<p>In defense of personas as they exist right now, I’ve found that I do get used to them a few minutes into a call. (I’ve had a 30-minute FaceTime call with Vision-Pro-wearing reps from Apple, as well as calls with fellow reviewers Joanna Stern and Nilay Patel.) But I think they work best persona-to-persona — that is to say, between two (or more) people who are all using a Vision Pro. That’s obviously not going to be the case for most calls. This will get normalized, as more people buy and use Vision headsets, and as Apple races to improve the feature to non-beta quality. But for now, if you use it, expect to talk about it.</p>

<p>Your persona is also used for the presentation of your eyes in the EyeSight feature. EyeSight, in Apple’s product marketing, is a headline feature of Vision Pro. It’s prominently featured on their website and their advertisements. But in practice it’s very subtle. Here’s the best selfie I’ve been able to capture of myself showing EyeSight:</p>

<p><a href="https://daringfireball.net/misc/2024/01/eyesight-in-action.jpeg" class="noborder">
<img src="https://daringfireball.net/misc/2024/01/eyesight-in-action.jpeg" alt="The author, wearing Vision Pro, with EyeSight activated."></a></p>

<p>For the record, my eyes were open when I snapped that photo.</p>

<p>EyeSight is not displayed most of the time that you’re using Vision Pro — it only turns on when Vision Pro detects a person in front of you (including when you look at yourself in a mirror). Most of the time you’re using Vision Pro, the front display shows nothing at all.</p>

<p>EyeSight is not an “<em>Oh my god, I can see your eyes!</em>” feature, but instead more of an “<em>Oh, yes, now that you ask, I guess I can sort of see your eyes</em>” feature. Apple seemingly went to great lengths (and significant expense) to create the EyeSight feature, but so far I’ve found it to be of highly dubious utility, if only because it’s so subtle. It’s like wearing tinted goggles meant to <em>obscure</em> you, not clear goggles meant to show your eyes clearly.</p>

<h2>Personal Entertainment</h2>

<p>I’ve saved the best for last. Vision Pro is simply a phenomenal way to watch movies, and 3D immersive experiences are astonishing. There are 3D immersive experiences in Vision Pro that are more compelling than Disney World attractions that people wait in line for hours to see.</p>

<p>First up are movies using apps that haven’t been updated for Vision Pro natively. I’ve used the iPad apps for services like Paramount+ and Peacock. Watching video in apps like these is a great experience, but not jaw-dropping. You just get a window with the video content that you can make as big as you want, but “big”, for these merely “compatible” apps, is about the size of the biggest wall in your room. This is true too for video in Safari when you go “full screen”. It breaks out of the browser window into a standalone video window. (Netflix.com is OK in VisionOS Safari, but YouTube.com stinks — it’s a minefield of UI targets that are too small for eye-tracking’s precision.)</p>

<p>Where things go to the next level are the Disney+ and Apple TV apps, which have been designed specifically for Vision Pro. Both apps offer immersive 360° viewing environments. <a href="https://www.apple.com/newsroom/2024/01/apple-previews-new-entertainment-experiences-launching-with-apple-vision-pro/">Disney+ has four</a>: “the Disney+ Theater, inspired by the historic El Capitan Theatre in Hollywood; the Scare Floor from Pixar’s <em>Monsters Inc.</em>; Marvel’s Avengers Tower overlooking downtown Manhattan; and the cockpit of Luke Skywalker’s landspeeder, facing a binary sunset on the planet Tatooine from the Star Wars galaxy.” With the TV app, Apple offers a distraction-free virtual theater.</p>

<p>What’s amazing about watching movies in these two apps is that the virtual movie screens look immense, as though you’re really in a movie theater, all by yourself, looking at a 100-foot screen. Apple’s presentation in the TV app is particularly good, giving you options to simulate perspectives from the front, middle, or back of the theater, as well as from either the floor or balcony levels. (Like Siskel and Ebert, I think I prefer the balcony.) The “<em>Holy shit, this screen looks absolutely immense</em>” effect is particularly good in Apple’s TV app. Somehow these postage-stamp-size displays inside Vision Pro are capable of convincing your brain that you’re sitting in the best seat in the house in front of a huge movie theater screen. (As immersive as the Disney+ viewing experience is, after using the TV app, I find myself wishing I could get closer to the big screen in Disney+, or make the screen even bigger.) I have never been a fan of 3D movies in actual movie theaters, but in Vision Pro the depth feels natural, not distracting, and you don’t suffer any loss of brightness from wearing what are effectively sunglasses.</p>

<p>And then there are the 3D immersive experiences. Apple has commissioned <a href="https://www.apple.com/newsroom/2024/01/apple-previews-new-entertainment-experiences-launching-with-apple-vision-pro/">a handful of original titles</a> that are available, free of charge as Vision Pro exclusives, at launch. I’m sure there are more on the way. There aren’t enough titles yet to recommend them as a reason to buy a Vision Pro, but what this handful of titles promises for the future is incredible. (I look forward, too, to watching sports in 3D immersion — but at the moment that’s entirely in the future. But hopefully the near future.)</p>

<p>But I <em>can</em> recommend buying Vision Pro solely for use as a personal theater. I paid $5,000 for my 77-inch LG OLED TV a few years ago. Vision Pro offers a far more compelling experience (including far more compelling spatial surround sound). You’d look at my TV set and almost certainly agree that it’s a nice big TV. But watching movies in the Disney+ and TV apps will make you go “Wow!” These are experiences I never imagined I’d be able to have in my own home (or, say, while flying across the country in an airplane).</p>

<p>The only hitch is that Vision Pro is utterly personal. Putting a headset on is by nature isolating — like headphones but more so, because eye contact is so essential for all primates. If you don’t often watch movies or shows or sports by yourself, it doesn’t make sense to buy a device that only you can see. Just this weekend, I watched most of the first half of the Chiefs-Ravens game in the Paramount+ app on Vision Pro, in a window scaled to the size of my entire living room wall. It was captivating. The image quality was a bit grainy scaled to that size (I believe the telecast was only in 1080p resolution), but it was better than watching on my TV simply because it was so damn big. But I wanted to watch the rest of the game (and the subsequent Lions-49ers game) with my wife, together on the sofa. I was happier overall sharing the experience with her, but damn if my 77-inch TV didn’t suddenly seem way too small.</p>

<p>Spatial computing in VisionOS is the real deal. It’s a legit productivity computing platform right now, and it’s only going to get better. It sounds like hype, but I truly believe this is a landmark breakthrough like the 1984 Macintosh and the 2007 iPhone.</p>

<p>But if you were to try just one thing using Vision Pro — just one thing — it has to be watching a movie in the TV app, in theater mode. Try that, and no matter how skeptical you were beforehand about the Vision Pro’s price tag, your hand will start inching toward your wallet.</p>
</article>


<hr>

<footer>
<p>
<a href="/david/" title="Aller à l’accueil"><svg class="icon icon-home">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-home"></use>
</svg> Accueil</a> •
<a href="/david/log/" title="Accès au flux RSS"><svg class="icon icon-rss2">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-rss2"></use>
</svg> Suivre</a> •
<a href="http://larlet.com" title="Go to my English profile" data-instant><svg class="icon icon-user-tie">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-user-tie"></use>
</svg> Pro</a> •
<a href="mailto:david%40larlet.fr" title="Envoyer un courriel"><svg class="icon icon-mail">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-mail"></use>
</svg> Email</a> •
<abbr class="nowrap" title="Hébergeur : Alwaysdata, 62 rue Tiquetonne 75002 Paris, +33184162340"><svg class="icon icon-hammer2">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-hammer2"></use>
</svg> Légal</abbr>
</p>
<template id="theme-selector">
<form>
<fieldset>
<legend><svg class="icon icon-brightness-contrast">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-brightness-contrast"></use>
</svg> Thème</legend>
<label>
<input type="radio" value="auto" name="chosen-color-scheme" checked> Auto
</label>
<label>
<input type="radio" value="dark" name="chosen-color-scheme"> Foncé
</label>
<label>
<input type="radio" value="light" name="chosen-color-scheme"> Clair
</label>
</fieldset>
</form>
</template>
</footer>
<script src="/static/david/js/instantpage-5.1.0.min.js" type="module"></script>
<script>
function loadThemeForm(templateName) {
const themeSelectorTemplate = document.querySelector(templateName)
const form = themeSelectorTemplate.content.firstElementChild
themeSelectorTemplate.replaceWith(form)

form.addEventListener('change', (e) => {
const chosenColorScheme = e.target.value
localStorage.setItem('theme', chosenColorScheme)
toggleTheme(chosenColorScheme)
})

const selectedTheme = localStorage.getItem('theme')
if (selectedTheme && selectedTheme !== 'undefined') {
form.querySelector(`[value="${selectedTheme}"]`).checked = true
}
}

const prefersColorSchemeDark = '(prefers-color-scheme: dark)'
window.addEventListener('load', () => {
let hasDarkRules = false
for (const styleSheet of Array.from(document.styleSheets)) {
let mediaRules = []
for (const cssRule of styleSheet.cssRules) {
if (cssRule.type !== CSSRule.MEDIA_RULE) {
continue
}
// WARNING: Safari does not have/supports `conditionText`.
if (cssRule.conditionText) {
if (cssRule.conditionText !== prefersColorSchemeDark) {
continue
}
} else {
if (cssRule.cssText.startsWith(prefersColorSchemeDark)) {
continue
}
}
mediaRules = mediaRules.concat(Array.from(cssRule.cssRules))
}

// WARNING: do not try to insert a Rule to a styleSheet you are
// currently iterating on, otherwise the browser will be stuck
// in a infinite loop…
for (const mediaRule of mediaRules) {
styleSheet.insertRule(mediaRule.cssText)
hasDarkRules = true
}
}
if (hasDarkRules) {
loadThemeForm('#theme-selector')
}
})
</script>
</body>
</html>

+ 244
- 0
cache/2024/cd9184008ba5d9e4c9be4d0a0eea4f60/index.md View File

@@ -0,0 +1,244 @@
title: Daring Fireball: The Vision Pro
url: https://daringfireball.net/2024/01/the_vision_pro
hash_url: cd9184008ba5d9e4c9be4d0a0eea4f60
archive_date: 2024-01-31

<p>For the last six days, I’ve been simultaneously testing three entirely new products from Apple. The first is a VR/AR headset with eye-tracking controls. The second is a revolutionary spatial computing productivity platform. The third is a breakthrough personal entertainment device.</p>

<p>A headset, a spatial productivity platform, and a personal entertainment device.</p>

<p>I’m sure you’re already getting it. These are not three separate devices. They’re one: Apple Vision Pro. But if you’ll pardon the shameless homage to Steve Jobs’s famous iPhone introduction, I think these three perspectives are the best way to consider it.</p>

<h2>The Hardware</h2>

<p>Vision Pro comes in a surprisingly big box. I was expecting a package roughly the dimensions of a HomePod box; instead, a Vision Pro retail box is <a href="https://daringfireball.net/misc/2024/01/homepods-v-vision-pro.jpeg">quite a bit larger than <em>two</em> HomePod boxes</a> stacked atop each other. (I own more HomePods than most people.)</p>

<p>There’s a lot inside. The top half of the package contains the Vision Pro headset itself, with the light seal, a light seal cushion, and the default Solo Knit Band already attached. The lower half contains the battery, the charger (30W), the cables, the Dual Loop Band, the Getting Started book (which is beautifully printed in full color, on excellent paper — it feels like a keepsake), the polishing cloth<sup id="fnr1-2024-01-30"><a href="#fn1-2024-01-30">1</a></sup>, and an extra light seal cushion.</p>

<p>To turn Vision Pro on, you connect the external battery pack’s power cable to the Vision Pro’s power connector, and rotate it a quarter turn to lock it into place. There are small dots on the headset’s dime-sized power socket showing how to align the cable connector’s small LED. The LED pulses when Vision Pro turns on. (I miss Apple’s glowing power indicator LEDs — this is a really delightful touch.) When Vision Pro has finished booting and is ready to use, it makes a pleasant welcoming sound.</p>

<p>Then you put Vision Pro on. If you’re using the Solo Knit Band, you tighten and loosen it using a dial on the band behind your right ear. VisionOS directs you to raise or lower the headset appropriately to position it at just the right height on your face relative to your eyes. If Vision Pro thinks your eyes are too close to the displays, it will suggest you switch to the “+” size light seal cushion. You get two light seal cushions, but they’re not the same: mine are labeled “W” and “W+”. The “+” is the same width, to match your light seal, but adds a wee bit more space between your eyes and the displays inside Vision Pro. For me the default (non-“+”) one fits fine.</p>

<p>The software then guides you through a series of screens to calibrate the eye tracking. It’s all very obvious, and kind of fun. It’s almost like a simple game: you stare at a series of dots in a circle, and pinch your index finger and thumb as you stare at each one. You go through this three times, in three different artificial lighting conditions: dark, medium, and bright. Near the end of the first-run experience, you’re prompted to bring your iPhone or iPad nearby, just like when setting up a new iPhone or iPad. This allows your Vision Pro to get your Apple ID credentials and Wi-Fi password without entering any of that manually. It’s a very smooth onboarding process. And then that’s it, you’re in and using Vision Pro. </p>

<p>There’s no getting around some fundamental problems with the Vision Pro hardware.</p>

<p>First is the fact that it uses an external battery pack connected via a power cable. The battery itself is about the width and height of an iPhone 15/15 Pro, but thicker. And the battery is heavy: about 325g, compared to 187g for an iPhone 15 Pro, and 221g for a 15 Pro Max. It’s closer in thickness and weight to <em>two</em> iPhone 15’s than it is to one. And the tethered power cable can be an annoyance. Vision Pro has no built-in reserve battery — disconnect the power cable from the headset and it immediately shuts off. It clicks firmly into place, so there’s no risk of accidentally disconnecting it. But if you buy an <a href="https://www.apple.com/shop/product/MW283LL/A/apple-vision-pro-battery">extra Vision Pro Battery for $200</a>, you can’t hot-swap them — you need to shut down first.</p>

<p>Second is the fact that Vision Pro is <a href="https://daringfireball.net/linked/2024/01/20/vision-pro-weight">heavy</a>. I’ve used it for hours at a time without any discomfort, but fatigue does set in, from the weight alone. You never forget that you’re wearing it. Related to Vision Pro’s weight is the fact that it’s quite large. It’s a big-ass pair of heavy goggles on your face. There’s nothing subtle about it — either from your first-person perspective wearing it, or from the third-person perspective of someone else looking at you while you wear it.</p>

<p>One of Apple’s suggestions for adjusting the position of Vision Pro on your face is to balance the weight/pressure on your face equally between your forehead and your cheeks. I found this to be good advice. My instincts, originally, were to place it slightly too low on my face, which causes the system to advise positioning it higher. It needs to be positioned just right on your face both so that you see well through its displays, and so it can see your eyes for Optic ID, Vision Pro’s equivalent of Face ID for unlocking the device and confirming actions that require authentication (like accessing passwords from your keychain, or purchasing apps). Within a day or two, it became natural for me to put it on without needing to fuss with the fit.</p>

<p>The default stretchy Solo Knit Band not only works well for me, but I prefer it, comfort- and convenience-wise, to the Dual Loop Band. With the Solo Knit Band, you put it on and tighten it by twisting the aforementioned dial. When you take it off, you loosen it first. You want it tight enough on your face that it isn’t practical not to have to tighten/loosen it each time you put it on/take it off. It’s a bit like getting accustomed to a new watch strap — at first it feels finicky, but you quickly develop muscle memory. In addition to learning how high to place Vision Pro on your face, playing around with how high the Solo Knit Band should go across the <em>back</em> of your head is essential for getting a consistent fit.</p>

<p>Why does Vision Pro come with the Dual Loop Band, which is an altogether different design? Apple’s Getting Started guide describes its purpose with a wonderful euphemism: “Apple Vision Pro also comes with a Dual Loop Band, which is a great option if you want a different fit.” Translation: You should try it if the Solo Knit Band isn’t comfortable. Vision Pro is an extraordinarily personal device. It’s not just on your face, which is incredibly sensitive to feel and touch, but it’s heavy and requires precise alignment with your eyes. You also really want the light seal to, well, seal out light. <a href="https://www.reddit.com/r/VisionPro/comments/19ardw5/all_light_seal_sizes/">This Reddit post</a> suggests there are 28 different sizes for the light seal. 28! (The N’s and W’s, I presume, are for <em>narrow</em> and <em>wide</em>.) Depending on the shape of your face, size of your head, and volume of hair, the Solo Knit Band might not work well. The Dual Loop Band has two velcro straps — one across the back of your head, and one that goes across the top. People who use the Dual Loop Band will probably need to loosen and tighten both straps each time they take Vision Pro on and off.</p>

<p>If it all sounds a little fussy, that’s because it is. But there’s no way around it: it requires a precise fit both for comfort and optical alignment.</p>

<p>Many have noted that for a product from a company that has pushed fitness-related devices (Watch) and services (Fitness+), there is no fitness-related marketing angle for Vision Pro. It’s simply too heavy. No one wants to exert themselves with a 650g device strapped to their face. Someday Apple will make a fitness-suitable Vision headset; this Vision Pro is not it.</p>

<p>Another aspect that takes some getting used to is simply handling Vision Pro. You need to learn to hold it via the aluminum frame around the device itself, not the light seal. Try to hold it or pick it up by the light seal and the seal will pop off. The light seal attaches to Vision Pro magnetically, and the light seal cushion attaches to the seal magnetically. They’re easy to attach and detach, and snap into place automatically — but they will detach if you use the light seal (or the cushion) to pick up the combined unit. (Zeiss lens inserts for glasses-wearers also pop into place magnetically, and are trivial to insert and remove.)</p>

<p>You don’t need to put Vision Pro to sleep before taking it off, nor wake it up when putting it on. You just take it off and put it on, and the system detects whether you’re using it automatically. You can just leave the battery attached permanently.</p>

<p>I suspect the front face of Vision Pro is easily scratched. This is a device that demands to be handled with a degree of care. Apple’s instructions advise putting the cover on each time you’re done using it, like putting the cap back on a bottle of a fizzy beverage when you’re done drinking. I don’t think it’s delicate, per se, but it is most certainly not rugged.</p>

<p>My review kit included <a href="https://www.apple.com/shop/product/MW2F3LL/A/apple-vision-pro-travel-case">Apple’s $200 travel case</a>. As with the Vision Pro retail box, I found it surprisingly large. It will consume much of the internal volume inside most laptop backpacks — and in fact, the travel case all by itself is roughly the size of a small child’s backpack. I very much look forward to using Vision Pro while traveling, but it’s something you’ll need to plan your packing around.<sup id="fnr2-2024-01-30"><a href="#fn2-2024-01-30">2</a></sup></p>

<p>In a short post two weeks ago — before I had this unit to review at home, but after another hands-on demo with Apple in New York — <a href="https://daringfireball.net/linked/2024/01/19/vision-pro-battery">I wrote</a>:</p>

<blockquote>
<p>Almost every first-generation product has things like this — glaring deficiencies dictated by the limits of technology. The
original Mac had far too little RAM (128 KB) and far too little
storage (a single <a href="https://lowendmac.com/2016/floppy-disk-compatibility-and-incompatibility-in-the-mac-world/">400 KB single-sided floppy disk drive</a>).
The original iPhone only supported 2G EDGE cellular networking,
which was <a href="https://en.wikipedia.org/wiki/Enhanced_Data_rates_for_GSM_Evolution#:~:text=EDGE%20can%20carry%20a%20bandwidth,much%20traffic%20as%20standard%20GPRS.">unfathomably slow</a> and didn’t work at all while
you were on a voice call. The original Apple Watch was very slow
and struggled to last a full day on a single charge. The external
battery pack — which only supplies 2 to 2.5 hours of battery life — is that for this first-gen Vision Pro. Also, the Vision Pro
headset itself — without any built-in battery — is still too big
and too heavy. </p>

<p><a href="https://paulgraham.com/really.html">Paul Graham has a wonderful adage</a>: </p>

<blockquote>
<p>Don’t worry what people will say. If your first version is so
impressive that trolls don’t make fun of it, you waited too long
to launch. </p>
</blockquote>
</blockquote>

<p>Vision Pro isn’t even in stores yet and it’s already subject to mockery. (So was the iPhone before it shipped; so was the original Macintosh.) In a few years, after a few product generations, we will <em>all</em> look back at this first Vision device and laugh. We’ll laugh at the external battery, we will laugh at the size and weight of the device, and eventually we will laugh at its price. The knocks against it are all undeniably true: it’s too heavy and too big for everyone, and too expensive for the mass market.</p>

<p>But, like that original iPhone and the original Macintosh before it, this first Vision Pro is no joke.</p>

<h2>The VisionOS Platform</h2>

<p>Back in June, after getting a 30-minute demo of Vision Pro at WWDC, I wrote:</p>

<blockquote>
<p>Apple is <a href="https://twitter.com/tim_cook/status/1665806600261763072">promoting</a> the Vision Pro announcement as the
launch of “the era of spatial computing”. That term feels perfect.
It’s not AR, VR, or XR. It’s spatial computing, and some <em>aspects</em>
of spatial computing are AR or VR. </p>

<p>To me the Macintosh has always felt more like a <em>place</em> than a
<em>thing</em>. Not a place I go physically, but a place my mind goes
intellectually. When I’m working or playing and in the flow, it
has always felt like MacOS is where I <em>am</em>. I’m in the Mac.
Interruptions — say, the doorbell or my phone ringing — are
momentarily disorienting when I’m in the flow on the Mac, because
I’m pulled out of that world and into the physical one. There’s a
similar effect with iOS too, but I’ve always found it less
profound. Partly that’s the nature of iOS, which doesn’t speak to
me, idiomatically, like MacOS does. I think in many ways that
explains why I never feel <em>in the flow</em> on an iPad like I can on a
Mac, even with the same size display. But with the iPhone in
particular screen size is an important factor. I don’t think <em>any</em>
hypothetical phone OS could be as immersive as I find MacOS,
simply because even the largest phone display is so small.
Watching a movie on a phone is a lesser experience than watching
on a big TV set, and watching a movie on even a huge TV is a
lesser experience than watching a movie in a nice theater. We
humans are visual creatures and our field of view affects our
sense of importance. Size matters. </p>

<p>The <em>worlds</em>, as it were, of MacOS and iOS (or Windows, or
Android, or whatever) are defined and limited by the displays on
which they run. If MacOS is a place I go mentally when working,
that place is manifested physically by the Mac’s display. It’s
like the playing field, or the court, in sports — it has very
clear, hard and fast, rectangular bounds. It is of fixed size and
shape, and everything I do in that world takes place in the
confines of those display boundaries. </p>

<p>VisionOS is very much going to be a conceptual place like that for
work. But there is no display. There are no boundaries. The
intellectual “place” where the apps of VisionOS are presented is
the real-world place in which you use the device, or the expansive
virtual environment you choose. The room in which you’re sitting
is the canvas. The whole room. The display on a Mac or iOS device
is to me like a portal, a rectangular window into a well-defined
virtual world. With VisionOS the virtual world is the actual world
around you. </p>

<p>In the same way that the introduction of multitouch with the
iPhone removed a layer of conceptual abstraction — instead of
touching a mouse or trackpad to move an on-screen pointer to an
object on screen, you simply touch the object on screen — VisionOS removes a layer of abstraction spatially. Using a Mac,
you are in a physical place, there is a display in front of you in
that place, and on that display are application windows. Using
VisionOS, there are just application windows in the physical place
in which you are. On Monday I had Safari and Messages and Photos
open, side by side, each in a window that seemed the size of a
movie poster — that is to say, each app in a window that appeared
larger than any actual computer display I’ve ever used. All side
by side. <a href="https://www.apple.com/newsroom/2023/06/introducing-apple-vision-pro/">Some of the videos in Apple’s Newsroom post</a>
introducing Vision Pro illustrate this. But seeing a picture of an
actor in this environment doesn’t do justice to experiencing it
firsthand, because a photo showing this environment itself has
defined rectangular borders. </p>

<p>This is not confusing or complex, but it feels profound. </p>
</blockquote>

<p>That might be the longest blockquote I’ve ever included in an article. But after nearly a week using Vision Pro, I can’t put it any better now than I did then. My inkling after that first 30-minute experience was exactly right.</p>

<p>This first-generation Vision Pro <em>hardware</em> is severely restricted by the current limits of technology. Apple has pushed those limits in numerous ways, but the limits are glaring. This Vision Pro is a stake in the ground, defining the new state of the art in immersive headset technology. But that stake in the ground will recede in the rear view mirror as the years march on. Just like the Mac’s 9-inch monochrome 512&amp;NoBreak; &amp;NoBreak;×&amp;NoBreak; &amp;NoBreak;342 pixel display. Just like the iPhone’s EDGE cellular modem.</p>

<p>But the conceptual design of VisionOS lays the foundation for an entirely new direction of interaction design. Just like how the basic concepts of the original Mac interface were exactly right, and remain true to this day. Just like how the original iPhone defined the way every phone in the world now works.</p>

<p>There is no practical way to surround yourself with multiple external displays with a Mac or PC to give yourself a workspace canvas the size of the workspace in VisionOS. The VisionOS workspace isn’t <em>infinite</em>, but it feels as close to infinitely large as it could be. It’s the world around you.</p>

<p>And it is very <em>spatial</em>. Windows remain anchored in place in the world around you. Set up a few windows, then stand up and walk away, and when you come back, those windows are exactly where you left them. It is uncanny to walk right through them. (From behind, they look white.) You can do seemingly crazy things like put a VisionOS application window outside a real-world window.</p>

<p>Windows also retain their positions relative to each other. A single press of the digital crown button brings up the VisionOS home view. A long-press of the digital crown button recenters your view according to your current gaze. So, for example, if everything seems a little too low in your view, look up, then press-and-hold the digital crown. All open windows will re-center in your current field of view. But this also works when you stand up and move.</p>

<p>You can start working in, say, your kitchen. Open up Messages, Safari, and Notes. Arrange the three windows around you from left to right. Stand up and walk to your living room. Those windows remain in your kitchen. Sit down on your sofa, and press-and-hold the digital crown. Now those windows move to your living room, re-centered in your current gaze — but exactly in the same positions and sizes relative to each other. It’s like having a multiple display setup that you can easily move to wherever you want to be.</p>

<p>Decades of Mac use has trained me to think that window controls are at the top of a window. In VisionOS they’re at the bottom. This took me a day or two to get accustomed to — when I think “I want to close this window”, my eyes naturally go to the top left corner. VisionOS windows really only have three controls: a close button and a “window bar” underneath, and a resizing indicator at each corner. Once you start using it, it’s easy to see why Apple put the window bar at the bottom instead of the top: if it were at the top, your hand and arm would obscure the window contents as you drag a window around to move it. With the bar at the bottom, window contents aren’t obscured at all while moving them.</p>

<p>Part of what makes the VisionOS workspace seem so expansive is that it’s utterly natural to make use of the Z-axis (depth). While dragging a window, it’s as easy to pull it closer, or push it farther away, as it is to move it left or right. It’s also utterly natural to rotate them, to arrange windows in a semicircle around you. It’s thrillingly science-fiction-like. The Mac, from its inception 40 years ago, has always had the concept of a Z-axis for stacking windows atop each other, but with VisionOS it’s quite different. In VisionOS, windows stay at the depth where you place them, whereas on the Mac windows pop to the front of the stack when you activate them.</p>

<p>Long-tap on a window’s close button and you get the option to close all other open windows, à la the Mac’s “Hide Others” command in the application menu. (Long-tapping is useful throughout VisionOS.)</p>

<p>The pass-through view of the real world around you means you can stand up and walk around while wearing Vision Pro. It doesn’t feel at all unsafe or disorienting. In fact it’s uncannily natural. But for <em>using</em> Vision Pro, it’s clearly intended that you be stationary, sitting or standing in a fixed position. Other than using Vision Pro as a camera, I can’t think of a reason to <em>use</em> it while walking about. Application windows are not fixed in position relative to <em>you</em> — they’re fixed in position relative to the world around you.</p>

<p><a href="https://www.apple.com/apple-vision-pro/guided-tour/">Apple’s Guided Tour video</a> does a better job than any written description could to convey the basic gist of using the platform. The main thing is this: you look at things, and tap your index finger and thumb to “click” or grab the thing you’re looking at. It sounds so simple and obvious, but it’s a breakthrough in interaction design. The Mac gave us “point and click”. The iPhone gave us “tap and slide”. Vision Pro gives us “look and tap”. The one aspect of this interaction model that isn’t instantly intuitive is that — with a few notable exceptions — you don’t reach out and poke the things you see directly. You’ll want to at first, even after reading this. But it doesn’t work for most things, and would quickly grow tiresome even if it did. Instead you really do just keep your hands on your lap or on your table top — wherever they’re most comfortable in a resting position.</p>

<p>One of the notable exceptions are VisionOS’s virtual keyboards. (Keyboards, plural, to include both the QWERTY typing keyboard and the numeric keypad for entering your device passcode, if Optic ID for some reason fails.) VisionOS virtual keyboards work <em>both</em> ways — you can gaze at each key you want to press and tap your finger and thumb to activate them in turn, or you can reach out and poke at the virtual keys. Either method is fine for entering a word or two (or a 6-digit passcode); neither method is good for actually writing. Siri dictation works great for longer-form text entry, but if you want to do any writing at all without dictation, you’ll want to pair a Bluetooth keyboard. The virtual keyboard is better than trying to type on an Apple Watch, but not by much.</p>

<p>One uncanny aspect to using a Bluetooth keyboard is that VisionOS presents a virtual HUD over the keyboard. This HUD contains autocomplete suggestions and also shows you, in a single line, the text you’re typing. This is an affordance for people who can’t type without looking at their keyboard. On a Mac or iPad with a physical keyboard, you really only have to move your eyes to go from looking at your display to your keyboard. On VisionOS, you need to move your head, because windows tend to be further away from you, and higher in space. This little HUD lets you see what you’re typing while keeping your eyes on the physical keyboard. It’s weird but in a good way. One week in and it still brings a smile to my face to have a physical keyboard with tappable autocomplete suggestions.</p>

<p>VisionOS also works great with a Magic Trackpad or Bluetooth mouse. The mouse cursor is a circle, just like the one in iPadOS. You don’t see the cursor in the space between windows; it just appears inside whichever window you’re looking at.</p>

<p>The Mac Virtual Display feature is both useful and almost startlingly intuitive. If you have a MacBook, you just open the MacBook lid and VisionOS will present a popover, just atop the MacBook’s open display, with two buttons: “Connect” and a close button. Tap the Connect button and you get a virtual Studio-display-size-ish Mac display. You can move and resize this window like any other in VisionOS. And while you can’t increase the number of pixels it represents, you can upscale it to be very large. It’s very usable, and while connected, your MacBook’s keyboard and trackpad work in VisionOS apps too. You can also initiate Mac Virtual Display mode manually, using Control Center in VisionOS (which you access by tapping a small chevron at the top of your view, looking up toward the ceiling or sky).</p>

<p>I’ve used this mode quite a bit over the last week, and the only hiccup is that I continually find myself wanting to use VisionOS’s “stare and tap” interaction with elements inside the virtual Mac display. That doesn’t work. So while my hands are on the physical keyboard and trackpad, everything works across both the Mac (inside the virtual Mac display) and VisionOS apps, but when my hands are off the physical trackpad, and I’m finger-to-thumb tapping away in VisionOS application windows, every single time I turn my attention back to a Mac app in the virtual Mac display, I find myself futilely finger-to-thumb tapping to activate or click whatever I’m looking at. This is akin to switching from an iPad to a MacBook and trying to touch the screen, but worse, because with the virtual Mac display in VisionOS, you’re continuously context switching between the Mac environment and VisionOS apps. The trackpad works perfectly in both; look-and-tap only works in VisionOS (or merely to move, resize, or activate the virtual Mac display — not to interact with the Mac UI therein).</p>

<p>VisionOS is already on version 1.0.1 (a software update from version 1.0.0 was already available when I set it up), and has a bunch of 1.0 bugs. I had to force quit apps — including Settings — a few times. (Press and hold both the top button and digital crown button for a few seconds, and you get a MacOS-style Force Quit dialog; keep holding both buttons down and you can force a system restart.)</p>

<p>There are not a lot of native VisionOS apps yet, but iPad apps really do work well. The main difference between native VisionOS apps and iPad apps isn’t that VisionOS apps work better, so much as that they simply look a lot cooler. VisionOS is, to my eyes, the best-looking OS Apple has made since the original skeuomorphic iPhone interface in iOS 1–6. Actual depth and shading — what an idea.</p>

<p>The apps on VisionOS’s home view are not manually organizable — a curious omission even in a 1.0 release. (Especially so given that Apple is bragging about having zillions of compatible iPad apps available.) All iPad apps — including a bunch of built-in apps from Apple itself, including News, Books, Calendar, and Maps — are put in a “Compatible Apps” folder, and have squircle-shaped icons. Native VisionOS apps are at the root level of the apps view, and have circular icons.</p>

<p>The fundamental interaction model in VisionOS feels like it will be copied by all future VR/AR headsets, in the same way that all desktop computers work like the Mac, and all phones and tablets now work like the iPhone. And when that happens, some will argue that of course they all work that way, because how else could they work? But personal computers didn’t have point-and-click GUIs before the Mac, and phones didn’t have “it’s all just a big touchscreen” interfaces before the iPhone. No other headset today has a “just look at a target, and tap your finger and thumb” interface today. I suspect in a few years they all will.</p>

<h2>The Hardware, Again, A/V Edition</h2>

<p>This brings me back to the hardware of Vision Pro. The displays are excellent, but I’m already starting to see how they aren’t good enough. The eye tracking is very good, but it’s not as precise as I’d like it to be. The cameras are good, but they don’t approach the dynamic range of your actual eyesight. There sometimes is color fringing at the periphery of your vision, depending on the lighting. A light source to your side, like a window in daytime, will show the fringing. When you move your head, the illusion of true pass-through is broken — you can tell that you’re looking at displays showing the world via footage from cameras. Just walking around is enough motion to break the illusion of natural pass-through of the real world. In fact, in some ways, the immersive 3D environments — mountaintops, lakesides, the surface of the moon (!) — are more visually realistic than the actual real world, because there’s less latency and shearing as you pan your gaze.</p>

<p>I’ve used the original PlayStation 5 VR headset, HTC Vive Pro, and own a Meta Quest 3. Vision Pro’s display quality makes both of those headsets seem like they’re from a different era. Vision Pro is in a different ballpark, playing a different game. In terms of resolution, Vision Pro is astonishing. I do not see pixels, ever. I see text as crisply as I do in real life. It’s very comfortable to read. (Although very weird, still, one week in, to have, say, a Safari window that appears 6 or 7 feet tall). But I can already imagine a <em>better</em> Vision headset display. I can already imagine lower latency between the camera footage and the displays in front of my eyes. I can already imagine greater dynamic range, like when looking out a window during daytime from inside a dim room.</p>

<p>Vision Pro’s displays are amazing, yet also obviously not good enough.</p>

<p>The speakers, on the other hand, are simply amazing. I’ve never experienced anything quite like them. I expected that they’d sound fine, but not as good as AirPods Pro or Max. Instead, I find they sound far better than any headphones I’ve ever worn. That’s because they’re actually speakers, optimally positioned in front of your ears. There’s always a catch, and the catch with Vision Pro’s speakers is that they’re not private at all. Someone sitting next to you can hear what you hear; someone near you can hear most of what you hear. When using Vision Pro in public or near others, you’ll want to wear AirPods both for privacy and courtesy — not audio quality.</p>

<p>The speakers also convey an uncanny sense of spatial reality.</p>

<p>Last, I’ve consistently gotten 3 full hours of battery life using Vision Pro on a full charge. Sometimes a little more. In my experience, Apple’s stated 2–2.5 hour battery life is a floor, not a ceiling. I also have suffered neither physical discomfort nor nausea in long sessions.<sup id="fnr3-2024-01-30"><a href="#fn3-2024-01-30">3</a></sup></p>

<h2>I’m Sorely Tempted, but Shall Resist, Making a ‘Persona Non Grata’ Pun in This Section Heading</h2>

<p><em>Personas</em> are a highlight feature of Vision Pro. Your persona is a digital avatar of your head and shoulders that appears in your stead when making FaceTime calls, or using other video call software that adopts the APIs to use them. At launch that already includes Zoom, Webex, and Microsoft Teams.</p>

<p>Apple is prominently labeling the entire persona feature “beta”, and it doesn’t take more than a moment of seeing one to know why. Personas are weird. They are very deep in the <a href="https://en.wikipedia.org/wiki/Uncanny_valley">uncanny valley</a>. There is no mistaking a persona for the actual person. At times they seem far more like a character from a video game than a photorealistic visage. And at all times they seem <em>somewhat</em> like a video game character. My hair, for example, looks like a shiny plastic Lego hairpiece.</p>

<p>Even capturing your persona is awkward. You have to take Vision Pro off your head and turn it around so the cameras (and lidar sensor?) can see you, but you don’t get a good image of what the cameras are capturing, because the front-facing EyeSight display is relatively low resolution. Your hair might be messed up from the headband, and you’ll need to check yourself in an actual mirror to make sure your shirt is smooth.</p>

<p>I FaceTimed my wife after capturing mine, and her reaction — not really knowing at all what to expect — was “No, no, no — oh my god what is this?” And then she just started laughing. We concluded the test with her telling me, “Don’t ever call me like that again.” She was joking (I think), but personas are so deep in the uncanny valley that the first time anyone sees one, they’re going to want to talk about it.</p>

<p>Apple is in a tough spot with this feature. The feature clearly deserves the prominent “beta” label in its current state. But you can also see why Apple needed to include the feature at launch, no matter how far from “good” it is. You can’t ship a productivity computer today that can’t be used for video conferencing. It’s like email or web browsing: essential. But that leaves only two options: a cartoon-like Memoji avatar, or an attempt at photorealism. Apple almost certainly could have knocked a Memoji avatar out of the park for this purpose, but I think rightly decided that that would be utterly inappropriate in most professional work contexts. You could FaceTime your family and friends as a Memoji and they’d accept it without judgment, but not professional colleagues or clients.</p>

<p>In defense of personas as they exist right now, I’ve found that I do get used to them a few minutes into a call. (I’ve had a 30-minute FaceTime call with Vision-Pro-wearing reps from Apple, as well as calls with fellow reviewers Joanna Stern and Nilay Patel.) But I think they work best persona-to-persona — that is to say, between two (or more) people who are all using a Vision Pro. That’s obviously not going to be the case for most calls. This will get normalized, as more people buy and use Vision headsets, and as Apple races to improve the feature to non-beta quality. But for now, if you use it, expect to talk about it.</p>

<p>Your persona is also used for the presentation of your eyes in the EyeSight feature. EyeSight, in Apple’s product marketing, is a headline feature of Vision Pro. It’s prominently featured on their website and their advertisements. But in practice it’s very subtle. Here’s the best selfie I’ve been able to capture of myself showing EyeSight:</p>

<p><a href="https://daringfireball.net/misc/2024/01/eyesight-in-action.jpeg" class="noborder">
<img src="https://daringfireball.net/misc/2024/01/eyesight-in-action.jpeg" alt="The author, wearing Vision Pro, with EyeSight activated."></a></p>

<p>For the record, my eyes were open when I snapped that photo.</p>

<p>EyeSight is not displayed most of the time that you’re using Vision Pro — it only turns on when Vision Pro detects a person in front of you (including when you look at yourself in a mirror). Most of the time you’re using Vision Pro, the front display shows nothing at all.</p>

<p>EyeSight is not an “<em>Oh my god, I can see your eyes!</em>” feature, but instead more of an “<em>Oh, yes, now that you ask, I guess I can sort of see your eyes</em>” feature. Apple seemingly went to great lengths (and significant expense) to create the EyeSight feature, but so far I’ve found it to be of highly dubious utility, if only because it’s so subtle. It’s like wearing tinted goggles meant to <em>obscure</em> you, not clear goggles meant to show your eyes clearly.</p>

<h2>Personal Entertainment</h2>

<p>I’ve saved the best for last. Vision Pro is simply a phenomenal way to watch movies, and 3D immersive experiences are astonishing. There are 3D immersive experiences in Vision Pro that are more compelling than Disney World attractions that people wait in line for hours to see.</p>

<p>First up are movies using apps that haven’t been updated for Vision Pro natively. I’ve used the iPad apps for services like Paramount+ and Peacock. Watching video in apps like these is a great experience, but not jaw-dropping. You just get a window with the video content that you can make as big as you want, but “big”, for these merely “compatible” apps, is about the size of the biggest wall in your room. This is true too for video in Safari when you go “full screen”. It breaks out of the browser window into a standalone video window. (Netflix.com is OK in VisionOS Safari, but YouTube.com stinks — it’s a minefield of UI targets that are too small for eye-tracking’s precision.)</p>

<p>Where things go to the next level are the Disney+ and Apple TV apps, which have been designed specifically for Vision Pro. Both apps offer immersive 360° viewing environments. <a href="https://www.apple.com/newsroom/2024/01/apple-previews-new-entertainment-experiences-launching-with-apple-vision-pro/">Disney+ has four</a>: “the Disney+ Theater, inspired by the historic El Capitan Theatre in Hollywood; the Scare Floor from Pixar’s <em>Monsters Inc.</em>; Marvel’s Avengers Tower overlooking downtown Manhattan; and the cockpit of Luke Skywalker’s landspeeder, facing a binary sunset on the planet Tatooine from the Star Wars galaxy.” With the TV app, Apple offers a distraction-free virtual theater.</p>

<p>What’s amazing about watching movies in these two apps is that the virtual movie screens look immense, as though you’re really in a movie theater, all by yourself, looking at a 100-foot screen. Apple’s presentation in the TV app is particularly good, giving you options to simulate perspectives from the front, middle, or back of the theater, as well as from either the floor or balcony levels. (Like Siskel and Ebert, I think I prefer the balcony.) The “<em>Holy shit, this screen looks absolutely immense</em>” effect is particularly good in Apple’s TV app. Somehow these postage-stamp-size displays inside Vision Pro are capable of convincing your brain that you’re sitting in the best seat in the house in front of a huge movie theater screen. (As immersive as the Disney+ viewing experience is, after using the TV app, I find myself wishing I could get closer to the big screen in Disney+, or make the screen even bigger.) I have never been a fan of 3D movies in actual movie theaters, but in Vision Pro the depth feels natural, not distracting, and you don’t suffer any loss of brightness from wearing what are effectively sunglasses.</p>

<p>And then there are the 3D immersive experiences. Apple has commissioned <a href="https://www.apple.com/newsroom/2024/01/apple-previews-new-entertainment-experiences-launching-with-apple-vision-pro/">a handful of original titles</a> that are available, free of charge as Vision Pro exclusives, at launch. I’m sure there are more on the way. There aren’t enough titles yet to recommend them as a reason to buy a Vision Pro, but what this handful of titles promises for the future is incredible. (I look forward, too, to watching sports in 3D immersion — but at the moment that’s entirely in the future. But hopefully the near future.)</p>

<p>But I <em>can</em> recommend buying Vision Pro solely for use as a personal theater. I paid $5,000 for my 77-inch LG OLED TV a few years ago. Vision Pro offers a far more compelling experience (including far more compelling spatial surround sound). You’d look at my TV set and almost certainly agree that it’s a nice big TV. But watching movies in the Disney+ and TV apps will make you go “Wow!” These are experiences I never imagined I’d be able to have in my own home (or, say, while flying across the country in an airplane).</p>

<p>The only hitch is that Vision Pro is utterly personal. Putting a headset on is by nature isolating — like headphones but more so, because eye contact is so essential for all primates. If you don’t often watch movies or shows or sports by yourself, it doesn’t make sense to buy a device that only you can see. Just this weekend, I watched most of the first half of the Chiefs-Ravens game in the Paramount+ app on Vision Pro, in a window scaled to the size of my entire living room wall. It was captivating. The image quality was a bit grainy scaled to that size (I believe the telecast was only in 1080p resolution), but it was better than watching on my TV simply because it was so damn big. But I wanted to watch the rest of the game (and the subsequent Lions-49ers game) with my wife, together on the sofa. I was happier overall sharing the experience with her, but damn if my 77-inch TV didn’t suddenly seem way too small.</p>

<p>Spatial computing in VisionOS is the real deal. It’s a legit productivity computing platform right now, and it’s only going to get better. It sounds like hype, but I truly believe this is a landmark breakthrough like the 1984 Macintosh and the 2007 iPhone.</p>

<p>But if you were to try just one thing using Vision Pro — just one thing — it has to be watching a movie in the TV app, in theater mode. Try that, and no matter how skeptical you were beforehand about the Vision Pro’s price tag, your hand will start inching toward your wallet.</p>

+ 267
- 0
cache/2024/f4d2d42eba58062be910407690ae447c/index.html View File

@@ -0,0 +1,267 @@
<!doctype html><!-- This is a valid HTML5 document. -->
<!-- Screen readers, SEO, extensions and so on. -->
<html lang="fr">
<!-- Has to be within the first 1024 bytes, hence before the `title` element
See: https://www.w3.org/TR/2012/CR-html5-20121217/document-metadata.html#charset -->
<meta charset="utf-8">
<!-- Why no `X-UA-Compatible` meta: https://stackoverflow.com/a/6771584 -->
<!-- The viewport meta is quite crowded and we are responsible for that.
See: https://codepen.io/tigt/post/meta-viewport-for-2015 -->
<meta name="viewport" content="width=device-width,initial-scale=1">
<!-- Required to make a valid HTML5 document. -->
<title>The Web Component Success Story (archive) — David Larlet</title>
<meta name="description" content="Publication mise en cache pour en conserver une trace.">
<!-- That good ol' feed, subscribe :). -->
<link rel="alternate" type="application/atom+xml" title="Feed" href="/david/log/">
<!-- Generated from https://realfavicongenerator.net/ such a mess. -->
<link rel="apple-touch-icon" sizes="180x180" href="/static/david/icons2/apple-touch-icon.png">
<link rel="icon" type="image/png" sizes="32x32" href="/static/david/icons2/favicon-32x32.png">
<link rel="icon" type="image/png" sizes="16x16" href="/static/david/icons2/favicon-16x16.png">
<link rel="manifest" href="/static/david/icons2/site.webmanifest">
<link rel="mask-icon" href="/static/david/icons2/safari-pinned-tab.svg" color="#07486c">
<link rel="shortcut icon" href="/static/david/icons2/favicon.ico">
<meta name="msapplication-TileColor" content="#f7f7f7">
<meta name="msapplication-config" content="/static/david/icons2/browserconfig.xml">
<meta name="theme-color" content="#f7f7f7" media="(prefers-color-scheme: light)">
<meta name="theme-color" content="#272727" media="(prefers-color-scheme: dark)">
<!-- Is that even respected? Retrospectively? What a shAItshow…
https://neil-clarke.com/block-the-bots-that-feed-ai-models-by-scraping-your-website/ -->
<meta name="robots" content="noai, noimageai">
<!-- Documented, feel free to shoot an email. -->
<link rel="stylesheet" href="/static/david/css/style_2021-01-20.css">
<!-- See https://www.zachleat.com/web/comprehensive-webfonts/ for the trade-off. -->
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_regular.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: light), (prefers-color-scheme: no-preference)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_bold.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: light), (prefers-color-scheme: no-preference)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_italic.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: light), (prefers-color-scheme: no-preference)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t3_regular.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: dark)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t3_bold.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: dark)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t3_italic.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: dark)" crossorigin>
<script>
function toggleTheme(themeName) {
document.documentElement.classList.toggle(
'forced-dark',
themeName === 'dark'
)
document.documentElement.classList.toggle(
'forced-light',
themeName === 'light'
)
}
const selectedTheme = localStorage.getItem('theme')
if (selectedTheme !== 'undefined') {
toggleTheme(selectedTheme)
}
</script>

<meta name="robots" content="noindex, nofollow">
<meta content="origin-when-cross-origin" name="referrer">
<!-- Canonical URL for SEO purposes -->
<link rel="canonical" href="https://jakelazaroff.com/words/the-web-component-success-story/">

<body class="remarkdown h1-underline h2-underline h3-underline em-underscore hr-center ul-star pre-tick" data-instant-intensity="viewport-all">


<article>
<header>
<h1>The Web Component Success Story</h1>
</header>
<nav>
<p class="center">
<a href="/david/" title="Aller à l’accueil"><svg class="icon icon-home">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-home"></use>
</svg> Accueil</a> •
<a href="https://jakelazaroff.com/words/the-web-component-success-story/" title="Lien vers le contenu original">Source originale</a>
<br>
Mis en cache le 2024-01-31
</p>
</nav>
<hr>
<p>Tom MacWright wrote a short post wondering <a class="link" href="https://macwright.com/2024/01/24/on-web-components" data-astro-cid-bi7aps5f>why we don’t see prominent applications using web components</a><a class="tooltip" data-tooltip href="https://macwright.com/2024/01/24/on-web-components" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>On Web Components</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://macwright.com/css/favicon.png" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>macwright.com/2024/01/24/on-web-components</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>.</p>
<p>That’s a fair question!
It’s easy to see the success of frameworks like React and Rails: just look at the thousands of websites built with them.
What does the web component success story look like?</p>
<p>Contrary to some people, I don’t see web components on their own as a huge productivity boon for individual websites.
Once you’ve bought into a particular set of technologies, it makes sense to use it for as much as you can.
If you have a React app, you’d be justifiably skeptical of introducing a second way to build components!</p>
<p>Rather, the biggest benefits I see are <em>collective</em>, cutting across the industry as a whole.
I think web components can make the entire web more accessible.
They have the potential to unify currently fragmented communities, including various JavaScript frameworks <em>and</em> those who avoid them.</p>
<p>I know that’s an audacious pitch, but bear with me.</p>
<h3 id="javascript-framework-interop">JavaScript Framework Interop</h3>
<p>Whenever I write about web components, I see pushback from people in the JavaScript community who seem to think that I want replace to JavaScript frameworks.</p>
<p>If you’re in that camp, let me assuage those fears: the web component success story emphatically does <em>not</em> involve rewriting every React app using web components.
As I’ll continue to say, web components and JavaScript frameworks are <em>complementary</em> (as opposed to <em>competing</em>) technologies.
In fact, I think JavaScript framework apps will be one of the most common places in which web components are used!</p>
<p>Does that mean we’ll all start writing web components in addition to React components?
Not at all.
When I say that web components will be used in JavaScript framework apps, I’m talking about third-party libraries.</p>
<p>JavaScript frameworks are tools, all tools have tradeoffs, yada yada, let’s skip the preamble.
I want to talk about one specific weakness of JavaScript frameworks: interoperability, or the lack thereof.
Almost without exception, each framework can only render components written for that framework specifically.</p>
<p>As a result, the JavaScript community tends to fragment itself along framework lines.
Switching frameworks has a high cost, especially when moving to a less popular one; it means leaving most of the third-party ecosystem behind.</p>
<p>That switching cost stunts framework innovation by heavily favoring incumbents with large ecosystems.
It’s hard to create new frameworks, because each one has to start its own ecosystem from scratch.
We keep rebuilding the same set of primitives over and over and over again.</p>
<p>There’s a famous Joel Spolsky blog post about <a class="link" href="https://www.joelonsoftware.com/2002/06/12/strategy-letter-v/" data-astro-cid-bi7aps5f>why capitalistic tech companies contribute to open source</a><a class="tooltip" data-tooltip href="https://www.joelonsoftware.com/2002/06/12/strategy-letter-v/" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://i0.wp.com/www.joelonsoftware.com/wp-content/uploads/2016/12/11969842.jpg?fit=400%2C400&amp;ssl=1" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>Strategy Letter V</span> <span class="description" data-astro-cid-bi7aps5f>When I was in college I took two intro economics courses: macroeconomics and microeconomics. Macro was full of theories like “low unemployment causes inflation” that never quite stood u…</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://i0.wp.com/www.joelonsoftware.com/wp-content/uploads/2016/12/11969842.jpg?fit=32%2C32&amp;ssl=1" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>www.joelonsoftware.com/2002/06/12/strategy-letter-v/</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>.
Briefly: every product has <em>substitutes</em> (products that can replace it) and <em>complements</em> (products that can be used alongside it).
The big takeaway is that ”<strong>smart companies try to commoditize their products’ complements</strong>”.
In other words, they try to make it so that their own product has a proprietary advantage, while the products used alongside it are all cheap and interchangeable.</p>
<p>Back to JavaScript frameworks.
React and Svelte are substitutes, while React and Radix are complements.
As a library author, the way to commoditize your complement is to make it work with as many frameworks as possible.<sup><a class="link" href="#user-content-fn-reactnative" id="user-content-fnref-reactnative" data-footnote-ref="" aria-describedby="footnote-label" data-astro-cid-bi7aps5f>1</a></sup>
And unlike in Native Land — where people have collectively spent billions of dollars over decades developing write-once run-anywhere environments — the web has one built in.</p>
<p>Maybe you’ve heard of it?
It’s called HTML, and it works with <a class="link" href="https://custom-elements-everywhere.com" data-astro-cid-bi7aps5f>every Javascript framework</a><a class="tooltip" data-tooltip href="https://custom-elements-everywhere.com" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://custom-elements-everywhere.com/images/card.jpg" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>Custom Elements Everywhere</span> <span class="description" data-astro-cid-bi7aps5f>Making sure frameworks and custom elements can be BFFs 🍻</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://custom-elements-everywhere.com/images/favicon.ico" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>custom-elements-everywhere.com</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>.
For all their warts, the fact that web components get this interoperability for free is a <em>ridiculously powerful advantage</em>, and libraries that don’t exploit it are leaving a lot of potential users on the table.</p>
<p>Here’s a concrete example.
<a class="link" href="https://www.xyflow.com" data-astro-cid-bi7aps5f>xyflow</a><a class="tooltip" data-tooltip href="https://www.xyflow.com" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://xyflow.com/img/og/xyflow.jpg" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>Node Based UIs for React and Svelte – xyflow</span> <span class="description" data-astro-cid-bi7aps5f>Powerful open source libraries for building node-based UIs with React or Svelte. Ready out-of-the-box and infinitely customizable.</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://xyflow.com/img/favicon.ico" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>www.xyflow.com</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a> is an excellent library for making flow charts.
It was originally called React Flow, but the maintainers renamed it when they added Svelte support.
<a class="link" href="https://www.xyflow.com/blog/why-svelte-flow" data-astro-cid-bi7aps5f>They had to put in a ton of work just to support that <em>one</em> extra framework</a><a class="tooltip" data-tooltip href="https://www.xyflow.com/blog/why-svelte-flow" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://xyflow.com/img/og/xyflow.jpg" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>Why Svelte Flow? – xyflow</span> <span class="description" data-astro-cid-bi7aps5f>xyflow - Customizable library for rendering workflows, diagrams and node-based UIs.</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://xyflow.com/img/favicon.ico" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>www.xyflow.com/blog/why-svelte-flow</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>!
And if you use Vue, Angular, Solid, Qwik or Ember, you’re still out of luck.</p>
<p>React has enjoyed continued success because it has a moat of fantastic third-party libraries: Radix, React Aria, React Three Fiber, Framer Motion and xyflow, among many others.
Web components have the potential to give us that same ecosystem — but for <em>every</em> framework.</p>
<h3 id="islands-of-interactivity">Islands of Interactivity</h3>
<p>Of course, plenty of websites don’t use JavaScript frameworks.
Hypermedia-centric approaches (read: how websites were built before circa 2010) are making a resurgence, led by libraries such as htmx.</p>
<p>Many websites like this still incorporate highly dynamic elements.
Often, these take the form of rich widgets that are missing from HTML, like menus and combo boxes.
Sometimes they’re even more complicated, like interactive diagrams in articles.
The modern term for these dynamic regions within an otherwise static page is “islands of interactivity”, but the pattern has existed for a long time.</p>
<p>Embedding these islands within the larger page has always been kinda awkward.
The process remains mostly unchanged from the days of jQuery plugins, relying on a complex choreography of HTML classes, CSS selectors and JavaScript function calls.
The bulk of the setup happens in a separate JavaScript file, far away from the HTML where the component will live on the page.</p>
<p>Web components invert that process.
They allow islands to be instantiated in the same way as any other element: by writing a tag name in the page’s markup.
As I wrote in <a class="link" href="https://jakelazaroff.com/words/the-website-vs-web-app-dichotomy-doesnt-exist/" data-astro-cid-bi7aps5f>The Website vs. Web App Dichotomy Doesn’t Exist</a><a class="tooltip" data-tooltip href="https://jakelazaroff.com/words/the-website-vs-web-app-dichotomy-doesnt-exist/" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://jakelazaroff.com/og/the-website-vs-web-app-dichotomy-doesnt-exist.png" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>The Website vs. Web App Dichotomy Doesn't Exist | jakelazaroff.com</span> <span class="description" data-astro-cid-bi7aps5f>A one-dimensional spectrum can't sufficiently capture the tradeoffs involved in web development.</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://jakelazaroff.com/favicon.ico" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>jakelazaroff.com/words/the-website-vs-web-app-dichotomy-doesnt-exist/</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>, web components allow developers to declaratively add dynamic behavior to HTML itself.</p>
<p>What makes web components particularly good companions for hypermedia-oriented libraries is the way they interact with other parts of the page.
While JavaScript framework components tend to do so by invoking callback functions, web components instead embrace one of the web’s core idioms: events. <sup><a class="link" href="#user-content-fn-target" id="user-content-fnref-target" data-footnote-ref="" aria-describedby="footnote-label" data-astro-cid-bi7aps5f>2</a></sup>
Indeed, Carson Gross’s essay <a class="link" href="https://htmx.org/essays/hypermedia-friendly-scripting/#events" data-astro-cid-bi7aps5f>Hypermedia-Friendly Scripting</a><a class="tooltip" data-tooltip href="https://htmx.org/essays/hypermedia-friendly-scripting/#events" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>&lt;/&gt; htmx ~ Hypermedia-Friendly Scripting</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://htmx.org/favicon.ico#events" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>htmx.org/essays/hypermedia-friendly-scripting/#events</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a> neatly outlines this use case:</p>
<blockquote>
<p>A JavaScript-based component that triggers events allows for hypermedia-oriented JavaScript libraries, such as htmx, to listen for those events and trigger hypermedia exchanges. This, in turn, makes any JavaScript library a potential hypermedia control, able to drive the Hypermedia-Driven Application via user-selected actions.</p>
</blockquote>
<p>As an example, here’s a TIL I wrote on <a class="link" href="https://til.jakelazaroff.com/htmx/load-modal-content-when-shoelace-dialog-opens/" data-astro-cid-bi7aps5f>using htmx and the Shoelace web component library to load the content of a dialog when it opens</a><a class="tooltip" data-tooltip href="https://til.jakelazaroff.com/htmx/load-modal-content-when-shoelace-dialog-opens/" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://til.jakelazaroff.com/og/htmx/load-modal-content-when-shoelace-dialog-opens.png" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>[htmx] Load modal content when a Shoelace dialog opens | Today I Learned</span> <span class="description" data-astro-cid-bi7aps5f>A collection of useful things I've learned.</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://til.jakelazaroff.com/favicon.ico" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>til.jakelazaroff.com/htmx/load-modal-content-when-shoelace-dialog-opens/</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>.
Notice how the whole process — from instantiating the dialog component, to requesting the content when the modal opens, to inserting it into the appropriate place in the DOM — is controlled declaratively via markup.</p>
<p>There are also <a class="link" href="https://www.zachleat.com/web/a-taxonomy-of-web-component-types/#html-web-components" data-astro-cid-bi7aps5f>HTML web components</a><a class="tooltip" data-tooltip href="https://www.zachleat.com/web/a-taxonomy-of-web-component-types/#html-web-components" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://v1.screenshot.11ty.dev/https%3A%2F%2Fwww.zachleat.com%2Fopengraph%2Fweb%2Fa-taxonomy-of-web-component-types%2F/opengraph/_x202401_0/" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>An Attempted Taxonomy of Web Components—zachleat.com</span> <span class="description" data-astro-cid-bi7aps5f>A post by Zach Leatherman (zachleat)</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://www.zachleat.com/img/rel-icon-192.jpg#html-web-components" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>www.zachleat.com/web/a-taxonomy-of-web-component-types/#html-web-components</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>, which work by progressively enhancing existing markup rather than by rendering new DOM elements.
Colocating logic in this way, sometimes called <a class="link" href="https://htmx.org/essays/locality-of-behaviour/" data-astro-cid-bi7aps5f>locality of behavior</a><a class="tooltip" data-tooltip href="https://htmx.org/essays/locality-of-behaviour/" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>&lt;/&gt; htmx ~ Locality of Behaviour (LoB)</span> <span class="description" data-astro-cid-bi7aps5f>htmx gives you access to AJAX, CSS Transitions, WebSockets and Server Sent Events directly in HTML, using attributes, so you can build modern user interfaces with the simplicity and power of hypertext

htmx is small (~14k min.gz’d), dependency-free, extendable, IE11 compatible &amp; has reduced code base sizes by 67% when compared with react</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://htmx.org/favicon.ico" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>htmx.org/essays/locality-of-behaviour/</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>, is a different lens on <a class="link" href="https://speakerdeck.com/didoo/let-there-be-peace-on-css?slide=62" data-astro-cid-bi7aps5f>a concept with which JavaScript developers should already be familiar</a><a class="tooltip" data-tooltip href="https://speakerdeck.com/didoo/let-there-be-peace-on-css?slide=62" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://files.speakerdeck.com/presentations/ecd310041a6841e0b4680dd85771a6fb/slide_61.jpg?8848622" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>Let There Be Peace On CSS</span> <span class="description" data-astro-cid-bi7aps5f>In the last few months there's been a growing friction between those who see CSS as an untouchable layer in the "separation of concerns" paradigm, and those who have simply ignored this golden rule and found different ways to style the UI (typically applying CSS styles via JavaScript).

This debate is getting more and more intense every day, bringing division in a community that used to be immune to this kind of “wars”.

This talk is my attempt to bring peace between the two fronts. To help these two opposite factions to understand and listen to each other, see the counterpart's points of views. To find the good things they have in common, and learn something from that.

## This talk has been presented at London CSS Meetup + Design Exchange Nottingham (DXN) + Front End London (FEL) + State of The Browser ##

Video of the talk: https://www.youtube.com/watch?v=bb_kb6Q2Kdc</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://d1eu30co0ohy4w.cloudfront.net/assets/favicon-bdd5839d46040a50edf189174e6f7aacc8abb3aaecd56a4711cf00d820883f47.png" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>speakerdeck.com/didoo/let-there-be-peace-on-css?slide=62</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>.</p>
<h3 id="no-more-silos">No More Silos</h3>
<p>These sound like separate problems, but they’re actually two sides of the same coin.
With web components, the library that works in every JavaScript framework <em>also</em> works as an island of interactivity on a static webpage.
Even HTML web components fit into both niches.</p>
<p>Brad Frost has called for <a class="link" href="https://bradfrost.com/blog/post/a-global-design-system/" data-astro-cid-bi7aps5f>a global design system</a><a class="tooltip" data-tooltip href="https://bradfrost.com/blog/post/a-global-design-system/" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://bradfrost.com/wp-content/uploads/2023/11/CleanShot-2023-11-02-at-13.53.24.png" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>A Global Design System</span> <span class="description" data-astro-cid-bi7aps5f>TL;DR: This is a call to action to create a Global Design System that provides the world's web designers &amp; developers a library of common UI components. A Global Design System would improve the quality and accessibility of the world's web experiences, save the world's web designers and developer</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://bradfrost.com/favicon.ico" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>bradfrost.com/blog/post/a-global-design-system/</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>: “a common library containing common UI components currently found in most design systems”.
The proposal is to create a cohesive, unstyled, accessible and internationalizable set of components — like <a class="link" href="https://www.radix-ui.com/primitives" data-astro-cid-bi7aps5f>Radix</a><a class="tooltip" data-tooltip href="https://www.radix-ui.com/primitives" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://radix-ui.com/social/primitives.png" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>Radix Primitives</span> <span class="description" data-astro-cid-bi7aps5f>Unstyled, accessible, open source React primitives for high-quality web apps and design systems.</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://www.radix-ui.com/favicon.png" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>www.radix-ui.com/primitives</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>, but for the entire web rather than for a single JavaScript framework.
It’s an ambitious goal, and from where I stand web components are by far the best way to achieve it.</p>
<p>Web components won’t take web development by storm, or show us the One True Way to build websites.
They don’t need to dethrone JavaScript frameworks.
We probably won’t even all learn how to write them!</p>
<p>What web components <em>will</em> do — at least, I hope — is let us collectively build a rich ecosystem of dynamic components that work with any web stack.
No more silos.
That’s the web component success story.</p>
</article>


<hr>

<footer>
<p>
<a href="/david/" title="Aller à l’accueil"><svg class="icon icon-home">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-home"></use>
</svg> Accueil</a> •
<a href="/david/log/" title="Accès au flux RSS"><svg class="icon icon-rss2">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-rss2"></use>
</svg> Suivre</a> •
<a href="http://larlet.com" title="Go to my English profile" data-instant><svg class="icon icon-user-tie">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-user-tie"></use>
</svg> Pro</a> •
<a href="mailto:david%40larlet.fr" title="Envoyer un courriel"><svg class="icon icon-mail">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-mail"></use>
</svg> Email</a> •
<abbr class="nowrap" title="Hébergeur : Alwaysdata, 62 rue Tiquetonne 75002 Paris, +33184162340"><svg class="icon icon-hammer2">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-hammer2"></use>
</svg> Légal</abbr>
</p>
<template id="theme-selector">
<form>
<fieldset>
<legend><svg class="icon icon-brightness-contrast">
<use xlink:href="/static/david/icons2/symbol-defs-2021-12.svg#icon-brightness-contrast"></use>
</svg> Thème</legend>
<label>
<input type="radio" value="auto" name="chosen-color-scheme" checked> Auto
</label>
<label>
<input type="radio" value="dark" name="chosen-color-scheme"> Foncé
</label>
<label>
<input type="radio" value="light" name="chosen-color-scheme"> Clair
</label>
</fieldset>
</form>
</template>
</footer>
<script src="/static/david/js/instantpage-5.1.0.min.js" type="module"></script>
<script>
function loadThemeForm(templateName) {
const themeSelectorTemplate = document.querySelector(templateName)
const form = themeSelectorTemplate.content.firstElementChild
themeSelectorTemplate.replaceWith(form)

form.addEventListener('change', (e) => {
const chosenColorScheme = e.target.value
localStorage.setItem('theme', chosenColorScheme)
toggleTheme(chosenColorScheme)
})

const selectedTheme = localStorage.getItem('theme')
if (selectedTheme && selectedTheme !== 'undefined') {
form.querySelector(`[value="${selectedTheme}"]`).checked = true
}
}

const prefersColorSchemeDark = '(prefers-color-scheme: dark)'
window.addEventListener('load', () => {
let hasDarkRules = false
for (const styleSheet of Array.from(document.styleSheets)) {
let mediaRules = []
for (const cssRule of styleSheet.cssRules) {
if (cssRule.type !== CSSRule.MEDIA_RULE) {
continue
}
// WARNING: Safari does not have/supports `conditionText`.
if (cssRule.conditionText) {
if (cssRule.conditionText !== prefersColorSchemeDark) {
continue
}
} else {
if (cssRule.cssText.startsWith(prefersColorSchemeDark)) {
continue
}
}
mediaRules = mediaRules.concat(Array.from(cssRule.cssRules))
}

// WARNING: do not try to insert a Rule to a styleSheet you are
// currently iterating on, otherwise the browser will be stuck
// in a infinite loop…
for (const mediaRule of mediaRules) {
styleSheet.insertRule(mediaRule.cssText)
hasDarkRules = true
}
}
if (hasDarkRules) {
loadThemeForm('#theme-selector')
}
})
</script>
</body>
</html>

+ 96
- 0
cache/2024/f4d2d42eba58062be910407690ae447c/index.md View File

@@ -0,0 +1,96 @@
title: The Web Component Success Story
url: https://jakelazaroff.com/words/the-web-component-success-story/
hash_url: f4d2d42eba58062be910407690ae447c
archive_date: 2024-01-31

<p>Tom MacWright wrote a short post wondering <a class="link" href="https://macwright.com/2024/01/24/on-web-components" data-astro-cid-bi7aps5f>why we don’t see prominent applications using web components</a><a class="tooltip" data-tooltip href="https://macwright.com/2024/01/24/on-web-components" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>On Web Components</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://macwright.com/css/favicon.png" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>macwright.com/2024/01/24/on-web-components</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>.</p>
<p>That’s a fair question!
It’s easy to see the success of frameworks like React and Rails: just look at the thousands of websites built with them.
What does the web component success story look like?</p>
<p>Contrary to some people, I don’t see web components on their own as a huge productivity boon for individual websites.
Once you’ve bought into a particular set of technologies, it makes sense to use it for as much as you can.
If you have a React app, you’d be justifiably skeptical of introducing a second way to build components!</p>
<p>Rather, the biggest benefits I see are <em>collective</em>, cutting across the industry as a whole.
I think web components can make the entire web more accessible.
They have the potential to unify currently fragmented communities, including various JavaScript frameworks <em>and</em> those who avoid them.</p>
<p>I know that’s an audacious pitch, but bear with me.</p>
<h3 id="javascript-framework-interop">JavaScript Framework Interop</h3>
<p>Whenever I write about web components, I see pushback from people in the JavaScript community who seem to think that I want replace to JavaScript frameworks.</p>
<p>If you’re in that camp, let me assuage those fears: the web component success story emphatically does <em>not</em> involve rewriting every React app using web components.
As I’ll continue to say, web components and JavaScript frameworks are <em>complementary</em> (as opposed to <em>competing</em>) technologies.
In fact, I think JavaScript framework apps will be one of the most common places in which web components are used!</p>
<p>Does that mean we’ll all start writing web components in addition to React components?
Not at all.
When I say that web components will be used in JavaScript framework apps, I’m talking about third-party libraries.</p>
<p>JavaScript frameworks are tools, all tools have tradeoffs, yada yada, let’s skip the preamble.
I want to talk about one specific weakness of JavaScript frameworks: interoperability, or the lack thereof.
Almost without exception, each framework can only render components written for that framework specifically.</p>
<p>As a result, the JavaScript community tends to fragment itself along framework lines.
Switching frameworks has a high cost, especially when moving to a less popular one; it means leaving most of the third-party ecosystem behind.</p>
<p>That switching cost stunts framework innovation by heavily favoring incumbents with large ecosystems.
It’s hard to create new frameworks, because each one has to start its own ecosystem from scratch.
We keep rebuilding the same set of primitives over and over and over again.</p>
<p>There’s a famous Joel Spolsky blog post about <a class="link" href="https://www.joelonsoftware.com/2002/06/12/strategy-letter-v/" data-astro-cid-bi7aps5f>why capitalistic tech companies contribute to open source</a><a class="tooltip" data-tooltip href="https://www.joelonsoftware.com/2002/06/12/strategy-letter-v/" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://i0.wp.com/www.joelonsoftware.com/wp-content/uploads/2016/12/11969842.jpg?fit=400%2C400&amp;ssl=1" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>Strategy Letter V</span> <span class="description" data-astro-cid-bi7aps5f>When I was in college I took two intro economics courses: macroeconomics and microeconomics. Macro was full of theories like “low unemployment causes inflation” that never quite stood u…</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://i0.wp.com/www.joelonsoftware.com/wp-content/uploads/2016/12/11969842.jpg?fit=32%2C32&amp;ssl=1" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>www.joelonsoftware.com/2002/06/12/strategy-letter-v/</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>.
Briefly: every product has <em>substitutes</em> (products that can replace it) and <em>complements</em> (products that can be used alongside it).
The big takeaway is that ”<strong>smart companies try to commoditize their products’ complements</strong>”.
In other words, they try to make it so that their own product has a proprietary advantage, while the products used alongside it are all cheap and interchangeable.</p>
<p>Back to JavaScript frameworks.
React and Svelte are substitutes, while React and Radix are complements.
As a library author, the way to commoditize your complement is to make it work with as many frameworks as possible.<sup><a class="link" href="#user-content-fn-reactnative" id="user-content-fnref-reactnative" data-footnote-ref="" aria-describedby="footnote-label" data-astro-cid-bi7aps5f>1</a></sup>
And unlike in Native Land — where people have collectively spent billions of dollars over decades developing write-once run-anywhere environments — the web has one built in.</p>
<p>Maybe you’ve heard of it?
It’s called HTML, and it works with <a class="link" href="https://custom-elements-everywhere.com" data-astro-cid-bi7aps5f>every Javascript framework</a><a class="tooltip" data-tooltip href="https://custom-elements-everywhere.com" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://custom-elements-everywhere.com/images/card.jpg" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>Custom Elements Everywhere</span> <span class="description" data-astro-cid-bi7aps5f>Making sure frameworks and custom elements can be BFFs 🍻</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://custom-elements-everywhere.com/images/favicon.ico" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>custom-elements-everywhere.com</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>.
For all their warts, the fact that web components get this interoperability for free is a <em>ridiculously powerful advantage</em>, and libraries that don’t exploit it are leaving a lot of potential users on the table.</p>
<p>Here’s a concrete example.
<a class="link" href="https://www.xyflow.com" data-astro-cid-bi7aps5f>xyflow</a><a class="tooltip" data-tooltip href="https://www.xyflow.com" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://xyflow.com/img/og/xyflow.jpg" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>Node Based UIs for React and Svelte – xyflow</span> <span class="description" data-astro-cid-bi7aps5f>Powerful open source libraries for building node-based UIs with React or Svelte. Ready out-of-the-box and infinitely customizable.</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://xyflow.com/img/favicon.ico" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>www.xyflow.com</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a> is an excellent library for making flow charts.
It was originally called React Flow, but the maintainers renamed it when they added Svelte support.
<a class="link" href="https://www.xyflow.com/blog/why-svelte-flow" data-astro-cid-bi7aps5f>They had to put in a ton of work just to support that <em>one</em> extra framework</a><a class="tooltip" data-tooltip href="https://www.xyflow.com/blog/why-svelte-flow" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://xyflow.com/img/og/xyflow.jpg" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>Why Svelte Flow? – xyflow</span> <span class="description" data-astro-cid-bi7aps5f>xyflow - Customizable library for rendering workflows, diagrams and node-based UIs.</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://xyflow.com/img/favicon.ico" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>www.xyflow.com/blog/why-svelte-flow</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>!
And if you use Vue, Angular, Solid, Qwik or Ember, you’re still out of luck.</p>
<p>React has enjoyed continued success because it has a moat of fantastic third-party libraries: Radix, React Aria, React Three Fiber, Framer Motion and xyflow, among many others.
Web components have the potential to give us that same ecosystem — but for <em>every</em> framework.</p>
<h3 id="islands-of-interactivity">Islands of Interactivity</h3>
<p>Of course, plenty of websites don’t use JavaScript frameworks.
Hypermedia-centric approaches (read: how websites were built before circa 2010) are making a resurgence, led by libraries such as htmx.</p>
<p>Many websites like this still incorporate highly dynamic elements.
Often, these take the form of rich widgets that are missing from HTML, like menus and combo boxes.
Sometimes they’re even more complicated, like interactive diagrams in articles.
The modern term for these dynamic regions within an otherwise static page is “islands of interactivity”, but the pattern has existed for a long time.</p>
<p>Embedding these islands within the larger page has always been kinda awkward.
The process remains mostly unchanged from the days of jQuery plugins, relying on a complex choreography of HTML classes, CSS selectors and JavaScript function calls.
The bulk of the setup happens in a separate JavaScript file, far away from the HTML where the component will live on the page.</p>
<p>Web components invert that process.
They allow islands to be instantiated in the same way as any other element: by writing a tag name in the page’s markup.
As I wrote in <a class="link" href="https://jakelazaroff.com/words/the-website-vs-web-app-dichotomy-doesnt-exist/" data-astro-cid-bi7aps5f>The Website vs. Web App Dichotomy Doesn’t Exist</a><a class="tooltip" data-tooltip href="https://jakelazaroff.com/words/the-website-vs-web-app-dichotomy-doesnt-exist/" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://jakelazaroff.com/og/the-website-vs-web-app-dichotomy-doesnt-exist.png" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>The Website vs. Web App Dichotomy Doesn't Exist | jakelazaroff.com</span> <span class="description" data-astro-cid-bi7aps5f>A one-dimensional spectrum can't sufficiently capture the tradeoffs involved in web development.</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://jakelazaroff.com/favicon.ico" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>jakelazaroff.com/words/the-website-vs-web-app-dichotomy-doesnt-exist/</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>, web components allow developers to declaratively add dynamic behavior to HTML itself.</p>
<p>What makes web components particularly good companions for hypermedia-oriented libraries is the way they interact with other parts of the page.
While JavaScript framework components tend to do so by invoking callback functions, web components instead embrace one of the web’s core idioms: events. <sup><a class="link" href="#user-content-fn-target" id="user-content-fnref-target" data-footnote-ref="" aria-describedby="footnote-label" data-astro-cid-bi7aps5f>2</a></sup>
Indeed, Carson Gross’s essay <a class="link" href="https://htmx.org/essays/hypermedia-friendly-scripting/#events" data-astro-cid-bi7aps5f>Hypermedia-Friendly Scripting</a><a class="tooltip" data-tooltip href="https://htmx.org/essays/hypermedia-friendly-scripting/#events" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>&lt;/&gt; htmx ~ Hypermedia-Friendly Scripting</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://htmx.org/favicon.ico#events" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>htmx.org/essays/hypermedia-friendly-scripting/#events</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a> neatly outlines this use case:</p>
<blockquote>
<p>A JavaScript-based component that triggers events allows for hypermedia-oriented JavaScript libraries, such as htmx, to listen for those events and trigger hypermedia exchanges. This, in turn, makes any JavaScript library a potential hypermedia control, able to drive the Hypermedia-Driven Application via user-selected actions.</p>
</blockquote>
<p>As an example, here’s a TIL I wrote on <a class="link" href="https://til.jakelazaroff.com/htmx/load-modal-content-when-shoelace-dialog-opens/" data-astro-cid-bi7aps5f>using htmx and the Shoelace web component library to load the content of a dialog when it opens</a><a class="tooltip" data-tooltip href="https://til.jakelazaroff.com/htmx/load-modal-content-when-shoelace-dialog-opens/" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://til.jakelazaroff.com/og/htmx/load-modal-content-when-shoelace-dialog-opens.png" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>[htmx] Load modal content when a Shoelace dialog opens | Today I Learned</span> <span class="description" data-astro-cid-bi7aps5f>A collection of useful things I've learned.</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://til.jakelazaroff.com/favicon.ico" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>til.jakelazaroff.com/htmx/load-modal-content-when-shoelace-dialog-opens/</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>.
Notice how the whole process — from instantiating the dialog component, to requesting the content when the modal opens, to inserting it into the appropriate place in the DOM — is controlled declaratively via markup.</p>
<p>There are also <a class="link" href="https://www.zachleat.com/web/a-taxonomy-of-web-component-types/#html-web-components" data-astro-cid-bi7aps5f>HTML web components</a><a class="tooltip" data-tooltip href="https://www.zachleat.com/web/a-taxonomy-of-web-component-types/#html-web-components" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://v1.screenshot.11ty.dev/https%3A%2F%2Fwww.zachleat.com%2Fopengraph%2Fweb%2Fa-taxonomy-of-web-component-types%2F/opengraph/_x202401_0/" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>An Attempted Taxonomy of Web Components—zachleat.com</span> <span class="description" data-astro-cid-bi7aps5f>A post by Zach Leatherman (zachleat)</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://www.zachleat.com/img/rel-icon-192.jpg#html-web-components" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>www.zachleat.com/web/a-taxonomy-of-web-component-types/#html-web-components</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>, which work by progressively enhancing existing markup rather than by rendering new DOM elements.
Colocating logic in this way, sometimes called <a class="link" href="https://htmx.org/essays/locality-of-behaviour/" data-astro-cid-bi7aps5f>locality of behavior</a><a class="tooltip" data-tooltip href="https://htmx.org/essays/locality-of-behaviour/" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>&lt;/&gt; htmx ~ Locality of Behaviour (LoB)</span> <span class="description" data-astro-cid-bi7aps5f>htmx gives you access to AJAX, CSS Transitions, WebSockets and Server Sent Events directly in HTML, using attributes, so you can build modern user interfaces with the simplicity and power of hypertext

htmx is small (~14k min.gz’d), dependency-free, extendable, IE11 compatible &amp; has reduced code base sizes by 67% when compared with react</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://htmx.org/favicon.ico" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>htmx.org/essays/locality-of-behaviour/</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>, is a different lens on <a class="link" href="https://speakerdeck.com/didoo/let-there-be-peace-on-css?slide=62" data-astro-cid-bi7aps5f>a concept with which JavaScript developers should already be familiar</a><a class="tooltip" data-tooltip href="https://speakerdeck.com/didoo/let-there-be-peace-on-css?slide=62" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://files.speakerdeck.com/presentations/ecd310041a6841e0b4680dd85771a6fb/slide_61.jpg?8848622" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>Let There Be Peace On CSS</span> <span class="description" data-astro-cid-bi7aps5f>In the last few months there's been a growing friction between those who see CSS as an untouchable layer in the "separation of concerns" paradigm, and those who have simply ignored this golden rule and found different ways to style the UI (typically applying CSS styles via JavaScript).

This debate is getting more and more intense every day, bringing division in a community that used to be immune to this kind of “wars”.

This talk is my attempt to bring peace between the two fronts. To help these two opposite factions to understand and listen to each other, see the counterpart's points of views. To find the good things they have in common, and learn something from that.

## This talk has been presented at London CSS Meetup + Design Exchange Nottingham (DXN) + Front End London (FEL) + State of The Browser ##

Video of the talk: https://www.youtube.com/watch?v=bb_kb6Q2Kdc</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://d1eu30co0ohy4w.cloudfront.net/assets/favicon-bdd5839d46040a50edf189174e6f7aacc8abb3aaecd56a4711cf00d820883f47.png" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>speakerdeck.com/didoo/let-there-be-peace-on-css?slide=62</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>.</p>
<h3 id="no-more-silos">No More Silos</h3>
<p>These sound like separate problems, but they’re actually two sides of the same coin.
With web components, the library that works in every JavaScript framework <em>also</em> works as an island of interactivity on a static webpage.
Even HTML web components fit into both niches.</p>
<p>Brad Frost has called for <a class="link" href="https://bradfrost.com/blog/post/a-global-design-system/" data-astro-cid-bi7aps5f>a global design system</a><a class="tooltip" data-tooltip href="https://bradfrost.com/blog/post/a-global-design-system/" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://bradfrost.com/wp-content/uploads/2023/11/CleanShot-2023-11-02-at-13.53.24.png" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>A Global Design System</span> <span class="description" data-astro-cid-bi7aps5f>TL;DR: This is a call to action to create a Global Design System that provides the world's web designers &amp; developers a library of common UI components. A Global Design System would improve the quality and accessibility of the world's web experiences, save the world's web designers and developer</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://bradfrost.com/favicon.ico" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>bradfrost.com/blog/post/a-global-design-system/</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>: “a common library containing common UI components currently found in most design systems”.
The proposal is to create a cohesive, unstyled, accessible and internationalizable set of components — like <a class="link" href="https://www.radix-ui.com/primitives" data-astro-cid-bi7aps5f>Radix</a><a class="tooltip" data-tooltip href="https://www.radix-ui.com/primitives" data-astro-cid-bi7aps5f> <img class="thumbnail" src="https://radix-ui.com/social/primitives.png" alt="" data-astro-cid-bi7aps5f> <span class="title" data-astro-cid-bi7aps5f>Radix Primitives</span> <span class="description" data-astro-cid-bi7aps5f>Unstyled, accessible, open source React primitives for high-quality web apps and design systems.</span> <span class="href" data-astro-cid-bi7aps5f> <img class="favicon" src="https://www.radix-ui.com/favicon.png" alt="" data-astro-cid-bi7aps5f> <span class="url" data-astro-cid-bi7aps5f>www.radix-ui.com/primitives</span> <svg xmlns="http://www.w3.org/2000/svg" class="arrow"> <use href="/icons.svg#share"></use> </svg> </span> </a>, but for the entire web rather than for a single JavaScript framework.
It’s an ambitious goal, and from where I stand web components are by far the best way to achieve it.</p>
<p>Web components won’t take web development by storm, or show us the One True Way to build websites.
They don’t need to dethrone JavaScript frameworks.
We probably won’t even all learn how to write them!</p>
<p>What web components <em>will</em> do — at least, I hope — is let us collectively build a rich ecosystem of dynamic components that work with any web stack.
No more silos.
That’s the web component success story.</p>

+ 8
- 0
cache/2024/index.html View File

@@ -72,6 +72,8 @@
<li><a href="/david/cache/2024/877ad04fd329c26c80113e15dec540df/" title="Accès à l’article dans le cache local : The Walk and Talk: Everything We Know">The Walk and Talk: Everything We Know</a> (<a href="https://craigmod.com/ridgeline/176/" title="Accès à l’article original distant : The Walk and Talk: Everything We Know">original</a>)</li>
<li><a href="/david/cache/2024/1d60fc5548a6fe61da80a4e16892fa0c/" title="Accès à l’article dans le cache local : Deep Democracy - IAPOP">Deep Democracy - IAPOP</a> (<a href="https://iapop.com/deep-democracy/" title="Accès à l’article original distant : Deep Democracy - IAPOP">original</a>)</li>
<li><a href="/david/cache/2024/fd6eda56671045e0c1e2d215e07f1a6f/" title="Accès à l’article dans le cache local : EffVer: Version your code by the effort required to upgrade">EffVer: Version your code by the effort required to upgrade</a> (<a href="https://jacobtomlinson.dev/effver/" title="Accès à l’article original distant : EffVer: Version your code by the effort required to upgrade">original</a>)</li>
<li><a href="/david/cache/2024/d236f33cf82727313d17cb23bf36a395/" title="Accès à l’article dans le cache local : Reconsider your partnership with Brave">Reconsider your partnership with Brave</a> (<a href="https://kagifeedback.org/d/2808-reconsider-your-partnership-with-brave/6" title="Accès à l’article original distant : Reconsider your partnership with Brave">original</a>)</li>
@@ -150,6 +152,8 @@
<li><a href="/david/cache/2024/82b88d48d8043d79425ce8afd8dff42e/" title="Accès à l’article dans le cache local : Echoing Wirth's plea for lean software">Echoing Wirth's plea for lean software</a> (<a href="https://blog.testdouble.com/posts/2024-01-24-plea-for-lean/" title="Accès à l’article original distant : Echoing Wirth's plea for lean software">original</a>)</li>
<li><a href="/david/cache/2024/f4d2d42eba58062be910407690ae447c/" title="Accès à l’article dans le cache local : The Web Component Success Story">The Web Component Success Story</a> (<a href="https://jakelazaroff.com/words/the-web-component-success-story/" title="Accès à l’article original distant : The Web Component Success Story">original</a>)</li>
<li><a href="/david/cache/2024/ea2cfc9aa425a6967d2cacd9f96ceb9e/" title="Accès à l’article dans le cache local : Ask LukeW: New Ways into Web Content">Ask LukeW: New Ways into Web Content</a> (<a href="https://lukew.com/ff/entry.asp?2008" title="Accès à l’article original distant : Ask LukeW: New Ways into Web Content">original</a>)</li>
<li><a href="/david/cache/2024/4a56aa5497e68df0c5bb1d5331203219/" title="Accès à l’article dans le cache local : When “Everything” Becomes Too Much: The npm Package Chaos of 2024">When “Everything” Becomes Too Much: The npm Package Chaos of 2024</a> (<a href="https://socket.dev/blog/when-everything-becomes-too-much" title="Accès à l’article original distant : When “Everything” Becomes Too Much: The npm Package Chaos of 2024">original</a>)</li>
@@ -160,6 +164,8 @@
<li><a href="/david/cache/2024/284205d0f99390dd18d3af12ff53227c/" title="Accès à l’article dans le cache local : Redeployment Part Two">Redeployment Part Two</a> (<a href="https://brr.fyi/posts/redeployment-part-two" title="Accès à l’article original distant : Redeployment Part Two">original</a>)</li>
<li><a href="/david/cache/2024/0676c7ccf1ab2b380641866789366d26/" title="Accès à l’article dans le cache local : The Performance Inequality Gap, 2024">The Performance Inequality Gap, 2024</a> (<a href="https://infrequently.org/2024/01/performance-inequality-gap-2024/" title="Accès à l’article original distant : The Performance Inequality Gap, 2024">original</a>)</li>
<li><a href="/david/cache/2024/e8748af541273328d9aa9f1aeb1087b2/" title="Accès à l’article dans le cache local : Redeployment Part Three">Redeployment Part Three</a> (<a href="https://brr.fyi/posts/redeployment-part-three" title="Accès à l’article original distant : Redeployment Part Three">original</a>)</li>
<li><a href="/david/cache/2024/55477786fc56b6fc37bb97231b634d90/" title="Accès à l’article dans le cache local : Fabrique : concept">Fabrique : concept</a> (<a href="https://www.quaternum.net/2023/06/02/fabrique-concept/" title="Accès à l’article original distant : Fabrique : concept">original</a>)</li>
@@ -172,6 +178,8 @@
<li><a href="/david/cache/2024/09c0739036ea4a8b6c985e127fe7e3c8/" title="Accès à l’article dans le cache local : ☕️ Journal : Carnets">☕️ Journal : Carnets</a> (<a href="https://thom4.net/2023/02/01/carnets/" title="Accès à l’article original distant : ☕️ Journal : Carnets">original</a>)</li>
<li><a href="/david/cache/2024/cd9184008ba5d9e4c9be4d0a0eea4f60/" title="Accès à l’article dans le cache local : Daring Fireball: The Vision Pro">Daring Fireball: The Vision Pro</a> (<a href="https://daringfireball.net/2024/01/the_vision_pro" title="Accès à l’article original distant : Daring Fireball: The Vision Pro">original</a>)</li>
<li><a href="/david/cache/2024/076169df8a4bd9dde9a4637c6b306dff/" title="Accès à l’article dans le cache local : Ma page /now (ou plutôt /en-ce-moment)">Ma page /now (ou plutôt /en-ce-moment)</a> (<a href="https://blog.professeurjoachim.com/billet/2024-01-05-ma-page-now-ou-plutot-en-ce-moment" title="Accès à l’article original distant : Ma page /now (ou plutôt /en-ce-moment)">original</a>)</li>
<li><a href="/david/cache/2024/9b4b5364526390ba1db9c4a651ea8311/" title="Accès à l’article dans le cache local : Teaming is hard because you’re probably not really on a team">Teaming is hard because you’re probably not really on a team</a> (<a href="https://www.strategy-business.com/article/Teaming-is-hard-because-youre-probably-not-really-on-a-team" title="Accès à l’article original distant : Teaming is hard because you’re probably not really on a team">original</a>)</li>

Loading…
Cancel
Save