A place to cache linked articles (think custom and personal wayback machine)
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

index.html 47KB

  1. <!doctype html><!-- This is a valid HTML5 document. -->
  2. <!-- Screen readers, SEO, extensions and so on. -->
  3. <html lang="fr">
  4. <!-- Has to be within the first 1024 bytes, hence before the <title>
  5. See: https://www.w3.org/TR/2012/CR-html5-20121217/document-metadata.html#charset -->
  6. <meta charset="utf-8">
  7. <!-- Why no `X-UA-Compatible` meta: https://stackoverflow.com/a/6771584 -->
  8. <!-- The viewport meta is quite crowded and we are responsible for that.
  9. See: https://codepen.io/tigt/post/meta-viewport-for-2015 -->
  10. <meta name="viewport" content="width=device-width,initial-scale=1">
  11. <!-- Required to make a valid HTML5 document. -->
  12. <title>You Are Now Remotely Controlled (archive) — David Larlet</title>
  13. <!-- Generated from https://realfavicongenerator.net/ such a mess. -->
  14. <link rel="apple-touch-icon" sizes="180x180" href="/static/david/icons2/apple-touch-icon.png">
  15. <link rel="icon" type="image/png" sizes="32x32" href="/static/david/icons2/favicon-32x32.png">
  16. <link rel="icon" type="image/png" sizes="16x16" href="/static/david/icons2/favicon-16x16.png">
  17. <link rel="manifest" href="/static/david/icons2/site.webmanifest">
  18. <link rel="mask-icon" href="/static/david/icons2/safari-pinned-tab.svg" color="#07486c">
  19. <link rel="shortcut icon" href="/static/david/icons2/favicon.ico">
  20. <meta name="msapplication-TileColor" content="#f0f0ea">
  21. <meta name="msapplication-config" content="/static/david/icons2/browserconfig.xml">
  22. <meta name="theme-color" content="#f0f0ea">
  23. <!-- Thank you Florens! -->
  24. <link rel="stylesheet" href="/static/david/css/style_2020-01-24.css">
  25. <!-- See https://www.zachleat.com/web/comprehensive-webfonts/ for the trade-off. -->
  26. <link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_regular.woff2" as="font" type="font/woff2" crossorigin>
  27. <link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_bold.woff2" as="font" type="font/woff2" crossorigin>
  28. <link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_italic.woff2" as="font" type="font/woff2" crossorigin>
  29. <meta name="robots" content="noindex, nofollow">
  30. <meta content="origin-when-cross-origin" name="referrer">
  31. <!-- Canonical URL for SEO purposes -->
  32. <link rel="canonical" href="https://www.nytimes.com/2020/01/24/opinion/sunday/surveillance-capitalism.html">
  33. <body class="remarkdown h1-underline h2-underline hr-center ul-star pre-tick">
  34. <article>
  35. <h1>You Are Now Remotely Controlled</h1>
  36. <h2><a href="https://www.nytimes.com/2020/01/24/opinion/sunday/surveillance-capitalism.html">Source originale du contenu</a></h2>
  37. <div class="css-53u6y8"><p class="css-exrw3m evys1bk0">The debate on privacy and law at the Federal Trade Commission was unusually heated that day. Tech industry executives “argued that they were capable of regulating themselves and that government intervention would be costly and counterproductive.” Civil libertarians warned that the companies’ data capabilities posed “an unprecedented threat to individual freedom.” One observed, “We have to decide what human beings are in the electronic age. Are we just going to be chattel for commerce?” A commissioner asked, ‘‘Where should we draw the line?” The year was <a class="css-1g7m0tk" href="https://www.nytimes.com/1997/06/11/us/ftc-opens-hearings-on-computers-threat-to-privacy-and-liberty.html?searchResultPosition=34" title="">1997</a><em class="css-2fg4z9 e1gzwzxm0">.</em></p>
  38. <p class="css-exrw3m evys1bk0">The line was never drawn, and the executives got their way. Twenty-three years later the evidence is in. The fruit of that victory was a new economic logic that I call “surveillance capitalism.” Its success depends upon one-way-mirror operations engineered for our ignorance and wrapped in a fog of misdirection, euphemism and mendacity. It rooted and flourished in the new spaces of the internet, once celebrated by surveillance capitalists as “<a class="css-1g7m0tk" href="https://www.pbs.org/newshour/science/in-new-digital-age-google-leaders-see-more-possibilities-to-connect-the-worlds-7-billion" title="" rel="noopener noreferrer" target="_blank">the world’s largest ungoverned space.</a>” But power fills a void, and those once wild spaces are no longer ungoverned. Instead, they are owned and operated by private surveillance capital and governed by its iron laws.</p>
  39. <p class="css-exrw3m evys1bk0">The rise of surveillance capitalism over the last two decades went largely unchallenged. “Digital” was fast, we were told, and stragglers would be left behind. It’s not surprising that so many of us rushed to follow the bustling White Rabbit down his tunnel into a promised digital Wonderland where, like Alice, we fell prey to delusion. In Wonderland, we celebrated the new digital services as free, but now we see that the surveillance capitalists behind those services regard us as the free commodity. We thought that we search Google, but now we understand that Google searches us. We assumed that we use social media to connect, but<strong class="css-8qgvsz ebyp5n10"> </strong>we learned that connection is how social media uses us. We barely questioned why our new TV or mattress had a privacy policy, but we’ve begun to understand that “privacy” policies are actually surveillance policies.</p>
  40. <p class="css-exrw3m evys1bk0">And like our forebears who named the automobile “horseless carriage” because they could not reckon with its true dimension, we regarded the internet platforms as “bulletin boards” where anyone could pin a note. Congress cemented this delusion in a statute, <a class="css-1g7m0tk" href="https://www.law.cornell.edu/uscode/text/47/230" title="" rel="noopener noreferrer" target="_blank">Section 230</a> of the 1996 Communications Decency Act, absolving those companies of the obligations that adhere to “publishers” or even to “speakers.”</p></div>
  41. <aside class="css-ew4tgv" aria-label="companion column"></aside>
  42. <p></div><div class="css-1fanzo5 StoryBodyCompanionColumn"><div class="css-53u6y8"><p class="css-exrw3m evys1bk0">Only repeated crises have taught us that these platforms are not bulletin boards but hyper-velocity global bloodstreams into which anyone may introduce a dangerous virus without a vaccine. This is how Facebook’s chief executive, Mark Zuckerberg, could legally <a class="css-1g7m0tk" href="https://www.washingtonpost.com/technology/2019/05/24/facebook-acknowledges-pelosi-video-is-faked-declines-delete-it/" title="" rel="noopener noreferrer" target="_blank">refuse</a> to remove a faked video of Speaker of the House Nancy Pelosi and later <a class="css-1g7m0tk" href="https://apnews.com/90e5e81f501346f8779cb2f8b8880d9c" title="" rel="noopener noreferrer" target="_blank">double down</a> on this decision, announcing that political advertising would not be subject to fact-checking.</p>
  43. <p class="css-exrw3m evys1bk0">All of these delusions rest on the most treacherous hallucination of them all: the belief that privacy is private. We have imagined that we can choose our degree of privacy with an individual calculation in which a bit of personal information is traded for valued services — a reasonable quid pro quo.<strong class="css-8qgvsz ebyp5n10"> </strong>For example, when Delta Air Lines piloted a biometric data system at the Atlanta airport, the company <a class="css-1g7m0tk" href="https://www.usatoday.com/story/travel/flights/todayinthesky/2018/11/29/delta-usas-first-biometric-terminal-ready-go-atlanta-airport/2145655002/" title="" rel="noopener noreferrer" target="_blank">reported</a> that of nearly 25,000 customers who traveled there each week, 98 percent opted into the process, noting that “the facial recognition option is saving an average of two seconds for each customer at boarding, or nine minutes when boarding a wide body aircraft.”</p>
  44. <p class="css-exrw3m evys1bk0">In fact the rapid development of facial recognition systems reveals the public consequences of this supposedly private choice. Surveillance capitalists have demanded the right to take our faces wherever they appear — on a city street or a Facebook page. The Financial Times <a class="css-1g7m0tk" href="https://www.ft.com/content/7d3e0d6a-87a0-11e9-a028-86cea8523dc2" title="" rel="noopener noreferrer" target="_blank">reported</a> that a Microsoft facial recognition training database of 10 million images plucked from the internet without anyone’s knowledge and supposedly limited to academic research was employed by companies like IBM and state agencies that included the United States and Chinese military. Among these were two Chinese suppliers of equipment to officials in Xinjiang, where members of the Uighur community live in open-air prisons under perpetual surveillance by facial recognition systems.</p>
  45. <p class="css-exrw3m evys1bk0">Privacy is not private, because the effectiveness of <a class="css-1g7m0tk" href="https://www.ft.com/content/cf19b956-60a2-11e9-b285-3acd5d43599e" title="" rel="noopener noreferrer" target="_blank">these</a> and <a class="css-1g7m0tk" href="https://carnegieendowment.org/2019/09/17/global-expansion-of-ai-surveillance-pub-79847" title="" rel="noopener noreferrer" target="_blank">other</a> private or public surveillance and control systems depends upon the pieces of ourselves that we give up — or that are secretly stolen from us.</p>
  46. <p class="css-exrw3m evys1bk0">Our digital century was to have been democracy’s Golden Age. Instead, we enter its third decade marked by a stark new form of social inequality best understood as “epistemic inequality.” It recalls a pre-Gutenberg era of extreme asymmetries of knowledge and the power that accrues to such knowledge, as the tech giants seize control of information and learning itself. The delusion of “privacy as private” was crafted to breed and feed this unanticipated social divide. Surveillance capitalists exploit the widening inequity of knowledge for the sake of profits. They manipulate the economy, our society and even our lives with impunity, endangering not just individual privacy but democracy itself. Distracted by our delusions, we failed to notice this bloodless coup from above.</p></div><aside class="css-ew4tgv" aria-label="companion column"></aside></div><div class="css-1fanzo5 StoryBodyCompanionColumn"><div class="css-53u6y8"><p class="css-exrw3m evys1bk0">The belief that privacy is private has left us careening toward a future that we did not choose, because it failed to reckon with the profound distinction between a society that insists upon sovereign individual rights and one that lives by the social relations of the one-way mirror. The lesson is that <em class="css-2fg4z9 e1gzwzxm0">privacy is public</em> — it is a collective good that is logically and morally inseparable from the values of human autonomy and self-determination upon which privacy depends and without which a democratic society is unimaginable.</p>
  47. <p class="css-exrw3m evys1bk0">Still, the winds appear to have finally shifted. A fragile new awareness is dawning as we claw our way back up the rabbit hole toward home. Surveillance capitalists are fast because they seek neither genuine consent nor consensus. They rely on psychic numbing and messages of inevitability to conjure the helplessness, resignation and confusion that paralyze their prey. Democracy is slow, and that’s a good thing. Its pace reflects the tens of millions of conversations that occur in families, among neighbors, co-workers and friends, within communities, cities and states, gradually stirring the sleeping giant of democracy to action.</p><div id="NYT_MID_MAIN_CONTENT_REGION" class="css-9tf9ac"></div><p class="css-exrw3m evys1bk0">These conversations are occurring now, and there are many indications that lawmakers are ready to join and to lead. This third decade is likely to decide our fate. Will we make the digital future better, or will it make us worse? Will it be a place that we can call home?</p>
  48. <p class="css-exrw3m evys1bk0">Epistemic inequality is not based on what we can earn but rather on what we can learn. It is defined as unequal access to learning imposed by private commercial mechanisms of information capture, production, analysis and sales. It is best exemplified in the fast-growing abyss between what we know and what is known about us.</p>
  49. <p class="css-exrw3m evys1bk0">Twentieth-century industrial society was organized around the “division of labor,” and it followed that the struggle for economic equality would shape the politics of that time. Our digital century shifts society’s coordinates from a division of labor to a “division of learning,” and it follows that the struggle over access to knowledge and the power conferred by such knowledge will shape the politics of<em class="css-2fg4z9 e1gzwzxm0"> </em>our time.</p>
  50. <p class="css-exrw3m evys1bk0">The new centrality of epistemic inequality signals a power shift from the ownership of the means of production, which defined the politics of the 20th century, to the ownership of the production of meaning. The challenges of epistemic justice and epistemic rights in this new era are summarized in three essential questions about knowledge, authority and power: Who knows? Who decides who knows? Who decides who decides who knows?</p>
  51. <p class="css-exrw3m evys1bk0">During the last two decades, the leading surveillance capitalists — Google, later followed by Facebook, Amazon and Microsoft — helped to drive this societal transformation while simultaneously ensuring their ascendance to the pinnacle of the epistemic hierarchy. They operated in the shadows to amass huge knowledge monopolies by taking without asking, a maneuver that every child recognizes as theft. Surveillance capitalism begins by unilaterally staking a claim to private human experience as free raw material for translation into behavioral data. Our lives are rendered as data flows.</p></div><aside class="css-ew4tgv" aria-label="companion column"></aside></div><div class="css-1fanzo5 StoryBodyCompanionColumn"><div class="css-53u6y8"><p class="css-exrw3m evys1bk0">Early on, it was discovered that, unknown to users, even data freely given harbors rich predictive signals, a surplus that is more than what is required for service improvement. It isn’t only what you post online, but whether you use exclamation points or the color saturation of your photos; not just where you walk but the stoop of your shoulders; not just the identity of your face but the emotional states conveyed by your “microexpressions”; not just what you like but the pattern of likes across engagements. Soon this behavioral surplus was secretly hunted and captured, claimed as proprietary data.</p>
  52. <p class="css-exrw3m evys1bk0">The data are conveyed through complex supply chains of devices, tracking and monitoring software, and<strong class="css-8qgvsz ebyp5n10"> </strong><a class="css-1g7m0tk" href="https://www.wsj.com/articles/you-give-apps-sensitive-personal-information-then-they-tell-facebook-11550851636" title="" rel="noopener noreferrer" target="_blank">ecosystems of apps</a> and <a class="css-1g7m0tk" href="https://fil.forbrukerradet.no/wp-content/uploads/2020/01/2020-01-14-out-of-control-final-version.pdf" title="" rel="noopener noreferrer" target="_blank">companies</a> that specialize in niche data flows captured in secret. For example, <a class="css-1g7m0tk" href="https://www.wsj.com/articles/you-give-apps-sensitive-personal-information-then-they-tell-facebook-11550851636" title="" rel="noopener noreferrer" target="_blank">testing by The Wall Street Journal showed</a> that Facebook receives heart rate data from the Instant Heart Rate: HR Monitor, menstrual cycle data from the Flo Period &amp; Ovulation Tracker, and data that reveal interest in real estate properties from Realtor.com — all of it without the user’s knowledge.</p>
  53. <p class="css-exrw3m evys1bk0">These data flows empty into surveillance capitalists’ computational factories, called “artificial intelligence,” where they are manufactured into behavioral predictions that are about us<em class="css-2fg4z9 e1gzwzxm0">,</em> but they are not <em class="css-2fg4z9 e1gzwzxm0">for us</em>. Instead, they are sold to business customers in a new kind of market that trades exclusively in human futures. Certainty in human affairs is the lifeblood of these markets, where surveillance capitalists compete on the quality of their predictions. This is a new form of trade that birthed some of the richest and most powerful companies in history.</p></div><aside class="css-ew4tgv" aria-label="companion column"></aside></div><div class="css-79elbk" data-testid="photoviewer-wrapper"><div class="css-z3e15g" data-testid="photoviewer-wrapper-hidden"></div><div data-testid="photoviewer-children" class="css-1a48zt4 ehw59r15"><figure class="css-jcw7oy e1g7ppur0" aria-label="media" role="group" itemProp="associatedMedia" itemscope="" itemID="https://static01.nyt.com/images/2020/01/26/opinion/sunday/26zuboff2/26zuboff2-articleLarge.jpg?quality=90&amp;auto=webp" itemType="http://schema.org/ImageObject"><div class="css-1xdhyk6 erfvjey0"><span class="css-1ly73wi e1tej78p0">Image</span><picture><source media="(max-width: 599px) and (min-device-pixel-ratio: 3),(max-width: 599px) and (-webkit-min-device-pixel-ratio: 3),(max-width: 599px) and (min-resolution: 3dppx),(max-width: 599px) and (min-resolution: 288dpi)" srcSet="https://static01.nyt.com/images/2020/01/26/opinion/sunday/26zuboff2/26zuboff2-mobileMasterAt3x.jpg?quality=75&amp;auto=webp&amp;disable=upscale&amp;width=600"/><source media="(max-width: 599px) and (min-device-pixel-ratio: 2),(max-width: 599px) and (-webkit-min-device-pixel-ratio: 2),(max-width: 599px) and (min-resolution: 2dppx),(max-width: 599px) and (min-resolution: 192dpi)" srcSet="https://static01.nyt.com/images/2020/01/26/opinion/sunday/26zuboff2/26zuboff2-mobileMasterAt3x.jpg?quality=75&amp;auto=webp&amp;disable=upscale&amp;width=1200"/><source media="(max-width: 599px) and (min-device-pixel-ratio: 1),(max-width: 599px) and (-webkit-min-device-pixel-ratio: 1),(max-width: 599px) and (min-resolution: 1dppx),(max-width: 599px) and (min-resolution: 96dpi)" srcSet="https://static01.nyt.com/images/2020/01/26/opinion/sunday/26zuboff2/26zuboff2-mobileMasterAt3x.jpg?quality=75&amp;auto=webp&amp;disable=upscale&amp;width=1800"/><img class="css-1m50asq" src="https://static01.nyt.com/images/2020/01/26/opinion/sunday/26zuboff2/26zuboff2-articleLarge.jpg?quality=75&amp;auto=webp&amp;disable=upscale" srcSet="https://static01.nyt.com/images/2020/01/26/opinion/sunday/26zuboff2/26zuboff2-articleLarge.jpg?quality=90&amp;auto=webp 600w,https://static01.nyt.com/images/2020/01/26/opinion/sunday/26zuboff2/26zuboff2-jumbo.jpg?quality=90&amp;auto=webp 1024w,https://static01.nyt.com/images/2020/01/26/opinion/sunday/26zuboff2/26zuboff2-superJumbo.jpg?quality=90&amp;auto=webp 2048w" sizes="((min-width: 600px) and (max-width: 1004px)) 84vw, (min-width: 1005px) 60vw, 100vw" itemProp="url" itemID="https://static01.nyt.com/images/2020/01/26/opinion/sunday/26zuboff2/26zuboff2-articleLarge.jpg?quality=75&amp;auto=webp&amp;disable=upscale"/></picture></div><figcaption itemProp="caption description" class="css-1l44abu e1xdpqjp0"><span itemProp="copyrightHolder" class="css-cnj6d5 e1z0qqy90"><span class="css-1ly73wi e1tej78p0">Credit...</span><span>Illustration by Erik Carter; Photograph by Getty Images</span></span></figcaption></figure></div></div><div class="css-1fanzo5 StoryBodyCompanionColumn"><div class="css-53u6y8"><p class="css-exrw3m evys1bk0">In order to achieve their objectives, the leading surveillance capitalists sought to establish <a class="css-1g7m0tk" href="http://www.martinhilbert.net/Hilbert_Significance_pre-publish.pdf" title="" rel="noopener noreferrer" target="_blank">unrivaled dominance</a> over the <a class="css-1g7m0tk" href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2205145" title="" rel="noopener noreferrer" target="_blank">99.9 percent</a> of the world’s information now rendered in digital formats that they helped to create. Surveillance capital has built most of the world’s <a class="css-1g7m0tk" href="https://data-economy.com/hyperscalers-taking-world-unprecedented-scale/" title="" rel="noopener noreferrer" target="_blank">largest computer networks</a>, data centers, populations of servers, undersea transmission cables, <a class="css-1g7m0tk" href="https://www.wired.com/2017/04/building-ai-chip-saved-google-building-dozen-new-data-centers/" title="" rel="noopener noreferrer" target="_blank">advanced microchips</a>, and frontier machine intelligence, igniting <a class="css-1g7m0tk" href="https://www.nytimes.com/2017/10/22/technology/artificial-intelligence-experts-salaries.html" title="">an arms race for the 10,000</a> or so specialists on the planet who know how to coax knowledge from these vast new data continents.</p>
  54. <p class="css-exrw3m evys1bk0">With Google in the lead, the top surveillance capitalists seek to control labor markets in critical expertise, including data science and <a class="css-1g7m0tk" href="https://www.bloomberg.com/news/features/2019-06-18/apple-google-and-facebook-are-raiding-animal-research-labs" title="" rel="noopener noreferrer" target="_blank">animal research</a>, elbowing out competitors such as start-ups, universities, high schools, municipalities, established corporations in other industries and less wealthy countries. In 2016, 57 percent of American computer science Ph.D. graduates took jobs in industry, while only 11 percent became tenure-track faculty members. It’s not just an American problem. In Britain, university administrators <a class="css-1g7m0tk" href="https://www.theguardian.com/science/2017/nov/02/big-tech-firms-google-ai-hiring-frenzy-brain-drain-uk-universities" title="" rel="noopener noreferrer" target="_blank">contemplate</a> a “missing generation” of data scientists. A Canadian scientist laments, “the power, the expertise, the data are all concentrated in the hands of a few companies.”</p>
  55. <p class="css-exrw3m evys1bk0">Google created the first insanely lucrative markets to trade in human futures, what we now know as online targeted advertising, based on their predictions of which ads users would click. Between 2000, when the new economic logic was just emerging, and 2004, when the company went public, revenues increased by 3,590 percent. This startling number represents the “surveillance dividend.” It quickly reset the bar for investors, eventually driving start-ups, apps developers and established companies to shift their business models toward surveillance capitalism.<strong class="css-8qgvsz ebyp5n10"> </strong>The promise of a fast track to outsized revenues from selling human futures drove this migration first to Facebook, then through the tech sector and now throughout the rest of the economy to industries as disparate as insurance, retail, finance, education, health care, real estate, entertainment and every product that begins with the word “smart” or service touted as “personalized.”</p></div><aside class="css-ew4tgv" aria-label="companion column"></aside></div><div class="css-1fanzo5 StoryBodyCompanionColumn"><div class="css-53u6y8"><p class="css-exrw3m evys1bk0">Even Ford, the birthplace of the 20th-century mass production economy, is on the trail of the surveillance dividend, proposing to meet the challenge of slumping car sales by reimagining Ford vehicles as a “<a class="css-1g7m0tk" href="http://freakonomics.com/podcast/ford/" title="" rel="noopener noreferrer" target="_blank">transportation operating system</a>.” <a class="css-1g7m0tk" href="https://eu.freep.com/story/money/cars/2018/11/13/ford-motor-credit-data-new-revenue/1967077002/" title="" rel="noopener noreferrer" target="_blank">As one analyst put it,</a> Ford “could make a fortune monetizing data. They won’t need engineers, factories or dealers to do it. It’s almost pure profit.”</p>
  56. <p class="css-exrw3m evys1bk0">Surveillance capitalism’s economic imperatives were refined in the competition to sell certainty. Early on it was clear that machine intelligence must feed on volumes of data, compelling economies of scale in data extraction. Eventually it was understood that volume is necessary but not sufficient. The best algorithms also require varieties of data — economies of scope. This realization helped drive the “mobile revolution” sending users into the real world armed with cameras, computers, gyroscopes and microphones packed inside their smart new phones. In the competition for scope, surveillance capitalists want your <a class="css-1g7m0tk" href="https://moniotrlab.ccis.neu.edu/wp-content/uploads/2019/09/ren-imc19.pdf" title="" rel="noopener noreferrer" target="_blank">home</a> and what you say and do within its walls. They want your <a class="css-1g7m0tk" href="https://www.washingtonpost.com/technology/2019/12/17/what-does-your-car-know-about-you-we-hacked-chevy-find-out/" title="" rel="noopener noreferrer" target="_blank">car,</a> your <a class="css-1g7m0tk" href="https://www.nytimes.com/2019/11/11/business/google-ascension-health-data.html" title="">medical conditions</a>, and the <a class="css-1g7m0tk" href="https://tv-watches-you.princeton.edu/tv-tracking-acm-ccs19.pdf" title="" rel="noopener noreferrer" target="_blank">shows</a> you stream; your <a class="css-1g7m0tk" href="https://www.nytimes.com/interactive/2019/12/19/opinion/location-tracking-cell-phone.html" title="">location</a> as well as all the streets and buildings in your path and all the behavior of all the people in your <a class="css-1g7m0tk" href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3390610" title="" rel="noopener noreferrer" target="_blank">city</a>. They want your <a class="css-1g7m0tk" href="https://arstechnica.com/gadgets/2019/02/googles-nest-security-system-shipped-with-a-secret-microphone/" title="" rel="noopener noreferrer" target="_blank">voice</a> and what you <a class="css-1g7m0tk" href="https://www.winsightgrocerybusiness.com/retailers/real-time-insights-amazon-prime-whole-foods-integration" title="" rel="noopener noreferrer" target="_blank">eat</a> and what you <a class="css-1g7m0tk" href="https://www.fastcompany.com/90349518/google-keeps-an-eye-on-what-you-buy-and-its-not-alone" title="" rel="noopener noreferrer" target="_blank">buy</a>; your children’s <a class="css-1g7m0tk" href="https://www.cnet.com/news/parents-told-to-destroy-connected-dolls-over-hacking-fears/" title="" rel="noopener noreferrer" target="_blank">play</a> time and their <a class="css-1g7m0tk" href="https://www.wsj.com/articles/one-parent-is-on-a-mission-to-protect-children-from-digital-mistakes-11562762000" title="" rel="noopener noreferrer" target="_blank">schooling</a>; your <a class="css-1g7m0tk" href="https://www.vox.com/2019/8/30/20835137/facebook-zuckerberg-elon-musk-brain-mind-reading-neuroethics" title="" rel="noopener noreferrer" target="_blank">brain waves</a> and your <a class="css-1g7m0tk" href="https://slate.com/technology/2019/09/social-determinants-health-facebook-google.html" title="" rel="noopener noreferrer" target="_blank">bloodstream</a>. <a class="css-1g7m0tk" href="https://www.seattletimes.com/business/amazon/amazon-rolls-out-new-devices-amid-swirl-of-privacy-questions/" title="" rel="noopener noreferrer" target="_blank">Nothing is exempt.</a></p>
  57. <p class="css-exrw3m evys1bk0">Unequal knowledge about us produces unequal power over us, and so epistemic inequality widens to include the distance between what we can do and what can be done to us. Data scientists describe this as the shift from monitoring to actuation, in which a critical mass of knowledge about a machine system enables the remote control of that system. Now people have become targets for remote control, as surveillance capitalists discovered that the most predictive data come from intervening in behavior to tune, herd and modify action in the direction of commercial objectives. This third imperative, “economies of action,” has become an arena of intense experimentation. “We are learning how to write the music,” one scientist said, “and then we let the music make them dance.”</p>
  58. <p class="css-exrw3m evys1bk0">This new power “to make them dance” does not employ soldiers to threaten terror and murder. It arrives carrying a cappuccino, not a gun. It is a new “instrumentarian” power that works its will through the medium of ubiquitous digital instrumentation to manipulate subliminal cues, psychologically target communications, impose default choice architectures, trigger social comparison dynamics and levy rewards and punishments — all of it aimed at remotely tuning, herding and modifying human behavior in the direction of profitable outcomes and always engineered to preserve users’ ignorance.</p>
  59. <p class="css-exrw3m evys1bk0">We saw predictive knowledge morphing into instrumentarian power in Facebook’s contagion experiments <a class="css-1g7m0tk" href="https://www.nature.com/articles/nature11421" title="" rel="noopener noreferrer" target="_blank">published in 2012</a> and <a class="css-1g7m0tk" href="https://www.pnas.org/content/111/24/8788" title="" rel="noopener noreferrer" target="_blank">2014</a>, when it planted subliminal cues and manipulated social comparisons on its pages, first to influence users to vote in midterm elections and later to make people feel sadder or happier. Facebook researchers celebrated the success of these experiments noting two key findings: that it was possible to manipulate online cues to influence real world behavior and feelings, and that this could be accomplished while successfully bypassing users’ awareness.</p>
  60. <p class="css-exrw3m evys1bk0">In 2016, the Google-incubated augmented reality game, Pokémon Go, tested economies of action on the streets. Game players did not know that they were pawns in the real game of behavior modification for profit, as the rewards and punishments of hunting imaginary creatures were used to herd people to the McDonald’s, Starbucks and local pizza joints that were <a class="css-1g7m0tk" href="https://time.com/5602363/george-orwell-1984-anniversary-surveillance-capitalism/" title="" rel="noopener noreferrer" target="_blank">paying the company for “footfall,</a>” in exactly the same way that online advertisers pay for “click through” to their websites.</p>
  61. <p class="css-exrw3m evys1bk0">In 2017, a leaked Facebook <a class="css-1g7m0tk" href="https://www.theguardian.com/technology/2017/may/01/facebook-advertising-data-insecure-teens" title="" rel="noopener noreferrer" target="_blank">document</a> acquired by The Australian exposed the corporation’s interest in applying “psychological insights” from “internal Facebook data” to modify user behavior. The targets were 6.4 million young Australians and New Zealanders. “By monitoring posts, pictures, interactions and internet activity in real time,” the executives wrote, “Facebook can work out when young people feel ‘stressed,’ ‘defeated,’ ‘overwhelmed,’ ‘anxious,’ ‘nervous,’ ‘stupid,’ ‘silly,’ ‘useless’ and a ‘failure.’” This depth of information, they explained, allows Facebook to pinpoint the time frame during which a young person needs a “confidence boost” and is most vulnerable to a specific configuration of subliminal cues and triggers. The data are then used to match each emotional phase with appropriate ad messaging for the maximum probability of guaranteed sales.</p></div><aside class="css-ew4tgv" aria-label="companion column"></aside></div><div class="css-1fanzo5 StoryBodyCompanionColumn"><div class="css-53u6y8"><p class="css-exrw3m evys1bk0">Facebook <a class="css-1g7m0tk" href="https://about.fb.com/news/h/comments-on-research-and-ad-targeting/" title="" rel="noopener noreferrer" target="_blank">denied these practices</a>, though a former product manager <a class="css-1g7m0tk" href="https://www.theguardian.com/technology/2017/may/02/facebook-executive-advertising-data-comment" title="" rel="noopener noreferrer" target="_blank">accused the company</a> of “lying through its teeth.” The fact is that in the absence of corporate transparency and democratic oversight, epistemic inequality rules. They know. They decide who knows. They decide who decides.</p>
  62. <p class="css-exrw3m evys1bk0">The public’s intolerable knowledge disadvantage is deepened by surveillance capitalists’ perfection of mass communications as gaslighting. Two examples are illustrative. On April 30, 2019 Mark Zuckerberg made a dramatic <a class="css-1g7m0tk" href="https://www.theverge.com/2019/4/30/18524188/facebook-f8-keynote-mark-zuckerberg-privacy-future-2019" title="" rel="noopener noreferrer" target="_blank">announcement</a> at the company’s annual developer conference, declaring, “The future is private.” A few weeks later, a Facebook litigator <a class="css-1g7m0tk" href="https://theintercept.com/2019/06/14/facebook-privacy-policy-court/" title="" rel="noopener noreferrer" target="_blank">appeared</a> before a federal district judge in California to thwart a user lawsuit over privacy invasion, arguing that the very act of using Facebook negates any reasonable expectation of privacy “as a matter of law.” In May 2019 Sundar Pichai, chief executive of Google, wrote in The Times<em class="css-2fg4z9 e1gzwzxm0"> </em>of his corporations’s commitment to the principle that<em class="css-2fg4z9 e1gzwzxm0"> </em>“<a class="css-1g7m0tk" href="https://www.nytimes.com/2019/05/07/opinion/google-sundar-pichai-privacy.html" title="">privacy cannot be a luxury good</a>.” Five months later <a class="css-1g7m0tk" href="https://www.nydailynews.com/news/national/ny-witness-saw-homeless-people-selling-face-scans-google-five-dollars-20191004-j6z2vonllnerpiuakt6wrp6l44-story.html" title="" rel="noopener noreferrer" target="_blank">Google contractors were found offering $5 gift cards</a> to homeless people of color in an Atlanta park in return for a facial scan.</p>
  63. <p class="css-exrw3m evys1bk0">Facebook’s denial invites even more scrutiny in light of another <a class="css-1g7m0tk" href="https://theintercept.com/2018/04/13/facebook-advertising-data-artificial-intelligence-ai/" title="" rel="noopener noreferrer" target="_blank">leaked company document appearing in 2018</a>. The confidential report offers rare insight into the heart of Facebook’s computational factory, where a “prediction engine” runs on a machine intelligence platform that “ingests trillions of data points every day, trains thousands of models” and then “deploys them to the server fleet for live predictions.” Facebook notes that its “prediction service” produces “more than 6 million predictions per second.” But to what purpose?</p>
  64. <p class="css-exrw3m evys1bk0">In its report, the company makes clear that these extraordinary capabilities are dedicated to meeting its corporate customers’ “core business challenges” with procedures that link prediction, microtargeting, intervention and behavior modification. For example, a Facebook service called “loyalty prediction” is touted for its ability to plumb proprietary behavioral surplus to predict individuals who are “at risk” of shifting their brand allegiance and alerting advertisers to intervene promptly with targeted messages designed to stabilize loyalty just in time to alter the course of the future.</p>
  65. <p class="css-exrw3m evys1bk0">That year a young man named Christopher Wylie turned whistle-blower on his former employer, a political consultancy known as Cambridge Analytica. “We exploited Facebook to harvest millions of people’s profiles,” <a class="css-1g7m0tk" href="https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election" title="" rel="noopener noreferrer" target="_blank">Wylie admitted</a>, “and built models to exploit what we knew about them and target their inner demons.” Mr. Wylie characterized those techniques as “<a class="css-1g7m0tk" href="https://www.politico.eu/article/cambridge-analytica-whistleblower-christopher-wylie-warns-of-new-cold-war-online/" title="" rel="noopener noreferrer" target="_blank">information warfare</a>,” correctly assessing that such shadow wars are built on asymmetries of knowledge and the power it affords. Less clear to the public or lawmakers was that the political firm’s strategies of secret invasion and conquest employed surveillance capitalism’s standard operating procedures to which billions of innocent “users” are routinely subjected each day. Mr. Wylie <a class="css-1g7m0tk" href="https://www.youtube.com/watch?time_continue=12&amp;v=FXdYSQ6nu-M&amp;feature=emb_logo" title="" rel="noopener noreferrer" target="_blank">described</a> this mirroring process, as he followed a trail that was already cut and marked. Cambridge Analytica’s real innovation was to pivot the whole undertaking from commercial to political objectives.</p>
  66. <p class="css-exrw3m evys1bk0">In other words, Cambridge Analytica was the parasite, and surveillance capitalism was the host. Thanks to its epistemic dominance, surveillance capitalism provided the behavioral data that exposed the <em class="css-2fg4z9 e1gzwzxm0">targets</em> for assault. Its methods of behavioral microtargeting and behavioral modification became the <em class="css-2fg4z9 e1gzwzxm0">weapons</em>. And it was surveillance capitalism’s lack of accountability for content on its platform afforded by Section 230 that provided the <em class="css-2fg4z9 e1gzwzxm0">opportunity</em> for the stealth attacks designed to trigger the inner demons of unsuspecting citizens.</p>
  67. <p class="css-exrw3m evys1bk0">It’s not just that epistemic inequality leaves us utterly vulnerable to the attacks of actors like Cambridge Analytica. The larger and more disturbing point is that surveillance capitalism has turned epistemic inequality into a defining condition of our societies, normalizing information warfare as a chronic feature of our daily reality prosecuted by the very corporations upon which we depend for effective social participation. They have the knowledge, the machines, the science and the scientists, the secrets and the lies. All privacy now rests with them, leaving us with few means of defense from these marauding data invaders. Without law, we scramble to hide in our own lives, while our children debate encryption strategies around the dinner table and students wear masks to public protests as protection from facial recognition systems built with our family photos.</p></div><aside class="css-ew4tgv" aria-label="companion column"></aside></div><div class="css-1fanzo5 StoryBodyCompanionColumn"><div class="css-53u6y8"><p class="css-exrw3m evys1bk0">In the absence of new declarations of epistemic rights and legislation, surveillance capitalism threatens to remake society as it unmakes democracy. From below, it undermines human agency, usurping privacy, diminishing autonomy and depriving individuals of the right to combat. From above, epistemic inequality and injustice are fundamentally incompatible with the aspirations of a democratic people.</p>
  68. <p class="css-exrw3m evys1bk0">We know that surveillance capitalists work in the shadows, but what they do there and the knowledge they accrue are unknown to us. They have the means to know everything about us, but we can know little about them. Their knowledge of us is not for us. Instead, our futures are sold for others’ profits. Since that Federal Trade Commission meeting in 1997, the line was never drawn, and people did become chattel for commerce. Another destructive delusion is that this outcome was inevitable — an unavoidable consequence of convenience-enhancing digital technologies. The truth is that surveillance capitalism hijacked the digital medium. There was nothing inevitable about it.</p>
  69. <p class="css-exrw3m evys1bk0">American lawmakers have been reluctant to take on these challenges for many reasons. One is an unwritten policy of “surveillance exceptionalism” forged in the aftermath of the Sept. 11 terrorist attacks, when the government’s concerns shifted from online privacy protections to a new zeal for “<a class="css-1g7m0tk" href="https://www.nytimes.com/2002/12/15/magazine/the-year-in-ideas-total-information-awareness.html" title="">total information awareness</a>.” In that political environment the fledgling surveillance capabilities emerging from Silicon Valley appeared to hold great promise.</p>
  70. <p class="css-exrw3m evys1bk0">Surveillance capitalists have also defended themselves with lobbying and forms of propaganda intended to undermine and intimidate lawmakers, confounding judgment and freezing action. These have received relatively little scrutiny compared to the damage they do. Consider two examples:</p>
  71. <p class="css-exrw3m evys1bk0">The first is the assertion that democracy threatens prosperity and innovation. Former Google chief executive Eric Schmidt <a class="css-1g7m0tk" href="https://www.washingtonpost.com/national/on-leadership/googles-eric-schmidt-expounds-on-his-senate-testimony/2011/09/30/gIQAPyVgCL_story.html" title="" rel="noopener noreferrer" target="_blank">explained</a> in 2011, “we took the position of ‘hands off the internet.’ You know, leave us alone … The government can make regulatory mistakes that can slow this whole thing down, and we see that and we worry about it.” This propaganda is recycled from the Gilded Age barons, whom we now call “robbers.” They insisted that there was no need for law when one had the “law of survival of the fittest,” the “laws of capital” and the “law of supply and demand.”</p>
  72. <p class="css-exrw3m evys1bk0">Paradoxically, surveillance capital does not appear to drive innovation. A promising new era of economic research <a class="css-1g7m0tk" href="https://www.youtube.com/watch?v=xJgjLfx-Bcs" title="" rel="noopener noreferrer" target="_blank">shows</a> the critical role that government and democratic governance have played in innovation and <a class="css-1g7m0tk" href="http://germangutierrezg.com/GutierrezPhilippon_Fading_Stars_2019.pdf" title="" rel="noopener noreferrer" target="_blank">suggests</a> a lack of innovation in big tech companies like Google. Surveillance capitalism’s information dominance is not dedicated to the urgent challenges of carbon-free energy, eliminating hunger, curing cancers, ridding the oceans of plastic or flooding the world with well paid, smart, loving teachers and doctors. Instead, we see a frontier operation run by geniuses with vast capital and computational power that is furiously dedicated to the lucrative science and economics of human prediction for profit.</p>
  73. <p class="css-exrw3m evys1bk0">The second form of propaganda is the argument that the success of the leading surveillance capitalist firms reflects the real value they bring to people. But data from the demand side suggest that surveillance capitalism is better understood as a market failure. Instead of a close alignment of supply and demand, people use these services because they have no comparable alternatives and because they are ignorant of surveillance capitalism’s shadow operations and their consequences. Pew Research Center recently <a class="css-1g7m0tk" href="https://www.pewresearch.org/internet/2019/11/15/americans-and-privacy-concerned-confused-and-feeling-lack-of-control-over-their-personal-information/" title="" rel="noopener noreferrer" target="_blank">reported</a> that 81 percent of Americans believe the potential risks of companies’ data collection outweigh the benefits, suggesting that corporate success depends upon coercion and obfuscation rather than meeting people’s real needs.</p>
  74. <p class="css-exrw3m evys1bk0">In his prizewinning history of regulation, the historian Thomas McCraw delivers a warning. Across the centuries regulators failed when they did not frame “strategies appropriate to the particular industries they were regulating.” Existing privacy and antitrust laws are vital but neither will be wholly adequate to the new challenges of reversing epistemic inequality.</p></div><aside class="css-ew4tgv" aria-label="companion column"></aside></div><div class="css-1fanzo5 StoryBodyCompanionColumn"><div class="css-53u6y8"><p class="css-exrw3m evys1bk0">These contests of the 21st century demand a framework of epistemic rights enshrined in law and subject to democratic governance. Such rights would interrupt data supply chains by safeguarding the boundaries of human experience before they come under assault from the forces of datafication. The choice to turn any aspect of one’s life into data must belong to individuals by virtue of their rights in a democratic society. This means, for example, that companies cannot claim the right to your face, or use your face as free raw material for analysis, or own and sell any computational products that derive from your face. The conversation on epistemic rights has already begun, reflected in a pathbreaking <a class="css-1g7m0tk" href="https://amnestyusa.org/wp-content/uploads/2019/11/Surveillance-Giants-Embargo-21-Nov-0001-GMT-FINAL-report.pdf" title="" rel="noopener noreferrer" target="_blank">report</a> from Amnesty International.</p>
  75. <p class="css-exrw3m evys1bk0">On the demand side, we can outlaw human futures markets and thus eliminate the financial incentives that sustain the surveillance dividend. This is not a radical prospect. For example, societies outlaw markets that trade in human organs, babies and slaves. In each case, we recognize that such markets are both morally repugnant and produce predictably violent consequences. Human futures markets can be shown to produce equally predictable outcomes that challenge human freedom and undermine democracy. Like subprime mortgages and fossil fuel investments, surveillance assets will become the new toxic assets.</p>
  76. <p class="css-exrw3m evys1bk0">In support of a new competitive landscape, lawmakers will need to champion new forms of collective action, just as nearly a century ago legal protections for the rights to organize, to strike and to bargain collectively united lawmakers and workers in curbing the powers of monopoly capitalists. Lawmakers must seek alliances with <a class="css-1g7m0tk" href="https://www.citylab.com/life/2018/12/bianca-wylie-interview-toronto-quayside-protest-criticism/574477/" title="" rel="noopener noreferrer" target="_blank">citizens who are deeply concerned</a> over the unchecked power of the surveillance capitalists and with workers who seek fair wages and reasonable security in defiance of the precarious employment conditions that define the surveillance economy.</p>
  77. <p class="css-exrw3m evys1bk0">Anything made by humans can be unmade by humans. Surveillance capitalism is young, barely 20 years in the making, but democracy is old, rooted in generations of hope and contest.</p>
  78. <p class="css-exrw3m evys1bk0">Surveillance capitalists are rich and powerful, but they are not invulnerable. They have an Achilles heel: fear. They fear lawmakers who do not fear them. They fear citizens who demand a new road forward as they insist on new answers to old questions: Who will know? Who will decide who knows? Who will decide who decides? Who will write the music, and who will dance?</p>
  79. </div></p>
  80. </article>
  81. <hr>
  82. <footer>
  83. <p>
  84. <a href="/david/" title="Aller à l’accueil">🏠</a> •
  85. <a href="/david/log/" title="Accès au flux RSS">🤖</a> •
  86. <a href="http://larlet.com" title="Go to my English profile" data-instant>🇨🇦</a> •
  87. <a href="mailto:david%40larlet.fr" title="Envoyer un courriel">📮</a> •
  88. <abbr title="Hébergeur : Alwaysdata, 62 rue Tiquetonne 75002 Paris, +33184162340">🧚</abbr>
  89. </p>
  90. </footer>
  91. <script src="/static/david/js/instantpage-3.0.0.min.js" type="module" defer></script>
  92. </body>
  93. </html>