Browse Source

More links

master
David Larlet 2 years ago
parent
commit
d902697713

+ 252
- 0
cache/2021/9c4c83e66d9d0b259c17d0f87af70c69/index.html View File

<!doctype html><!-- This is a valid HTML5 document. -->
<!-- Screen readers, SEO, extensions and so on. -->
<html lang="fr">
<!-- Has to be within the first 1024 bytes, hence before the `title` element
See: https://www.w3.org/TR/2012/CR-html5-20121217/document-metadata.html#charset -->
<meta charset="utf-8">
<!-- Why no `X-UA-Compatible` meta: https://stackoverflow.com/a/6771584 -->
<!-- The viewport meta is quite crowded and we are responsible for that.
See: https://codepen.io/tigt/post/meta-viewport-for-2015 -->
<meta name="viewport" content="width=device-width,initial-scale=1">
<!-- Required to make a valid HTML5 document. -->
<title>left alone, together (archive) — David Larlet</title>
<meta name="description" content="Publication mise en cache pour en conserver une trace.">
<!-- That good ol' feed, subscribe :). -->
<link rel="alternate" type="application/atom+xml" title="Feed" href="/david/log/">
<!-- Generated from https://realfavicongenerator.net/ such a mess. -->
<link rel="apple-touch-icon" sizes="180x180" href="/static/david/icons2/apple-touch-icon.png">
<link rel="icon" type="image/png" sizes="32x32" href="/static/david/icons2/favicon-32x32.png">
<link rel="icon" type="image/png" sizes="16x16" href="/static/david/icons2/favicon-16x16.png">
<link rel="manifest" href="/static/david/icons2/site.webmanifest">
<link rel="mask-icon" href="/static/david/icons2/safari-pinned-tab.svg" color="#07486c">
<link rel="shortcut icon" href="/static/david/icons2/favicon.ico">
<meta name="msapplication-TileColor" content="#f7f7f7">
<meta name="msapplication-config" content="/static/david/icons2/browserconfig.xml">
<meta name="theme-color" content="#f7f7f7" media="(prefers-color-scheme: light)">
<meta name="theme-color" content="#272727" media="(prefers-color-scheme: dark)">
<!-- Documented, feel free to shoot an email. -->
<link rel="stylesheet" href="/static/david/css/style_2021-01-20.css">
<!-- See https://www.zachleat.com/web/comprehensive-webfonts/ for the trade-off. -->
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_regular.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: light), (prefers-color-scheme: no-preference)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_bold.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: light), (prefers-color-scheme: no-preference)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_italic.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: light), (prefers-color-scheme: no-preference)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t3_regular.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: dark)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t3_bold.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: dark)" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t3_italic.woff2" as="font" type="font/woff2" media="(prefers-color-scheme: dark)" crossorigin>
<script>
function toggleTheme(themeName) {
document.documentElement.classList.toggle(
'forced-dark',
themeName === 'dark'
)
document.documentElement.classList.toggle(
'forced-light',
themeName === 'light'
)
}
const selectedTheme = localStorage.getItem('theme')
if (selectedTheme !== 'undefined') {
toggleTheme(selectedTheme)
}
</script>

<meta name="robots" content="noindex, nofollow">
<meta content="origin-when-cross-origin" name="referrer">
<!-- Canonical URL for SEO purposes -->
<link rel="canonical" href="https://phirephoenix.com/blog/2021-05-03/privacy">

<body class="remarkdown h1-underline h2-underline h3-underline em-underscore hr-center ul-star pre-tick" data-instant-intensity="viewport-all">


<article>
<header>
<h1>left alone, together</h1>
</header>
<nav>
<p class="center">
<a href="/david/" title="Aller à l’accueil"><svg class="icon icon-home">
<use xlink:href="/static/david/icons2/symbol-defs.svg#icon-home"></use>
</svg> Accueil</a> •
<a href="https://phirephoenix.com/blog/2021-05-03/privacy" title="Lien vers le contenu original">Source originale</a>
</p>
</nav>
<hr>
<p>There’s a depressing sort of symmetry in the fact that our modern paradigms of privacy were developed in response to the proliferation of photography and their exploitation by tabloids. The seminal 1890 Harvard Law Review article <em>The Right to Privacy</em>—which every essay about data privacy is contractually obligated to cite—argued that the right of an individual to object to the publication of photographs ought to be considered part of a general ‘right to be let alone’.</p>

<p>130 years on, privacy is still largely conceived of as an individual thing, wherein we get to make solo decisions about when we want to be left alone and when we’re comfortable being trespassed upon. This principle undergirds the notice-and-consent model of data management, which you might also know as the pavlovian response to click “I agree” on any popup and login screen with little regard for the forty pages of legalese you might be agreeing to.</p>

<p>The thing is, the right to be left alone makes perfect sense when you’re managing information relationships between individuals, where there are generally pretty clear social norms around what constitutes a boundary violation. Reasonable people can and do disagree as to the level of privacy they expect, but if I invite you into my home and you snoop through my bedside table and read my diary, there isn’t much ambiguity about that being an invasion.</p>

<p>But in the age of ✨ networked computing ✨, this individual model of privacy just doesn’t scale anymore. There are too many exponentially intersecting relationships for any of us to keep in our head. It’s no longer just about what we tell a friend or the tax collector or even a journalist. It’s the digital footprint that we often unknowingly leave in our wake every time we interact with something online, and how all of those websites and apps and their shadowy partners talk to each other behind our backs. It’s the cameras in malls tracking our location and sometimes emotions, and it’s the license plate readers compiling a log of our movements.</p>

<p>At a time when governments and companies are increasingly investing in surveillance mechanisms under the guise of security and transparency, that scale is only going to keep growing. Our individual comfort about whether we are left alone is no longer the only, or even the most salient part of the story, and we need to think about privacy as a public good and a collective value.</p>

<p class="divider"></p>

<p>Notice-and-consent fundamentally assumes that we’re always free to choose how our data is used, but that ignores the constraints of the real world contexts in which we make those decisions. At a very basic level, if you work somewhere that uses Gmail, good luck telling your boss you conscientiously object to Google. And if your kid’s friends are all on Instagram, the choice between handing their data over to Facebook or being socially shut out is not exactly a neutral decision.</p>

<p>But even if we did have full control over these decisions, the amount of energy that would be required for all of us to monitor <em>everyone</em> we’ve ever given <em>any</em> data to, <em>forever</em>, is just too much, especially when those privacy relationships are constantly changing, and we are constantly changing too.</p>

<p><a href="https://haveibeenpwned.com/">Have I Been Pwned</a> is a wonderful service that can tell you if your email has been part of a large scale data breach. Every time I look at my personal list (seventeen and counting), one of them always draws my eye because it’s a site I signed up for when I was eleven years old: Neopets. There is absolutely nothing you could’ve done to persuade that 11-year-old kid not to sign up for Neopets, and there’s also nothing I can do now as an adult to undo the harm. Is it my responsibility to have taken steps to delete my accounts on everything I’ve ever stopped using?</p>

<p>Assuming I’m not reusing passwords all over the place, at least the worst thing you could do with my Neopets account is mistreat my virtual pet. Imagine, instead, that you’re a queer kid living in a small town in 1999, and you sign up for Livejournal and use it to find a supportive and loving queer community online. Then in 2007 Livejournal gets sold to a company based in Russia, which in 2013 criminalizes the distribution of pro-LGBTQ content to minors, and in 2017 Livejournal loses the account info of 26 million users. Was it your responsibility to monitor the shifting geopolitical context of your childhood diary for the next two decades?</p>

<p>The impossibility of this burden to individually safeguard our data often reminds me of recycling. Because yes, there’s absolutely digital safety practices to lower our risk of exposure, but they don’t address the core issue that there’s too much data, too many data brokers, too many transactions hidden from the user’s view. And yes, we can and should recycle, but it doesn’t change the fact that 71% of global emissions can be traced back to 100 companies, and it certainly doesn’t change the fact that those companies have spent decades lying to us about how effective recycling is so that they can keep churning out plastic. Those are structural problems that we can’t recycle our way out of, just as we can’t notice-and-consent our way into collective privacy.</p>

<p class="divider"></p>

<p>I like thinking about privacy as being collective, because it feels like a more true reflection of the fact that our lives are made up of relationships, and information about our lives is social and contextual by nature. The fact that I have a sister also indicates that my sister has at least one sibling: me. If I took a DNA test through 23andme<sup id="fnref:1"></sup> I’m not just disclosing information about me but also about everyone that I’m related to, none of whom are able to give consent. The privacy implications for familial DNA are pretty broad: this information might be used to sell or withhold products and services, expose family secrets, or implicate a future as-yet-unborn relative in a crime. I could email 23andme and ask them to delete my records, and they might eventually comply in a month or three. But my present and future relatives wouldn’t be able to do that, or even know that their privacy had been compromised at all.</p>

<p>Even with data that’s less fraught than our genome, our decisions about what we expose to the world have externalities for the people around us. I might think nothing of posting a photo of going out with my friends and mentioning the name of the bar, but I’ve just exposed our physical location to the internet. If one of my friends has had to deal with a stalker in their past, I could’ve put their physical safety at risk. Even if I’m careful to make the post friends-only, the people I trust are not the same as the people my friends trust. In an individual model of privacy, we are only as private as our least private friend.</p>

<p>Amidst the global pandemic, this might sound not dissimilar to public health. When I decide whether to wear a mask in public, that’s partially about how much the mask will protect me from airborne droplets. But it’s also—perhaps more significantly—about protecting everyone else <em>from</em> me.</p>

<p>People who refuse to wear a mask because they’re willing to risk getting Covid are often only thinking about their bodies as a thing to defend, whose sanctity depends on the strength of their individual immune system. They’re not thinking about their bodies as a thing that can also attack, that can be the conduit that kills someone else. People who are careless about their own data because they think they’ve done nothing wrong are only thinking of the harms that they might experience, not the harms that they can cause.</p>

<p>The thing about common goods like public health, though, is that there’s only so much individual actions can achieve without a collective response that targets systemic problems. While we owe a duty of care to one another, it’s not enough for all of us to be willing to wear masks if there’s no contact tracing, no paid sick leave, no medical manufacturing and distribution capacity, no international sharing of vaccine research. And it’s not enough for each of us to be individually vigilant about our information if unscrupulous trackers are gathering up data we didn’t even know we were shedding, or if law enforcement is buying up that data on the private market to use for surveillance purposes none of us ever consented to.</p>

<p class="divider"></p>

<p>Data collection isn’t always bad, but it is always risky. Sometimes that’s due to shoddy design and programming or lazy security practices. But even the best engineers often fail to build risk-free systems, by the very nature of systems.</p>

<p>Systems are easier to attack than they are to defend. If you want to defend a system, you have to make sure every part of it is perfectly implemented to guard against any possible vulnerabilities. Oftentimes, trying to defend a system means adding additional components, which just ends up creating more potential weak points. Whereas if you want to attack, all you have to do is find the one weakness that the systems designer missed. (Or, to paraphrase the IRA, you only have to be lucky once.)</p>

<p>This is true of all systems, digital or analog, but the thing that makes computer systems particularly vulnerable is that the same weaknesses can be deployed across millions of devices, in our phones and laptops and watches and toasters and refrigerators and doorbells. When a vulnerability is discovered in one system, an entire class of devices around the world is instantly a potential target, but we still have to go fix them one by one.</p>

<p>This is how the Equifax data leak happened. Equifax used a piece of open source software that had a security flaw in it, the people who work on that software found it and fixed it, and instead of diligently updating their systems Equifax hit the snooze button for four months and let hackers steal hundreds of millions of customer records. And while Equifax is definitely guilty of aforementioned lazy security practices, this incident also illustrates how fragile computer systems are. From the moment this bug was discovered, every server in the world that ran that software was at risk.</p>

<p>What’s worse, in many cases people weren’t even aware that their data was stored with Equifax. If you’re an adult who has had a job or a phone bill or interacted with a bank in the last seven years, your identifying information is collected by Equifax whether you like it or not. The only way to opt out would have been to be among the small percentage of overwhelmingly young, poor, and racialized people who have no credit histories, which significantly limits the scope of their ability to participate in the economy. How do you notice-and-consent your way out of that?</p>

<p>Software engineer Maciej Ceglowski gave <a href="https://idlewords.com/talks/haunted_by_data.htm">an excellent talk in 2015</a> that compared data to nuclear waste. While the promise of nuclear energy is great, we’ve never <em>quite</em> figured out what to do with nuclear waste, which will outlive all the institutions that generated it. Oftentimes, we shrug, stick it in a big vat underground, put up some scary warning signs and hope for the best. Similarly, because data storage has become so cheap, it’s easy to keep all of it just in case you figure out how to make money from it at some point. This means there’s just petabytes of toxic data on hard drives all around the world, waiting to go off.</p>

<p>As ineffective as recycling might be, at least it’s something we can do to reduce our footprint. But there’s little you or I can do to prevent thousands of tons of radioactive waste from spilling into a river, whether by accident or by corporate design.</p>

<p class="divider"></p>

<p>I could go on and on about the practical reasons to shift away from privacy as an individual phenomenon, but honestly the main thing that motivates me is that I <em>want</em> to live in a society where everyone has a basic right to privacy.</p>

<p>Privacy is essential to human agency and dignity. Denying someone privacy—even when it’s as seemingly small as a parent who won’t let their kid close the door—has a corrosive effect, eroding trust as well as our sense of interiority. When we scale up the individual to a body politic, it is the private sphere that’s crucial for our capacity for democracy and self-determination. As individuals, we need privacy to figure out who we are when we’re no longer performing the self. As a collective, we have to be able to distinguish who we are as individuals hidden from the norms and pressures of the group in order to reason clearly about how we want to shape the group. Elections have secret ballots for a reason.</p>

<p>If we do care about privacy as a collective value, then it cannot be an individual burden. Right now, privacy is essentially a luxury good. If you can afford not to use coupons, you don’t have to let retailers track your shopping habits with loyalty points. If you’re technically savvy, you don’t have to let Gmail see all your emails. Not only does that make access to privacy incredibly inequitable, it also affects our collective understanding of what is a “normal” amount of privacy.</p>

<p>I don’t just mean that in terms of the weird looks checkout clerks give me when I decline to provide my email or postal code or phone number for 10% off next time I shop. If everyone uses insecure texting apps, then having an encrypted chat app like Signal on your phone becomes a red flag to law enforcement. If you use Signal because you’re an activist or you belong to a marginalized group targeted by the state, you are doubly harmed by this norm.</p>

<p>An individual framing of this problem asks questions like, why don’t you want Google to see your email? What have you got to hide? But if you only have the right to privacy when you’re hypervigilant about defending it, you never really had that right to begin with. Instead, at a very minimum the question should be: why does Google deserve to see your email?</p>

<p>And if I can be more ambitious: what values do we as a society want to enshrine in our communication systems? The seriousness with which most legal frameworks treat mail fraud indicates that the capacity for private communication is a pretty important social value. So how can we best design the technical protocols and systems for electronic mail to protect what we care about?</p>

<p class="divider"></p>

<p>There unfortunately isn’t one weird trick to save democracy, but that doesn’t mean there aren’t lessons we can learn from history to figure out how to protect privacy as a public good. The scale and ubiquity of computers may be unprecedented, but so is the scale of our collective knowledge.</p>

<p>For example, we know one of the ways to make people care about negative externalities is to make them pay for it; that’s why carbon pricing is one of the most efficient ways of reducing emissions. There’s no reason why we couldn’t enact a data tax of some kind. We can also take a cautionary tale from pricing externalities, because you have to have the will to enforce it. Western Canada is littered with tens of thousands of orphan wells that oil production companies said they would clean up and haven’t, and now the Canadian government is chipping in billions of dollars to do it for them. This means we must build in enforcement mechanisms at the same time that we’re designing principles for data governance, otherwise it’s little more than ethics-washing.</p>

<p>We also know that while public goods often have a free rider problem, people are actually pretty willing to act for the collective good if they know that others will, too. There’s many examples around the world of communities banding together to collectively govern a shared resource, like forestry, grazing grounds, and wells. The same principle can also be used in data governance, using systems like data trusts or a data commons. Nor is that collective action limited to users and consumers. Anyone who works with data can influence the future of privacy by organizing with their coworkers. Our tech overlords are powerful, but they still rely on the labour of their employees. If you work in tech, building collective labour power is one of the most effective things you can do to influence policy and product direction. If unions didn’t work, executives wouldn’t be so terrified of them.</p>

<p>But all of that starts with a shift in how we see privacy and its relationship to data governance.</p>

<p>In 1962, a book called <em>Silent Spring</em> by Rachel Carson documenting the widespread ecological harms caused by synthetic pesticides went off like a metaphorical bomb in the nascent environmental movement. Rachel Carson was far from the first person to criticize DDT, but she captured the public’s horrified imagination by demonstrating that DDT’s systemic impact to the biosphere was something we needed to solve collectively. <em>Silent Spring</em> is credited with the surge of environmental activism in the 60s, a ban on the use of DDT for agriculture in the US, and eventually the creation of the US Environmental Protection Agency.</p>

<p>I know the political landscape and information space is very different today than in the 60s, and history is never as clean as a paragraph-long recounting makes it sound. Nevertheless, I find the story of <em>Silent Spring</em> incredibly hopeful, because it’s a story about the power that shifting our frame of focus can have in expanding our vision of what’s possible.</p>

<p>I’m not saying data is like DDT<sup id="fnref:2"></sup>. But right now, we’re still only thinking about privacy protection in incremental terms, by changing the individual data relationships between consumers and corporations. If we think of privacy as a public good, the scale of solutions we can let ourselves imagine becomes so much bigger. It might feel ludicrous to think about drastic measures like banning inferred data, or data expiry by default, or taxing data as an income, but banning DDT also seemed ludicrous until it wasn’t. Nothing about how technology shows up in our lives is predetermined; these are all policy <em>choices</em>. We cannot afford to keep treating the question of privacy as a narrow technocratic or procedural problem. But there’s no telling what expansive solutions might open up to us when we see this fight for privacy for what it really is, a fight for a public good, and our fundamental collective rights.</p>

<p><em>This essay is substantively adapted from a talk I gave at <a href="https://youtu.be/QL7UDBKaF-o?t=1237">Theorizing the Web Presents</a> in October of 2020. My thanks to TtW for giving me the space to work out some of these ideas, as well as Kathy, Sarah, Ellie, Jamie, Jane, and Chris for reading several early drafts and providing valuable feedback.</em></p>
</article>


<hr>

<footer>
<p>
<a href="/david/" title="Aller à l’accueil"><svg class="icon icon-home">
<use xlink:href="/static/david/icons2/symbol-defs.svg#icon-home"></use>
</svg> Accueil</a> •
<a href="/david/log/" title="Accès au flux RSS"><svg class="icon icon-rss2">
<use xlink:href="/static/david/icons2/symbol-defs.svg#icon-rss2"></use>
</svg> Suivre</a> •
<a href="http://larlet.com" title="Go to my English profile" data-instant><svg class="icon icon-user-tie">
<use xlink:href="/static/david/icons2/symbol-defs.svg#icon-user-tie"></use>
</svg> Pro</a> •
<a href="mailto:david%40larlet.fr" title="Envoyer un courriel"><svg class="icon icon-mail">
<use xlink:href="/static/david/icons2/symbol-defs.svg#icon-mail"></use>
</svg> Email</a> •
<abbr class="nowrap" title="Hébergeur : Alwaysdata, 62 rue Tiquetonne 75002 Paris, +33184162340"><svg class="icon icon-hammer2">
<use xlink:href="/static/david/icons2/symbol-defs.svg#icon-hammer2"></use>
</svg> Légal</abbr>
</p>
<template id="theme-selector">
<form>
<fieldset>
<legend><svg class="icon icon-brightness-contrast">
<use xlink:href="/static/david/icons2/symbol-defs.svg#icon-brightness-contrast"></use>
</svg> Thème</legend>
<label>
<input type="radio" value="auto" name="chosen-color-scheme" checked> Auto
</label>
<label>
<input type="radio" value="dark" name="chosen-color-scheme"> Foncé
</label>
<label>
<input type="radio" value="light" name="chosen-color-scheme"> Clair
</label>
</fieldset>
</form>
</template>
</footer>
<script src="/static/david/js/instantpage-5.1.0.min.js" type="module"></script>
<script>
function loadThemeForm(templateName) {
const themeSelectorTemplate = document.querySelector(templateName)
const form = themeSelectorTemplate.content.firstElementChild
themeSelectorTemplate.replaceWith(form)

form.addEventListener('change', (e) => {
const chosenColorScheme = e.target.value
localStorage.setItem('theme', chosenColorScheme)
toggleTheme(chosenColorScheme)
})

const selectedTheme = localStorage.getItem('theme')
if (selectedTheme && selectedTheme !== 'undefined') {
form.querySelector(`[value="${selectedTheme}"]`).checked = true
}
}

const prefersColorSchemeDark = '(prefers-color-scheme: dark)'
window.addEventListener('load', () => {
let hasDarkRules = false
for (const styleSheet of Array.from(document.styleSheets)) {
let mediaRules = []
for (const cssRule of styleSheet.cssRules) {
if (cssRule.type !== CSSRule.MEDIA_RULE) {
continue
}
// WARNING: Safari does not have/supports `conditionText`.
if (cssRule.conditionText) {
if (cssRule.conditionText !== prefersColorSchemeDark) {
continue
}
} else {
if (cssRule.cssText.startsWith(prefersColorSchemeDark)) {
continue
}
}
mediaRules = mediaRules.concat(Array.from(cssRule.cssRules))
}

// WARNING: do not try to insert a Rule to a styleSheet you are
// currently iterating on, otherwise the browser will be stuck
// in a infinite loop…
for (const mediaRule of mediaRules) {
styleSheet.insertRule(mediaRule.cssText)
hasDarkRules = true
}
}
if (hasDarkRules) {
loadThemeForm('#theme-selector')
}
})
</script>
</body>
</html>

+ 85
- 0
cache/2021/9c4c83e66d9d0b259c17d0f87af70c69/index.md View File

title: left alone, together
url: https://phirephoenix.com/blog/2021-05-03/privacy
hash_url: 9c4c83e66d9d0b259c17d0f87af70c69

<p>There’s a depressing sort of symmetry in the fact that our modern paradigms of privacy were developed in response to the proliferation of photography and their exploitation by tabloids. The seminal 1890 Harvard Law Review article <em>The Right to Privacy</em>—which every essay about data privacy is contractually obligated to cite—argued that the right of an individual to object to the publication of photographs ought to be considered part of a general ‘right to be let alone’.</p>

<p>130 years on, privacy is still largely conceived of as an individual thing, wherein we get to make solo decisions about when we want to be left alone and when we’re comfortable being trespassed upon. This principle undergirds the notice-and-consent model of data management, which you might also know as the pavlovian response to click “I agree” on any popup and login screen with little regard for the forty pages of legalese you might be agreeing to.</p>

<p>The thing is, the right to be left alone makes perfect sense when you’re managing information relationships between individuals, where there are generally pretty clear social norms around what constitutes a boundary violation. Reasonable people can and do disagree as to the level of privacy they expect, but if I invite you into my home and you snoop through my bedside table and read my diary, there isn’t much ambiguity about that being an invasion.</p>

<p>But in the age of ✨ networked computing ✨, this individual model of privacy just doesn’t scale anymore. There are too many exponentially intersecting relationships for any of us to keep in our head. It’s no longer just about what we tell a friend or the tax collector or even a journalist. It’s the digital footprint that we often unknowingly leave in our wake every time we interact with something online, and how all of those websites and apps and their shadowy partners talk to each other behind our backs. It’s the cameras in malls tracking our location and sometimes emotions, and it’s the license plate readers compiling a log of our movements.</p>

<p>At a time when governments and companies are increasingly investing in surveillance mechanisms under the guise of security and transparency, that scale is only going to keep growing. Our individual comfort about whether we are left alone is no longer the only, or even the most salient part of the story, and we need to think about privacy as a public good and a collective value.</p>

<p class="divider"></p>

<p>Notice-and-consent fundamentally assumes that we’re always free to choose how our data is used, but that ignores the constraints of the real world contexts in which we make those decisions. At a very basic level, if you work somewhere that uses Gmail, good luck telling your boss you conscientiously object to Google. And if your kid’s friends are all on Instagram, the choice between handing their data over to Facebook or being socially shut out is not exactly a neutral decision.</p>

<p>But even if we did have full control over these decisions, the amount of energy that would be required for all of us to monitor <em>everyone</em> we’ve ever given <em>any</em> data to, <em>forever</em>, is just too much, especially when those privacy relationships are constantly changing, and we are constantly changing too.</p>

<p><a href="https://haveibeenpwned.com/">Have I Been Pwned</a> is a wonderful service that can tell you if your email has been part of a large scale data breach. Every time I look at my personal list (seventeen and counting), one of them always draws my eye because it’s a site I signed up for when I was eleven years old: Neopets. There is absolutely nothing you could’ve done to persuade that 11-year-old kid not to sign up for Neopets, and there’s also nothing I can do now as an adult to undo the harm. Is it my responsibility to have taken steps to delete my accounts on everything I’ve ever stopped using?</p>

<p>Assuming I’m not reusing passwords all over the place, at least the worst thing you could do with my Neopets account is mistreat my virtual pet. Imagine, instead, that you’re a queer kid living in a small town in 1999, and you sign up for Livejournal and use it to find a supportive and loving queer community online. Then in 2007 Livejournal gets sold to a company based in Russia, which in 2013 criminalizes the distribution of pro-LGBTQ content to minors, and in 2017 Livejournal loses the account info of 26 million users. Was it your responsibility to monitor the shifting geopolitical context of your childhood diary for the next two decades?</p>

<p>The impossibility of this burden to individually safeguard our data often reminds me of recycling. Because yes, there’s absolutely digital safety practices to lower our risk of exposure, but they don’t address the core issue that there’s too much data, too many data brokers, too many transactions hidden from the user’s view. And yes, we can and should recycle, but it doesn’t change the fact that 71% of global emissions can be traced back to 100 companies, and it certainly doesn’t change the fact that those companies have spent decades lying to us about how effective recycling is so that they can keep churning out plastic. Those are structural problems that we can’t recycle our way out of, just as we can’t notice-and-consent our way into collective privacy.</p>

<p class="divider"></p>

<p>I like thinking about privacy as being collective, because it feels like a more true reflection of the fact that our lives are made up of relationships, and information about our lives is social and contextual by nature. The fact that I have a sister also indicates that my sister has at least one sibling: me. If I took a DNA test through 23andme<sup id="fnref:1"></sup> I’m not just disclosing information about me but also about everyone that I’m related to, none of whom are able to give consent. The privacy implications for familial DNA are pretty broad: this information might be used to sell or withhold products and services, expose family secrets, or implicate a future as-yet-unborn relative in a crime. I could email 23andme and ask them to delete my records, and they might eventually comply in a month or three. But my present and future relatives wouldn’t be able to do that, or even know that their privacy had been compromised at all.</p>

<p>Even with data that’s less fraught than our genome, our decisions about what we expose to the world have externalities for the people around us. I might think nothing of posting a photo of going out with my friends and mentioning the name of the bar, but I’ve just exposed our physical location to the internet. If one of my friends has had to deal with a stalker in their past, I could’ve put their physical safety at risk. Even if I’m careful to make the post friends-only, the people I trust are not the same as the people my friends trust. In an individual model of privacy, we are only as private as our least private friend.</p>

<p>Amidst the global pandemic, this might sound not dissimilar to public health. When I decide whether to wear a mask in public, that’s partially about how much the mask will protect me from airborne droplets. But it’s also—perhaps more significantly—about protecting everyone else <em>from</em> me.</p>

<p>People who refuse to wear a mask because they’re willing to risk getting Covid are often only thinking about their bodies as a thing to defend, whose sanctity depends on the strength of their individual immune system. They’re not thinking about their bodies as a thing that can also attack, that can be the conduit that kills someone else. People who are careless about their own data because they think they’ve done nothing wrong are only thinking of the harms that they might experience, not the harms that they can cause.</p>

<p>The thing about common goods like public health, though, is that there’s only so much individual actions can achieve without a collective response that targets systemic problems. While we owe a duty of care to one another, it’s not enough for all of us to be willing to wear masks if there’s no contact tracing, no paid sick leave, no medical manufacturing and distribution capacity, no international sharing of vaccine research. And it’s not enough for each of us to be individually vigilant about our information if unscrupulous trackers are gathering up data we didn’t even know we were shedding, or if law enforcement is buying up that data on the private market to use for surveillance purposes none of us ever consented to.</p>

<p class="divider"></p>

<p>Data collection isn’t always bad, but it is always risky. Sometimes that’s due to shoddy design and programming or lazy security practices. But even the best engineers often fail to build risk-free systems, by the very nature of systems.</p>

<p>Systems are easier to attack than they are to defend. If you want to defend a system, you have to make sure every part of it is perfectly implemented to guard against any possible vulnerabilities. Oftentimes, trying to defend a system means adding additional components, which just ends up creating more potential weak points. Whereas if you want to attack, all you have to do is find the one weakness that the systems designer missed. (Or, to paraphrase the IRA, you only have to be lucky once.)</p>

<p>This is true of all systems, digital or analog, but the thing that makes computer systems particularly vulnerable is that the same weaknesses can be deployed across millions of devices, in our phones and laptops and watches and toasters and refrigerators and doorbells. When a vulnerability is discovered in one system, an entire class of devices around the world is instantly a potential target, but we still have to go fix them one by one.</p>

<p>This is how the Equifax data leak happened. Equifax used a piece of open source software that had a security flaw in it, the people who work on that software found it and fixed it, and instead of diligently updating their systems Equifax hit the snooze button for four months and let hackers steal hundreds of millions of customer records. And while Equifax is definitely guilty of aforementioned lazy security practices, this incident also illustrates how fragile computer systems are. From the moment this bug was discovered, every server in the world that ran that software was at risk.</p>

<p>What’s worse, in many cases people weren’t even aware that their data was stored with Equifax. If you’re an adult who has had a job or a phone bill or interacted with a bank in the last seven years, your identifying information is collected by Equifax whether you like it or not. The only way to opt out would have been to be among the small percentage of overwhelmingly young, poor, and racialized people who have no credit histories, which significantly limits the scope of their ability to participate in the economy. How do you notice-and-consent your way out of that?</p>

<p>Software engineer Maciej Ceglowski gave <a href="https://idlewords.com/talks/haunted_by_data.htm">an excellent talk in 2015</a> that compared data to nuclear waste. While the promise of nuclear energy is great, we’ve never <em>quite</em> figured out what to do with nuclear waste, which will outlive all the institutions that generated it. Oftentimes, we shrug, stick it in a big vat underground, put up some scary warning signs and hope for the best. Similarly, because data storage has become so cheap, it’s easy to keep all of it just in case you figure out how to make money from it at some point. This means there’s just petabytes of toxic data on hard drives all around the world, waiting to go off.</p>

<p>As ineffective as recycling might be, at least it’s something we can do to reduce our footprint. But there’s little you or I can do to prevent thousands of tons of radioactive waste from spilling into a river, whether by accident or by corporate design.</p>

<p class="divider"></p>

<p>I could go on and on about the practical reasons to shift away from privacy as an individual phenomenon, but honestly the main thing that motivates me is that I <em>want</em> to live in a society where everyone has a basic right to privacy.</p>

<p>Privacy is essential to human agency and dignity. Denying someone privacy—even when it’s as seemingly small as a parent who won’t let their kid close the door—has a corrosive effect, eroding trust as well as our sense of interiority. When we scale up the individual to a body politic, it is the private sphere that’s crucial for our capacity for democracy and self-determination. As individuals, we need privacy to figure out who we are when we’re no longer performing the self. As a collective, we have to be able to distinguish who we are as individuals hidden from the norms and pressures of the group in order to reason clearly about how we want to shape the group. Elections have secret ballots for a reason.</p>

<p>If we do care about privacy as a collective value, then it cannot be an individual burden. Right now, privacy is essentially a luxury good. If you can afford not to use coupons, you don’t have to let retailers track your shopping habits with loyalty points. If you’re technically savvy, you don’t have to let Gmail see all your emails. Not only does that make access to privacy incredibly inequitable, it also affects our collective understanding of what is a “normal” amount of privacy.</p>

<p>I don’t just mean that in terms of the weird looks checkout clerks give me when I decline to provide my email or postal code or phone number for 10% off next time I shop. If everyone uses insecure texting apps, then having an encrypted chat app like Signal on your phone becomes a red flag to law enforcement. If you use Signal because you’re an activist or you belong to a marginalized group targeted by the state, you are doubly harmed by this norm.</p>

<p>An individual framing of this problem asks questions like, why don’t you want Google to see your email? What have you got to hide? But if you only have the right to privacy when you’re hypervigilant about defending it, you never really had that right to begin with. Instead, at a very minimum the question should be: why does Google deserve to see your email?</p>

<p>And if I can be more ambitious: what values do we as a society want to enshrine in our communication systems? The seriousness with which most legal frameworks treat mail fraud indicates that the capacity for private communication is a pretty important social value. So how can we best design the technical protocols and systems for electronic mail to protect what we care about?</p>

<p class="divider"></p>

<p>There unfortunately isn’t one weird trick to save democracy, but that doesn’t mean there aren’t lessons we can learn from history to figure out how to protect privacy as a public good. The scale and ubiquity of computers may be unprecedented, but so is the scale of our collective knowledge.</p>

<p>For example, we know one of the ways to make people care about negative externalities is to make them pay for it; that’s why carbon pricing is one of the most efficient ways of reducing emissions. There’s no reason why we couldn’t enact a data tax of some kind. We can also take a cautionary tale from pricing externalities, because you have to have the will to enforce it. Western Canada is littered with tens of thousands of orphan wells that oil production companies said they would clean up and haven’t, and now the Canadian government is chipping in billions of dollars to do it for them. This means we must build in enforcement mechanisms at the same time that we’re designing principles for data governance, otherwise it’s little more than ethics-washing.</p>

<p>We also know that while public goods often have a free rider problem, people are actually pretty willing to act for the collective good if they know that others will, too. There’s many examples around the world of communities banding together to collectively govern a shared resource, like forestry, grazing grounds, and wells. The same principle can also be used in data governance, using systems like data trusts or a data commons. Nor is that collective action limited to users and consumers. Anyone who works with data can influence the future of privacy by organizing with their coworkers. Our tech overlords are powerful, but they still rely on the labour of their employees. If you work in tech, building collective labour power is one of the most effective things you can do to influence policy and product direction. If unions didn’t work, executives wouldn’t be so terrified of them.</p>

<p>But all of that starts with a shift in how we see privacy and its relationship to data governance.</p>

<p>In 1962, a book called <em>Silent Spring</em> by Rachel Carson documenting the widespread ecological harms caused by synthetic pesticides went off like a metaphorical bomb in the nascent environmental movement. Rachel Carson was far from the first person to criticize DDT, but she captured the public’s horrified imagination by demonstrating that DDT’s systemic impact to the biosphere was something we needed to solve collectively. <em>Silent Spring</em> is credited with the surge of environmental activism in the 60s, a ban on the use of DDT for agriculture in the US, and eventually the creation of the US Environmental Protection Agency.</p>

<p>I know the political landscape and information space is very different today than in the 60s, and history is never as clean as a paragraph-long recounting makes it sound. Nevertheless, I find the story of <em>Silent Spring</em> incredibly hopeful, because it’s a story about the power that shifting our frame of focus can have in expanding our vision of what’s possible.</p>

<p>I’m not saying data is like DDT<sup id="fnref:2"></sup>. But right now, we’re still only thinking about privacy protection in incremental terms, by changing the individual data relationships between consumers and corporations. If we think of privacy as a public good, the scale of solutions we can let ourselves imagine becomes so much bigger. It might feel ludicrous to think about drastic measures like banning inferred data, or data expiry by default, or taxing data as an income, but banning DDT also seemed ludicrous until it wasn’t. Nothing about how technology shows up in our lives is predetermined; these are all policy <em>choices</em>. We cannot afford to keep treating the question of privacy as a narrow technocratic or procedural problem. But there’s no telling what expansive solutions might open up to us when we see this fight for privacy for what it really is, a fight for a public good, and our fundamental collective rights.</p>

<p><em>This essay is substantively adapted from a talk I gave at <a href="https://youtu.be/QL7UDBKaF-o?t=1237">Theorizing the Web Presents</a> in October of 2020. My thanks to TtW for giving me the space to work out some of these ideas, as well as Kathy, Sarah, Ellie, Jamie, Jane, and Chris for reading several early drafts and providing valuable feedback.</em></p>

+ 2
- 0
cache/2021/index.html View File

<li><a href="/david/cache/2021/1bbae4b7e1e642fda7cbc70540b51710/" title="Accès à l’article dans le cache local : Implantation de voies cyclables dans Ahuntsic-Cartierville">Implantation de voies cyclables dans Ahuntsic-Cartierville</a> (<a href="https://montreal.ca/articles/implantation-de-voies-cyclables-dans-ahuntsic-cartierville-5150" title="Accès à l’article original distant : Implantation de voies cyclables dans Ahuntsic-Cartierville">original</a>)</li> <li><a href="/david/cache/2021/1bbae4b7e1e642fda7cbc70540b51710/" title="Accès à l’article dans le cache local : Implantation de voies cyclables dans Ahuntsic-Cartierville">Implantation de voies cyclables dans Ahuntsic-Cartierville</a> (<a href="https://montreal.ca/articles/implantation-de-voies-cyclables-dans-ahuntsic-cartierville-5150" title="Accès à l’article original distant : Implantation de voies cyclables dans Ahuntsic-Cartierville">original</a>)</li>
<li><a href="/david/cache/2021/9c4c83e66d9d0b259c17d0f87af70c69/" title="Accès à l’article dans le cache local : left alone, together">left alone, together</a> (<a href="https://phirephoenix.com/blog/2021-05-03/privacy" title="Accès à l’article original distant : left alone, together">original</a>)</li>
<li><a href="/david/cache/2021/75c2f55c5321a0afe4352f53da5e0b89/" title="Accès à l’article dans le cache local : Pollution numérique">Pollution numérique</a> (<a href="https://www.hydroquebec.com/a/decarboner.html" title="Accès à l’article original distant : Pollution numérique">original</a>)</li> <li><a href="/david/cache/2021/75c2f55c5321a0afe4352f53da5e0b89/" title="Accès à l’article dans le cache local : Pollution numérique">Pollution numérique</a> (<a href="https://www.hydroquebec.com/a/decarboner.html" title="Accès à l’article original distant : Pollution numérique">original</a>)</li>
<li><a href="/david/cache/2021/629d13932d0187afd3e8056f15b509c4/" title="Accès à l’article dans le cache local : Si vous pensez qu’ils doivent mourir…">Si vous pensez qu’ils doivent mourir…</a> (<a href="https://zine-le-village.fr/si-vous-pensez-qu-ils-doivent-mourir.html" title="Accès à l’article original distant : Si vous pensez qu’ils doivent mourir…">original</a>)</li> <li><a href="/david/cache/2021/629d13932d0187afd3e8056f15b509c4/" title="Accès à l’article dans le cache local : Si vous pensez qu’ils doivent mourir…">Si vous pensez qu’ils doivent mourir…</a> (<a href="https://zine-le-village.fr/si-vous-pensez-qu-ils-doivent-mourir.html" title="Accès à l’article original distant : Si vous pensez qu’ils doivent mourir…">original</a>)</li>

Loading…
Cancel
Save