Browse Source

Moar links

master
David Larlet 1 year ago
parent
commit
94d40c07a3
No known key found for this signature in database

+ 404
- 0
cache/2020/2390380d879c04ee56baf320b6f7e681/index.html View File

@@ -0,0 +1,404 @@
<!doctype html><!-- This is a valid HTML5 document. -->
<!-- Screen readers, SEO, extensions and so on. -->
<html lang="fr">
<!-- Has to be within the first 1024 bytes, hence before the <title>
See: https://www.w3.org/TR/2012/CR-html5-20121217/document-metadata.html#charset -->
<meta charset="utf-8">
<!-- Why no `X-UA-Compatible` meta: https://stackoverflow.com/a/6771584 -->
<!-- The viewport meta is quite crowded and we are responsible for that.
See: https://codepen.io/tigt/post/meta-viewport-for-2015 -->
<meta name="viewport" content="width=device-width,minimum-scale=1,initial-scale=1,shrink-to-fit=no">
<!-- Required to make a valid HTML5 document. -->
<title>Twelve Million Phones, One Dataset, Zero Privacy (archive) — David Larlet</title>
<!-- Lightest blank gif, avoids an extra query to the server. -->
<link rel="icon" href="data:;base64,iVBORw0KGgo=">
<!-- Thank you Florens! -->
<link rel="stylesheet" href="/static/david/css/style_2020-01-09.css">
<!-- See https://www.zachleat.com/web/comprehensive-webfonts/ for the trade-off. -->
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_regular.woff2" as="font" type="font/woff2" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_bold.woff2" as="font" type="font/woff2" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_italic.woff2" as="font" type="font/woff2" crossorigin>

<meta name="robots" content="noindex, nofollow">
<meta content="origin-when-cross-origin" name="referrer">
<!-- Canonical URL for SEO purposes -->
<link rel="canonical" href="https://www.nytimes.com/interactive/2019/12/19/opinion/location-tracking-cell-phone.html">

<body class="remarkdown h1-underline h2-underline hr-center ul-star pre-tick">

<article>
<h1>Twelve Million Phones, One Dataset, Zero Privacy</h1>
<h2><a href="https://www.nytimes.com/interactive/2019/12/19/opinion/location-tracking-cell-phone.html">Source originale du contenu</a></h2>
<p>
<strong>Every minute of every day,</strong> everywhere on the planet, dozens of companies — largely unregulated, little scrutinized — are logging the movements of tens of millions of people with mobile phones and storing the information in gigantic data files. The Times <a href="https://www.nytimes.com/privacy-project">Privacy Project</a> obtained one such file, by far the largest and most sensitive ever to be reviewed by journalists. It holds more than 50 billion location pings from the phones of more than 12 million Americans as they moved through several major cities, including Washington, New York, San Francisco and Los Angeles.
</p>

<p>
Each piece of information in this file represents the precise location of a single smartphone over a period of several months in 2016 and 2017. The data was provided to Times Opinion by sources who asked to remain anonymous because they were not authorized to share it and could face severe penalties for doing so. The sources of the information said they had grown alarmed about how it might be abused and urgently wanted to inform the public and lawmakers.
</p>

<p>
<em><em><em>[Related: </em><a href="https://www.nytimes.com/interactive/2019/12/20/opinion/location-data-national-security.html">How to Track President Trump</a><em> —</em><em> Read more about the national security risks found in the data.]</em></em>
</em></p>

<p>

</p>

<p>
After spending months sifting through the data, tracking the movements of people across the country and speaking with dozens of data companies, technologists, lawyers and academics who study this field, we feel the same sense of alarm. In the cities that the data file covers, it tracks people from nearly every neighborhood and block, whether they live in mobile homes in Alexandria, Va., or luxury towers in Manhattan.
</p>

<p>
One search turned up more than a dozen people visiting the Playboy Mansion, some overnight. Without much effort we spotted visitors to the estates of Johnny Depp, Tiger Woods and Arnold Schwarzenegger, connecting the devices’ owners to the residences indefinitely.
</p>

<p>
If you lived in one of the cities the dataset covers and use apps that share your location — anything from weather apps to local news apps to coupon savers — you could be in there, too.
</p>

<p>
If you could see the full trove, you might never use your phone the same way again.
</p>

<p>
<strong>The data reviewed by Times Opinion</strong> didn’t come from a telecom or giant tech company, nor did it come from a governmental surveillance operation. It originated from a location data company, one of dozens quietly collecting precise movements using software slipped onto mobile phone apps. You’ve probably never heard of most of the companies — and yet to anyone who has access to this data, your life is an open book. They can see the places you go every moment of the day, whom you meet with or spend the night with, where you pray, whether you visit a methadone clinic, a psychiatrist’s office or a massage parlor.
</p>

<p>
<a href="https://www.nytimes.com/interactive/2018/12/10/business/location-data-privacy-apps.html">The Times</a> and other news organizations have reported on smartphone tracking in the past. But never with a data set so large. Even still, this file represents just a small slice of what’s collected and sold every day by the location tracking industry — surveillance so omnipresent in our digital lives that it now seems impossible for anyone to avoid.
</p>

<p>
It doesn’t take much imagination to conjure the powers such always-on surveillance can provide an authoritarian regime like China’s. Within America’s own representative democracy, citizens would surely rise up in outrage if the government attempted to mandate that every person above the age of 12 carry a tracking device that revealed their location 24 hours a day. Yet, in the decade since Apple’s App Store was created, Americans have, app by app, consented to just such a system run by private companies. Now, as the decade ends, tens of millions of Americans, including many children, find themselves carrying spies in their pockets during the day and leaving them beside their beds at night — even though the corporations that control their data are far less accountable than the government would be.
</p>

<p>
<em><em><em>[Related: </em><a href="https://www.nytimes.com/interactive/2019/12/21/opinion/pasadena-smartphone-spying.html">Where Even the Children Are Being Tracked</a><em> — We followed every move of people in one city. Then we went to tell them.]</em></em>
</em></p>

<p>

</p>

<p>
“The seduction of these consumer products is so powerful that it blinds us to the possibility that there is another way to get the benefits of the technology without the invasion of privacy. But there is,” said William Staples, founding director of the Surveillance Studies Research Center at the University of Kansas. “All the companies collecting this location information act as what I have called Tiny Brothers, using a variety of data sponges to engage in everyday surveillance.”
</p>

<p>
In this and subsequent articles we’ll reveal what we’ve found and why it has so shaken us. We’ll ask you to consider the national security risks the existence of this kind of data creates and the specter of what such precise, always-on human tracking might mean in the hands of corporations and the government. We’ll also look at legal and ethical justifications that companies rely on to collect our precise locations and the deceptive techniques they use to lull us into sharing it.
</p>

<p>
Today, it’s perfectly legal to collect and sell all this information. In the United States, as in most of the world, no federal law limits what has become a vast and lucrative trade in human tracking. Only internal company policies and the decency of individual employees prevent those with access to the data from, say, stalking an estranged spouse or selling the evening commute of an intelligence officer to a hostile foreign power.
</p>

<p>
Companies say the data is shared only with vetted partners. As a society, we’re choosing simply to take their word for that, displaying a blithe faith in corporate beneficence that we don’t extend to far less intrusive yet more heavily regulated industries. Even if these companies are acting with the soundest moral code imaginable, there’s ultimately no foolproof way they can secure the data from falling into the hands of a foreign security service. Closer to home, on a smaller yet no less troubling scale, there are often few protections to stop an individual analyst with access to such data from tracking an ex-lover or a victim of abuse.
</p>

<h2 class="g-subhed g-optimize-type " id="">
A DIARY OF YOUR EVERY MOVEMENT
</h2>

<p>
<strong>The companies that collect</strong> all this information on your movements justify their business on the basis of three claims: People consent to be tracked, the data is anonymous and the data is secure.
</p>

<p>
None of those claims hold up, based on the file we’ve obtained and our review of company practices.
</p>

<p>
Yes, the location data contains billions of data points with no identifiable information like names or email addresses. But it’s child’s play to connect real names to the dots that appear on the maps.
</p>

<p>
Here’s what that looks like.
</p>

<p>
<strong>In most cases,</strong> ascertaining a home location and an office location was enough to identify a person. Consider your daily commute: Would any other smartphone travel directly between your house and your office every day?
</p>

<p>
Describing location data as anonymous is “a completely false claim” that has been debunked in multiple studies, Paul Ohm, a law professor and privacy researcher at the Georgetown University Law Center, told us. “Really precise, longitudinal geolocation information is absolutely impossible to anonymize.”
</p>

<p>
“D.N.A.,” he added, “is probably the only thing that’s harder to anonymize than precise geolocation information.”
</p>

<p>
<em><em><em>[Work in the location tracking industry? Seen an abuse of data? We want to hear from you. Using a non-work phone or computer, contact us on a secure line at 440-295-5934, @charliewarzel on Wire or email </em><a href="mailto:charlie.warzel@nytimes.com">Charlie Warzel</a><em> and </em><a href="mailto:stuart.thompson@nytimes.com">Stuart A. Thompson</a><em> directly.]</em></em>
</em></p>

<p>
Yet companies continue to claim that the data are anonymous. In marketing materials and at trade conferences, anonymity is a major selling point — key to allaying concerns over such invasive monitoring.
</p>

<p>
To evaluate the companies’ claims, we turned most of our attention to identifying people in positions of power. With the help of publicly available information, like home addresses, we easily identified and then tracked scores of notables. We followed military officials with security clearances as they drove home at night. We tracked law enforcement officers as they took their kids to school. We watched high-powered lawyers (and their guests) as they traveled from private jets to vacation properties. We did not name any of the people we identified without their permission.
</p>

<p>
The data set is large enough that it surely points to scandal and crime but our purpose wasn’t to dig up dirt. We wanted to document the risk of underregulated surveillance.
</p>

<p>
Watching dots move across a map sometimes revealed hints of faltering marriages, evidence of drug addiction, records of visits to psychological facilities.
</p>

<p>
Connecting a sanitized ping to an actual human in time and place could feel like reading someone else’s diary.
</p>

<p>
In one case, we identified Mary Millben, a singer based in Virginia who has performed for three presidents, including President Trump. She was invited to the service at the Washington National Cathedral the morning after the president’s inauguration. That’s where we first found her.
</p>

<p>
She remembers how, surrounded by dignitaries and the first family, she was moved by the music echoing through the recesses of the cathedral while members of both parties joined together in prayer. All the while, the apps on her phone were also monitoring the moment, recording her position and the length of her stay in meticulous detail. For the advertisers who might buy access to the data, the intimate prayer service could well supply some profitable marketing insights.
</p>

<p>
“To know that you have a list of places I have been, and my phone is connected to that, that’s scary,” Ms. Millben told us. “What’s the business of a company benefiting off of knowing where I am? That seems a little dangerous to me.”
</p>

<p>
Like many people we identified in the data, Ms. Millben said she was careful about limiting how she shared her location. Yet like many of them, she also couldn’t name the app that might have collected it. Our privacy is only as secure as the least secure app on our device.
</p>

<p>
“That makes me uncomfortable,” she said. “I’m sure that makes every other person uncomfortable, to know that companies can have free rein to take your data, locations, whatever else they’re using. It is disturbing.”
</p>

<p>
<em><em><em>[Related: </em><a href="https://www.nytimes.com/2019/12/26/reader-center/location-tracking-phones-questions.html">What’s the Worst That Could Happen With My Phone Data?</a><em> — Our journalists answers your questions about their investigation into how companies track smartphone users.]</em></em>
</em></p>

<p>
The inauguration weekend yielded a trove of personal stories and experiences: elite attendees at presidential ceremonies, religious observers at church services, supporters assembling across the National Mall — all surveilled and recorded permanently in rigorous detail.
</p>

<p>
Protesters were tracked just as rigorously. After the pings of Trump supporters, basking in victory, vanished from the National Mall on Friday evening, they were replaced hours later by those of participants in the Women’s March, as a crowd of nearly half a million descended on the capital. Examining just a photo from the event, you might be hard-pressed to tie a face to a name. But in our data, pings at the protest connected to clear trails through the data, documenting the lives of protesters in the months before and after the protest, including where they lived and worked.
</p>

<p>
We spotted a senior official at the Department of Defense walking through the Women’s March, beginning on the National Mall and moving past the Smithsonian National Museum of American History that afternoon. His wife was also on the mall that day, something we discovered after tracking him to his home in Virginia. Her phone was also beaming out location data, along with the phones of several neighbors.
</p>

<p>
The official’s data trail also led to a high school, homes of friends, a visit to Joint Base Andrews, workdays spent in the Pentagon and a ceremony at Joint Base Myer-Henderson Hall with President Barack Obama in 2017 (nearly a dozen more phones were tracked there, too).
</p>

<p>
Inauguration Day weekend was marked by other protests — and riots. Hundreds of protesters, some in black hoods and masks, gathered north of the National Mall that Friday, eventually <a href="https://www.nytimes.com/2017/01/20/us/politics/inauguration-protests.html">setting fire to a limousine</a> near Franklin Square. The data documented those rioters, too. Filtering the data to that precise time and location led us to the doorsteps of some who were there. Police were present as well, many with faces obscured by riot gear. The data led us to the homes of at least two police officers who had been at the scene.
</p>

<p>
As revealing as our searches of Washington were, we were relying on just one slice of data, sourced from one company, focused on one city, covering less than one year. Location data companies collect orders of magnitude more information every day than the totality of what Times Opinion received.
</p>

<p>
Data firms also typically draw on other sources of information that we didn’t use. We lacked the mobile advertising IDs or other identifiers that advertisers often combine with demographic information like home ZIP codes, age, gender, even phone numbers and emails to create detailed audience profiles used in <a href="https://www.nytimes.com/interactive/2019/04/30/opinion/privacy-targeted-advertising.html">targeted advertising</a>. When datasets are combined, privacy risks can be amplified. Whatever protections existed in the location dataset can crumble with the addition of only one or two other sources.
</p>

<p>
There are dozens of companies profiting off such data daily across the world — by collecting it directly from smartphones, creating new technology to better capture the data or creating audience profiles for targeted advertising.
</p>

<p>
The full collection of companies can feel dizzying, as it’s constantly changing and seems impossible to pin down. Many use technical and nuanced language that may be confusing to average smartphone users.
</p>

<p>
While many of them have been involved in the business of tracking us for years, the companies themselves are unfamiliar to most Americans. (Companies can work with data derived from GPS sensors, Bluetooth beacons and other sources. Not all companies in the location data business collect, buy, sell or work with granular location data.)
</p>

<p>
Location data companies generally downplay the risks of collecting such revealing information at scale. Many also say they’re not very concerned about potential regulation or software updates that could make it more difficult to collect location data.
</p>

<p>
“No, it doesn’t really keep us up at night,” Brian Czarny, chief marketing officer at Factual, one such company, said. He added that Factual does not resell detailed data like the information we reviewed. “We don’t feel like anybody should be doing that because it’s a risk to the whole business,” he said.
</p>

<p>
In absence of a federal privacy law, the industry has largely relied on self-regulation. Several industry groups offer ethical guidelines meant to govern it. Factual joined the <a href="https://www.mmaglobal.com/">Mobile Marketing Association</a>, along with many other data location and marketing companies, in drafting a pledge intended to improve its self-regulation. The pledge is slated to be released next year.
</p>

<p>
States are starting to respond with their own laws. The California Consumer Protection Act goes into effect next year and adds new protections for residents there, like allowing them to ask companies to delete their data or prevent its sale. But aside from a few new requirements, the law could leave the industry largely unencumbered.
</p>

<p>
“If a private company is legally collecting location data, they’re free to spread it or share it however they want,” said Calli Schroeder, a lawyer for the privacy and data protection company VeraSafe.
</p>

<p>
The companies are required to disclose very little about their data collection. By law, companies need only describe their practices in their privacy policies, which tend to be <a href="https://www.nytimes.com/interactive/2019/06/12/opinion/facebook-google-privacy-policies.html">dense legal documents</a> that few people read and even fewer can truly understand.
</p>

<h2 class="g-subhed g-optimize-type " id="">
EVERYTHING CAN BE HACKED
</h2>

<p>
<strong>Does it really matter</strong> that your information isn’t actually anonymous? Location data companies argue that your data is safe — that it poses no real risk because it’s stored on guarded servers. This assurance has been undermined by the parade of publicly reported data breaches — to say nothing of breaches that don’t make headlines. In truth, sensitive information can be easily transferred or leaked, as evidenced by this very story.
</p>

<p>
We’re constantly shedding data, for example, by surfing the internet or making credit card purchases. But location data is different. Our precise locations are used fleetingly in the moment for a targeted ad or notification, but then repurposed indefinitely for much more profitable ends, like tying your purchases to <a href="https://web.clearchanneloutdoor.com/radar">billboard ads</a> you drove past on the freeway. Many apps that use your location, like weather services, work perfectly well without your precise location — but collecting your location feeds a lucrative secondary business of analyzing, licensing and transferring that information to third parties.
</p>

<p class="g-source ">
<span class="g-caption">The data contains simple information like date, latitude and longitude, making it easy to inspect, download and transfer. Note: Values are randomized to protect sources and device owners.</span>

</p>

<p>
For many Americans, the only real risk they face from having their information exposed would be embarrassment or inconvenience. But for others, like survivors of abuse, the risks could be substantial. And who can say what practices or relationships any given individual might want to keep private, to withhold from friends, family, employers or the government? We found hundreds of pings in mosques and churches, abortion clinics, queer spaces and other sensitive areas.
</p>

<p>
In one case, we observed a change in the regular movements of a Microsoft engineer. He made a visit one Tuesday afternoon to the main Seattle campus of a Microsoft competitor, Amazon. The following month, he started a new job at Amazon. It took minutes to identify him as Ben Broili, a manager now for Amazon Prime Air, a drone delivery service.
</p>

<p>
“I can’t say I’m surprised,” Mr. Broili told us in early December. “But knowing that you all can get ahold of it and comb through and place me to see where I work and live — that’s weird.” That we could so easily discern that Mr. Broili was out on a job interview raises some obvious questions, like: Could the internal location surveillance of executives and employees become standard corporate practice?
</p>

<p>
Mr. Broili wasn’t worried about apps cataloguing his every move, but he said he felt unsure about whether the tradeoff between the services offered by the apps and the sacrifice of privacy was worth it. “It’s an awful lot of data,” he said. “And I really still don’t understand how it’s being used. I’d have to see how the other companies were weaponizing or monetizing it to make that call.”
</p>

<p>
If this kind of location data makes it easy to keep tabs on employees, it makes it just as simple to stalk celebrities. Their private conduct — even in the dead of night, in residences and far from paparazzi — could come under even closer scrutiny.
</p>

<p>
Reporters hoping to evade other forms of surveillance by meeting in person with a source might want to rethink that practice. Every major newsroom covered by the data contained dozens of pings; we easily traced one Washington Post journalist through Arlington, Va.
</p>

<p>
In other cases, there were detours to hotels and late-night visits to the homes of prominent people. One person, plucked from the data in Los Angeles nearly at random, was found traveling to and from roadside motels multiple times, for visits of only a few hours each time.
</p>

<p>
While these pointillist pings don’t in themselves reveal a complete picture, a lot can be gleaned by examining the date, time and length of time at each point.
</p>

<p>
Large data companies like Foursquare — perhaps the most familiar name in the location data business — say they don’t sell detailed location data like the kind reviewed for this story but rather use it to <a href="https://www.nytimes.com/2019/10/16/opinion/foursquare-privacy-internet.html">inform analysis</a>, such as measuring whether you <a href="https://enterprise.foursquare.com/products/attribution">entered a store</a> after seeing an ad on your mobile phone.
</p>

<p>
But a number of companies do sell the detailed data. Buyers are typically data brokers and advertising companies. But some of them have little to do with consumer advertising, including financial institutions, geospatial analysis companies and real estate investment firms that can process and analyze such large quantities of information. They might pay more than $1 million for a tranche of data, according to a former location data company employee who agreed to speak anonymously.
</p>

<p>
Location data is also collected and shared alongside a mobile advertising ID, a supposedly anonymous identifier about 30 digits long that allows advertisers and other businesses to tie activity together across apps. The ID is also used to combine location trails with other information like your name, home address, email, phone number or even an identifier tied to your Wi-Fi network.
</p>

<p>
The data can change hands in almost real time, so fast that your location could be transferred from your smartphone to the app’s servers and exported to third parties in milliseconds. This is how, for example, you might see an ad for a new car some time after walking through a dealership.
</p>

<p>
That data can then be resold, copied, pirated and abused. There’s no way you can ever retrieve it.
</p>

<p>
Location data is about far more than consumers seeing a few more relevant ads. This information provides critical intelligence for big businesses. The Weather Channel app’s parent company, for example, analyzed users’ location data for <a href="https://web.archive.org/web/20180731211011/https://business.weather.com/writable/documents/Financial-Markets/InvestorInsights_SolutionSheet.pdf">hedge funds</a>, according to a <a href="https://www.nytimes.com/2019/01/03/technology/weather-channel-app-lawsuit.html">lawsuit filed in Los Angeles this year</a> that was triggered by Times reporting. And Foursquare received much attention in 2016 after using its data trove to <a href="https://medium.com/foursquare-direct/foursquare-predicts-chipotle-s-q1-sales-down-nearly-30-foot-traffic-reveals-the-start-of-a-mixed-78515b2389af">predict</a> that after an E. coli crisis, Chipotle’s sales would drop by 30 percent in the coming months. Its same-store sales ultimately <a href="https://www.cnbc.com/2016/04/26/chipotle-reports-first-quarter-results.html">fell 29.7 percent</a>.
</p>

<p>
Much of the concern over location data has focused on telecom giants like Verizon and AT&amp;T, which have been <a href="https://www.wyden.senate.gov/imo/media/doc/at&amp;t%20letter%20to%20RW%206.15.pdf">selling location data</a> to third parties for years. Last year, Motherboard, Vice’s technology website, <a href="https://motherboard.vice.com/en_us/article/nepxbz/i-gave-a-bounty-hunter-300-dollars-located-phone-microbilt-zumigo-tmobile">found</a> that once the data was sold, it was being shared to help bounty hunters find specific cellphones in real time. The resulting scandal forced the telecom giants to <a href="https://arstechnica.com/tech-policy/2018/06/verizon-and-att-will-stop-selling-your-phones-location-to-data-brokers/">pledge</a> they would stop selling location movements to data brokers.
</p>

<p>
Yet no law prohibits them from doing so.
</p>

<p>
Location data is transmitted from your phone via software development kits, or S.D.Ks. as they’re known in the trade. The kits are small programs that can be used to build features within an app. They make it easy for app developers to simply include location-tracking features, a useful component of services like weather apps. Because they’re so useful and easy to use, S.D.K.s are embedded in thousands of apps. Facebook, Google and Amazon, for example, have extremely popular S.D.K.s that allow smaller apps to connect to bigger companies’ ad platforms or help provide web traffic analytics or payment infrastructure.
</p>

<p>
But they could also sit on an app and collect location data while providing no real service back to the app. Location companies may <a href="https://www.xmode.io/app-publishers/">pay</a> the apps to be included — collecting valuable data that can be monetized.
</p>

<p>
“If you have an S.D.K. that’s frequently collecting location data, it is more than likely being resold across the industry,” said Nick Hall, chief executive of the data marketplace company VenPath.
</p>

<h2 class="g-subhed g-optimize-type " id="">
THE ‘HOLY GRAIL’ FOR MARKETERS
</h2>

<p>
<strong>If this information is so sensitive,</strong> why is it collected in the first place?
</p>

<p>
For brands, following someone’s precise movements is key to understanding the “customer journey” — every step of the process from seeing an ad to buying a product. It’s the Holy Grail of advertising, one marketer said, the complete picture that connects all of our interests and online activity with our real-world actions.
</p>

<p>
Once they have the complete customer journey, companies know a lot about what we want, what we buy and what made us buy it. Other groups have begun to find ways to use it too. Political campaigns could analyze the interests and demographics of <a href="https://www.wsj.com/articles/political-campaigns-track-cellphones-to-identify-and-target-individual-voters-11570718889">rally attendees</a> and use that information to shape their messages to try to manipulate particular groups. Governments around the world could have a new tool to identify protestors.
</p>

<p>
Pointillist location data also has some clear benefits to society. Researchers can use the raw data to provide key insights for transportation studies and government planners. The City Council of Portland, Ore., unanimously <a href="https://www.opb.org/news/article/cellphone-location-data-portland-google-privacy/">approved a deal</a> to study traffic and transit by monitoring millions of cellphones. Unicef <a href="https://www.businesswire.com/news/home/20190910005037/en/Cuebiq%E2%80%99s-Data-Good-Program-UNICEF-High-Precision-Human">announced a plan</a> to use aggregated mobile location data to study epidemics, natural disasters and demographics.
</p>

<p>
For individual consumers, the value of constant tracking is less tangible. And the lack of transparency from the advertising and tech industries raises still more concerns.
</p>

<p>
Does a coupon app need to sell second-by-second location data to other companies to be profitable? Does that really justify allowing companies to track millions and potentially expose our private lives?
</p>

<p>
Data companies say users consent to tracking when they agree to share their location. But those consent screens rarely make clear how the data is being packaged and sold. If companies were clearer about what they were doing with the data, would anyone agree to share it?
</p>

<p>
What about data collected years ago, before hacks and leaks made privacy a forefront issue? Should it still be used, or should it be deleted for good?
</p>

<p>
If it’s possible that data stored securely today can easily be hacked, leaked or stolen, is this kind of data worth that risk?
</p>

<p>
Is all of this surveillance and risk worth it merely so that we can be served slightly more relevant ads? Or so that hedge fund managers can get richer?
</p>

<p>
The companies profiting from our every move can’t be expected to voluntarily limit their practices. Congress has to step in to protect Americans’ needs as consumers and rights as citizens.
</p>

<p>
Until then, one thing is certain: We are living in the world’s most advanced surveillance system. This system wasn’t created deliberately. It was built through the interplay of technological advance and the profit motive. It was built to make money. The greatest trick technology companies ever played was persuading society to surveil itself.
</p>
</article>


<hr>

<footer>
<p>
<a href="/david/" title="Aller à l’accueil">🏠</a> •
<a href="/david/log/" title="Accès au flux RSS">🤖</a> •
<a href="http://larlet.com" title="Go to my English profile" data-instant>🇨🇦</a> •
<a href="mailto:david%40larlet.fr" title="Envoyer un courriel">📮</a> •
<abbr title="Hébergeur : Alwaysdata, 62 rue Tiquetonne 75002 Paris, +33184162340">🧚</abbr>
</p>
</footer>
<script src="/static/david/js/instantpage-3.0.0.min.js" type="module" defer></script>
</body>
</html>

+ 286
- 0
cache/2020/2390380d879c04ee56baf320b6f7e681/index.md View File

@@ -0,0 +1,286 @@
title: Twelve Million Phones, One Dataset, Zero Privacy
url: https://www.nytimes.com/interactive/2019/12/19/opinion/location-tracking-cell-phone.html
hash_url: 2390380d879c04ee56baf320b6f7e681

<p>
<strong>Every minute of every day,</strong> everywhere on the planet, dozens of companies — largely unregulated, little scrutinized — are logging the movements of tens of millions of people with mobile phones and storing the information in gigantic data files. The Times <a href="https://www.nytimes.com/privacy-project">Privacy Project</a> obtained one such file, by far the largest and most sensitive ever to be reviewed by journalists. It holds more than 50 billion location pings from the phones of more than 12 million Americans as they moved through several major cities, including Washington, New York, San Francisco and Los Angeles.
</p>
<p>
Each piece of information in this file represents the precise location of a single smartphone over a period of several months in 2016 and 2017. The data was provided to Times Opinion by sources who asked to remain anonymous because they were not authorized to share it and could face severe penalties for doing so. The sources of the information said they had grown alarmed about how it might be abused and urgently wanted to inform the public and lawmakers.
</p>
<p>
<em><em><em>[Related: </em><a href="https://www.nytimes.com/interactive/2019/12/20/opinion/location-data-national-security.html">How to Track President Trump</a><em> —</em><em> Read more about the national security risks found in the data.]</em></em>
</em></p>
<p>
</p>
<p>
After spending months sifting through the data, tracking the movements of people across the country and speaking with dozens of data companies, technologists, lawyers and academics who study this field, we feel the same sense of alarm. In the cities that the data file covers, it tracks people from nearly every neighborhood and block, whether they live in mobile homes in Alexandria, Va., or luxury towers in Manhattan.
</p>
<p>
One search turned up more than a dozen people visiting the Playboy Mansion, some overnight. Without much effort we spotted visitors to the estates of Johnny Depp, Tiger Woods and Arnold Schwarzenegger, connecting the devices’ owners to the residences indefinitely.
</p>
<p>
If you lived in one of the cities the dataset covers and use apps that share your location — anything from weather apps to local news apps to coupon savers — you could be in there, too.
</p>
<p>
If you could see the full trove, you might never use your phone the same way again.
</p>

<p>
<strong>The data reviewed by Times Opinion</strong> didn’t come from a telecom or giant tech company, nor did it come from a governmental surveillance operation. It originated from a location data company, one of dozens quietly collecting precise movements using software slipped onto mobile phone apps. You’ve probably never heard of most of the companies — and yet to anyone who has access to this data, your life is an open book. They can see the places you go every moment of the day, whom you meet with or spend the night with, where you pray, whether you visit a methadone clinic, a psychiatrist’s office or a massage parlor.
</p>
<p>
<a href="https://www.nytimes.com/interactive/2018/12/10/business/location-data-privacy-apps.html">The Times</a> and other news organizations have reported on smartphone tracking in the past. But never with a data set so large. Even still, this file represents just a small slice of what’s collected and sold every day by the location tracking industry — surveillance so omnipresent in our digital lives that it now seems impossible for anyone to avoid.
</p>

<p>
It doesn’t take much imagination to conjure the powers such always-on surveillance can provide an authoritarian regime like China’s. Within America’s own representative democracy, citizens would surely rise up in outrage if the government attempted to mandate that every person above the age of 12 carry a tracking device that revealed their location 24 hours a day. Yet, in the decade since Apple’s App Store was created, Americans have, app by app, consented to just such a system run by private companies. Now, as the decade ends, tens of millions of Americans, including many children, find themselves carrying spies in their pockets during the day and leaving them beside their beds at night — even though the corporations that control their data are far less accountable than the government would be.
</p>
<p>
<em><em><em>[Related: </em><a href="https://www.nytimes.com/interactive/2019/12/21/opinion/pasadena-smartphone-spying.html">Where Even the Children Are Being Tracked</a><em> — We followed every move of people in one city. Then we went to tell them.]</em></em>
</em></p>
<p>
</p>
<p>
“The seduction of these consumer products is so powerful that it blinds us to the possibility that there is another way to get the benefits of the technology without the invasion of privacy. But there is,” said William Staples, founding director of the Surveillance Studies Research Center at the University of Kansas. “All the companies collecting this location information act as what I have called Tiny Brothers, using a variety of data sponges to engage in everyday surveillance.”
</p>
<p>
In this and subsequent articles we’ll reveal what we’ve found and why it has so shaken us. We’ll ask you to consider the national security risks the existence of this kind of data creates and the specter of what such precise, always-on human tracking might mean in the hands of corporations and the government. We’ll also look at legal and ethical justifications that companies rely on to collect our precise locations and the deceptive techniques they use to lull us into sharing it.
</p>
<p>
Today, it’s perfectly legal to collect and sell all this information. In the United States, as in most of the world, no federal law limits what has become a vast and lucrative trade in human tracking. Only internal company policies and the decency of individual employees prevent those with access to the data from, say, stalking an estranged spouse or selling the evening commute of an intelligence officer to a hostile foreign power.
</p>
<p>
Companies say the data is shared only with vetted partners. As a society, we’re choosing simply to take their word for that, displaying a blithe faith in corporate beneficence that we don’t extend to far less intrusive yet more heavily regulated industries. Even if these companies are acting with the soundest moral code imaginable, there’s ultimately no foolproof way they can secure the data from falling into the hands of a foreign security service. Closer to home, on a smaller yet no less troubling scale, there are often few protections to stop an individual analyst with access to such data from tracking an ex-lover or a victim of abuse.
</p>
<h2 class="g-subhed g-optimize-type " id="">
A DIARY OF YOUR EVERY MOVEMENT
</h2>
<p>
<strong>The companies that collect</strong> all this information on your movements justify their business on the basis of three claims: People consent to be tracked, the data is anonymous and the data is secure.
</p>
<p>
None of those claims hold up, based on the file we’ve obtained and our review of company practices.
</p>
<p>
Yes, the location data contains billions of data points with no identifiable information like names or email addresses. But it’s child’s play to connect real names to the dots that appear on the maps.
</p>
<p>
Here’s what that looks like.
</p>

<p>
<strong>In most cases,</strong> ascertaining a home location and an office location was enough to identify a person. Consider your daily commute: Would any other smartphone travel directly between your house and your office every day?
</p>
<p>
Describing location data as anonymous is “a completely false claim” that has been debunked in multiple studies, Paul Ohm, a law professor and privacy researcher at the Georgetown University Law Center, told us. “Really precise, longitudinal geolocation information is absolutely impossible to anonymize.”
</p>
<p>
“D.N.A.,” he added, “is probably the only thing that’s harder to anonymize than precise geolocation information.”
</p>
<p>
<em><em><em>[Work in the location tracking industry? Seen an abuse of data? We want to hear from you. Using a non-work phone or computer, contact us on a secure line at 440-295-5934, @charliewarzel on Wire or email </em><a href="mailto:charlie.warzel@nytimes.com">Charlie Warzel</a><em> and </em><a href="mailto:stuart.thompson@nytimes.com">Stuart A. Thompson</a><em> directly.]</em></em>
</em></p>

<p>
Yet companies continue to claim that the data are anonymous. In marketing materials and at trade conferences, anonymity is a major selling point — key to allaying concerns over such invasive monitoring.
</p>
<p>
To evaluate the companies’ claims, we turned most of our attention to identifying people in positions of power. With the help of publicly available information, like home addresses, we easily identified and then tracked scores of notables. We followed military officials with security clearances as they drove home at night. We tracked law enforcement officers as they took their kids to school. We watched high-powered lawyers (and their guests) as they traveled from private jets to vacation properties. We did not name any of the people we identified without their permission.
</p>
<p>
The data set is large enough that it surely points to scandal and crime but our purpose wasn’t to dig up dirt. We wanted to document the risk of underregulated surveillance.
</p>
<p>
Watching dots move across a map sometimes revealed hints of faltering marriages, evidence of drug addiction, records of visits to psychological facilities.
</p>
<p>
Connecting a sanitized ping to an actual human in time and place could feel like reading someone else’s diary.
</p>
<p>
In one case, we identified Mary Millben, a singer based in Virginia who has performed for three presidents, including President Trump. She was invited to the service at the Washington National Cathedral the morning after the president’s inauguration. That’s where we first found her.
</p>

<p>
She remembers how, surrounded by dignitaries and the first family, she was moved by the music echoing through the recesses of the cathedral while members of both parties joined together in prayer. All the while, the apps on her phone were also monitoring the moment, recording her position and the length of her stay in meticulous detail. For the advertisers who might buy access to the data, the intimate prayer service could well supply some profitable marketing insights.
</p>
<p>
“To know that you have a list of places I have been, and my phone is connected to that, that’s scary,” Ms. Millben told us. “What’s the business of a company benefiting off of knowing where I am? That seems a little dangerous to me.”
</p>
<p>
Like many people we identified in the data, Ms. Millben said she was careful about limiting how she shared her location. Yet like many of them, she also couldn’t name the app that might have collected it. Our privacy is only as secure as the least secure app on our device.
</p>
<p>
“That makes me uncomfortable,” she said. “I’m sure that makes every other person uncomfortable, to know that companies can have free rein to take your data, locations, whatever else they’re using. It is disturbing.”
</p>
<p>
<em><em><em>[Related: </em><a href="https://www.nytimes.com/2019/12/26/reader-center/location-tracking-phones-questions.html">What’s the Worst That Could Happen With My Phone Data?</a><em> — Our journalists answers your questions about their investigation into how companies track smartphone users.]</em></em>
</em></p>

<p>
The inauguration weekend yielded a trove of personal stories and experiences: elite attendees at presidential ceremonies, religious observers at church services, supporters assembling across the National Mall — all surveilled and recorded permanently in rigorous detail.
</p>
<p>
Protesters were tracked just as rigorously. After the pings of Trump supporters, basking in victory, vanished from the National Mall on Friday evening, they were replaced hours later by those of participants in the Women’s March, as a crowd of nearly half a million descended on the capital. Examining just a photo from the event, you might be hard-pressed to tie a face to a name. But in our data, pings at the protest connected to clear trails through the data, documenting the lives of protesters in the months before and after the protest, including where they lived and worked.
</p>
<p>
We spotted a senior official at the Department of Defense walking through the Women’s March, beginning on the National Mall and moving past the Smithsonian National Museum of American History that afternoon. His wife was also on the mall that day, something we discovered after tracking him to his home in Virginia. Her phone was also beaming out location data, along with the phones of several neighbors.
</p>

<p>
The official’s data trail also led to a high school, homes of friends, a visit to Joint Base Andrews, workdays spent in the Pentagon and a ceremony at Joint Base Myer-Henderson Hall with President Barack Obama in 2017 (nearly a dozen more phones were tracked there, too).
</p>
<p>
Inauguration Day weekend was marked by other protests — and riots. Hundreds of protesters, some in black hoods and masks, gathered north of the National Mall that Friday, eventually <a href="https://www.nytimes.com/2017/01/20/us/politics/inauguration-protests.html">setting fire to a limousine</a> near Franklin Square. The data documented those rioters, too. Filtering the data to that precise time and location led us to the doorsteps of some who were there. Police were present as well, many with faces obscured by riot gear. The data led us to the homes of at least two police officers who had been at the scene.
</p>
<p>
As revealing as our searches of Washington were, we were relying on just one slice of data, sourced from one company, focused on one city, covering less than one year. Location data companies collect orders of magnitude more information every day than the totality of what Times Opinion received.
</p>
<p>
Data firms also typically draw on other sources of information that we didn’t use. We lacked the mobile advertising IDs or other identifiers that advertisers often combine with demographic information like home ZIP codes, age, gender, even phone numbers and emails to create detailed audience profiles used in <a href="https://www.nytimes.com/interactive/2019/04/30/opinion/privacy-targeted-advertising.html">targeted advertising</a>. When datasets are combined, privacy risks can be amplified. Whatever protections existed in the location dataset can crumble with the addition of only one or two other sources.
</p>
<p>
There are dozens of companies profiting off such data daily across the world — by collecting it directly from smartphones, creating new technology to better capture the data or creating audience profiles for targeted advertising.
</p>
<p>
The full collection of companies can feel dizzying, as it’s constantly changing and seems impossible to pin down. Many use technical and nuanced language that may be confusing to average smartphone users.
</p>
<p>
While many of them have been involved in the business of tracking us for years, the companies themselves are unfamiliar to most Americans. (Companies can work with data derived from GPS sensors, Bluetooth beacons and other sources. Not all companies in the location data business collect, buy, sell or work with granular location data.)
</p>

<p>
Location data companies generally downplay the risks of collecting such revealing information at scale. Many also say they’re not very concerned about potential regulation or software updates that could make it more difficult to collect location data.
</p>
<p>
“No, it doesn’t really keep us up at night,” Brian Czarny, chief marketing officer at Factual, one such company, said. He added that Factual does not resell detailed data like the information we reviewed. “We don’t feel like anybody should be doing that because it’s a risk to the whole business,” he said.
</p>
<p>
In absence of a federal privacy law, the industry has largely relied on self-regulation. Several industry groups offer ethical guidelines meant to govern it. Factual joined the <a href="https://www.mmaglobal.com/">Mobile Marketing Association</a>, along with many other data location and marketing companies, in drafting a pledge intended to improve its self-regulation. The pledge is slated to be released next year.
</p>
<p>
States are starting to respond with their own laws. The California Consumer Protection Act goes into effect next year and adds new protections for residents there, like allowing them to ask companies to delete their data or prevent its sale. But aside from a few new requirements, the law could leave the industry largely unencumbered.
</p>
<p>
“If a private company is legally collecting location data, they’re free to spread it or share it however they want,” said Calli Schroeder, a lawyer for the privacy and data protection company VeraSafe.
</p>
<p>
The companies are required to disclose very little about their data collection. By law, companies need only describe their practices in their privacy policies, which tend to be <a href="https://www.nytimes.com/interactive/2019/06/12/opinion/facebook-google-privacy-policies.html">dense legal documents</a> that few people read and even fewer can truly understand.
</p>

<h2 class="g-subhed g-optimize-type " id="">
EVERYTHING CAN BE HACKED
</h2>
<p>
<strong>Does it really matter</strong> that your information isn’t actually anonymous? Location data companies argue that your data is safe — that it poses no real risk because it’s stored on guarded servers. This assurance has been undermined by the parade of publicly reported data breaches — to say nothing of breaches that don’t make headlines. In truth, sensitive information can be easily transferred or leaked, as evidenced by this very story.
</p>
<p>
We’re constantly shedding data, for example, by surfing the internet or making credit card purchases. But location data is different. Our precise locations are used fleetingly in the moment for a targeted ad or notification, but then repurposed indefinitely for much more profitable ends, like tying your purchases to <a href="https://web.clearchanneloutdoor.com/radar">billboard ads</a> you drove past on the freeway. Many apps that use your location, like weather services, work perfectly well without your precise location — but collecting your location feeds a lucrative secondary business of analyzing, licensing and transferring that information to third parties.
</p>

<p class="g-source ">
<span class="g-caption">The data contains simple information like date, latitude and longitude, making it easy to inspect, download and transfer. Note: Values are randomized to protect sources and device owners.</span>
</p>

<p>
For many Americans, the only real risk they face from having their information exposed would be embarrassment or inconvenience. But for others, like survivors of abuse, the risks could be substantial. And who can say what practices or relationships any given individual might want to keep private, to withhold from friends, family, employers or the government? We found hundreds of pings in mosques and churches, abortion clinics, queer spaces and other sensitive areas.
</p>
<p>
In one case, we observed a change in the regular movements of a Microsoft engineer. He made a visit one Tuesday afternoon to the main Seattle campus of a Microsoft competitor, Amazon. The following month, he started a new job at Amazon. It took minutes to identify him as Ben Broili, a manager now for Amazon Prime Air, a drone delivery service.
</p>
<p>
“I can’t say I’m surprised,” Mr. Broili told us in early December. “But knowing that you all can get ahold of it and comb through and place me to see where I work and live — that’s weird.” That we could so easily discern that Mr. Broili was out on a job interview raises some obvious questions, like: Could the internal location surveillance of executives and employees become standard corporate practice?
</p>

<p>
Mr. Broili wasn’t worried about apps cataloguing his every move, but he said he felt unsure about whether the tradeoff between the services offered by the apps and the sacrifice of privacy was worth it. “It’s an awful lot of data,” he said. “And I really still don’t understand how it’s being used. I’d have to see how the other companies were weaponizing or monetizing it to make that call.”
</p>
<p>
If this kind of location data makes it easy to keep tabs on employees, it makes it just as simple to stalk celebrities. Their private conduct — even in the dead of night, in residences and far from paparazzi — could come under even closer scrutiny.
</p>
<p>
Reporters hoping to evade other forms of surveillance by meeting in person with a source might want to rethink that practice. Every major newsroom covered by the data contained dozens of pings; we easily traced one Washington Post journalist through Arlington, Va.
</p>
<p>
In other cases, there were detours to hotels and late-night visits to the homes of prominent people. One person, plucked from the data in Los Angeles nearly at random, was found traveling to and from roadside motels multiple times, for visits of only a few hours each time.
</p>
<p>
While these pointillist pings don’t in themselves reveal a complete picture, a lot can be gleaned by examining the date, time and length of time at each point.
</p>
<p>
Large data companies like Foursquare — perhaps the most familiar name in the location data business — say they don’t sell detailed location data like the kind reviewed for this story but rather use it to <a href="https://www.nytimes.com/2019/10/16/opinion/foursquare-privacy-internet.html">inform analysis</a>, such as measuring whether you <a href="https://enterprise.foursquare.com/products/attribution">entered a store</a> after seeing an ad on your mobile phone.
</p>
<p>
But a number of companies do sell the detailed data. Buyers are typically data brokers and advertising companies. But some of them have little to do with consumer advertising, including financial institutions, geospatial analysis companies and real estate investment firms that can process and analyze such large quantities of information. They might pay more than $1 million for a tranche of data, according to a former location data company employee who agreed to speak anonymously.
</p>
<p>
Location data is also collected and shared alongside a mobile advertising ID, a supposedly anonymous identifier about 30 digits long that allows advertisers and other businesses to tie activity together across apps. The ID is also used to combine location trails with other information like your name, home address, email, phone number or even an identifier tied to your Wi-Fi network.
</p>
<p>
The data can change hands in almost real time, so fast that your location could be transferred from your smartphone to the app’s servers and exported to third parties in milliseconds. This is how, for example, you might see an ad for a new car some time after walking through a dealership.
</p>
<p>
That data can then be resold, copied, pirated and abused. There’s no way you can ever retrieve it.
</p>
<p>
Location data is about far more than consumers seeing a few more relevant ads. This information provides critical intelligence for big businesses. The Weather Channel app’s parent company, for example, analyzed users’ location data for <a href="https://web.archive.org/web/20180731211011/https://business.weather.com/writable/documents/Financial-Markets/InvestorInsights_SolutionSheet.pdf">hedge funds</a>, according to a <a href="https://www.nytimes.com/2019/01/03/technology/weather-channel-app-lawsuit.html">lawsuit filed in Los Angeles this year</a> that was triggered by Times reporting. And Foursquare received much attention in 2016 after using its data trove to <a href="https://medium.com/foursquare-direct/foursquare-predicts-chipotle-s-q1-sales-down-nearly-30-foot-traffic-reveals-the-start-of-a-mixed-78515b2389af">predict</a> that after an E. coli crisis, Chipotle’s sales would drop by 30 percent in the coming months. Its same-store sales ultimately <a href="https://www.cnbc.com/2016/04/26/chipotle-reports-first-quarter-results.html">fell 29.7 percent</a>.
</p>
<p>
Much of the concern over location data has focused on telecom giants like Verizon and AT&amp;T, which have been <a href="https://www.wyden.senate.gov/imo/media/doc/at&amp;t%20letter%20to%20RW%206.15.pdf">selling location data</a> to third parties for years. Last year, Motherboard, Vice’s technology website, <a href="https://motherboard.vice.com/en_us/article/nepxbz/i-gave-a-bounty-hunter-300-dollars-located-phone-microbilt-zumigo-tmobile">found</a> that once the data was sold, it was being shared to help bounty hunters find specific cellphones in real time. The resulting scandal forced the telecom giants to <a href="https://arstechnica.com/tech-policy/2018/06/verizon-and-att-will-stop-selling-your-phones-location-to-data-brokers/">pledge</a> they would stop selling location movements to data brokers.
</p>
<p>
Yet no law prohibits them from doing so.
</p>
<p>
Location data is transmitted from your phone via software development kits, or S.D.Ks. as they’re known in the trade. The kits are small programs that can be used to build features within an app. They make it easy for app developers to simply include location-tracking features, a useful component of services like weather apps. Because they’re so useful and easy to use, S.D.K.s are embedded in thousands of apps. Facebook, Google and Amazon, for example, have extremely popular S.D.K.s that allow smaller apps to connect to bigger companies’ ad platforms or help provide web traffic analytics or payment infrastructure.
</p>
<p>
But they could also sit on an app and collect location data while providing no real service back to the app. Location companies may <a href="https://www.xmode.io/app-publishers/">pay</a> the apps to be included — collecting valuable data that can be monetized.
</p>
<p>
“If you have an S.D.K. that’s frequently collecting location data, it is more than likely being resold across the industry,” said Nick Hall, chief executive of the data marketplace company VenPath.
</p>

<h2 class="g-subhed g-optimize-type " id="">
THE ‘HOLY GRAIL’ FOR MARKETERS
</h2>
<p>
<strong>If this information is so sensitive,</strong> why is it collected in the first place?
</p>
<p>
For brands, following someone’s precise movements is key to understanding the “customer journey” — every step of the process from seeing an ad to buying a product. It’s the Holy Grail of advertising, one marketer said, the complete picture that connects all of our interests and online activity with our real-world actions.
</p>
<p>
Once they have the complete customer journey, companies know a lot about what we want, what we buy and what made us buy it. Other groups have begun to find ways to use it too. Political campaigns could analyze the interests and demographics of <a href="https://www.wsj.com/articles/political-campaigns-track-cellphones-to-identify-and-target-individual-voters-11570718889">rally attendees</a> and use that information to shape their messages to try to manipulate particular groups. Governments around the world could have a new tool to identify protestors.
</p>
<p>
Pointillist location data also has some clear benefits to society. Researchers can use the raw data to provide key insights for transportation studies and government planners. The City Council of Portland, Ore., unanimously <a href="https://www.opb.org/news/article/cellphone-location-data-portland-google-privacy/">approved a deal</a> to study traffic and transit by monitoring millions of cellphones. Unicef <a href="https://www.businesswire.com/news/home/20190910005037/en/Cuebiq%E2%80%99s-Data-Good-Program-UNICEF-High-Precision-Human">announced a plan</a> to use aggregated mobile location data to study epidemics, natural disasters and demographics.
</p>
<p>
For individual consumers, the value of constant tracking is less tangible. And the lack of transparency from the advertising and tech industries raises still more concerns.
</p>
<p>
Does a coupon app need to sell second-by-second location data to other companies to be profitable? Does that really justify allowing companies to track millions and potentially expose our private lives?
</p>
<p>
Data companies say users consent to tracking when they agree to share their location. But those consent screens rarely make clear how the data is being packaged and sold. If companies were clearer about what they were doing with the data, would anyone agree to share it?
</p>
<p>
What about data collected years ago, before hacks and leaks made privacy a forefront issue? Should it still be used, or should it be deleted for good?
</p>
<p>
If it’s possible that data stored securely today can easily be hacked, leaked or stolen, is this kind of data worth that risk?
</p>
<p>
Is all of this surveillance and risk worth it merely so that we can be served slightly more relevant ads? Or so that hedge fund managers can get richer?
</p>
<p>
The companies profiting from our every move can’t be expected to voluntarily limit their practices. Congress has to step in to protect Americans’ needs as consumers and rights as citizens.
</p>
<p>
Until then, one thing is certain: We are living in the world’s most advanced surveillance system. This system wasn’t created deliberately. It was built through the interplay of technological advance and the profit motive. It was built to make money. The greatest trick technology companies ever played was persuading society to surveil itself.
</p>

+ 75
- 0
cache/2020/58add7873e65625beba4c859d40a278b/index.html View File

@@ -0,0 +1,75 @@
<!doctype html><!-- This is a valid HTML5 document. -->
<!-- Screen readers, SEO, extensions and so on. -->
<html lang="fr">
<!-- Has to be within the first 1024 bytes, hence before the <title>
See: https://www.w3.org/TR/2012/CR-html5-20121217/document-metadata.html#charset -->
<meta charset="utf-8">
<!-- Why no `X-UA-Compatible` meta: https://stackoverflow.com/a/6771584 -->
<!-- The viewport meta is quite crowded and we are responsible for that.
See: https://codepen.io/tigt/post/meta-viewport-for-2015 -->
<meta name="viewport" content="width=device-width,minimum-scale=1,initial-scale=1,shrink-to-fit=no">
<!-- Required to make a valid HTML5 document. -->
<title>TikTok and the coming of infinite media (archive) — David Larlet</title>
<!-- Lightest blank gif, avoids an extra query to the server. -->
<link rel="icon" href="data:;base64,iVBORw0KGgo=">
<!-- Thank you Florens! -->
<link rel="stylesheet" href="/static/david/css/style_2020-01-09.css">
<!-- See https://www.zachleat.com/web/comprehensive-webfonts/ for the trade-off. -->
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_regular.woff2" as="font" type="font/woff2" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_bold.woff2" as="font" type="font/woff2" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_italic.woff2" as="font" type="font/woff2" crossorigin>

<meta name="robots" content="noindex, nofollow">
<meta content="origin-when-cross-origin" name="referrer">
<!-- Canonical URL for SEO purposes -->
<link rel="canonical" href="http://www.roughtype.com/?p=8677">

<body class="remarkdown h1-underline h2-underline hr-center ul-star pre-tick">

<article>
<h1>TikTok and the coming of infinite media</h1>
<h2><a href="http://www.roughtype.com/?p=8677">Source originale du contenu</a></h2>
<figure class="wp-block-image"><img src="http://www.roughtype.com/wp/wp-content/plugins/jetpack/modules/lazy-images/images/1x1.trans.gif" alt="" class="wp-image-8691" data-lazy-src="https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?fit=625%2C294" data-lazy-srcset="https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?w=1656 1656w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?resize=300%2C141 300w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?resize=768%2C361 768w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?resize=1024%2C481 1024w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?resize=624%2C293 624w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?w=1250 1250w" data-lazy-sizes="(max-width: 625px) 100vw, 625px"/><noscript><img src="https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?fit=625%2C294" alt="" class="wp-image-8691" srcset="https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?w=1656 1656w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?resize=300%2C141 300w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?resize=768%2C361 768w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?resize=1024%2C481 1024w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?resize=624%2C293 624w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?w=1250 1250w" sizes="(max-width: 625px) 100vw, 625px"/></noscript></figure>

<p>If Instagram <a href="http://www.roughtype.com/?p=2106">showed</a> us what a world without art looks like, TikTok shows us what a world without shame looks like. The old virtues of restraint — prudence, discretion, tact — are gone. There is only one virtue: to be seen. In TikTok’s world, which more and more is our world, shamelessness has lost its negative connotations and become an asset. You may not get fifteen minutes of fame, but you will get fifteen seconds.</p>

<p>The rise of TikTok heralds something bigger, though: a reconfiguration of media. As mass media defined the twentieth century, so the twenty-first will be defined by infinite media. The media business has always aspired to endlessness, to securing an unbroken hold on the sense organs of the public. TikTok at last achieves it. More than YouTube, more than Facebook, more than Instagram, more than Twitter, TikTok reveals the sticky new atmosphere of our lives.</p>

<p>Infinite media requires endlessness on two fronts: supply and demand. Shamelessness, in this context, is best understood as a supply-side resource, a means of production. To manufacture the unlimited supply of content that an app like TikTok needs, the total productive capacity of the masses needs to be <a href="http://www.roughtype.com/?p=634">mobilized</a>. That requires not just the ready availability of media-production tools (the smartphone’s camera and microphone and its editing software) and the existence of a universal broadcast network (the internet), but also a culture that encourages and celebrates self-exposure and self-promotion. Vanity must go unchecked by modesty. The showoff, once a risible figure, must become an aspirational one.</p>

<p>On the demand side, too, TikTok achieves endlessness. It is endless horizontally, each video an infinitely looping GIF, and it is endless vertically, the videos stacked up in an infinite scroll. There is no exit from TikTok’s cinema. One college student I know, having recently downloaded the app, told me that she now finds herself watching TikToks until her iPhone battery dies. She can’t pull her eyes away from the screen, but she is still able to withstand the temptation to recharge her phone while the app’s running. Electrical failure is the last defense against infinite media.</p>

<p>TikTok’s Chinese owner, ByteDance, specializes in using machine-learning algorithms to tailor content to individual appetites. (With artificial intelligence, there is accounting for taste.) “Personalised information flows will dictate the way,” the company <a href="https://ailab.bytedance.com/">declares</a> in a vaguely Maoish aphorism in its mission statement. It doesn’t need to build exhaustive data profiles of its users as, say, Facebook does. It just watches what you watch, and how you watch it, and then feeds you whatever video has the highest calculated probability of tickling your fancy. You feel the frisson of discovery, but behind the scenes it’s just a machine pumping out widgets. “TikTok deals in the illusion, at least, of revelation,” <em>New York Times</em> critic Amanda Hess <a href="https://www.nytimes.com/interactive/2019/10/10/arts/TIK-TOK.html">writes</a>. Not to mention the illusion, at least, of egalitarianism, of communalism, of joy.</p>

<blockquote class="wp-block-quote"><p>When I tap the heart on some high school kid’s weird video, I feel a flicker of pride, as if I am supporting him in some way. But all I am really doing is demanding more.</p></blockquote>

<p>TikTok is at once a manifestation and a parody of what Stanford communication professor Fred Turner has <a href="https://www.press.uchicago.edu/ucp/books/book/chicago/D/bo10509859.html">termed</a> the “democratic surround.” From the 1940s through the 1960s, media-minded intellectuals promoted the ideal of a polyphonic multimedia experience that would be created and consumed by the public. The democratic surround would not only free the masses from centrally controlled media, with its authoritarian aura, but would raise the collective consciousness. TikTok gives us the democratic surround, but it turns out to be a pantomime. The central authority is still there, hidden behind a mask of your face.*</p>

<p>Infinite media sucks in all media, from news to entertainment to communication. Look at what’s going on in pop. Each TikTok has a soundtrack, a looping clip spinning on a wee turntable in the corner of the screen. The music business, seeing TikTok’s ability to turn songs into memes, has already developed a craving for the app’s yee yee juice. As Jia Tolentini <a href="https://www.newyorker.com/magazine/2019/09/30/how-tiktok-holds-our-attention">explains</a> in the <em>New Yorker</em>:</p>

<blockquote class="wp-block-quote"><p>Certain musical elements serve as TikTok catnip: bass-heavy transitions that can be used as punch lines; rap songs that are easy to lip-synch or include a narrative-friendly call and response. A twenty-six-year-old Australian producer named Adam Friedman, half of the duo Cookie Cutters, told me that he was now concentrating on lyrics that you could act out with your hands. “I write hooks, and I try it in the mirror—how many hand movements can I fit into fifteen seconds?” he said. “You know, goodbye, call me back, peace out, F you.”</p></blockquote>

<p>The aural hooks amplify the visual hooks, and vice versa, to saturate the sensorium. When it comes to the infinite, more is always better.</p>

<p>Boomers may struggle to make sense of TikTok, but they’ll appreciate its most obvious antecedent: the Ed Sullivan Show. Squeeze old Ed through a wormhole and give him a spin in a Vitamix, and you get TikTok. There’s Liza Minnelli singing “MacArthur Park,” then there’s a guy spinning plates on the ends of sticks, then there’s Señor Wences ventriloquizing through a hand puppet. Except it’s all us. We’re Liza, we’re the plate-spinning guy, we’re Señor Wences, we’re the puppet. We’re even Ed, flicking acts on and off the stage with the capriciousness of a pagan god.</p>

<p>Every Sunday night during the sixties the nation found itself glued to the set, engrossed in a variety show. It was an omen.</p>

<p>___________<br/>*In a recent essay, collected in the book <em>Trump and the Media</em> (reviewed <a href="https://lareviewofbooks.org/article/can-journalism-be-saved">here</a> by me), Turner argues that the democratization of media may paradoxically breed authoritarianism.<br/></p>
</article>


<hr>

<footer>
<p>
<a href="/david/" title="Aller à l’accueil">🏠</a> •
<a href="/david/log/" title="Accès au flux RSS">🤖</a> •
<a href="http://larlet.com" title="Go to my English profile" data-instant>🇨🇦</a> •
<a href="mailto:david%40larlet.fr" title="Envoyer un courriel">📮</a> •
<abbr title="Hébergeur : Alwaysdata, 62 rue Tiquetonne 75002 Paris, +33184162340">🧚</abbr>
</p>
</footer>
<script src="/static/david/js/instantpage-3.0.0.min.js" type="module" defer></script>
</body>
</html>

+ 31
- 0
cache/2020/58add7873e65625beba4c859d40a278b/index.md View File

@@ -0,0 +1,31 @@
title: TikTok and the coming of infinite media
url: http://www.roughtype.com/?p=8677
hash_url: 58add7873e65625beba4c859d40a278b

<figure class="wp-block-image"><img src="http://www.roughtype.com/wp/wp-content/plugins/jetpack/modules/lazy-images/images/1x1.trans.gif" alt="" class="wp-image-8691" data-lazy-src="https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?fit=625%2C294" data-lazy-srcset="https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?w=1656 1656w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?resize=300%2C141 300w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?resize=768%2C361 768w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?resize=1024%2C481 1024w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?resize=624%2C293 624w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?w=1250 1250w" data-lazy-sizes="(max-width: 625px) 100vw, 625px"/><noscript><img src="https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?fit=625%2C294" alt="" class="wp-image-8691" srcset="https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?w=1656 1656w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?resize=300%2C141 300w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?resize=768%2C361 768w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?resize=1024%2C481 1024w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?resize=624%2C293 624w, https://i2.wp.com/www.roughtype.com/wp/wp-content/uploads/2020/01/tiktok.jpg?w=1250 1250w" sizes="(max-width: 625px) 100vw, 625px"/></noscript></figure>

<p>If Instagram <a href="http://www.roughtype.com/?p=2106">showed</a> us what a world without art looks like, TikTok shows us what a world without shame looks like. The old virtues of restraint — prudence, discretion, tact — are gone. There is only one virtue: to be seen. In TikTok’s world, which more and more is our world, shamelessness has lost its negative connotations and become an asset. You may not get fifteen minutes of fame, but you will get fifteen seconds.</p>

<p>The rise of TikTok heralds something bigger, though: a reconfiguration of media. As mass media defined the twentieth century, so the twenty-first will be defined by infinite media. The media business has always aspired to endlessness, to securing an unbroken hold on the sense organs of the public. TikTok at last achieves it. More than YouTube, more than Facebook, more than Instagram, more than Twitter, TikTok reveals the sticky new atmosphere of our lives.</p>

<p>Infinite media requires endlessness on two fronts: supply and demand. Shamelessness, in this context, is best understood as a supply-side resource, a means of production. To manufacture the unlimited supply of content that an app like TikTok needs, the total productive capacity of the masses needs to be <a href="http://www.roughtype.com/?p=634">mobilized</a>. That requires not just the ready availability of media-production tools (the smartphone’s camera and microphone and its editing software) and the existence of a universal broadcast network (the internet), but also a culture that encourages and celebrates self-exposure and self-promotion. Vanity must go unchecked by modesty. The showoff, once a risible figure, must become an aspirational one.</p>

<p>On the demand side, too, TikTok achieves endlessness. It is endless horizontally, each video an infinitely looping GIF, and it is endless vertically, the videos stacked up in an infinite scroll. There is no exit from TikTok’s cinema. One college student I know, having recently downloaded the app, told me that she now finds herself watching TikToks until her iPhone battery dies. She can’t pull her eyes away from the screen, but she is still able to withstand the temptation to recharge her phone while the app’s running. Electrical failure is the last defense against infinite media.</p>

<p>TikTok’s Chinese owner, ByteDance, specializes in using machine-learning algorithms to tailor content to individual appetites. (With artificial intelligence, there is accounting for taste.) “Personalised information flows will dictate the way,” the company <a href="https://ailab.bytedance.com/">declares</a> in a vaguely Maoish aphorism in its mission statement. It doesn’t need to build exhaustive data profiles of its users as, say, Facebook does. It just watches what you watch, and how you watch it, and then feeds you whatever video has the highest calculated probability of tickling your fancy. You feel the frisson of discovery, but behind the scenes it’s just a machine pumping out widgets. “TikTok deals in the illusion, at least, of revelation,” <em>New York Times</em> critic Amanda Hess <a href="https://www.nytimes.com/interactive/2019/10/10/arts/TIK-TOK.html">writes</a>. Not to mention the illusion, at least, of egalitarianism, of communalism, of joy.</p>

<blockquote class="wp-block-quote"><p>When I tap the heart on some high school kid’s weird video, I feel a flicker of pride, as if I am supporting him in some way. But all I am really doing is demanding more.</p></blockquote>

<p>TikTok is at once a manifestation and a parody of what Stanford communication professor Fred Turner has <a href="https://www.press.uchicago.edu/ucp/books/book/chicago/D/bo10509859.html">termed</a> the “democratic surround.” From the 1940s through the 1960s, media-minded intellectuals promoted the ideal of a polyphonic multimedia experience that would be created and consumed by the public. The democratic surround would not only free the masses from centrally controlled media, with its authoritarian aura, but would raise the collective consciousness. TikTok gives us the democratic surround, but it turns out to be a pantomime. The central authority is still there, hidden behind a mask of your face.*</p>

<p>Infinite media sucks in all media, from news to entertainment to communication. Look at what’s going on in pop. Each TikTok has a soundtrack, a looping clip spinning on a wee turntable in the corner of the screen. The music business, seeing TikTok’s ability to turn songs into memes, has already developed a craving for the app’s yee yee juice. As Jia Tolentini <a href="https://www.newyorker.com/magazine/2019/09/30/how-tiktok-holds-our-attention">explains</a> in the <em>New Yorker</em>:</p>

<blockquote class="wp-block-quote"><p>Certain musical elements serve as TikTok catnip: bass-heavy transitions that can be used as punch lines; rap songs that are easy to lip-synch or include a narrative-friendly call and response. A twenty-six-year-old Australian producer named Adam Friedman, half of the duo Cookie Cutters, told me that he was now concentrating on lyrics that you could act out with your hands. “I write hooks, and I try it in the mirror—how many hand movements can I fit into fifteen seconds?” he said. “You know, goodbye, call me back, peace out, F you.”</p></blockquote>

<p>The aural hooks amplify the visual hooks, and vice versa, to saturate the sensorium. When it comes to the infinite, more is always better.</p>

<p>Boomers may struggle to make sense of TikTok, but they’ll appreciate its most obvious antecedent: the Ed Sullivan Show. Squeeze old Ed through a wormhole and give him a spin in a Vitamix, and you get TikTok. There’s Liza Minnelli singing “MacArthur Park,” then there’s a guy spinning plates on the ends of sticks, then there’s Señor Wences ventriloquizing through a hand puppet. Except it’s all us. We’re Liza, we’re the plate-spinning guy, we’re Señor Wences, we’re the puppet. We’re even Ed, flicking acts on and off the stage with the capriciousness of a pagan god.</p>

<p>Every Sunday night during the sixties the nation found itself glued to the set, engrossed in a variety show. It was an omen.</p>

<p>___________<br/>*In a recent essay, collected in the book <em>Trump and the Media</em> (reviewed <a href="https://lareviewofbooks.org/article/can-journalism-be-saved">here</a> by me), Turner argues that the democratization of media may paradoxically breed authoritarianism.<br/></p>

+ 135
- 0
cache/2020/59dac1925636ebf6358c3a598bf834f9/index.html View File

@@ -0,0 +1,135 @@
<!doctype html><!-- This is a valid HTML5 document. -->
<!-- Screen readers, SEO, extensions and so on. -->
<html lang="fr">
<!-- Has to be within the first 1024 bytes, hence before the <title>
See: https://www.w3.org/TR/2012/CR-html5-20121217/document-metadata.html#charset -->
<meta charset="utf-8">
<!-- Why no `X-UA-Compatible` meta: https://stackoverflow.com/a/6771584 -->
<!-- The viewport meta is quite crowded and we are responsible for that.
See: https://codepen.io/tigt/post/meta-viewport-for-2015 -->
<meta name="viewport" content="width=device-width,minimum-scale=1,initial-scale=1,shrink-to-fit=no">
<!-- Required to make a valid HTML5 document. -->
<title>Un pédophile est un client Apple comme les autres. (archive) — David Larlet</title>
<!-- Lightest blank gif, avoids an extra query to the server. -->
<link rel="icon" href="data:;base64,iVBORw0KGgo=">
<!-- Thank you Florens! -->
<link rel="stylesheet" href="/static/david/css/style_2020-01-09.css">
<!-- See https://www.zachleat.com/web/comprehensive-webfonts/ for the trade-off. -->
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_regular.woff2" as="font" type="font/woff2" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_bold.woff2" as="font" type="font/woff2" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_italic.woff2" as="font" type="font/woff2" crossorigin>

<meta name="robots" content="noindex, nofollow">
<meta content="origin-when-cross-origin" name="referrer">
<!-- Canonical URL for SEO purposes -->
<link rel="canonical" href="https://www.affordance.info/mon_weblog/2020/01/pedophile-client-apple.html">

<body class="remarkdown h1-underline h2-underline hr-center ul-star pre-tick">

<article>
<h1>Un pédophile est un client Apple comme les autres.</h1>
<h2><a href="https://www.affordance.info/mon_weblog/2020/01/pedophile-client-apple.html">Source originale du contenu</a></h2>
<p>Si Gabriel Matzneff ne sait plus où donner à lire le récit de ses ébats pédophiles suite à l'annonce du retrait de ses livres par deux de ses éditeurs historiques, il lui reste encore la possibilité de se créer un compte Apple. Je vous explique. </p>

<p>Apple, par la voix de Jane Horvath, vice-présidente de la société chargée de la confidentialité, <a href="https://www.01net.com/actualites/apple-confirme-acceder-aux-photos-de-ses-clients-pour-lutter-contre-la-pedopornographie-1837691.html" rel="noopener" target="_blank">vient de déclarer au CES</a> (Consumer Electronic Show) <a href="https://www.telegraph.co.uk/technology/2020/01/08/apple-scans-icloud-photos-check-child-abuse/" rel="noopener" target="_blank">qu'ils allaient désormais "scanner" les comptes iCloud de tous les utilisateurs pour y détecter d'éventuelles photos pédophiles</a>. Et que s'ils en trouvaient, des photos pédophiles, ils fermeraient lesdits comptes iCloud. Mais ne communiqueraient pas ces comptes à la justice ou à la police. </p>

<p>Ce qui pose plusieurs problèmes. Que je vais essayer de traiter ici. </p>

<h2>L'avis privé.</h2>

<p>Apple a toujours eu pour <em>credo </em>marchand et comme <em>confiteor</em> client une doctrine d'inviolabilité totale de la "vie privée" de ses utilisateurs au travers des dispositifs, services et terminaux que la firme commercialise. </p>

<p>Au prix, ces dernières années, de saignants débats entre les tenants d'une vision très rigoriste de la vie privée et des données personnelles (tout ce qui relève de la vie privée est privé tout le temps et pour tout le monde y compris pour la justice et ce qu'elles que soient les circonstances), et d'autres - dont moi - défendant l'idée qu'il ne doit pas y avoir (en démocratie) d'espace irrémédiablement imperquisitionnable, <em>a fortiori</em> si c'est une société privée qui gère un service permettant de rester hors de portée du droit (et d'une action en justice) au nom d'intérêts marchands. </p>

<p>Je vous avais expliqué tout cela dans un article il y a deux ans, dont le titre était "<a href="https://affordance.typepad.com/mon_weblog/2016/02/iphone-le-degre-zero-de-la-privacy.html" rel="noopener" target="_blank">un terroriste est un client Apple comme les autres</a>" et qui avait pour sujet le refus d'Apple d'accorder au FBI l'accès aux données d'un iPhone utilisé dans le cadre d'une fusillade terroriste. Pour vous éviter d'aller le relire intégralement, je recopie juste la conclusion : </p>

<blockquote>
<p>"(...) un terroriste est un client Apple comme un autre. Et bénéficie à ce titre des mêmes "droits", du même "niveau de protection" qu'un autre client d'Apple pour ce qui concerne les données stockées dans son iPhone. Remettre en question ces droits pour un individu, même convaincu d'actes de terrorisme, reviendrait à remettre en question ces droits pour l'ensemble des possesseurs d'iPhone. Tel est en tout cas l'argumentaire de Tim Cook. (...)</p>
<p>Nous avons tous une vie privée, et nous avons tous le droit à une vie privée. Pour autant, dans le cadre d'un état de droit, et dans le cadre d'une procédure judiciaire, si nous sommes accusés ou reconnus coupables d'un acte délictueux, les éléments qui composent et documentent notre vie privée restent accessible via un mandat de perquisition (ordonné par un juge). La question que pose le cryptage par défaut incassable des iPhones et la <a href="https://www.apple.com/customer-letter/" rel="noopener" target="_blank">lettre de Tim Cook aux clients d'Apple</a> est celle de savoir si l'espace - physique ou numérique - alloué aux traces documentaires de notre vie privée doit être imperquisitionnable. S'il doit résister à toute forme de perquisition.</p>
<p>Si l'on se contente d'envisager les terroristes (ou les pédophiles ou les dealers de drogue ou les auteurs de crimes et délits en tout genre) comme autant de "clients" ayant acheté un appareil offrant des garanties raisonnables de préservation de leur vie privée, alors Tim Cook a raison : la demande du FBI est inacceptable. Mais si l'on considère que notre vie privée doit être un espace imperquisitionnable quelles que soient les circonstances et y compris dans le cadre d'une action judiciaire légitime effectuée dans un état de droit, alors c'est la posture de Tim Cook qui devient inacceptable."</p>
</blockquote>

<h2>Mais revenons aux pédophiles.</h2>

<p>Pendant des années, Apple expliquait qu'il ne "regardait" jamais et en rien le contenu déposé, stocké et hébergé sur un compte iCloud. Jamais. Même en sachant qu'il pouvait effectivement y avoir des contenus illégaux. </p>

<p>Et puis donc changement brutal de doctrine : voici que <a href="https://www.telegraph.co.uk/technology/2020/01/08/apple-scans-icloud-photos-check-child-abuse/" rel="noopener" target="_blank">désormais Apple annonce vérifier si les comptes iCloud ne comportent pas de photos ou de vidéos pédo-pornographiques</a>. Mais il ne le fait que sur la base de la <a href="https://www.zdnet.fr/actualites/microsoft-photodna-identifier-les-contenus-illegaux-sans-meme-les-regarder-39822582.htm" rel="noopener" target="_blank">technologie PhotoDNA</a>, développée et mise gratuitement à disposition par Microsoft, laquelle technologie permet de <a href="https://www.zdnet.fr/actualites/microsoft-photodna-identifier-les-contenus-illegaux-sans-meme-les-regarder-39822582.htm" rel="noopener" target="_blank">comparer des photos à des bases de données d'images pédophiles</a>, y compris si les photos sont retouchées ou recadrées. Cette technologie ne permet donc pas de détecter des contenus pédopornographiques inédits mais uniquement déjà existants. L'autre moyen de détecter des contenus pédopornographiques c'est de les faire visionner par des modérateurs. Avec tous les risques psychiques que l'on sait. </p>

<p>Donc Apple va déléguer à une technologie le fait d'identifier le partage de photos pédophiles déjà "identifiées" et s'il en trouve il "désactivera" ces comptes. Mais ne les signalera pas à la police ou à la justice. </p>

<p>En fait la <a href="https://www.apple.com/legal/privacy/en-ww/" rel="noopener" target="_blank">"Privacy Policy" d'Apple</a> a été modifiée <a href="https://www.macobserver.com/analysis/apple-scans-uploaded-content/" rel="noopener" target="_blank">comme suit en Mai 2019</a> : </p>

<blockquote>
<p><em>"We may also use your personal information for account and network security purposes, including in order to protect our services for the benefit of all our users, and pre-screening or scanning uploaded content for potentially illegal content, including child sexual exploitation material."</em></p>
</blockquote>

<p>Vous noterez que la détection de "<em>contenus potentiellement illégaux incluant des documents en lien avec l'exploitation sexuelle des enfants</em>" n'a pour objet que de protéger "<em>nos services</em>" pour le bénéfice "<em>de nos utilisateurs</em>". Il est donc explicite qu'Apple n'envisage pas de traiter ce problème comme relevant d'actes illégaux relevant de la justice mais simplement comme une manière de maintenir "propre" son écosystème de services. </p>

<h2>Surveiller sans voir et décider sans juger.</h2>

<p>On pourrait disserter pendant des heures sur le "choix" d'Apple de ne pas jouer le rôle d'un auxiliaire de justice dans le cas de d'éléments factuels (je n'ai pas dit d'élements "de preuve") relevant de la pédo-criminalité. La très récente affaire Matzneff est à ce titre pleine d'enseignements sur l'arbitraire circonstanciel de toute moralité dès lors qu'elle vise à s'inscrire dans un horizon social du tolérable ou du haïssable.</p>

<p>La distinction entre le statut d'éditeur et celui d'hébergeur est d'ailleurs consubstantielle de l'histoire du web contemporain. Histoire dans laquelle les actuels hébergeurs que sont les plateformes (moteurs et réseaux sociaux) sont tenus à une obligation de retrait dans des délais "raisonnables" dès lors que des contenus leurs sont signalés comme contrevenant à la loi. </p>

<p>Mais le fait est qu'il existe, aussi, en droit (français) une "<a href="https://www.legavox.fr/blog/maitre-haddad-sabine/obligation-signalement-devoir-professionnel-citoyen-25518.htm" rel="noopener" target="_blank">obligation de signalement</a>" s'adressant à tout particulier ou professionnel et qui fonctionne comme une exception au secret professionnel : </p>

<blockquote>
<p>"<em>Le fait, pour quiconque ayant connaissance de privations, de mauvais traitements ou d'agressions ou atteintes sexuelles infligés à un mineur ou à une personne qui n'est pas en mesure de se protéger en raison de son âge, d'une maladie, d'une infirmité, d'une déficience physique ou psychique ou d'un état de grossesse, de ne pas en informer les autorités judiciaires ou administratives ou de continuer à ne pas informer ces autorités tant que ces infractions n'ont pas cessé est puni de trois ans d'emprisonnement et de 45 000 euros d'amende."</em></p>
</blockquote>

<p>Comme ces sujets sont plus qu'épineux, il importe d'être précis. Le fait "<em>d'avoir connaissances de privations et de mauvais traitements ou d'agressions ou atteintes sexuelles infligés à un mineur</em>" n'équivaut pas, en droit, au fait d'avoir connaissance de la consultation ou du stockage d'images de privations et de mauvais traitements ou d'agressions sexuelles infligés à un mineur. </p>

<p>Pour autant, la décision d'Apple de simplement désactiver ("<em>disabled</em>") les comptes hébergeant du matériel pédo-pornographique sans en référer aux autorités judiciaires est un nouvel espace intersticiel qui complexifie et trouble encore davantage la réalité déjà passablement confuse de l'inscription d'un comportement "numérique" dans une sphère publique, privée, ou intime. </p>

<h2>Le paradoxe de la fronce.</h2>

<p>Nous vivons aujourd'hui dans une société d'hypermnésie, d'hyperscopie, et d'hyperacousie. Nos habitus, nos "morales" individuelles et nos "endroits" (<a href="http://www.marcjahjah.net/2966-portraits-despaces-2-la-querencia" rel="noopener" target="_blank">nos "Querencia" comme le dit si justement Marc Jahjah</a>) sont surveillés, entendus et enregistrés, le plus souvent avec notre consentement paresseux. De la même manière que sont surveillés, entendus et enregistrés les comportements, les habitus, les morales et les endroits où nous faisons ensemble et où nous sommes société.</p>

<p>Une société dans laquelle, à la fois des états, des sociétés privés monopolistiques ou oligopolistiques, et des entités collaborationnistes réunissant les deux, disposent donc de la capacité à se souvenir de tout (hypermnésie), comme de celle de tout voir (hyperscopie) et de tout entendre (hyperacousie). </p>

<p>Ces trois sur-capacités adressent à la fois :</p>

<ul>
<li>nos comportements et nos engagements sociaux dans l'espace public (d'où l'enjeu des débats sur la vidéo-surveillance qui dérivent aujourd'hui encore plus dangereusement vers la surveillance / reconnaissance faciale),</li>
<li>nos comportements et nos engagements privés dans des espaces non-publics,</li>
<li>mais aussi et littéralement nos biorythmes intimes (phases de sommeil, pulsations cardiaques, taux de sucre, etc ... par l'entremise des bracelets connectés et/ou autres applications de "santé" connectée). </li>
</ul>

<p>Notre corporéité même, dans son sens le plus littéralement biologique, est pour ces sociétés devenu une sorte de tiers-lieu où s'exerce le pouvoir de scrutation que nous leur accordons.</p>

<p>Face à cela, face à l'ensemble de ces régimes d'hyper-contrôle et à leurs modalités, se développent des dispositifs imperscrutables et inauditables en droit (<a href="https://affordance.typepad.com/mon_weblog/2016/02/iphone-le-degre-zero-de-la-privacy.html" rel="noopener" target="_blank">nos téléphones, en tout cas dans la doctrine Apple</a>), un développement qui n'est pas littéralement para-doxal (= qui ne va pas "contre" les discours de légitimation des régimes de contrôle) mais qui en est tout au contraire le parallèle, qui en accompagne le déploiement autant que le dévoiement.</p>

<p>La question est donc celle du pli, de la fronce. Celle d'une théorie topologique des espaces numériques de publication et d'archivage, dans le sens où la topologie est cette science dans laquelle "<em>une tasse à café est identique à une chambre à air, car toutes deux sont des surfaces avec un trou.</em>" Ce que je veux dire par là est que l'on réfléchit le plus souvent aux questions d'espace public et de vie privée en termes partitionnables binaires, comme si toute forme de vie publique était exclusive de toute expression appartenant à la vie privée, et réciproquement bien sûr. Et qu'à mon avis on a tort de procéder ainsi.</p>

<p>Il ne s'agit pas aujourd'hui de partitionner, de scinder, d'isoler des espaces privés, publics ou intimes qui seraient à chaque fois intrinsèquement exclusifs les uns des autres, mais de penser et de comprendre comment des forces (économiques, sociales, politiques) contraignent et modèlent chacun de ces espaces dans des temporalités propres et des surtout, au regard d'usages circonstanciels dédiés.</p>

<p><a class="asset-img-link" href="https://www.affordance.info/.a/6a00d8341c622e53ef0240a502220b200b-pi"><img alt="Image053" class="asset asset-image at-xid-6a00d8341c622e53ef0240a502220b200b img-responsive" src="https://www.affordance.info/.a/6a00d8341c622e53ef0240a502220b200b-500wi" title="Image053"/></a><em>Image de la "fronde", l'une des 7 catastrophes élémentaires dans la <a href="https://fr.wikipedia.org/wiki/Th%C3%A9orie_des_catastrophes" rel="noopener" target="_blank">théorie éponyme de René Thom</a>.</em></p>

<p>Dans l'image ci-dessus par exemple, si l'on considère que l'endroit du plan équivaut à la vie publique et l'envers à la vie privée, la fronde a capacité de masquer, de changer, de voiler, de dévoiler ou de travestir temporairement un espace en un autre ; de masquer (pour certains) ce qui était auparavant visible (pour d'autres), et réciproquement.</p>

<p>La décision d'Apple concernant l'examen des photos à caractère pédophile, cette volonté (nouvelle) et cette capacité de regarder sans voir, d'inspecter sans révéler, est une nouvelle fronce dans le régime global de nos publications et de nos vies numériques. Elle inaugure une nouvelle temporalité où ce qui était hier imperscrutable (le contenu des comptes iCloud) devient scruté mais en dehors de tout régime de dévoilement (pas de communication à la justice). Une scrutation qui ne donne lieu à aucun "scrutin" dans le sens où on se contente de "désactiver" (<em>disabled</em>) plutôt que de délibérer. </p>

<p>Apple sait sans voir que des comptes contiennent des photos pédopornographiques. Mais Apple ne voit pas en quoi ce savoir (ce "ça-voir") devrait être su de la justice, c'est à dire être proposé à la vue de la justice. Parce que qu'Apple, finalement, refuse de savoir / ça-voir.</p>

<p>Me revient en mémoire cette phrase : "<a href="https://www.affordance.info/mon_weblog/2019/01/si-cest-pourri-tes-pas-le-bon-produit.html" rel="noopener" target="_blank">ce que nous tolérons est ce que nous sommes vraiment.</a>" La bonne question à adresser à Apple, celle qu'il faut aussi nous poser collectivement en tant qu'utilisateurs ou observateurs de ces pratiques, c'est de savoir de quelle forme de tolérance relève le fait de "simplement" fermer des comptes que l'on sait contenir du matériel pédopornographique sans en référer à la justice dans le cadre d'un état de droit. </p>

<p>On utilise souvent (du moins le faisait-on du temps où Steve Jobs régnait en maître sur la firme), la métaphore religieuse pour caractériser le lien si particulier qui unissait la marque à ses utilisateurs. Si Apple est une église, alors elle marche dans les pas de l'église catholique qui elle aussi, voyait mais ne voulait pas savoir, et qui surtout, refuser de laisser la justice faire son travail, en ne lui donnant pas à voir la réalité des faits. </p>

<p>Si nous ne nous posons pas maintenant ces questions, nous nous réveillerons demain avec des centaines d'affaires Matzneff et serons effarés de voir qu'ils étaient là, sous nos yeux, et que nous n'avons rien fait, rien dit, que nous avons regardé sans voir, mais pas sans ça-voir. </p>
</article>


<hr>

<footer>
<p>
<a href="/david/" title="Aller à l’accueil">🏠</a> •
<a href="/david/log/" title="Accès au flux RSS">🤖</a> •
<a href="http://larlet.com" title="Go to my English profile" data-instant>🇨🇦</a> •
<a href="mailto:david%40larlet.fr" title="Envoyer un courriel">📮</a> •
<abbr title="Hébergeur : Alwaysdata, 62 rue Tiquetonne 75002 Paris, +33184162340">🧚</abbr>
</p>
</footer>
<script src="/static/david/js/instantpage-3.0.0.min.js" type="module" defer></script>
</body>
</html>

+ 54
- 0
cache/2020/59dac1925636ebf6358c3a598bf834f9/index.md View File

@@ -0,0 +1,54 @@
title: Un pédophile est un client Apple comme les autres.
url: https://www.affordance.info/mon_weblog/2020/01/pedophile-client-apple.html
hash_url: 59dac1925636ebf6358c3a598bf834f9

<p>Si Gabriel Matzneff ne sait plus où donner à lire le récit de ses ébats pédophiles suite à l'annonce du retrait de ses livres par deux de ses éditeurs historiques, il lui reste encore la possibilité de se créer un compte Apple. Je vous explique. </p>
<p>Apple, par la voix de Jane Horvath, vice-présidente de la société chargée de la confidentialité, <a href="https://www.01net.com/actualites/apple-confirme-acceder-aux-photos-de-ses-clients-pour-lutter-contre-la-pedopornographie-1837691.html" rel="noopener" target="_blank">vient de déclarer au CES</a> (Consumer Electronic Show) <a href="https://www.telegraph.co.uk/technology/2020/01/08/apple-scans-icloud-photos-check-child-abuse/" rel="noopener" target="_blank">qu'ils allaient désormais "scanner" les comptes iCloud de tous les utilisateurs pour y détecter d'éventuelles photos pédophiles</a>. Et que s'ils en trouvaient, des photos pédophiles, ils fermeraient lesdits comptes iCloud. Mais ne communiqueraient pas ces comptes à la justice ou à la police. </p>
<p>Ce qui pose plusieurs problèmes. Que je vais essayer de traiter ici. </p>
<h2>L'avis privé.</h2>
<p>Apple a toujours eu pour <em>credo </em>marchand et comme <em>confiteor</em> client une doctrine d'inviolabilité totale de la "vie privée" de ses utilisateurs au travers des dispositifs, services et terminaux que la firme commercialise. </p>
<p>Au prix, ces dernières années, de saignants débats entre les tenants d'une vision très rigoriste de la vie privée et des données personnelles (tout ce qui relève de la vie privée est privé tout le temps et pour tout le monde y compris pour la justice et ce qu'elles que soient les circonstances), et d'autres - dont moi - défendant l'idée qu'il ne doit pas y avoir (en démocratie) d'espace irrémédiablement imperquisitionnable, <em>a fortiori</em> si c'est une société privée qui gère un service permettant de rester hors de portée du droit (et d'une action en justice) au nom d'intérêts marchands. </p>
<p>Je vous avais expliqué tout cela dans un article il y a deux ans, dont le titre était "<a href="https://affordance.typepad.com/mon_weblog/2016/02/iphone-le-degre-zero-de-la-privacy.html" rel="noopener" target="_blank">un terroriste est un client Apple comme les autres</a>" et qui avait pour sujet le refus d'Apple d'accorder au FBI l'accès aux données d'un iPhone utilisé dans le cadre d'une fusillade terroriste. Pour vous éviter d'aller le relire intégralement, je recopie juste la conclusion : </p>
<blockquote>
<p>"(...) un terroriste est un client Apple comme un autre. Et bénéficie à ce titre des mêmes "droits", du même "niveau de protection" qu'un autre client d'Apple pour ce qui concerne les données stockées dans son iPhone. Remettre en question ces droits pour un individu, même convaincu d'actes de terrorisme, reviendrait à remettre en question ces droits pour l'ensemble des possesseurs d'iPhone. Tel est en tout cas l'argumentaire de Tim Cook. (...)</p>
<p>Nous avons tous une vie privée, et nous avons tous le droit à une vie privée. Pour autant, dans le cadre d'un état de droit, et dans le cadre d'une procédure judiciaire, si nous sommes accusés ou reconnus coupables d'un acte délictueux, les éléments qui composent et documentent notre vie privée restent accessible via un mandat de perquisition (ordonné par un juge). La question que pose le cryptage par défaut incassable des iPhones et la <a href="https://www.apple.com/customer-letter/" rel="noopener" target="_blank">lettre de Tim Cook aux clients d'Apple</a> est celle de savoir si l'espace - physique ou numérique - alloué aux traces documentaires de notre vie privée doit être imperquisitionnable. S'il doit résister à toute forme de perquisition.</p>
<p>Si l'on se contente d'envisager les terroristes (ou les pédophiles ou les dealers de drogue ou les auteurs de crimes et délits en tout genre) comme autant de "clients" ayant acheté un appareil offrant des garanties raisonnables de préservation de leur vie privée, alors Tim Cook a raison : la demande du FBI est inacceptable. Mais si l'on considère que notre vie privée doit être un espace imperquisitionnable quelles que soient les circonstances et y compris dans le cadre d'une action judiciaire légitime effectuée dans un état de droit, alors c'est la posture de Tim Cook qui devient inacceptable."</p>
</blockquote>
<h2>Mais revenons aux pédophiles.</h2>
<p>Pendant des années, Apple expliquait qu'il ne "regardait" jamais et en rien le contenu déposé, stocké et hébergé sur un compte iCloud. Jamais. Même en sachant qu'il pouvait effectivement y avoir des contenus illégaux. </p>
<p>Et puis donc changement brutal de doctrine : voici que <a href="https://www.telegraph.co.uk/technology/2020/01/08/apple-scans-icloud-photos-check-child-abuse/" rel="noopener" target="_blank">désormais Apple annonce vérifier si les comptes iCloud ne comportent pas de photos ou de vidéos pédo-pornographiques</a>. Mais il ne le fait que sur la base de la <a href="https://www.zdnet.fr/actualites/microsoft-photodna-identifier-les-contenus-illegaux-sans-meme-les-regarder-39822582.htm" rel="noopener" target="_blank">technologie PhotoDNA</a>, développée et mise gratuitement à disposition par Microsoft, laquelle technologie permet de <a href="https://www.zdnet.fr/actualites/microsoft-photodna-identifier-les-contenus-illegaux-sans-meme-les-regarder-39822582.htm" rel="noopener" target="_blank">comparer des photos à des bases de données d'images pédophiles</a>, y compris si les photos sont retouchées ou recadrées. Cette technologie ne permet donc pas de détecter des contenus pédopornographiques inédits mais uniquement déjà existants. L'autre moyen de détecter des contenus pédopornographiques c'est de les faire visionner par des modérateurs. Avec tous les risques psychiques que l'on sait. </p>
<p>Donc Apple va déléguer à une technologie le fait d'identifier le partage de photos pédophiles déjà "identifiées" et s'il en trouve il "désactivera" ces comptes. Mais ne les signalera pas à la police ou à la justice. </p>
<p>En fait la <a href="https://www.apple.com/legal/privacy/en-ww/" rel="noopener" target="_blank">"Privacy Policy" d'Apple</a> a été modifiée <a href="https://www.macobserver.com/analysis/apple-scans-uploaded-content/" rel="noopener" target="_blank">comme suit en Mai 2019</a> : </p>
<blockquote>
<p><em>"We may also use your personal information for account and network security purposes, including in order to protect our services for the benefit of all our users, and pre-screening or scanning uploaded content for potentially illegal content, including child sexual exploitation material."</em></p>
</blockquote>
<p>Vous noterez que la détection de "<em>contenus potentiellement illégaux incluant des documents en lien avec l'exploitation sexuelle des enfants</em>" n'a pour objet que de protéger "<em>nos services</em>" pour le bénéfice "<em>de nos utilisateurs</em>". Il est donc explicite qu'Apple n'envisage pas de traiter ce problème comme relevant d'actes illégaux relevant de la justice mais simplement comme une manière de maintenir "propre" son écosystème de services. </p>
<h2>Surveiller sans voir et décider sans juger.</h2>
<p>On pourrait disserter pendant des heures sur le "choix" d'Apple de ne pas jouer le rôle d'un auxiliaire de justice dans le cas de d'éléments factuels (je n'ai pas dit d'élements "de preuve") relevant de la pédo-criminalité. La très récente affaire Matzneff est à ce titre pleine d'enseignements sur l'arbitraire circonstanciel de toute moralité dès lors qu'elle vise à s'inscrire dans un horizon social du tolérable ou du haïssable.</p>
<p>La distinction entre le statut d'éditeur et celui d'hébergeur est d'ailleurs consubstantielle de l'histoire du web contemporain. Histoire dans laquelle les actuels hébergeurs que sont les plateformes (moteurs et réseaux sociaux) sont tenus à une obligation de retrait dans des délais "raisonnables" dès lors que des contenus leurs sont signalés comme contrevenant à la loi. </p>
<p>Mais le fait est qu'il existe, aussi, en droit (français) une "<a href="https://www.legavox.fr/blog/maitre-haddad-sabine/obligation-signalement-devoir-professionnel-citoyen-25518.htm" rel="noopener" target="_blank">obligation de signalement</a>" s'adressant à tout particulier ou professionnel et qui fonctionne comme une exception au secret professionnel : </p>
<blockquote>
<p>"<em>Le fait, pour quiconque ayant connaissance de privations, de mauvais traitements ou d'agressions ou atteintes sexuelles infligés à un mineur ou à une personne qui n'est pas en mesure de se protéger en raison de son âge, d'une maladie, d'une infirmité, d'une déficience physique ou psychique ou d'un état de grossesse, de ne pas en informer les autorités judiciaires ou administratives ou de continuer à ne pas informer ces autorités tant que ces infractions n'ont pas cessé est puni de trois ans d'emprisonnement et de 45 000 euros d'amende."</em></p>
</blockquote>
<p>Comme ces sujets sont plus qu'épineux, il importe d'être précis. Le fait "<em>d'avoir connaissances de privations et de mauvais traitements ou d'agressions ou atteintes sexuelles infligés à un mineur</em>" n'équivaut pas, en droit, au fait d'avoir connaissance de la consultation ou du stockage d'images de privations et de mauvais traitements ou d'agressions sexuelles infligés à un mineur. </p>
<p>Pour autant, la décision d'Apple de simplement désactiver ("<em>disabled</em>") les comptes hébergeant du matériel pédo-pornographique sans en référer aux autorités judiciaires est un nouvel espace intersticiel qui complexifie et trouble encore davantage la réalité déjà passablement confuse de l'inscription d'un comportement "numérique" dans une sphère publique, privée, ou intime. </p>
<h2>Le paradoxe de la fronce.</h2>
<p>Nous vivons aujourd'hui dans une société d'hypermnésie, d'hyperscopie, et d'hyperacousie. Nos habitus, nos "morales" individuelles et nos "endroits" (<a href="http://www.marcjahjah.net/2966-portraits-despaces-2-la-querencia" rel="noopener" target="_blank">nos "Querencia" comme le dit si justement Marc Jahjah</a>) sont surveillés, entendus et enregistrés, le plus souvent avec notre consentement paresseux. De la même manière que sont surveillés, entendus et enregistrés les comportements, les habitus, les morales et les endroits où nous faisons ensemble et où nous sommes société.</p>
<p>Une société dans laquelle, à la fois des états, des sociétés privés monopolistiques ou oligopolistiques, et des entités collaborationnistes réunissant les deux, disposent donc de la capacité à se souvenir de tout (hypermnésie), comme de celle de tout voir (hyperscopie) et de tout entendre (hyperacousie). </p>
<p>Ces trois sur-capacités adressent à la fois :</p>
<ul>
<li>nos comportements et nos engagements sociaux dans l'espace public (d'où l'enjeu des débats sur la vidéo-surveillance qui dérivent aujourd'hui encore plus dangereusement vers la surveillance / reconnaissance faciale),</li>
<li>nos comportements et nos engagements privés dans des espaces non-publics,</li>
<li>mais aussi et littéralement nos biorythmes intimes (phases de sommeil, pulsations cardiaques, taux de sucre, etc ... par l'entremise des bracelets connectés et/ou autres applications de "santé" connectée). </li>
</ul>
<p>Notre corporéité même, dans son sens le plus littéralement biologique, est pour ces sociétés devenu une sorte de tiers-lieu où s'exerce le pouvoir de scrutation que nous leur accordons.</p>
<p>Face à cela, face à l'ensemble de ces régimes d'hyper-contrôle et à leurs modalités, se développent des dispositifs imperscrutables et inauditables en droit (<a href="https://affordance.typepad.com/mon_weblog/2016/02/iphone-le-degre-zero-de-la-privacy.html" rel="noopener" target="_blank">nos téléphones, en tout cas dans la doctrine Apple</a>), un développement qui n'est pas littéralement para-doxal (= qui ne va pas "contre" les discours de légitimation des régimes de contrôle) mais qui en est tout au contraire le parallèle, qui en accompagne le déploiement autant que le dévoiement.</p>
<p>La question est donc celle du pli, de la fronce. Celle d'une théorie topologique des espaces numériques de publication et d'archivage, dans le sens où la topologie est cette science dans laquelle "<em>une tasse à café est identique à une chambre à air, car toutes deux sont des surfaces avec un trou.</em>" Ce que je veux dire par là est que l'on réfléchit le plus souvent aux questions d'espace public et de vie privée en termes partitionnables binaires, comme si toute forme de vie publique était exclusive de toute expression appartenant à la vie privée, et réciproquement bien sûr. Et qu'à mon avis on a tort de procéder ainsi.</p>
<p>Il ne s'agit pas aujourd'hui de partitionner, de scinder, d'isoler des espaces privés, publics ou intimes qui seraient à chaque fois intrinsèquement exclusifs les uns des autres, mais de penser et de comprendre comment des forces (économiques, sociales, politiques) contraignent et modèlent chacun de ces espaces dans des temporalités propres et des surtout, au regard d'usages circonstanciels dédiés.</p>
<p><a class="asset-img-link" href="https://www.affordance.info/.a/6a00d8341c622e53ef0240a502220b200b-pi"><img alt="Image053" class="asset asset-image at-xid-6a00d8341c622e53ef0240a502220b200b img-responsive" src="https://www.affordance.info/.a/6a00d8341c622e53ef0240a502220b200b-500wi" title="Image053"/></a><em>Image de la "fronde", l'une des 7 catastrophes élémentaires dans la <a href="https://fr.wikipedia.org/wiki/Th%C3%A9orie_des_catastrophes" rel="noopener" target="_blank">théorie éponyme de René Thom</a>.</em></p>
<p>Dans l'image ci-dessus par exemple, si l'on considère que l'endroit du plan équivaut à la vie publique et l'envers à la vie privée, la fronde a capacité de masquer, de changer, de voiler, de dévoiler ou de travestir temporairement un espace en un autre ; de masquer (pour certains) ce qui était auparavant visible (pour d'autres), et réciproquement.</p>
<p>La décision d'Apple concernant l'examen des photos à caractère pédophile, cette volonté (nouvelle) et cette capacité de regarder sans voir, d'inspecter sans révéler, est une nouvelle fronce dans le régime global de nos publications et de nos vies numériques. Elle inaugure une nouvelle temporalité où ce qui était hier imperscrutable (le contenu des comptes iCloud) devient scruté mais en dehors de tout régime de dévoilement (pas de communication à la justice). Une scrutation qui ne donne lieu à aucun "scrutin" dans le sens où on se contente de "désactiver" (<em>disabled</em>) plutôt que de délibérer. </p>
<p>Apple sait sans voir que des comptes contiennent des photos pédopornographiques. Mais Apple ne voit pas en quoi ce savoir (ce "ça-voir") devrait être su de la justice, c'est à dire être proposé à la vue de la justice. Parce que qu'Apple, finalement, refuse de savoir / ça-voir.</p>
<p>Me revient en mémoire cette phrase : "<a href="https://www.affordance.info/mon_weblog/2019/01/si-cest-pourri-tes-pas-le-bon-produit.html" rel="noopener" target="_blank">ce que nous tolérons est ce que nous sommes vraiment.</a>" La bonne question à adresser à Apple, celle qu'il faut aussi nous poser collectivement en tant qu'utilisateurs ou observateurs de ces pratiques, c'est de savoir de quelle forme de tolérance relève le fait de "simplement" fermer des comptes que l'on sait contenir du matériel pédopornographique sans en référer à la justice dans le cadre d'un état de droit. </p>
<p>On utilise souvent (du moins le faisait-on du temps où Steve Jobs régnait en maître sur la firme), la métaphore religieuse pour caractériser le lien si particulier qui unissait la marque à ses utilisateurs. Si Apple est une église, alors elle marche dans les pas de l'église catholique qui elle aussi, voyait mais ne voulait pas savoir, et qui surtout, refuser de laisser la justice faire son travail, en ne lui donnant pas à voir la réalité des faits. </p>
<p>Si nous ne nous posons pas maintenant ces questions, nous nous réveillerons demain avec des centaines d'affaires Matzneff et serons effarés de voir qu'ils étaient là, sous nos yeux, et que nous n'avons rien fait, rien dit, que nous avons regardé sans voir, mais pas sans ça-voir. </p>

+ 145
- 0
cache/2020/5ddeb776b27bade5f581d66e40de4c6c/index.html View File

@@ -0,0 +1,145 @@
<!doctype html><!-- This is a valid HTML5 document. -->
<!-- Screen readers, SEO, extensions and so on. -->
<html lang="fr">
<!-- Has to be within the first 1024 bytes, hence before the <title>
See: https://www.w3.org/TR/2012/CR-html5-20121217/document-metadata.html#charset -->
<meta charset="utf-8">
<!-- Why no `X-UA-Compatible` meta: https://stackoverflow.com/a/6771584 -->
<!-- The viewport meta is quite crowded and we are responsible for that.
See: https://codepen.io/tigt/post/meta-viewport-for-2015 -->
<meta name="viewport" content="width=device-width,minimum-scale=1,initial-scale=1,shrink-to-fit=no">
<!-- Required to make a valid HTML5 document. -->
<title>Big Mood Machine (archive) — David Larlet</title>
<!-- Lightest blank gif, avoids an extra query to the server. -->
<link rel="icon" href="data:;base64,iVBORw0KGgo=">
<!-- Thank you Florens! -->
<link rel="stylesheet" href="/static/david/css/style_2020-01-09.css">
<!-- See https://www.zachleat.com/web/comprehensive-webfonts/ for the trade-off. -->
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_regular.woff2" as="font" type="font/woff2" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_bold.woff2" as="font" type="font/woff2" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_italic.woff2" as="font" type="font/woff2" crossorigin>

<meta name="robots" content="noindex, nofollow">
<meta content="origin-when-cross-origin" name="referrer">
<!-- Canonical URL for SEO purposes -->
<link rel="canonical" href="https://thebaffler.com/downstream/big-mood-machine-pelly">

<body class="remarkdown h1-underline h2-underline hr-center ul-star pre-tick">

<article>
<h1>Big Mood Machine</h1>
<h2><a href="https://thebaffler.com/downstream/big-mood-machine-pelly">Source originale du contenu</a></h2>
<p><span class="first-selection">Music is emotional</span>, and so our listening often signals something deeply personal and private. Today, this means music streaming platforms are in a unique position within the greater platform economy: they have troves of data related to our emotional states, moods, and feelings. It’s a matter of unprecedented access to our interior lives, which is buffered by the flimsy illusion of privacy. When a user chooses, for example, a “private listening” session on Spotify, the effect is to make them feel that it’s a one-way relation between person and machine. Of course, that personalization process is Spotify’s way of selling users on its product. But, as it turns out, in a move that should not surprise anyone at this point, Spotify has been selling access to that listening data to multinational corporations.</p>

<p>Where other platforms might need to invest more to piece together emotional user profiles, Spotify streamlines the process by providing boxes that users click on to indicate their moods: Happy Hits, Mood Booster, Rage Beats, Life Sucks. All of these are examples of what can now be found on Spotify’s Browse page under the “mood” category, which currently contains eighty-five playlists. If you need a lift in the morning, there’s Wake Up Happy, A Perfect Day, or Ready for the Day. If you’re feeling down, there’s Feeling Down, Sad Vibe, Down in the Dumps, Drifting Apart, Sad Beats, Sad Indie, and Devastating. If you’re grieving, there’s even Coping with Loss, with the tagline: “When someone you love becomes a memory, find solace in these songs.”</p>

<aside class="related-box"><h3>More</h3><div class="image-fill" style="background-image: url(/wp-content/uploads/2018/12/PELLY_REAL-350x234.gif );"><a class="full-coverage" href="https://thebaffler.com/downstream/streambait-pop-pelly?utm_campaign=bonus&utm_source=big-mood-machine-pelly"></a></div><h5><a href="https://thebaffler.com/downstream/streambait-pop-pelly?utm_campaign=bonus&utm_source=big-mood-machine-pelly">Streambait Pop</a></h5><h6>By&nbsp;Liz Pelly</h6></aside>

<p>Over the years, streaming services have pushed a narrative about these mood playlists, suggesting, through aggressive marketing, that the rise of listening by way of moods and activities was a service to listeners and artists alike—a way to help users navigate infinite choice, to find their way through a vast library of forty million songs. It’s a powerful arm of the industry-crafted mythology of the so-called streaming revolution: platforms celebrating this grand recontextualization of music into mood playlists as an engine of discovery. Spotify is currently running a campaign centered on moods—the company’s Twitter tagline is currently “Music for every mood”—complete with its own influencer campaign. </p>

<p>But a more careful look into Spotify’s history shows that the decision to define audiences by their moods was part of a strategic push to grow Spotify’s advertising business in the years leading up to its IPO—and today, Spotify’s enormous access to mood-based data is a pillar of its value to brands and advertisers, allowing them to target ads on Spotify by moods and emotions. Further, since 2016, Spotify has shared this mood data directly with the world’s biggest marketing and advertising firms.</p>

<p class="parasectionhed">Streaming <span style="text-decoration: line-through;">Intelligence</span> Surveillance</p>

<p>In 2015, Spotify began selling advertisers on the idea of marketing to moods, moments, and activities instead of genres. This was one year after Spotify acquired the “music intelligence” firm Echo Nest. Together they began looking at the 1.5 billion user-generated playlists at Spotify’s disposal. Studying these playlists allowed Spotify to more deeply analyze the contexts in which <em>listening</em> was happening on its platform. And so, right around the time that Spotify realized it had 400,000 user-generated barbecue playlists, Brian Benedik, then Spotify’s North American Vice President of Advertising and Partnerships, noted in an <em>Ad Age</em> interview that the company would focus on moods as a way to grow its advertising mechanism: “This is not something that’s just randomly thrown out there,” Benedik said. “It’s a strategic evolution of the Spotify ads business.” As of May 1, 2015, advertisers would be able to target ads to users of the free ad-supported service based on activities and moods: “Mood categories like happy, chill, and sad will let a brand like Coca-Cola play on its ‘Open Happiness’ campaign when people are listening to mood-boosting music,” the <em>Ad Age </em>article explained.</p>

<blockquote class="pullquote"><p>Spotify’s enormous access to mood-based data is a pillar of its value to brands and advertisers.</p></blockquote>

<p>Four years later, Spotify is the world’s biggest streaming subscription service, with 207 million users in seventy-nine different countries. And as Spotify has grown, its advertising machine has exploded. Of those 207 million users, it claims 96 million are subscribers, meaning that 111 million users rely on the ad-supported version. Spotify top marketing execs have expressed that the company’s ambition is “<a href="https://www.wsj.com/articles/spotify-has-big-ambitions-for-ad-business-1502964001">absolutely</a>” to be the third-largest player in the digital ad market behind Google and Facebook. In turn, since 2015, Spotify’s strategic use of mood and emotion-based targeting has only become even more entrenched in its business model.</p>

<p>“At Spotify we have a personal relationship with over 191 million people who show us their true colors with zero filter,” reads a current advertising deck. “That&#8217;s a lot of authentic engagement with our audience: billions of data points every day across devices! This data fuels Spotify&#8217;s streaming intelligence—our secret weapon that gives brands the edge to be relevant in real-time moments.&#8221; Another brand-facing pitch proclaims: “The most exciting part? This new research is starting to reveal the streaming generation’s offline behaviors through their streaming habits.”</p>

<p>Today, Spotify Ad Studio, a self-service portal automating the ad-purchase process, promises access to “rich and textured datasets,” allowing brands to instantly target their ads by mood and activity categories like “Chill,” “Commute,” “Dinner,” “Focus/Study,” “Girls Night Out,” and more. And across the Spotify for Brands website are a number of “studies” and “insights reports” regarding research that Spotify has undertaken about streaming habits: “You are what you stream,” they reiterate over and over.</p>

<p>In a 2017 package titled <em>Understanding People Through Music—Millennial Edition,</em> Spotify (with help from “youth marketing and millennial research firm” Ypulse) set out to help advertisers better target millennial users by mood, emotion, and activity specifically. Spotify explains that “unlike generations past, millennials aren’t loyal to any specific music genre.” They conflate this with a greater reluctance toward labels and binaries, pointing out the rising number of individuals who identify as gender fluid and the growing demographic of millennials who do not have traditional jobs—and chalk these up to consumer preferences. “This throws a wrench is marketers’ neat audience segmentations,” Spotify commiserates.</p>

<p>For the study, they also gathered six hundred in-depth “day in a life” interviews recorded as “behavioral diaries.” All participants were surveilled by demographics, platform usage, playlist behavior, feature usage, and music tastes—and in the United States (where privacy is taken less seriously), Spotify and Ypulse were able to pair Spotify’s own streaming data with additional third-party information on “broader interests, lifestyle, and shopping behaviors.”</p>

<p>The result is an interactive hub on the Spotify for Brands website detailing seven distinct “key audio streaming moments for marketers to tap into,” including Working, Chilling, Cooking, Chores, Gaming, Workout, Partying, and Driving. Spotify also dutifully outlines recommendations for how to use this information to sell shit, alongside success stories from Dunkin’ Donuts, Snickers, Gatorade, Wild Turkey, and BMW.</p>

<p>More startlingly, for each of these “moments” there is an animated trajectory of a typical “emotional journey” claiming to predict the various emotional states users will experience while listening to particular playlists. Listeners who are “working,” for instance, are likely to start out feeling pressured and stressed, before they become more energized and focused and end up feeling fine and accomplished at the end of the playlist queue. If they listen while doing chores, the study claims to know that they start out feeling stressed and lazy, then grow motivated and entertained, and end by feeling similarly good and accomplished.</p>

<p>In Spotify’s world, listening data has become the oil that fuels a monetizable metrics machine, pumping the numbers that lure advertisers to the platform. In a data-driven listening environment, the commodity is no longer <em>music</em>. The commodity is <em>listening</em>. The commodity is <em>users </em>and their <em>moods</em>. The commodity is <em>listening habits</em> as <em>behavioral data</em>. Indeed, what Spotify calls “streaming intelligence” should be understood as surveillance of its users to fuel its own growth and ability to sell mood-and-moment data to brands.</p>

<p class="parasectionhed">A Leviathan of Ads</p>

<p>The potential of music to provide mood-related data useful to marketers has long been studied. In 1990, the <em>Journal of Marketing</em> published an article dubbed “Music, Mood and Marketing” that surveyed some of this history while bemoaning how “despite being a prominent promotional tool, music is not well understood or controlled by marketers.” The text outlines how “marketers are precariously dependent on musicians for their insight into the selection or composition of the ‘right’ music for particular situations.” This view of music as a burdensome means to a marketer’s end is absurd, but it’s also the logic that rules the current era of algorithmic music platforms. Unsurprisingly, this 1990 article aimed to overcome challenges for marketers by figuring out new ways to extract value from music that would be beyond the control of musicians themselves: studying the “behavioral effects” of music with a “special emphasis on music’s emotional expressionism and role as mood influencer” in order to create new forms of power and control.<em> </em></p>

<p>Today, marketers want mood-related data more than ever, at least in part to fuel automated, personalized ad targeting. In 2016, the world’s largest holding company for advertising and PR agencies, WPP, <a href="https://www.prnewswire.com/news-releases/wpps-data-alliance-and-spotify-announce-global-data-partnership-300362733.html">announced</a> that it had struck a multi-year partnership with Spotify, giving the conglomerate unprecedented access to Spotify’s mood data specifically. The partnership with the WPP, it turns out, was part of Spotify’s plan to ramp up its advertising business in advance of its IPO.</p>

<blockquote class="pullquote"><p>In a data-driven listening environment, the commodity is no longer <em>music</em>. The commodity is <em>listening</em>. The commodity is <em>users</em> and their <em>moods</em>.</p></blockquote>

<p>WPP is the parent company to several of the world’s largest and oldest advertising, PR, and brand consulting firms, including Ogilvy, Grey Global Group, and at least eighteen others. Across their portfolio, WPP owns companies that work with numerous mega-corporations and household brands, helping shill everything from cars, Coca-Cola, and KFC to booze, banks, and Burger King. Over the decades, these companies have worked on campaigns spanning from Merrill Lynch and Lay’s potato chips to Colgate-Palmolive and Ford. Additionally, WPP properties also include tech-focused companies that claim proficiency in automation- and personalization-driven ad sales, all of which would now benefit from Spotify’s mood data.</p>

<p>The 2016 announcement of WPP and Spotify’s global partnership in “data, insights, creative, technology, innovation, programmatic solution, and new growth markets” speaks for itself:</p>

<blockquote>
<p>WPP now has unique listening preferences and behaviors of Spotify&#8217;s 100 million users in 60 countries. The multi-year deal provides differentiating value to WPP and its clients by harnessing insights from the connection between music and audiences&#8217; moods and activities. Music attributes such as tempo and energy are proven to be highly relevant in predicting mood, which enables advertisers to understand their audiences in a new emotional dimension.</p>
</blockquote>

<p>What’s more, WPP-owned advertising agencies could now access the “Wunderman Zipline™ Data Management Platform” to gain direct access to Spotify users’ “mood, listening and playlist behavior, activity and location.” They’d also potentially make use of “Spotify’s data on connected device usage” while the WPP-owned company GroupM specifically would retain access to “an exclusive infusion of Spotify data” into its own platform made for corporate ad targeting. Per the announcement, WPP companies would also serve as launch partners for new types of advertising and new markets unveiled by Spotify, while procuring “visibility into Spotify&#8217;s product roadmap and access to beta testing.”</p>

<p>At the time the partnership was announced, Anas Ghazi, then Managing Director of Global Partnerships at WPP’s Data Alliance, noted that all WPP agencies would be able to “grab these insights. . . . If you think about how music shapes your activity and thoughts, this is a new, unique play for us to find audiences. Mood and moments are the next pieces of understanding audiences.” And Harvey Goldhersz, then CEO of GroupM Data &amp; Analytics, salivated: “The insights we&#8217;ll develop from Spotify’s behavioral data will help our clients realize a material marketplace advantage, aiding delivery of ads that are appropriate to the consumer&#8217;s mood and the device used.”</p>

<p class="parasectionhed">Ongoing Synergies</p>

<p>While this deal was announced via the WPP Data Alliance, visiting that particular organization’s website now auto-directs back to the main WPP website, likely a result of corporate restructuring that WPP underwent over the past year. Currently, the only public-facing evidence of the relationship between WPP and Spotify is <a href="http://www.kantar.com/about/partners">listed</a> online under the WPP-owned data and insights company Kantar, which WPP describes as “the world’s leading marketing data, insight and consultancy company.”</p>

<p>What might Kantar be doing with this user data? The current splash video deck on its website is useful: it claims to be the first agency to use “facial recognition in advertising testing,” and it professes to be exploring new technologies “from biodata and biometrics and healthcare, to capturing human sentiment and even voting intentions by analyzing social media.” And, finally, it admits to “exploiting big data, artificial intelligence and analytics . . . to predict attitudes and behavior.”</p>

<p>When we reached out to see if the relationship between Kantar and Spotify had changed since the initial 2016 announcement, Kantar sent <em>The Baffler</em> this statement:</p>

<blockquote>
<p>The 2016 Spotify collaboration was the first chapter of many-a collaboration and has continued to evolve to meet the dynamic needs of our clients and marketplace. Spotify continues to be a valued partner of larger enterprise and Kantar with on-going synergies<em>.</em></p>
</blockquote>

<p>One year after the announcement of the partnership, in 2017, Spotify further confirmed its desire to establish direct relationships with the world’s biggest advertising agencies when it hired two executives from WPP: Angela Solk, now Global Head of Agency Partnerships, whose job at Spotify includes teaching WPP and other ad agencies how to best make use of Spotify’s first-party data. (In Solk’s first year at Spotify, she helped create the Smirnoff Equalizer; in a 2018 interview, she reflected on the “beauty” of that branded content campaign and Spotify’s ability to extract listener insight and make it “part of Smirnoff’s DNA.”) Spotify also hired Craig Weingarten as its Head of Industry, Automotive, which now leads Spotify’s Detroit-based auto ad sales team.</p>

<p>According to its own media narrative, Spotify offers data access to brands that competitor platforms do not, and it has gained a reputation for its eagerness to share its first-party data. At advertising conferences and in the ad press, Spotify top ad exec Marco Bertozzi has emphasized how Spotify hopes to widely share first-party data, going so far as to <a href="https://www.thedrum.com/news/2017/03/17/spotify-europe-vp-when-other-walled-gardens-say-no-data-questions-we-say-yes">confess</a>, “When other walled gardens say no to data questions . . . we say yes.” (Bertozzi was also the mind behind an internal Spotify campaign adorably branded “#LoveAds” to combat growing societal disgust with digital advertising. #LoveAds started as a mantra of the advertising team, but as Bertozzi proudly <a href="https://www.campaignlive.co.uk/article/stop-knocking-advertising-learn-loveads/1519385">explained</a> in late 2018, “#LoveAds became a movement within the company.”)</p>

<p>Spotify has spent tremendous energy on its ad team’s proficiency with cross-device advertising options (likely due to the imminent ubiquity of Spotify in the car and the so-called “smart home”), as well as “programmatic advertising,” otherwise understood as the targeted advertising sold through an automated process, often in milliseconds—Spotify seeks to be the most dominant seller of such advertising in the audio space. And there’s also the self-serve platform, direct inventory sales, Spotify’s private marketplace (an invite-only inventory for select advertisers), programmatic guaranteed deals (a guaranteed volume of impressions at a fixed price)—the jargon ad-speak lists could go on and on.</p>

<blockquote class="pullquote"><p>According to its own media narrative, Spotify offers data access to brands that competitor platforms do not.</p></blockquote>

<p>Trying to keep tabs on Spotify’s advertising products and partnerships is dizzying. But what is clear is that the hype surrounding these partnerships has often focused on “moods and moments”-related data Spotify offers brands—not to mention the company’s penchant for allowing brands to combine their own data with Spotify’s. In 2017, Spotify’s Brian Benedik <a href="https://www.thedrum.com/news/2017/06/15/spotify-hits-140-million-monthly-users-it-notes-explosive-growth-automated-audio-ads">told </a><a href="https://www.thedrum.com/news/2017/06/15/spotify-hits-140-million-monthly-users-it-notes-explosive-growth-automated-audio-ads"><em>The Drum</em></a> that Spotify’s access to listening habits and first-party data is “one of the reasons that some of these big multi-national brands like the Samsungs and the Heinekens and the Microsofts and Procter and Gambles of the world are working with us a lot closer than they ever have . . . they don’t see that or get that from any other platform out there.” And it appears that things will only get darker. Julie Clark, Spotify’s Global Head of Automation Sales, said earlier this year in an interview that its targeting capabilities are growing: “There’s deeper first party-data that’s going to become available as well.”</p>

<p class="parasectionhed">Mood-Boosterism</p>

<p>Recently, I tried out a mood-related experiment on Spotify. I created a new account and only listened to the “Coping with Loss” playlist on loop for a few days. I paid particular attention to the advertisements that I was served by Spotify. And while I do not know for sure whether listening to the “Coping with Loss” playlist caused me to receive an unusually nostalgic Home Depot ad about how your carpets contain memories, or an ad for a particularly angsty new album called <em>Amidst the Chaos</em>, the extent to which Spotify is matching moods and emotions with advertisements certainly makes it seem possible. What was clearer: during my time spent listening exclusively to songs about grieving, Spotify was quick to recommend that I brighten my mood. Under the heading “More like Coping With Loss . . .” it recommended playlists themed for Father’s Day and Mother’s Day, and playlists called “Warm Fuzzy Feelings,” “Soundtrack Love Songs,” “90s Love Songs,” “Love Ballads,” and “Acoustic Hits.” Spotify evidently did not want me to sit with my sorrow; it wanted my mood to improve. It wanted me to be happy.</p>

<p>This is because Spotify specifically wants to be seen as a mood-boosting platform. In Spotify for Brands blog posts, the company routinely emphasizes how its own platform distinguishes itself from other streams of digital content, particularly because it gives marketers a chance to reach users through a medium that is widely seen as a “<em>positive enhancer”</em>: a medium they turn to for “music to help them get through the less desirable moments in their day, improve the more positive ones and even discover new things about their personality,” says Spotify.</p>

<p>“We’re quite unique in that we have people’s ears . . . combine that with the psycho-graphic data that we have and that becomes very powerful for brands,” said Jana Jakovljevic <a href="https://www.thedrum.com/news/2015/11/03/spotify-rolling-out-its-programmatic-advertising-offering">in 2015</a>, then Head of Programmatic Solutions; she is now employed by AI ad-tech company <a href="https://cognitiv.ai/">Cognitiv</a>, which claims to be “the first neural network technology that unearths patterns of consumer behavior” using “deep learning” to predict and target consumers.</p>

<blockquote class="pullquote"><p>During my time spent listening exclusively to songs about grieving, Spotify was quick to recommend that I brighten my mood.</p></blockquote>

<p><span style="font-size: 14pt;">The fact that experience at Spotify could prepare someone for such a career shift is worth some reflection. But more interestingly, Jakovljevic added that Spotify was using this data in many ways, including to determine exactly what type of music to recommend, which is important to remember: the data that is used to sell advertisers on the platform is also the data driving recommendations. The platform can recommend music in ways that appease advertisers while promising them that </span><em style="font-size: 14pt;">mood-boosting</em><span style="font-size: 14pt;"> ad space. What’s in question here isn’t just how Spotify monitors and mines data on our listening in order to use their “audience segments” as a form of currency—but also how it then creates environments more suitable for advertisers through what it recommends, manipulating future listening on the platform.</span></p>

<p>In appealing to advertisers, Spotify also celebrates its position as a <em>background experience</em> and in particular how this benefits advertisers and brands. Jorge Espinel, who was Head of Global Business Development at Spotify for five years, once said in an interview: “We love to be a background experience. You&#8217;re competing for consumer attention. Everyone is fighting for the foreground. We have the ability to fight for the background. And really no one is there. You’re doing your email, you’re doing your social network, etcetera.” In other words, it is in advertisers’ best interests that Spotify <em>stays a background experience</em>.</p>

<p>When a platform like Spotify sells advertisers on its mood-boosting, background experience, and then bakes these aims into what it recommends to listeners, a twisted form of behavior manipulation is at play. It’s connected to what Shoshana Zuboff, in <em>The Age of Surveillance Capitalism: The Fight for A Human Future at the New Frontier of Power</em>, calls the “behavioral futures market”—where “many companies are eager to lay their bets on our future behavior.”</p>

<p>Indeed, Spotify seeks not just to <em>monitor and mine </em>our mood, but also to manipulate future behavior. “What we’d ultimately like to do is be able to predict people’s behavior through music,” Les Hollander, the Global Head of Audio and Podcast Monetization, <a href="https://adexchanger.com/ad-exchange-news/podcast-spotify-blazed-trail-audio-ads/">said</a> in 2017. “We know that if you&#8217;re listening to your chill playlist in the morning, you may be doing yoga, you may be meditating . . . so we’d serve a contextually relevant ad with information and tonality and pace to that particular moment.” Very Zen!</p>

<p>Spotify’s treatment of its mood and emotion data as a form of currency in the greater data marketplace should be considered more generally in the context of the tech industry’s rush to quantify our emotions. There is a burgeoning industry surrounding technology that alleges to mine our emotional states in order to feed AI projects; take, for example, car companies that claim they can use facial recognition to read your mood and keep you safer on the road. Or Facebook’s patents on facial recognition software. Or unnerving technologies like Affectiva, which claim to be developing an industry around “emotion AI” and “affective computing” processes that measure human emotions.</p>

<p>It remains to be seen how Spotify could leverage such tech to maintain its reputation as a mood-boosting platform. And yet we should admit that it’s good for business for Spotify to manipulate people’s emotions on the platform toward feelings of chillness, contentment, and happiness. This has immense consequences for music, of course, but what does it mean for news and politics and culture at large, as the platform is set to play a bigger role in mediating all of the above, especially as its podcasting efforts grow?</p>

<p>On the Spotify for Brands blog, the streaming giant explains that its research shows millennials are weary of most social media and news platforms, feeling that these mediums affect them negatively. Spotify is a solution for brands, it explains, because it is a platform where people go to feel <em>good</em>. Of course, in this telling of things, Spotify conveniently ignores why those other forms of media feel so bad. It’s because they are platforms that prioritize their own product and profit above all else. It’s because they are platforms governed by nothing more than surveillance technology and the mechanisms of advertising.</p>
</article>


<hr>

<footer>
<p>
<a href="/david/" title="Aller à l’accueil">🏠</a> •
<a href="/david/log/" title="Accès au flux RSS">🤖</a> •
<a href="http://larlet.com" title="Go to my English profile" data-instant>🇨🇦</a> •
<a href="mailto:david%40larlet.fr" title="Envoyer un courriel">📮</a> •
<abbr title="Hébergeur : Alwaysdata, 62 rue Tiquetonne 75002 Paris, +33184162340">🧚</abbr>
</p>
</footer>
<script src="/static/david/js/instantpage-3.0.0.min.js" type="module" defer></script>
</body>
</html>

+ 55
- 0
cache/2020/5ddeb776b27bade5f581d66e40de4c6c/index.md View File

@@ -0,0 +1,55 @@
title: Big Mood Machine
url: https://thebaffler.com/downstream/big-mood-machine-pelly
hash_url: 5ddeb776b27bade5f581d66e40de4c6c

<p><span class="first-selection">Music is emotional</span>, and so our listening often signals something deeply personal and private. Today, this means music streaming platforms are in a unique position within the greater platform economy: they have troves of data related to our emotional states, moods, and feelings. It’s a matter of unprecedented access to our interior lives, which is buffered by the flimsy illusion of privacy. When a user chooses, for example, a “private listening” session on Spotify, the effect is to make them feel that it’s a one-way relation between person and machine. Of course, that personalization process is Spotify’s way of selling users on its product. But, as it turns out, in a move that should not surprise anyone at this point, Spotify has been selling access to that listening data to multinational corporations.</p>
<p>Where other platforms might need to invest more to piece together emotional user profiles, Spotify streamlines the process by providing boxes that users click on to indicate their moods: Happy Hits, Mood Booster, Rage Beats, Life Sucks. All of these are examples of what can now be found on Spotify’s Browse page under the “mood” category, which currently contains eighty-five playlists. If you need a lift in the morning, there’s Wake Up Happy, A Perfect Day, or Ready for the Day. If you’re feeling down, there’s Feeling Down, Sad Vibe, Down in the Dumps, Drifting Apart, Sad Beats, Sad Indie, and Devastating. If you’re grieving, there’s even Coping with Loss, with the tagline: “When someone you love becomes a memory, find solace in these songs.”</p>
<aside class="related-box"><h3>More</h3><div class="image-fill" style="background-image: url(/wp-content/uploads/2018/12/PELLY_REAL-350x234.gif );"><a class="full-coverage" href="https://thebaffler.com/downstream/streambait-pop-pelly?utm_campaign=bonus&utm_source=big-mood-machine-pelly"></a></div><h5><a href="https://thebaffler.com/downstream/streambait-pop-pelly?utm_campaign=bonus&utm_source=big-mood-machine-pelly">Streambait Pop</a></h5><h6>By&nbsp;Liz Pelly</h6></aside>
<p>Over the years, streaming services have pushed a narrative about these mood playlists, suggesting, through aggressive marketing, that the rise of listening by way of moods and activities was a service to listeners and artists alike—a way to help users navigate infinite choice, to find their way through a vast library of forty million songs. It’s a powerful arm of the industry-crafted mythology of the so-called streaming revolution: platforms celebrating this grand recontextualization of music into mood playlists as an engine of discovery. Spotify is currently running a campaign centered on moods—the company’s Twitter tagline is currently “Music for every mood”—complete with its own influencer campaign. </p>
<p>But a more careful look into Spotify’s history shows that the decision to define audiences by their moods was part of a strategic push to grow Spotify’s advertising business in the years leading up to its IPO—and today, Spotify’s enormous access to mood-based data is a pillar of its value to brands and advertisers, allowing them to target ads on Spotify by moods and emotions. Further, since 2016, Spotify has shared this mood data directly with the world’s biggest marketing and advertising firms.</p>
<p class="parasectionhed">Streaming <span style="text-decoration: line-through;">Intelligence</span> Surveillance</p>
<p>In 2015, Spotify began selling advertisers on the idea of marketing to moods, moments, and activities instead of genres. This was one year after Spotify acquired the “music intelligence” firm Echo Nest. Together they began looking at the 1.5 billion user-generated playlists at Spotify’s disposal. Studying these playlists allowed Spotify to more deeply analyze the contexts in which <em>listening</em> was happening on its platform. And so, right around the time that Spotify realized it had 400,000 user-generated barbecue playlists, Brian Benedik, then Spotify’s North American Vice President of Advertising and Partnerships, noted in an <em>Ad Age</em> interview that the company would focus on moods as a way to grow its advertising mechanism: “This is not something that’s just randomly thrown out there,” Benedik said. “It’s a strategic evolution of the Spotify ads business.” As of May 1, 2015, advertisers would be able to target ads to users of the free ad-supported service based on activities and moods: “Mood categories like happy, chill, and sad will let a brand like Coca-Cola play on its ‘Open Happiness’ campaign when people are listening to mood-boosting music,” the <em>Ad Age </em>article explained.</p>
<blockquote class="pullquote"><p>Spotify’s enormous access to mood-based data is a pillar of its value to brands and advertisers.</p></blockquote>
<p>Four years later, Spotify is the world’s biggest streaming subscription service, with 207 million users in seventy-nine different countries. And as Spotify has grown, its advertising machine has exploded. Of those 207 million users, it claims 96 million are subscribers, meaning that 111 million users rely on the ad-supported version. Spotify top marketing execs have expressed that the company’s ambition is “<a href="https://www.wsj.com/articles/spotify-has-big-ambitions-for-ad-business-1502964001">absolutely</a>” to be the third-largest player in the digital ad market behind Google and Facebook. In turn, since 2015, Spotify’s strategic use of mood and emotion-based targeting has only become even more entrenched in its business model.</p>
<p>“At Spotify we have a personal relationship with over 191 million people who show us their true colors with zero filter,” reads a current advertising deck. “That&#8217;s a lot of authentic engagement with our audience: billions of data points every day across devices! This data fuels Spotify&#8217;s streaming intelligence—our secret weapon that gives brands the edge to be relevant in real-time moments.&#8221; Another brand-facing pitch proclaims: “The most exciting part? This new research is starting to reveal the streaming generation’s offline behaviors through their streaming habits.”</p>
<p>Today, Spotify Ad Studio, a self-service portal automating the ad-purchase process, promises access to “rich and textured datasets,” allowing brands to instantly target their ads by mood and activity categories like “Chill,” “Commute,” “Dinner,” “Focus/Study,” “Girls Night Out,” and more. And across the Spotify for Brands website are a number of “studies” and “insights reports” regarding research that Spotify has undertaken about streaming habits: “You are what you stream,” they reiterate over and over.</p>
<p>In a 2017 package titled <em>Understanding People Through Music—Millennial Edition,</em> Spotify (with help from “youth marketing and millennial research firm” Ypulse) set out to help advertisers better target millennial users by mood, emotion, and activity specifically. Spotify explains that “unlike generations past, millennials aren’t loyal to any specific music genre.” They conflate this with a greater reluctance toward labels and binaries, pointing out the rising number of individuals who identify as gender fluid and the growing demographic of millennials who do not have traditional jobs—and chalk these up to consumer preferences. “This throws a wrench is marketers’ neat audience segmentations,” Spotify commiserates.</p>
<p>For the study, they also gathered six hundred in-depth “day in a life” interviews recorded as “behavioral diaries.” All participants were surveilled by demographics, platform usage, playlist behavior, feature usage, and music tastes—and in the United States (where privacy is taken less seriously), Spotify and Ypulse were able to pair Spotify’s own streaming data with additional third-party information on “broader interests, lifestyle, and shopping behaviors.”</p>
<p>The result is an interactive hub on the Spotify for Brands website detailing seven distinct “key audio streaming moments for marketers to tap into,” including Working, Chilling, Cooking, Chores, Gaming, Workout, Partying, and Driving. Spotify also dutifully outlines recommendations for how to use this information to sell shit, alongside success stories from Dunkin’ Donuts, Snickers, Gatorade, Wild Turkey, and BMW.</p>
<p>More startlingly, for each of these “moments” there is an animated trajectory of a typical “emotional journey” claiming to predict the various emotional states users will experience while listening to particular playlists. Listeners who are “working,” for instance, are likely to start out feeling pressured and stressed, before they become more energized and focused and end up feeling fine and accomplished at the end of the playlist queue. If they listen while doing chores, the study claims to know that they start out feeling stressed and lazy, then grow motivated and entertained, and end by feeling similarly good and accomplished.</p>
<p>In Spotify’s world, listening data has become the oil that fuels a monetizable metrics machine, pumping the numbers that lure advertisers to the platform. In a data-driven listening environment, the commodity is no longer <em>music</em>. The commodity is <em>listening</em>. The commodity is <em>users </em>and their <em>moods</em>. The commodity is <em>listening habits</em> as <em>behavioral data</em>. Indeed, what Spotify calls “streaming intelligence” should be understood as surveillance of its users to fuel its own growth and ability to sell mood-and-moment data to brands.</p>
<p class="parasectionhed">A Leviathan of Ads</p>
<p>The potential of music to provide mood-related data useful to marketers has long been studied. In 1990, the <em>Journal of Marketing</em> published an article dubbed “Music, Mood and Marketing” that surveyed some of this history while bemoaning how “despite being a prominent promotional tool, music is not well understood or controlled by marketers.” The text outlines how “marketers are precariously dependent on musicians for their insight into the selection or composition of the ‘right’ music for particular situations.” This view of music as a burdensome means to a marketer’s end is absurd, but it’s also the logic that rules the current era of algorithmic music platforms. Unsurprisingly, this 1990 article aimed to overcome challenges for marketers by figuring out new ways to extract value from music that would be beyond the control of musicians themselves: studying the “behavioral effects” of music with a “special emphasis on music’s emotional expressionism and role as mood influencer” in order to create new forms of power and control.<em> </em></p>
<p>Today, marketers want mood-related data more than ever, at least in part to fuel automated, personalized ad targeting. In 2016, the world’s largest holding company for advertising and PR agencies, WPP, <a href="https://www.prnewswire.com/news-releases/wpps-data-alliance-and-spotify-announce-global-data-partnership-300362733.html">announced</a> that it had struck a multi-year partnership with Spotify, giving the conglomerate unprecedented access to Spotify’s mood data specifically. The partnership with the WPP, it turns out, was part of Spotify’s plan to ramp up its advertising business in advance of its IPO.</p>
<blockquote class="pullquote"><p>In a data-driven listening environment, the commodity is no longer <em>music</em>. The commodity is <em>listening</em>. The commodity is <em>users</em> and their <em>moods</em>.</p></blockquote>
<p>WPP is the parent company to several of the world’s largest and oldest advertising, PR, and brand consulting firms, including Ogilvy, Grey Global Group, and at least eighteen others. Across their portfolio, WPP owns companies that work with numerous mega-corporations and household brands, helping shill everything from cars, Coca-Cola, and KFC to booze, banks, and Burger King. Over the decades, these companies have worked on campaigns spanning from Merrill Lynch and Lay’s potato chips to Colgate-Palmolive and Ford. Additionally, WPP properties also include tech-focused companies that claim proficiency in automation- and personalization-driven ad sales, all of which would now benefit from Spotify’s mood data.</p>
<p>The 2016 announcement of WPP and Spotify’s global partnership in “data, insights, creative, technology, innovation, programmatic solution, and new growth markets” speaks for itself:</p>
<blockquote>
<p>WPP now has unique listening preferences and behaviors of Spotify&#8217;s 100 million users in 60 countries. The multi-year deal provides differentiating value to WPP and its clients by harnessing insights from the connection between music and audiences&#8217; moods and activities. Music attributes such as tempo and energy are proven to be highly relevant in predicting mood, which enables advertisers to understand their audiences in a new emotional dimension.</p>
</blockquote>
<p>What’s more, WPP-owned advertising agencies could now access the “Wunderman Zipline™ Data Management Platform” to gain direct access to Spotify users’ “mood, listening and playlist behavior, activity and location.” They’d also potentially make use of “Spotify’s data on connected device usage” while the WPP-owned company GroupM specifically would retain access to “an exclusive infusion of Spotify data” into its own platform made for corporate ad targeting. Per the announcement, WPP companies would also serve as launch partners for new types of advertising and new markets unveiled by Spotify, while procuring “visibility into Spotify&#8217;s product roadmap and access to beta testing.”</p>
<p>At the time the partnership was announced, Anas Ghazi, then Managing Director of Global Partnerships at WPP’s Data Alliance, noted that all WPP agencies would be able to “grab these insights. . . . If you think about how music shapes your activity and thoughts, this is a new, unique play for us to find audiences. Mood and moments are the next pieces of understanding audiences.” And Harvey Goldhersz, then CEO of GroupM Data &amp; Analytics, salivated: “The insights we&#8217;ll develop from Spotify’s behavioral data will help our clients realize a material marketplace advantage, aiding delivery of ads that are appropriate to the consumer&#8217;s mood and the device used.”</p>
<p class="parasectionhed">Ongoing Synergies</p>
<p>While this deal was announced via the WPP Data Alliance, visiting that particular organization’s website now auto-directs back to the main WPP website, likely a result of corporate restructuring that WPP underwent over the past year. Currently, the only public-facing evidence of the relationship between WPP and Spotify is <a href="http://www.kantar.com/about/partners">listed</a> online under the WPP-owned data and insights company Kantar, which WPP describes as “the world’s leading marketing data, insight and consultancy company.”</p>
<p>What might Kantar be doing with this user data? The current splash video deck on its website is useful: it claims to be the first agency to use “facial recognition in advertising testing,” and it professes to be exploring new technologies “from biodata and biometrics and healthcare, to capturing human sentiment and even voting intentions by analyzing social media.” And, finally, it admits to “exploiting big data, artificial intelligence and analytics . . . to predict attitudes and behavior.”</p>
<p>When we reached out to see if the relationship between Kantar and Spotify had changed since the initial 2016 announcement, Kantar sent <em>The Baffler</em> this statement:</p>
<blockquote>
<p>The 2016 Spotify collaboration was the first chapter of many-a collaboration and has continued to evolve to meet the dynamic needs of our clients and marketplace. Spotify continues to be a valued partner of larger enterprise and Kantar with on-going synergies<em>.</em></p>
</blockquote>
<p>One year after the announcement of the partnership, in 2017, Spotify further confirmed its desire to establish direct relationships with the world’s biggest advertising agencies when it hired two executives from WPP: Angela Solk, now Global Head of Agency Partnerships, whose job at Spotify includes teaching WPP and other ad agencies how to best make use of Spotify’s first-party data. (In Solk’s first year at Spotify, she helped create the Smirnoff Equalizer; in a 2018 interview, she reflected on the “beauty” of that branded content campaign and Spotify’s ability to extract listener insight and make it “part of Smirnoff’s DNA.”) Spotify also hired Craig Weingarten as its Head of Industry, Automotive, which now leads Spotify’s Detroit-based auto ad sales team.</p>
<p>According to its own media narrative, Spotify offers data access to brands that competitor platforms do not, and it has gained a reputation for its eagerness to share its first-party data. At advertising conferences and in the ad press, Spotify top ad exec Marco Bertozzi has emphasized how Spotify hopes to widely share first-party data, going so far as to <a href="https://www.thedrum.com/news/2017/03/17/spotify-europe-vp-when-other-walled-gardens-say-no-data-questions-we-say-yes">confess</a>, “When other walled gardens say no to data questions . . . we say yes.” (Bertozzi was also the mind behind an internal Spotify campaign adorably branded “#LoveAds” to combat growing societal disgust with digital advertising. #LoveAds started as a mantra of the advertising team, but as Bertozzi proudly <a href="https://www.campaignlive.co.uk/article/stop-knocking-advertising-learn-loveads/1519385">explained</a> in late 2018, “#LoveAds became a movement within the company.”)</p>
<p>Spotify has spent tremendous energy on its ad team’s proficiency with cross-device advertising options (likely due to the imminent ubiquity of Spotify in the car and the so-called “smart home”), as well as “programmatic advertising,” otherwise understood as the targeted advertising sold through an automated process, often in milliseconds—Spotify seeks to be the most dominant seller of such advertising in the audio space. And there’s also the self-serve platform, direct inventory sales, Spotify’s private marketplace (an invite-only inventory for select advertisers), programmatic guaranteed deals (a guaranteed volume of impressions at a fixed price)—the jargon ad-speak lists could go on and on.</p>
<blockquote class="pullquote"><p>According to its own media narrative, Spotify offers data access to brands that competitor platforms do not.</p></blockquote>
<p>Trying to keep tabs on Spotify’s advertising products and partnerships is dizzying. But what is clear is that the hype surrounding these partnerships has often focused on “moods and moments”-related data Spotify offers brands—not to mention the company’s penchant for allowing brands to combine their own data with Spotify’s. In 2017, Spotify’s Brian Benedik <a href="https://www.thedrum.com/news/2017/06/15/spotify-hits-140-million-monthly-users-it-notes-explosive-growth-automated-audio-ads">told </a><a href="https://www.thedrum.com/news/2017/06/15/spotify-hits-140-million-monthly-users-it-notes-explosive-growth-automated-audio-ads"><em>The Drum</em></a> that Spotify’s access to listening habits and first-party data is “one of the reasons that some of these big multi-national brands like the Samsungs and the Heinekens and the Microsofts and Procter and Gambles of the world are working with us a lot closer than they ever have . . . they don’t see that or get that from any other platform out there.” And it appears that things will only get darker. Julie Clark, Spotify’s Global Head of Automation Sales, said earlier this year in an interview that its targeting capabilities are growing: “There’s deeper first party-data that’s going to become available as well.”</p>
<p class="parasectionhed">Mood-Boosterism</p>
<p>Recently, I tried out a mood-related experiment on Spotify. I created a new account and only listened to the “Coping with Loss” playlist on loop for a few days. I paid particular attention to the advertisements that I was served by Spotify. And while I do not know for sure whether listening to the “Coping with Loss” playlist caused me to receive an unusually nostalgic Home Depot ad about how your carpets contain memories, or an ad for a particularly angsty new album called <em>Amidst the Chaos</em>, the extent to which Spotify is matching moods and emotions with advertisements certainly makes it seem possible. What was clearer: during my time spent listening exclusively to songs about grieving, Spotify was quick to recommend that I brighten my mood. Under the heading “More like Coping With Loss . . .” it recommended playlists themed for Father’s Day and Mother’s Day, and playlists called “Warm Fuzzy Feelings,” “Soundtrack Love Songs,” “90s Love Songs,” “Love Ballads,” and “Acoustic Hits.” Spotify evidently did not want me to sit with my sorrow; it wanted my mood to improve. It wanted me to be happy.</p>
<p>This is because Spotify specifically wants to be seen as a mood-boosting platform. In Spotify for Brands blog posts, the company routinely emphasizes how its own platform distinguishes itself from other streams of digital content, particularly because it gives marketers a chance to reach users through a medium that is widely seen as a “<em>positive enhancer”</em>: a medium they turn to for “music to help them get through the less desirable moments in their day, improve the more positive ones and even discover new things about their personality,” says Spotify.</p>
<p>“We’re quite unique in that we have people’s ears . . . combine that with the psycho-graphic data that we have and that becomes very powerful for brands,” said Jana Jakovljevic <a href="https://www.thedrum.com/news/2015/11/03/spotify-rolling-out-its-programmatic-advertising-offering">in 2015</a>, then Head of Programmatic Solutions; she is now employed by AI ad-tech company <a href="https://cognitiv.ai/">Cognitiv</a>, which claims to be “the first neural network technology that unearths patterns of consumer behavior” using “deep learning” to predict and target consumers.</p>
<blockquote class="pullquote"><p>During my time spent listening exclusively to songs about grieving, Spotify was quick to recommend that I brighten my mood.</p></blockquote>
<p><span style="font-size: 14pt;">The fact that experience at Spotify could prepare someone for such a career shift is worth some reflection. But more interestingly, Jakovljevic added that Spotify was using this data in many ways, including to determine exactly what type of music to recommend, which is important to remember: the data that is used to sell advertisers on the platform is also the data driving recommendations. The platform can recommend music in ways that appease advertisers while promising them that </span><em style="font-size: 14pt;">mood-boosting</em><span style="font-size: 14pt;"> ad space. What’s in question here isn’t just how Spotify monitors and mines data on our listening in order to use their “audience segments” as a form of currency—but also how it then creates environments more suitable for advertisers through what it recommends, manipulating future listening on the platform.</span></p>
<p>In appealing to advertisers, Spotify also celebrates its position as a <em>background experience</em> and in particular how this benefits advertisers and brands. Jorge Espinel, who was Head of Global Business Development at Spotify for five years, once said in an interview: “We love to be a background experience. You&#8217;re competing for consumer attention. Everyone is fighting for the foreground. We have the ability to fight for the background. And really no one is there. You’re doing your email, you’re doing your social network, etcetera.” In other words, it is in advertisers’ best interests that Spotify <em>stays a background experience</em>.</p>
<p>When a platform like Spotify sells advertisers on its mood-boosting, background experience, and then bakes these aims into what it recommends to listeners, a twisted form of behavior manipulation is at play. It’s connected to what Shoshana Zuboff, in <em>The Age of Surveillance Capitalism: The Fight for A Human Future at the New Frontier of Power</em>, calls the “behavioral futures market”—where “many companies are eager to lay their bets on our future behavior.”</p>
<p>Indeed, Spotify seeks not just to <em>monitor and mine </em>our mood, but also to manipulate future behavior. “What we’d ultimately like to do is be able to predict people’s behavior through music,” Les Hollander, the Global Head of Audio and Podcast Monetization, <a href="https://adexchanger.com/ad-exchange-news/podcast-spotify-blazed-trail-audio-ads/">said</a> in 2017. “We know that if you&#8217;re listening to your chill playlist in the morning, you may be doing yoga, you may be meditating . . . so we’d serve a contextually relevant ad with information and tonality and pace to that particular moment.” Very Zen!</p>
<p>Spotify’s treatment of its mood and emotion data as a form of currency in the greater data marketplace should be considered more generally in the context of the tech industry’s rush to quantify our emotions. There is a burgeoning industry surrounding technology that alleges to mine our emotional states in order to feed AI projects; take, for example, car companies that claim they can use facial recognition to read your mood and keep you safer on the road. Or Facebook’s patents on facial recognition software. Or unnerving technologies like Affectiva, which claim to be developing an industry around “emotion AI” and “affective computing” processes that measure human emotions.</p>
<p>It remains to be seen how Spotify could leverage such tech to maintain its reputation as a mood-boosting platform. And yet we should admit that it’s good for business for Spotify to manipulate people’s emotions on the platform toward feelings of chillness, contentment, and happiness. This has immense consequences for music, of course, but what does it mean for news and politics and culture at large, as the platform is set to play a bigger role in mediating all of the above, especially as its podcasting efforts grow?</p>
<p>On the Spotify for Brands blog, the streaming giant explains that its research shows millennials are weary of most social media and news platforms, feeling that these mediums affect them negatively. Spotify is a solution for brands, it explains, because it is a platform where people go to feel <em>good</em>. Of course, in this telling of things, Spotify conveniently ignores why those other forms of media feel so bad. It’s because they are platforms that prioritize their own product and profit above all else. It’s because they are platforms governed by nothing more than surveillance technology and the mechanisms of advertising.</p>

+ 69
- 0
cache/2020/662f4136a25b828f662a6e822d85575d/index.html View File

@@ -0,0 +1,69 @@
<!doctype html><!-- This is a valid HTML5 document. -->
<!-- Screen readers, SEO, extensions and so on. -->
<html lang="fr">
<!-- Has to be within the first 1024 bytes, hence before the <title>
See: https://www.w3.org/TR/2012/CR-html5-20121217/document-metadata.html#charset -->
<meta charset="utf-8">
<!-- Why no `X-UA-Compatible` meta: https://stackoverflow.com/a/6771584 -->
<!-- The viewport meta is quite crowded and we are responsible for that.
See: https://codepen.io/tigt/post/meta-viewport-for-2015 -->
<meta name="viewport" content="width=device-width,minimum-scale=1,initial-scale=1,shrink-to-fit=no">
<!-- Required to make a valid HTML5 document. -->
<title>Who Listens to the Listeners? (archive) — David Larlet</title>
<!-- Lightest blank gif, avoids an extra query to the server. -->
<link rel="icon" href="data:;base64,iVBORw0KGgo=">
<!-- Thank you Florens! -->
<link rel="stylesheet" href="/static/david/css/style_2020-01-09.css">
<!-- See https://www.zachleat.com/web/comprehensive-webfonts/ for the trade-off. -->
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_regular.woff2" as="font" type="font/woff2" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_bold.woff2" as="font" type="font/woff2" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_italic.woff2" as="font" type="font/woff2" crossorigin>

<meta name="robots" content="noindex, nofollow">
<meta content="origin-when-cross-origin" name="referrer">
<!-- Canonical URL for SEO purposes -->
<link rel="canonical" href="https://librarianshipwreck.wordpress.com/2019/12/06/who-listens-to-the-listeners/">

<body class="remarkdown h1-underline h2-underline hr-center ul-star pre-tick">

<article>
<h1>Who Listens to the Listeners?</h1>
<h2><a href="https://librarianshipwreck.wordpress.com/2019/12/06/who-listens-to-the-listeners/">Source originale du contenu</a></h2>
<p>It can be kind of fun to look through the record collections of your friends. Whether this collection consists of actual records, cassette tapes, compact discs, or just a lengthy “library” list you scroll through – it presents an opportunity to discover new musicians, listen to cool songs you haven’t listened to in a while, and there is always a certain sense that you get to know a person a bit better by getting to know their musical taste. Granted, it has become progressively trickier to sift through your friend’s records as these collections have ceased being an accumulation of physical objects and have increasingly become an unseen database of their streaming preferences.</p>

<p>Luckily as the year comes to a close, the music streaming platform Spotify provides its users with a tantalizing (and ready to share) glimpse of their listening habits. It’s fun! It’s shareable! And it’s a troubling reminder that being “fun” and “shareable” is one of the primary ways by which rampant surveillance becomes normalized.</p>

<p>Recently, Spotify users found themselves prompted to “see how you listened in 2019,” and if they chose to tap on the prompt they were delivered a colorful tour through their past year in music. Users were informed that “in 2019, your sound changed with the seasons,” were told of the geographical distribution of musicians they listened to, were praised for being “genre fluid” (and told their top music genres), were told how many new artists they discovered this year, were reminded of the musicians and songs they listened to the most in the year, were confronted with the number of total minutes they had spent listening to music on Spotify, and given that the decade is wrapping up Spotify even goes so far as to crown “your artist of the decade.” It’s a brief, colorful tour through the year that was, brought to life with a peppering of clips from the songs a user had listened to most in the previous year. And throughout, the user is continually encouraged to “share this story,” and it is even suggested a listener should tweet a message out to the “one lucky artist” who was their number one for the year.</p>

<p>Spotify Wrapped presents an amusing mix of information that easily blends the nostalgic, with the mildly embarrassing, and stirs it all together with an assortment of impressive numbers. It reminds the user what songs they had on loop last Winter (perhaps reminding them, for good or ill, what it was that was prompting them to listen to those songs on loop). Similarly, it can lead to an amused self-deprecating feeling that the top artists and songs weren’t the ones a user had expected. And it is easy to be shocked and impressed by just how many new artists a user discovered (ostensibly thanks to the platform), and the number of total minutes spent listening can be strikingly large. What’s more the constant prompting to “share this story,” pushes the act of privately listening to music into something easy to broadcast, and what’s more these are colorful entertaining graphics made expressly for the purposes of sharing.</p>

<p>But there is also something deeply creepy about Spotify Wrapped. For it is a very clear reminder to Spotify users that when they are listening to music the platform is listening to them do it. Spotify knows how many times you’ve listened to a particular song, to a particular artist, to a particular genre, and how much total time you’ve spent listening to music on the platform. And though you may have forgotten what song you listened to the most in 2017, Spotify definitely remembers. There are not many occasions when a tech company merrily confronts its users with a sampling of the data its gathering on them, but Spotify Wrapped does just that.</p>

<p>The obvious rejoinder to this is that “of course Spotify does this!” And, in fairness, it is not as though Spotify Wrapped reveals something particularly scandalous. After all, this monitoring is easily understood within the context of the platform wherein the number of listens needs to be tracked so that Spotify can continue paying hundredths of cents to musicians, and because Spotify needs to know what a user is listening to in order to better tailor recommendations to them (and to other users). This does not even fully amount to a case in which users could be up in arms about agreeing to something they didn’t understand that was buried in the bowels of a terms of service agreement. Furthermore, it is not as though this is the first time that Spotify has put together such year-end reports, so there isn’t even the shock at something new. And yet, with its year end wrap up, Spotify provides a fairly clear reminder to its users that it really knows an unnerving amount about them.</p>

<p>What Spotify presents its users with is a very appealing bargain: a user can listen to tons of music (choosing to pay a monthly fee if they’d prefer to do so without advertisements), and in exchange the platform will get to gather all of the data about these listening habits. And this is a deal that a user enters into even if they choose to pay for the service. The user gets the music, the platform gets the data, and the musician might get a couple of pennies if enough people are listening. Part of what makes this deal seem so good is that, for the user, it seems like forking over their data will only result in better recommendations – and though they may understand that Spotify reserves the right to sell (or do whatever it wants) with the data, that seems much more abstract than getting to listen to music. The negative potential, and future risks, are hidden behind enough levels of benefits that the users ignore the downsides (<a href="https://librarianshipwreck.wordpress.com/2018/08/02/from-megatechnic-bribe-to-megatechnic-blackmail-mumfords-megamachine-after-the-digital-turn/">this is a perfect example of what the social critic Lewis Mumford called “the megatechnic bribe</a>).  And thus, in the guise of a seemingly innocuous tradeoff (in which the user thinks they’re really getting the benefit), the user accepts being subjected to high-tech corporate surveillance.</p>

<p>Importantly, this is one of the primary ways in which such surveillance gets normalized. Users just take for granted that the price of using a platform (even if they’re actually paying for it) is to let that platform gather their data. To make matters worse, this is eventually turned into something celebratory as users are encouraged to share Spotify’s yearly surveillance report with their friends. As a result, this corporate surveillance becomes not only something one is subjected to, but something that one advertises as being fun. By broadcasting their Spotify Wrapped list, users not only further normalize surveillance, but they make it seem fun and cool – pushing those who have not received similar reports to have a feeling like they’re missing out. Of course, Spotify is hardly the only music streaming platform that is gathering data on the listening habits of its users (they all are), but the blithe way in which it reframes surveillance as a celebration provides an opportunity to pause and consider how much privacy is being surrendered, and the way in which doing that is presented as an amusing game.</p>

<p>Lest there be any doubt, Spotify Wrapped is hardly the top surveillance concern at the moment. Battles over facial recognition are setting themselves up to be major civil rights issues in the years ahead; the avalanche of “smart home” devices raises the specter of how fully private spaces will be taken over by constant corporate surveillance; the Amazon Ring is driving important debates over questions of policing, institutional racism, and panoptic surveillance; and the data practices of companies like Google and Facebook continue to spark fresh anxieties. In comparison to such things, some bright graphics about favorite songs really do seem innocuous.</p>

<p>Yet Spotify Wrapped should concern us, precisely because it seems so banal. High-tech surveillance succeeds by slowly chipping away at the obstacles to its acceptance. It does not start with the total takeover, rather it begins on a smaller scale, presenting itself as harmless and enjoyable. As people steadily grow accustomed to this sort of surveillance, as they come to see themselves as its beneficiaries instead of as its victims, they become open to a little bit more surveillance, and a little bit more surveillance, and a little bit more. This is the steady wearing down of defenses, the slow transformation of corporate creepiness into cultural complacency, that allows rampant high-tech surveillance to progress. Surveillance is not always the high-tech dystopia kicking down the door, sometimes it’s the pair of wonderful headphones that keep people distracted so they don’t realize that the high-tech dystopia has climbed in through the window in the living room.</p>

<p>It can be fun to look through your friends’ record collections. It can be enjoyable to learn about what your friends have been listening to lately. But the companies that are subjecting you to constant high-tech surveillance aren’t your friends, even if they make brightly colored reports for you.</p>
</article>


<hr>

<footer>
<p>
<a href="/david/" title="Aller à l’accueil">🏠</a> •
<a href="/david/log/" title="Accès au flux RSS">🤖</a> •
<a href="http://larlet.com" title="Go to my English profile" data-instant>🇨🇦</a> •
<a href="mailto:david%40larlet.fr" title="Envoyer un courriel">📮</a> •
<abbr title="Hébergeur : Alwaysdata, 62 rue Tiquetonne 75002 Paris, +33184162340">🧚</abbr>
</p>
</footer>
<script src="/static/david/js/instantpage-3.0.0.min.js" type="module" defer></script>
</body>
</html>

+ 15
- 0
cache/2020/662f4136a25b828f662a6e822d85575d/index.md View File

@@ -0,0 +1,15 @@
title: Who Listens to the Listeners?
url: https://librarianshipwreck.wordpress.com/2019/12/06/who-listens-to-the-listeners/
hash_url: 662f4136a25b828f662a6e822d85575d

<p>It can be kind of fun to look through the record collections of your friends. Whether this collection consists of actual records, cassette tapes, compact discs, or just a lengthy “library” list you scroll through – it presents an opportunity to discover new musicians, listen to cool songs you haven’t listened to in a while, and there is always a certain sense that you get to know a person a bit better by getting to know their musical taste. Granted, it has become progressively trickier to sift through your friend’s records as these collections have ceased being an accumulation of physical objects and have increasingly become an unseen database of their streaming preferences.</p>
<p>Luckily as the year comes to a close, the music streaming platform Spotify provides its users with a tantalizing (and ready to share) glimpse of their listening habits. It’s fun! It’s shareable! And it’s a troubling reminder that being “fun” and “shareable” is one of the primary ways by which rampant surveillance becomes normalized.</p>
<p>Recently, Spotify users found themselves prompted to “see how you listened in 2019,” and if they chose to tap on the prompt they were delivered a colorful tour through their past year in music. Users were informed that “in 2019, your sound changed with the seasons,” were told of the geographical distribution of musicians they listened to, were praised for being “genre fluid” (and told their top music genres), were told how many new artists they discovered this year, were reminded of the musicians and songs they listened to the most in the year, were confronted with the number of total minutes they had spent listening to music on Spotify, and given that the decade is wrapping up Spotify even goes so far as to crown “your artist of the decade.” It’s a brief, colorful tour through the year that was, brought to life with a peppering of clips from the songs a user had listened to most in the previous year. And throughout, the user is continually encouraged to “share this story,” and it is even suggested a listener should tweet a message out to the “one lucky artist” who was their number one for the year.</p>
<p>Spotify Wrapped presents an amusing mix of information that easily blends the nostalgic, with the mildly embarrassing, and stirs it all together with an assortment of impressive numbers. It reminds the user what songs they had on loop last Winter (perhaps reminding them, for good or ill, what it was that was prompting them to listen to those songs on loop). Similarly, it can lead to an amused self-deprecating feeling that the top artists and songs weren’t the ones a user had expected. And it is easy to be shocked and impressed by just how many new artists a user discovered (ostensibly thanks to the platform), and the number of total minutes spent listening can be strikingly large. What’s more the constant prompting to “share this story,” pushes the act of privately listening to music into something easy to broadcast, and what’s more these are colorful entertaining graphics made expressly for the purposes of sharing.</p>
<p>But there is also something deeply creepy about Spotify Wrapped. For it is a very clear reminder to Spotify users that when they are listening to music the platform is listening to them do it. Spotify knows how many times you’ve listened to a particular song, to a particular artist, to a particular genre, and how much total time you’ve spent listening to music on the platform. And though you may have forgotten what song you listened to the most in 2017, Spotify definitely remembers. There are not many occasions when a tech company merrily confronts its users with a sampling of the data its gathering on them, but Spotify Wrapped does just that.</p>
<p>The obvious rejoinder to this is that “of course Spotify does this!” And, in fairness, it is not as though Spotify Wrapped reveals something particularly scandalous. After all, this monitoring is easily understood within the context of the platform wherein the number of listens needs to be tracked so that Spotify can continue paying hundredths of cents to musicians, and because Spotify needs to know what a user is listening to in order to better tailor recommendations to them (and to other users). This does not even fully amount to a case in which users could be up in arms about agreeing to something they didn’t understand that was buried in the bowels of a terms of service agreement. Furthermore, it is not as though this is the first time that Spotify has put together such year-end reports, so there isn’t even the shock at something new. And yet, with its year end wrap up, Spotify provides a fairly clear reminder to its users that it really knows an unnerving amount about them.</p>
<p>What Spotify presents its users with is a very appealing bargain: a user can listen to tons of music (choosing to pay a monthly fee if they’d prefer to do so without advertisements), and in exchange the platform will get to gather all of the data about these listening habits. And this is a deal that a user enters into even if they choose to pay for the service. The user gets the music, the platform gets the data, and the musician might get a couple of pennies if enough people are listening. Part of what makes this deal seem so good is that, for the user, it seems like forking over their data will only result in better recommendations – and though they may understand that Spotify reserves the right to sell (or do whatever it wants) with the data, that seems much more abstract than getting to listen to music. The negative potential, and future risks, are hidden behind enough levels of benefits that the users ignore the downsides (<a href="https://librarianshipwreck.wordpress.com/2018/08/02/from-megatechnic-bribe-to-megatechnic-blackmail-mumfords-megamachine-after-the-digital-turn/">this is a perfect example of what the social critic Lewis Mumford called “the megatechnic bribe</a>).  And thus, in the guise of a seemingly innocuous tradeoff (in which the user thinks they’re really getting the benefit), the user accepts being subjected to high-tech corporate surveillance.</p>
<p>Importantly, this is one of the primary ways in which such surveillance gets normalized. Users just take for granted that the price of using a platform (even if they’re actually paying for it) is to let that platform gather their data. To make matters worse, this is eventually turned into something celebratory as users are encouraged to share Spotify’s yearly surveillance report with their friends. As a result, this corporate surveillance becomes not only something one is subjected to, but something that one advertises as being fun. By broadcasting their Spotify Wrapped list, users not only further normalize surveillance, but they make it seem fun and cool – pushing those who have not received similar reports to have a feeling like they’re missing out. Of course, Spotify is hardly the only music streaming platform that is gathering data on the listening habits of its users (they all are), but the blithe way in which it reframes surveillance as a celebration provides an opportunity to pause and consider how much privacy is being surrendered, and the way in which doing that is presented as an amusing game.</p>
<p>Lest there be any doubt, Spotify Wrapped is hardly the top surveillance concern at the moment. Battles over facial recognition are setting themselves up to be major civil rights issues in the years ahead; the avalanche of “smart home” devices raises the specter of how fully private spaces will be taken over by constant corporate surveillance; the Amazon Ring is driving important debates over questions of policing, institutional racism, and panoptic surveillance; and the data practices of companies like Google and Facebook continue to spark fresh anxieties. In comparison to such things, some bright graphics about favorite songs really do seem innocuous.</p>
<p>Yet Spotify Wrapped should concern us, precisely because it seems so banal. High-tech surveillance succeeds by slowly chipping away at the obstacles to its acceptance. It does not start with the total takeover, rather it begins on a smaller scale, presenting itself as harmless and enjoyable. As people steadily grow accustomed to this sort of surveillance, as they come to see themselves as its beneficiaries instead of as its victims, they become open to a little bit more surveillance, and a little bit more surveillance, and a little bit more. This is the steady wearing down of defenses, the slow transformation of corporate creepiness into cultural complacency, that allows rampant high-tech surveillance to progress. Surveillance is not always the high-tech dystopia kicking down the door, sometimes it’s the pair of wonderful headphones that keep people distracted so they don’t realize that the high-tech dystopia has climbed in through the window in the living room.</p>
<p>It can be fun to look through your friends’ record collections. It can be enjoyable to learn about what your friends have been listening to lately. But the companies that are subjecting you to constant high-tech surveillance aren’t your friends, even if they make brightly colored reports for you.</p>

+ 137
- 0
cache/2020/77c968588b2e605d5b3050c45af53603/index.html View File

@@ -0,0 +1,137 @@
<!doctype html><!-- This is a valid HTML5 document. -->
<!-- Screen readers, SEO, extensions and so on. -->
<html lang="fr">
<!-- Has to be within the first 1024 bytes, hence before the <title>
See: https://www.w3.org/TR/2012/CR-html5-20121217/document-metadata.html#charset -->
<meta charset="utf-8">
<!-- Why no `X-UA-Compatible` meta: https://stackoverflow.com/a/6771584 -->
<!-- The viewport meta is quite crowded and we are responsible for that.
See: https://codepen.io/tigt/post/meta-viewport-for-2015 -->
<meta name="viewport" content="width=device-width,minimum-scale=1,initial-scale=1,shrink-to-fit=no">
<!-- Required to make a valid HTML5 document. -->
<title>Driving surveillance: What does your car know about you? We hacked a 2017 Chevy to find out. (archive) — David Larlet</title>
<!-- Lightest blank gif, avoids an extra query to the server. -->
<link rel="icon" href="data:;base64,iVBORw0KGgo=">
<!-- Thank you Florens! -->
<link rel="stylesheet" href="/static/david/css/style_2020-01-09.css">
<!-- See https://www.zachleat.com/web/comprehensive-webfonts/ for the trade-off. -->
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_regular.woff2" as="font" type="font/woff2" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_bold.woff2" as="font" type="font/woff2" crossorigin>
<link rel="preload" href="/static/david/css/fonts/triplicate_t4_poly_italic.woff2" as="font" type="font/woff2" crossorigin>

<meta name="robots" content="noindex, nofollow">
<meta content="origin-when-cross-origin" name="referrer">
<!-- Canonical URL for SEO purposes -->
<link rel="canonical" href="https://www.washingtonpost.com/technology/2019/12/17/what-does-your-car-know-about-you-we-hacked-chevy-find-out/">

<body class="remarkdown h1-underline h2-underline hr-center ul-star pre-tick">

<article>
<h1>Driving surveillance: What does your car know about you? We hacked a 2017 Chevy to find out.</h1>
<h2><a href="https://www.washingtonpost.com/technology/2019/12/17/what-does-your-car-know-about-you-we-hacked-chevy-find-out/">Source originale du contenu</a></h2>
<p>Cars have become the most sophisticated computers many of us own, filled with hundreds of sensors. Even older models know an awful lot about you. Many copy over personal data as soon as you plug in a smartphone.</p>

<p>But for the thousands you spend to buy a car, the data it produces doesn’t belong to you. My Chevy’s dashboard didn’t say what the car was recording. It wasn’t in the owner’s manual. There was no way to download it.</p>

<p>To glimpse my car data, I had to hack my way in.</p>

<p>We’re at a turning point for driving surveillance: In the 2020 model year, most new cars sold in the United States will come with built-in Internet connections, including 100 percent of Fords, GMs and BMWs and all but one model Toyota and Volkswagen. (This independent cellular service is often included free or sold as an add-on.) Cars are becoming smartphones on wheels, sending and receiving data from apps, insurance firms and pretty much wherever their makers want. Some brands even reserve the right to use the data to track you down if you don’t pay your bills.</p>

<p>When I buy a car, I assume the data I produce is owned by me — or at least is controlled by me. Many automakers do not. They act like how and where we drive, also known as telematics, isn’t personal information.</p>

<p>Cars now run on the new oil: your data. It is fundamental to a future of transportation where vehicles drive themselves and we hop into whatever one is going our way. Data isn’t the enemy. Connected cars already do good things like improve safety and send you service alerts that are much more helpful than a check-engine light in the dash.</p>

<p>But we’ve been down this fraught road before with <a href="https://www.washingtonpost.com/technology/2019/09/18/you-watch-tv-your-tv-watches-back/?tid=lk_inline_manual_15" target="_blank">smart speakers, smart TVs, smartphones</a> and all the other smart things we now realize are playing fast and loose with our personal lives. Once information about our lives gets shared, sold or stolen, we lose control.</p>

<p>There are no federal laws regulating what carmakers can collect or do with our driving data. And carmakers lag in taking steps to protect us and draw lines in the sand. Most hide what they’re collecting and sharing behind privacy policies written in the kind of language only a lawyer’s mother could love.</p>

<p>Car data has a secret life. To find out what a car knows about me, I borrowed some techniques from crime scene investigators.</p>

<p><h3 class="font--subhead color-gray-darkest ma-0">What your car knows</h3></p>

<p>Jim Mason hacks into cars for a living, but usually just to better understand crashes and thefts. The Caltech-trained engineer works in Oakland, Calif., <a href="https://arcca.com/our-experts/james-j-mason/">for a firm called ARCCA</a> that helps reconstruct accidents. He agreed to help conduct a forensic analysis of my privacy.</p>

<p>I chose a Chevrolet as our test subject because its maker GM has had the longest of any automaker to figure out data transparency. It began connecting cars with its <a href="https://www.onstar.com/us/en/home/" target="_blank">OnStar service</a> in 1996, initially to summon emergency assistance. Today GM has more than 11 million 4G LTE data-equipped vehicles on the road, including free basic service and extras you pay for. I found a volunteer, Doug, who let us peer inside his two-year-old Chevy Volt.</p>

<p>I met Mason at an empty warehouse, where he began by explaining one important bit of car anatomy. Modern vehicles don’t just have one computer. There are multiple, interconnected brains that can generate up to 25 gigabytes of data per hour from sensors all over the car. Even with Mason’s gear, we could only access some of these systems.</p>

<p>This kind of hacking isn’t a security risk for most of us — it requires hours of physical access to a vehicle. Mason brought a laptop, special software, a box of circuit boards, and dozens of sockets and screwdrivers.</p>

<p>We focused on the computer with the most accessible data: the infotainment system. You might think of it as the car’s touch-screen audio controls, yet many systems interact with it, from navigation to a synced-up smartphone. The only problem? This computer is buried beneath the dashboard.</p>

<p>After an hour of prying and unscrewing, our Chevy’s interior looked like it had been lobotomized. But Mason had extracted the infotainment computer, about the size of a small lunchbox. He clipped it into a circuit board, which fed into his laptop. The data didn’t copy over in our first few attempts. “There is a lot of trial and error,” said Mason.</p>

<p>(Don’t try this at home. Seriously — we had to take the car into a repair shop to get the infotainment computer reset.)</p>

<p>It was worth the trouble when Mason showed me my data. There on a map was the precise location where I’d driven to take apart the Chevy. There were my other destinations, like the hardware store I’d stopped at to buy some tape.</p>

<p>Among the trove of data points were unique identifiers for my and Doug’s phones, and a detailed log of phone calls from the previous week. There was a long list of contacts, right down to people’s address, emails and even photos.</p>

<p>For a broader view, Mason also extracted the data from a Chevrolet infotainment computer that I bought used on eBay for $375. It contained enough data to reconstruct the Upstate New York travels and relationships of a total stranger. We know he or she frequently called someone listed as “Sweetie,” whose photo we also have. We could see the exact Gulf station where they bought gas, the restaurant where they ate (called Taste China) and the unique identifiers for their Samsung Galaxy Note phones.</p>

<p>Infotainment systems can collect even more. Mason has hacked into Fords that record locations once every few minutes, even when you don’t use the navigation system. He’s seen German cars with 300-gigabyte hard drives — five times as much as a basic iPhone 11. The Tesla Model 3 can <a href="https://www.washingtonpost.com/technology/2018/08/02/behind-wheel-tesla-model-its-giant-iphone-better-worse/?tid=lk_inline_manual_39" target="_blank">collect video snippets </a>from the car’s many cameras. Coming next: face data, used to personalize the vehicle and track driver attention.</p>

<p>In our Chevy, we probably glimpsed just a fraction of what GM knows. We didn’t see what was uploaded to GM’s computers, because we couldn’t access the live OnStar cellular connection. (Researchers have done those kinds of hacks before to prove connected vehicles <a href="https://www.wired.com/2015/07/hackers-remotely-kill-jeep-highway/" target="_blank">can be remotely controlled</a>.)</p>

<p>My volunteer car owner Doug asked GM to see the data it collected and shared. The automaker just pointed us to an obtuse privacy policy. Doug also (twice) sent GM a formal request <a href="https://epic.org/privacy/profiling/sb27.html">under a 2003 California data law</a> to ask who the company shared his information with. He got no reply.</p>

<p>GM spokesman David Caldwell declined to offer specifics on Doug’s Chevy but said the data GM collects generally falls into three categories: vehicle location, vehicle performance and driver behavior. “Much of this data is highly technical, not linkable to individuals and doesn’t leave the vehicle itself,” he said.</p>

<p>The company, he said, collects real-time data to monitor vehicle performance to improve safety and to help design future products and services.</p>

<p>But there were clues to what more GM knows on its website and app. It offers a Smart Driver score — a measure of good driving — based on how hard you brake and turn and how often you drive late at night. They’ll share that with insurance companies, if you want. With paid OnStar service, I could, on demand, locate the car’s exact location. It also offers in-vehicle WiFi and remote key access for Amazon package deliveries. An OnStar Marketplace connects the vehicle directly with third-party apps for Domino’s, IHOP, Shell and others.</p>

<p>The OnStar <a href="https://www.onstar.com/us/en/privacy_statement/">privacy policy</a><u>,</u> possibly only ever read by yours truly, grants the company rights to a broad set of personal and driving data without much detail on when and how often it might collect it. It says: “We may keep the information we collect for as long as necessary” to operate, conduct research or satisfy GM’s contractual obligations. Translation: pretty much forever.</p>

<p>It’s likely GM and other automakers keep just a slice of the data cars generate. But think of that as a temporary phenomenon. Coming 5G cellular networks promise to link cars to the Internet with ultra-fast, ultra-high-capacity connections. As wireless connections get cheaper and data becomes more valuable, anything the car knows about you is fair game.</p>

<p><h3 class="font--subhead color-gray-darkest ma-0">Protecting yourself</h3></p>

<p>GM’s view, echoed by many other automakers, is that we gave them permission for all of this. “Nothing happens without customer consent,” said GM’s Caldwell.</p>

<p>When my volunteer Doug bought his Chevy, he didn’t even realize OnStar basic service came standard. (I don’t blame him — who really knows what all they’re initialing on a car purchase contract?) There is no button or menu inside the Chevy to shut off OnStar or other data collection, though GM says it has added one to newer vehicles. Customers can press the console OnStar button and ask a representative to remotely disconnect.</p>

<p>What’s the worry? From conversations with industry insiders, I know many automakers haven’t totally figured out what to do with the growing amounts of driving data we generate. But that’s hardly stopping them from collecting it.</p>

<p>Five years ago, 20 automakers <a href="https://autoalliance.org/connected-cars/automotive-privacy/">signed on to volunteer privacy standards</a><u>,</u> pledging to “provide customers with clear, meaningful information about the types of information collected and how it is used,” as well as “ways for customers to manage their data.” But when I called eight of the largest automakers, not even one offered a dashboard for customers to look at, download and control their data.</p>

<p>Automakers haven’t had a data reckoning yet, but they’re due for one. <a href="https://www.freep.com/story/money/cars/general-motors/2018/10/01/gm-radio-listening-habits-advertising/1424294002/">GM ran an experiment</a> in which it tracked the radio music tastes of 90,000 volunteer drivers to look for patterns with where they traveled. According to the Detroit Free Press, GM told marketers that the data might help them persuade a country music fan who normally stopped at Tim Horton’s to go to McDonald’s instead.</p>

<p>GM would not tell me exactly what data it collected for that program but said “personal information was not involved” because it was anonymized data. (Privacy advocates have warned that location data is personal because it can be re-identified with individuals because we follow such unique patterns.)</p>

<p>GM’s privacy policy, which the company says it will update before the end of 2019, says it may “use anonymized information or share it with third parties for any legitimate business purpose.” Such as whom? “The details of those third-party relationships are confidential,” said Caldwell.</p>

<p>There are more questions. GM’s privacy policy says it will comply with legal data demands. How often does it share our data with the government? GM doesn’t offer a transparency report like tech companies do.</p>

<p>Automakers say they put data security first. But I suspect they’re just not used to customers demanding transparency. They also probably want to have sole control over the data, given that the industry’s existential threats — self-driving and ride-hailing technologies — are built on it.</p>

<p>But not opening up brings problems, too. <a href="https://safeandsecuredata.org/" target="_blank">Automakers</a> are battling with <a href="https://yourcaryourdata.org/">repair shops </a>in <a href="https://www.wbur.org/bostonomix/2019/08/06/right-to-repair-ballot-measure" target="_blank">Massachusetts</a> about a proposal that would require car companies to grant owners — and mechanics — access to telematics data. The Auto Care Association says locking out independent shops could give consumers fewer choices and make us end up paying more for service. The automakers say it’s a security and privacy risk.</p>

<p>In 2020, the <a href="https://www.washingtonpost.com/technology/2019/09/02/california-adopted-countrys-first-major-consumer-privacy-law-now-silicon-valley-is-trying-rewrite-it/?tid=lk_inline_manual_67" target="_blank">California Consumer Privacy Act</a> will require any company that collects personal data about the state’s residents to provide access to the data and give people the ability to opt out of its sharing. GM said it would comply with the law but didn’t say how.</p>

<p>Are any carmakers better? Among the privacy policies I read, <a href="https://www.toyota.com/privacyvts/">Toyota’s stood out</a> for drawing a few clear lines in the sand about data sharing. It says it won’t share “personal information” with data resellers, social networks or ad networks — but still carves out the right to share what it calls “vehicle data” with business partners.</p>

<p>Until automakers put even a fraction of the effort they put into TV commercials into giving us control over our data, I’d be wary about using in-vehicle apps or signing up for additional data services. At least smartphone apps like Google Maps let you turn off and delete location history.</p>

<p>And Mason’s hack brought home a scary reality: Simply plugging a smartphone into a car could put your data at risk. If you’re selling your car or returning a lease or rental, take the time to delete the data saved on its infotainment system. An <a href="https://www.privacy4cars.com/home/default.aspx">app called Privacy4Cars</a> offers model-by-model directions. Mason gives out gifts of car-lighter USB plugs, which let you charge a phone without connecting it to the car computer. (You can buy inexpensive ones online.)</p>

<p>If you’re buying a new vehicle, tell the dealer you want to know about connected services — and how to turn them off. Few offer an Internet “kill switch,” but they may at least allow you turn off location tracking.</p>

<p>Or, for now at least, you can just buy an old car. Mason, for one, drives a conspicuously non-connected 1992 Toyota.</p>
</article>


<hr>

<footer>
<p>
<a href="/david/" title="Aller à l’accueil">🏠</a> •
<a href="/david/log/" title="Accès au flux RSS">🤖</a> •
<a href="http://larlet.com" title="Go to my English profile" data-instant>🇨🇦</a> •
<a href="mailto:david%40larlet.fr" title="Envoyer un courriel">📮</a> •
<abbr title="Hébergeur : Alwaysdata, 62 rue Tiquetonne 75002 Paris, +33184162340">🧚</abbr>
</p>
</footer>
<script src="/static/david/js/instantpage-3.0.0.min.js" type="module" defer></script>
</body>
</html>

+ 45
- 0
cache/2020/77c968588b2e605d5b3050c45af53603/index.md View File

@@ -0,0 +1,45 @@
title: Driving surveillance: What does your car know about you? We hacked a 2017 Chevy to find out.
url: https://www.washingtonpost.com/technology/2019/12/17/what-does-your-car-know-about-you-we-hacked-chevy-find-out/
hash_url: 77c968588b2e605d5b3050c45af53603

<p>Cars have become the most sophisticated computers many of us own, filled with hundreds of sensors. Even older models know an awful lot about you. Many copy over personal data as soon as you plug in a smartphone.</p>
<p>But for the thousands you spend to buy a car, the data it produces doesn’t belong to you. My Chevy’s dashboard didn’t say what the car was recording. It wasn’t in the owner’s manual. There was no way to download it.</p>
<p>To glimpse my car data, I had to hack my way in.</p>
<p>We’re at a turning point for driving surveillance: In the 2020 model year, most new cars sold in the United States will come with built-in Internet connections, including 100 percent of Fords, GMs and BMWs and all but one model Toyota and Volkswagen. (This independent cellular service is often included free or sold as an add-on.) Cars are becoming smartphones on wheels, sending and receiving data from apps, insurance firms and pretty much wherever their makers want. Some brands even reserve the right to use the data to track you down if you don’t pay your bills.</p>
<p>When I buy a car, I assume the data I produce is owned by me — or at least is controlled by me. Many automakers do not. They act like how and where we drive, also known as telematics, isn’t personal information.</p>
<p>Cars now run on the new oil: your data. It is fundamental to a future of transportation where vehicles drive themselves and we hop into whatever one is going our way. Data isn’t the enemy. Connected cars already do good things like improve safety and send you service alerts that are much more helpful than a check-engine light in the dash.</p>
<p>But we’ve been down this fraught road before with <a href="https://www.washingtonpost.com/technology/2019/09/18/you-watch-tv-your-tv-watches-back/?tid=lk_inline_manual_15" target="_blank">smart speakers, smart TVs, smartphones</a> and all the other smart things we now realize are playing fast and loose with our personal lives. Once information about our lives gets shared, sold or stolen, we lose control.</p>
<p>There are no federal laws regulating what carmakers can collect or do with our driving data. And carmakers lag in taking steps to protect us and draw lines in the sand. Most hide what they’re collecting and sharing behind privacy policies written in the kind of language only a lawyer’s mother could love.</p>
<p>Car data has a secret life. To find out what a car knows about me, I borrowed some techniques from crime scene investigators.</p><p><h3 class="font--subhead color-gray-darkest ma-0">What your car knows</h3></p><p>Jim Mason hacks into cars for a living, but usually just to better understand crashes and thefts. The Caltech-trained engineer works in Oakland, Calif., <a href="https://arcca.com/our-experts/james-j-mason/">for a firm called ARCCA</a> that helps reconstruct accidents. He agreed to help conduct a forensic analysis of my privacy.</p>
<p>I chose a Chevrolet as our test subject because its maker GM has had the longest of any automaker to figure out data transparency. It began connecting cars with its <a href="https://www.onstar.com/us/en/home/" target="_blank">OnStar service</a> in 1996, initially to summon emergency assistance. Today GM has more than 11 million 4G LTE data-equipped vehicles on the road, including free basic service and extras you pay for. I found a volunteer, Doug, who let us peer inside his two-year-old Chevy Volt.</p>
<p>I met Mason at an empty warehouse, where he began by explaining one important bit of car anatomy. Modern vehicles don’t just have one computer. There are multiple, interconnected brains that can generate up to 25 gigabytes of data per hour from sensors all over the car. Even with Mason’s gear, we could only access some of these systems.</p>
<p>This kind of hacking isn’t a security risk for most of us — it requires hours of physical access to a vehicle. Mason brought a laptop, special software, a box of circuit boards, and dozens of sockets and screwdrivers.</p>
<p>We focused on the computer with the most accessible data: the infotainment system. You might think of it as the car’s touch-screen audio controls, yet many systems interact with it, from navigation to a synced-up smartphone. The only problem? This computer is buried beneath the dashboard.</p>
<p>After an hour of prying and unscrewing, our Chevy’s interior looked like it had been lobotomized. But Mason had extracted the infotainment computer, about the size of a small lunchbox. He clipped it into a circuit board, which fed into his laptop. The data didn’t copy over in our first few attempts. “There is a lot of trial and error,” said Mason.</p>
<p>(Don’t try this at home. Seriously — we had to take the car into a repair shop to get the infotainment computer reset.)</p>
<p>It was worth the trouble when Mason showed me my data. There on a map was the precise location where I’d driven to take apart the Chevy. There were my other destinations, like the hardware store I’d stopped at to buy some tape.</p>
<p>Among the trove of data points were unique identifiers for my and Doug’s phones, and a detailed log of phone calls from the previous week. There was a long list of contacts, right down to people’s address, emails and even photos.</p>
<p>For a broader view, Mason also extracted the data from a Chevrolet infotainment computer that I bought used on eBay for $375. It contained enough data to reconstruct the Upstate New York travels and relationships of a total stranger. We know he or she frequently called someone listed as “Sweetie,” whose photo we also have. We could see the exact Gulf station where they bought gas, the restaurant where they ate (called Taste China) and the unique identifiers for their Samsung Galaxy Note phones.</p>
<p>Infotainment systems can collect even more. Mason has hacked into Fords that record locations once every few minutes, even when you don’t use the navigation system. He’s seen German cars with 300-gigabyte hard drives — five times as much as a basic iPhone 11. The Tesla Model 3 can <a href="https://www.washingtonpost.com/technology/2018/08/02/behind-wheel-tesla-model-its-giant-iphone-better-worse/?tid=lk_inline_manual_39" target="_blank">collect video snippets </a>from the car’s many cameras. Coming next: face data, used to personalize the vehicle and track driver attention.</p>
<p>In our Chevy, we probably glimpsed just a fraction of what GM knows. We didn’t see what was uploaded to GM’s computers, because we couldn’t access the live OnStar cellular connection. (Researchers have done those kinds of hacks before to prove connected vehicles <a href="https://www.wired.com/2015/07/hackers-remotely-kill-jeep-highway/" target="_blank">can be remotely controlled</a>.)</p>
<p>My volunteer car owner Doug asked GM to see the data it collected and shared. The automaker just pointed us to an obtuse privacy policy. Doug also (twice) sent GM a formal request <a href="https://epic.org/privacy/profiling/sb27.html">under a 2003 California data law</a> to ask who the company shared his information with. He got no reply.</p>
<p>GM spokesman David Caldwell declined to offer specifics on Doug’s Chevy but said the data GM collects generally falls into three categories: vehicle location, vehicle performance and driver behavior. “Much of this data is highly technical, not linkable to individuals and doesn’t leave the vehicle itself,” he said.</p>
<p>The company, he said, collects real-time data to monitor vehicle performance to improve safety and to help design future products and services.</p>
<p>But there were clues to what more GM knows on its website and app. It offers a Smart Driver score — a measure of good driving — based on how hard you brake and turn and how often you drive late at night. They’ll share that with insurance companies, if you want. With paid OnStar service, I could, on demand, locate the car’s exact location. It also offers in-vehicle WiFi and remote key access for Amazon package deliveries. An OnStar Marketplace connects the vehicle directly with third-party apps for Domino’s, IHOP, Shell and others.</p>
<p>The OnStar <a href="https://www.onstar.com/us/en/privacy_statement/">privacy policy</a><u>,</u> possibly only ever read by yours truly, grants the company rights to a broad set of personal and driving data without much detail on when and how often it might collect it. It says: “We may keep the information we collect for as long as necessary” to operate, conduct research or satisfy GM’s contractual obligations. Translation: pretty much forever.</p>
<p>It’s likely GM and other automakers keep just a slice of the data cars generate. But think of that as a temporary phenomenon. Coming 5G cellular networks promise to link cars to the Internet with ultra-fast, ultra-high-capacity connections. As wireless connections get cheaper and data becomes more valuable, anything the car knows about you is fair game.</p><p><h3 class="font--subhead color-gray-darkest ma-0">Protecting yourself</h3></p><p>GM’s view, echoed by many other automakers, is that we gave them permission for all of this. “Nothing happens without customer consent,” said GM’s Caldwell.</p>
<p>When my volunteer Doug bought his Chevy, he didn’t even realize OnStar basic service came standard. (I don’t blame him — who really knows what all they’re initialing on a car purchase contract?) There is no button or menu inside the Chevy to shut off OnStar or other data collection, though GM says it has added one to newer vehicles. Customers can press the console OnStar button and ask a representative to remotely disconnect.</p>
<p>What’s the worry? From conversations with industry insiders, I know many automakers haven’t totally figured out what to do with the growing amounts of driving data we generate. But that’s hardly stopping them from collecting it.</p>
<p>Five years ago, 20 automakers <a href="https://autoalliance.org/connected-cars/automotive-privacy/">signed on to volunteer privacy standards</a><u>,</u> pledging to “provide customers with clear, meaningful information about the types of information collected and how it is used,” as well as “ways for customers to manage their data.” But when I called eight of the largest automakers, not even one offered a dashboard for customers to look at, download and control their data.</p>
<p>Automakers haven’t had a data reckoning yet, but they’re due for one. <a href="https://www.freep.com/story/money/cars/general-motors/2018/10/01/gm-radio-listening-habits-advertising/1424294002/">GM ran an experiment</a> in which it tracked the radio music tastes of 90,000 volunteer drivers to look for patterns with where they traveled. According to the Detroit Free Press, GM told marketers that the data might help them persuade a country music fan who normally stopped at Tim Horton’s to go to McDonald’s instead.</p>
<p>GM would not tell me exactly what data it collected for that program but said “personal information was not involved” because it was anonymized data. (Privacy advocates have warned that location data is personal because it can be re-identified with individuals because we follow such unique patterns.)</p>
<p>GM’s privacy policy, which the company says it will update before the end of 2019, says it may “use anonymized information or share it with third parties for any legitimate business purpose.” Such as whom? “The details of those third-party relationships are confidential,” said Caldwell.</p>
<p>There are more questions. GM’s privacy policy says it will comply with legal data demands. How often does it share our data with the government? GM doesn’t offer a transparency report like tech companies do.</p>
<p>Automakers say they put data security first. But I suspect they’re just not used to customers demanding transparency. They also probably want to have sole control over the data, given that the industry’s existential threats — self-driving and ride-hailing technologies — are built on it.</p>
<p>But not opening up brings problems, too. <a href="https://safeandsecuredata.org/" target="_blank">Automakers</a> are battling with <a href="https://yourcaryourdata.org/">repair shops </a>in <a href="https://www.wbur.org/bostonomix/2019/08/06/right-to-repair-ballot-measure" target="_blank">Massachusetts</a> about a proposal that would require car companies to grant owners — and mechanics — access to telematics data. The Auto Care Association says locking out independent shops could give consumers fewer choices and make us end up paying more for service. The automakers say it’s a security and privacy risk.</p>
<p>In 2020, the <a href="https://www.washingtonpost.com/technology/2019/09/02/california-adopted-countrys-first-major-consumer-privacy-law-now-silicon-valley-is-trying-rewrite-it/?tid=lk_inline_manual_67" target="_blank">California Consumer Privacy Act</a> will require any company that collects personal data about the state’s residents to provide access to the data and give people the ability to opt out of its sharing. GM said it would comply with the law but didn’t say how.</p>
<p>Are any carmakers better? Among the privacy policies I read, <a href="https://www.toyota.com/privacyvts/">Toyota’s stood out</a> for drawing a few clear lines in the sand about data sharing. It says it won’t share “personal information” with data resellers, social networks or ad networks — but still carves out the right to share what it calls “vehicle data” with business partners.</p>
<p>Until automakers put even a fraction of the effort they put into TV commercials into giving us control over our data, I’d be wary about using in-vehicle apps or signing up for additional data services. At least smartphone apps like Google Maps let you turn off and delete location history.</p>
<p>And Mason’s hack brought home a scary reality: Simply plugging a smartphone into a car could put your data at risk. If you’re selling your car or returning a lease or rental, take the time to delete the data saved on its infotainment system. An <a href="https://www.privacy4cars.com/home/default.aspx">app called Privacy4Cars</a> offers model-by-model directions. Mason gives out gifts of car-lighter USB plugs, which let you charge a phone without connecting it to the car computer. (You can buy inexpensive ones online.)</p>
<p>If you’re buying a new vehicle, tell the dealer you want to know about connected services — and how to turn them off. Few offer an Internet “kill switch,” but they may at least allow you turn off location tracking.</p>
<p>Or, for now at least, you can just buy an old car. Mason, for one, drives a conspicuously non-connected 1992 Toyota.</p>

+ 12
- 0
cache/2020/index.html View File

@@ -25,10 +25,18 @@
<article>
<ul>
<li><a href="/david/cache/2020/77c968588b2e605d5b3050c45af53603/" title="Accès à l'article caché">Driving surveillance: What does your car know about you? We hacked a 2017 Chevy to find out.</a> (<a href="https://www.washingtonpost.com/technology/2019/12/17/what-does-your-car-know-about-you-we-hacked-chevy-find-out/" title="Accès à l'article original">original</a>)</li>
<li><a href="/david/cache/2020/3006691afcf79e8a0fa83b2f0f64e91a/" title="Accès à l'article caché">Redesign: Wants and Needs</a> (<a href="https://frankchimero.com/blog/2020/wants-and-needs/" title="Accès à l'article original">original</a>)</li>
<li><a href="/david/cache/2020/5ddeb776b27bade5f581d66e40de4c6c/" title="Accès à l'article caché">Big Mood Machine</a> (<a href="https://thebaffler.com/downstream/big-mood-machine-pelly" title="Accès à l'article original">original</a>)</li>
<li><a href="/david/cache/2020/fb2849b42586654e0c899bf1a31fa5a5/" title="Accès à l'article caché">Sparrow’s Guide To Meditation</a> (<a href="https://www.thesunmagazine.org/issues/529/sparrows-guide-to-meditation" title="Accès à l'article original">original</a>)</li>
<li><a href="/david/cache/2020/2390380d879c04ee56baf320b6f7e681/" title="Accès à l'article caché">Twelve Million Phones, One Dataset, Zero Privacy</a> (<a href="https://www.nytimes.com/interactive/2019/12/19/opinion/location-tracking-cell-phone.html" title="Accès à l'article original">original</a>)</li>
<li><a href="/david/cache/2020/662f4136a25b828f662a6e822d85575d/" title="Accès à l'article caché">Who Listens to the Listeners?</a> (<a href="https://librarianshipwreck.wordpress.com/2019/12/06/who-listens-to-the-listeners/" title="Accès à l'article original">original</a>)</li>
<li><a href="/david/cache/2020/17aa5580eb34f39f214e4a72458c535e/" title="Accès à l'article caché">Thinking about the past, present, and future of web development</a> (<a href="https://www.baldurbjarnason.com/past-present-future-web/" title="Accès à l'article original">original</a>)</li>
<li><a href="/david/cache/2020/67c8c54b07137bcfc0069fccd8261b53/" title="Accès à l'article caché">Mercurial's Journey to and Reflections on Python 3</a> (<a href="https://gregoryszorc.com/blog/2020/01/13/mercurial%27s-journey-to-and-reflections-on-python-3/" title="Accès à l'article original">original</a>)</li>
@@ -45,8 +53,12 @@
<li><a href="/david/cache/2020/73a689d4932b2952affd040014e9b85b/" title="Accès à l'article caché">The empty promises of Marie Kondo and the craze for minimalism</a> (<a href="https://www.theguardian.com/lifeandstyle/2020/jan/03/empty-promises-marie-kondo-craze-for-minimalism" title="Accès à l'article original">original</a>)</li>
<li><a href="/david/cache/2020/58add7873e65625beba4c859d40a278b/" title="Accès à l'article caché">TikTok and the coming of infinite media</a> (<a href="http://www.roughtype.com/?p=8677" title="Accès à l'article original">original</a>)</li>
<li><a href="/david/cache/2020/2343cdf9e5f75cc6bfba098799f0f2fd/" title="Accès à l'article caché">A Future with No Future: Depression, the Left, and the Politics of Mental Health</a> (<a href="https://lareviewofbooks.org/article/future-no-future-depression-left-politics-mental-health/" title="Accès à l'article original">original</a>)</li>
<li><a href="/david/cache/2020/59dac1925636ebf6358c3a598bf834f9/" title="Accès à l'article caché">Un pédophile est un client Apple comme les autres.</a> (<a href="https://www.affordance.info/mon_weblog/2020/01/pedophile-client-apple.html" title="Accès à l'article original">original</a>)</li>
<li><a href="/david/cache/2020/a1ba10f6326b0ed4c9ca343a214f671d/" title="Accès à l'article caché">Végétarien carnivore</a> (<a href="https://oncletom.io/2020/01/12/vegetarien-carnivore/" title="Accès à l'article original">original</a>)</li>
<li><a href="/david/cache/2020/f6f75ff6a361536ccb7a2861ee386bbd/" title="Accès à l'article caché">Acte de naissance</a> (<a href="http://scopyleft.fr/blog/2013/acte-de-naissance/" title="Accès à l'article original">original</a>)</li>

Loading…
Cancel
Save