A place to cache linked articles (think custom and personal wayback machine)
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

index.md 3.9KB

4 vuotta sitten
1234567
  1. title: The Latest Data Privacy Debacle
  2. url: https://www.nytimes.com/2018/01/30/opinion/strava-privacy.html
  3. hash_url: 21879bc7bba82e6781257300c3da60d0
  4. <p class="story-body-text story-content" data-para-count="500" data-total-count="2630" id="story-continues-3">Part of the problem with the ideal of individualized informed consent is that it assumes companies have the ability to inform us about the risks we are consenting to. They don’t. Strava surely did not intend to reveal the GPS coordinates of a possible Central Intelligence Agency annex in Mogadishu, Somalia — but it may have done just that. Even if all technology companies meant well and acted in good faith, they would not be in a position to let you know what exactly you were signing up for.</p><p class="story-body-text story-content" data-para-count="656" data-total-count="3286">Another part of the problem is the increasingly powerful computational methods called machine learning, which can take seemingly inconsequential data about you and, combining them with other data, can discover facts about you that you never intended to reveal. For example, research shows that data as minor as your Facebook “likes” can be used to infer your sexual orientation, whether you use addictive substances, your race and your views on many political issues. This kind of computational statistical inference is not 100 percent accurate, but it can be fairly close — certainly close enough to be used to profile you for a variety of purposes.</p><p class="story-body-text story-content" data-para-count="657" data-total-count="3943">A challenging feature of machine learning is that exactly how a given system works is opaque. Nobody — not even those who have access to the code and data — can tell what piece of data came together with what other piece of data to result in the finding the program made. This further undermines the notion of informed consent, as we do not know which data results in what privacy consequences. What we do know is that these algorithms work better the more data they have. This creates an incentive for companies to collect and store as much data as possible, and to bury the privacy ramifications, either in legalese or by playing dumb and being vague.</p><p class="story-body-text story-content" data-para-count="666" data-total-count="4609">What can be done? There must be strict controls and regulations concerning how all the data about us — not just the obviously sensitive bits — is collected, stored and sold. With the implications of our current data practices unknown, and with future uses of our data unknowable, data storage must move from being the default procedure to a step that is taken only when it is of demonstrable benefit to the user, with explicit consent and with clear warnings about what the company does and does not know. And there should also be significant penalties for data breaches, especially ones that result from underinvestment in secure data practices, as many now do.</p><button class="button comments-button theme-speech-bubble-large" data-skip-to-para-id="">
  5. </button>
  6. <p class="story-body-text story-content" data-para-count="427" data-total-count="5036">Companies often argue that privacy is what we sacrifice for the supercomputers in our pockets and their highly personalized services. This is not true. While a perfect system with no trade-offs may not exist, there are technological avenues that remain underexplored, or even actively resisted by big companies, that could allow many of the advantages of the digital world without this kind of senseless assault on our privacy.</p><p class="story-body-text story-content" data-para-count="239" data-total-count="5275">With luck, stricter regulations and a true consumer backlash will force our technological overlords to take this issue seriously and let us take back what should be ours: true and meaningful informed consent, and the right to be let alone.</p>