A place to cache linked articles (think custom and personal wayback machine)
Ви не можете вибрати більше 25 тем Теми мають розпочинатися з літери або цифри, можуть містити дефіси (-) і не повинні перевищувати 35 символів.

3 роки тому
1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556575859606162636465666768
  1. title: Facebook’s Unknowable Megascale
  2. url: https://daringfireball.net/2020/12/facebook_unknowable_megascale
  3. hash_url: 352d2966ecda7a68cbf97efbc691d017
  4. <p>Adrienne LaFrance, writing for The Atlantic, “<a href="https://www.theatlantic.com/technology/archive/2020/12/facebook-doomsday-machine/617384/">Facebook Is a Doomsday Machine</a>”:</p>
  5. <blockquote>
  6. <p>People tend to complain about Facebook as if something recently
  7. curdled. There’s a notion that the social web was once useful, or
  8. at least that it could have been good, if only we had pulled a few
  9. levers: some moderation and fact-checking here, a bit of
  10. regulation there, perhaps a federal antitrust lawsuit. But that’s
  11. far too sunny and shortsighted a view. Today’s social networks,
  12. Facebook chief among them, were built to encourage the things that
  13. make them so harmful. It is in their very architecture.</p>
  14. <p>I’ve been thinking for years about what it would take to make the
  15. social web magical in all the right ways — less extreme, less
  16. toxic, more true — and I realized only recently that I’ve been
  17. thinking far too narrowly about the problem. I’ve long wanted Mark
  18. Zuckerberg to admit that <a href="https://twitter.com/AdrienneLaF/status/910493155421822976">Facebook is a media company</a>, to take
  19. responsibility for the informational environment he created in the
  20. same way that the editor of a magazine would. (I pressed him on
  21. this <a href="https://www.theatlantic.com/technology/archive/2018/05/mark-zuckerberg-doesnt-understand-journalism/559424/">once</a> and he laughed.) In recent years, as Facebook’s
  22. mistakes have compounded and its reputation has tanked, it has
  23. become clear that negligence is only part of the problem. No one,
  24. not even Mark Zuckerberg, can control the product he made. I’ve
  25. come to realize that Facebook is not a media company. It’s a
  26. Doomsday Machine.</p>
  27. </blockquote>
  28. <p>This is a very compelling and cogent essay, and I largely agree with LaFrance. But here I disagree: Zuckerberg clearly <em>can</em> control it. There are dials on the algorithms that control what users see in their feeds. What can’t be controlled is what happens as Facebook pursues <em>engagement</em>. What keeps too many people hooked to Facebook is exactly the sort of worldview-warping toxic content that is damaging society worldwide. To some degree Facebook’s addictiveness and toxicity are directly correlated. This isn’t conjecture or speculation, <a href="https://daringfireball.net/linked/2020/11/24/facebook-sociopaths">we have proof</a>. Plus, we have eyes: in some ways the societal harm from Facebook is as easy for anyone to see as the respiratory problems caused by smoking. I honestly believe Zuckerberg would prefer to reduce the toxicity of Facebook’s social media platforms, but not enough to do so if it reduces Facebook’s addictiveness. Again, likewise, I’m sure tobacco company executives would have loved to invent tobacco products that didn’t cause cancer.</p>
  29. <p>A key insight from LaFrance:</p>
  30. <blockquote>
  31. <p>The website that’s perhaps best known for encouraging mass
  32. violence is the image board 4chan — which was followed by 8chan,
  33. which then became 8kun. These boards are infamous for being the
  34. sites where multiple mass-shooting suspects have shared manifestos
  35. before homicide sprees. The few people who are willing to defend
  36. these sites unconditionally do so from a position of free-speech
  37. absolutism. That argument is worthy of consideration. But there’s
  38. something architectural about the site that merits attention, too:
  39. There are no algorithms on 8kun, only a community of users who
  40. post what they want. People use 8kun to publish abhorrent ideas,
  41. but at least the community isn’t pretending to be something it’s
  42. not. The biggest social platforms claim to be similarly neutral
  43. and pro–free speech when in fact no two people see the same feed.
  44. Algorithmically tweaked environments feed on user data and
  45. manipulate user experience, and not ultimately for the purpose of
  46. serving the user. Evidence of real-world violence can be easily
  47. traced back to both Facebook and 8kun. But 8kun doesn’t manipulate
  48. its users or the informational environment they’re in. Both sites
  49. are harmful. But Facebook might actually be worse for humanity.</p>
  50. </blockquote>
  51. <p>This is <em>the</em> problem we, collectively, have not grasped. How do we regulate — via the law and/or social norms — a form of mass media with amorphous content? When you make a movie or write a book or publish a magazine, the speech that matters is the content of the movie/book/magazine. When you post something to Facebook, the “speech” that matters most isn’t the content of the post but the algorithm that determines who sees it and how. 3 billion users effectively means there are 3 billion different “Facebooks”. <em>That’s</em> the “megascale” which LaFrance equates to the megadeaths of a Strangelovian doomsday device. </p>
  52. <p>A mere “website” — say, Wikipedia — that reaches an audience of billions is like the surface of an ocean: enormously expansive, but visible. Facebook is like the <em>volume</em> of an ocean: not merely massive, but unknowable.</p>
  53. <p>We instinctively think that 8kun is “worse” than Facebook because its users are free to post the worst content imaginable, and because they are terribly imaginative, do. It feels like 8kun must be “worse” because its <em>content</em> is worse — what is permitted, and what actually is posted. But Facebook is in fact far worse, because by its nature we, as a whole, can’t even see what “Facebook” is because everyone’s feed is unique. 8kun, at least, is a knowable product. You could print it out and say, “Here is what 8kun was on December 29, 2020.” How could you ever say what Facebook is at any given <em>moment</em>, let alone for a given day, let alone as an omnipresent daily presence in <em>billions</em> of people’s lives?</p>
  54. <p>A question I’ve pondered these last few post-election weeks: What would have happened if Mark Zuckerberg were all-in on Trump? What if instead of flagging and tamping down on Trump’s utterly false but profoundly destructive “election fraud” anti-democratic power grab, Facebook had done the opposite and pushed the narrative Trump wants? What if Trump owned Facebook? What if Zuckerberg ran for president, lost, and pursued a similar “turn your supporters against democracy” strategy?</p>
  55. <p>Is there any reason to believe that Facebook chose the pre- and post-election course it did because it was the right thing to do — good for the United States, good for the world, good for the principles of democracy and truth — rather than the result of a cold calculus that determined it was the optimal way to keep the most people the most engaged with Facebook?</p>
  56. <p>I, for one, believe Facebook charted a course around this election primarily with Facebook’s continuing addictiveness in mind. But I <em>know</em> that whatever the reasons, they were ultimately determined by one person. That’s quite a thing.</p>