title: Facebook’s Unknowable Megascale url: https://daringfireball.net/2020/12/facebook_unknowable_megascale hash_url: 352d2966ecda7a68cbf97efbc691d017
Adrienne LaFrance, writing for The Atlantic, “Facebook Is a Doomsday Machine”:
People tend to complain about Facebook as if something recently curdled. There’s a notion that the social web was once useful, or at least that it could have been good, if only we had pulled a few levers: some moderation and fact-checking here, a bit of regulation there, perhaps a federal antitrust lawsuit. But that’s far too sunny and shortsighted a view. Today’s social networks, Facebook chief among them, were built to encourage the things that make them so harmful. It is in their very architecture.
I’ve been thinking for years about what it would take to make the social web magical in all the right ways — less extreme, less toxic, more true — and I realized only recently that I’ve been thinking far too narrowly about the problem. I’ve long wanted Mark Zuckerberg to admit that Facebook is a media company, to take responsibility for the informational environment he created in the same way that the editor of a magazine would. (I pressed him on this once and he laughed.) In recent years, as Facebook’s mistakes have compounded and its reputation has tanked, it has become clear that negligence is only part of the problem. No one, not even Mark Zuckerberg, can control the product he made. I’ve come to realize that Facebook is not a media company. It’s a Doomsday Machine.
This is a very compelling and cogent essay, and I largely agree with LaFrance. But here I disagree: Zuckerberg clearly can control it. There are dials on the algorithms that control what users see in their feeds. What can’t be controlled is what happens as Facebook pursues engagement. What keeps too many people hooked to Facebook is exactly the sort of worldview-warping toxic content that is damaging society worldwide. To some degree Facebook’s addictiveness and toxicity are directly correlated. This isn’t conjecture or speculation, we have proof. Plus, we have eyes: in some ways the societal harm from Facebook is as easy for anyone to see as the respiratory problems caused by smoking. I honestly believe Zuckerberg would prefer to reduce the toxicity of Facebook’s social media platforms, but not enough to do so if it reduces Facebook’s addictiveness. Again, likewise, I’m sure tobacco company executives would have loved to invent tobacco products that didn’t cause cancer.
A key insight from LaFrance:
The website that’s perhaps best known for encouraging mass violence is the image board 4chan — which was followed by 8chan, which then became 8kun. These boards are infamous for being the sites where multiple mass-shooting suspects have shared manifestos before homicide sprees. The few people who are willing to defend these sites unconditionally do so from a position of free-speech absolutism. That argument is worthy of consideration. But there’s something architectural about the site that merits attention, too: There are no algorithms on 8kun, only a community of users who post what they want. People use 8kun to publish abhorrent ideas, but at least the community isn’t pretending to be something it’s not. The biggest social platforms claim to be similarly neutral and pro–free speech when in fact no two people see the same feed. Algorithmically tweaked environments feed on user data and manipulate user experience, and not ultimately for the purpose of serving the user. Evidence of real-world violence can be easily traced back to both Facebook and 8kun. But 8kun doesn’t manipulate its users or the informational environment they’re in. Both sites are harmful. But Facebook might actually be worse for humanity.
This is the problem we, collectively, have not grasped. How do we regulate — via the law and/or social norms — a form of mass media with amorphous content? When you make a movie or write a book or publish a magazine, the speech that matters is the content of the movie/book/magazine. When you post something to Facebook, the “speech” that matters most isn’t the content of the post but the algorithm that determines who sees it and how. 3 billion users effectively means there are 3 billion different “Facebooks”. That’s the “megascale” which LaFrance equates to the megadeaths of a Strangelovian doomsday device.
A mere “website” — say, Wikipedia — that reaches an audience of billions is like the surface of an ocean: enormously expansive, but visible. Facebook is like the volume of an ocean: not merely massive, but unknowable.
We instinctively think that 8kun is “worse” than Facebook because its users are free to post the worst content imaginable, and because they are terribly imaginative, do. It feels like 8kun must be “worse” because its content is worse — what is permitted, and what actually is posted. But Facebook is in fact far worse, because by its nature we, as a whole, can’t even see what “Facebook” is because everyone’s feed is unique. 8kun, at least, is a knowable product. You could print it out and say, “Here is what 8kun was on December 29, 2020.” How could you ever say what Facebook is at any given moment, let alone for a given day, let alone as an omnipresent daily presence in billions of people’s lives?
A question I’ve pondered these last few post-election weeks: What would have happened if Mark Zuckerberg were all-in on Trump? What if instead of flagging and tamping down on Trump’s utterly false but profoundly destructive “election fraud” anti-democratic power grab, Facebook had done the opposite and pushed the narrative Trump wants? What if Trump owned Facebook? What if Zuckerberg ran for president, lost, and pursued a similar “turn your supporters against democracy” strategy?
Is there any reason to believe that Facebook chose the pre- and post-election course it did because it was the right thing to do — good for the United States, good for the world, good for the principles of democracy and truth — rather than the result of a cold calculus that determined it was the optimal way to keep the most people the most engaged with Facebook?
I, for one, believe Facebook charted a course around this election primarily with Facebook’s continuing addictiveness in mind. But I know that whatever the reasons, they were ultimately determined by one person. That’s quite a thing.