Platforms' Election 'Fixes' Are Rooted in Flawed Philosophy

Just 50 days out, Facebook, Google, and Twitter have done little more than roll out small patches—recalling the old coding ethos of "worse is better."
Collage of images of voting booths Mark Zuckerberg Elizabeth Warren and ballots
Photo-Illustration: Sam Whitney; Getty Images

Three and a half years ago, as the country came to understand the outsize role of social media manipulation in electing Donald Trump, you might have imagined that by the next time around the major platforms would have profoundly changed the way politics is conducted online and come to grips with essential design flaws. Instead, 50 days before Election Day, they have devised what amounts to a series of patches and fixes to a buggy operating system, not vital safeguards to our democracy.

At Facebook, Mark Zuckerberg announced no fewer than 10 different election-related fixes, including a ban on campaigns introducing new ads the week before Election Day and adding election officials to a category of “high-risk people” needing protection from online intimidation. Google, for its part, said on Thursday that it would turn off the autocomplete feature when its search engine looked up election issues because its suggestions often repeat false information or political bias. That same day, Twitter announced that it would label, play down, or remove election-related material that fit its categories of distortion and disinformation. YouTube had already rolled out a series of correctives concerning elections, including pledging to remove any material that contains hacked information about a political candidate or material “encouraging others to interfere with democratic processes.”

Senator Elizabeth Warren, among the most prominent critics of Silicon Valley on Capitol Hill, quickly pointed out the inadequacy of Zuckerberg’s proposals in a tweet: “Not enough. Facebook has repeatedly fumbled its responsibility to protect our democracy. Now the stakes are higher than ever—and they need to do more than make small, performative tweaks.”

Warren’s point is clear: We need nothing less than commitments from these platforms that they will rid themselves of bad actors intent on disrupting free and fair elections. But the big tech companies are constitutionally incapable of providing such fundamental guarantees—and this is by design. They are in the business of providing small corrections, quickly assessed and likely quickly corrected yet again. They do this not to evade responsibility, but because this is their time-tested way of adapting to fast-changing circumstances.

This approach represents the “move fast” part of Facebook’s much-derided philosophy of “move fast and break things,” which, despite the company’s public disavowal over the years, remains at the heart of how it and its peers operate. In this moment of political crisis, however, when the algorithms have been proven to impact behaviors beyond the digital realm, the smallness of such a vision is proving a tragic weakness of these celebrated companies. Their broken code may be repairable, but the damage it does to the real world is not.

The notorious motto’s roots go deeper than the walls of Facebook in its early days. Recently, as I was trying to understand why Zuckerberg’s response to election abuse seemed so hyperactive yet so inadequate, a programmer friend sent me to an essay from the early 1990s about coding best practices by Richard P. Gabriel called “Worse Is Better,” which influenced Facebook’s early thinking. Gabriel knew that his title was absurd—and he later recanted it through an essay he wrote under a pen name with the equally absurd title “Worse Is Better Is Worse”—but there is an insight in that original essay that Silicon Valley leaders, often with a background in coding, took to heart, and haven’t seemed to let go of.

In his original essay, Gabriel argues that code, rather than aspiring to be perfect or elegant, should be simple to install and apply universally so that it can spread like a virus, a term that he explicitly uses. The shift from the so-called perfectionism of traditional programming would be subtle. Gabriel advises trying to be “correct in all observable aspects,” but cautions that “it is slightly better to be simple than correct.”

That “slightly better” does a lot of work in explaining where we are today.

Gabriel was concerned about the delay and complexity that comes from being right from the start, as opposed to being on the seemingly right path. “The worse-is-better software first will gain acceptance,” Gabriel predicts, “second will condition its users to expect less, and third will be improved to a point that is almost the right thing.” In other words, for much of the time you live with a flawed product, though Gabriel optimistically described this stage as “half of the right thing.” If all goes according to plan, you end up with very nearly the right thing, and a lot faster than trying to accomplish this in one fell swoop.

Evidently, we are still at the maddening stage in the social networks’ evolution in which they are half of the right thing: They provide an amazing ability to connect people across the globe and share information, ideas, and deep emotions, while also allowing rampant harassment, disinformation, and conspiracy theories. It’s a hefty price, and our acceptance of it begs the question: Is worse better?

I emailed Gabriel, nearly 30 years after his essay first spread among programmers via early email, to ask if he saw worse-is-better thinking in the way Facebook approaches problems like the 2020 election. He responded with a link to a keynote speech from 2009 by Facebook’s head of engineering at the time, Robert Johnson, with the title, “Moving Fast at Scale—Lessons Learned at Facebook.”

These were Facebook’s glory days—it had grown to 300 million users, all the while being generally beloved by its audience. As told by Johnson, Facebook’s insistence on moving fast and breaking things came from a place of humility and a drive to make something better than its leaders could even imagine. The key point is not to let anything—certainly not a few broken features in a huge platform—stop you from testing a new idea.

“This notion we should slow down and get it right doesn’t actually make much sense,” Johnson said back in 2009, “because, unless you have an extremely good idea of what right is, slowing down to get something right just means you are going to take longer before you figure out that you are wrong. And, more importantly, why you are wrong, and move on to the next thing.”

The fatal flaw at Facebook and the prominent companies that followed this move-fast-and-correct-quickly philosophy isn’t simply the rapidity or all the broken stuff, but that by involving itself so deeply in divisive politics, it is applying a programming philosophy to a system that is not nearly as resilient as computer code. A few crashes of the Facebook site, or other user-experience blemishes, may well be the price you pay to discover an important innovation in building the architecture of a global platform. But when it comes to democracy and tolerance, the crashes from bad designs can be genuinely catastrophic, acquiring a momentum of their own. In the latter case, fixes sent out from headquarters are likely to be inadequate.

“I didn’t design ‘worse is better’ to be a moral approach to designing and making things,” Gabriel wrote in an email, but rather as a way for a community “to help design and build a thing that works for them.” However, he wrote, “when the ‘test’ includes either directly or indirectly ad revenue, paid content, political considerations, patronage, and corruption, the evolutionary arc of the platform can go haywire.” He said that, had Facebook limited this fast-moving technique to matters of back-end programming, we wouldn’t be in our troubling situation involving free and fair elections and the social networks. Worse Is Better is good for coding code, but not for coding society.

Maybe this separation between code and society is fanciful—after all, Facebook is built on code, and that code reaches far and wide, influencing our thoughts and actions beyond the screen and inevitably changing society. But the least we should insist is that Silicon Valley coders step back from consciously shaping society—whether that means creating algorithms to surface articles they think will interest you, or anticipating what you might be searching for, or helping groups of like-minded people organize together. They have exactly the wrong kind of training and instincts for this important work.

There are, of course, many differences between coding code and coding society. But perhaps most important is that the ultimate failsafe for programmers—reinstalling the system and starting over—just isn’t available for matters of democracy.

Photographs: Brett Carlsen/Getty Images; Ethan Miller/Getty Images; Johannes Simon/Getty Images


More Great WIRED Stories