clock menu more-arrow no yes mobile

Filed under:

Why Facebook failed its civil rights audit

The new, deeply critical report highlights the tension between free expression and hate speech on the social network.

Facebook co-founder and CEO Mark Zuckerberg testifies before the House Financial Services Committee in October 2019.
Mark Zuckerberg testifies about how his company will handle false and misleading information by political leaders during the 2020 campaign before the House Financial Services Committee in October 2019.
Chip Somodevilla / Getty Images

Facebook has failed on civil rights.

On Wednesday, after two years of work, the social media giant finally released the results of its independent audit, a wide-ranging report on the state of civil rights on Facebook, from hate speech to advertising to algorithmic bias. The auditors found that the company simply hasn’t done enough to combat hate and abuse on its platform.

Following up on two previous updates in December 2018 and June 2019, the audit concludes that the company’s handling of civil rights issues is “too reactive and piecemeal,” and ultimately raises doubts about whether Facebook is actually committed to addressing its myriad problems.

That’s especially concerning given that the November 2020 election is just months away.

Former ACLU director Laura W. Murphy, who led the report along with civil rights attorney Megan Cacace, compared Facebook’s work to climbing Mount Everest. She noted that though the social media company had made some progress, Facebook still hadn’t invested enough resources or moved quickly enough to address its many civil rights challenges, creating “legitimate questions about Facebook’s full-throated commitment to reaching the summit.”

The audit, which was commissioned by Facebook at the urging of civil rights leaders and politicians, comes amid a growing advertiser boycott of the platform called Stop Hate for Profit, which is led by civil rights groups including the NAACP, the Anti-Defamation League, and Color of Change, none of which seem to have any plans to halt their campaign. More than 1,000 companies have now signed on, despite CEO Mark Zuckerberg dismissing its impact.

For these leaders of the boycott, who have long tried to work alongside Facebook, the findings of the audit confirm much of what they’ve previously said about the company: that it isn’t taking issues around hate speech, bias, polarization, and diversity seriously enough.

“Ridding the platform of hate and misinformation against Black people only became a priority when there was a PR crisis to endure,” said Rashad Robinson, the president of Color of Change, who hinted that Congress may have a role in protecting civil rights on the ever-embattled platform.

The report is an important one for Facebook’s reputation, but it isn’t binding. Facebook can choose to implement the recommendations in the report or to dismiss them — which is what some advocates like Robinson fear. In a blog post announcing the report’s release on Wednesday, Facebook COO Sheryl Sandberg said that the company “won’t make every change they [auditors] call for,” but that it “will put more of their proposals into practice.”

Regardless of what the company ends up doing, the audit serves as a thorough examination of Facebook’s longstanding struggle to reconcile its stated values around free speech with the history of harm caused by unchecked vitriol and discrimination on its platform. With that overarching theme in mind, here are five key takeaways about Facebook and civil rights from the 89-page report.

1) Holding Trump to a different standard sets a troubling precedent

Facebook has failed to penalize Trump for violating its community guidelines, the auditors say, which stands “to gut policies” that had represented progress for civil rights on the platform. The report specifically highlights a group of Trump’s posts that made misleading claims about voting and the president’s infamous “looting … shooting” post about protesters. Echoing previous concerns from civil rights groups, the auditors say these posts clearly violate Facebook’s community guidelines and that not removing them establishes a concerning precedent for Trump and other politicians.

The voting-related posts by Trump referenced in the report include false claims about mail-in ballots in California, Michigan, and Nevada. Facebook ultimately decided that these posts did not violate its guidelines, arguing in the case of Michigan and Nevada that the language in the posts was merely “challenging the legality of officials.” The auditors explain that they “vehemently expressed” their view that the posts violated policy but were “not afforded an opportunity to speak directly to decision-makers” until after the final decision was made.

Facebook’s decisions, they said, constitute a “tremendous setback for all of the policies that attempt to ban voter suppression on Facebook.”

Trump’s “looting ... shooting” post represents a similar pattern of self-justified inaction. In that post, the president appeared to threaten violence against Black Lives Matter protesters, using language that echoed civil rights-era white segregationists. Though Facebook executives called the White House requesting that Trump change or delete the post, the company ultimately did nothing about it. By contrast, Twitter chose to label an identical post by President Trump on its platform for violating its rules about glorifying violence.

Facebook defended its decision by arguing that threats of state action are allowed on the platform. The auditors say that logic ignored “how such statements, especially when made by those in power and targeted toward an identifiable, minority community, condone vigilantism and legitimize violence against that community.“ They added, “Random shooting is not a legitimate state use of force.” Again, the auditors say they were not included in the decision-making process in time. Facebook’s decision about the “looting … shooting” post, which Mark Zuckerberg later defended on a call with employees, prompted criticism from company executives and a virtual employee walkout. It was one of the incidents that inspired the Stop Hate for Profit boycott.

In June, Facebook announced it will label posts that violate its community guidelines but are left up because they’re deemed newsworthy (and if their public interest value eclipses the harm they cause), but that doesn’t seem to happen very often. The audit revealed that over the past year, the company only applied the newsworthy exception to politicians 15 times, and only once in the United States, and it was not immediately clear what those instances were.

Meanwhile, the company still hasn’t taken any action against Trump’s past posts, and the auditors concluded that for many civil rights advocates, “the damage has already been done.” Even if Facebook has policies supporting civil rights, the auditors concluded, the refusal to enforce them against Trump has eroded trust in the company and leaves room for other politicians to follow in Trump’s footsteps.

2) Valuing free speech above all else creates problems

While Facebook’s leadership has repeatedly emphasized the company’s commitment to free expression, the auditors found that this comes at a cost. Facebook systematically chooses to prioritize the speech of politicians over clamping down on harmful and hateful rhetoric, which hurts its users overall. Several times in the report, the auditors cite Zuckerberg’s 2019 speech at Georgetown as a “turning point,” where Facebook reiterated its commitment to free expression as “a governing principle of the platform.”

Facebook’s choice not to fact-check politicians — and to allow them to sometimes break Facebook’s own rules against posting harmful content because what politicians say is inherently newsworthy — represents another problem. Both steps have significantly hurt the company’s civil rights efforts, the auditors said. Allowing politicians to spread misinformation about voting, which Zuckerberg in his Georgetown speech argued was a form of free expression, particularly undermines Facebook’s commitment to its values. The auditors said they found Facebook’s prioritization of free speech over other values, like nondiscrimination and equality, “deeply troubling.”

By forming exemptions for politicians’ content, they argue, a “hierarchy of speech is created that privileges certain voices over less powerful voices.”

The report, however, acknowledges that Facebook is failing to address the tension between its civil rights promises and its monolithic commitment to free expression. Instead, the company should work to develop a more comprehensive understanding of free speech that acknowledges how typical users actually experience the platform.

“For a 21st century American corporation, and for Facebook, a social media company that has so much influence over our daily lives, the lack of clarity about the relationship between those two values is devastating,” lead auditor Laura W. Murphy wrote in the report’s introduction. “It will require hard balancing, but that kind of balancing of rights and interests has been part of the American dialogue since its founding and there is no reason that Facebook cannot harmonize those values, if it really wants to do so.”

3) Hate speech is still a problem for Facebook, and we don’t know how bad it really is

Facebook has long struggled with hateful and violent speech on its platform, including from white nationalists streaming talk shows on Facebook Watch and members of the “boogaloo movement” that promote anti-government ideology and has instigated violence at recent racial justice protests.

Facebook’s audit highlights that the company has a long way to go in combating hate speech, particularly around white nationalism. Facebook has made some progress: It says it’s gotten better at identifying hate speech, and it now has a team of 350 people who work exclusively on combating dangerous groups on Facebook. But the auditors say hateful content often stays on the platform for longer than it should or doesn’t get removed in the first place. This is an “especially acute” problem with content targeting African Americans, Jews, and Muslims, according to the audit.

For example, the auditors asked Facebook to ban all content that promotes white nationalist or white separatist ideology, something it has so far failed to do. The company has explicitly banned phrases like “white nationalism” or “white separatism,” but that simplistic approach still allows racist content to continue to spread on the platform, the auditors said.

The audit also criticized Facebook for not taking down hateful events fast enough. The report highlights how in 2019, it took Facebook more than 24 hours to remove an event intended to physically intimidate attendees at the Islamic Society of North America’s annual meeting in Houston, Texas. Facebook has acknowledged its misstep with that incident, but auditors called for the company to fundamentally revise its review process to expedite the removal of such events. Properly moderating events, the report says, is essential “to ensure that people cannot use Facebook to organize calls to arms to harm or intimidate specific groups” during the current nationwide protests.

One thing complicating Facebook’s hate speech problem is the fact that there’s not enough hard data to know how bad it is or how it impacts different groups. The report says “the absence of data for analysis and study seems to undercut efforts to document and define the problem, identify its source, and explore potential mitigation.”

While the audit focused on issues of hate speech, it also touched on a related and even more complex issue that has dogged Facebook for years: whether its platform politically polarizes its users and how this might be connected to the hate speech that spreads on Facebook. A recent Wall Street Journal report found that Facebook’s leadership shut down efforts to make the site less divisive by shelving internal research on whether social media increases polarization. Facebook, and Zuckerberg in particular, has denied these claims and criticized the Journal’s reporting.

Zuckerberg has vehemently disputed the notion that Facebook is polarizing its users, arguing that on the whole the platform brings people together. The auditors questioned that conclusion, saying they “do not believe that Facebook is sufficiently attuned to the depth of concern on the issue of polarization and the way that the algorithms used by Facebook inadvertently fuel extreme and polarizing content.”

Under public pressure after the 2016 election, Facebook adjusted its News Feed’s algorithm so that it promotes posts from friends and family over news articles. Still, the auditors believe this wasn’t sufficient action and that “Facebook should do everything in its power to prevent its tools and algorithms from driving people toward self-reinforcing echo chambers of extremism, and that the company must recognize that failure to do so can have dangerous (and life-threatening) real-world consequences.”

Facebook can do this, the auditors say, not just by removing hateful content but also by redirecting users “away from (rather than toward) extremist organizations” in the types of recommendations it makes.

4) Covid-19 showed Facebook can effectively police harmful content when it wants to

The Covid-19 pandemic raised the stakes for how the company handles harmful content. Notably, in response to the pandemic, Facebook began to aggressively take down misinformation related to Covid-19, removing hundreds of thousands of false posts that Facebook identified as having the potential to cause imminent physical harm.

This new approach contrasts starkly with how the company combats other types of misinformation, which Facebook has historically chosen not to act on. The report says that “Facebook has no qualms about reining in speech by the proponents of the anti-vaccination movement, or limiting misinformation about COVID-19, but when it comes to voting, Facebook has been far too reluctant to adopt strong rules to limit misinformation and voter suppression.”

Moderating pandemic-related content is also getting more complicated for the platform: As Recode’s Peter Kafka explained in late May, the discussion around Covid-19 has evolved from a public health concern into a rancorous and partisan political debate that encompasses voting rights, state reopening plans, and the politics of wearing (or not wearing) masks. The report notes the majority of the 100,000 pieces of content between March and May taken down for violating its voter inference policies were related to Covid-19.

5) The person Facebook hires to be its new civil rights executive needs real decision-making power

For years, civil rights leaders have pressured Facebook to create a role that would ensure that the company is thinking about whether its products and policies are treating people fairly. With the publication of this report, Facebook announced that it is creating a senior vice president on civil rights leadership role. But auditors say that isn’t enough. They want Facebook to create a “civil rights infrastructure.”

The audit recommends that the new vice president of civil rights should manage a team rather than work in a standalone position; they should have a mandatory say in key “decisions with civil rights implications,” such as whether or not to remove controversial posts from a politician. The auditors specifically said the new vice president of civil rights “must be ‘in the room’ (meaning in direct dialogue with decision-makers) when decisions are being made and have direct conversations with leadership.”

Fewer than 10 people weighed in on Zuckerberg’s controversial final decision not to take down Trump’s post referencing “shooting” at protests, according to a transcript of an internal Facebook all-hands meeting Recode reported on in June. Of the people Zuckerberg cited in the meeting, only one was Black, and none had roles dedicated exclusively to civil rights.

In a statement to Recode, Rashad Robinson, the president of Color of Change, said the newly announced position was “an important step” but added that “their office needs to be provided with full resources to be effective.”

“Without this, there is no reason to believe that Facebook will prioritize civil rights protections moving forward,” Robinson said. “All we can count on is Zuckerberg pontificating about free expression, while giving a free pass to politicians to lie, sow discord, and thrive off of hate and political chaos.”

What’s next

For civil rights leaders who have been waiting on the results of this report for two years, the big question is what comes next. Will Facebook enact the many changes in this audit it has said it’s “considering” or “piloting”?

Facebook COO Sheryl Sandberg, in her blog post announcing the audit’s release on Wednesday, called it the “beginning of the journey — not the end” for Facebook’s handling of hate speech and related issues. But some civil rights organizations are losing patience, and according to the audit, some are considering stopping their work with Facebook altogether. This is an alarming sign, considering how close the November election is.

“I’m not looking only for what the audit recommends, but what Facebook is going to do about it,” Jessica González, president of the civil rights organization Free Press, which has been one of the organizations leading an advertising boycott of Facebook, told Recode.

Advertisers are continuing to sign on to the boycott, with around 125 new ones signing up so far this week alone, González told Recode on Wednesday. Congress is also likely to press Facebook on these issues at an upcoming congressional hearing on antitrust issues in July, during which Zuckerberg and other major tech executives are set to testify.

“I know that we can’t snap our fingers and transform a social media network in a day, but [Facebook has] been way too lethargic about this,” said González. “The actions don’t meet the words.”

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.