In the days after the election, several senior Biden campaign workers talked with me about their public confrontation with Facebook, the world’s biggest social media platform. They described the company as plagued by conflicting desires: to avoid claims of political bias; to avoid being blamed for the election results, as it was in 2016; and to publicize its election integrity efforts.
Facebook thought it was trying to be a neutral referee. But the Biden and Trump campaigns were playing entirely different sports. The result, the Biden camp felt, was a paralysis and an inconsistent application of Facebook’s rules that ultimately benefited Trump’s campaign.
Here’s some of what the campaign looked like from the trenches of the disinformation war.
In early September, the Biden campaign met with Facebook’s elections integrity team. With just weeks to go before election night, the meeting was an opportunity for Facebook to clarify how it would handle disinformation efforts to discourage people from voting and to undermine confidence in the results.
According to multiple Biden staff members in attendance, the Facebook team was unequivocal and reassuring. Under no circumstances, the company’s employees said, would Facebook tolerate the use of falsehoods to discredit mail-in voting. Facebook promised decisive action on voting disinformation, even if it were to come from President Trump himself.
The promise was put to the test shortly after, when Trump on his Facebook page urged North Carolina voters to show up to polling places even if they previously submitted a mail-in ballot. “Don’t let them illegally take your vote away from you,” the post read.
Trump’s call for his supporters to vote twice was roundly condemned by officials, including North Carolina’s attorney general. But when the Biden campaign asked Facebook to remove the post, it refused, instead appending a small label saying that mail-in voting “has a long history of trustworthiness.” (BuzzFeed News reported that Facebook’s internal data show that its warning labels don’t meaningfully stop the spread of Mr. Trump’s posts.)
For the Biden team, the moment was emblematic of its frustrating yearlong battle with the platform to enforce its own rules. “It was a total reversal,” a senior staff member told me recently. “You have half-baked policies on one hand, and the political reality on the other. And when push comes to shove, they don’t enforce their rules as they describe them.” (Like this staff member, those I interviewed spoke on condition of anonymity for this article for fear of reprisals.)
Facebook, for its part, poured significant resources into election security in 2020. The company registered over 4.4 million voters, built an elections hub to push out vetted news and had an elections operation center that brought together 40 teams inside the company. Its security team, led by its cybersecurity policy chief, Nathaniel Gleicher, took down numerous foreign and domestic influence operations seeking to undermine the election.
But many of the concerns expressed by the Biden campaign revolved around attacks from Republicans, not foreign adversaries. In conversations, Biden staff members rattled off examples of egregious misinformation and disinformation:
Posts on the eve of the Iowa caucuses baselessly alleging suspicious Democratic voter registrations that spread wildly before Facebook fact-checked the claims. Disinformation aimed at Spanish-language speakers before the Nevada caucuses. The constant swirl of accusations around Mr. Biden’s son Hunter and his work in Ukraine.
And then there was the refusal by Facebook’s chief executive, Mark Zuckerberg, to fact-check political ads.
Biden staff members said they repeatedly asked Facebook how it fact-checked content and received few answers in return. “We wanted to know: How many fact checkers do they have? How many requests go to them? How many do they actually end up fact-checking? Standard stuff, really,” one senior campaign worker told me. “We were told weekly that we’d get details on the scope of the program, and it never happened.”
According to campaign officials, when the campaign asked for insight into what political content was performing best on the platform, Facebook promised guidance but never thoroughly followed through.
But the Biden campaign’s own data showed some troubling signs. Workers told me the team attempted to track disinformation about their candidate to compile a weekly report. They were quickly overwhelmed, they said, unable to keep tabs on the vast network of conspiracies and lies.
So they started doing some internal polling, which was equally alarming. One internal poll of white, non-college-educated voters showed that those who used Facebook daily were 33 percent less likely to vote for Biden than those who didn’t.
I expected to hear accounts of heated phone calls between Facebook executives and campaign officials or, perhaps, bromide-filled exchanges between Biden and Zuckerberg. The reality was more mundane.
According to the Biden staff members, top executives rarely dealt with the campaign, even after it publicly bemoaned Facebook’s lackluster enforcement and the rampant spread of political misinformation and disinformation. Some of the campaign’s requests for clarification on policy, they said, were met with short email responses and included the same lines about policy enforcement that were handed out to reporters. In other cases, they said, there was no response at all.
A senior Facebook employee, who worked closely with the Biden campaign during the election, offered a different characterization of events. The employee, who is a Democrat and previously worked in Democratic politics, argued that the company met with the campaign regularly, offering numerous briefings on new policies. Facebook investigated content policy decisions brought up by the campaign, this person said.
The employee described the relationship as productive and even collegial despite tense circumstances. “The fundamental disagreement at the end of day was around these policy decisions. They wanted us to be more aggressive regarding both Trump-specific content and adjacent posts from his allies,” the employee said. “We took what we think is an aggressive approach and enforced policies that allowed us to label content related to mail-in voting and took down suppression efforts.”
The employee argued that Facebook was responsive to the Biden campaign, though in certain circumstances the company would not divulge internal metrics or other information.
“They weren’t ignoring us,” one senior Biden campaign official told me. “Facebook simply didn’t want to deal with the issues we raised. They didn’t have anything substantive to say and knew we’d call them on their drivel. All we kept asking is, ‘Will you actually exercise the corporate judgment you preach?’ What could they say to that? No?”
So the Biden campaign went public. Repeatedly. In October 2019, the campaign sent a letter to Facebook after it let a Republican super PAC run a video ad that accused Mr. Biden of blackmailing Ukrainian officials to stop an investigation of Hunter Biden. In June, the campaign issued an open letter urging Facebook to “stop allowing politicians to hide behind paid misinformation.” By late September, the campaign was yet again publicly excoriating the company as “the nation’s foremost propagator of disinformation about the voting process,” according to an email obtained by Axios.
Inside Facebook, some employees were equally frustrated with the company’s approach. One former employee present at company discussions told me recently that proposed engineering changes to Facebook’s political advertising technology were shot down by leadership.
The company rejected numerous proposals for additional transparency regarding political ads, worried that doing so might affect its commercial advertising business, the former employee told me. The former employee also said that efforts to narrow the criteria by which a candidate could target users were rejected by leaders for fear they might disproportionately affect specific candidates or political parties. The employee said that managers discussed potential political ad changes with federal elections officials, and both Democratic and Republican campaigns.
“If anyone in those constituencies said, ‘We don’t like this idea,’ then Facebook would abandon it,” this former employee told me recently. “They didn’t want to upset anyone with a public political persona.”
Every person I spoke to for this article seemed exhausted and frustrated by fundamental disagreements regarding how the platform moderates and directs attention to political speech. The senior Facebook employee argued that the company was trying to strike “what we believe is a responsible balance to provide as much free speech and as much responsible, authoritative information as possible.” This person stressed that the company’s election protection efforts were not devised with any partisan slant.
I offered that such efforts at a more neutral posture around political campaigns weren’t neutral at all — that, in an election in which one campaign is actively undermining confidence in the electoral process, Facebook’s trepidation to enforce its rules provided an advantage to the most shameless actors. The senior Facebook employee disagreed.
“That’s not the way the company approaches these issues. It happens to be that the company favors speech and we make every effort to allow for the most speech,” the senior employee told me. “What we balance for is not what Democrats want versus what Republicans want or what shameless versus the most virtuous people want. We’re balancing for providing as much speech as possible and looking for ways to prevent harm from happening.”
How one defines harm is an important and fraught part of this debate. Is the fact that many Americans now believe the election was stolen a preventable harm? Is Facebook’s important election security work meaningfully undermined by the president and his Republican colleagues’ Facebook posts, which are allowed to stay up on the platform? I’d argue yes.
“What Facebook is not good at is analyzing outcomes of their indifference to things,” the former Facebook employee told me. “Again and again they try not to advantage one side or another, and only in hindsight do they discover the consequences.”
It’s unclear how the Biden campaign staff members’ experience with Facebook will affect the incoming administration’s policies toward the company. But their time on the front lines of the information war has left them gravely concerned that Facebook and the other social media platforms are a threat to the electoral process. One Biden staff member described the flood of content suggesting the election was stolen as “unforgivable.”
And it’s hard to blame that person. The campaign is fighting an uphill battle against a president and a contingent of his followers who refuse to accept reality.
I recently visited Mr. Trump’s Facebook page. His most recent post at the time was an all-caps decree: “RIGGED ELECTION. WE WILL WIN!” Underneath it was an unimposing banner, appended by Facebook, offering not a reality check or rebuke of the president’s claim, but a neutral statement: “The U.S. has laws, procedures and established institutions to ensure the integrity of our elections.”
Charlie Warzel, a Montana-based opinion writer at larger for The New York Times, covers technology, media, politics and online extremism.