For years, misinformation has flourished on Facebook. Falsehoods, misrepresentations, and outright lies posted on the site have shaped the discourse on everything from national politics to public health.
But despite their role in facilitating communications for billions of people, Facebook executives refused to commit resources to understand the extent to which COVID-19-related misinformation pervaded its platform, according to a report in The New York Times.
Early in the pandemic, a group of data scientists at Facebook met with executives to propose a project that would determine how many users saw misleading or false information about COVID. It wasn’t a small task—they estimated that the process could take up to a year or more to complete—but it would give the company a solid understanding of the extent to which misinformation spread on its platform.
The executives listened to the data scientists’ pitch and then reportedly ghosted them. The data team’s proposal wasn’t approved, and they were never given an explanation for why it was silently dropped.
The revelations come as Facebook has drawn fire from the White House for its role in the spread of misinformation about COVID-19 and the vaccines that prevent it. “They’re killing people,” President Joe Biden said about the role of social networks in the spread of misinformation. “Look, the only pandemic we have is among the unvaccinated. They’re killing people.”
Biden later walked back his comments slightly, but they revealed the administration’s frustration with social media platforms—and with Facebook in particular—over their response to the pandemic. For weeks, the White House pressed Facebook for details on how the company is combating COVID vaccine misinformation. The social network offered some details but gave unsatisfying answers to other requests.
It’s unclear why Facebook isn’t sharing information about its efforts to fight misinformation. The company has surveyed its users about vaccine acceptance—Facebook says 85 percent “have been or want to be vaccinated”—and it says it has taken down 18 million pieces of misinformation related to COVID-19 since the pandemic began. That’s about 40,000 pieces of content per day.
Perhaps Facebook hasn’t shared those details because it’s not confident in its own approach. Without a more comprehensive view of how misinformation spreads on Facebook, it’s probably extraordinarily difficult to devise an effective counteroffensive. Removing 18 million pieces of content isn’t nothing, but it’s likely an insignificant number given that back in 2012, when Facebook had less than half the users it has today, the company said it processed 2.5 billion pieces of content per day.
Facebook’s unwillingness or inability to understand the scope of COVID misinformation on its platform was apparent in comments it gave to The New York Times, in which it blamed its nescience on the lack of a “standard definition” for pandemic-related misinformation. “The suggestion we haven’t put resources toward combating COVID misinformation and supporting the vaccine rollout is just not supported by the facts,” said Dani Lever, a Facebook spokeswoman. “With no standard definition for vaccine misinformation, and with both false and even true content (often shared by mainstream media outlets) potentially discouraging vaccine acceptance, we focus on the outcomes—measuring whether people who use Facebook are accepting of COVID-19 vaccines.”
For researchers who study misinformation, that explanation isn’t sufficient. “They need to open up the black box that is their content ranking and content amplification architecture,” Imran Ahmed, chief executive of the Center for Countering Digital Hate, told The New York Times. “Take that black box and open it up for audit by independent researchers and government. We don’t know how many Americans have been infected with misinformation.”