Facebook whistleblower Frances Haugen testified before a Senate panel yesterday, recommending a slate of changes to rein in the company, including a Section 230 overhaul that would hold the social media giant responsible for its algorithms that promote content based on the engagement it receives in users’ news feeds.
“If we had appropriate oversight, or if we reformed [Section] 230 to make Facebook responsible for the consequences of their intentional ranking decisions, I think they would get rid of engagement-based ranking,” Haugen said. “Because it is causing teenagers to be exposed to more anorexia content, it is pulling families apart, and in places like Ethiopia, it’s literally fanning ethnic violence.”
Haugen made sure to distinguish between user-generated content and Facebook’s algorithms, which prioritize the content in news feeds and drive engagement. She suggested that Facebook should not be responsible for content that users post on its platforms but that it should be held liable once its algorithms begin making decisions about which content people see.
That suggestion mirrors a bill, the Protecting Americans from Dangerous Algorithms Act, which has been introduced in the House.
Yet despite the call for Section 230 reform, Haugen also cautioned that reform alone would not be enough to adequately oversee the company’s broad reach. “The severity of this crisis demands that we break out of previous regulatory frames,” she said. “Tweaks to outdated privacy protections or changes to Section 230 will not be sufficient.”
The sweeping hearing was called to address tens of thousands of documents Haugen obtained from Facebook before she left the company in May. The former product manager handed over the files to Congress and the SEC and is seeking whistleblower protection. She has filed eight lawsuits with the SEC, including allegations that Facebook overcharged advertisers and misled investors about the size of its user base. The company, she said, is not being upfront about the nature of the problems it’s facing. “Facebook wants you to believe that the problems we’re talking about are unsolvable. They want you to believe in false choices,” Haugen said.
“They want you to believe that you must choose between a Facebook full of divisive and extreme content or losing one of the most important values our country was founded upon: free speech,” she added. “That you must choose between public oversight of Facebook’s choices and your personal privacy. That to be able to share fun photos of your kids with old friends, you must also be inundated with anger-driven virality. They want you to believe that this is just part of the deal.
“I am here today to tell you that’s not true. These problems are solvable. A safer, free-speech-respecting, more enjoyable social media is possible.”
Haugen was similarly optimistic about Facebook’s mission, despite its myriad problems. “I believe in the potential of Facebook,” she said. “We can have social media we enjoy, that connects us without tearing apart our democracy, putting our children in danger, and sowing ethnic violence around the world. We can do better.”
Time and again, the hearing came back to the problems posed by the prevalence of algorithms and artificial intelligence in Facebook’s operations.
Documents that Haugen collected from Facebook show that engagement-based ranking algorithms prioritize divisive and extreme content on the platform. To solve that problem, she said, Facebook uses artificial intelligence to find dangerous content before it spreads. The problem is that “Facebook’s own research says they cannot adequately identify dangerous content. And as a result, those dangerous algorithms that they admit are picking up the extreme sentiments, the division, they can’t protect us from the harms that they know exist in their own system,” Haugen said.
One solution, floated by Sen. Rick Scott (R-Fla.) during the hearing, proposes that users be offered a choice between an algorithmic news feed and a chronological one. Haugen did not think that would be sufficient. “I worry that if Facebook is allowed to give users the choice: do you want an engagement-based news feed or a chronological news feed… that people will choose the more addictive option, that engagement-based ranking, even if it is leading their daughters to eating disorders.”
In an internal message later posted publicly, Facebook CEO Mark Zuckerberg alluded to research the company has done on its platform. “If we’re going to have an informed conversation about the effects of social media on young people, it’s important to start with a full picture. We’re committed to doing more research ourselves and making more research publicly available.”
Despite promising a “full picture,” Zuckerberg did not commit to releasing all of Facebook’s research or opening its platforms to study by researchers not employed or funded by the company. Haugen’s documents show that, as recently as last week, the company has released research selectively to paint a rosier picture of what is happening on its platforms.
The company’s lack of transparency has tied regulators’ hands, Haugen said. “This inability to see in Facebook’s actual systems and confirm that they work as communicated is like the Department of Transportation regulating cars by only watching them drive down the highway,” she said. Haugen urged the senators to consider a government agency that would audit social media platforms, giving regulators greater insight into the companies’ inner workings.
“Today, no regulator has a menu of solutions for how to fix Facebook because Facebook didn’t want them to know enough about what’s causing the problems,” Haugen said. “Otherwise, there wouldn’t have been a need for a whistleblower.”