United States

Facebook Whistleblower: The Company Knows It's Harming People and the Buck Stops With Zuckerberg

Frances Haugen, Facebook whistle-blower, speaks during a Senate Commerce, Science and Transportation Subcommittee hearing in Washington, D.C., U.S., on Tuesday, Oct. 5, 2021.
Stefani Reynolds | Bloomberg | Getty Images
  • Facebook whistleblower Frances Haugen testified before a Senate panel Tuesday, telling lawmakers they must intervene to solve the "crisis" created by her former employer's products.
  • Haugen unmasked herself Sunday as the source behind leaked documents at the core of a revealing Wall Street Journal series about Facebook.
  • She was a product manager on Facebook's civil integrity team.

Facebook whistleblower Frances Haugen told a Senate panel Tuesday that Congress must intervene to solve the "crisis" created by her former employer's products.

The former Facebook product manager for civic misinformation told lawmakers that Facebook consistently puts its own profits over users' health and safety, which is largely a result of its algorithms' design that steers users toward high-engagement posts that in some cases can be more harmful.

Though she stopped short of accusing top executives of intentionally creating harmful products, she said that ultimately CEO Mark Zuckerberg had to be responsible for the impact of his business.

Haugen also said that Facebook's algorithm could steer young users from something relatively innocuous such as healthy recipes to content promoting anorexia in a short period of time. She proposed a solution for Facebook to change its algorithms to stop focusing on delivering posts that create more engagement and instead create a chronological feed of posts for Facebook users. That, she said, would help Facebook deliver safer content.

Haugen, who unmasked herself Sunday as the source behind leaked documents at the core of a revealing Wall Street Journal series about Facebook, testified before the Senate Commerce subcommittee on consumer protection. Haugen told CBS' "60 Minutes" in an interview aired this weekend that the problems she saw at Facebook were worse than anywhere else she'd worked, which includes Google, Yelp and Pinterest. She told the news program that she copied tens of thousands of pages of internal research that she took with her when she left Facebook in May.

"I saw that Facebook repeatedly encountered conflicts between its own profits and our safety," Haugen said in her written testimony. "Facebook consistently resolved those conflicts in favor of its own profits. The result has been a system that amplifies division, extremism and polarization — and undermining societies around the world."

In her prepared remarks, Haugen said she believes she did the right thing in coming forward but is aware Facebook could use its immense resources to "destroy" her.

"I came forward because I recognized a frightening truth: almost no one outside of Facebook knows what happens inside Facebook," Haugen said in her written remarks. "The company's leadership keeps vital information from the public, the U.S. government, its shareholders, and governments around the world."

Haugen said a turning point that convinced her of the need to bring information outside Facebook was when the company dissolved the civic integrity team after the 2020 U.S. election. Facebook said it would integrate those responsibilities into other parts of the company. But Haugen said that within six months of the reorganization, 75% of her "pod" of seven people who had mostly come from civic integrity left for other parts of the company or left entirely.

"Six months after the reorganization, we had clearly lost faith that those changes were coming," she said.

In a statement after the hearing concluded, a Facebook spokesperson attempted to cast doubt on Haugen's credibility.

"Today, a Senate Commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives — and testified more than six times to not working on the subject matter in question," said Lena Pietsch, Facebook's director of policy communications. "We don't agree with her characterization of the many issues she testified about. Despite all this, we agree on one thing; it's time to begin to create standard rules for the internet. It's been 25 years since the rules for the internet have updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act."

Impact on young users

Much of the hearing focused on Facebook's impact on young users. Lawmakers expressed outrage at one of the Journal's reports that said Facebook's internal research had found Instagram created a toxic environment for some teen girls already experiencing negative feelings about their bodies. That prompted even stronger calls from lawmakers for Facebook to end plans to launch a version of Instagram for kids.

Facebook has accused the Journal of cherry-picking data, emphasizing that research showed that a majority of users surveyed in several cases found positive effects of using its products, even when a small percentage felt it made their negative feelings worse.

Facebook has said that in a survey, 8 out of 10 teen Instagram users in the U.S. said the platform made them feel better or had no effect on their feelings about themselves. But Haugen testified Tuesday that the other 20% remaining is still significant on a platform boasting billions of users worldwide.

"In the case of cigarettes, 'only' about 10% of people who smoke ever get lung cancer," Haugen said. "So the idea that 20% of your users could be facing serious mental health issues and that's not a problem is shocking."

Though Facebook announced a temporary pause on its Instagram for kids plans, Haugen told senators she would be "sincerely surprised" if Facebook stops working on the product.

"Facebook understands that if they want to continue to grow, they have to find new users," Haugen said, adding that means ingraining kids with habits.

Allegations of misrepresentation and understaffing

Along with her disclosures to the U.S. Senate and the Journal, Haugen also filed complaints with the Securities and Exchange Commission, claiming Facebook misled investors and advertisers by omitting or misrepresenting what it knew about how its platforms were being used, such as to spread misinformation, and the measures it was taking to combat that.

Haugen said Tuesday that Facebook gave talking points to advertising staff after the Jan. 6 insurrection at the U.S. Capitol assuring advertisers that Facebook was doing everything it could to make the platform safer, including by taking down all hate speech they find. Haugen said this was not true.

Though she called on lawmakers to impose regulations on Facebook, she warned in her testimony that "[t]weaks to outdated privacy protections or changes to Section 230 will not be sufficient," referring to the legal shield that protects online platforms from liability for their users' posts. She also said she believes a healthy social media platform is possible to achieve and that Facebook presents "false choices ... between connecting with those you love online and your personal privacy."

"The core of the issue is that no one can understand Facebook's destructive choices better than Facebook, because only Facebook gets to look under the hood," she said in her prepared testimony, saying transparency is the right first step.

She told lawmakers that she consistently saw teams at Facebook understaffed, which prompted "an implicit discouragement from having better detection systems." She said if Facebook had even a basic detector on the counterespionage team on which she worked, they would be able to pick up on many more cases than they already handled.

Similarly, she added that Facebook could do "substantially more" to detect children on their platform and should have to publish those processes for Congress. She said Facebook has the ability to detect more underaged kids on the platform even if they lie about their ages.

Haugen also said that while she worked there, the counterespionage team tracked Chinese participation on Facebook tracking the ethnic minority Uyghur population. She said the "consistent understaffing" of such teams is a national security concern and that she is speaking with other parts of Congress about it. Sen. Richard Blumenthal, D-Conn., the chairman of the subcommittee, said that topic was ripe for another hearing.

'Big Tobacco moment'

Opening the hearing Tuesday, Blumenthal called on Zuckerberg to come before the committee to explain the company's actions. He called the company "morally bankrupt" for rejecting reforms offered by its own researchers.

Haugen said Zuckerberg's unique position as CEO and founder with a majority of voting shares in the company makes him accountable only to himself.

There are "no similarly powerful companies that are as unilaterally controlled," Haugen said.

Blumenthal said the disclosures by Haugen ushered in a "Big Tobacco moment," a comparison Haugen echoed in her own testimony. Blumenthal recalled his own work suing tobacco companies as Connecticut's attorney general, remembering a similar time when enforcers learned those companies had conducted research that showed the harmful effects of their products.

Sen. Roger Wicker, R-Miss., chairman of the Commerce Committee, called the hearing "part of the process of demystifying Big Tech."

Toward the end of the hearing, Blumenthal told Haugen that he thinks "there are other whistleblowers out there."

"I think you're leading by example," he said. "I think you're showing them that there's a path to make this industry more responsible and more caring about kids."

If you or someone you know is in crisis, call the National Suicide Prevention Lifeline at 800-273-8255.

Copyright CNBC
Contact Us