Frances Haugen, Facebook whistle-blower, speaks during a Senate Commerce, Science and Transportation Subcommittee hearing in Washington, D.C., U.S., on Tuesday, Oct. 5, 2021.
Stefani Reynolds | Bloomberg | Getty Images
Facebook whistleblower Frances Haugen told U.K. lawmakers on Monday that the company’s refusal to take responsibility for its services or incentivize employees to speak up about problematic behavior created the toxic situation that exists today.
“There is an unwillingness at Facebook to acknowledge that they are responsible to anyone,” Haugen said on Monday, testifying at a U.K. Parliament hearing on new legislation aimed at tackling harmful content online.
Haugen appeared in public for the second time since revealing herself as the source behind the numerous internal documents that sparked The Wall Street Journal’s series “The Facebook Files.” Haugen testified before the U.S. Congress earlier this month and has since started sharing her trove of documents with numerous news outlets.
Facebook leadership is centered on growth and has created a culture that focuses on the positive aspects of the company’s services at the expense of dealing with the problems they cause, Haugen said Monday.
“Facebook is overwhelmingly full of conscientious, kind, empathetic people,” she said. “Good people who are embedded in systems with bad incentives are led to bad actions. There is a real pattern of people who are willing to look the other way are promoted more than people who raise alarms.”
Haugen said Facebook hasn’t put in place ways for employees to point out issues that management should consider addressing or that researchers could examine.
“Facebook has shown over and over again not just that they don’t want to release that data but even when they do release that data they often mislead people,” she said.
It’s an attitude lodged in Facebook’s start-up culture and one that won’t change until the company is forced through regulation to alter its incentives, Haugen said.
“When they see a conflict of interest between profits and people, they keep choosing profits,” Haugen said.
A Facebook spokesperson said in an emailed statement that the company agrees on the need for regulation “so that businesses like ours aren’t making these decisions on our own.” The representative also reiterated Facebook’s disputes from recent news stories and said the company has spent “$13 billion and hired 40,000 people to do one job: keep people safe on our apps.”
Here are the highlights from Monday’s hearing:
Facebook Chairman and CEO Mark Zuckerberg.
Erin Scott | Reuters
Is Facebook evil?
John Nicolson, a member of parliament, asked Haugen if Facebook was just evil.
“What your evidence has shown to us is that Facebook is failing to prevent harm to children, it’s failing to prevent the spread of disinformation, it’s failing to prevent hate speak,” Nicolson said. “It does have the power to deal with these issues, it’s just choosing not to, which makes me wonder whether Facebook is just fundamentally evil. Is Facebook evil?”
Haugen said the word she would is “negligence.”
“I do believe there is a pattern of inadequacy, that Facebook is unwilling to acknowledge its own power,” she said. “They believe in flatness, and they won’t accept the consequences of their actions. So I think that is negligence and it is ignorance, but I can’t see into their hearts.”
Adam Mosseri, Facebook
Beck Diefenbach | Reuters
Worries about Instagram Kids
The Journal, in its series, highlighted that Facebook was aware that its Instagram service was harmful to teenagers’ mental health.
Public outcry following that report led Facebook to announce last month that it would pause its development of a version of Instagram designed for kids 13 and younger.
That topic came up again during Monday’s hearing.
Haugen said that inside Facebook, addiction to the company’s products is referred to as “problematic use.” Facebook found that problematic use is much worse in young people than those who are older, Haugen said.
To meet the bar for problematic use, someone has to be self-aware and honest enough to admit to a lack of control over usage. Haugen said that by the time teenagers have been using Facebook’s products for a year and turn 14, between 5.8% and 8% of them say they have problematic use.
“That’s a huge problem,” she said. “If that many 14-year-olds are that self-aware and that honest, the real number is probably 15%, 20%. I am deeply concerned about Facebook’s role in hurting the most vulnerable among us.”
Haugen said Facebook’s own reports say that the problem is not only that Instagram is dangerous for teenagers but that it is more harmful than other forms of social media.
“When kids describe their usage of Instagram, Facebook’s own research describes it as an addict’s narrative. The kids say, ‘This makes me unhappy. I feel like I don’t have the ability to control my usage of it, and I feel if I left I’d be ostracized,'” Haugen said. “I am deeply worried that it may not be possible to make Instagram safe for a 14-year-old, and I sincerely doubt it’s possible to make it safe for a 10-year-old.”
‘A novel that is going to be horrific to read’
In the hearing, Haugen referenced one of the Journal’s articles that noted that armed groups used Facebook to incite violence in Ethiopia. The company doesn’t have enough employees who speak the relevant languages to monitor the situation on Facebook’s services, according to the report.
Haugen said such situations are at risk of arising in other vulnerable countries in the global south, which is one of the main reasons she came forward.
“I believe situations like Ethiopia are just part of the opening chapters of a novel that is going to be horrific to read,” Haugen said.
Regulation could be good
Haugen commended the U.K. for considering regulating social media services, and she noted that regulation could help Facebook.
“I think regulation could actually be good for Facebook’s long-term success, because they force Facebook back into a place where it was more pleasant to be on Facebook,” she said.
The Verge on Monday published a report based on Haugen’s documents that showed the number of teenage users of the Facebook app in the U.S. has declined by 13% since 2019, with a projected drop of 45% over the next two years. The number of users between the ages of 20 and 30 was expected to decline by 4% during that time frame, according to the internal documents.
Haugen said that if regulation forced Facebook to change its incentives in a manner that resulted in its apps becoming more pleasant for users, the company could reverse this decline.
“I think if you make Facebook safer and more pleasant, it will be a more profitable company 10 years from now, because the toxic version of Facebook is slowly losing users,” she said.
WATCH: How can Facebook fix the trust issue?