A former Facebook product manager has become one of the company’s highest-profile critics after exposing thousands of internal documents she said showed the social media giant failed to protect users (Joerg Koch, dapd, AP)
- Former Facebook product manager Frances Haugen has exposed thousands of internal documents she said showed the social media giant failed to protect users.
- One study Haugen uncovered showed Facebook took action on as little as 3% to 5% of hate speech on Facebook, and on less than 1% of content classified under “violence and incitement”.
- Other research showed that using Instagram often makes things worse for young people who suffer from existing mental health problems, such as anxiety or body image issues
A former Facebook product manager has become one of the company’s highest-profile critics after exposing thousands of internal documents she said showed the social media giant failed to protect users.
Frances Haugen, who tackled misinformation on the platform, turned over internal research to US lawmakers and the Wall Street Journal, which reported the company knew, but didn’t disclose, the negative impact of services like Instagram. She said she was sounding the alarm over the company’s practices after seeing repeated evidence that Facebook prioritises profits over the well-being of its users.
“There were conflicts of interest between what was good for the public and what was good for Facebook,” she told “60 Minutes” in her first public interview on Sunday.
“Facebook over and over again chose to optimise for its own interests like making more money.”
The revelations have ignited a firestorm for Facebook in Washington as lawmakers accuse the platform of covering up internal research about its negative effects. The trove of documents she handed over shed light on internal discussions about the company’s content moderation efforts, how it treats high-profile accounts differently from other users, and the mental impact its photo-sharing app Instagram has on young users.
One study Haugen uncovered showed Facebook took action on as little as 3% to 5% of hate speech on Facebook, and on less than 1% of content classified under “violence and incitement,” according to 60 Minutes. Haugen is set to appear on Tuesday before a Senate subcommittee on consumer protection as part of a hearing focused on “protecting kids online.”
Last week, lawmakers questioned Antigone Davis, Facebook’s global head of safety, over documents that showed Instagram can worsen the mental health of teens who are already suffering
“From her first visit with my office, I have admired her backbone and bravery in revealing terrible truths about one of the world’s most powerful, implacable corporate giants,” Senator Richard Blumenthal, a Connecticut Democrat who chairs the subcommittee holding next week’s hearing, said in a statement.
“We now know about Facebook’s destructive harms to kids — harms that Facebook concealed and knowingly exploited to increase profits — because of documents Frances revealed,” he said.
Facebook spokesperson Lena Pietsch, calling the 60 Minutes segment “misleading,” said in a statement that the company seeks to balance free expression with the need to keep the platform safe.
“We continue to make significant improvements to tackle the spread of misinformation and harmful content,” she said. “To suggest we encourage bad content and do nothing is just not true.”
Haugen started working at Facebook in June 2019 after stints at Google, Yelp and Pinterest, according to her LinkedIn page. The Iowa native was recruited to Facebook to be the lead product manager on the civic misinformation team and later worked on counter-espionage, according to her website. Frances told 60 Minutes she agreed to take the Facebook job so she could work against misinformation after seeing a friend get wrapped up in online conspiracy theories.
“I never wanted anyone to feel the pain that I had felt,” she told the network. “I had seen how high the stakes were in terms of making sure there was high quality information on Facebook.”
During her time at Facebook, Frances grew more alarmed by the choices the company was making to prioritize its own growth at the expense of the public, she said.
Included in the trove of documents Haugen shared was a series of internal research slides outlining the impact that Facebook photo-sharing app Instagram has on teenagers, reported in September as part of a series of stories by the Wall Street Journal.
The research showed that using Instagram often makes things worse for young people who suffer from existing mental health problems, such as anxiety or body image issues. Her lawyers have also filed at least eight complaints with the US Securities and Exchange Commission, according to the 60 Minutes segment.
It’s clear Haugen left Facebook knowing full-well she planned to hand over damning company documents. After resigning from her job in April, Haugen stayed at Facebook an additional month, collecting material on the company she felt proved Facebook had failed to be responsible, the Journal reported.
She expected the company to notice her activity, which included viewing documents unrelated to her job, she added. Facebook can see when employees view certain documents or make specific searches on the company’s internal communication product, called Workplace.
Haugen even left a cryptic message for the company on her last day, the Journal reported, by using the insernal search function to type: “I don’t hate Facebook. I love Facebook. I want to save it.”
Haugen grew up attending the Iowa caucuses with her professor parents, which instilled in her “a strong sense of pride in democracy and responsibility for civic participation,” according to her website. Now, she sees herself as “an advocate for public oversight of social media.”
“We can have social media we enjoy that brings out the best in humanity,” she says on her website.
Facebook has pushed back on some of the Journal’s stories, claiming that data was “cherry picked.” Still, the uproar that followed the reports led the company last week to halt plans to roll out a separate version of Instagram for children under 13, citing the need for further consultation with experts, parents and policymakers.
Facebook says it’s not abandoning the idea of building the app entirely.
“I still think building this experience is the right thing to do, but we want to take more time to speak with parents and experts working out how to get this right,” tweeted Instagram head Adam Mosseri.
At a hearing on the topic last week, lawmakers blasted Facebook, arguing that the company has focused on profits ahead of efforts to make its products safer for kids. “We do not trust you,” said Tennessee Senator Marsha Blackburn, the panel’s ranking Republican.