Frances Haugen’s interview with the US news program 60 Minutes contained a litany of damning statements about Facebook. Haugen, a former Facebook employee who had joined the company to help it combat misinformation, told the CBS show that the tech company prioritized profit over security and was “destroying our societies.”
Haugen will testify in Washington on Tuesday, as political pressure mounts on Facebook. These are some of the key excerpts from Haugen’s interview.
Choose profit over the public good
Haugen’s sharpest words echoed what is becoming a common refrain from politicians on both sides of the Atlantic: that Facebook puts profit before the well-being of its users and the public.
“What I saw on Facebook over and over again was that there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, time and time again, chose to optimize for its own interests, like making more money. “
He also accused Facebook of endangering public safety by reversing changes to its algorithm once the 2020 presidential election ended, allowing misinformation to spread across the platform again. “And as soon as the elections were over, they turned them around [the safety systems] roll back or change settings back to what they were before, to prioritize growth over security. And that really seems like a betrayal of democracy to me. “
Facebook’s approach to security compared to others
In a 15-year career as a tech professional, Haugen, 37, has worked for companies like Google and Pinterest, but said Facebook had the worst approach to restricting harmful content. She said: “I have seen a lot of social media and it was substantially worse on Facebook than anything I had seen before.” Referring to Mark Zuckerberg, founder and CEO of Facebook, he said: “I have a lot of empathy for Mark. And Mark has never set out to create an obnoxious platform. But it has allowed decisions to be made where the side effects of those decisions are that hateful and polarizing content gets wider distribution and more reach. “
Instagram and mental health
The document leak that had the biggest impact was a series of research slides showing that Facebook’s Instagram app was harming the mental health and well-being of some adolescent users, with 30% of adolescent girls feeling dissatisfaction with them worsening. her body.
She said: “And what’s super tragic is that Facebook’s own research says that as these young women start consuming this eating disorder content, they become increasingly depressed. And it actually makes them use the app more. And so they end up in this feedback loop where they hate their bodies more and more. Facebook’s own research says that it’s not just that Instagram is dangerous for teens, that it harms teens, it’s clearly worse than other forms of social media. “
Facebook has described the Wall Street Journal reports on the slides as a “mischaracterization” of its research.
Why Haugen Leaked the Documents
Haugen said “person after person” had tried to address Facebook’s problems, but had been taken down. “Imagine that you know what happens inside Facebook and you know that nobody on the outside knows it. I knew what my future would be like if I continued to stay within Facebook, which is person after person after person who has approached this within Facebook and grounded themselves. “
Having joined the company in 2019, Haugen said he decided to take action this year and began copying tens of thousands of documents from Facebook’s internal system, which he believed show that Facebook, despite public comments to the contrary, is failing. significant progress in the fight online. hatred and misinformation. “At some point in 2021, I realized that ‘OK, I’m going to have to do this systemically, and I have to get out enough that no one can question that this is real.’
Facebook and violence
Haugen said the company had contributed to ethnic violence, a reference to Burma. In 2018, following the army’s massacre of Rohingya Muslims, Facebook accepted that its platform had been used to “foment division and incite violence offline” in relation to the country. Speaking at 60 Minutes, Haugen said: “When we live in an information environment that is full of angry, hateful and polarizing content, it erodes our civic trust, it erodes our faith in others, it erodes our ability to want to care for others. other. The version of Facebook that exists today is destroying our societies and causing ethnic violence around the world. “
Facebook and the Washington riots
The January 6 riot, when crowds of right-wing protesters stormed the Capitol, came after Facebook disbanded the Civic Integrity team of which Haugen was a member. The team, which focused on election-related issues around the world, dispersed to other Facebook units after the US presidential election. “They told us: ‘We are dissolving Civic Integrity.’ Like, they basically said, ‘Oh good, we made it through the election. There were no riots. We can get rid of civic integrity now. ‘ Fast forward a couple of months, we reached the insurrection. And when they got rid of Civic Integrity, that was the moment I thought, ‘I don’t trust they’re willing to invest what it takes to keep Facebook from being dangerous.’
The 2018 algorithm change
Facebook changed the algorithm in its news feed, Facebook’s core function, which provides users with a personalized source of content, such as photos of friends and news stories, to prioritize content that increased user engagement. Haugen said this made divisive content more prominent.
“One of the consequences of how Facebook is targeting that content today is that it is optimizing it for content that generates engagement or reaction. But their own research shows that content that is hateful, that is divisive, that is polarizing: it is easier to inspire people to anger than other emotions. “He added:” Facebook has realized that if they change the algorithm to be more secure, people will spend less time on the site, click fewer ads, and earn less money. “
Haugen said European political parties reached out to Facebook to say that the change in the news service was forcing them to take more extreme political positions to attract users’ attention. Describing the concerns of politicians, he said: “They are forcing us to take positions that we don’t like, that we know are bad for society. We know that if we don’t take those positions, we won’t win in the social media market. “
In a statement to 60 Minutes, Facebook said: “Every day, our teams must balance protecting the right of billions of people to speak out with the need to keep our platform safe and positive. We continue to make significant improvements to address the spread of misinformation and harmful content. To suggest that we encourage bad content and do nothing is simply not true. If some research had identified an exact solution to these complex challenges, the technology industry, governments and society would have solved them long ago. “