AppleInsider is supported by your audience and can earn commissions when you shop through our links. These affiliate associations do not influence our editorial content.
A whistleblower who leaked a selection of internal Facebook documents has revealed her identity while criticizing the social network’s track record, stating that “Facebook has shown time and again that it prefers profit over safety.”
A cache of Facebook documents was leaked when Wall street journal by a whistleblower, prompting a thorough investigation on the social network. Following the publication of the initial investigative efforts in the documents, the whistleblower has come forward in an interview, explaining more about how the tech giant works and why it chose to release the data.
That data release led to a number of reports, including one that revealed that Facebook allegedly knew that Instagram was bad for the well-being of teens. That report led to a hearing with the Senate Commerce Committee’s Subcommittee on Teen Mental Health Consumer Protection.
On to interview with 60 minutes On Sunday, Frances Haugen redoubled the trove of documents that apparently showed Facebook cared more about its algorithms than dealing with hate speech and other toxic content.
Haugen was previously a product manager at the company, working in its Civic Integrity Group, but left after the group disbanded. Before her departure, she copied tens of thousands of pages of internal research, which she claims shows Facebook is lying to the public about its progress against hate, violence and misinformation.
“I knew what my future would be like if I continued to stay within Facebook, which is person after person after person has addressed this within Facebook and grounded themselves,” Haugen said of his decision to release the documents. “At some point in 2021, I realized that Ok, I’m going to have to do this systemically, and I have to get out enough that no one can question that this is real.”
According to the data scientist, who had previously worked for Google and Pinterest, “I’ve seen a ton of social media and Facebook was substantially worse than anything I’d ever seen before.”
The documents included studies conducted internally by Facebook on its services, one of which determined that Facebook had not acted on hateful content. “We estimate that we can act only between 3% and 5% of hate and around 6 tenths of 1% of [violence and incitement] on Facebook despite being the best in the world at that, “the study reads.
Haugen recounts how she was assigned to Civic Integrity, which was intended to work on misinformation risks for the elections, but after the 2020 US elections, it was disbanded.
“They basically said: Oh well, we passed the elections. There was no disturbance. We can get rid of Civic Integrity now.” Fast forward a couple of months, we hit the insurrection, “Haugen recounts.” And when they got rid of Civic Integrity, it was the moment when I thought I didn’t trust that they were willing to invest what is necessary to invest to prevent Facebook from being dangerous”.
By selecting the content algorithm to show users based on engagement, Haugen says the system is now optimized for engagement or getting a reaction. “But his own research is showing that content that is hateful, that is divisive, that is polarizing, it is easier to inspire people to anger than other emotions.”
“Yes, Facebook has realized that if they change the algorithm to be more secure, people will spend less time on the site, click on fewer ads, make less money,” he adds, before insisting that systems be introduced. of security. to reduce misinformation for the 2020 elections were temporary.
“As soon as the elections were over, they rejected them or changed the configuration to what they were before, to prioritize growth over security,” says Haugen. “And that seems to me to be a betrayal of democracy.”
The publication of Haugen documents was not only for the Wall street journal. In September, attorneys working on Haugen’s behalf filed at least 8 complaints with the Securities and Exchange Commission. The submissions were based on the theory that Facebook should not lie to investors or withhold material information, something that its public claims about progress against hate speech and reality may prompt further scrutiny by the regulator.