In a fresh blow to the social media giant, Facebook whistleblower, Frances Haugen – a former product manager on Facebook’s civic misinformation team, has accused the platform of prioritizing own profits over public safety.
Haugen, the source behind revelation detailing the company’s research to The Wall Street Journal, told 60 Minutes she leaked the documents after seeing a conflict of interest at Facebook between ‘what’s good for the company and what’s good for the public’.
“Facebook, over and over again, chose to optimize for its own interests, like making more money,” the data scientist said in the interview. “I knew what my future looked like if I continued to stay inside Facebook, which is person after person has tackled this inside of Facebook and ground themselves to the ground.”
Haugen, the former algorithmic product manager at Facebook, said she decided to make Facebook’s internal communications public after realizing she would need to do so in a systemic way and get out enough that no one can question that this is real. She copied and released tens of thousands of pages of documents. Haugen highlighted Facebook’s algorithm as the element that pushes misinformation onto users.
She said the company recognized the risk of misinformation to the 2020 election and therefore added safety systems to reduce that risk, but loosened those safety measures once again after the election. “As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety. And that really feels like a betrayal of democracy to me.”
Haugen explained that one of the consequences of how Facebook is picking out that content today is it is optimizing for content that gets engagement or reaction. “But its own research is showing that content that is hateful, that is divisive, that is polarizing, it’s easier to inspire people to anger than it is to other elements.”
Facebook spokesperson Lena Pietsch, in a statement, following Haugen’s identity reveal, said that every day, the platform’s teams have to balance protecting the ability of billions of people to express themselves openly with the need to keep the platform a safe and positive place. “We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.”
The leaked documents revealed that Facebook executives had been aware of the negative impacts of its platforms on some young users, among other findings. The WSJ reported that one internal document found that of teens reporting suicidal thoughts, 6% of American users traced the urge to kill themselves to Instagram.
However, Facebook has brushed it off saying that WSJ ignored potentially positive interpretations of data, like that many users found positive impacts from engagement with their products.