As extremist supporters of Donald Trump stormed the US Capitol 의 위에 6 1 월, battling police and forcing lawmakers into hiding, an insurrection of a different kind was taking place inside the world’s largest social media company.
Thousands of miles away, in California, 페이스 북 engineers were racing to tweak internal controls to slow the spread of misinformation and content likely to incite further violence.
Emergency actions – some of which were rolled back after the 2020 election – included banning Trump, freezing comments in groups with records of hate speech and filtering out the “Stop the Steal” rallying cry of Trump’s campaign to overturn his electoral loss, falsely citing widespread fraud. Officials have called it the most secure election in US history.
Actions also included empowering Facebook content moderators to act more assertively by labeling the US a “temporary high risk location” for political violence.
동시에, frustration inside Facebook erupted over what some saw as the company’s halting and inconsistent response to rising extremism in the US.
“Haven’t we had enough time to figure out how to manage discourse without enabling violence?” one employee wrote on an internal message board at the height of the 6 January turmoil.
“We’ve been fueling this fire for a long time and we shouldn’t be surprised it’s now out of control.”
It’s a question that still hangs over the company today, as Congress and regulators investigate Facebook’s role in the events.
New internal documents have been provided to a number of media outlets in recent days by the former Facebook employee turned whistleblower 프랜시스 하우겐, following her initial disclosures and claims that the platform puts profits before public good, and her testimony to Congress.
The outlets, 포함 the New York Times, 그만큼 워싱턴 포스트 과 NBC, published reports based on those documents, which offer a deeper look into the spread of misinformation and conspiracy theories on the platform, particularly related to the 2020 US presidential election.
They show that Facebook employees repeatedly flagged concerns before and after the election, when Trump tried to falsely overturn Joe Biden’s victory. 에 따르면 뉴욕 타임즈, a company data scientist told co-workers a week after the election that 10% of all US views of political content were of posts that falsely claimed the vote was fraudulent. But as workers flagged these issues and urged the company to act, the company failed or struggled to address the problems, the Times reported.
The internal documents also show Facebook researchers have found the platform’s recommendation tools repeatedly pushed users to extremist groups, prompting internal warnings that some managers and executives ignored, NBC News reported.
In one striking internal study, a Facebook researcher created a fake profile for “Carol Smith”, a conservative female user whose interests included Fox News and Donald Trump. The experiment showed that within two days, Facebook’s algorithm was recommending “Carol” join groups dedicated to QAnon, a baseless internet conspiracy theory.
The documents 또한 provide a rare glimpse into how the company appears to have simply stumbled into the events of 6 1 월.
It quickly became clear that even after years under the microscope for insufficiently policing its platform, the social network had missed how riot participants spent weeks vowing – by posting on Facebook itself – to stop Congress from certifying Joe Biden’s election victory.
This story is based in part on disclosures Haugen made to the Securities and Exchange Commission (SEC), the US agency that handles regulation to protect investors in publicly traded companies, provided to Congress in redacted form by her legal counsel.
The redacted versions received by Congress were obtained by a consortium of news organizations, including the Associated Press.
What Facebook called “Break the Glass” emergency measures put in place on 6 January were essentially a toolkit of options designed to stem the spread of dangerous or violent content. The social network had first used the system in the run-up to the bitter 2020 선거.
As many as 22 of those measures were rolled back at some point after the election, according to an internal spreadsheet analyzing the company’s response.
“As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety,” Haugen has said.
An internal Facebook report following 6 1 월, previously reported by BuzzFeed, faulted the company for a “piecemeal” approach to the rapid growth of “Stop the Steal” pages.
Facebook said the situation was more nuanced and that it carefully calibrates its controls to react quickly to spikes in hateful and violent content. The company said it was not responsible for the actions of the rioters – and that having stricter controls in place prior to that day wouldn’t have helped.
Facebook’s decisions to phase certain safety measures in or out had taken into account signals from the Facebook platform as well as information from law enforcement, said a spokesperson, Dani Lever, 속담: “When those signals changed, so did the measures.”
Lever added that some of the measures had stayed in place well into February and others remained active today.
그 동안에, Facebook is facing mounting pressure after a new whistleblower on Friday accused it of knowingly hosting hate speech and illegal activity.
Allegations by the new whistleblower, who spoke to the 워싱턴 포스트, were reportedly contained in a complaint to the SEC.
In the complaint, which echoes Haugen’s disclosures, the former employee detailed how Facebook officials frequently declined to enforce safety rules for fear of angering Donald Trump and his allies or offsetting the company’s huge growth. In one alleged incident, Tucker Bounds, a Facebook communications official, dismissed concerns about the platform’s role in 2016 election manipulation.
“It will be a flash in the pan,” Bounds said, according to the affidavit, as reported by the Post. “Some legislators will get pissy. And then in a few weeks they will move on to something else. 그 동안에, we are printing money in the basement, and we are fine.”