Facebook should be punished with substantial fines, potentially running into billions of pounds, if it withholds evidence that its social media platforms harm users, according to the MP leading scrutiny of a new online safety bill.
The social network is under political pressure on both sides of the Atlantic following revelations in the Wall Street Journal that Facebook knew its Instagram photo-sharing app was harming the mental health of teenage girls. Leaked internal documents showed that among teenagers who have had suicidal thoughts, 13% of British users and 6% of American users traced the desire to kill themselves to Instagram.
Damian Collins, the Conservative chair of the joint committee on the draft online safety bill, which imposes a duty of care on social media companies to protect users from harmful content, said on Wednesday: “If they have important information like this and they kept that information from the regulator then I think they should be punished. There would be fines. The bill creates a duty of care. If there are harms being caused and a company is trying to hide that information from the regulator, then that would be quite a serious breach in duty of care.”
Social media firms are required under the draft bill to submit to Ofcom, the communications watchdog, a “risk assessment” of content that causes harm to users.
The bill proposes fines of up to 10% of a company’s annual turnover, which in the case of Facebook would be around £6bn. One Facebook research slide from 2019, revealed by the WSJ, stated that the app made body image issues worse for one in three girls.
Beeban Kidron, the crossbench peer who sits on the committee and was behind the recent introduction of a children’s privacy code, said the revelations proved “beyond doubt the importance and timeliness of the online safety bill”. Ella añadió: “It makes it unequivocal that the bill’s protections must be extended to protect children wherever they are online, whether on social media, in app stores or in a virtual classroom. Facebook’s own research shows how children are sent into a spiral of harmful experiences by features deliberately designed to keep them engaged. The bill has to bring in an era of enforceable minimum standards. What we have now is the tech industry marking its own homework and then hiding the devastating results.”
The children’s charity the NSPCC said it was “appalling” that Facebook had not acted on its own internal evidence that Instagram caused harm to its users.
Andy Burrows, the head of child online safety policy at the NSPCC, dicho: “Instead of working to make the site safe, they’ve obstructed researchers, regulators, and governments and run a PR and lobbying campaign in an attempt to prove the opposite.”
In Washington, the Senate consumer protection subcommittee said it would investigate the revelations and was in contact with a Facebook whistleblower.
“It is clear that Facebook is incapable of holding itself accountable,” said the US senators Richard Blumenthal and Marsha Blackburn, the chair and ranking member of the committee respectively. “When given the opportunity to come clean to us about their knowledge of Instagram’s impact on young users, Facebook provided evasive answers that were misleading and covered up clear evidence of significant harm.”
Karina Newton, the head of public policy at Instagram, said in a blogpost on Tuesday: “While the story focuses on a limited set of findings and casts them in a negative light, we stand by this research. It demonstrates our commitment to understanding complex and difficult issues young people may struggle with, and informs all the work we do to help those experiencing these issues.”
The furore over Instagram came as the WSJ published further revelations about Facebook and how a change to its News Feed algorithm in 2018 made the platform’s users angrier and more divisive. “Misinformation, toxicity, and violent content are inordinately prevalent among reshares,” said Facebook researchers in internal memos. According to the WSJ, unnamed political parties in Europe warned Facebook that they had become more negative in their campaigning in order to stay on users’ News Feeds. “Many parties, including those that have shifted to the negative, worry about the long-term effects on democracy,” read one internal Facebook report.
Responding to the latest revelations, a Facebook spokesperson said: “Is a ranking change the source of the world’s divisions? No. Research shows certain partisan divisions in our society have been growing for many decades, long before platforms like Facebook even existed.”