Young people must report harmful online content, 英国のウォッチドッグは言います

Young people should report harmful online content, the communications watchdog has said, after finding that two-thirds have encountered potential harms on social media but only one in six report it.

Ofcom found that 67% of people aged between 13 そして 24 had seen potentially harmful content online, although only 17% report it. The regulator is charged with enforcing measures in the forthcoming online safety bill, which will require social media companies to protect children and adults from online harms.

The most common potential harm encountered online was offensive or bad language (28%), according to respondents in Ofcom’s Online Nation 2022 報告する, followed by: misinformation (23%); scams, fraud and phishing (22%); unwelcome friend or follow requests (21%) and trolling (17%). さらに 14% had experienced bullying, abusive behaviour and threats online.

Ofcom is launching a campaign with TikTok influencer Lewis Leigh, who rose to fame during lockdown by posting videos of himself teaching dance moves to his grandmother. The “Only Nans” campaign will encourage young people to report harmful content they see on social media.

The campaign is also supported by Jo Hemmings, a behavioural psychologist. 彼女は言いました: “People react very differently when they see something harmful in real life – reporting it to the police or asking for help from a friend, parent or guardian – but often take very little action when they see the same thing in the virtual world.”

TikTok removed more than 85m pieces of content in the final three months of last year, ほぼ 5% of that total coming from user referrals. Instagram removed more than 43m pieces of content over the same period, of which more than 6% came from users reporting or flagging content.

Anna-Sophie Harling, online safety principal at Ofcom, 前記: “Our campaign is designed to empower young people to report harmful content when they see it, and we stand ready to hold tech firms to account on how effectively they respond.”

ザ・ オンライン安全法案 is expected to become law by the end of the year. Ofcom will have the power to impose fines of £18m or 10% of a company’s global turnover for breaches of the act, which imposes a duty of care on tech firms to protect people from harmful user-generated content. One of the specific mandates in the bill is ensuring that children are not exposed to harmful or inappropriate content.

Andy Burrows, head of child safety online policy at the NSPCC, which has called for a strengthening of the bill, 前記: “This report lays bare how young people are at increased risk of coming across harmful content but feel unsupported on social media and either do not know how to report it or feel platforms simply won’t take action when they do.”