Ofcom proposes new measures to protect children online

Social media and other online services operating in the UK have been told they must do more to improve children’s safety when they are online.

New measures proposed by Ofcom would require tech firms to stop their algorithms recommending harmful content to children, and establish robust age-checks to keep them safer.

The media regulator’s draft Children’s Safety Codes of Practice covers a wide range of online services including apps and search engines as well as social media sites. It says that firms must first assess the risk their service poses to children and then implement safety measures to mitigate those risks.

The document sets out more than 40 specific measures to prevent children from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography, and to minimise children’s exposure to other serious harms, including violent, hateful or abusive material, online bullying, as well as content promoting dangerous challenges.

All user-to-user services will be expected to have content moderation systems and processes to take quick action on harmful content, while large search services should use a ‘safe search’ setting for children, which can’t be turned off and must filter out the most harmful content. 

Children’s experiences online are often “blighted by seriously harmful content which they can’t avoid or control”, said Ofcom chief executive Dame Melanie Dawes.

She added that the proposals “firmly place the responsibility for keeping children safer on tech firms”.

“Our measures — which go way beyond current industry standards — will deliver a step-change in online safety for children in the UK. Once they are in force we won’t hesitate to use our full range of enforcement powers to hold platforms to account.”