League tables to show tech firms’ child safety record

Ofcom say reforms in new online safety code will ensure children are protected - but some campaigners say measures do not go far enough
Online safety charity Internet Matters’ survey found parents have concerns about children’s digital habits (Alamy/PA)

New league tables will be published to show how well tech companies are protecting children from harmful content online including suicide guides and porn, under a new crackdown announced on Wednesday.

Media regulator Ofcom said its new Children’s Safety Code would result in firms which failed to block damaging material being “named and shamed” and potentially barred from having users aged under 18.

It said the new requirements would include better age checks, either via facial recognition or other proof of identity, and changes to algorithms used by tech companies to stop harmful content being directed at children.

The watchdog, which was introducing its new code on Wednesday under provisions set out in the Online Safety Act passed by Parliament last year, said the reforms should ensure that children enjoyed “freedom of expression” with their peers worldwide while being protected from those wanting to harm them.

Some bereaved parents, who lost their children as a result of the impact of dangerous online material, warned that the new code did not go far enough and called for ministers to strengthen the legislation with measures including a ban on the use of end-to-end encryption for children’s social media.

But Dame Melanie Dawes, the head of Ofcom, said the new league tables and the other changes would deliver significant improvements in child safety.

“This is a big moment. Young people are fed harmful content again and again and this has become normalised, but it needs to change,” she said.

“We are requiring proper robust age checks, changes to the algorithms … so that children are not shown suicide or self-harm material or pornography, and more control for teenagers about what groups they are added to and how they can screen out content.

“We will be publishing league tables so that the public know which companies are implementing the changes and which are not.”

Ruth Moss, whose 13-year-old daughter Sophie Parkinson took her own life in 2014 after viewing suicide websites and self-harm videos on her phone, said she hoped the new code would achieve positive changes but that further legislation was needed to ban the use of end-to-end encryption for children.

“No child should be able to have privately messaged conversations that are end-to-end encrypted because … even the social media companies themselves can’t access those conversations and that means when something goes wrong … that evidence gets lost,” she told BBC Radio 4’s Today programme.

Other parents who have called for stronger legislation include Ian Russell, whose daughter Molly took her own life in 2017 at the age of 14 after viewing self-harm and suicide images online, and Esther Ghey, whose daughter Brianna was murdered in February last year by two teenagers who had watched violent videos on the darknet before the murder.

Create a FREE account to continue reading

eros

Registration is a free and easy way to support our journalism.

Join our community where you can: comment on stories; sign up to newsletters; enter competitions and access content on our app.

Your email address

Must be at least 6 characters, include an upper and lower case character and a number

You must be at least 18 years old to create an account

* Required fields

Already have an account? SIGN IN

By clicking Create Account you confirm that your data has been entered correctly and you have read and agree to our Terms of use , Cookie policy and Privacy policy .

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged in