Social media firms asked to toughen up age checks for under-13s

6 hours ago 13
ARTICLE AD BOX

Laura Cressand

Imran Rahman-Jones,Technology reporters

Getty Images A boy in a blue hoodie leaning against a white wall, holding a phone in both hands and looking down at it. His face is covered by his brown hair. Getty Images

Major technology companies have been asked to bring in more robust age checks for under-13s in the UK, similar to those currently in place for services designed for adults.

The platforms contacted by media regulator Ofcom and data watchdog the Information Commissioner's Office (ICO) are Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox and X.

They have been told they should do more to make sure younger children are kept safe online.

Ofcom Chief Executive Melanie Dawes said services were currently "failing to put children's safety at the heart of their products".

The companies have defended the safeguards they have in place, with YouTube owner Google saying it was surprised by Ofcom's approach, urging it to focus on higher risk services instead.

But both regulators said the social media platforms needed to strengthen their commitment to stopping children under 13 from signing up.

Currently, many platforms rely on people who sign up to self-report their own ages.

"As self-declaration is easily circumvented, this means underage children can easily access services that have not been designed for them," the ICO said in an open letter to social media and video platforms.

Most social media platforms have a minimum age limit of 13, but Ofcom research suggests 86% of children aged 10-12 have their own social media profile.

Ofcom wants firms to use "highly-effective age checks," which are currently only required by law for certain services which provide over-18 content, such as pornography.

Implementing similar methods for young children's social media would require the big tech firms to voluntarily bring most robust measures in.

The ICO's focus is on the handling of young children's data.

"Where services have set a minimum age - such as 13 - they generally have no lawful basis for processing the personal data of children under that age on their service," its letter, from Chief Executive Paul Arnold, said.

Technology secretary Liz Kendall said no platform would get a "free pass" when it came to protecting children and that Ofcom had her full support in holding the platforms to account.

"No company should need a court order to act responsibly to protect children," she added.

What do the tech companies say?

YouTube said it was surprised by Ofcom's "move away from a risk-based approach, particularly given that we routinely update them and other regulators on our industry-leading work on youth safety.

"We urge them to focus instead on high risk services that are failing to comply with the codes set out in the Online Safety Act."

Meta said it had many of the suggestions from Ofcom already in place, "including using AI to detect users' age based on their activity, and facial age estimation technology".

The company, which owns Facebook and Instagram, added that bringing age verification for app stores would mean "parents and teens will only need to provide their personal information once."

Snapchat reported it was testing age verification tools.

TikTok said it used "enhanced technologies" to help detect and remove underage accounts.

The company also claimed to be the only major platform to transparently publish the number of suspected under-13 accounts it removes - with over 90 million suspected under-13 accounts removed between October 2024 and September 2025.

Roblox said it has additional protections for under-13s and has released 140 new safety features in the past year, including "the introduction of new mandatory age checks that all players must complete in order to access chat features".

A spokesperson added that they "look forward to demonstrating our efforts in our ongoing dialogue with Ofcom".

X was contacted for comment by BBC News but did not respond.

Professor Amy Orben, digital mental health expert at Cambridge University, welcomed the regulators' action - but added it must only be the beginning of stronger regulation.

"Safety must be built into products by design rather than treated as an afterthought, with regulators showing more strength in holding companies to account," she said.

Social media analyst Matt Navarra said the "real risk" came from algorithms or recommendation systems, another aspect which Ofcom highlighted also needed to be addressed.

"Knowing a user is a child is step one," he said, "but designing a platform that doesn't exploit their attention is the next step - and that step is actually much harder".

Additional reporting by Chris Vallance.

 The world’s biggest tech news in your inbox every Monday.”

Read Entire Article