UK mulls social media curbs for kids — but who’s even asking?

Posted by
Check your BMI

LONDON — British ministers are considering fresh restrictions on social media for children, including requiring parents to provide permission for under-16s to use the platforms.

But the people you might expect to back such a measure — those advocating most fiercely for child online safety — don’t sound keen, and there are questions over just how workable such a policy would be in practice.

“To automatically kick kids off [social media] is not necessarily the right answer,” said crossbench peer and founder of the 5Rights Foundation, Beeban Kidron. Kidron was one of the most active child safety campaigners on the U.K.’s Online Safety Act, which put fresh duties on platforms to police content.

“I think that the government should not keep on introducing new things and [should instead] implement the old,” she added.

The Molly Rose Foundation, another key group calling for child online safety protections, said that while further measures “are necessary to protect children from online risks,” emphasis “should firmly be on strengthening the regulator’s hand to ensure platforms are no longer awash with a set of avoidable dangers.”

An adviser to the foundation, Andy Burrows, took to X to brand the proposal “a nebulous consultation (& kite flying!) that no-one saw coming.”

First reported by Bloomberg Thursday night, the government is expected to start reviewing the evidence on how social media can harm children next year, with a formal consultation to potentially follow.

Downing Street was keeping tight-lipped Friday. “Government doesn’t look to ban things for the sake of it — this is just speculation and our focus is more broadly to make sure we continue keeping children safe online,” said Prime Minister Rishi Sunak’s spokesperson.

Science Minister Andrew Griffith told Sky News on Friday that there was “more that could be done” to protect children online.

‘Here we go again’

The tech industry’s reaction to the idea is more or less “here we go again,” said an industry figure granted anonymity to speak freely about the proposals before they are finalized.

They and another industry representative expressed doubts about how such a measure would be enforced, highlighting that under 16s typically have fewer means of proving their age than 18 year olds, and that the idea throws up the additional problem for platforms of confirming who is a parent. 

The TikTok app | Drew Angerer/Getty Images
toonsbymoonlight

Such a move wouldn’t be unprecedented. In June, France passed a law demanding social media platforms verify users’ ages and obtain parental consent for under 15s. 

Most social media platforms used by kids restrict users under the age of 13. But this is primarily achieved through self-certification, meaning children can simply lie about their date of birth. 

A report published on Thursday by U.K. regular Ofcom found that although TikTok, Twitch and Snapchat make efforts to identify and remove underage accounts and to age-gate adult material, kids are still at risk of encountering inappropriate content.

Britain’s Online Safety Act stipulates that social media platforms must implement highly effective “age estimation” or “age verification” techniques, and that self-declaration will no longer be sufficient. Ofcom is expected to consult on this part of the bill early next year. 

Child campaigners still see the Online Safety Act as a promising vehicle for change, aside from the new idea being floated. “I would ask the prime minister to show some confidence in the Online Safety Act and make sure that what comes out of it is robust,” said Kidron. 

“They should really be doubling down on resourcing and putting their political will behind regulation so that we can have a digital world that is fit for children.”