RAISE THE AGE FAQS

Raising the Age - Answers to common questions

Doesn’t this push children into darker corners of the internet?

There’s no evidence this would happen.

Current social media platforms are high-risk, highly commercialised spaces. The idea that removing access to mainstream platforms would suddenly drive millions of teenagers to the dark web is not supported by evidence.

Young people are on today’s dominant platforms largely because of network effects. Everyone is there, so they feel compelled to be there.

Clear, enforced age boundaries shift social norms at scale. When expectations change — as they did with smoking or drink-driving — behaviour changes too.

A regulatory reset would also create space for new products designed around child wellbeing, not adult engagement metrics.

Won’t kids will just get around age checks?

Some will. That has never been a reason to abandon age limits.

We don’t scrap drinking age laws because some teenagers get served alcohol. Age limits are not about catching every individual — they are about setting clear social norms.

Platforms already use sophisticated AI to target advertising and detect behaviour. They can use similar tools to estimate age and prevent underage access.

If there is collective will, enforcement is achievable.

And regardless of the technical detail, clear legal boundaries give parents something they currently lack: collective backing. It becomes much easier to say “not yet” when the rule applies to everyone – not just your child.

Age limits don’t just restrict access. They reset norms.

What about the educational and communication benefits for young people?

This proposal does not switch off the internet. It does not prevent messaging, research, learning, creativity or communication.

Young people can communicate without algorithmic feeds. They can learn without infinite scroll. They can connect without systems engineered to maximise comparison and compulsive use.

The question is not whether technology has benefits. It clearly does.

The question is whether those benefits require children to spend hours inside commercially driven systems designed to maximise engagement at any cost.

We believe they do not.

Setting clear age boundaries on account creation creates the space to build safer, age-appropriate digital environments that genuinely support learning and communication – without exposing children to predictable harms.

Protect first. Then build better.

Shouldn’t we focus on education rather than regulation?

This is not a choice between education and legislation. That is a false dichotomy. The right answer is yes to both.

Digital literacy, parental support and better education are essential. But education alone cannot counteract systems deliberately engineered to exploit developmental vulnerabilities.

The best solution requires policy change and behaviour change. One is not a substitute for the other.

Clear legal boundaries reset the system. Education helps young people navigate it safely.

We need both.

What about vulnerable children who rely on social media for support?

Commercial social media platforms are not a substitute for real community.

In practice, vulnerable children are often the most harmed – through grooming, exploitation, harmful content and algorithmic amplification.

Social media offers the promise of genuine connection and community, but today’s algorithmically amplified, commercially optimised systems designed to maximise engagement at any cost do not provide the kind of community children need.

If we want to preserve genuine benefits, we must rethink the design and incentives of modern social media.

Leaving vulnerable children inside harmful systems because some support exists there is not compassion. It is an abdication of responsibility.

Can’t we just enforce existing law?

Full enforcement of the Online Safety Act is essential. It must be properly implemented and robustly enforced.

Raising the age should not replace the OSA – it should build on it. It must be the start of a deeper reset, not the end of reform.

There is still significant work to do to make platforms safer by design. Clear age boundaries and specific design standards are one crucial part of that wider shift, and will encourage child-centric innovation.

The OSA fines companies after harm has occurred. We believe we need to prevent the harms from happening in the first place.

The consequence for failing to meet child safety standards should be loss of access to children – not inconsequential fines absorbed as a cost of doing business.

Isn’t this just a blunt ban?

“Ban” makes it sound indiscriminate. What we are proposing is targeted and proportionate.

Platforms that meet robust child safety standards can serve under-16s. Those that do not, should not.

We also need to be clear about language. We don’t call age limits on alcohol or driving “bans.” We recognise that certain environments require maturity.

This is not about banning children from social media forevermore. It is about preventing social media platforms from accessing children until they can redesign their products to meet basic safety standards.

Isn’t age verification just digital ID by the back door?

No.

Most adults will not need to upload ID to continue using social media. Platforms already hold extensive behavioural data and can often estimate age with high accuracy.

Where stronger proof is required, privacy-preserving third-party systems can verify age without storing ID documents or creating centralised databases.

Age assurance is a technical and regulatory challenge — not an unsolvable one. And the onus must be on the tech companies to solve it.

Won’t 16 just create a cliff edge?

The cliff edge already exists – it just happens at 11 or 12.

Sixteen is a more developmentally appropriate transition point. It aligns better with adolescent maturity and other age-based responsibilities.

We do not throw children into the deep end with alcohol or driving in order to “prepare” them. We set age limits because timing and development matter.

Under-16s should only be able to access products that are demonstrably safe for them – not today’s social media with minor tweaks, but systems fundamentally redesigned to remove predictable harms.

Read our manifesto

Read our manifesto to reset social media for the next generation, backed by 250,000 UK families.

Join our live briefing

Understand the consultation, and how you can influence it.

🗓️ Tuesday 10th March 8.30-9.30pm, Zoom