RAISE THE AGE MANIFESTO

Let’s give childhood a few more years

Our manifesto to reset social media for the next generation, backed by 250,000 UK families.

Childhood only happens once.

And the early teenage years are some of the most formative of all.

They’re for figuring out who you are – building confidence, resilience and real-world relationships.

We believe they should be shaped by families, friends and communities, not algorithms.

This proposal is not about banning the internet. It’s about setting a clear boundary for commercially driven social media.

One that gives young people a few more years to grow before stepping into a world that’s designed for profit, not their wellbeing.

OUR POLICY PRINCIPLE

If platforms aren’t safe for children, they shouldn’t have access to our children.

In every other area of childhood, companies must prove their products are safe before children use them.

Social media should be no exception.

Until platforms redesign their products to meet that standard, they should not be allowed to offer accounts to children under 16.

It’s that simple.

OUR MANIFESTO

01. Enough is enough – the public are demanding action

The public have had enough. That’s why 250,000 parents in the Smartphone Free Childhood community wrote to MPs in just five days in January. Parents across the country, of all political persuasions, shared one united message: it’s time to raise the age.

Why now?

Because for more than a decade, governments and regulators have pursued incremental reform, but childhood has not become safer online. Instead, the harms have intensified. Social media has become more immersive, more algorithmic and more addictive.

The status quo is failing children. Ask almost any parent, teacher or teenager – they can name someone struggling because of these systems. And every day we delay taking action, more children are at risk.

Removing platforms’ access to under-16s until their products are safe would not solve every problem overnight – but it would reset the default.

A clear minimum age doesn’t just regulate platforms – it empowers parents to say “not yet” with confidence.

02. The harms are predictable, and preventable

Social media platforms are often described as community. But they are built for engagement and growth – not belonging.

Rising addiction, mental health challenges, exposure to harmful content and the risk of grooming are foreseeable outcomes of systems designed to maximise attention at all costs.

Algorithms amplify comparison and feed vulnerabilities to drive engagement. Extreme, polarised and sexualised content performs best. Weak moderation allows bullying and exploitation to flourish.

These environments are not designed around children’s developmental needs.

This is not about punishing children. It is about recognising that commercially-driven social media ecosystems are not built for them – and giving young people a few more years to grow before entering them.

03. Ban platforms until they make them safe

Current social media platforms are not safe for children.

If toys, medicines or food caused comparable harm, they would be withdrawn from the market until they were made safe.

Social media should meet the same standard.

For too long, platforms have been given access to children first, with regulation attempting to catch up after harm occurs. That model is the wrong way round.

Platforms should have to demonstrate that their products are safe for children before they are allowed to offer accounts to under-16s.

Access must be conditional on safety.

04. Responsibility must sit with platforms, 
not parents

Child safety cannot depend solely on individual parents trying to manage systems built and scaled by the most powerful companies in the world.

These platforms are built for growth and engagement. Once most children are on them, network effects make opting out almost impossible. Saying no can mean social exclusion.

Ordinary mums and dads can’t be expected to out-parent algorithms built by trillion-dollar companies to capture their kids' attention.

This is not a private problem. It is a structural one – and structural problems require government action.

Government must reset the terms of access so responsibility for child safety sits where it belongs: with the companies that design and profit from these systems.

05. We need a reset for the next generation

Children will grow up online. That is not in doubt.

The question is what kind of digital world they enter – one built around their wellbeing, or one built around maximising their attention.

A conditional ban on platforms providing accounts to under-16s would change the default. It would require companies to design in safety from the start, rather than trying to contain harm after it occurs. It would set clearer standards and raise expectations across the board.

This reset is not the whole solution. But it is a sensible first step. A boundary consistent with how we protect children everywhere else – and a signal that safety must come before corporate profits.

Done properly, this approach will create space for innovation built around children’s wellbeing.

It is not anti-technology. It is pro-childhood. And it is the beginning of a wider plan to make the digital world safer for the next generation.

Raise the Age FAQs

Got questions about raising the age on social media? Read our answers to common questions and pushbacks.

Join our live briefing

Understand the consultation, and how you can influence it.

🗓️ Tuesday 10th March 8.30-9.30pm, Zoom