If you’re a parent in Britain right now, you’re probably having some version of the same conversation most evenings: how long has that screen been glowing? Who is my child talking to? What are they watching? And what is it all doing to them?
So the government’s decision last week to open the consultation on an under-16s social media ban is not just understandable, it was inevitable. The public mood has fundamentally shifted, with Australia making the first move and Spain signalling it will follow. The pressure on the Government to “do something” is immense.
The debate is already hardening into a familiar binary – ban or don’t ban, protect children or don’t protect children. It’s emotive, it’s headline-friendly, and it feels decisive – and probably will play very well across the despatch box. But if Labour wants lasting reform rather than a short-term political win, we need to look deeper than the platforms themselves, and look at the people building them.
READ MORE: ‘AI regulation is key to Labour’s climate credibility’
Addictive design is not some accident. Infinite scroll did not emerge from nature like rainfall. Autoplay was not delivered by stork. These are product decisions, made by teams of engineers, product managers, behavioural scientists, and executives. Age verification systems will be coded and assured by someone. AI chatbots are trained and tested by someone. Recommendation algorithms are tuned, tweaked, and optimised by someone. If we treat technology as an abstract force, something that simply happens to us, we miss the most important lever for change.
In most areas of public life where decisions have the potential to cause real harm, we expect clear standards of professional responsibility. We don’t rely purely on goodwill or corporate culture. Doctors train and practise under strict ethical codes. Lawyers operate within regulated professional frameworks. Teachers are trusted with young people precisely because their profession carries defined duties of care. These structures exist because society recognises that when the stakes are high, professionalism cannot be optional.
Yet the people shaping the digital environments our children inhabit, spaces where they learn, socialise, explore identity and, at times, encounter profound risk, are not consistently subject to the same structured professional accountability. That’s a huge gap.
Subscribe here to our daily newsletter roundup of Labour news, analysis and comment– and follow us on TikTok, Bluesky, WhatsApp, X and Facebook.
This isn’t about demonising tech workers – actually quite the opposite. Many of the engineers and safety professionals inside technology companies care deeply about child safety. Many are parents. Many have pushed internally for safer defaults, stronger moderation, less exploitative design. But good intentions without institutional professional backing can be fragile in the face of commercial pressure. If Labour is serious about creating an online world where young people can thrive, not just survive, we must strengthen the professional foundations of the tech workforce itself.
That means clearer standards around ethical design. It means embedding public interest principles into software development practices. It means supporting professional bodies, accreditation routes, and continuing professional development frameworks that make safety, privacy, and wellbeing core competencies rather than optional extras.
Framed this way, the choice isn’t simply between laissez-faire and prohibition. There is a third way here, one that recognises both the enormous benefits of digital connection and the absolute necessity of meaningful guardrails. Not a rhetorical “third way” of triangulation for its own sake, but a practical one: raising professional standards so innovation and protection advance together rather than being seen as in competition.
An Australia style ban may still form part of the final policy mix. The consultation may conclude that age thresholds are in fact necessary. But even if we introduce one tomorrow, platforms will continue to design for engagement for when our young people turn 16. AI tools will continue to evolve. Children will migrate to new digital spaces. Without systemic change in how these systems are built, we risk chasing the problem from app to app. Perpetually putting out the fires, rather than catching the arsonist.
Labour has an opportunity here. Instead of framing this solely as a war against Big Tech billionaires, we can frame it as a public safety agenda; a standards agenda; a professionalisation agenda.
Protecting children online is not just about switching something off. It’s about building something better. And that has to start with the people behind the systems.
Share your thoughts. Contribute on this story or tell your own by writing to our Editor. The best letters every week will be published on the site. Find out how to get your letter published.
-
- SHARE: If you have anything to share that we should be looking into or publishing about this story – or any other topic involving Labour– contact us (strictly anonymously if you wish) at [email protected].
- SUBSCRIBE: Sign up to LabourList’s morning email here for the best briefing on everything Labour, every weekday morning.
- DONATE: If you value our work, please chip in a few pounds a week and become one of our supporters, helping sustain and expand our coverage.
- PARTNER: If you or your organisation might be interested in partnering with us on sponsored events or projects, email [email protected].
- ADVERTISE: If your organisation would like to advertise or run sponsored pieces on LabourList‘s daily newsletter or website, contact our exclusive ad partners Total Politics at [email protected].


More from LabourList
‘Lammy’s jury trial cuts risk worsening racial bias in justice system’
‘Is there an ancient right to jury trial?’
‘To tackle the housing crisis, the Treasury must rethink how it values social housing investment’