If social media technology were owned and controlled by the people who use it, we would not be living with algorithms that amplify hate, nor would we see pornographic or exploitative features bundled behind premium paywalls. These are not accidental design flaws; they are the predictable outcomes of profit-driven platforms optimised for engagement and revenue rather than public good.
Calls for a separate social media environment for children under 16 may sound sensible at first glance. However, history shows that prohibition-style policies rarely eliminate harmful behaviour. More often, they push it underground, reduce transparency and make harms harder to monitor and regulate. Age-segregated platforms risk becoming either ineffective or lightly supervised spaces that replicate the same structural problems.
A better solution lies not in simple restriction but in governance. Social media platforms need democratic, enforceable governance structures that ensure ethical design, accountability and appropriate use.
At present, most major platforms are privately owned, venture-capital-backed corporations whose legal duty is to maximise shareholder value. That incentive structure rewards attention extraction, data harvesting and polarising content. As a result, misinformation, harassment, scams and exploitative material are not anomalies; they are systemic features.
This represents one of the biggest social and democratic threats of our time, and it has emerged with remarkably little serious opposition. We have effectively sleepwalked into a communications infrastructure that shapes public discourse, mental health and political outcomes, yet operates with minimal democratic oversight.
I support labour’s stance on tackling the use of AI to create illicit images, particularly deepfake abuse. But the problem is far wider than one application of artificial intelligence. What about the millions of people who have been scammed, trolled, harassed or deliberately misinformed through platforms that lack meaningful ethical controls? What about communities targeted by algorithmic amplification of hate, or children nudged towards harmful content by recommendation systems designed to maximise time spent online?
These harms are not simply failures of moderation; they are failures of ownership and control.
An alternative model already exists in the form of platform cooperatives. Platform co-ops are digital services that are owned and governed by their users, workers or communities, rather than external investors. Decisions about data use, moderation, algorithms and monetisation are made democratically, with social value embedded into the platform’s purpose.
In the UK, platform cooperatives and other forms of cooperative technology are beginning to emerge across sectors including social care, creative industries and public services. While still small in scale due to their funding limitations, they demonstrate that ethical, inclusive and accountable digital infrastructure is possible when ownership aligns with users rather than advertisers.
Subscribe here to our daily newsletter roundup of Labour news, analysis and comment– and follow us on Bluesky, WhatsApp, X and Facebook.
If the government is serious about creating a safer digital environment for children and adults alike, it must go beyond regulation alone and actively support alternative ownership models with enabling finance. Labour’s manifesto commitment to doubling the size of the cooperative movement provides a clear opportunity. That commitment should explicitly include digital platforms and technology infrastructure.
Share your thoughts. Contribute on this story or tell your own by writing to our Editor. The best letters every week will be published on the site. Find out how to get your letter published.
Supporting cooperative tech could mean targeted public investment, preferential procurement, access to patient capital, and regulatory frameworks that recognise democratic digital governance as a public good. Without this, we will remain trapped in a system where the harms of social media are treated as unfortunate side effects, rather than the logical consequences of who controls the technology.
Real safety online will not come from bans or cosmetic fixes. It will come from reshaping power, ownership and accountability in the digital public square and that is a challenge Labour should be bold enough to take on.
-
- SHARE: If you have anything to share that we should be looking into or publishing about this story – or any other topic involving Labour– contact us (strictly anonymously if you wish) at [email protected].
- SUBSCRIBE: Sign up to LabourList’s morning email here for the best briefing on everything Labour, every weekday morning.
- DONATE: If you value our work, please chip in a few pounds a week and become one of our supporters, helping sustain and expand our coverage.
- PARTNER: If you or your organisation might be interested in partnering with us on sponsored events or projects, email [email protected].
- ADVERTISE: If your organisation would like to advertise or run sponsored pieces on LabourList‘s daily newsletter or website, contact our exclusive ad partners Total Politics at [email protected].


More from LabourList
Social media ban for under-16s proposed in government consultation
How much progress has Labour made on its King’s Speech commitments?
‘Labour must defend NATO’