Last month George Freeman MP woke up to find a video circulating online that appeared to show him defecting from the Tories to Reform UK. It was slickly produced, believable enough to fool casual viewers, and entirely fake.
Freeman went to the Norfolk Police. They took it seriously and initially treated it as a potential false communications offence under the Online Safety Act 2023. However, they later decided that it did not meet the legal test for a crime.
The wrong that has been perpetrated against Freeman and the public in this case is clear: intent to deceive.
‘Satire makes fun of power. Deceit undermines truth.’
Our law already recognises that impersonation can be an offence. For example, pretending to be a police officer is a criminal offence because society relies on believing that the person in uniform really has the authority they claim. The same logic should apply to politics.
There have long been calls for new rules to govern election content, aiming to prevent misleading communication in our politics. The arrival of generative AI has made it even more urgent to bring forward a new code of conduct for campaigning. The main objection to doing so is that it might curtail free speech.
However, when someone deliberately uses technology to impersonate a political candidate, they’re not exercising free expression; they’re committing a form of democratic fraud. The difference between satire and deceit is as old as politics itself. Satire makes fun of power. Deceit undermines truth.
READ MORE: Artificial intelligence: ‘If progressives don’t harness AI, the populist right will’
‘Using AI to impersonate a candidate can cross into deception’
There was a period of time when there was a lot of hype about generative AI in politics but minimal signs of its impact. That has now shifted. Candidates and parties are increasingly using AI-generated videos, images and voice clips to shape messaging, to dramatise policy stakes, and to mock opponents. This is the new normal.
For example, in October, Donald Trump reposted an AI-generated video showing himself as a fighter-pilot wearing a crown and dropping sludge-like “faeces” on protesters during the “No Kings” demonstrations. It was clearly fantastical in nature. Nobody thought he was claiming to have actually done it.
And in New York’s mayoral race, Andrew Cuomo’s campaign released an AI-generated video on Halloween, depicting his opponent Zohran Mamdani trick-or-treating; among other barbs, it showed Mamdani taking 52% of the candy on offer instead of the customary one as a way of attacking his tax policy. The video was fairly realistic and featured a very accurate voice clone. But the production was still a bit shonky. The situation was clearly satirical. And it featured a large disclaimer throughout highlighting the use of AI. As such, nobody would have mistaken it as genuine.
These examples illustrate that generative AI can be used to depict politicians but stay on the right side of the line. But using AI to impersonate a candidate in a video can cross into deception. In Cuomo’s case, the combination of cloning Mamdani’s image/voice and presenting him in ways he never had, means the Halloween advert came close to that line. Had the disclaimer only appeared at the end, or the quality been 20% higher, it may have crossed it.
READ MORE: ‘Streeting, Akehurst, who next? How campaigns can fight deepfake attacks’
‘AI is enabling those intent on manipulating voters’
The test as to whether a piece of content is satire or misrepresentation should be whether the median voter believes the candidate genuinely said or did what is being presented. The moment the audience questions – even for a second – if they’re in the realm of drama, or reality, it’s a problem.
It’s worth noting, AI isn’t a necessary ingredient for misleading claims in election content. For years, our politics has been poisoned by things like dodgy leaflets posted through doors designed to mislead voters into thinking a candidate said something they did not. AI is simply enabling those intent on manipulating voters to make their misleading claims much more believable.
While we must address deception, it is equally important to recognise why parody must be protected. Britain has a rich tradition of political parody, from Spitting Image to Private Eye to the endless stream of mash-ups that lampoon ministers every week. These aren’t threats to democracy; they’re part of it. Parody signals to the audience that it’s a joke. The humour only works because we know it’s not real. That’s the line any sensible law must hold. Outlawing political impersonation shouldn’t muzzle humour, commentary or criticism. It should simply criminalise the use of artificial intelligence or digital tools to deceive voters about who’s speaking.
Subscribe here to our daily newsletter roundup of Labour news, analysis and comment– and follow us on Bluesky, WhatsApp, X and Facebook.
‘AI could change an election unless the law catches up’
A deep-fake designed to make a candidate appear to say something they never said, especially during an election, is no different in spirit from distributing a fake ballot paper. It’s a direct assault on the integrity of the democratic process.
The Norfolk Police’s decision not to progress Freeman’s case makes clear that there is currently a significant gap in the law. The Online Safety Act 2023 doesn’t cover it. Section 106 of the Representation of the People Act 1983 makes it illegal to publish false statements about a candidate’s personal character or conduct during an election. That’s a narrow and outdated safeguard. It doesn’t cover impersonation, and it doesn’t apply outside campaign periods. Defamation law is too slow and too costly to help in the middle of a fast-moving online storm. By the time a candidate proves a deep-fake is false, the clip has already gone viral.
New rules, properly enforced, could change this and the upcoming Elections Bill is the perfect opportunity. Within that legislation, as part of a code of conduct on campaigning, we need an offence that focuses on intent and authenticity. A simple principle: it should be illegal to create or distribute digital content that falsely purports to be a political candidate (or claims to be speaking for them), with the intent to deceive voters. Alongside that, a clear exemption for parody, satire or artistic expression. A law like this wouldn’t protect politicians from mockery but it would protect the public from manipulation.
Freeman, speaking at a recent event in the House of Commons on disinformation in campaigning, said: “I dread an election in which videos are put out 24 hours before saying that Justin has a secret habit and everyone goes, ‘Well, I’m not voting for him.’ It could change an election”. He’s right. It could, and soon it will, unless the law catches up.
Share your thoughts. Contribute on this story or tell your own by writing to our Editor. The best letters every week will be published on the site. Find out how to get your letter published.
-
- SHARE: If you have anything to share that we should be looking into or publishing about this story – or any other topic involving Labour– contact us (strictly anonymously if you wish) at [email protected].
- SUBSCRIBE: Sign up to LabourList’s morning email here for the best briefing on everything Labour, every weekday morning.
- DONATE: If you value our work, please chip in a few pounds a week and become one of our supporters, helping sustain and expand our coverage.
- PARTNER: If you or your organisation might be interested in partnering with us on sponsored events or projects, email [email protected].
- ADVERTISE: If your organisation would like to advertise or run sponsored pieces on LabourList‘s daily newsletter or website, contact our exclusive ad partners Total Politics at [email protected].


More from LabourList
‘To win back trust, Labour needs a clear dividing line on tax’
Letters to the Editor – week ending 9th November 2025
‘Labour can rally Britain around science and progress’