Below is the full text of Tom Watson’s speech today on fixing the “distorted” digital market, previewed here.
Hello, thanks for having me today, and thanks to Progressive Centre UK, particularly Matthew Laza, for hosting this important event. There has always been propaganda, and spin in our politics. It is a side effect, albeit an unpleasant one, of an energetic liberal democracy. But what is happening today is different.
In recent years we’ve seen the organised exploitation of digital media platforms by anti-democratic interests. They are seeking to manipulate public views, to distort our political culture, and to divide our society. And the painful truth is, it’s working.
The Former Prime Minister Jim Callaghan once said that a lie can be halfway round the world before the truth has got its boots on. Today, falsehood travels 20 times faster than truth on Twitter. Political ads on Facebook reach more people if you’re willing to include outrageous content because their algorithms are set to reward attention, even if it’s attention paid to lies or distortions. And YouTube’s steady stream of recommended videos will take you from cats to conspiracy theories in a matter of minutes.
All too often, we don’t even notice it’s happening. I’m going to make two statements that may seem to contradict each other: firstly, the internet has enabled great, positive innovations in media and politics; secondly, both the people abusing digital platforms, and the companies that allow them to do so, are a threat to our democracy.
We have seen an illiberal populism sweeping the West. It is fuelled by declining public faith in democratic institutions but is amplified by the Internet. Let’s be clear, fundamentally, this is because governments often fail to deliver on the core tenets of their social contracts: personal freedom, equality under the law, and the chance to make a better future for our children.
Look no further than the people sleeping rough on our streets to see that the system isn’t working. Frustration with the system is turning into a desire for destruction. Chunks of the electorate hold views that are verifiably false. Fact-checks do not sway opinions. And trust in media is falling even further.
Our democracy depends on an open public sphere of fact-driven debate. That base is cracked at its foundations. This is not just a technology problem. It is a democracy problem, exacerbated by technology.
The BBC recently revealed that I was the first person use the phrase social media in the House of Commons. That was back in 2008, when the world looked very different. It was the year Android was launched; the year a new music streaming app called Spotify was released; the year Myspace was overtaken by Facebook.
I was among those that saw the Internet as a great democratising force in our politics. It offered new forms of organisation, innovation, and localism. Our optimism was ascendant. In 2008, like many people, I looked forward to a tech utopia. But a decade on, we find ourselves in digital dystopia.
New control technologies like tracking tools and micro-targeted advertising have cast a long shadow. A little later, I’ll explain the how Pokemon Go is an unlikely example of this kind of development. Parts of the internet have become havens for hate speech, platforms for extremism and election interference. We now know that as many as 126 million Americans saw content posted by fake Russian backed pages on Facebook during the 2016 Presidential election.
Competition has been replaced by corporate power. Google has bought 215 business since 2000. Facebook has bought 69 businesses since 2007. And those who understand how to use the power of the internet to mine data, manipulate political views, and even undermine democracies seem to be on the rise, as seen in the case of Cambridge Analytica which was exposed by the courageous journalism of Carole Cadwalladr
It’s easy to look back and think ourselves naive. And it’s fair to say we underestimated the rapidity with which these problems would appear and how intensely they would impact on our politics. But things did not have to end up this way. And they do not have to stay this way.
Technology responds to the desires of its users, the structure of its market, and to the limits of the law. These things can all be changed. People created the internet, wrote the code that makes it work, designed and built Facebook and all the other services. They can all be made differently – if we have the will to change them.
So I think our central task as policy leaders is to steer the power of technology back towards the public interest. We can’t afford a laissez faire approach to regulation any longer. Government has been dazzled by the scale of corporate profit for far too long, without questioning how much of that value reaches the bulk of the population.
We must rebalance the digital ecosystem towards the many, not the few. To do this, we need to address the harms caused by the so-called attention economy. The Internet remains an open space for speech and assembly. But that is not where the money is.
The business model is simple: track everything that everyone does online. Then try to predict their wants, group them into target markets, and sell access to their attention to advertisers. The more people click, read, and watch, the more money is made. And here lies the danger for our democracy.
Now merchants are selling worldviews and ideologies as products – just look at Steve Bannon. Conspiracy sells better than truth; and hate sells better than compassion. So, digital platforms are ideally suited to propagandists peddling bigotry and division to the disillusioned. I know Silicon Valley companies didn’t set out to undermine democracy. But they didn’t stop it happening either, and they continue to profit from it.
Let me be clear: this has got to stop. And in these divided times of Brexit, what is uniting the different sides of the Houses of Parliament is the need to address this. And unlike in other important national debates we’re having, parties of both colours understand that we can’t recreate past eras, we don’t want to turn away from the internet and the benefits it offers.
And as we work to address these problems, we must not lose sight of the fact that the power of technology offers enormous opportunity to humanity. This is particularly important to the Labour Party, which at its best is forward-looking and forward thinking. We know that change is inevitable, and with our notion of the empowering state, our challenge is to shape the change for the benefit of all. So – what can be done?
I see three phases to this process. First are the short term tasks to address the abuse of digital platforms. Then, we tackle a second set of challenges that reach deep into the structure of the online market. Lastly, we establish a digital public sphere to rebuild people’s trust in democratic institutions.
Underpinning all three is a Duty of Care for technology companies that have a broad social impact. The harms caused online need to be seen and treated as a public health concern. I’m very pleasantly surprised to see that the Secretary of State for Health understands this and is taking a leadership position.
Companies must recognize and measure those harms and potential harms. They must be transparent about risks and take reasonable measures to address them, according to rules laid down by a competent, agile regulator and monitored by regular audits. This is how we handle other industries that offer great public benefit but also carry risks if left without oversight, from broadcasting to healthcare. Technology markets cannot be exempt.
Firstly, I’ll address those immediate concerns about harms caused to our citizens and our democracy by the abuse of digital platforms. The rise in disinformation shows that the technologies underpinning the digital economy are too easily turned against us, sowing division and bringing extremism from the margins to the mainstream.
This is a matter of national security. What do I mean by that? Organised disinformation can come from foreign agents who wish us ill. But it’s also a matter of democratic integrity because disinformation undermines the quality of public debate. It is dividing our society, it is damaging our faith in the media, and it is distorting electoral outcomes. And it’s the vulnerable groups that are hit the hardest.
Consider the effects of disinformation on our older population. Recent research from the US found that older people may not have the digital skills to distinguish fact from fiction online. Over-65s shared twice as many links to inaccurate news articles on Facebook than those in the second-oldest age group.
The DCMS Select Committee have led the way on this with their Fake News Inquiry, and I look forward to the imminent publication of their report. Their interim conclusions: tech companies control what we see, by their very business model, and once content is seen, it is very difficult to disregard.
The public must have confidence that online attacks on our democracy will not be tolerated from any source, or in any form – be that cyber-attacks, hate speech, harassment or fraud. And when tech giants find that fake news has been spread on their platform they should let those exposed to it know – they should correct the record.
Just like the internet touches all aspects of our lives, its misuse impacts all levels of society. It is simply unacceptable that even with the multibillion pound market Facebook has in the UK, appearing before our parliament was not a priority for Mark Zuckerberg.
Even my old friends Rupert and James Murdoch made themselves democratically accountable to our parliament. That’s why we must hold global tech platforms accountable with clear rules of the road, and align innovation with public welfare.
To do this, we will introduce a set of Digital Democracy Guarantees. So let me outline Labour’s promises on security, transparency, and education. We will take immediate steps to further protect our elections from the cyber-attacks and disinformation campaigns of criminal enterprises and foreign actors. We will ensure that tech companies confirm that all online political advertisers targeting UK citizens are physically located in our country. We will bolster transparency, too.
People have the right to know who is trying to influence their views, and how they are trying to do it. Agents of disinformation amplify their lies through targeted digital adverts and social media bots. So all automated accounts on digital platforms should be clearly labelled.
And political advertising will be made more transparent, so consumers are confident they know who placed the advert they are seeing, and understand the broad demographic criteria by which they were targeted.
Too many platforms choose ad sales over accuracy, clickbait over credibility. So we need to establish improved online and media awareness across our education system and we will work with civil society groups to cultivate public knowledge about disinformation to support the next generation of voters.
And we will protect users from those who use digital media platforms to parade illegal material. Labour would establish a new legal duty to remove illegal content, like hate speech, with a judge-led system of checks and balances. This will include fast-track appeals.
We will look to our friends in Germany and the judicial framework they have built. The right to legitimate speech should be balanced against the need for legal protection.
Digital platforms should also enforce higher standards amongst their own user communities. It is abhorrent that Britain First’s Facebook page remained active after posts were linked to multiple attacks. I was shocked to learn that Facebook doesn’t recognise Stephen Yaxley-Lennon – Tommy Robinson – as the hate figure he is and kick him off their platform.
I was horrified to see how my SNP opponent, Stewart MacDonald, and his staff were targeted by Yaxley-Lennon’s intimidation tactics that was broadcast on Facebook Live. Parliament is united that we must not allow the online world to be a haven for hate.
New regulation must also put the protection of children at the forefront. That’s why Labour will ensure that companies have a legal duty of care in the services they provide to children. I was heartened that the government said yesterday that they want this too, but we need to ensure that the threshold for harm caused is not too high to offer meaningful protection, and breaches of the legal duty must be met by robust penalties.
Under the GDPR companies can be fined 4% of global turnover, or 20 million Euros, for data breaches. If companies breach health and safety law in this country they are not only fined but forced to pay a victim surcharge to compensate those affected. For the duty of care to be effective, we need penalties that seriously affect companies’ bottom lines.
It is fitting that Safer Internet Day falls within Children’s Mental Health Week because the issues are inextricably linked. According to Ofcom 8 in 10 of children between 5 and 7 years old are online, along with almost all children between 8 and 11 years old. Together, children make up a third of online users, and they are also some of the most vulnerable.
NHS research shows that children with mental health problems are more likely to use social media every day, and will do so for longer periods of time. And time spent online can have tragic consequences.
By now the whole country knows the tragic story of Molly Russell. Last month, her parents bravely said that exposure to harmful content about depression and suicide online, in their own words: “helped to kill her”. And very sadly, Molly’s family is not alone.
Their tragedy is a consequence of an industry that too often chooses to profit from children, rather than protect them. Just look at the children left by Facebook to rack up bills for thousands of dollars through games like Pet Ville and Happy Aquarium.
The Science and Technology Select Committee’s impressive recent report called for a comprehensive regulatory framework to protect children’s health as a matter of urgency. They found that the current patchwork of initiatives does not offer our children the protection they need, leaving them vulnerable to content that is detrimental, and even dangerous, to their wellbeing. I agree with them – our children need more than patchwork protection.
The government urgently needs to announce an industry regulator with a strong Duty of Care with tough sanctions. Because while the government drags its feet, reluctant to regulate a profitable industry, children are left vulnerable. It seems to me that the whole of the tech industry has forgotten that children are still children, online or off.
Beyond these immediate changes, achieving long term progress means addressing the structural issues of the digital market. At the centre of this crisis is an imbalance of power created by data monopolies and a distorted market.
Each year, businesses make billions by extracting and monetising personal data from each and every one of us. And yes, they offer us a service in return, but only worth a fraction of the fortune they gain. This is Surveillance Capitalism.
I’ve been greatly influenced by the work of Professor Shoshana Zuboff. I spent hours with small children playing Pokemon Go and wondering why it so often led us to McDonalds. In her excellent book, Professor Zuboff says that Pokemon Go, a Google spin-off, created, as she says, behavioural futures markets so that business and corporations could profit from predicted behaviour.
This is one tiny albeit bizarre example of platforms catering to the business interests of a few at the expense of the many. Just look at how YouTube profits from musicians’ work without fair remuneration. YouTube pays creators just £0.00054p per stream.
They push back at me, and say they’re in a digital streaming market and this is about finding a negotiated price. And I do understand their argument, but I think it’s undermined when you see the amount Google is spending on lobbyists in Brussels to undermine simple copyright reforms for the digital age. No one can justify that last year, Amazon made £8.7bn in Britain, and paid only £4.5m in tax.
So empowering people to challenge these unethical powerful interests goes to the very heart of a Labour movement born from our trades unions. We will use the tools of government to shape the digital market and create meaningful structural change: this will be a new social contract fit for the digital age.
The power dynamic between monopoly tech platforms and users has long been lopsided. Users need more control over how their personal data is collected and monetised through a Digital Bill of Rights. Customers should benefit from the value of the data they provide and the inferences made from it.
And in a market that offers little consumer choice, industry must reduce the barriers for moving between platforms, meaning greater portability of data across services, and not just the raw data, but the friendship networks, tags and other details that make it valuable.
As well as knowing when our data is traded, we should know when and how our data is subject to automated decisions; zo people should have a voice in whether algorithms serve as invisible editors, for example, curating what news we see, and have the ability to opt out.
They should know when their personal data is being used to affect the price they are being offered for goods or services. And we will explore alternatives to companies keeping large databases and encourage research into personal data stores which would give us more control over our own data and how it is processed.
Of equal importance is the power dynamic between the companies themselves. Consumers must have meaningful choices in how to find, send and receive information online. One of government’s central roles is to prevent the abuse of market power, from facilitating competitive entry and product differentiation to regulating this new generation of essential services so that public interest comes before private profits.
Competition restrictions and oversight should be modernised to match the digital market. Today, power is consolidated by large companies merging and acquiring smaller competitors, so future competition reviews should consider whether companies are acquiring data and patents that enable monopolisation.
The scale of the largest companies is rightly the subject of scrutiny, with consumers and businesses all subject to the whim of a single overmighty platform provider. We should take seriously the calls to break them up if it is in the public interest. I don’t subscribe to the argument that they’re global companies that can’t be touched.
It’s certainly true that we have to work within existing international structures to regulate monopolies. But these companies also exist within UK markets, and are subject to UK law including competition regulation.
I said at the start of this speech that governments are failing to deliver on the core of their social contracts. This government’s refusal to preempt the challenges presented by machine learning and artificial intelligence is one such failure.
As datasets grow, so do machines’ abilities to learn and make decisions on their own. And people are worried. Over half of working people think AI will make it harder to earn a good life in years to come. So how can we ensure technological advancements work for all of us, not against us?
Although the government will estimate that 9 million jobs will be lost to automation, I can’t tell you today exactly which jobs will be lost due to the rise of the robots. No one can do that accurately…Although the new Institute for the Future of Work is trying.
It’s essential that school curricula help create a resilient workforce with the creativity and emotional intelligence to adapt to the jobs of the future. And artificial intelligence and machine learning should be subject to ethical oversight to ensure an appropriate balance of power, particularly in relation to design bias, discriminatory outcomes, and algorithmic fairness.
Last year, Amazon’s AI recruitment machines were filtering out female candidates’ job applications. Machines were learning the company’s own internal bias. Algorithmic processes need greater oversight and even regular audits, akin to health and safety, to help prevent these patterns repeating.
This new social contract would reach deep into digital market structures, helping to equalise the relationships between consumers and corporations, between smaller and larger companies, and balancing future innovation with social justice.
I want to build public interest into technology business models. Because when the public are seen as meaningful market partners, industry is incentivised to share more of its many benefits.
With that guiding principle, we can begin to build a digital public sphere. This would be an online space that supports civil society, where people can feel safe, where people won’t be surveilled, and if they are advertised to, they are advertised to transparently. I envisage this as a place where people can go for services from our great national collections to local authority services.
Jeremy Corbyn addressed these issues head-on last summer, when he spoke about building a free and democratic media for the digital age. And he’s right that without radical thinking, our public spaces and debates will be taken up by unaccountable tech giants and trust in the media will further suffer.
So we need to be bold and ambitious in a changing media landscape. To counter the damage done by disinformation, and to safeguard our democracy. To provide a safe space for young people online, and a source of trusted information for those less confident using the internet. And to create a place where reasonable debate and discussion can take place without trolls and extremists seeking to damage and undermine people.
One area that a digital public sphere can support is quality journalism that holds the powerful to account. The digital economy has displaced much of traditional journalism. 136 local and regional papers have closed in just six years. And there are 6,000 fewer full-time positions in the industry than in 2007.
Of course, digital platforms differ fundamentally from print: unlike the limits of the page, online advertising space is infinite. And the profits go to the advertising platforms themselves rather than the providers of news and other content that appear on them. This has undercut the profit model of print journalism irreversibly.
But even as the fortunes of commercial media have declined, the public’s need for their services has surged. A digital public sphere will provide a space for journalism in the public interest. But in order to do this the public policy response must be rigorous and open minded.
We could give charitable status to some local, investigative and public interest journalism. Allowing outlets to use grants, donations, and tax exemptions to fund the brilliant work they do. Initiatives like a British Digital Corporation, which Jeremy outlined last summer, could serve as a crucial access point to publicly held data, could bring technological advancement and the public good into lockstep once again.
The ongoing Cairncross Review, and the government’s response to it, should commit to creating a digital public sphere, and engage in the creative policy thinking needed to achieve this.
To conclude, as a society we stand at an inflection point. We need an ambitious public policy response to steer technological development back towards serving the wellbeing of our democracy. I’ve spoken today about what we need to do now.
One – Deal with harms, hate and fake news with an enforceable duty of care.
Two – Fix a distorted online market caused by data monopolists and tech acquisitions. This requires a regulator to stop the lobbyists and lawyers jumping through the cracks in law and the gaps in our knowledge.
Three – Encourage and shape a digital public sphere where citizens can absorb credible news and information safe in the knowledge they will not be surveilled or targeted with ads when they do.
There are no single or fast solutions. Only a combination of policies — all of which are necessary for a self-governing democracy and none of which are sufficient alone. But this much is clear. We cannot allow our society to be held hostage to a marketplace of data monopolists who extract far more than they return, and whose business model undermines the integrity of our democracy.
Social justice and the public interest must be built into our market structures. We need:
- More protection from those that wish to do us harm,
- More choice and control over our own lives online,
- More opportunities to benefit from the wonders technology has to offer.
That is the mark of a fair society that only Labour can deliver. And if we get this right, the results could be spectacular.