The labour movement must be prepared for the age of the algorithm

The fiasco around the calculation of A-Level results earlier this month clearly has huge implications for the young people involved, but it also marks an important moment where the public has had its first real taste of what ‘rage against the algorithm’ might look like. For those young people now contemplating what their future holds, the bad news is that whatever happens these algorithms are set to play a role in their working lives for years to come. It matters for all of us. The future of work is one in which data will determine how we are managed, what types of jobs there will be and how we are rewarded, either through pay, benefits or promotion. The question is whether we can start to tame these beasts and avoid the kinds of shambles we have seen over exam results.

Algorithms are hardly a new phenomenon. But as they become more mainstream and are hardwired into the fabric of our economy and society, it is essential that we all become more literate in their use. The world of world was already changing before the pandemic. But the pace of that change has accelerated during Covid, with growing interest in work-based surveillance technology and monitoring software to help manage remote working.

For those of us with an interest in economic justice, data is becoming a fault line in inequality – separating those who have power over technology from the rest of us. It means that the same trust and accountability that has been talked about in relation to the exams fiasco and government use of algorithms must also to apply when employers deploy new technology or automated tools.

For trade unionists in particular, questions of algorithms and bias will be our meat and drink over the decades to come – almost equivalent to the role that health and safety played in the previous century. They are already being used in recruitment, in performance reviews and elsewhere in the workplace, too often with disastrous consequences. There is already evidence that workers are either unaware or being excluded from information and decision-making about workplace surveillance. Our own research at Prospect shows that most workers are unsure what data is currently collected about them by employers. We have to start educating ourselves, and fast.

The essential issue with any algorithm is that it reflects the biases of the assumptions and data that it is based on. For example, the A-Level algorithm penalised students in larger class sizes, and it didn’t take a genius to work out that the result would be the disproportionate impact of downgrading on state school students. In a workplace context, the obvious example is recruitment algorithms that reflect the racial biases of those who design them and therefore only offer interviews to white applicants. Algorithms are only as good as the data that is fed into them: if you put garbage in, you get garbage out.

We need to start thinking of data as part of our civil and economic rights. At its worst, as the exams fiasco showed, it is about unaccountable power leading to discrimination and injustice. But too often, especially at work, data is assessed in terms of business risks, not people. One challenge is that the foundation block for GDPR and our current data rights is individual privacy. Whilst this is important, it is insufficient in itself to tackle systematic bias or problems such as those highlighted by the A-Level results. We need to develop a collective approach to our data rights alongside individual privacy. This is even more important with the contractual relationship at the heat of employment. Inequality and discrimination are about structures as well as individuals.

What do unions need to do to prepare ourselves for the age of the algorithm?

  • We need to make better use of existing legal tools to test and scrutinise surveillance technologies. GDPR says that all new uses of our data should be subject to scrutiny by workers/unions. But how often does that consultation happen? We need to use GDPR and tools such as Data Protection Impact Assessments (DPIAs) to let the sun shine in on how employers are using our data.
  • We need to build our knowledge on what new data, technology and automated processes are coming down the track, and to equip union reps to engage on the issue.
  • We need to make data part our bargaining agenda, ensuring transparency and making algorithmic decisions and data collection everyday matters for collective bargaining.
  • We need to go further and faster on exploring ways that workers can collect and use their own data, individually and collectively, so they can argue back against unjust algorithms.
  • We need to campaign for improved workplace rights around transparency over the collection and use of employee data, and to establish new rules, such as on the Right to Disconnect and challenging the always-on work culture.

Prospect is working on these issues and helping to bring unions and activists together on this debate. For example, we are working with Uni Global Union to test and develop new approaches to giving workers more access and control over their own data. And we are learning from partner unions across the world about best practice when it comes to building data rights into collective agreements.

The rage against the algorithm we have seen over the summer only highlights the necessity of this work. This generation of students are the workforce of tomorrow – together we can make sure that their first experience of the injustice of algorithms does not become the norm as they enter the world of work.

More from LabourList

DONATE HERE

We provide our content free, but providing daily Labour news, comment and analysis costs money. Small monthly donations from readers like you keep us going. To those already donating: thank you.

If you can afford it, can you join our supporters giving £10 a month?

And if you’re not already reading the best daily round-up of Labour news, analysis and comment…

SUBSCRIBE TO OUR DAILY EMAIL