Technology is not neutral. It is always embedded in social, economic, and political relations built on systematic, structural imbalances of power. Welcome to capitalism. Including the Xi Jinping Chinese variant.
Artificial intelligence technologies are not unusual in this respect. But what is true of IT in general is even more so of AI. Such is the argument of Dan McQuillan, author of Resisting AI: an anti-fascist approach to artificial intelligence, and a lecturer in creative and social computing at Goldsmiths, University of London. (Sebastian Klovig Skelton’s interview with McQuillan features in the 01-7 issue of Computer Weekly, and can be read here on the CW web site).
His strong contention is that existing AI tends to lend itself to authoritarian forms of politics that are remorselessly exclusionary, deciding who gets to have agency and who does not. It amplifies existing social injustices and needs to be junked.
So far, so pessimistic. He also, however, argues for direct action against AI as we know it – for example, against new datacentres being built in areas with water access issues – and for participatory social projects to envision alternative uses of the technology. He takes inspiration from the work of the British cybernetician Stafford Beer, who was an advisor to the Salvador Allende government in Chile that was toppled by a fascist coup in 1973.
Elsewhere in the 01-07 August issue of Computer Weekly, we find hackers fighting back against AI, in an analysis by Alex Scroxton. 72% of the ethical hackers who contributed to Bugcrowd’s annual Inside the mind of a hacker study said that AI will never be able to reproduce the creativity of thought of a human hacker. Nevertheless, hackers are using generative AI, with 62% of the report’s respondents saying they are incorporating it into their security workflows.
And younger hackers – Generation Z is now preponderant in the community – will most likely take GenAI for granted as a digital assistant. But they are arguably more likely to be aligned with the emancipatory and participatory politics advocated by McQuillan in Resisting AI. Indeed 50% of the Bugcrowd constituency are engaged with social or community groups relating to cybersecurity.
Fleur Doidge’s feature on the impact of the GenAI rocket-boosting of artificial intelligence and machine learning on the datacentre brings out the insight that a datacentre optimised for AI will be a different beast to a conventional one, built for enterprise operations. As elsewhere in the economy, the likely impact of the increased use of generative AI on the datacentre sector is likely to be improved productivity and efficiency among staff. The watchword in this area is caution and the application of foresight. Any impacts of AI on power, processing and storage could be countered by using AI.
However, datacentre operators might have to look out nervously for those protestors.