Nigel Shadbolt on why the UK is well placed to lead on the ethics of AI

Nigel Shadbolt, principal of Jesus College, Oxford and co-founder of the Open Data Institute, underscores the ethics of artificial intelligence, and says it will shape the naked ape as tools always have

The UK has a genuine opportunity to take a lead on the ethics of artificial intelligence, says Nigel Shadbolt, principal of Jesus College, Oxford and co-founder of the Open Data Institute (ODI).

Indeed, many of the most important issues thrown up by AI – which Shadbolt began researching in the 1970s and early 1980s – are ethical in nature, he says. And he explores the gamut of these issues in a book he has co-written with Roger Hampson, The Digital Ape: how to live (in peace) with smart machines.

Hampson was chief executive officer at the London Borough of Redbridge for 16 years, and is a non-executive director of the ODI.

“The idea was to write a popular book about AI that says ‘we’ve got agency in this, we have choices’,” said Shadbolt in an interview with Computer Weekly in July at Jesus College – an Elizabethan college with a strong link to Wales. Shadbolt became the Principal in 2015. “When presented to a general audience, they get it.”

Shadbolt will take part in a panel, open to the general public, on “augmented humanity”, associated with Computer Weekly, at the Oxford Science and Ideas Festival later this year.

He says the book adumbrates things that have, subsequent to writing it, gone wrong, such as the Cambridge Analytica data ethics controversy and the clunkiness, so far, of voice-recognition assistant devices in the home – comically getting hold of the wrong end of the stick.

“I am an AI guy originally, and can say there is a lot of hyperbole and nonsense being spoken,” he says.

The title of the book is a homage to Desmond Morris’ 1967 book, The Naked Ape. “All this technology, same old ape,” says Shadbolt.

And he does not have in mind just homo sapiens, but all hominins. “We were making tools for 200,000 generations before even modern human beings appeared on the scene,” says Shadbolt. “That has shaped our neurology, our motor coordination, as well as our social coordination. So it isn’t new that technology is going to change us.”

Does he expect advanced artificial intelligence technologies to change us again?

“They already are, in the form of what is called extended cognition, or extended mind: offloading stuff you used to carry in your head to the device,” he says. But, again, this is not new. “They used to have this argument in classical times with the emergence of writing because that meant people were not carrying the oral tradition in their heads.”

“The idea of us waking up with self-aware, sentient machines is science fiction”

Nigel Shadbolt, Open Data Institute

And it will always have limits. “The idea of us waking up with self-aware, sentient machines is science fiction,” says Shadbolt. “Though, along the way, we will build these amplifiers to our own intelligence which will collectively and individually change us. Such brain-machine interfaces already exist with implants to control Parkinson’s or prosthetic limbs.”

However, he does see a real danger in the “concentration of power in data with an elite”, adding: “So we do have to think about how to empower average citizens and consumers to control their information and get a sense of agency. It is not a new idea, this. The old public intellectuals like Bertrand Russell and CP Snow worried about equality in similar ways.”

Shadbolt gives the example of deep British thinking about life sciences – exemplified in the Warnock committee of inquiry into human fertilisation and embryology in the 1980s – as evidence that the UK excels in the kind of reflection demanded by the current upsurge in AI.

“These are broad ethical questions,” he says. “For example, take the digital companions that will be with us in our houses – what happens when you can preserve the voice and characteristics of a deceased partner? What will be the social norms in that sort of world?”

The House of Lords report on artificial intelligence, and its implications for the UK’s economy and society, published earlier this year, advanced the view that the ethics of artificial intelligence could prove a good niche.

Shadbolt agrees. “I think we are well equipped to do that,” he says. “We could have a ‘Warnock commission’ about privacy and data. If we can build regulatory authorities that balance the interests in data exploitation similar to that which was done regarding stem cells in human embryology, that makes for a really credible argument for data as an innovation space for the UK.”

But Brexit could throw a spanner in the works, he says. “We can now access the best European talent and funds and have been very successful there. The government says it will keep us subscribed to those programmes, but we used to shape them, too. And we have to work out how we properly fund our own research in the future.”

Data and AI

Shadbolt underlines the value of bringing together not only data science, on the one hand, and more algorithmic machine learning and artificial intelligence, on the other, but also infusing both with insights from other fields. “In my own work, I’ve surrounded the computer science with inputs from other subjects,” he says. “Computer science cannot stand alone: you need the life and physical sciences feeding in the problems.”

And this interdisciplinarity comes over in The Digital Ape, which has arresting sentences, such as: “Let’s look in a little more depth at the Sahara Desert Ant, several species of Cataglyhis. There is computation built into the morphology of these small creatures.” And the text is leavened with references to philosophers, such as Daniel Dennett and Wittgenstein, and writers as diverse as PG Wodehouse and TS Eliot.

But as well as this multidisciplinary approach, it is important, says Shadbolt, to calibrate data work with AI work correctly. There is a danger that AI on its own has become “too deeply fashionable”, he says. “You then need to think about the architecture very carefully. At the ODI, we talk a lot about data infrastructure, which is just as important as conventional infrastructure, such as railways and power grids,”

Computer science at Oxford

Shadbolt has traversed the range of computer science higher education in the UK. “I've gone from a pure AI department at [the University of] Edinburgh through a department of psychology at Nottingham, which was interested in human problem-solving, to professor of AI at Southampton, which was strongly engineering-focused,” he says. “And now I am here [at Oxford] and I think all those approaches have a lot to learn from one another.”

So is there an “Oxford” approach to the subject? “Historically, it’s been very strong, going back to Christopher Strachey and Tony Hoare, in terms of the formal basics of computing,” he says. “In the fundamentals of logical computation it is very strong, and that is always a good place to be.

“But, over the last decade and a half, it has expanded to the applied topics, too. And it is flanked by very strong mathematics and statistics departments, and the engineering science departments.”

Computer science is still a relatively new subject for Oxford – and it has a remarkably small number of undergraduates. When Shadbolt arrived, in 2015, there were 36 single honours students for what is “one of the strongest computer science departments in the country”.

Shadbolt and his colleagues, including Sir Tim Berners-Lee, who is also a professor of computer science at Oxford, and Michael Wooldridge, head of the department, are “on a mission to get more colleges to accept computer science”, he says. “We’ll be taking our first intake in October [at Jesus]. It is a college-by-college thing.”

Read more about the ethics of artificial intelligence

Computer science is the most in-demand course across the university, with about 20 applicants for every place. It also has the highest number of students from working-class backgrounds.

In fact, according to the Times Higher, the highest proportion of state school students is to be found on computer science courses. Between 2015 and 2017, 77.8% of UK students enrolled in the subject at Oxford were from a state school.

Moreover, 15.8% of students from the bottom two categories of socio-economic advantage were admitted to computer science at Oxford over the same period.

As Anna McKie reported in the Times Higher: “This compares sharply with the university’s classics courses, which admitted the smallest proportion of students from financially disadvantaged areas, 5.2%, or from state schools, 28.9%.

“But for all the good statistics around computer science admission data, the problem nationwide, and especially in our research intensive universities, is gender”, said Shadbolt.

Once these Oxford computer science students graduate, says Shadbolt, “hanging onto the best of them is a problem in the face of interest from private companies and from the US [academy]”. He adds: “It was ever thus, but if we can work out an arrangement whereby they [the companies] can employ our graduates but also allow people to work with a foot in both camps, that would be good.”

Sir Nigel Shadbolt will take part in a panel on “augmented humanity”, associated with Computer Weekly, at the Oxford Science and Ideas Festival on 19 October 2018.

This was last published in August 2018

Read more on Artificial intelligence, automation and robotics

Join the conversation

1 comment

Send me notifications when other members comment.

Please create a username to comment.

Since people are lazy, they are trying to offload more and more responsibility to computers, but it is a really bad idea.  The drones we use in the Mideast are massacring civilians continually.  We already have half a dozen deaths caused by autonomous vehicle experiments.  We have to realize there is no such thing as AI really, and that only human intelligence actually exists, and that what we call AI is just poorly implemented human intelligence that is played back later.  So we should not do that whenever there is any risk to humans involved.  No autonomous weapons systems or vehicles must ever be allowed.