Interview: Ruha Benjamin, author, Race After Technology

Why do systems that are supposed to help society seem to have a disproportionately adverse effect on ethnic minorities?

Are software algorithms racist? “The idea of a racist robot is a tongue-in-cheek name for machine bias – but I think about the larger process of discriminatory design,” says Ruha Benjamin, an associate professor of African American studies at Princeton University and author of Race After Technology.

Benjamin has a stark warning for the technology industry, software developers and users of technology: technology has the potential to hide, speed up and even deepen discrimination, while appearing neutral and even benevolent when compared to the racist practices of a previous era.

Publicly, the tech industry appears to hold liberal values. Tim Berners-Lee invented the World Wide Web, giving everyone the ability to share information freely. Facebook and other social media platforms have enabled people to connect and share experiences, while open source has demonstrated the altruistic nature of freely available software and the fact that programmers dedicate time and effort maintaining open source code.

But Benjamin argues that many algorithms in software systems and online services have discriminatory designs that encode inequality by explicitly amplifying racial hierarchies, by ignoring but thereby replicating social divisions, or by aiming to fix racial bias but ultimately doing quite the opposite.

The risk to society is that these hidden algorithmic biases can therefore have a detrimental effect on minorities. “I want people to think about how automation allows the propagation of traditional biases – even if the machine seems neutral,” she says.

In Race After Technology, Benjamin explores algorithmic discrimination and how existing racial biases get inside data science. “There are many different routes,” she says. “Technology does not grow on trees. What kind of seeds are we planting?”

She asks readers to think about how they can design algorithms differently,  so they are not predisposed to prejudice.

“I want people to think about how automation allows the propagation of traditional biases”

Ruha Benjamin, Princeton University

Because this bias often finds its way into the public sector systems that are responsible for supporting vulnerable members of society, they can amplify racial inequality, she says. For Benjamin, society need to be acutely aware of the societal risk of bias in public sector systems and the systems in finance, insurance, healthcare and other sectors, where officials rely on computer systems to make decisions that can have an adverse impact on individuals.

“It may seem like a very objective system that relies on training data, but if historic data bears strong bias, we are automating these past biases,” she says.

Benjamin would prefer public officials, and those in the private sector responsible for making decisions about individuals, to pull back from their computer screens. “I am trying to resuscitate the human agents behind these systems,” she says.

While people generally recognise their own human bias, for Benjamin, outsourcing decisions to objective systems that have biased algorithms simply shifts that bias to the machine.

Designing out racial bias

Questioning the underlying value of any given piece of technology should be part of the design process, says Benjamin. This should also be part of good corporate social responsibility and become a normal aspect of product development. She says software developers need to think both about how their systems can enhance society and the communities their software could harm.

However, the pace at which software developers are encouraged to get a product out into the market is incompatible with a more considered approach, where the societal impact is assessed, says Benjamin. “Race After Technology is also about the speed we are incentivised to go, which sidesteps the social dimension. Those who produce the tech are not incentivised to go slower.”

Benjamin adds: “My vision requires different values. Think about whether you are motivated more by economic value or social value.”

Although the tech industry tends to be multicultural and, at least publicly, seems like a promoter of gender equality and diversity, Benjamin feels that the raw statistics around demographics, gender and racial diversity represents one of the facets that need to be considered.

“There is a culture in the technology industry which influences people and appears to override their backgrounds and upbringings,” she says. At some level, Benjamin feels that the background of individuals who work in tech can sometimes take a back seat to how they are motivated.

“You have a narrow area of expertise. You are not necessarily incentivised to think about the broad impact of your work,” she says.

Read more about algorithmic bias

But recent protests by tech workers show that people do feel angst over what projects they are prepared to work on, says Benjamin. “I have seen a growing movement in the technology industry for employees to push their organisations to think about the broader implications of projects for surveillance and military tech,” she says.

For instance, in June 2018, Google decided not to renew its contract with the US military to develop artificial intelligence (AI) technology following a workers’ revolt.

After Facebook/Cambridge Analytica, Benjamin believes people will demand that the software products they use and support, and the companies that make these products and services, are more ethical.

In fact, Benjamin says bias in the tech industry is similar to other industries and institutions. “The problems raised are not that different,” she adds. “Many aspects of society are in crisis with respect to values.”

But she thinks there is an opportunity for the tech sector to take a lead in ethics. “Who are you accountable to?” she asks. “The bottom line? Shareholders?” For Benjamin, there is an ethos in tech to do good – but this aspiration is not the whole story.

After writing Race After Technology, she says: “I was interested in the many forms bias can take. Is technology for social good, a tech fix for bias? I want to reposition technology from a fix to bypass social problems, to how technology factors in a wider social change.”

Read more on CW500 and IT leadership skills

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.