ma8 - stock.adobe.com
Immigration rights campaigners have filed a judicial review against the Home Office, challenging its use of a selective algorithm in the processing of visa applications.
The Joint Council for the Welfare of Immigrants (JCWI) said the “streaming tool” creates an unfair environment for certain nationals applying to secure their visa.
The migrants’ rights charity wants the court to declare the use of the algorithm unlawful, and to halt the current method of storing visa applications, calling it a breach of the 2010 Equality Act.
But the Home Office maintains it “complies fully” with all relevant legislation.
The case is believed to be Britain’s first court challenge to an algorithmic decision system.
The algorithm first came to public attention when it was shown to a group of lawyers visiting Sheffield’s visa processing centre.
The streaming tool has each applicant’s risk level graded red, yellow or green. Once flagged by the algorithm, eligibility is considered on a case-by-case basis.
Sarah Marcus, communications director at JCWI, said many of the individuals being filtered as problematic are likely to be “from African countries”.
The Home Office admitted it has a list of countries more likely to be deemed a “risk’, but has opted not to publicly identify which these are.
A 2016 report by the Independent Chief Inspector of Borders and Immigration stated that while segmenting applicants “to manage them more efficiently” may be sensible, there is a risk that the algorithms will become “de facto decision-making tools”.
But a Home Office spokesperson said the streaming tool is “only used to allocate applications, not to decide them”. The department said the data is used to indicate whether an application might require more or less scrutiny, allowing visa applications to be “processed as efficiently as possible”.
JCWI’s legal policy director Chai Patel said the streaming tool operates as a “digital hostile environment” which for years has had a “major effect on who has the right to come here to work, study or see loved ones”. The tool “[singles] out some people as ‘suspect’ and others as somehow more trustworthy, just because of where they come from”, he said.
Martha Dark, director of technology justice advocacy group Foxglove, pointed out that algorithms are not neutral, but reflect the preferences of the people who build and use them.
“This visa algorithm didn’t suddenly create bias in the Home Office, but because of its feedback loop, it does accelerate and reinforce [bias]. The Home Office should scrap the streaming tool and set up a scheme that’s fair for everyone, regardless of colour or creed,” she said.
The Home Office said it would be “inappropriate to comment” on the current legal proceedings.