Getty Images/Caiaimage

Goldman Sachs probed over alleged Apple Card gender discrimination

US financial services regulator is investigating allegations that Goldman Sachs’ credit decisions for its Apple Card offering show gender bias

The algorithms used to make credit decisions by Goldman Sachs for its Apple Card offering are being investigated by a US financial services regulator for potential gender discrimination.

This follows an allegation from tech entrepreneur David Heinemeier Hansson that he was given a much higher credit limit than his wife on Apple Card, even though she has a better credit score than him and the couple had a joint tax return. Apple co-founder Steve Wozniak tweeted in response that a similar thing had happened to his wife.

According to US reports, the New York Department of Financial Services (NYDFS) said it is probing Goldman Sachs to determine whether the algorithms used to make decisions are leading to women being treated less favourably. “The department will be conducting an investigation to determine whether New York law was violated and ensure all consumers are treated equally, regardless of sex,” said a spokesman for Linda Lacewell, superintendent of the NYDFS.

“Any algorithm that, intentionally or not, results in discriminatory treatment of women or any other protected class of people violates New York law.”

Apple Card uses the MasterCard network to enable iPhone users to turn their mobile into a credit card, linked to their Apple Pay account, after an application process that takes just minutes. It is this kind of simplicity, combined with Apple’s high security and large customer base, that attracted Goldman Sachs to the partnership.

Goldman Sachs said in a statement: “As with any other individual credit card, your application is evaluated independently. We look at an individual’s income and an individual’s creditworthiness, which includes factors like personal credit scores, how much debt you have, and how that debt has been managed.

“Based on these factors, it is possible for two family members to receive significantly different credit decisions.”

Hansson said that when this became a PR issue for the bank, his wife’s credit limit was raised without asking for any additional documentation. According to reports, he said: “My belief isn’t that there was some nefarious person wanting to discriminate. But that doesn’t matter. How do you know there isn’t an issue with the machine-learning algorithm when no one can explain how this decision was made?”

Read more about Apple Card

He added: “Goldman and Apple are delegating credit assessment to a black box. It’s not a gender-discrimination intent but it is a gender-discrimination outcome.”

In its recent Let’s get real about AI study, management consultant OC&C warned that one of the key challenges in using artificial intelligence (AI) is building trust in the answer. AI systems typically learn “the rules” from exposure to outcomes rather than building up from simple rules, such as “if x, then y”, it said.

This means that the AI system may not be able to explain why a particular result was achieved. In turn, this can cause serious problems with trust in AI infrastructure and rejection of human operators, and/or fundamental problems with conforming to regulations, said OC&C.

Earlier this month, a Goldman Sachs regulatory filing revealed that the bank had made billions of dollars of credit available through its partnership with Apple as the issuing bank for the tech giant’s credit card.

It said that since Apple Card was launched in August, it has lent $10bn in credit to Apple Card users, with current loan balances worth more than $700m.

Read more on IT for financial services

CIO
Security
Networking
Data Center
Data Management
Close