Apple’s tech-oriented credit card is at the heart of a new investigation into alleged gender discrimination.
New York state regulators have announced an investigation into Goldman Sachs, the bank that issues the Apple Card, after a series of viral tweets from a consumer who shared the vastly different credit limits that were issued to him and his wife when they both applied for the card.
The NYSDFS was first tipped off by a viral Twitter thread from tech entrepreneur David Heinemeier Hansson, begun on Nov. 7. He detailed how his card’s credit limit was 20 times higher than his wife’s, even though she has a higher credit score and they file joint tax returns. Hansson referred to the Apple Card as a “sexist program” and said that its over-reliance on a “biased” algorithm did not excuse discriminatory treatment.
The @AppleCard is such a fucking sexist program. My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does. No appeals work.
— DHH (@dhh) November 7, 2019
After his complaints on Twitter, Hansson found his wife’s Apple Card’s credit limit was increased to match his. However, Hansson’s frustration was not only with the credit line issue, but also how customer support is trained to handle the accusation of gender bias: blame the algorithm.
Apple has handed the customer experience and their reputation as an inclusive organization over to a biased, sexist algorithm it does not understand, cannot reason with, and is unable to control. When a trillion-dollar company simply accepts the algorithmic overlord like this…
— DHH (@dhh) November 8, 2019
Hansson’s complaints were even echoed by Steve Wozniak, co-founder of Apple, who responded to Hansson’s tweet, saying “the same thing happened to us.” Wozniak said that his credit limit was 10 times higher than what his wife had, even though they did not have any separate assets or accounts. In his view, Apple should “share responsibility” for the problem.
I'm a current Apple employee and founder of the company and the same thing happened to us (10x) despite not having any separate assets or accounts. Some say the blame is on Goldman Sachs but the way Apple is attached, they should share responsibility.
— Steve Wozniak (@stevewoz) November 10, 2019
Others shared similar stories…
Just read this thread. My wife has a way better score than me, almost 850, has a higher salary and was given a credit limit 1/3 of mine. We had joked that maybe Apple is just sexist. Seems like it’s not a joke. Beyond f’ed up.
— Carmine Granucci (@whoiscarmine) November 9, 2019
Same here my wife is a doctor, I haven’t worked in 5 years and my credit limit is $20,000 her is $4,800. She wanted to surprise me with the new Apple Watch and phone for both of us, price tag came out to $5,200. She had to put the rest on our AMX.
— shareProud (@shareproud) November 9, 2019
The CEO of Goldman Sachs denied wrongdoing on Monday, stating unequivocally that “we have not and will not make decisions based on factors like gender.” He added that the company would be open to re-evaluating credit limits for those who believe their credit line is lower than their credit history would suggest it should be.
Superintendent of the NYSDFS Linda Lacewell said Sunday in a statement that state law bans discrimination against protected classes of individuals, “which means an algorithm, as with any other method of determining creditworthiness, cannot result in disparate treatment for individuals based on age, creed, race, color, sex, sexual orientation, national origin or other protected characteristics.” She added that this “is not just about looking into one algorithm” but also about working with the tech community more broadly to “make sure consumers nationwide can have confidence that the algorithms that increasingly impact their ability to access financial services do not discriminate.”
Why it’s Hot:
Apple and Goldman Sachs may blame “the algorithm,” but ultimately that algorithm was created by humans – and that excuse doesn’t cut it with customers. As we increasingly rely on algorithms and AI, how do we ensure they’re built without our innate biases?