Apple’s tech-oriented credit card is at the heart of a new investigation into alleged gender discrimination.
New York state regulators have announced an investigation into Goldman Sachs, the bank that issues the Apple Card, after a series of viral tweets from a consumer who shared the vastly different credit limits that were issued to him and his wife when they both applied for the card.
The NYSDFS was first tipped off by a viral Twitter thread from tech entrepreneur David Heinemeier Hansson, begun on Nov. 7. He detailed how his card’s credit limit was 20 times higher than his wife’s, even though she has a higher credit score and they file joint tax returns. Hansson referred to the Apple Card as a “sexist program” and said that its over-reliance on a “biased” algorithm did not excuse discriminatory treatment.
After his complaints on Twitter, Hansson found his wife’s Apple Card’s credit limit was increased to match his. However, Hansson’s frustration was not only with the credit line issue, but also how customer support is trained to handle the accusation of gender bias: blame the algorithm.
Hansson’s complaints were even echoed by Steve Wozniak, co-founder of Apple, who responded to Hansson’s tweet, saying “the same thing happened to us.” Wozniak said that his credit limit was 10 times higher than what his wife had, even though they did not have any separate assets or accounts. In his view, Apple should “share responsibility” for the problem.
Others shared similar stories…
The CEO of Goldman Sachs denied wrongdoing on Monday, stating unequivocally that “we have not and will not make decisions based on factors like gender.” He added that the company would be open to re-evaluating credit limits for those who believe their credit line is lower than their credit history would suggest it should be.
Superintendent of the NYSDFS Linda Lacewell said Sunday in a statement that state law bans discrimination against protected classes of individuals, “which means an algorithm, as with any other method of determining creditworthiness, cannot result in disparate treatment for individuals based on age, creed, race, color, sex, sexual orientation, national origin or other protected characteristics.” She added that this “is not just about looking into one algorithm” but also about working with the tech community more broadly to “make sure consumers nationwide can have confidence that the algorithms that increasingly impact their ability to access financial services do not discriminate.”
Why it’s Hot:
Apple and Goldman Sachs may blame “the algorithm,” but ultimately that algorithm was created by humans – and that excuse doesn’t cut it with customers. As we increasingly rely on algorithms and AI, how do we ensure they’re built without our innate biases?
Sources: Time, Mashable