The New York Department of Financial Services is looking into allegations of gender discrimination against users of the Apple Card, which is administered by Goldman Sachs.
The allegations blew up on Twitter Saturday after tech entrepreneur David Heinmeier Hansson wrote that Apple Card offered him twenty times the credit limit as his wife, although they have shared assets and she has a higher credit score. Many other users voiced similar experiences — including Apple co-founder Steve Wozniak.
Wozniak said his credit limit was 10 times that of his wife, despite the fact that they share all assets and accounts.
"Some say the blame is on Goldman Sachs, but the way Apple is attached, they should share responsibility," Wozniak tweeted.
Hansson wrote that after reaching out to Apple in an attempt to rectify the situation, he was told credit limits are determined by an algorithm.
The situation throws a shadow over the Apple Card, which launched with much fanfare in August as a partnership between the tech giant's Apple Pay program and a new retail consumer-focused effort at Goldman Sachs. The companies had boasted that the card would be available to consumers that might otherwise struggle to access credit, including those with no credit history or below-average credit scores. But these allegations highlight the challenges inherent in letting artificial intelligence, which has been shown in a number of contexts to be biased, make decisions like how much credit to extend to a user.
Linda Lacewell, superintendent of the Department of Financial Services, said on Twitter Saturday the department would "take a look" into the allegations.
In a response to a request for comment on this story, an Apple spokesperson directed CNN Business to Goldman Sachs.
A spokesperson for Goldman Sachs said the company does not consider gender in determining credit limits.
"As with any other individual credit card, your application is evaluated independently," the company said in a statement to CNN Business. "We look at an individual's income and an individual's creditworthiness, which includes factors like personal credit scores, how much debt you have, and how that debt has been managed. Based on these factors, it is possible for two family members to receive significantly different credit decisions."
Hansson, in his Twitter thread, said the program's decision to offer his wife such a low credit limit was so striking that they feared her identity had been stolen and they paid to check her credit score, finding it was higher than his. Hansson is the founder and chief technology officer at web development firm Basecamp.
Wozniak, who co-founded Apple with Steve Jobs and continues to work for the company, said on Twitter he and his wife had a similar experience with the Apple Card. Though they share all of their assets and accounts, he was offered ten times the credit limit of his wife.
"Some say the blame is on Goldman Sachs but the way Apple is attached, they should share responsibility," Wozniak said.
AI-powered algorithms have been problematic in the past, researchers have found. Facial recognition software has trouble identifying women of color, and software used to sentence criminals was found to be biased against black Americans.
That could be an issue as AI technology underpins everything from the speech recognition that makes smart speakers like Siri work to the technology that allows autonomous vehicles to drive themselves; and in the case of the Apple Card, consumer credit assessments.
"Financial services companies are responsible for ensuring the algorithms they use do not unintentionally discriminate against protected groups," Lacewell said on Twitter.
And it's not the first time a tech company has been accused of facilitating discrimination in access to financial services. Facebook is facing a proposed class action lawsuit that alleges that ads for financial services such as loans and insurance coverage on the platform were targeted away from women and older people over the past three years.