lines of code on computer screen Apple cofounder Steve Wozniak said that he had gotten a credit limit 10 times that offered his wife. (Photo: Shutterstock)

There are laws against discrimination that cost women—and other minorities—job opportunities, higher salaries and other benefits, even down to housing, but how can a person fight back when the origin of the discrimination is a faceless algorithm rather than a human being?

Apple may be about to find out. Reuters reports that the New York Department of Financial Services is going to be investigating Goldman Sachs' credit card practices after its new Apple titanium credit card offered a tech entrepreneur's wife a credit limit 20 times smaller than it offered her husband—despite her having a better credit rating.

A tweet by David Heinemeier Hansson about the disparity prompted a similar one from Apple cofounder Steve Wozniak, who said that he had gotten a credit limit 10 times that offered his wife.

The cause of the disparate offers apparently lies with an algorithm.

The report quotes a blog post by Linda Lacewell, the superintendent of the New York State Department of Financial Services, who writes, "New York law prohibits discrimination against protected classes of individuals." Lacewell added that included algorithms, along with any other way of deciding creditworthiness, from rendering disparate treatment arising from individual characteristics such as age, creed, race, color, sex, sexual orientation, national origin or other factors.

Reuters cites DFS saying, "We know the question of discrimination in algorithmic decisioning also extends to other areas of financial services."

And how ironic, that two such prominent women should find themselves the targets of algorithmic discrimination with regard to money.

And it's not the first investigation to be launched with regard to discrimination. Bloomberg reports, "NY DFS opened a probe against health care giant UnitedHealth Group Inc. after a study found an algorithm favored white patients over black patients."

Hansson is quoted saying, "Goldman and Apple are delegating credit assessment to a black box. It's not a gender-discrimination intent but it is a gender-discrimination outcome."

The algorithm problem promises to be in the news again in the future. The Electronic Frontier Foundation has criticized proposed new rules from the Department of Housing and Urban Development "that would effectively insulate landlords, banks, and insurance companies that use algorithmic models from lawsuits that claim their practices have an unjustified discriminatory effect."

Says EFF, "HUD's proposal is flawed, and suggests that the agency doesn't understand how machine learning and other algorithmic tools work in practice. Algorithmic tools are increasingly relied upon to make assessments of tenants' creditworthiness and risk, and HUD's proposed rules will make it all but impossible to enforce the Fair Housing Act into the future."

Use of algorithms, says EFF, would make it much tougher for a plaintiff to prove a disparate impact claim, as well as creating three defenses that would insulate defendants from disparate impact claims because they used algorithms that resulted in discriminatory outcomes.

Housing, health and money. What's next?

Continue Reading for Free

Register and gain access to:

  • Breaking benefits news and analysis, on-site and via our newsletters and custom alerts
  • Educational webcasts, white papers, and ebooks from industry thought leaders
  • Critical converage of the property casualty insurance and financial advisory markets on our other ALM sites, PropertyCasualty360 and ThinkAdvisor
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.

Marlene Satter

Marlene Y. Satter has worked in and written about the financial industry for decades.