lines of code on computer screen Apple cofounder Steve Wozniak said that he had gottena credit limit 10 times that offered his wife. (Photo:Shutterstock)

|

There are laws against discrimination that cost women—and otherminorities—job opportunities, higher salaries and other benefits,even down to housing, but how can a person fight back when theorigin of the discrimination is a faceless algorithm rather than a human being?

|

Apple may be about to find out. Reuters reports that the New York Department ofFinancial Services is going to be investigating Goldman Sachs'credit card practices after its new Apple titanium credit cardoffered a tech entrepreneur's wife a credit limit 20 times smallerthan it offered her husband—despite her having a better creditrating.

|

A tweet by David Heinemeier Hansson about the disparity prompteda similar one from Apple cofounder Steve Wozniak, who said that hehad gotten a credit limit 10 times that offered his wife.

|

The cause of the disparate offers apparently lies with analgorithm.

|

The report quotes a blog post by Linda Lacewell, thesuperintendent of the New York State Department of FinancialServices, who writes, "New York law prohibits discriminationagainst protected classes of individuals." Lacewell added thatincluded algorithms, along with any other way of decidingcreditworthiness, from rendering disparate treatment arising fromindividual characteristics such as age, creed, race, color, sex,sexual orientation, national origin or other factors.

|

Reuters cites DFS saying, "We know the question ofdiscrimination in algorithmic decisioning also extends to otherareas of financial services."

|

And how ironic, that two such prominent women should findthemselves the targets of algorithmic discrimination with regard tomoney.

|

And it's not the first investigation to be launched with regardto discrimination. Bloomberg reports, "NY DFS opened a probeagainst health care giant UnitedHealth Group Inc. after a studyfound an algorithm favored white patients over black patients."

|

Hansson is quoted saying, "Goldman and Apple are delegatingcredit assessment to a black box. It's not a gender-discriminationintent but it is a gender-discrimination outcome."

|

The algorithm problem promises to be in the newsagain in the future. The Electronic Frontier Foundation has criticizedproposed new rules from the Department of Housing and UrbanDevelopment "that would effectively insulate landlords, banks, andinsurance companies that use algorithmic models from lawsuits thatclaim their practices have an unjustified discriminatoryeffect."

|

Says EFF, "HUD's proposal is flawed, and suggests that theagency doesn't understand how machine learning and otheralgorithmic tools work in practice. Algorithmic tools areincreasingly relied upon to make assessments of tenants'creditworthiness and risk, and HUD's proposed rules will make itall but impossible to enforce the Fair Housing Act into thefuture."

|

Use of algorithms, says EFF, would make it much tougher for aplaintiff to prove a disparate impact claim, as well as creatingthree defenses that would insulate defendants from disparate impactclaims because they used algorithms that resulted in discriminatoryoutcomes.

|

Housing, health and money. What's next?

|

READ MORE:

Complete your profile to continue reading and get FREE access to BenefitsPRO, part of your ALM digital membership.

  • Critical BenefitsPRO information including cutting edge post-reform success strategies, access to educational webcasts and videos, resources from industry leaders, and informative Newsletters.
  • Exclusive discounts on ALM, BenefitsPRO magazine and BenefitsPRO.com events
  • Access to other award-winning ALM websites including ThinkAdvisor.com and Law.com
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.