AI Concept “The potential for machine learning systems to amplify discrimination is not going away on its own,” says one expert. “Companies need to actively teach their technology to not discriminate.” (Image: Shutterstock)

While machine learning systems – if designed well — can help minimize human bias in decision-making, it’s also very possible that such systems can also reinforce systemic bias and discrimination, according to the World Economic Forum white paper, “How to Prevent Discriminatory Outcomes in Machine Learning.”

The WEF report cites many real-life examples and “what-if” scenarios in which there is risk for discrimination when designing algorithms for machine learning systems, including the real potential for exclusionary health insurance systems, Erica Kochi, co-leader of UNICEF’s Innovation Unit, writes in Quartz.

Complete your profile to continue reading and get FREE access to, part of your ALM digital membership.

Your access to unlimited content isn’t changing.
Once you are an ALM digital member, you’ll receive:

  • Critical information including cutting edge post-reform success strategies, access to educational webcasts and videos, resources from industry leaders, and informative Newsletters.
  • Exclusive discounts on ALM, BenefitsPRO magazine and events.
  • Access to other award-winning ALM websites including and

Already have an account?

Katie Kuehner-Hebert

Katie Kuehner-Hebert is a freelance writer based in Running Springs, Calif. She has more than three decades of journalism experience, with particular expertise in employee benefits and other human resource topics.

More from this author



Join BenefitsPRO

Don’t miss crucial news and insights you need to navigate the shifting employee benefits industry. Join now!

  • Unlimited access to - your roadmap to thriving in a disrupted environment
  • Access to other award-winning ALM websites including and
  • Exclusive discounts on and ALM events.

Already have an account? Sign In Now
Join BenefitsPRO

Copyright © 2023 ALM Global, LLC. All Rights Reserved.