Credit: peshkova/Adobe Stock

ChatGPT's answers to nearly three-fourths of drug-related questions reviewed by pharmacists were incomplete or wrong in a recent study. The research underscores the need for guardrails as artificial intelligence becomes increasingly prevalent in health care.

The U.S. Department of Health and Human Services on Wednesday finalized a broad rule aimed at boosting data interoperability and patient access, including a provision to establish transparency requirements for AI in health software. Under the rule, developers of clinical decision support and predictive tools certified by the agency's Office of the National Coordinator for Health Information Technology should enable a limited set of identified users to access information.

Complete your profile to continue reading and get FREE access to BenefitsPRO, part of your ALM digital membership.

Your access to unlimited BenefitsPRO content isn’t changing.
Once you are an ALM digital member, you’ll receive:

  • Breaking benefits news and analysis, on-site and via our newsletters and custom alerts
  • Educational webcasts, white papers, and ebooks from industry thought leaders
  • Critical converage of the property casualty insurance and financial advisory markets on our other ALM sites, PropertyCasualty360 and ThinkAdvisor
NOT FOR REPRINT

© 2025 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.