What Should GenAI Not Do in Healthcare?

With the advent of generative AI models like Med-PaLM and ChatGPT, providers can now type complex medical questions into a chat box and receive sophisticated (and hopefully accurate) answers. This ability surpasses previous AI applications in the potential to serve patients, but also in the potential to run afoul of laws like corporate practice of medicine (CPOM) rules, the False Claims Act (FCA), and FDA regulations. These concerns — on top of the risk of a generative AI model fabricating answers, known as “hallucinations” — mean that providers should proceed with extreme caution before implementing generative AI tools into their practices.

Read the full article by Matthew Mousley on the Wharton Healthcare Quarterly website.

Promoting AI Use in Developing Medical Devices

The U.S. Food and Drug Administration (FDA) has issued a draft guidance intended to promote the development of safe and effective medical devices that use a type of artificial intelligence (AI) known as machine learning (ML). The draft guidance further develops FDA’s least burdensome regulatory approach for AI/ML-enabled device software functions (ML-DSFs), which aims to increase the pace of innovation while maintaining safety and effectiveness.

Read the full Alert on the Duane Morris website.

© 2009- Duane Morris LLP. Duane Morris is a registered service mark of Duane Morris LLP.

The opinions expressed on this blog are those of the author and are not to be construed as legal advice.

Proudly powered by WordPress