AI-Generated Medical Advice: GPT and Its Implications for Healthcare

A recent JAMA Viewpoint article by Claudia E. Haupt, JSD, PhD, and Mason Marks, MD, JD, examines the medical applications of AI language learning models (LLMs) like OpenAI's GPT and the risks they may pose to patients and healthcare professionals. The authors discuss potential use cases for these technologies in clinical practice and the legal and ethical implications surrounding their deployment.

Generative pretrained transformers (GPT) are AI tools that produce human-like text, enabling users to interact with AI as if they were communicating with another person. GPT can answer questions, provide differential diagnoses, and complete repetitive writing tasks. However, GPT is prone to errors and biases, raising concerns about its accuracy and reliability.

In healthcare settings, GPT could be employed in research, education, and clinical care. It can help scientists develop study protocols, serve as an interactive medical encyclopedia for students, and ease clinician burnout by taking on repetitive tasks. However, the authors caution that GPT is not currently HIPAA compliant and could compromise patient privacy.

The legal and ethical questions surrounding GPT and other LLMs depend on whether they are used to assist healthcare practitioners or replace them. Existing legal frameworks can assign liability to professionals for any inaccurate advice generated by AI when it is filtered through their judgment. However, patient-facing uses of GPT that substitute for professional judgment may undermine the patient-physician relationship and expose clinicians to legal liability.

In the consumer context, current healthcare laws do not apply, and the responsibilities owed to users are unclear. The Federal Trade Commission (FTC) could potentially frame AI manipulation and misleading AI-generated medical advice as unfair or deceptive business practices. The authors recommend that clinicians educate patients about the risks of using LLMs outside of the patient-physician relationship and advocate for FTC regulation to protect patients from false or misleading AI-generated medical advice.

Read the article here: https://jamanetwork.com/journals/jama/fullarticle/2803077

Subscribe

Get the latest news, updates and more delivered directly to your email inbox