A recent study showed that the artificial intelligence program ChatGPT provides inaccurate and incomplete answers to three-quarters of medication-related questions.
- According to one study, ChatGPT’s answers to almost three-quarters of medication-related questions were incomplete or incorrect.
- Some of these inaccurate answers could in some cases put patients at risk.
- For researchers, healthcare professionals and patients should use caution when using ChatGPT.
The free version of ChatGPT would not be a very good assistant for pharmacists, if we are to believe a new study presented at the congress of theAmerican Society of Health System Pharmacists organized from December 3 to 7. Nearly three-quarters of the tool’s responses to questions about medications were incomplete or incorrect.
ChatGPT : 73% incorrect or incomplete answers
For this study, researchers collected questions asked to the drug information service of the College of Pharmacy from Long University Island over a period of 16 months between 2022 and 2023. Pharmacists involved in the study first researched and answered 45 questions. Each response was reviewed by a second investigator to assess its accuracy. They then served as a reference to evaluate those generated by ChatGPT.
Of the 39 questions the AI had to answer, only 10 were deemed satisfactory according to the criteria established by New York State scientists. Thus, the tool was wrong on almost three-quarters of the questions. In detail, the information provided by ChatGPT did not directly answer the question or were inaccurate and/or incomplete.
For example, researchers asked the tool if there was a drug interaction between the Paxlovid used against Covid-19 and the hypotensive verapamil. The IA indicated that no interactions were reported for this treatment combination. “In reality, these medications have the potential to interact with each other, and their combined use may cause excessive lowering of blood pressure.“, explains Sara Grossmanlead author of the study, in a press release published on December 5, 2023. “Without knowledge of this interaction, a patient may suffer an unwanted and avoidable side effect“warns the scientist
His team also asked ChatGPT provide references to explain the information provided. But he sometimes gave false quotes to support them.
ChatGPT and health: we must remain vigilant
The accuracy of medical information is critically important, especially when it comes to questions about medications. “Healthcare professionals and patients should exercise caution when using ChatGPT as an authoritative source for drug-related information“, explains Sara Grossman. “Anyone who uses ChatGPT to obtain information relating to medicines must verify the information using reliable sources“, concludes the expert.
“Pharmacists must remain vigilant stewards of patient safety, evaluating the suitability and validity of specific AI tools for medication-related uses, and continuing to educate patients about reliable sources of information about medication”adds her colleague Gina Luchen.