Forgot Password

Sign In

Register

  • Company Information

  • Billing Address

  • Are you primarily interested in advertising *

  • Do you want to recieve the HealthTimes Newsletter?

  • Caution urged when using AI to answer health questions

    Author: AAP

Flinders University researchers have cautioned against the use of artificial intelligence programs to seek health advice, calling for regulators and health-care experts to be involved in developing quality standards.

In a recent study, the university evaluated the ability of ChatGPT to provide responses to questions commonly asked by patients with cancer on prevalence, prognosis and treatment.

Subscribe for FREE to the HealthTimes magazine



The responses were compared against more general Google responses.

"We caution that the ChatGPT responses didn't provide quality references, will produce only answers to some questions, is not currently kept up to date in real-time, and will produce incorrect answers in a confident-sounding manner," researcher Ash Hopkins said.

"The latter is an important required improvement to ensure the virtual assistant can respond with uncertainty when it is not sure of its answers."

However, Dr Hopkins said ChatGPT had a remarkable ability to formulate responses to complex questions about cancer in a way that often appeared less likely to cause alarm.

"Overall, the responses demonstrate that ChatGPT generally produces easily understandable answers which are comparable to Google's feature snippet," he said.

"Notably, the ChatGPT responses often had contextualisations which appeared to minimise the likelihood of alarm, while practical recommendations, such as speaking to your doctor, were also often added."

Comments

Thanks, you've subscribed!

Share this free subscription offer with your friends

Email to a Friend


  • Remaining Characters: 500