Medindia LOGIN REGISTER
Medindia

AI Tool Mimics Doctor's Notes With Precision

by Colleen Fleiss on Dec 4 2023 3:06 PM
Listen to this article
0:00/0:00

AI Tool Mimics Doctor
A new AI program called GatorTronGPT can craft doctors' notes so convincingly that even two physicians couldn't distinguish them from human-written notes. (1 Trusted Source
A Study of Generative Large Language Model for Medical Research and Healthcare

Go to source
)
In this proof-of-concept study, physicians reviewed patient notes -- some written by actual medical doctors while others were created by the new AI programme -- and the physicians identified the correct author only 49 percent of the time. A team of 19 researchers from NVIDIA and the University of Florida trained supercomputers to generate medical records based on a new model, GatorTronGPT, that functions similarly to ChatGPT.

GatorTron AI Tool: Impact and Accessibility in Clinical Research

The free versions of GatorTron models have more than 430,000 downloads from Hugging Face, an open-source AI website. GatorTron models are the site's only models available for clinical research, according to lead author Yonghui Wu, from the University of Florida’s department of health outcomes and biomedical informatics.

"In health care, everyone is talking about these models. GatorTron and GatorTronGPT are unique AI models that can power many aspects of medical research and health care. Yet, they require massive data and extensive computing power to build. We are grateful to have this supercomputer, HiPerGator, from NVIDIA to explore the potential of AI in healthcare," Wu said.

For this research, published in the journal npj Digital Medicine, the team developed a large language model that allows computers to mimic natural human language.These models work well with standard writing or conversations, but medical records bring additional hurdles, such as needing to protect patients' privacy and being highly technical.

Digital medical records cannot be Googled or shared on Wikipedia. To overcome these obstacles, the researchers used health medical records of two million patients while keeping 82 billion useful medical words. Combining this set with another dataset of 195 billion words, they trained the GatorTronGPT model to analyse the medical data with GPT-3 architecture, or Generative Pre-trained Transformer, a form of neural network architecture.

Of the many possible uses for a medical GPT, one idea involves replacing the tedium of documentation with notes recorded and transcribed by AI. For an AI tool to reach such parity with human writing, programmers spend weeks programming supercomputers with clinical vocabulary and language usage based on billions upon billions of words.

Reference:
  1. A Study of Generative Large Language Model for Medical Research and Healthcare - (https://arxiv.org/abs/2305.13523)
Source-IANS


Advertisement