ChatGPT could produce fake medical data

While the threat already hangs over the work of artists, translators, or even journalists, might artificial intelligence (AI) also supplant that of scientists? In any case, this is what some fear following the publication of a study on this subject in the journal Patterns on March 10. This new research reveals that ChatGPT, the prototype chatbot that has been all the rage lately, can indeed “manufacture compelling medical data”, rapporte Interesting Engineering.

The researchers came to this conclusion following asking ChatGPT to generate an abstract of a scientific paper looking at the effects of two drugs on rheumatoid arthritis. To do this, they invited the conversational tool to use medical data from 2012 to 2020. Task accomplished for ChatGPT which produced a realistic scientific article, and which went so far as to affirm that a drug was more effective than another.

If some of the data used by the tool were indeed real, the scientists behind the study doubt, however, that this is the case for each of them since “ChatGPT only considers data up to 2019”, emphasizes Interesting Engineering. Above all, it seems dangerous to them that an AI can draw conclusions as to the greater effectiveness of one drug compared to another.

Data that is difficult to detect

The authors of this study are concerned that it will now be much easier to publish fraudulent research “likely to cast doubt on all legitimate work” and that some people with disabilities take advantage of it.

“In one followingnoon, you can end up with dozens of abstracts of scientific papers that can be submitted to various conferences for publication”lament the researchers. “When an abstract is accepted for publication, it is also possible to use this same artificial intelligence technology to write its manuscript.”

But for researchers, the main concern is that ChatGPT may offer to use data that does not exist in order to write research papers. These might thus “easily escape human detection and ultimately end up in a post.”

Share:

Facebook
Twitter
Pinterest
LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.