These scientists aren’t using ChatGPT


The ChatGPT app is seen running on an iPhone.

Some scientists have chosen to steer clear of generative AI tools such as ChatGPT.Credit: Jaap Arriens/NurPhoto/Shutterstock

Since its release a year ago, it has been impossible to escape the ChatGPT craze. The chatbot, which generates incredibly realistic human-like text and was released by OpenAI in November 2022, seems to have permeated every industry, including science. Researchers have used ChatGPT, and the broader technology known as generative AI, to brainstorm research ideas, create computer code and even write entire research papers.

But not all scientists are embracing the technology. According to a survey carried out by Nature, about 78% of researchers do not regularly use generative AI tools such as ChatGPT. Of those that do, many have used it only for fun activities not related to their research, or as an experiment. Some have chosen to steer clear of chatbots because of the potential pitfalls and limitations. Others fear that they are missing out.

Nature spoke to three researchers about why they are not using ChatGPT in their work.

‘This could be very helpful’

Maxime Gauberti would use ChatGPT if he could. The neuroradiologist joined the waiting list to subscribe to ChatGPT-4 about a week ago. “I tried to buy it, but for the moment it’s not possible — I’m waiting for them to contact me,” says Gauberti. He had previously tried using the earlier, free version of the AI tool, ChatGPT-3.5, but felt that it did not perform well enough to be useful.

Gauberti, who is based at the University of Caen Normandy, France, thinks the newest version of ChatGPT could cut down on the time he spends composing e-mails and writing letters of recommendation for the PhD students and postdocs in his lab. “I think for my situation, for a non-native English speaker, this is really something that could be very, very helpful,” he says. “I’m taking a lot of time just to write, to be sure that it is OK, that this is the proper way to address people,” says Gauberti.

Talking to other researchers who have access to ChatGPT-4 has emphasized how it could help him. “Now that I see some colleagues are using it, I feel bad that it takes so much time to do some tasks that could be automated,” says Gauberti. This crosses his mind fairly often, especially when he has to do tasks that could benefit from ChatGPT.

Gauberti thinks that spending less time writing e-mails could improve his productivity overall. “I need at least two or three days to write something good enough to be sent,” he says. “If I had an artificial intelligence to help me with that, I could be more reactive, and if you multiply that by the number of times that it could happen in your career, maybe it could potentially influence [career progression] a little.”

Stifling creativity?

Ada Kaluzna, a psychologist who left her research post at University College London in 2021 and now lives in Japan, has tried using ChatGPT for fun with her friends, but she thinks that the chatbot has no use in her research on mental health. “I enjoy writing. I do it in a fast way. Why am I even a researcher if I don’t write my own research?”

Kaluzna thinks that using ChatGPT could disrupt her ability to learn and think creatively. “Many people say that they don’t know what they think unless they write it down, and I think there’s something to it,” she says. “Writing enables you to focus on the topic and gives you the time to gather your thoughts. Without trying to put your thoughts down in a structured way, you might not be able to come up with new ideas.”

Although some of her colleagues have used ChatGPT for writing e-mails, Kaluzna says that she is happy to avoid the chatbot. “I don’t feel the pressure right now,” she says.

‘It’s fabricated information’

Viswanath Vittaladevaram, a computational chemist at the University of Galway, Ireland, tried using ChatGPT to find background information but realized that the chatbot cited unreliable sources when answering niche questions. “A large amount of information generated through ChatGPT is from non-academic sources,” he says.

He was also concerned when the AI tool provided varied answers to the same question. “You just get a feeling that it is fabricated information,” says Vittaladevaram. Another concern is that the data used to train ChatGPT could bias the chatbot’s responses, he says.

Although he experimented with using ChatGPT to gather information, Vittaladevaram does not use it to write research papers, because he hopes to publish in scientific journals that have banned the use of the tool for this purpose. Nevertheless, he thinks that the AI can be useful for answering basic questions such as what is the role of a protein in the body. “I decided I should completely move away from ChatGPT in terms of scientific writing,” he says. “But I can use it to get some basic information.”


Leave a Reply

Your email address will not be published. Required fields are marked *