Advances in artificial intelligence mean ‘deadbots’ can ‘haunt’ users, warn AI experts. Here’s what it means


‘Deadbots’ or digital recreations of lost loved ones are fast becoming a reality, with many companies offering services that can bring back the dead digitally. As amazing as it may sound, AI ethicists have warned that such services would do more harm than good and can even “haunt” users.

What are ‘deadbots’?

Deadbots or ‘griefbots’ are AI chatbots trained to simulate the personality and speech patterns of a dead loved one using the digital footprints — messages, etc — they’ve left behind. 

The ethical dilemma

As per the University of Cambridge researchers, “Artificial intelligence that allows users to hold text and voice conversations with lost loved ones runs the risk of causing psychological harm and even digitally ‘haunting’ those left behind, without design safety standards.

Researchers from Cambridge University’s Leverhulme Centre for the Future of Intelligence in a paper published in the journal Philosophy and Technology have highlighted how these deadbots can cause psychological harm or be misused.

‘Overwhelming emotional weight’

As per the researchers, such deadbots can distress users, especially children, by insisting their dead parent is still “with you”. They contend that while initially these digital recreations may offer some semblance of comfort, in the losing term, they can cause psychological distress. 

Users may feel drained by daily interactions that become an “overwhelming emotional weight”, argue researchers. They also say that the emotional crutch can impede mourning, which is a natural process of dealing with grief.

An ethical minefield

As per study co-author Dr Katarzyna Nowaczyk-Basińska, “Rapid advancements in generative AI mean that nearly anyone with Internet access and some basic know-how can revive a deceased loved one”, it is an “ethical minefield”.

“It’s important to prioritise the dignity of the deceased, and ensure that this isn’t encroached on by financial motives of digital afterlife services, for example.”

One potential scenario the researchers propose is the misuse of the “data donor” or the digitally recreated dead loved one, to push products and advertising.

“People might develop strong emotional bonds with such simulations, which will make them particularly vulnerable to manipulation,” said co-author Dr Tomasz Hollanek.

They suggest that deadbots should be retired in a “dignified way”.

“Methods and even rituals for retiring deadbots in a dignified way should be considered. This may mean a form of digital funeral, for example, or other types of ceremony depending on the social context.”

“We recommend design protocols that prevent deadbots being utilised in disrespectful ways, such as for advertising or having an active presence on social media,” added Hollanek.

They also call for age restrictions for users and “meaningful transparency” to ensure users are consistently aware that they are interacting with an AI.

“These services run the risk of causing huge distress to people if they are subjected to unwanted digital hauntings from alarmingly accurate AI recreations of those they have lost. The potential psychological effect, particularly at an already difficult time, could be devastating,” they reason.

(With inputs from agencies)


Leave a Reply

Your email address will not be published. Required fields are marked *