Deadbot - AI
(Photo : (Photo by Chip Somodevilla/Getty Images))
Deadbots, an artificial technology service could cause psychological harm to creators and users or even digitally ‘haunt’ them.

A new artificial intelligence program that is allowing users to simulate speaking to deceased loved ones is sparking ethical concerns from experts in the industry. 

"Deadbots," also known as "griefbots," use artificial intelligence to mimic the langauge and personality of deceased people based on their "digital footprint," according to Techtimes. 

AI ethicists are ringing the alarm on the long-lasting psychological harm the technology could cause, citing immoral companies and thoughtless business practices that are able to misuse the service while taking advantage of the living.

Experts highlight the potential harm and haunt that could be inflicted upon creators and users, explaining that while these services are legally permissible, letting users upload their conversations with dead relatives to essentially bring them back in the form of a chabot steps into a danger zone.

One of the study's co-authors at Cambridge's Leverhulme Center for the Future of Intelligence, Dr. Katarzyna Nowaczyk-Basińska, told The Guardian: 

"Rapid advancements in generative AI mean that nearly anyone with internet access and some basic know-how can revive a deceased loved one." 

Adding, "This area of AI is an ethical minefield. It's important to prioritize the dignity of the deceased and ensure that this isn't encroached on by the financial motives of digital afterlife services."

Companies capitalizing on digital legacy services through advertisements is a likely possibility. It presents an uncomfortable situation when users realize they weren't consulted on whether their data, for example, if recreated loved ones offer daily suggestions or advice, could be shared.

While the deadbot is a risk to all, it especially raises concerns for children. Parents who want to help their children grieve the loss of someone close could cause significant damage by short-circuiting the normal mourning process.

"No re-creation service can prove that allowing children to interact with 'deadbots' is beneficial or, at the very least, does not harm this vulnerable group," the paper cautions.

Researchers have proposed a set of practices that require regulation in order to be enforced while simultaneously preserving the dignity of the dead.

Such measures include procedures for sensitively "retiring" deadbots and limiting their interactive features to adults only, while being transparent as to how the artificial system operates.

The idea to recreate a dead loved one using a ChatGPT-style AI mechanism is nothing new. 

In 2021, Joshua Barbeau made headlines after using GPT-3 to manifest a chatbot who spoke with the voice of his deceased girlfriend, according to The Guardian.

Several other programs also offer services such as converting text messages from the deceased into chabots, which led developer Eugenia Kuyda to create the AI companion app Replika.

Another well-known business called MyHeritage is a genealogy website that has introduced Deep Nostalgia, featuring animation videos of users' ancestors from still photos.

Following its viral frenzy, the company admitted that some users found it creepy.

"The results can be controversial, and it's hard to stay indifferent to this technology," MyHeritage said at the time. 

The company has since launched DeepStory, giving users the ability to generate talking videos.