Artificial intelligence (AI) chatbots that mimic the language and personalities of dead people risk “digitally haunting” the living, a researcher has warned.
Some companies are already offering to “bring grandma back to life” by offering users the ability to upload the conversations and digital footprint of their deceased loved ones into a chatbot.
Such services could be marketed to terminally ill parents or children, or to people who are still healthy and want to catalog their lives and leave behind a digital legacy.
But researchers at the University of Cambridge say AI chatbots – known as deadbots – are a “high-risk” endeavor that could cause lasting psychological harm to users and fundamentally disrespect the rights of the deceased.
Dr Tomasz Hollanek, AI researcher from the Leverhulme Centre, said: “It is essential that digital after-life services take into account the rights and consent of not only those they recreate, but also those who will have to interact with the simulations.
“These services run the risk of causing enormous distress to people if they are subjected to unwanted digital hauntings from alarmingly accurate AI recreations of those they have lost.”
“The potential psychological effect, particularly in an already difficult time, could be devastating.”
The study, published in the journal Philosophy and Technology, highlights the potential for companies to use dead robots to surreptitiously advertise products to users in the manner of a deceased loved one, or to distress children by insisting about the fact that a deceased relative is still “with you”.
Researchers say that when the living sign up to be virtually recreated after death, the resulting chatbots could be used by companies to spam surviving family and friends with notifications, reminders and updates. unsolicited updates on the services they provide – almost as if it were digital. hunted by the dead.
Even those who find initial comfort in a dead robot may be exhausted by daily interactions that become a “crushing emotional weight,” the study authors say, but they may also be helpless in the face of the suspension of a AI simulation if their loved one, now deceased, signed an agreement. long contract with an after-life digital service.
Study co-author Dr Katarzyna Nowaczyk-Basinska said: “Rapid advances in generative AI mean that almost anyone with internet access and some basic know-how can bring a deceased loved one back to life.
“This area of AI is an ethical minefield.
“It is important to prioritize the dignity of the deceased and ensure that it is not encroached upon by financial motivations linked to digital afterlife services, for example.
“At the same time, a person can leave an AI simulation as a farewell gift to loved ones who are not ready to deal with their grief in this way.
“The rights of data donors and those who interact with AI services after death must be equally protected.”
Researchers say there are already platforms offering to recreate the dead with AI for a small fee, such as Project December, which began by leveraging GPT models before developing its own systems, and applications like HereAfter.
Similar services have also begun to emerge in China, according to the study.
Dr Hollanek said people “could develop strong emotional connections to such simulations, making them particularly vulnerable to manipulation”.
He said ways to “remove dead robots in a dignified manner should be considered”, which “could mean some form of digital funeral”.
“We recommend designing protocols that prevent dead bots from being used in disrespectful ways, for example for advertising purposes or to have an active presence on social media,” he added.
The researchers recommend age restrictions for deadbots and also call for “meaningful transparency” to ensure that users are consistently aware that they are interacting with an AI.
They also called on design teams to prioritize opt-out protocols that allow potential users to end their relationships with dead bots in ways that put an end to their emotions.
Dr Nowaczyk-Basinska said: “We need to start thinking now about how to mitigate the social and psychological risks of digital immortality, because the technology is already here.”
Contact our press team by emailing us at webnews@metro.co.uk.
For more stories like this, check out our news page.
MORE : Katy Perry Fans Annoyed After Met Gala AI Prank That Even Her Own Mom Fell In Love With
MORE : Boston Dynamics just made its robot dog even more terrifying
MORE : Rejoice! Taylor Swift and Harry Styles return to TikTok as music row ends
Get the latest news you need, feel-good stories, analysis and more.
This site is protected by reCAPTCHA and Google Privacy Policy And Terms of use apply.