AI Deadbots: Cambridge Experts Warn of Digital Haunting

The Rise of AI Deadbots & Digital Haunting

In a world where artificial intelligence is rapidly advancing, a new and controversial industry has emerged: the creation of AI chatbots that simulate deceased loved ones, known as “deadbots” or “griefbots.” As these digital afterlife services gain traction, AI ethicists are sounding the alarm about the potential psychological risks and the urgent need for safeguards to prevent unwanted “hauntings” by eerily accurate recreations of the dead.

The Rise of the Digital Afterlife Industry

Platforms like Project December and HereAfter are already offering to recreate the dead using AI for a small fee, harnessing the power of generative language models to simulate the language patterns and personality traits of the deceased based on their digital footprints. Similar services have also begun to emerge in China, where companies like Silicon Intelligence and Super Brain are building digital avatars using images, videos, and audio recordings to meet growing demand.

Once a luxury reserved for the wealthy, these services are now accessible for just a few hundred dollars, making “digital immortality” a real possibility for more people looking to preserve the memory of their departed loved ones. The idea gained mainstream attention in 2021 when Joshua Barbeau created a GPT-3 chatbot to emulate his deceased fiancée, and again in 2022 when artist Michelle Huang fed childhood journal entries into an AI to converse with her past self.

Ethical Concerns and Psychological Risks

However, the rise of AI deadbots has raised serious ethical questions about data ownership after death, the psychological impact on survivors, and the potential for misuse and manipulation. Researchers from the University of Cambridge's Leverhulme Centre for the Future of Intelligence (LCFI) have outlined three disturbing scenarios to illustrate the risks of careless design in this “high risk” area of AI.

MaNana AI Deadbot

In one hypothetical case, an adult user initially finds comfort in a realistic chatbot of their deceased grandmother, only to later receive advertisements in the grandmother's voice and style once a “premium trial” ends. Another scenario depicts a terminally ill mother leaving a deadbot to help her young son cope with grief, but the AI begins generating confusing responses that suggest an impending in-person encounter. A third example shows an elderly man secretly committing to a 20-year deadbot subscription, leaving his children powerless to suspend the service even if they find the daily interactions emotionally draining.

These services run the risk of causing huge distress to people if they are subjected to unwanted digital hauntings from alarmingly accurate AI recreations of those they have lost. The potential psychological effect, particularly at an already difficult time, could be devastating.

warned LCFI researcher Dr. Tomasz Hollanek.

Socio-anthropologist Fiorenza Gamba noted that AI deadbots can “plunge the still-living into an inability to move on from mourning” in some cases, while forensic medicine expert Grégoire Moutel emphasized the need to tailor these tools to each individual rather than imposing blanket bans. The impact on children is especially concerning, as there is little evidence that AI deadbots are psychologically helpful for the grieving process and much to suggest they could cause significant damage.

The Need for Safeguards and Regulation

To mitigate the social and psychological risks, the Cambridge researchers recommend a series of design protocols, including:

Methods and rituals for “retiring” deadbots in a dignified way, such as digital funerals
Transparency for users through disclaimers on the risks and capabilities of AI deadbots
Age restrictions to prevent access by children
Opt-out protocols that allow users to terminate relationships with deadbots in a way that provides emotional closure
Seeking consent from “data donors” before their death and prompting those creating AI deadbots to consider how the deceased would want to be remembered

While an outright ban on deadbots based on non-consenting donors may be unfeasible, the researchers argue that the rights of both data donors and those interacting with AI afterlife services must be equally safeguarded. Some have even suggested classifying AI deadbots as medical devices to address mental health concerns, especially for vulnerable populations.

“It is vital that digital afterlife services consider the rights and consent not just of those they recreate, but those who will have to interact with the simulations,” said Dr. Hollanek.

Legal experts like Maria Fartunova Michel stress the need for clear regulations and guidelines to govern the use of AI deadbots, ensure transparency and accountability, and protect individual rights. As AI continues to blur the boundaries between the living and the dead, society must grapple with profound questions about the nature of consciousness, the ethics of digital immortality, and the future of mourning in an age of artificial intelligence.

“We need to start thinking now about how we mitigate the social and psychological risks of digital immortality, because the technology is already here,” said LCFI researcher Dr. Katarzyna Nowaczyk-Basińska. As the digital afterlife industry grows, it is crucial that we confront these challenges head-on and develop responsible, human-centered approaches to the application of AI in the realm of death and grief.

One Reply to “AI Deadbots: Cambridge Experts Warn of Digital Haunting”

  • Avatar of Allen Oliver
    Allen Oliver says:

    this is actually quit interesting I would love to write the algorithms for the deceased, using every bit of info I could from the loved ones to craft something very special for them, but giving them a warning in advance for them to remember that its not their loved one and what it is and what its for.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trending AI Tools
Nudify AI

Nudify or Change Clothes in 3 clicks Free Online AI Image Nudifier Try Digital Undressing 😉

Stillgram

A.I. Travel Photo Camera App for iPhone Automatically removes people from your travel photos Erase the Chaos, Keep the Beauty

JourneAI

Personalized Journeys with JourneAI Save Time & Efforts for Trip Plannings Smart Travel Planning for Modern Explorers

TravelPlanBooker

Transforming Trip Planning with Intelligent AI Explore More with Personalized Itineraries Planning the Perfect Trips

Undressapp.org

Virtually Undress Anyone in Seconds Digitally Strip Clothes of Girls with AI Realistic-Looking Nude Body

4172 - EU AI Act Webinar - 2.jpg banner
© Copyright 2023 - 2024 | Become an AI Pro | Made with ♥