Go to the main content

AI is reviving the dead — and trapping the living in emotional limbo

Hearing a loved one’s voice at 2 a.m. feels like a miracle. Until you can’t turn it off.

News

Hearing a loved one’s voice at 2 a.m. feels like a miracle. Until you can’t turn it off.

A few months ago I was chopping onions in our tiny São Paulo kitchen while a podcast murmured in the background. The host played a clip of a woman “talking” to her late father through an AI memorial app. The voice was warm and familiar. The rhythm was almost right. I felt my shoulders tense. Part awe, part unease. As a mom and a daughter who lives far from her family, I understand why people reach for any tool that promises one more conversation. I also know how quickly comfort can turn into a loop you cannot exit.

The cultural moment has caught up. Mainstream outlets are publishing stories about AI re creations of the dead and the people using them for closure, comfort or curiosity. GQ recently profiled this trend, tracing how tools once reserved for celebrities are now cheap and accessible to anyone with a smartphone. The piece is a useful snapshot of where we are and where we could be headed.

As a reporter at heart who lives for the details, I wanted to find the roots. Who is actually building this. What the research says about grief and mental health. Where the law is trying to keep up. So I dug in.

What this technology really does

Behind the soft lighting and heartfelt ads is a simple recipe. Feed an AI system a person’s digital trace. Texts, emails, social posts, video, voice. The system learns patterns. Then it generates plausible responses or even a talking avatar. This is not science fiction. It is already in the wild.

Project December, made by game designer Jason Rohrer, was one of the first services to popularize this in 2020 and 2021, when journalist Jason Fagone documented a young man’s attempt to chat with his deceased fiancée using the system. The man described both relief and emotional strain after long conversations with the bot. It was moving and messy, which is exactly how grief feels even without AI in the room.

Other companies package the same idea in different forms. StoryFile records hours of video interview answers and lets viewers ask questions later, producing interactive conversations with living people and, eventually, their memorials. William Shatner recorded one and talked publicly about the experience. HereAfter AI focuses on voice first memory chats for families. It markets a free trial and positions itself as a gentle archive that talks back.

Some projects push the boundary in another direction. In South Korea, a TV documentary used virtual reality to stage a reunion between a mother and her 7 year old daughter who had died. The short film triggered strong reactions around the world. Many viewers cried. Many asked whether it was healing or harmful. That question has not gone away.

Why it attracts us

Grief is not a straight line and it does not end on a schedule. Modern bereavement research even has a name for the healthy part of staying connected to the people we have lost. It is called the continuing bonds model. Instead of forcing detachment, the model suggests we can integrate the relationship in new ways. That can look like rituals, letters, a recipe you keep making, or the way your grandmother’s sayings live in your kitchen.

AI memorials are a tempting update to that idea. They are interactive. They fill silences. They answer back at 2 a.m. when your brain will not slow down. For an adult child living on another continent like me, the fantasy of hearing a parent’s voice whenever you want is powerful. The promise is intimacy on demand.

Where the experts are drawing the line

Researchers are starting to map the risks. A team at the University of Cambridge’s Leverhulme Centre for the Future of Intelligence warned in 2024 that griefbots could cause psychological harm and even create a sense of digital haunting if designers are careless. They recommend guardrails like age limits, clear labels and an actual way to retire a bot when a family is ready to stop. “This area of AI is an ethical minefield,” said co author Katarzyna Nowaczyk Basińska. “The rights of both data donors and those who interact with AI afterlife services should be equally safeguarded.”

Her colleague Tomasz Hollanek added that companies should consider rituals for ending an AI relationship with dignity, even suggesting a “digital funeral.” It sounds symbolic until you imagine your phone pinging you in the voice of your mother months after you tried to move on. A real off switch matters.

This is not the only warning on paper. In 2018, ethicists Carl Öhman and Luciano Floridi proposed an ethical framework for the digital afterlife industry in Nature Human Behaviour, noting that the web is already crowded with the remains of its departed users and that we need norms to protect dignity, not just utility. That argument aged well as generative AI made simulation easier and cheaper.

Consent is the fault line

The living may want a simulation. The deceased may not. China’s deep synthesis rules, which took effect in January 2023, try to get ahead of this by forcing providers to label synthetic content and secure consent when editing biometric data like faces and voices. It is not a perfect system, but it recognizes the baseline that a person’s voice and image are not a free for all after death.

In the European Union, the new AI Act includes transparency obligations for synthetic media. A Code of Practice is being drafted to help with labeling requirements across the bloc. This will not solve grief. It might reduce confusion when an AI avatar appears in your feed without context.

Then there is the corporate imagination. Microsoft even filed a patent for a chatbot system trained on the social data of a specific person, including the deceased. A patent is not a product, but patents signal what companies are considering. When money and identity meet, regulation and bright lines become important.

The human consequences no one can outsource

I think about two kinds of harm. The first is obvious. If you are in acute grief and you spend hours a day chatting with a simulation, your world might shrink. Sleep suffers. Work suffers. Your real relationships shift around the time you spend with a ghost that never tires. GQ’s reporting surfaced both relief and the risk of dependency in families who tried it. The feelings are real. The bots are not.

The second kind is quieter. It is the way AI can invent new “memories.” When a model generates a novel story in your father’s style or answers a question he never faced, the line between memory and fiction blurs. In a Cambridge scenario, a bot begins pushing ads in a dead relative’s voice after a free trial ends. That is not a plot from Black Mirror. It is a realistic product roadmap if we do not push back.

Even feel good experiences can snag you. The South Korean VR reunion was created with the mother’s consent and a careful team. The moment still lands in your body. Watching it, I could feel why some families would say yes. I could also feel how easily a yes can become emotional limbo if you cannot find a natural end.

What I tell myself before I click “agree”

I am not purist about technology. I love tools that save time and keep my family close across three countries. But anything that promises to erase the hard part of being human deserves a pause. Here is the checklist I use, as a daughter, a mother and a person who likes clear routines.

First, consent. Would this person have wanted to be simulated. Did we ever talk about how they wanted to be remembered. If the answer is no, I hold back. Researchers emphasize making consent mutual. The data donor needs a say. So do the people who will interact with the bot.

Second, labels and limits. If a service cannot promise clear labeling and a simple off switch, I am out. Cambridge’s team explicitly calls for both. They even suggest rituals for retiring a bot so that users can step out with closure. That is practical wisdom disguised as design.

Third, purpose and timing. In our house we have a rule about late night decisions. Do not make them. The same applies here. If I ever try a memorial app, it will be in short windows and never in the middle of the night when my mind is most fragile. I will tell my husband and a friend so they can remind me to take breaks.

Finally, the human work. The continuing bonds model did not need AI to help people remember and integrate love. That work looks like recipes, photo albums, stories told to a child in a bath. It looks like weekly dinners, long walks, and the way we imitate our elders without noticing. AI can supplement those bonds. It should not replace the living parts of them.

A realistic path forward

We can hold two ideas at once. These tools can comfort. These tools can harm. The difference often sits in design choices and our own boundaries. When I read the Cambridge recommendations and the older ethical frameworks, I do not hear academic distance. I hear care. The message is simple. Build for consent. Label what is synthetic. Respect the dignity of the dead. Protect the well being of the living. “We need to start thinking now about how we mitigate the social and psychological risks of digital immortality, because the technology is already here.” That last sentence is a quote, and it is the nudge I needed to write this piece.

The news is not that AI can mimic a loved one. The news is that our choices in 2026 will shape what that imitation does to our families. I want my daughter to grow up with honest stories about her grandparents, not ads in their voices. I want my community to grieve fully and still return to life. Tools are welcome if they help us do that. They do not get to decide for us.

 

If You Were a Healing Herb, Which Would You Be?

Each herb holds a unique kind of magic — soothing, awakening, grounding, or clarifying.
This 9-question quiz reveals the healing plant that mirrors your energy right now and what it says about your natural rhythm.

✨ Instant results. Deeply insightful.

 

Ainura Kalau

Ainura was born in Central Asia, spent over a decade in Malaysia, and studied at an Australian university before settling in São Paulo, where she’s now raising her family. Her life blends cultures and perspectives, something that naturally shapes her writing. When she’s not working, she’s usually trying new recipes while binging true crime shows, soaking up sunny Brazilian days at the park or beach, or crafting something with her hands.

More Articles by Ainura

More From Vegout