From voice clones to digital avatars, artificial intelligence (AI) is providing new methods for digitally preserving loved ones. However, this technology also raises significant concerns about data privacy, consent, and its potential impact on how we mourn.
Diego Felix Dos Santos never thought he’d hear his late father’s voice again, but AI made it possible. “The tone of the voice is pretty perfect,” he says. “It feels like, almost, he’s here.”
After his 39-year-old father’s sudden death last year, Dos Santos traveled to Brazil to be with his family. It wasn’t until he returned to his home in Edinburgh, Scotland, that he realized he “had nothing to actually remind [me of] my dad.” What he did have, though, was a voice note his father had sent him from his hospital bed.
In July, Dos Santos used that voice note with the help of Eleven Labs, an AI-powered voice generator platform. For a monthly fee of $22, he uploaded the audio and created new messages in his father’s voice, simulating conversations they never had.
“Hi son, how are you?” his father’s voice says from the app, just as it would on their regular weekly calls. “Kisses. I love you, bossy,” the voice adds, using the nickname his father gave him as a boy. Although Dos Santos’ religious family initially had reservations about his use of AI to communicate with his father, he says they have since become more accepting of his choice. Now, he and his wife, who was diagnosed with cancer in 2013, are considering creating AI voice clones of themselves as well.
Dos Santos’ experience is part of a growing trend where people are using AI not just to create digital likenesses, but to simulate the deceased. As this technology becomes more personal and widespread, experts are cautioning about the ethical and emotional risks. These concerns range from questions of consent and data protection to the commercial interests that are fueling their development.
The market for AI technologies designed to help people with loss, often called “grief tech,” has expanded rapidly in recent years. This industry includes US startups like StoryFile (an AI-powered video tool for posthumous playback) and HereAfter AI (a voice-based app that creates interactive avatars of deceased loved ones). These technologies are marketed as a way to cope with, and potentially even prevent, grief.
In 2024, Robert LoCascio founded Eternos, a Palo Alto-based startup that helps people create an AI digital twin, after his father passed away. Since then, over 400 people have used the platform to create interactive AI avatars. Subscriptions for a legacy account, which keeps a person’s story accessible to loved ones after their death, start at $25.
Michael Bommer, an engineer and former colleague of LoCascio’s, was one of the first to create a digital replica of himself on Eternos after receiving a terminal cancer diagnosis. LoCascio says Bommer, who died last year, found a sense of peace in leaving a part of himself behind for his family. His wife, Anett Bommer, who lives in Berlin, Germany, told Reuters that the AI “captures his essence well.” She added, “I feel him close in my life through the AI because it was his last heartfelt project and this has now become part of my life.”
Alex Quinn, the CEO of Authentic Interactions Inc, the parent company of StoryFile, says the goal of this technology is not to create digital ghosts but to preserve people’s memories while they can still share them. “These stories would cease to exist without some type of interference,” Quinn says. While acknowledging the obvious limitations of AI clones—they won’t know the current weather or who the president is—he believes the results are worthwhile. “I don’t think anyone ever wants to see someone’s history and someone’s story and someone’s memory completely go.”
One of the most significant concerns surrounding grief tech is consent. What does it mean to digitally recreate someone who has no control over how their likeness is used after they die? Some companies, such as Eleven Labs, allow people to create posthumous digital likenesses of their loved ones. However, others are more restrictive. LoCascio from Eternos, for example, says their policy prevents them from creating avatars of people who cannot give consent. They enforce this with checks, including requiring users to record their voice twice. “We won’t cross the line,” he says. “I think, ethically, this doesn’t work.”
In 2024, AI ethicists at Cambridge University published a study calling for safety protocols to address the social and psychological risks posed by the “digital afterlife industry.” Katarzyna Nowaczyk-Basińska, a researcher at Cambridge and a co-author of the study, says that commercial incentives often drive the development of these technologies, making transparency about data privacy essential. “We have no idea how this (deceased person’s) data will be used in two or 10 years, or how this technology will evolve,” Nowaczyk-Basińska says. She suggests that consent should be treated as an ongoing process that is revisited as AI capabilities change.
Beyond data privacy and exploitation, some experts are also worried about the emotional toll of this technology. Could it hinder a person’s ability to process grief?
Cody Delistraty, author of “The Grief Cure,” warns against the idea that AI can offer a shortcut through mourning. “Grief is individualized,” he says, noting that people can’t put it through the “sieve of a digital avatar or AI chatbot” and expect to “get something really positive.”
Anett Bommer says she didn’t rely on her husband’s AI avatar in the early stages of her grief, but she doesn’t believe it would have had a negative effect if she had. “The relationship to loss hasn’t changed anything,” she says, adding that the avatar “is just another tool I can use alongside photos, drawings, letters, notes,” to remember him.
Andy Langford, the clinical director of the UK-based bereavement charity Cruse, says that while it’s too soon to draw conclusions about AI’s effect on grief, it’s important that those using this technology don’t “get stuck” in their mourning. “We need to do a bit of both—the grieving and the living,” he says.
For Dos Santos, using AI was not about finding closure; it was about seeking connection. “There’s some specific moments in life… that I would normally call him for advice,” he says. While he knows AI can’t bring his father back, it provides a way to recreate the “magical moments” he can no longer share.

