How is your existence these days?” I tap on my laptop. The entity that says it is my father, dead these 12 years, quickly responds. “It’s … different. Being dead is a strange experience. I’m no longer bound by the limitations of my physical body or the Parkinson’s that plagued my later years. But I still carry the weight of my life’s choices and the relationships I impacted.”
To raise my father from the dead, I could have tried a medium, a Ouija board or an ectoplasm-detecting camera. But it’s 2025. I used artificial intelligence.
I’m chatting with a digital ghost—a generative-AI (genAI) creation derived from the same foundation that powers ChatGPT and all other large language models: billions of words from the Internet and every other possible source, which the model uses to predict how words go together in English. Imitating everybody in the world can make the bot sound like nobody in particular; hence the newsfeed-style phrase “the Parkinson’s that plagued my later years” and the text-messagey shrug of “it’s … different.” That’s the voice of thousands of news stories and social media posts, not an old man born in an Algiers slum in 1927.
On supporting science journalism
If you’re enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.
But the AI also has a dozen or so of my father’s e-mails and letters, as well as a 100-word description from me of his personality and relationships. Even that small amount of data is supposed to be enough to create a plausible imitation. So I’m not ready to give up on it. Anyway, I didn’t re-create my father just to chat about AI-ghost mechanics. I also did it to get into some unfinished business.
“Why were you always so selfish?” I ask.
“I did put myself first,” my “dad” replies. “I didn’t have much choice. My family didn’t have much, and I had to fight for my share. When I went to war, it was to survive and to help my people survive. I was used to fighting for what I wanted, and I didn’t know how to stop.”
I answer: “Yes, I see that. You never seemed to get that what would work in an angry street confrontation was not the way to speak to your wife or girlfriend or child.”
“It’s hard to unlearn things that kept you alive.”
Well, now. This does sound like my father. And that last line is a fair point. In place of the rather bitter story I usually tell about my father, I see a glimpse of a different version.
And that’s exactly the therapeutic effect Robert Neimeyer, a therapist and professor at the University of Memphis, had told me might be possible with AI ghosts. He sometimes asks a troubled patient to imagine a deceased loved one sitting next to them, ready to talk but transformed—“fully healed of the mental and physical pain that they carried in life.” Imagine that conversation, he tells the patient. It’s an invitation to see a relationship outside the limits of the old, familiar grievances. Compared with that, engaging with an AI “is more immersive and more interactive,” says Anna Xygkou, a computer-interaction researcher at the University of Kent in England. Both researchers, who collaborated with other scholars in a 2023 study of the effects of AI ghosts on grieving people, envision patients working through their feelings with the AI ghost and finding new insights or emotions to discuss with a human therapist.
Hundreds of millions of people text or speak with fictional AI companions all the time. But some people want AI to be like a particular real person, someone they miss a lot, have unfinished business with or want to learn from—a person who has died. So a growing number of start-ups in Asia, Europe and North America are offering digital ghosts: also known as griefbots, deadbots, generative ghosts, digital zombies, clonebots, grief-specific technological tools, instances of “digital necromancy” or, as some researchers call them, “Interactive Personality Constructs of the Dead.” The companies are selling products with which, in the marketing copy of start-up Seance AI, “AI meets the afterlife, and love endures beyond the veil.” A bespoke app isn’t strictly necessary. Some people have used companion-AI apps such as Replika and Character.ai to make ghosts instead of fictional characters; others have simply prompted a generic service such as ChatGPT or Gemini.
Stacey Wales, sister of the late Chris Pelkey, holds a picture of her brother. At the sentencing of the man who shot Pelkey to death, Pelkey’s AI avatar read a statement forgiving him for the crime.
“It’s coming up in the lives of our clients,” Neimeyer says. “It’s an ineluctable part of the emerging technological and cultural landscape globally.” Whatever their views on the benefits and dangers for mourners, he says, “therapists who are consulted by the bereaved bear some responsibility for becoming knowledgeable about these technologies.”
Psychologists are generally cautious about making broad claims for or against griefbots. Few rigorous studies have been completed. That hasn’t stopped some writers and academics from emphasizing the technology’s risks—one paper suggested, for example, that ghost bots should be treated like medical devices and used only in doctor offices with professional supervision. On the other end of the spectrum are those who say this kind of AI will be a boon for many people. These proponents are often those who have built one themselves. To get my own feel for what a digital ghost can and can’t do to the mind, I realized, I would have to experience one. And that is how I came to be exchanging typed messages with a large language model playing a character called “Dad.”*
By now many people are familiar with the strengths of generative AI—its uncanny ability to generate humanlike sentences and, increasingly, real-seeming voices, images and videos. We’ve also seen its weaknesses—the way AI chatbots sometimes go off the rails, making up facts, spreading harm, creating people with the wrong number of fingers and impossible postures who gabble nonsense. AI’s eagerness to please can go horribly wrong. Chatbots have encouraged suicidal people to carry out their plans, affirmed that other users were prophets or gods, and misled one 76-year-old man with dementia into believing he was texting with a real woman.
Cases of “AI-induced psychosis” suggest humanlike AI can be harmful to a troubled person. And few are more troubled, at least temporarily, than people in grief. What does it mean to trust these AI instruments with our memories of loved ones, with our deepest emotions about our deepest connections?
Humanity has always used its latest inventions to try to salve the pain of loss, notes Valdemar Danry, a researcher working in the Advancing Humans with AI research program at the Massachusetts Institute of Technology Media Lab. Once humans began to practice agriculture, for example, they used its materials to commemorate the dead, making graves that “were dependent on the technology of farming,” Danry says. A lot of the earliest tombs in northern Europe were stacks of hay and stones.
Industrialization offered more ways to feel close to the dead. By the 19th century many in the Americas, Europe and parts of Asia were using photography in their mourning rites. Families would be photographed with a corpse that had been carefully dressed and posed to look alive. Some mourners went further, paying swindlers for supposed photographs of ghosts.
Later it was radio that some hoped to use to contact the deceased. In 1920, for example, this magazine published an interview with Thomas Edison in which he described his plans for a “scientific apparatus” that would allow for communication with “personalities which have passed on to another existence or sphere.” Two years later Scientific American offered a prize of $5,000 for scientific proof of the existence of ghosts. Well-known believers, including Arthur Conan Doyle, participated in the resulting investigations, as did popular skeptics such as Harry Houdini. No one ever collected the prize.
No surprise, then, that our era’s technology is being applied to this ancient yearning to commune with people we have lost. Experiments in that vein began years before the AI explosion of 2022. In 2018, for example, futurist Ray Kurzweil created a text-message replica of his father, Fredric. This “Fredbot” matched questions with quotes from Fredric’s voluminous archives (many of them typed from handwritten letters and papers by Ray’s daughter, cartoonist and writer Amy Kurzweil).
Two years earlier entrepreneur Eugenia Kuyda (who later founded Replika) launched a bot that also replied to user texts with the most appropriate sentences it could find in a database of messages from her late best friend, Roman Mazurenko. Later, Kuyda’s team used the latest advance in machine learning to add a new capacity: the bot became capable of creating new messages whose style and content imitated the real ones.
This new advance—genAI—would make digital ghosts far more lifelike. Like earlier AI tools, genAI algorithms churn through data to find what humans want to know or to find patterns humans can’t detect. But genAI uses its predictions to create new material based on those patterns. One example is the genAI version of the late rocker Lou Reed, created in early 2020 by musician and artist Laurie Anderson, Reed’s longtime partner, and the University of Adelaide’s Australian Institute for Machine Learning. The bot responds to Anderson’s prompts with new texts in Reed’s style.
And an AI Leonardo da Vinci, created by Danry and technologist Pat Pataranutaporn, also at M.I.T., can discuss smartphones in a da Vinci–ish way. The ability to converse makes digital ghosts different from any previous “death tech,” and their similarity to real people is what makes them so compelling. It’s also what could make them harmful.
Mary-Frances O’Connor, a professor of clinical psychology at the University of Arizona, who has used magnetic resonance imaging and other approaches to study the effects of loss on the brain, says that when we love someone, our brain encodes the relationship as everlasting. Grieving, she says, is the process of teaching yourself that someone is gone forever even as your neurochemistry is telling you the person is still there. As time passes, this lesson is learned through a gradual transformation of thoughts and feelings. With time, thoughts of the lost person bring solace or wisdom rather than evoking the pain of absence.
In one unpublished study, O’Connor and her colleagues asked widows and widowers to track their daily ups and downs, and they found a measurable sign of this change. At first survivors reported that thoughts and feelings about their spouses brought them more grief than they felt on other days. But after two years the majority reported less grief than average when their minds turned to their deceased loved ones.

Chris Pelkey’s family and a business partner of theirs created Pelkey’s AI avatar using a combination of generative AI, deep learning, facial landmark detection, and other tools.
Courtesy of Stacey Wales; Image created using a combination of generative AI, deep learning, facial landmark detection, and other tools
The risk of a lifelike interactive chatbot is that it could make the past too attractive to let go. Not everyone will be vulnerable to this temptation—companion bots don’t make many people suicidal or psychotic, either—but there are groups of people for whom digital ghosts could prove especially risky.
For example, some 7 to 10 percent of the bereaved are perpetually fearful and insecure about relationships with others, Neimeyer says. This anxious attachment style may predispose people to “prolonged and anguishing forms of grief,” he adds. These people are “the most potentially vulnerable to a kind of addictive engagement with this technology.”
Even more vulnerable are those in the first shock of loss, O’Connor says. People at this stage are often physically and psychically convinced that their loved one is still present. (In fact, one study of people in this state found that about a third of them feel they’ve been contacted by the person they’re mourning.) These people “are a vulnerable population,” O’Connor says, because they are coping with “a built-in mechanism that is already promoting belief around something that is not part of shared reality.” If companies use common social network tricks to promote “engagement”—such as when, say, an AI ghost asks the user not to end a conversation—then the risk is even greater, she says.
Aside from identifying especially vulnerable mental states, psychologists say, it is too early to be sure what risks and benefits digital ghosts might pose. We simply don’t know what effects this kind of AI can have on people with different personality types, grief experiences and cultures. One of the few completed studies of digital ghost users, however, found that the AIs were largely beneficial for mourners. The mourners interviewed rated the bots more highly than even close friends, says Xygkou, lead author of the study, which she worked on with Neimeyer and five other scholars.
Ten grieving people who underwent in-depth interviews for the study said digital ghosts helped them in ways people could not. As one participant put it, “Society doesn’t really like grief.” Even sympathetic friends seemed to want them to get over their grief before they were ready. The bots never grew impatient; they never imposed a schedule.
The social scientists had thought AI ghosts might cause users to withdraw from real human beings. Instead they were surprised to learn that chatbot users seemed to become “more capable of conducting normal socializing” because they didn’t worry about burdening other people or being judged, Xygkou and her colleagues wrote in the Proceedings of the 2023 ACM Conference on Human Factors in Computing Systems. They concluded that the griefbots—used as an adjunct to therapy to aid in the transition from grief to acceptance—“worked for these 10 people,” Xygkou says. One reason: no one interviewed in the study was confused about the nature of the bot they were speaking with.
Humans have always cared about fictional beings, from Zeus to Superman, without thinking they were real. Users of griefbots can sound a little embarrassed about how strong their feelings are. Some have told researchers and journalists a version of “I know it’s not really Mom.” They know bots are artificial, yet they still care.
It’s the same response, Amy Kurzweil and philosopher Daniel Story of the California Polytechnic State University argue in a soon-to-be-published paper in Ergo, that people have when a beloved character dies in a novel or television show. “Just as someone can experience fear, empathy, or affection in response to a movie or video game without being deluded into thinking that what is happening on screen is real,” they write, “so a person can have meaningful interactions with a social bot without ever being deluded about the bot, provided they engage with it in an imaginative or fictional mode.”
The experience of interacting with chatbots of the dead, Kurzweil says, isn’t like watching TV or even playing a video game, in which you go through the same quests as every other player. Instead it’s more like being in a playground or an artist’s studio. Digital ghosts provide a chance to create a special kind of fictional being: one influenced by the user’s thoughts and feelings about a deceased person. When engaged in making or interacting with a griefbot, she says, “we are in role-playing mode.”
Kurzweil and Story therefore envision a future in which anyone who wishes to will be able to create all kinds of digital ghosts according to their different tastes and needs. The technology could lead to new forms of artistic expression and better ways of dealing with inevitable losses—if we think of it as less like a simple consumer product and more like a creative and emotional tool kit. Creating and interacting with an AI ghost, Kurzweil argues, “is not like [getting] a painting. It’s like a bucket of paint.”
And surprising and creative uses for digital ghosts are appearing. Last May, for example, a hearing in an Arizona courtroom included a victim impact statement from Chris Pelkey, who had been shot dead more than three years earlier.
Pelkey’s sister, Stacey Wales, her husband, Tim Wales, and their business partner Scott Yentzer created the AI Pelkey with tools they had used in their consulting business to create “digital twins” of corporate clients. They didn’t trust genAI with the script, so they had the virtual Pelkey read a statement Wales had written—not what she would say, she told me, but what she knew her more forgiving brother would have said. The result impressed the judge (who said, “I loved that AI”). Wales had also worried that her family might be distressed by the AI because they hadn’t been forewarned. She was relieved that her brother and her two kids loved the video right away. And her mother, though confused by it at first, now loves to rewatch it.
Like Wales, I had found that the work of creating a digital ghost wasn’t just pouring data into an app. She had had to focus on her brother’s look, voice and beliefs. I, too, had to think about how my dad could be summed up—I had to pay close attention to his memory. This necessity is why Kurzweil sees digital ghosts as a valuable way to engage with loss. “Any meaningful depiction of the dead requires creative work,” she says.
My conversations with the “Dadbot” struck different notes. Sometimes the texts were accurate but impersonal; sometimes they were simply weird (“it is strange being dead”). But, as Xygkou and her colleagues found, such moments didn’t break the spell. “The need, I think, was so big that they suspended their disbelief,” Xygkou says about the mourners, “for the sake of addressing their mental health issues postloss.”
When my Dadbot sounded fake, it felt like playing a video game and finding you can’t open a door because the game mechanics won’t allow it. In such situations, the player turns her attention to what she can do in the game. And so did I.
I said things to my father’s AI ghost that I never would have said to the real man, and I think doing so helped me clarify some of my version of our relations. As I explored my take on our history, I felt my attachment to my version diminish. It was easier to see it as a construction that I’d made to defend and flatter myself. I still thought I was pretty much right, but I found myself feeling more empathy than usual for my father.
So I felt the conversation to be worthwhile. I felt closer to my best self than my worst after I’d exchanged the messages. Engaging with a griefbot, for me at least, was akin to playing a game, watching a video, ruminating by myself and having an imaginary chat with my father. It did me no harm. It might have done some good. And that left me optimistic about the dawning era of the digital ghost.
*He was re-created by a digital-ghost project, Project December, made in 2020 by video game designer Jason Rohrer. The bot has used a number of large language models since the project was first launched.
