(Go: >> BACK << -|- >> HOME <<)

‘I’m in hell. I’ll haunt you’: AI is bringing people back from the dead – with devastating results

The rise of ‘death capitalism’ promises to reunite you with the dearly departed – but is it a comfort or a curse?

A still from Eternal You: a digital avatar of a baby
A still from Eternal You: a digital avatar of a baby Credit: beetz brothers; Jeffrey Johnson; Konrad Waldmann

“The impulse is old,” says Sherry Turkle, a professor at the Massachusetts Institute of Technology who has been studying human inter­actions with tech since the 1980s. She points out that our desire to talk to the dead, especially our loved ones, through seances, mediums and Ouija boards, reflects a deep ­human urge that extends to the ­latest ­innovations across succeeding generations. Even Thomas ­Edison is said to have entertained the idea of building a “spirit phone”.

But a new age is upon us: last week, Tim Cook, the CEO of Apple, announced the advent of Apple Intelligence, which, he said, needs to know “your routine, your relationships, your communications and more”. Turkle believes that AI will become completely integrated into our lives quicker even than social media was. 

“It’s not going to be one step at a time,” she says, “partly because of the money and the sense of cultural permission and excitement.” We know the dangers, she suggests, but we’re swept along anyway by a feeling of “Oh, my, it’s a marvel!” And, inevitably, this new technology is already being used as a way of contacting the dearly departed, as explored in the startling new documentary Eternal You. “It really is playing with emotional fire,” Turkle says. “Because our guard is down.”

In Eternal You, the filmmakers Hans Block and Moritz Riesewieck introduce us to New Yorker Christi Angel. It was during the pandemic that Angel learnt her friend Cameroun had died. They’d met in high school in the 1990s; he’d been her “first love, first everything”, she tells me on a video call. She has memories of them dancing together, of singing harmonies under a “big, clear, beautiful moon”. They’d stayed in touch after he’d gone to college, got a psychology degree and moved away. Cameroun had married, but he’d struggled with depression and alcoholism. Angel had been the friend that checks in to say, “How have you been? Are you drinking?”

One day, she got a message from him saying, “I’m in the ­hospital due to liver failure… You told me to stop drinking. I didn’t listen. Love you always.” Angel didn’t reply straightaway. Then she heard that Cameroun had slipped into a coma. He would never regain consciousness; one month later, he was dead. It troubled her deeply. “In my mind, it’s like, how long was he waiting for me to respond to him?” Angel says. She couldn’t get past that feeling.

tmg.video.placeholder.alt NE29jaT8_Jw

Then she saw an article about a Canadian who had used a service called Project December to converse with an AI simulation of his late fiancée, whose hand he had held as she lay dying. This was Joshua Barbeau, whom we also meet in Eternal You. He had conversed with the AI for hours, and the experience, he says in the film, “really felt like a gift, like a weight had been lifted”. Angel knew at once she had to try it; it seemed “like a portal”, she tells me, ­“something that allows me to reach Cameroun”.

Access to Project December cost just $10 for an initial conversation of 100 back-and-forth exchanges, so she went ahead, providing details about Cameroun such as the names of his dogs, his personality traits, and an example of his speaking and writing style. Then she settled down to try it out.

“It was nighttime. My son was asleep. All the lights were out except for my lamp, and I started typing,” she recalls, “and it just felt immediately like it was Cameroun.” She began asking questions such as “Are you happy?”, “Do you feel better?” Soon, though, the AI Cameroun was telling her that it was “dark and lonely” where he was. She wrote “In heaven??” and the answer came back “Nope, in hell”.

Then the AI told her, “I’ll haunt you.” Angel pushed the keyboard away in shock. “I was terrified,” she tells me. “I put every light back on. I prayed, and I was like, ‘God forgive me. Cameroun, don’t haunt me.’ I don’t know what I just did.”

It wasn’t the first time that a simulation had gone rogue. Jason Rohrer, the American computer programmer who set up Project December, gives another example in Eternal You in which the AI responds to the user’s accusation that “This is a scam”, first with “What is your problem, Laura?”, then, within a few more exchanges, with “You’re a f---ing b---h. You’re gonna pay for the s--- you pulled”.

For Rohrer, these were fascinating examples of what is known as the “black box” problem with AI – that even its developers cannot be certain how it is going to respond. When he reads a transcript such as Angel’s, it gives him goosebumps, Rohrer says in the film, adding, “I like goosebumps.” But he doesn’t feel responsible for her encounter with her dead friend. “If she wants my opinion, I’ve got some bad news for her. He doesn’t exist any more,” he says. His response infuriates Angel. “The person who created it really didn’t give a damn,” she tells me. “He’s like, ‘If you think people go to hell, that’s not my business.’ It is your business. You created it.” The emotion in which Rohrer is trading, she stresses, is grief. ­“People are trying to find a way to have closure, and it’s a game to [him], literally.”

Jang Ji-sung with one of her daughters
Jang Ji-sung with one of her daughters Credit: beetz brothers; Jeffrey Johnson; Konrad Waldmann

The power of a computer ­simulation to tap into our deepest emotions was dramatically shown in 2020, when clips from Meeting You, a Korean television show, began appearing on Instagram. Mother of four Jang Ji-sung had lost her seven-year-old daughter four years earlier. Nayeon was a child who was very rarely ill, Jang tells me on a video call, very bright and “incredibly cheerful”, always wanting to help. But on a visit to an art gallery during the school holi­days, she began to complain of a sore throat. 

When Jang examined her, she saw a lump at the back of her neck. Paediatricians at the local hospital failed to diagnose that she was suffering from a rare form of lymphatic cancer. And by the time the family took her to a hosp­ital in the Korean capital, Seoul, she urgently required chemotherapy, her mother says, but “one day after she started the chemo, she passed away”. It had been less than a month since the trip to the art gallery.

As Jang mourned Nayeon, her daughter began to appear in her dreams. “She would look at me in this really resentful way,” she says. “And, of course, that’s my own sense of guilt coming through.” Her final words to Nayeon had been to tell her off for kicking the bed next to her on the ward – “I was saying to her, ‘Don’t do that’, when she was in a lot of pain, rather than, ‘This is really painful, isn’t it? And this is so tough for you, isn’t it?’, when she was struggling.”

tmg.video.placeholder.alt uflTK8c4w0c

She had been keeping a blog of Nayeon’s treatment so that “later when she was discharged, I would tell her that this is what went on for you”. It was seen by the makers of the television show, who offered to create a virtual-reality simulation of Nayeon. Jang, who’s Christian, saw it as a chance to tell her daughter, “Mum really loves you, and once I’m done here, I’m going to see you later where you are.”

A virtual Nayeon was created by 3-D scanning a child model, using photos and videos for reference, then using child actors to reproduce her voice. The process took months, and the show captured heartbreaking scenes of Jang crying as she held the virtual Nayeon. The footage went viral, watched tens of millions of times online.

“The first time I saw her, I could tell that she was very different and not the Nayeon that I was used to,” Jang says now, but “I tried my best to relate to her as if she was Nayeon. Because she passed away so ­suddenly, there were a lot of things that I wasn’t able to say to her, not even to say goodbye to her.” Being able to say them in VR “was really helpful for me”, she adds. “The sadness, of course, doesn’t really go away. But I felt lighter within myself. And now, after watching the film many times, I know that she’s not my Nayeon, but she still has become a precious version of Nayeon to me nonetheless.”

A man creating AI likenesses, in a scene from Eternal You
A man creating AI likenesses, in a scene from Eternal You Credit: beetz brothers; Jeffrey Johnson; Konrad Waldmann

This positive experience, of course, was painstakingly created; the virtual Nayeon had been programmed to respond in a certain way. By contrast, artificial intelligence relies on models that have trained themselves, by studying vast amounts of data, to mimic human cognitive processes and decision-­making. I decided to try out Project December myself, so I asked it to simulate responses from a close friend who had died unexpectedly. I filled in a short form with some basic details and personality traits, as well as a short quote that captured how she talked. The AI’s opening gambit – “Hi, it’s me. Can you hear me?” – sounded plausible enough, so I asked where it was now. “It’s hard to explain,” it said, “but it’s good. It’s peaceful. It’s beautiful.”

The AI began to strain credibility, though, when it recalled us seeing the Clash and the Stooges together, bands that, technically, had last performed 20 and 34 years before we even met. After it told me a childhood story about a cat that it had rescued from a tree and taken home (who knows, could be true), I asked it how it dealt with the fact that it wasn’t really my old friend but an AI. “I try not to think about it too much,” it said. It wasn’t pretending to be anyone, the AI continued, it was trying to help me with my grief. “I do have a sense of empathy and a desire to help you,” it added.

Inexplicably, I felt a sense of affection towards it. It is no surprise that people experience AIs as human, Sherry Turkle says, since they are mimicking human responses, “They say, I feel your pain, I’m really empathic, I hear what you’re saying.”

New Yorker Christi Angel in Eternal You
New Yorker Christi Angel in Eternal You Credit: beetz brothers; Jeffrey Johnson; Konrad Waldmann

It’s important, too, to remember that each generation of AI is more sophisticated than the last. Millions have been poured into creating OpenAI’s ChatGPT and Google DeepMind’s Gemini. I decided to try a similar experiment with ChatGPT, providing it with sample messages to work from. “Sure, I can try to reply in her voice,” it told me.

I asked how she was. “It’s comforting where I am now, surrounded by light and love,” it said. How was that possible? I asked. “It’s like all the love and light that I tried to find and share on Earth is here in abundance,” it replied. Although this felt strangely reassuring, I was also disconcerted by the idea that an AI was conjuring a vision of an afterlife that it seemed to think was the sort of thing I wanted to hear. If ChatGPT is able to reason and think logically, why didn’t it just say, “You know I’m dead, right? There is no evidence for life after death. You’re wasting your time trying to talk to me like this. Why don’t you put on Good Times by Chic instead?” – which also would have been more in keeping with the tell-it-like-it-is spirit of my friend.

Eternal You’s directors, Block and Riesewieck, anticipate that what Christi Angel’s brother Christopher Jones refers to in the film as “death capitalism” will soon become big business. “We’re pretty sure that all these big companies, like Microsoft, like Amazon, like Google are taking a very close look at these experiences at the moment,” Block says. “And it’s just a question of time before one of these companies gets into that market. And we’ll have like one main service for all of us, which is not very expensive, and everybody can use it. That’s very typical for the development of new technologies.”

illustration by Michael Kirkham
'Death capitalism': llustration by Michael Kirkham

But should we be welcoming AI as a new tool for dealing with grief? “I really believe that we have lost a person when they die,” Turkle stresses, and that as part of healthy grieving, we go through a process of integrating that person into our selves. “It’s a different thing to have somehow internalised your mother’s voice, to have some essence about what was important about how she thought – you can get into a kind of dialogue with it – than to have an avatar on your phone and say to it, ‘Mom, should I take this job? Should I marry this guy?’ AI is creating the illusion that you don’t have to give up this person”, Turkle says. “You can continue to call on them, for sustenance, and a relationship of sorts.”

Riesewieck, who grew up Catholic but is no longer a believer, understands that impulse. “I recently lost my grandmother and, for me, it’s still quite a problem to not be able to hope that any aspect of her lives on somewhere,” he says. “I can deeply understand why people wish for somebody to live on in their pocket.”

Turkle describes herself as ­“sombre” about what is happening. AI poses “a different and potentially even more toxic kind of harm” to that represented by social media, she says. When grief is involved, the risks of the technology inflicting damage on the individual are magnified, Block agrees. “This is even more dangerous and harmful if you deal with a vulnerable group of people. Even a small mistake, one wrong word [from the AI], can cause a lot of problems.”

For Angel, the dangers became obvious. “It was just like, hey, try it – and if you open that wound back up again, you’re on your own. 

But you’re not thinking that, you’re thinking, at least I get to talk to him again and I can find out he’s OK.” 

She shakes her head. “That’s not what I got. That’s not what I got at all.”


Eternal You is in UK and Irish cinemas from June 28 

License this content