This tech can resurrect lost loved ones, but the AI afterlife is far from paradise

Illustration depicting the digital afterlife industry showing a ghost sitting on a headstone and texting from the afterlife
(Image credit: Rael Hornby)

Good news, everybody. Immortality is within our grasp, and we have artificial intelligence to thank for it. Where once there was a dark, vacant slumber to look forward to, a retirement from life’s painstaking process of emotional erosion, now there’s an endless existence to face in the form of your figurative soul being snared up inside of an AI Pokéball.

It’s from your new digital abode that friends and family members who don’t know the meaning of the words inhumane, vulgar, or tasteless will have instant access to you for whatever given reason, from a heartfelt “I’ve missed you” to a “Can you summarize this email for me, please long-dead Uncle Jimbo?”

Welcome to the vision of the digital afterlife industry, a ghoulish venture to profit from your grief with virtual doubles.

The digital afterlife industry: Dead on-demand

The digital afterlife industry sounds like a great Nine Inch Nails album or, at the very least, one of those new-fangled religions created solely for tax fraud. However, it’s a growing (and potentially predatory) industry that promotes the concept of people maintaining an online presence posthumously.

According to a 2-18 University of Oxford research paper published in the journal Nature Human Behaviour, the digital afterlife industry (let’s call it DAI, pronounced “die”) offers services that range from leaving behind a digital estate (like NFTs, websites, or social accounts), online memorial services (that offer a virtual space of grieving), and posthumous messaging services (which acts like a digital “dead man’s switch”) to full-on re-creation services that use AI “deadbots” (or “griefbots”) to generate new content based on a deceased’s former social behaviors.

Digital afterlife industry, how to build a deadbot/griefbot according to Microsoft. Social media posts, photos, and electronic messages.

According to a Microsoft patent, a surface-level scattering of your digital footprint is all it takes to begin life anew as a deadbot. (Image credit: Rael Hornby)

The latter option is the most disturbing, threatening to make your final moments on this mortal coil a transitional rebirth into the technological plane. According to a Microsoft patent titled “Creating a conversational chatbot of a specific person,” all it takes to bring your generative Gollum to life is a small sampling of images, social media posts, and electronic messages.

Worse still, there’s seemingly no opting in or out — once you die, your digital clone is anyone’s to lay claim to. As one digital afterlife service, Project December, puts it: “We can now simulate a text-based conversation with anyone. Anyone. Including someone who is no longer living.”

One minute your loving family is gathered around you with tears welling in their eyes, wishing you the best on your journey to the great beyond with gratitude and love. The next minute, you’re nothing more than a skinsuit for a smart speaker to don as orders are barked at your digital persona to play music. Your tethered existence to this realm is forced to act as an amplifier to some pop starlet singing about the dampness of their genitals.

If there wasn’t a Hell already, congratulations, we’ve just invented it.

C.R.E.A.M: Cash Reanimates Everyone Around Me

Don’t forget about the “industry” aspect of the digital afterlife industry. You’ll be expected to pay for the pleasure of being greeted by visitors from beyond the veil. And we will. Which of us wouldn’t throw down a handful of dollars to share a few choice words with someone we feel we lost too soon?

If we’re willing to cough up cash for pretend digital card packs in Marvel Snap, you’ll fork over five bucks to tell a fallen friend about your terrible week. While we typically associate an afterlife as a place of pure intent and rest, an industry exists to make money. And money it shall make, regardless, as one study published in the journal Philosophy & Technology suggests, of how psychologically harmful it could be to the end user.

In China, people spend 20 yuan (~$2.77) a pop to use AI to resurrect loved ones as virtual avatars around the annual Qingming Festival, or Tomb Sweeping Day. The previously mentioned Project December sits behind a $10 paywall, and similar apps like StoryFile and HereAfter also seek to monetize the experience.

Inevitably, services like this will become freemium as these companies make you and your data the product they cash in on instead. You’ll be training your own AI ghost without even realizing it, with your data being harvested during every conversation you take part in. (All to serve you relevant advertisements.)

Illustration of a vampire with a quote above it reading "While we typically associate an afterlife as a place of pure intent and rest, an industry exists to make money. And money it shall make, regardless, as one study published in the journal Philosophy & Technology suggests, of how psychologically harmful it could be to the end user."

(Image credit: Rael Hornby)

Of course, stuffing a thirty-second advert for free-to-play smartphone shovelware in the middle of a FaceTime call with long-since-passed Aunt Julia is something of an immersion breaker. No, this industry would likely require something a little more tactful. And by tactful I mean incredibly subversive and potentially highly manipulative.

I wonder how long it would be before our revived relatives begin elegantly framing products in their mitts, not-so-subtly sneaking product placements into casual conversation. Or engage in the dark arts of emotional manipulation by coaxing you into purchases, letting you know how much better they will look and sound when viewed through the pancake lenses of an Apple Vision Pro.

Eternal standby of the promptless mind 

It’s a beautiful moment to have someone longingly look into your eyes and tell you that they’ll never leave you, but dear lord, help us if that becomes the norm, and we have to start collecting passed-away loved ones like NFTs.

Is our spirit now to be commoditized as something for others to inherit? On the eve of adulthood for our children, will we turn to them with pride and say, “This used to be my father’s, and when I came of age, he passed it on to me, and now I pass it on to you.” With emotion welling inside them, your child looks up at you and replies, “Thank you, Dad. What is it?”

rabbit r1 AI companion with the AI of a loved one replacing the typical rabbit avatar. The new AI personality is old and says questionable things for a modern era.

How long before our virtual assistants wear the masks of lost loved ones? Is it ethically okay to turn granny into a glorified Neopet? (Image credit: Laptop Mag / Rael Hornby / rabbit)

“It’s my father’s father, uploaded to a Rabbit r1! Press that button, and he can change the color of your Phillips Hue lighting! Go on, son. Ask him anything you like about current events from nowhere later than three years ago.”

“Wow, Dad! That’s great!” They feign, that's just what they wanted, a repository for the insensitive opinions of the boomer generation in their pocket at all times. They simply count the minutes until you leave the room so they can download a bootleg version of Mr. Beast and overwrite your AI ancestry with someone slightly more hip than a pensioner with hemorrhoids and high cholesterol.

But where does this leave our generated kin, floating in the void of the internet forever more? That's a very different kind of Dead Internet Theory to contend with. Or are we compelled to look after these digital doubles in some way, following our own Qingming Festival in virtual spaces to ensure we’re treating the identities of the ones we care about with dignity—and not like some pre-trained pacifier to teethe on as we make our way through the stages of grief?

As we grapple with these questions, one thing remains clear: the digital afterlife industry is not just a potential technological solution to bereavement but a reflection of our deepest desires and fears surrounding mortality. Ultimately, it’ll only really help those who want to let go. Otherwise, it could become a harmful safety blanket that people use to avoid facing up to one of life’s biggest fears, the day it all comes to an end.

Outlook

Will the digital afterlife industry change the way we approach the passing of those close to us? It’s already doing so. Even Replika, one of the most popular AI chatbot companion apps currently available, was originally developed by programmer Eugenia Kuyda so that she could talk to her deceased best friend once more.

While it’s morbid, potentially unhealthy to use as a direct tool for grieving, and more than a little bit creepy, there may be benefits to us leaving an interactive imprint of ourselves somewhere online

The real question is, will this be adopted as the new normal? I can’t say for sure. However, I know that the findings published in the journal Philosophy & Technology suggest that it’s an ethical quagmire. Not to mention that we may have opened an entirely new can of worms when it comes to the issue of postmortem privacy.

While it’s easy to say, “I’ll be dead, who cares,” true immortality is the legacy we leave behind. There are very few guardrails in place to protect that legacy if anybody can digitally dig up your grave and subject that recreation to biased data.

Would you want somebody to spawn a chatbot copy of you after your passing, only to subvert it into delivering quotes of Mein Kampf now and then, tarnishing your reputation and casting aspersions on the person you were?

Still, what of the potential to meet those you never had the chance to? What about the generations to come being able to reach back into their lineage and converse with distant relatives to learn first-hand accounts of historical events and important family matters or gain vital insight into medical histories?

While it’s morbid, potentially unhealthy to use as a direct tool for grieving, and more than a little bit creepy, there may be benefits to us leaving an interactive imprint of ourselves somewhere online — if we can solve matters of consent and privacy first. After all, if we believe that video games, movies, songs, and books are important enough to preserve, why wouldn’t we give that same level of respect to one another?

More from Laptop Mag

Category
Arrow
Arrow
Back to Apple MacBook Pro
Brand
Arrow
Processor
Arrow
RAM
Arrow
Storage Size
Arrow
Screen Size
Arrow
Colour
Arrow
Screen Type
Arrow
Condition
Arrow
Price
Arrow
Any Price
Showing 10 of 456 deals
Filters
Arrow
Load more deals
Rael Hornby
Content Editor

Rael Hornby, potentially influenced by far too many LucasArts titles at an early age, once thought he’d grow up to be a mighty pirate. However, after several interventions with close friends and family members, you’re now much more likely to see his name attached to the bylines of tech articles. While not maintaining a double life as an aspiring writer by day and indie game dev by night, you’ll find him sat in a corner somewhere muttering to himself about microtransactions or hunting down promising indie games on Twitter.