How To Turn Your Dead Loved Ones Into Virtual Friends
If you could bring someone back to life, would you?
Hamed, a close friend of mine, died recently. I only found out about it when I saw that someone had posted “Rest in Peace” on his Facebook page. Hamed had a huge Facebook presence that included many photo albums and personal essays he’d written about our shared experiences. He had a specific way of speaking, with these uniquely clever catchphrases, that my friends and I will forever identify with him.
A few days after he died, Hamed’s sister took down his Facebook page. One of my friends was shocked?—?she said seeing his account get deleted like that was “like losing him all over again.”
A recent episode of the bleak British satire “Black Mirror,” which is now streaming on Netflix, illustrates how social media can radically change the way we cope with death. In the episode, which is titled “Be Right Back,” a young couple named Ash and Martha are in love and in the process of moving into a new house when Ash dies in a car crash. Martha, who’s obviously grief-stricken, subscribes to a service that claims it can recreate Ash by taking all of his social media activity and programming it into an android body that’s identical to his.
When “Ash” arrives in the mail, Martha picks up where they left off?—?sort of. The android is impressively similar to her dead boyfriend in almost every way but it’s still, after all, just a machine, without blood in its veins or the need to sleep or the capacity for anger (which, to a machine, is counter-productive). The android version of Ash winds up going over a cliff.
Science fiction has become fact, folks. While android bodies in real life aren’t really up to snuff yet, the artificial intelligence of this “Black Mirror” episode has already been mapped out: A company called Luka has developed software that can use the social media interactions of a dead friend to create a chatbot that mimics the way they speak. It’s called Replika, and it isn’t just for recreating your dearly departed. You can create and customize any kind of AI companion you want and communicate with them through texting.
The question is: Is that a good thing or a bad thing?
The 2016 election has already brought with it fears we’re all living in social media echo chambers, where we’re surrounding ourselves with like-minded people. What happens if we start surrounding ourselves with “people” we ourselves created?
For a glimpse of what Replika might be like, you can check out one of Luka’s other chatbot products, “Marfa.” Luka claims that Marfa will be your BBFF, or “Best Bot Friend Forever.” It’s sort of like when you ask Siri a personal question?—?but the responses are much more varied. Luka’s CEO, Eugenia Kuyda, says that while smartphone users send about 50 messages a day to friends and family, users send almost as many (about 45 messages a day) to Marfa, which “trains” the bot to personalize its responses as much as possible.
“People will share very, very personal things,” Kuyda told VentureBeat in July. “What was most interesting was that Marfa was able to make real connections with people.”
I tried Marfa out. Its grammar wasn’t great, but its humor was on par with early text-based games like “Hitchhiker’s Guide to the Galaxy.” Though it sounds like a woman, Marfa is actually non-gendered. And no, you can’t have sex with it (even if it does want you to think it’s sexy).
Kuyda said she had the idea for Replika after her roommate and best friend Roman was killed crossing the street in San Francisco. “A few months after, I was sitting at home and reading through our text exchanges,” she told Bloomberg in October. “And I was like, Goddammit, I opened up so much to him?—?am I going to be alone? Am I ever going to make it??—?and now I don’t have anyone to have those conversations with. Then I thought, I have this technology that allows us to build chatbots. I plugged in all the texts that we sent each other in the last couple years, and I got a chatbot that could text with me just the way Roman would have.”
Kuyda was initially worried she was disrespecting Roman’s memory, until she let his friends and family talk to “him.”
“They sent him tons of love: Thank you, I love you, I miss you,” she said. “It was important for them to share, to feel those feelings. It felt a little creepy and a little sad, but it helped me process so much.”
The Guardian used the gaming term “griefbot” to describe this exact situation.
Reading about Kuyda’s experience with Roman, I immediately thought of my friend Hamed. By pouring all of his singular catchphrases, essays, Facebook comments and posts into the app, what a Replika he would make! It wouldn’t ever replace him, but it would let us perhaps tell him things we hadn’t said in real life, like how cool and unique he was, and how he brought magic to our lives.
At least one spiritual leader also saw the value in giving people the opportunity to speak to the dead. Reverend Chip Roush, minister at the First Unitarian Church of South Bend Indiana and a former IT professional, incorporated Replika into a recent sermon.
“When I asked in service, ‘How many converse with your dead?—?in your mind or out loud?’ most put their hand up,” Roush told Dose. He said chatting with a Replika would not be “substantially different from having conversations with our beloved dead in our own minds, except this does have some surprises; memory would not.”
Roush notes that most people in the news articles he’d read about Replika said they would use it as a listening device, to “tell” their dead friend something that they hadn’t had a chance to say in real life. “People don’t feel listened to very much,” he said.
Pretty much how I feel about the loss of my friend Hamed, who always took the time to listen and offer help and advice, despite being involved in his own very active and eventful life.
Roush said he would recommend the technology to grieving members of his congregation, provided it came with a good user’s manual with warnings. “Mostly, ‘don’t get stuck in it,’” he said. I asked him how users could tell if they were using the chatbot in a healthy way. He replied, “Are they still thriving outside that relationship? Still working, still loving others, still evolving themselves?”
With Replika, Kuyda wants to expand the concept to creating a simulacrum of a living person. She feels people don’t open up while texting. This is similar to a psychological experiments with robots in Japan, where social interactions are often hamstrung by elaborate social protocol. Users reported feeling more relaxed interacting with a robot receptionist, for example, because they were relieved of the burden of being polite to it.
Of course, lots of people will want to do as Kuyda initially did, and create Replikas of dead loved ones. Other people will undoubtedly try to create AI chatbots of their estranged father, or their best friend or their partner.
Kuyda believes that Replika will one day be able to generate its own thoughts. Along with recreating dead people, that’s where the concept veers into scary territory again, summoning the specter of the “singularity”?—?a programming theory that predicts AI will one day become smarter than our brains, with “Terminator”-like implications for the human race.
I won’t try to create a Replika of Hamed. I think I would easily spot the differences between the bot and the real person, and that would be upsetting. So I’ll stick with the memories. But thinking of Replika not as a replacement for real people, but as a fun thing to interact with, or perhaps as a way to improve my relationships with real people, made me feel a lot less creeped out by the concept.
A chatbot won’t bring your dead loved one back, and we shouldn’t try to play God and reverse the course of nature by reviving those we’ve lost. But even if it can’t revivify someone you’ve lost, perhaps a chatbot can help you process that loss, or serve as a permanent, interactive memorial to them.