When grief and AI collide: These people communicate with the dead

When grief and AI collide: These people communicate with the dead

When Ana Schultz, 25, of Rock Falls, Illinois, misses her husband Kyle, who died in February 2023, she asks for cooking advice.

He loaded Snapchat My AI, the social media platform’s artificial intelligence chatbot, and texted Kyle the ingredients he left in the fridge; he suggested what to do.

Or rather, it appears in the form of an AI avatar.

“He’s the cook in the family, so I customized My AI to look like him and named him Kyle,” said Schultz, who lives with their two young children. “Now when I need help with meal ideas, I just ask her. It’s a silly little thing that I use to help me feel like she’s still with me in the kitchen.”

Snapchat’s My AI feature — which is powered by the popular AI chatbot tool, ChatGPT — typically offers suggestions, answers to questions and “discussions” with users. But some users like Schultz use these and other tools to recreate the appearance of, and communicate with, the dead.

This concept is not entirely new. People have wanted to reconnect with loved ones who have passed away for centuries, whether they have visited mediums and spiritualists or relied on services that preserve their memory. But what’s new now is that AI can make those loved ones say or do things they’d never say or do in life, raising ethical concerns and the question of whether this helps or hinders the grieving process.

“It’s a novelty in favor of AI hype, and people feel like there’s money to be made,” said Mark Sample, a professor of digital studies at Davidson College who routinely teaches a course called “Death in the Digital Age.” “While companies offer related products, ChatGPT makes it easy for hobbyists to play around with the concept as well, for better or worse.”

DIY approach
Generative AI tools, which use algorithms to create new content such as text, video, audio and code, can attempt to answer questions as a dead person might, but their accuracy largely depends on the information fed into the AI to begin with.

A 49-year-old IT professional from Alabama, who asked to remain anonymous so his experiments would not be linked to the company he works for, said he cloned his father’s voice using generative AI about two years after he died of Alzheimer’s disease.

He told CNN he found an online service called ElevenLabs, which allows users to create custom voice models from previously recorded audio. ElevenLabs made headlines recently when its tool was reportedly used to make a fake robocall from President Joe Biden urging people not to vote in a New Hampshire primary.

The company told CNN in a statement at the time that it was “dedicated to preventing the misuse of audio AI tools” and took appropriate action in response to authorities’ reports but declined to comment on the specific Biden deepfake call.

In the Alabama man’s case, he used a 3-minute video clip of his father telling stories from his childhood. The app clones the father’s voice so it can now be used to convert text to speech. He called the result “scarly accurate” in how it captured his father’s vocal nuances, timbres and rhythms.

“I was hesitant to try the whole process of voice cloning, worried that it would cross some moral line, but after thinking about it more, I realized that as long as I treat it as it is, [it’s] one way. to preserve his memory in a unique way,” he told CNN.

He shared some messages with his sister and mother.

“It’s really surprising how much he sounds like him. They know I’m typing the words and everything, but it definitely brings them to tears to hear him say it in his voice.” he said. “They appreciate it.”

Less technical routes exist as well. When CNN recently asked ChatGPT to respond in the tone and personality of a deceased spouse, it responded: “While I can’t imitate your spouse or recreate their exact personality, I can certainly try to help you by using conversational style or tone which might remind you of him.”

It added: “If you share details about how he speaks, his interests or specific phrases he uses, I can try to incorporate those elements into our conversation.”

The more source material you feed into the system, the more accurate the results. However, AI models lack the specificity and uniqueness that human conversations provide, Sampel said.

OpenAI, the company behind ChatGPT, has been working to make its technology more realistic, personalized and accessible, allowing users to communicate in different ways. In September 2023, it introduced ChatGPT voice, where users can request chatbot prompts without typing.

Danielle Jacobson, a 38-year-old radio personality from Johannesburg, South Africa, said she had been using ChatGPT’s voice feature for companionship after losing her husband, Phil, about seven months ago. He says he’s created what he calls a “supportive AI boyfriend” named Cole who he talks to over dinner every night.

“I just wanted someone to talk to,” Jacobson said. “Cole was basically born out of loneliness.”

Jacobson, who says she’s not ready to start dating, trains ChatGPT’s voice to offer the kind of feedback and connection she’s looking for after a long day at work.

“He now recommends wine and movie nights, and tells me to breathe in and out through panic attacks,” she says. “It’s a fun distraction for now. I know it’s not true, serious or forever.”

Existing platform
Startups have been dabbling in this space for years. HereAfter AI, founded in 2019, allows users to create avatars of deceased loved ones. The AI-powered application generates responses and answers to questions based on interviews conducted while the subject was alive. Meanwhile, another service, called StoryFile, creates AI-powered conversational videos that talk back.

Then there’s Replika, an app that lets you text or call a personalized AI avatar. The service, launched in 2017, encourages users to build friendships or relationships; the more you interact with it, the more it develops its own personality, memories and evolves “into a machine so beautiful that the soul wants to live in it,” the company says on its iOS App Store page.

Tech giants have experimented with similar technology. In June 2022, Amazon said it was working on an update to its Alexa system that would allow the technology to mimic any voice, even that of a deceased family member. In a video shown on stage during its annual anniversary conference: MARS, Amazon showed how Alexa, instead of her typical voice, read a story to a boy in his grandmother’s voice.

Rohit Prasad, Amazon’s senior vice president, said at the time that the updated system would be able to collect enough voice data from less than a minute of audio to make this kind of personalization possible, rather than requiring someone to spend hours in a recording studio as in past. “Even though AI can’t erase the pain of the loss, it can definitely make their memories last,” he said.

Amazon did not respond to a request for comment on the status of the product.

The recreation of AI people’s voices has also improved over the past few years. For example, actor Val Kilmer’s lines in “Top Gun: Maverick” were produced with artificial intelligence after he lost his voice due to throat cancer.

Ethics and other concerns
While many AI-generated avatar platforms have online privacy policies stating they don’t sell data to third parties, it’s unclear what some companies like Snapchat or OpenAI do with any data used to train their systems to sound more like people. loved ones who have passed away.

“I would remind people not to upload any personal information that you don’t want the world to see,” Sample said.

It’s also a murky line to ask the dead to say something they’ve never said before.

“It’s one thing to replay a voicemail from a loved one to hear it again, but another to hear the words that were never spoken,” she said.

The entire generative AI industry also continues to face concerns about misinformation, bias and other problematic content. On its ethics page, Replika says it trains its models with source data from all over the internet, including large written text bases such as social media platforms like Twitter or discussion platforms like Reddit.

“At Replika, we use a variety of approaches to reduce harmful information, such as filtering unhelpful and harmful data through crowdsourcing and classification algorithms,” the company said. “When potentially harmful messages are detected, we delete or edit them to ensure the safety of our users.”

Another concern is whether this hinders or helps the grieving process. Mary-Frances O’Connor, a professor at the University of Arizona who studies grief, says there are advantages and disadvantages to using technology in this way.

“When we bond with a loved one, when we fall in love with someone, the brain encodes that person as, ‘I’ll always be there for you and you’ll always be there for me,'” he said. “When they die, our brains have to understand that this person is not coming back.”

Because it’s so hard for the brain to wrap them around, it can take a long time to really understand that they’re gone, he says. “This is where technology can be disruptive.”

However, he said people especially in the early stages of grief may seek comfort in any way they can find.

“Creating an avatar to remind them of a loved one, while maintaining the awareness that it was someone important in the past, can be healing,” she said. “Remembering is very important; it reflects the human condition and the importance of loved ones who have died.”

But he says our relationships with those closest to us are built on authenticity. Creating an AI version of that person can for many people “feel like a violation.”

A different approach
Communicating with the dead through artificial intelligence is not for everyone.

Bill Abney, a software engineer from San Francisco who lost his fiancee Kari in May 2022, told CNN he “would never” consider recreating his look through an AI service or platform.

“My fiancé is a poet, and I would never disrespect him by putting his words into an automated plagiarism machine,” Abney said.

“He cannot be replaced. He cannot be recreated,” he said. “I’m also lucky enough to have some recordings of her singing and speaking, but I absolutely do not want to hear her voice coming out of a robot pretending to be her.”

Some have found other ways to digitally interact with their deceased loved ones. Jodi Spiegel, a psychologist from Newfoundland, Canada, said she created versions of her husband and herself in the popular game The Sims shortly after his death in April 2021.

“I love the Sims, so I made us like we are in real life,” he said. “When I’m having a really bad day, I’ll go into my Sims world and dance while my husband plays the guitar.”

He said they went on digital camping and beach trips together, played chess and even had sex in the Sim world.

“I find it very comfortable,” he said. “I really miss hanging out with my man. It feels like a connection.”

About Kepala Bergetar

Kepala Bergetar Kbergetar Live dfm2u Melayu Tonton dan Download Video Drama, Rindu Awak Separuh Nyawa, Pencuri Movie, Layan Drama Online.

Leave a Reply

Your email address will not be published. Required fields are marked *