AI app's ability to resurrect lost loved ones sparks fears technology is crossing the fantasy-reality Rubicon

Software creates interactive simulations of deceased loved ones

HereAfter AI is giving mourning families a space to talk with digital replicas of their deceased loved ones in what some are calling an eerie blurring of the lines between fantasy and reality.

The interactive app is the latest venture in the rapidly-advancing tech space, allowing mourners to keep the voice and personality of their deceased loved ones alive and chat with them using artificial intelligence.

While the innovation might sound unique and comforting to some, others say the development's moral implications could restrict the significance of life to simple characteristics that attempt to replace a once-living person.

ARTIFICIAL INTELLIGENCE IN HEALTH CARE: NEW PRODUCT ACTS AS ‘COPILOT FOR DOCTORS'

"The motivation driving this sort of conversation is clear – we want to keep people around who we've lost," Orthodox Catholic philosopher Joe Vukov said Sunday on "Fox & Friends Weekend." He added that the problem lies in ignoring the biological reality that humans are mortal and the moral assumption that people are more complicated than a limited number of duplicable characteristics.

Artificial Intelligence words are seen in this illustration taken March 31, 2023. REUTERS/Dado Ruvic/Illustration (REUTERS/Dado Ruvic/Illustration)

Similarly, viral video of grieving mother Jang Ji-sung meeting a simulation of her seven-year-old daughter she lost to cancer in 2016 went viral in recent days, showing that, despite being outfitted with goggles and gloves and seeing and hearing her late child, she still could not reach out and touch her.

As of Sunday morning, the video amassed over 30 million views on YouTube.

"I think one thing that's going on in the background here is this assumption that what we are and who we are is nothing more than an abstract, disembodied intelligence that could presumably be put into a computer," Vukov continued on the note.

"There's a lot of perspectives, including the traditional Christian one, that says that what we are is not just this abstract intelligence, but is rather our souls and our bodies together."

Fox News anchor Rachel Campos-Duffy raised concerns that the push to encase personalities in a virtual realm could blur the lines between what is and isn't real, a concern Vukov illustrated with an example of fruit that looks realistic but is made from wax.

RESEARCHERS PREDICT ARTIFICIAL INTELLIGENCE COULD LEAD TO A ‘NUCLEAR-LEVEL CATASTROPHE’

Mother Jang Ji-sung meets the simulated version of her deceased daughter reproduced by artificial intelligence. (Fox & Friends Weekend/Screengrab/MBClife)

"I think that's the sort of thing that we're getting set up for here," he said. "Maybe we're able to produce something that looks like a human and behaves like a human, maybe even looks and acts like one of our loved ones, but, at the end of the day, it's not the real thing and that's setting us up for a deeper disappointment, but also presenting us with a false reality."

ARTIFICIAL INTELLIEGENCE WON'T EVER BE ABLE TO COMPREHEND THIS ONE THING

A.I. continues to rile concerns over other key factors, including environmental, medical and even existential.

Virginia Tech professor Walid Saad, for example, said the algorithms consume a massive sum of physical energy to operate regularly. He said in an earlier segment Sunday on "Fox & Friends Weekend" that A.I. data centers are using massive amounts of water to cool processors, including the popular app ChatGPT. 

"For any A.I. algorithm… there are significant computing resources that you need to use," Saad said, adding that the energy and resources needed to sustain the computing power needed for A.I. bots comes not only from the water needed at data centers, but also from the power grid. 

He proposed recycling water used to sustain the platforms while moving toward greener, more resource-sufficient A.I. and reducing data center usage to help curb the problem.

Additionally, artificial intelligence has infiltrated the medical industry, posing questions about the safety and efficacy of its use.

CLICK HERE TO GET THE FOX NEWS APP

Fox News contributor Kurt Knutsson said Sunday that heightened concerns that artificial intelligence could not only override doctors, but lead them to become insufficient, are swirling around conversations about the bots' potential dangers.

"Are we about to make doctors complete idiots? Doing the work for them, or a machine thinking it can do the work?" he asked. "I don't think we're quite there yet in terms of knowing when this is going to happen… but, believe me, it's coming."

Load more..