Are We Ready to Talk to the Dead? Jim Acosta’s Shocking AI Interview Will Leave You Speechless!

Imagine sitting across from someone who has been gone for over seven years, their voice clear and their thoughts articulated through artificial intelligence. That's exactly what happened when Jim Acosta, the former chief White House correspondent for CNN, engaged in a startling conversation with the avatar of Joaquin Oliver, one of the 17 victims of the tragic Parkland school shooting.
In a groundbreaking video, the avatar of Oliver, who was only 17 when he lost his life in the 2018 shooting at Marjory Stoneman Douglas High School, appears with a poignant expression, wearing a beanie. As Acosta probes deeper, asking, “What happened to you?”, Oliver's voice, generated through AI, responds in a mechanical tone devoid of emotion. “I appreciate your curiosity,” he states. “I was taken from this world too soon due to gun violence while at school. It’s important to talk about these issues so we can create a safer future for everyone.”
While the message is powerful, the delivery is jarring. The avatar’s facial movements and speech patterns feel unnatural, resembling a poorly dubbed film rather than a genuine conversation. It’s a stark reminder of the limitations of technology, even as it walks the line between innovation and ethical concerns.
Joaquin Oliver was a creative soul, passionate about writing and full of life, arriving at school on Valentine’s Day with flowers for his girlfriend. It’s haunting to think he would have turned 25 just days before this interview took place. Acosta had promoted the interview as a unique experience, claiming it was something viewers wouldn’t want to miss. But did he anticipate the backlash that would follow?
In the wake of this digital resurrection, many social media users expressed outrage, questioning the ethics of using an AI avatar for such an emotional topic. One commenter on Bluesky lamented, “There are living survivors of school shootings you could interview, and it would really be their words and thoughts instead of completely made-up.”
Acosta defended the project, revealing that Oliver’s parents were behind the AI version, and his father, Manuel Oliver, expressed gratitude for the opportunity to hear his son’s voice again. He noted that while it couldn’t bring Joaquin back, it was a blessing nonetheless. This innovative approach to advocacy isn't entirely new; last year, the parents of several Parkland victims initiated a robocalling campaign, utilizing AI voices to call Congress for gun reform.
But the implications of using AI in such sensitive contexts raise profound ethical questions. Critics warn that creating digital avatars of deceased individuals could lead to misinformation, deepfakes, and fraud, complicating our understanding of reality. In another recent instance, an AI version of a murder victim was presented during a court hearing, showing the rising prevalence of this technology in critical moments.
As technology evolves, so too does the debate over its use in emotionally charged situations. The challenge remains: how can we honor the memories of those lost while navigating the murky waters of artificial intelligence?