AI brought a road rage victim ‘back to life’ in court. Experts say it went too far

When Christopher Pelkey was killed in a road rage incident in Arizona, his family was left not only to grieve but also to navigate how to represent him in court. As they prepared to confront his killer, Gabriel Horcasitas, during sentencing, they made an unusual and deeply controversial choice: to have Pelkey appear to speak from beyond the grave. To do so, they turned to technology: An AI-generated video featuring a re-created voice and likeness of Pelkey was presented as a victim impact statement ahead of sentencing. The video showed a digitally resurrected Pelkey appearing to speak directly to the judge. Of course, the statement wasn’t truly Pelkey’s. He couldn’t have possibly said those words—he died the day Horcasitas shot him. Yet the judge accepted the AI-generated message, even acknowledging its effect. “You allowed Chris to speak from his heart as you saw it,” the judge said. Horcasitas was sentenced to 10 and a half years in prison. The extraordinary courtroom moment has sparked widespread discussion, not just for its emotional power but for the precedent it may set. Arizona’s victims’ rights laws allow the families of deceased victims to determine how their impact statements are delivered. But legal and AI experts warn that this precedent is far from harmless. “I have sympathy for the family members who constructed the video,” says Eerke Boiten, a professor at De Montfort University in the U.K. specializing in AI. “Knowing Pelkey, they likely had a preconceived view of how he might have felt, and AI gave them a way of putting that across that many found attractive and convincing.” Still, Boiten is uneasy about how the video has been interpreted by both the public and possibly the court. “The video should be read as a statement of opinion from the family members, with AI providing a convincing presentation of that,” he explains. Yet public reaction suggests it was taken as something more. “The reactions show that it was taken as an almost factual contribution from Pelkey instead,” Boiten says. The victims’ rights attorney who represented Pelkey’s family told 404 Media that “at no point did anyone try to pass it off as Chris’s own words.” Yet the emotionally charged format of presenting a deepfaked version of the deceased gives those words far more weight than if they had simply been read aloud. And it’s worth emphasizing: Pelkey could never have written them himself. “It’s an inappropriate use of AI which has no relevance and should have no role at sentencing,” says Julian Roberts, emeritus professor of criminology at the University of Oxford and executive director of the Sentencing Academy. Data protection specialist Jon Baines of the firm Mishcon de Reya adds that the incident is “profoundly troubling from an ethical standpoint.” Roberts argues that using an AI-generated likeness of a victim oversteps the purpose of a victim impact statement. “The victim statement should inform the court about the impact of the crime on the victim and advise of any possible impact of the imposition of a community order, et cetera,” he says. “It is not an exercise in memorializing the victim.” In his view, that’s exactly what the Pelkey video did. Roberts also criticized the content of the statement itself: “The statement should contain information, not opinion or invention—human or AI-derived.” Still, a precedent has now been set—at least in Arizona. One that blurs the line between mourning and manipulation. One that allows people to “speak” from beyond the grave—and could, in the future, influence the length of prison sentences in ways that justice systems may not yet be prepared to handle.

May 9, 2025 - 11:29
 0
AI brought a road rage victim ‘back to life’ in court. Experts say it went too far

When Christopher Pelkey was killed in a road rage incident in Arizona, his family was left not only to grieve but also to navigate how to represent him in court. As they prepared to confront his killer, Gabriel Horcasitas, during sentencing, they made an unusual and deeply controversial choice: to have Pelkey appear to speak from beyond the grave.

To do so, they turned to technology: An AI-generated video featuring a re-created voice and likeness of Pelkey was presented as a victim impact statement ahead of sentencing. The video showed a digitally resurrected Pelkey appearing to speak directly to the judge.

Of course, the statement wasn’t truly Pelkey’s. He couldn’t have possibly said those words—he died the day Horcasitas shot him. Yet the judge accepted the AI-generated message, even acknowledging its effect. “You allowed Chris to speak from his heart as you saw it,” the judge said. Horcasitas was sentenced to 10 and a half years in prison.

The extraordinary courtroom moment has sparked widespread discussion, not just for its emotional power but for the precedent it may set. Arizona’s victims’ rights laws allow the families of deceased victims to determine how their impact statements are delivered. But legal and AI experts warn that this precedent is far from harmless.

“I have sympathy for the family members who constructed the video,” says Eerke Boiten, a professor at De Montfort University in the U.K. specializing in AI. “Knowing Pelkey, they likely had a preconceived view of how he might have felt, and AI gave them a way of putting that across that many found attractive and convincing.”

Still, Boiten is uneasy about how the video has been interpreted by both the public and possibly the court. “The video should be read as a statement of opinion from the family members, with AI providing a convincing presentation of that,” he explains. Yet public reaction suggests it was taken as something more. “The reactions show that it was taken as an almost factual contribution from Pelkey instead,” Boiten says.

The victims’ rights attorney who represented Pelkey’s family told 404 Media that “at no point did anyone try to pass it off as Chris’s own words.” Yet the emotionally charged format of presenting a deepfaked version of the deceased gives those words far more weight than if they had simply been read aloud. And it’s worth emphasizing: Pelkey could never have written them himself.

“It’s an inappropriate use of AI which has no relevance and should have no role at sentencing,” says Julian Roberts, emeritus professor of criminology at the University of Oxford and executive director of the Sentencing Academy. Data protection specialist Jon Baines of the firm Mishcon de Reya adds that the incident is “profoundly troubling from an ethical standpoint.”

Roberts argues that using an AI-generated likeness of a victim oversteps the purpose of a victim impact statement. “The victim statement should inform the court about the impact of the crime on the victim and advise of any possible impact of the imposition of a community order, et cetera,” he says. “It is not an exercise in memorializing the victim.” In his view, that’s exactly what the Pelkey video did.

Roberts also criticized the content of the statement itself: “The statement should contain information, not opinion or invention—human or AI-derived.”

Still, a precedent has now been set—at least in Arizona. One that blurs the line between mourning and manipulation. One that allows people to “speak” from beyond the grave—and could, in the future, influence the length of prison sentences in ways that justice systems may not yet be prepared to handle.