AI-Generated Victim Impact Statement Makes History in Arizona Courtroom

In a groundbreaking moment for legal proceedings, an Arizona courtroom witnessed the use of artificial intelligence to deliver a victim impact statement during a sentencing hearing on 1st May 2025. The family of Christopher Pelkey, a 37-year-old US Army veteran killed in a 2021 road rage incident in Chandler, Arizona, created an AI-generated video of Pelkey to address his killer, Jesus Horcasitas, in court.

Pelkey’s sister, Stacey Wales, spent two years crafting a traditional victim impact statement but felt it couldn’t fully capture her brother’s essence. With the help of Arizona Voice for Crime Victims, the family used AI technology to recreate Pelkey’s voice and likeness, based on existing images and recordings. The AI video, shown during Horcasitas’ sentencing, featured Pelkey’s digital avatar expressing regret over their encounter and reflecting on the tragedy. Horcasitas, who pleaded guilty to reckless endangerment and was found guilty of manslaughter, received a 10½-year sentence.This unprecedented use of AI in a U.S. courtroom has sparked both praise and concern. Legal experts noted the emotional impact of the presentation, with the judge acknowledging its influence.

However, Horcasitas’ attorney, Jason Lamm, criticized the move as inflammatory, arguing it may have swayed the judge’s decision and could provide grounds for an appeal. The defense was not informed about the AI video in advance, raising questions about courtroom protocol.Ethical debates are emerging as AI’s role in the justice system grows. Supporters, including Arizona State University law professor Gary Marchant, commended the family’s restraint, noting the statement avoided vindictiveness. Critics, like AI ethics expert Robert Leben, worry about future uses of the technology, questioning whether AI-generated statements will always align with a victim’s true wishes.As AI continues to evolve, this case highlights its potential to reshape legal processes while underscoring the need for clear guidelines to balance innovation with fairness. The Arizona courtroom’s experiment may set a precedent, prompting courts nationwide to grapple with the implications of AI in delivering justice.