'I Loved That AI:' Judge Moved by AI-Generated Avatar of Man Killed in Road Rage Incident

An AI avatar made to look and sound like the likeness of a man who was killed in a road rage incident addressed the court and the man who killed him: “To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances,” the AI avatar of Christopher Pelkey said. “In another life we probably could have been friends. I believe in forgiveness and a God who forgives. I still do.”
It was the first time the AI avatar of a victim—in this case, a dead man—has ever addressed a court, and it raises many questions about the use of this type of technology in future court proceedings.
The avatar was made by Pelkey’s sister, Stacey Wales. Wales tells 404 Media that her husband, Pelkey’s brother-in-law, recoiled when she told him about the idea. “He told me, ‘Stacey, you’re asking a lot.’”
Gabriel Horcasitas killed Christopher Pelkey in 2021 during a road rage incident. Horcasitas was found guilty in March and faced a sentencing hearing earlier this month. As part of the sentencing, Pelkey’s friends and family filed statements about how his death affected them. In a first, the Arizona court accepted an AI-generated video statement in which an avatar made to look and sound like Pelkey spoke.
“Hello, just to be clear for everyone seeing this, I am a version of Chris Pelkey recreated through AI that uses my picture and my voice profile,” the stilted avatar says. “I was able to be digitally regenerated to share with you today. Here is insight into who I actually was in real life.”
The video then changes to a real video of Pelkey filmed while he was alive, where he talks about his time in the Army and his belief in God. The video goes back to the AI avatar. “I would like to make my own impact statement,” the avatar says. “I can’t tell you how humbled I am for those that spoke up for me, everyone who flew in, took off from work, and for everyone who has supported me and my loved ones through three-and-a-half years and two trials, I wish I could be with you all today.” That’s when the AI avatar directly addressed Horcasitas.
“To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances,” the AI Pelkey says. “In another life we probably could have been friends. I believe in forgiveness, in God who forgives, I always have. And I still do.”
Notably, the judge in the case, Todd Lang, said he was moved by the video.
“I loved that AI, and thank you for that. As angry as you are, and as justifiably angry as the family is, I heard the forgiveness, and I know Mr. Horcasitas could appreciate it, but so did I,” Lang said immediately before sentencing Horcasitas. “I love the beauty in what Christopher, and I call him Christopher—I always call people by their last names, it’s a formality of the court—but I feel like calling him Christopher as we’ve gotten to know him today. I feel that that was genuine, because obviously the forgiveness of Mr. Horcasitas reflects the character I heard about today. But it also says something about the family, because you told me how angry you were, and you demanded the maximum sentence. And even though that’s what you wanted, you allowed Chris to speak from his heart as you saw it. I didn’t hear him asking for the maximum sentence.”
Horcasitas’s lawyer also referenced the avatar when asking Lang for mercy: “As you heard the likeness of Mr. Pelkey say, they could have been friends,” the lawyer said. “And I will tell you, having known Mr. Horcasitas for about three and a half years and reviewing various content of Mr. Pelkey’s phone, I think they had a lot of similar interests, and I think Mr. Pelkey was right, they could have been friends.”
Wales told 404 Media that she struggled writing her own victim impact statement for months before landing on the idea of using AI to have him give his own. She and her husband—Tim Wales—both work in tech and Tim has used AI tools before.
“We talked about it and he says, ‘You know you have to be careful with this stuff. In the wrong hands it can send the wrong message,’” Stacey told 404 Media. “He says, ‘Because without the right script, this will fall short. It will be flat and hokey and I’m not going to let it go out if it’s not authentic.’”
Stacey said she believes she knew what Christopher would say. “I…know the power of something like this. It needs to be a blanket statement of love, because that’s what Chris would stand for,” she said. “I can't use it selfishly, and I'm already aware of that.”
According to Stacey, Tim used Stable Diffusion fine-tuned with a Low-Rank Adaptation (LoRA) to craft the video. “And then we used a generative AI and deep learning processes to create a voice clone from his original voice,” she said.
Using an AI generated video to have a dead victim deliver “their own” impact statement is unprecedented. AI avatars are obviously not the real person, and what they say must either be scripted by a different person, or generated using an LLM that is not the person. In this case, the video was used to help determine the prison sentence of a living person. The video that Pelkey’s family played contained several minutes of video of Pelkey from when he was alive, but everything the AI avatar said was scripted by his sister.
There were videos shown during the trial that Stacey said were deeply difficult to sit through. “Videos of Chris literally being blown away with a bullet through his chest, going in the street, falling backward. We saw these items over and over and over,” she said. “And we were instructed: don’t you gasp and don’t you cry and do not make a scene, because that can cause a mistrial.”
Stacey said the sentencing hearing was the first time she and her family were able to talk to the court about how his death affected them. “This is the first time we had control, that we could speak up and that we could gasp aloud and let our tears flow and feel emotion in this courtroom. And we controlled the narrative,” she told 404 Media.
Jessica Gattuso, the victim’s right attorney that worked with Pelkey’s family, told 404 Media that Arizona’s laws made the AI testimony possible. “We have a victim’s bill of rights,” she said. “[Victims] have the discretion to pick what format they’d like to give the statement. So I didn’t see any issues with the AI and there was no objection. I don’t believe anyone thought there was an issue with it.”
AI in the courtroom is a controversial topic and one that both lawyers and judges have gotten frustrated with in the past. In January, a judge in Wyoming chastised a lawyer for citing non-existent cases that AI had hallucinated as part of a lawsuit. In March, a different group of attorneys were caught citing AI hallucinated cases and ordered to pay $15,000.
Gattuso said she understood the concerns, but felt that Pelkey’s AI avatar was handled deftly. “Stacey was up front and the video itself…said it was AI generated. We were very careful to make sure it was clear that these were the words that the family believed Christopher would have to say,” she said. “At no point did anyone try to pass it off as Chris’ own words.”
The prosecution against Horcasitas was only seeking nine years for the killing. The maximum was 10 and a half years. Stacey had asked the judge for the full sentence during her own impact statement. The judge granted her request, something Stacey credits—in part—to the AI video.
“Our goal was to make the judge cry. Our goal was to bring Chris to life and to humanize him,” she said.