close
close
Ai used to be a video of the deceased victim on the court: NPR

A screenshot from the video by Christopher Pelkey, produced by Tim Wales and Scott Yentzer, consulted with Stacey Wales.

A screenshot of the AI ​​created a video by Christopher Pelkey.

YouTube


Hide the caption

Switch the image signature

YouTube

For two years, Stacey Wales led a current list of everything she kept at the hearing of the court hearing for the man who killed her brother in a street racial in Chandler, Ariz.

But when she finally sat down to write her statement, Wales was stuck. She tried to find the right words, but one voice was clear: that of her brother.

“I couldn’t help but hear his voice in my head what he would say,” said Wales to NPR.

Then the idea came to her: Use artificial intelligence to create a video about how her late brother, Christopher Pelkey, the courtroom and especially the man would appeal to him in 2021 in a red light.

On Thursday, Wales was in front of the square and played the video – in what AI experts say, is probably the first time that the technology was used in the USA Creation of an impact declaration that was read by a AI presentation of the deceased victim.

https://www.youtube.com/watch?v=cms-_8etnts

A sister who is looking for the right words

Wales has thought about her effects on her victims -Impact since the first legal proceedings in 2023. The case was repeated in 2025 due to procedural problems with the first attempt.

The chance to speak in court meant that Wales had a lot to do that held her feelings in both exams to avoid the jury.

“You are said that you cannot react, you can’t emote, you can’t cry,” she said. “We were looking forward to (condemnation) because we can finally react.”

Wales’ lawyer told her that she should humanize Pelkey ​​and offer a complete picture of who he was.

So Wales made a mission. She said she Contacted so many people from Pelkey’s life – from his primary school teacher to the prom in high school to the soldiers, whom he served next to Iraq and Afghanistan.

A photo of Christopher Pelkey ​​who goes down the walk at her wedding.

A photo of Chris Pelkey ​​who goes down the aisle at her wedding.


Hide the caption

Switch the image signature

A photo of Chris Pelkey ​​who goes down the aisle at her wedding.

A total of Wales collected 48 statements to the victim of victims – without counting their own. When it was time to write her, she was torn back and forth how she really felt and what she believed, the judge would want to hear.

“I didn’t want to get up there and say: ‘I’m forgiving you’ because I’m not there, I’m not there yet,” she said. “And the dichotomy was that I could hear Chris’ voice in my head and he says: ‘I give him.'”

According to Wales, Pelkeys Mantra was always in God and others to love others and love others. He was the kind of man to give the shirt off the back, she said. While she tried to find the right words for herself, Wales said that the letter was natural from his perspective.

“I knew what he was standing for and it was only very clear to me what he would say,” she added.

A digitally circumcised beard and a inserted laughter

That night Wales turned to her husband Tim, who has experience with AI for work.

“He doesn’t get a say. He has no chance to speak,” said Wales and referred to her brother. “We can’t allow that. We have to give him a voice.”

Tim and their business partner Scott Yentzer only had a few days to produce the video. The challenge: there is not a single program for such a project. They also needed a long, clear audio clip from Pelkey’s voice and a photo of him that looked directly at the camera – none of what Wales had.

With several AI tools, Wales’ husband and Yentzer managed to create a convincing video with a 4.5-minute video by Pelkey, his funeral photo and a script that Wales has prepared to create a convincing video. They digitally removed the sunglasses on Pelkey’s hat and cropped his beard – which had caused technological problems.

Wales, which was strongly involved, ensure that the video felt true to life, said that her brother’s laugh was particularly difficult because most Pelkey ​​clips were filled with background noise.

Wales made experience think about her own mortality. One evening Wales stepped into her closest and recorded a nine-minute videos of herself and laughed in the event that her family needs a clear audio of her voice one day.

“It was a strange experience outside the body to think about her own mortality, but you never know when you won’t be here,” she said.

In the night before the conviction, Wales called her lawyer for victim rights, Jessica Gattuso, to tell her about the video. Gattuso said NPR that she initially hesitated the idea because she had never heard of the fact that it was done before the Arizona court. She was also concerned that the video may not be well received. But after seeing the video, she felt forced to look at in court.

“I knew that it would have an impact on everyone, including the shooter because it was a message of forgiveness,” said Gattuso.

The Ki created video helped with the healing, says sister

Ten people spoke at the hearing of Pelkey ​​to support Pelkey. The ai-generated video of him last went.

“Hello, just to make it clear when everyone can see that, I am a version of Chris Pelkey, which was recreated by AI, which my picture and language profile used,” said the Ai Avatar.

The video I thank everyone at Pelkey’s Life that contributed an impact declaration and participated in the hearing. Then the video appealed to his shooter Gabriel Paul Horcasitas.

“It is a shame that we came across each other under these circumstances. In another life we ​​could probably have been friends. I believe in forgiveness and God who forgives. I always have it and I still do it,” said the video.

The video ended with the avatar Encourage everyone to love each other and live life to the fullest. “Well, I’m going to fish now. I love you all. We’ll see you on the other side,” concluded it.

Neither the defense nor the judges have pushed back. Later in the hearing, judge Todd Lang said: “I loved this AI. Thank you for that.”

A photo by Chris Pelkey.

A photo of Christopher Pelkey.

Stacey Wales


Hide the caption

Switch the image signature

Stacey Wales

He added: “It says something about the family because you told me how angry you were and you asked the maximum penalty. And even thought that you wanted that you allowed Chris to speak from his heart as you saw. I didn’t ask him about the maximum sentence.” Horcasitas received 10.5 years for manslaughter.

Wales said she didn’t know how deep the video she and her family would influence. It was an opportunity for her youthful son to say goodbye to his uncle. For Wales, she gave her the strength to finally look back on photos of her brother.

“This process of AI and how it would sound, and his beard and the insertion of laughter and all these other things, it was very cathartic and it was part of the healing process,” she said.

What AI and legal experts say

Over the years there have been more and more examples in which the races of the role of the AI ​​were tested in the courtroom.

For example, President Trump’s former lawyer, Michael Cohen, sent his lawyer legal legal quotes in 2023. In recent times, a man tried to use an Avatar lawyer in court last month-an effort that was quickly closed by the judge.

According to Maura Grossman, according to Maura Grossman, a professor of Waterloo, who studied the applications of AI in criminals and civil cases, appears new. She added that in Pelkey’s case she did not see any legal or ethical questions.

“Because this lies in front of a judge, not as a jury, and because the video was not submitted as proof of itself, its effects are more limited,” she said NPR via e -mail.

Some experts, including Grossman, predict generative AI in the legal system, but raises various legal and ethical questions. The most important concerns are among the most important concerns regarding the declaration of consent, fairness and the question of whether the content was made in good faith.

“Sacrifices like this, who really try to represent the dead victim’s voice, are probably the least complained use of AI to create false videos or statements,” wrote Gary Marchant, professor of law, ethics and emerging technologies at Sandra Day O’Connor College of Law from Arizona State University.

“Many attempts to use AI to create deep counterfeits will be much more malicious,” he added.

Wales himself warns the people who can step into their footsteps to act with integrity and not to be driven by selfish motifs. “I could have been very selfish,” she said. “But it was important to give no person or group closures that someone else could miss.”

Leave a Reply

Your email address will not be published. Required fields are marked *