An AI program voiced Darth Vader in ‘Obi-Wan Kenobi’ so James Earl Jones could finally retire

After 45 years playing one of the most iconic characters in cinema history, James Earl Jones has said goodbye to Darth Vader. At 91, the legendary actor recently told Disney that he was “considering ending this particular character.” This forced the company to wonder how to replace Jones? The answer ultimately chosen by Disney, with the actor’s consent, involved an AI program.

If you’ve seen any of the recent Star Wars shows, you’ve heard Respeecher’s work. It’s a Ukrainian startup that uses archival recordings and a “proprietary AI algorithm” to create new dialogue featuring the voices of “performers from long ago.” In Jones’ case, the company worked with Lucasfilm to recreate his voice as it sounded when movie audiences first heard Darth Vader in 1977.

According , Jones had signed with Disney using recordings of his voice and Respeecher’s software to “keep Vader alive”. Lucasfilm veteran Matthew Wood told the outlet that James guided the Sith Lord’s performance in Obi Wan Kenobiacting as “a benevolent godfather”, but it was ultimately the AI ​​that gave Vader his voice in many scenes.

While there’s something to be said for preserving Vader’s voice, Disney’s decision to use an AI to do so is likely to fuel disagreements over how this technology should be used in areas creative. For example, Getty Images recently completed . With Jones, it’s possible we could hear him voice Vader long after his death.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you purchase something through one of these links, we may earn an affiliate commission. All prices correct at time of publication.

Ryan H. Bowman