ILM is embracing AI like any other tool, and you should too

The year, 1993. A rudimentary computer-generated T. rex—a reptilian skin stretched over a wire frame—played on a loop in a computer at Industrial Light & Magic in California. Three film legends—VFX supervisor Dennis Muren, animator Phil Tippett, and director Steven Spielberg—watched silently as the implications sank in. “Cinema history changed,” Rob Bredow recounts in his April 2023 TED Talk, which has just been published on YouTube. Tippett, a stop-motion pioneer, dryly told Spielberg, “I feel like I’m going extinct.” As most movie buffs know, that line landed in Jurassic Park. Tippett’s fear, however, turned out to be unfounded.The legendary effects company fused Tippett’s stop-motion puppetry with nascent CGI, using a “dinosaur input device”—a rigged armature with motion encoders—to digitize frame-by-frame animation. The result? A seismic shift that expanded artists’ tool kits worldwide and opened a new era in filmmaking.Bredow was only 19 when that happened. Now, as SVP of creative innovation for Lucasfilm and chief creative officer of ILM, he sees a direct parallel to today’s artificial intelligence debates. “Headlines say, ‘AI is coming for our jobs,’” he says in the TED Talk. From the Dykstraflex—the computer-controlled motion camera that enabled Star Wars’s iconic dogfights—to the StageCraft—a 270-degree LED curved wall that projected hyperrealistic 3D environments for The Mandalorian and now many other shows and movies—Bredow argues that ILM’s 50-year history is a demonstration of how technological leaps redefine, rather than replace, artistry.Stop-motion transformed and merged with 3D effects. So did physical models, full-size sets, and matte paintings. Just like it’s happening now with AI. “Innovation thrives when old and new technologies are blended,” Bredow argues.  ILM was late to the AI game. This became painfully obvious when the effects company created a rejuvenated version of Mark Hamill for the Season 2 finale of The Mandalorian (despite fans cheering on Luke Skywalker’s return to the screen). Done with traditional computer face tracking and 3D models—the same technique used to create Peter Cushing as Grand Moff Tarkin and Princess Leia—Return of the Jedi Luke was slammed for being unrealistic. Then a Star Wars fan and AI aficionado called Shamook re-created the scene using AI. The former took hours. The latter took weeks. There was no doubt about which one looked more realistic.The difference was so obvious that the company realized it had to act: ILM hired Shamook days after the deepfake remake was released. He worked on Indiana Jones and the Dial of Destiny, where ILM merged generative AI, trained on Harrison Ford’s past performances, with a meticulously handcrafted CG model to de-age the actor. The AI captured Ford’s micro-expressions; artists fine-tuned subtleties like eye moisture and skin texture. Ford himself said it was pretty good and really felt like him. Because it did.[Screenshot: ILM]AI is just another tool in the toolboxThe ethos that now guides its AI integration has been in ILM’s DNA since its origin. It was what drove George Lucas to pair engineers with artists to solve visual storytelling challenges. “We’re designed to be creative beings,” Bredow says. “We love seeing tech and creativity work together.” Bredow hinted at ILM’s embrace of AI tools in a Fast Company interview back in August 2024: “I do see a path forward with some combination of the algorithmic tools that we’ve had and some machine learning-based tools that we either already have or can imagine developing, that are really going to help accelerate artist workflows.” Now he has made clear that AI has reached a point in which it is just another toolbox in ILM’s toolbox. His stance about the technology is one that I have been seeing more and more since independent filmmaker Paul Trillo, one of the pioneers in using generative AI for his shorts, told me the same years ago. Trillo thinks that AI will enable indie projects to achieve blockbuster-grade VFX: “It is just a powerful tool in a creative’s arsenal.”It’s just too bad that the example that Bredow presented in the TEDTalk was so underwhelming: A video that shows some uninspired sci-fi animals that looked like Photoshop-made images quickly turned to video using Kling, a commercial AI video generation tool developed by Chinese tech company Kuaishou. He described it as ILM’s “moving mood board,” but it falls short of what you would expect from the mother of all VFX houses.But his points and the lesson from ILM’s half-century of visual innovation stand. The Dykstraflex didn’t kill cinematography—it birthed a new process and visual language. CGI dinosaurs didn’t erase animators—they just demanded hybrid skills. Now, as AI reshapes VFX, Bredow says we are witnessing another T. rex moment: one where artists, armed with generative tools, push storytelling beyond current limits.Adaptation is nonnegotiable. New tools should be embraced as long as they are not uneth

May 20, 2025 - 12:00
 0
ILM is embracing AI like any other tool, and you should too

The year, 1993. A rudimentary computer-generated T. rex—a reptilian skin stretched over a wire frame—played on a loop in a computer at Industrial Light & Magic in California. Three film legends—VFX supervisor Dennis Muren, animator Phil Tippett, and director Steven Spielberg—watched silently as the implications sank in. “Cinema history changed,” Rob Bredow recounts in his April 2023 TED Talk, which has just been published on YouTube. Tippett, a stop-motion pioneer, dryly told Spielberg, “I feel like I’m going extinct.” As most movie buffs know, that line landed in Jurassic Park. Tippett’s fear, however, turned out to be unfounded.

The legendary effects company fused Tippett’s stop-motion puppetry with nascent CGI, using a “dinosaur input device”—a rigged armature with motion encoders—to digitize frame-by-frame animation. The result? A seismic shift that expanded artists’ tool kits worldwide and opened a new era in filmmaking.

Bredow was only 19 when that happened. Now, as SVP of creative innovation for Lucasfilm and chief creative officer of ILM, he sees a direct parallel to today’s artificial intelligence debates. “Headlines say, ‘AI is coming for our jobs,’” he says in the TED Talk. From the Dykstraflex—the computer-controlled motion camera that enabled Star Wars’s iconic dogfights—to the StageCraft—a 270-degree LED curved wall that projected hyperrealistic 3D environments for The Mandalorian and now many other shows and movies—Bredow argues that ILM’s 50-year history is a demonstration of how technological leaps redefine, rather than replace, artistry.

Stop-motion transformed and merged with 3D effects. So did physical models, full-size sets, and matte paintings. Just like it’s happening now with AI. “Innovation thrives when old and new technologies are blended,” Bredow argues. 

ILM was late to the AI game. This became painfully obvious when the effects company created a rejuvenated version of Mark Hamill for the Season 2 finale of The Mandalorian (despite fans cheering on Luke Skywalker’s return to the screen). Done with traditional computer face tracking and 3D models—the same technique used to create Peter Cushing as Grand Moff Tarkin and Princess Leia—Return of the Jedi Luke was slammed for being unrealistic. Then a Star Wars fan and AI aficionado called Shamook re-created the scene using AI. The former took hours. The latter took weeks. There was no doubt about which one looked more realistic.

The difference was so obvious that the company realized it had to act: ILM hired Shamook days after the deepfake remake was released. He worked on Indiana Jones and the Dial of Destiny, where ILM merged generative AI, trained on Harrison Ford’s past performances, with a meticulously handcrafted CG model to de-age the actor. The AI captured Ford’s micro-expressions; artists fine-tuned subtleties like eye moisture and skin texture. Ford himself said it was pretty good and really felt like him. Because it did.

[Screenshot: ILM]

AI is just another tool in the toolbox

The ethos that now guides its AI integration has been in ILM’s DNA since its origin. It was what drove George Lucas to pair engineers with artists to solve visual storytelling challenges. “We’re designed to be creative beings,” Bredow says. “We love seeing tech and creativity work together.”

Bredow hinted at ILM’s embrace of AI tools in a Fast Company interview back in August 2024: “I do see a path forward with some combination of the algorithmic tools that we’ve had and some machine learning-based tools that we either already have or can imagine developing, that are really going to help accelerate artist workflows.” Now he has made clear that AI has reached a point in which it is just another toolbox in ILM’s toolbox. His stance about the technology is one that I have been seeing more and more since independent filmmaker Paul Trillo, one of the pioneers in using generative AI for his shorts, told me the same years ago. Trillo thinks that AI will enable indie projects to achieve blockbuster-grade VFX: “It is just a powerful tool in a creative’s arsenal.”

It’s just too bad that the example that Bredow presented in the TEDTalk was so underwhelming: A video that shows some uninspired sci-fi animals that looked like Photoshop-made images quickly turned to video using Kling, a commercial AI video generation tool developed by Chinese tech company Kuaishou. He described it as ILM’s “moving mood board,” but it falls short of what you would expect from the mother of all VFX houses.

But his points and the lesson from ILM’s half-century of visual innovation stand. The Dykstraflex didn’t kill cinematography—it birthed a new process and visual language. CGI dinosaurs didn’t erase animators—they just demanded hybrid skills. Now, as AI reshapes VFX, Bredow says we are witnessing another T. rex moment: one where artists, armed with generative tools, push storytelling beyond current limits.

Adaptation is nonnegotiable. New tools should be embraced as long as they are not unethically taking advantage of other people’s artwork. “The next game changer,” he said, “will light up screens worldwide.” The credits won’t fade on human creativity. Hordes of people’s names will keep rolling. At least for a few more years to come.