Animators also employed slow-motion photography in Transformers to control the speed and motion of some of the characters' transformations from vehicles to robots (or vice versa). "Sometimes, when the action got too fast, it was hard to tell what was happening," Benza says. "Dramatically, slow motion helped us to sell the action."
In one scene, as protagonist leader Optimus Prime (an 18-wheel Peterbilt truck that morphs into a two-story robot) slugs it out with an enemy Transformer known as Bonecrusher, animators slowed the scene as much as four times less than normal speed so that the audience could better take in the spectacle of two enormous robots crashing over a city bridge. While slow motion is typically achieved by filming scenes with a high-speed camera, Benza and his team were able to achieve the same effect by bringing the footage into Apple's Shake digital-compositing software, then retiming it using a Shake plug-in called Furnace Kronos.
One of Benza's priorities was to make the Transformers' movements as authentic as possible. Animators used Zeno rigid-body solver software written by ILM developers to film a major scene early in the movie in which an enemy Transformer called Scorponok attacks a U.S. military outpost in the desert. As its name indicates, Scorponok was designed to be a large mechanical attack scorpion. To improve the authenticity of the character's movements, animators used the software to calculate how Scorponok would move through its scenes. "We were leveraging the computer's power to help us create motion, because the computer is really good at figuring out physics," Benza says.
Benza and his team had access to sophisticated software while creating the film's action, but he says they are constantly on the lookout for new technologies to use in future movies, including a Transformers sequel. He says one way to upgrade results would be to use simulation software on par with the "finite element" applications utilized by automobile companies when running virtual crash safety tests.
These applications are complex and difficult to set up and run because they render simulated car crashes under very specific conditions, taking into account factors such as the type of vehicle, road conditions and speed. "We want to be able to leverage that technology in creating visual effects in a way that's as believable as tests done in the auto industry," Benza says. "They care a lot about getting the simulation 100 percent correct because lives are at stake." But these applications require a level of computing expertise to set up and run that is not typically found in animation departments. "We would need a simplified version," he acknowledges.
Another area where visual effects could be improved, according to Benza, is the simulation of natural phenomena such as smoke, fire and water. "I would love to see these become even more realistic," he says. "But it's very expensive and time consuming to do on our computers." This is, in part, because of the massive calculations that must be performed to, say, create an image of flowing water that looks like an actual stream. As it is, ILM used 5,500 computer processors and 220 terabytes of storage to store all the models, animation, background plates, textures, reference materials and artwork for the film. Benza is counting on the further development of computer processors that take advantage of multithreading throughput, among other technologies, to continue his quest for lifelike animation.