ADVERTISEMENT

And the Oscar Goes to… a Robot

Industrial Light & Magic animators push the limits of computer animation technology to create lifelike shape-shifters in the Transformers movie



Courtesy of Industrial Light & Magic

It is unlikely that this year's Oscar ceremony will include an award for best animated actor in a film. But that has not stopped movie companies from pushing the boundaries of animation to make their synthetic characters seem as real as possible—even if those characters happen to be shape-shifting megaton robots, as in last summer's Transformers special effects extravaganza.

The prospect of turning a lineup of toy action figures into a live-action film that kids would want to see (and their parents would want to take them to) was daunting, admits Industrial Light & Magic's (ILM) Scott Benza, the animation supervisor for Transformers. He adds, "We all scratched our heads," when we first heard about the project.

In November 2005 Transformers director Michael Bay and a team of designers provided Benza with 3-D computer images of the animated characters that Benza and more than 30 animators would eventually bring to life. These "animatics," or rough animations for the film, tell the movie's story, but they do not have any clearly defined style of movement.

"My job was to bring character to these computer animations," says Benza, who previously worked with Bay on 2005's The Island and Pearl Harbor in 2001. Indeed, animators are tasked with bringing computer-animated characters to life, particularly those with key "acting" roles. Lead animators work closely with directors to ensure they get the best performances out of their animated characters.

Even though he has more than a dozen films to his credit, including work on the 2003 film version of The Hulk that garnered him nominations for two awards from the Visual Effects Society, Benza says it was a challenge to create highly athletic performances from bulky, animated characters that appeared to be the size of small buildings and weigh several tons. "In some moments you have to sell the weight of the Transformers, in other moments you have to sell their athleticism," Benza says.

The key to making the Transformer characters interact well with the human actors and the sets was giving these mammoth machines a sense of weight and a fluidity of motion. "Most of what we had to achieve was possible, but not easily possible," Benza says. "A lot of the technology we used was in the early stages and had to be dramatically developed for us to use it in the film."

ILM animators employed a technique they refer to as "virtual background pipeline" to make sure that the animated characters had plenty of room to move in any given scene, whether they were flying, fighting or racing through an intersection. Virtual background pipeline starts by taking a large number of digital photographs of a scene or location using a tripod with a robotic head. ILM then used its custom-created Zeno software, as well as other different pieces of third-party software, to stitch the images together and re-create a seamless digital background. "These photos can also be used to create textured 3-D geometry, with a process called photomodeling—again, inside Zeno," Benza says.

Using the new seamless background combined with the textured 3-D geometry, artists at ILM, a subsidiary of Lucasfilm, Ltd., had the flexibility to alter camera moves that were filmed on location, or to create new ones. "Think of it as a projected image," Benza says. If Bay shot footage of an intersection, the animators would integrate computer imagery into Bay's background plate so they could better control the action of the animated characters moving through the scene. Virtual background pipeline was developed for The Island and was also utilized on Mission Impossible: 3, he says.

Animators also employed slow-motion photography in Transformers to control the speed and motion of some of the characters' transformations from vehicles to robots (or vice versa). "Sometimes, when the action got too fast, it was hard to tell what was happening," Benza says. "Dramatically, slow motion helped us to sell the action."

In one scene, as protagonist leader Optimus Prime (an 18-wheel Peterbilt truck that morphs into a two-story robot) slugs it out with an enemy Transformer known as Bonecrusher, animators slowed the scene as much as four times less than normal speed so that the audience could better take in the spectacle of two enormous robots crashing over a city bridge. While slow motion is typically achieved by filming scenes with a high-speed camera, Benza and his team were able to achieve the same effect by bringing the footage into Apple's Shake digital-compositing software, then retiming it using a Shake plug-in called Furnace Kronos.

One of Benza's priorities was to make the Transformers' movements as authentic as possible. Animators used Zeno rigid-body solver software written by ILM developers to film a major scene early in the movie in which an enemy Transformer called Scorponok attacks a U.S. military outpost in the desert. As its name indicates, Scorponok was designed to be a large mechanical attack scorpion. To improve the authenticity of the character's movements, animators used the software to calculate how Scorponok would move through its scenes. "We were leveraging the computer's power to help us create motion, because the computer is really good at figuring out physics," Benza says.

Benza and his team had access to sophisticated software while creating the film's action, but he says they are constantly on the lookout for new technologies to use in future movies, including a Transformers sequel. He says one way to upgrade results would be to use simulation software on par with the "finite element" applications utilized by automobile companies when running virtual crash safety tests.

These applications are complex and difficult to set up and run because they render simulated car crashes under very specific conditions, taking into account factors such as the type of vehicle, road conditions and speed. "We want to be able to leverage that technology in creating visual effects in a way that's as believable as tests done in the auto industry," Benza says. "They care a lot about getting the simulation 100 percent correct because lives are at stake." But these applications require a level of computing expertise to set up and run that is not typically found in animation departments. "We would need a simplified version," he acknowledges.

Another area where visual effects could be improved, according to Benza, is the simulation of natural phenomena such as smoke, fire and water. "I would love to see these become even more realistic," he says. "But it's very expensive and time consuming to do on our computers." This is, in part, because of the massive calculations that must be performed to, say, create an image of flowing water that looks like an actual stream. As it is, ILM used 5,500 computer processors and 220 terabytes of storage to store all the models, animation, background plates, textures, reference materials and artwork for the film. Benza is counting on the further development of computer processors that take advantage of multithreading throughput, among other technologies, to continue his quest for lifelike animation.

Rights & Permissions
Share this Article:

Comments

You must sign in or register as a ScientificAmerican.com member to submit a comment.
Scientific American Holiday Sale

Black Friday/Cyber Monday Blow-Out Sale

Enter code:
HOLIDAY 2014
at checkout

Get 20% off now! >

X

Email this Article

X