Already cutting a fiery swath through the box office, DreamWorks Animation’s How to Train Your Dragon provides an object lesson in just how far stereoscopic 3D animated films have progressed. This progress is largely due to the advances in platform technology that greatly extend the creative range and artistic vision of content producers.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
With many of the scenes containing multiple dragons, the complex, unprecedented level of character animation in this film demonstrates the value of multi-core processing solutions from Intel in scaling the production workflow and enhancing visual richness to greater heights.
The collaboration between DreamWorks Animation and Intel has resulted in substantial software improvements within the animation pipeline and better access to available processing resources in the complex, deeply integrated hardware platform based on Intel® architecture.
Animating a Screen Full of Dragons
The original Shrek from DreamWorks Animation, which was by many standards a landmark work in its own right, featured a single dragon. Even that one character pushed the limits of the existing technology in the late 1990s through the year 2000.
Kate Swanborg, technology executive at DreamWorks Animation, noted, for the technology available at the time, the dragon in the original Shrek movie proved extremely challenging to rig and animate.” The dragon in Shrek was so complex and so heavy from a technical and compute standpoint that we were confined to using her only in a handful of shots. We simply could not have her in more shots because we would not have been able to get them through the pipeline. When you fast forward eight or nine years from Shrek to when we decided a few years ago to make an entire film about dragons [How to Train Your Dragon], we knew that we were going to have to do something different.”
The creative team for How to Train Your Dragon presented a storyline with scenes that often contained dozens of dragons.” We knew that we could handle those shots with multiple dragons.” Swanborg continued, “But we also knew that those dragons had to fly, walk, breath fire, and emote in a way that audiences had never seen. In order to have both the volume and richness our artists demanded, we provided our animators with state-of-the-art, proprietary animation tools—tools that had been optimized through our partnership with Intel.
The collaboration brought together a team focused on meeting the artistic requirements of the animators, giving them the tools to craft fluid, believable character movements and convey emotions with intricate facial expressions and body postures. Enabling the artistic vision also required a high degree of behind-the scenes expertise, including the technical directors building the characters, the R&D engineers at DreamWorks Animation constructing and deploying software components throughout the animation pipeline, and Intel application engineers helping tune and optimize components to run efficiently on the hardware platform.
Lincoln Wallen, the head of Research and Development at DreamWorks Animation, saw the work as a collaboration that identified opportunities for optimizing the processing of the dragons. “The creative demand,” Wallen said, “had driven the complexity of those characters to a point where the speed of their execution within the animation tool was inhibiting our ability to animate.”
“Different than the original Shrek, the dragons are primary characters in this movie,” Wallen continued, “and because of the wings and the legs—with size and quality considerations—we increased the control points from a typical 500 to over 3,000. The dragons go into a process in which the animators cannot afford to compromise the interactivity for better control—they needed both.”
“We had to suppress that increase in complexity by improving the execution speed for those characters,” Wallen said. “That involved detailed performance analysis. The Intel teams were critical participants in that area. Their experience optimizing complex systems gave us the knowledge to get towhee we needed to go in the given time frame.”
Advancing the Agile Pipeline
In “Rethinking the Pipeline,” published in issue 5 of Visual Adrenaline magazine, Lincoln Wallen shared his vision of the agile pipeline, which mischaracterized by richer, more elaborate feedback loops at points throughout the pipeline, to better inform the artist and to guide creative decisions on lighting, animation, effects, and simulations. The re-engineering work to fully realize this vision is a long running project, which Wallen sees as a three phase process.
The first phase concentrated on the data architecture and representation, allowing distributed processes to operate over the detain a multi-processing fashion. “This year has really been a multi-core year. We did a lot of work on the core representations that were not thread-safe,” Wallen said, “assuming single-core execution model and local memory access. We cast the way in which we’re representing data so that it didn’t presuppose a single-process access and didn’t impose synchronization blocks.
The second phase, Wallen explained, is actually building those engines in a way that is inherently multi-processing capable. This is being done in two main areas: one is rendering, which also includes shading and the other is the character evaluation.
The third phase focuses on delivering fully on the promise of parallelism. From the animator’s perspective, this phase must include the capability of handling a deforming character—as complex as those featured in How to Train Your Dragon—in an interactive manner.
Moving Beyond Fixed Geometry Characters
Providing real-time interactive animation of deforming characters constitutes one of the largest, most complex challenges faced across the pipeline. Unlike fixed-geometry characters in video games, which need to render in real time due to the need to repeat game play, the geometry of deforming characters can be modified frame to frame.
In a typical video game, the geometry is loaded into memory and then transformed, rotated, and used in ways that do not modify that geometry significantly.
“That’s why you see limited use of surface physics,” Wallen explained, “such as cloth and clothing in games. As soon as you start modifying the geometry—frame by frame—your geometry generation becomes a massive bottleneck and you can’t sustain both quality and interactive performance.”
“Today our animators animate stand-ins,” Wallen continued,” that have fixed geometry; they don’t deform, but they do move. They move much like game characters, only they are more complex. The artists animate with something that isomer rigid and then they do an offline process that basically calculates the deformations; that’s not a real-time process.”
The step that DreamWorks Animation is taking this year—the middle year of the re-engineering effort—involves rebuilding the execution engine so a multi-core workstation can sustain a deforming character of the complexity of the dragons at interactive speeds. Animators will have the capability of animating with the geometry that actually goes into the final frame.
“Once this phase is complete,” Wallen said, “and we have engines that are multi-processing, why do we need to recognize the machine boundaries? After all, other machines are just other sources of memory and processes just like the relationship between a CPU and a GPU. The data transport does have latencies—it’s not necessarily coherent memory and so on—but once we have software architectures that can scale across multiple processes, why not have some of those processes off-machine into a cluster? This brings us to the point of making high-performance computing an interactive tool rather than a batch processing tool.”
This next phase of the project, Wallen stated, will bring the compute farm and the servers into the interactive workload, creating a situation where processing power cane very quickly applied to operations that are beyond ethereal-time capabilities of the individual workstation.
“In a sense,” Wallen said, “an artist will then be able to gain access to a suitable number of machines so that we can bring frames that are close to final into the interactive realm.”
Stereoscopic 3D Production
Leading the industry, DreamWorks Animation announced that all upcoming DreamWorks Animation films will be produced and distributed in stereoscopic InTru™ 3D. Equipping the animation pipeline to process stereoscopic 3D content presents another level of complexity and adds to the processing demands—on top of the demands imposed by effects, character movements, simulations, and so on.
Derek Chan, the head of Digital Operations for DreamWorks Animation, commented, “Stereoscopic 3D is significant driver in terms of processing demand. When you look at the number of pixels that are being calculated, we aren’t only creating one movie, like we used to, but we recreating two, in fact: one for your left eye and one for your right eye. So, we are doubling the amount of final images that we are creating in a stereoscopic movie. You can imagine how much more compute we had to provide.”
Just taking that big step moving to stereoscopic 3D,”Chan continued, “and having the necessary compute power while still effectively living within the same data counterpace: that shows how much more efficient we had to be—fitting within relatively the same footprint as we had before our move to 3D.”
Melding Animation Artistry with Technological Wizardry The collaborative engineering work accomplished by DreamWorks Animation and Intel to enhance the animation pipeline demonstrates that software optimization has tremendous effect on production efficiency and performance.
By building software components that use the available multi-core processing resources efficiently and take advantage of the parallelism and performance of next generation Intel® processors, DreamWorks Animation will beagle to expand its output of quality animated works—with three feature films planned for completion within this year.
Real-time visualization of animated scenes is another important area of improvement. Tuned, well-threaded software and parallel processing unlocks real-time and near real-time potential for envisioning, visualizing, and animating scenes and characters.
The type of strongly integrated production work being done on the re-engineered DreamWorks Animation hardware infrastructure strengthens the capabilities of creative teams and energizes imaginative storytelling. Intel has worked closely with DreamWorks Animation to support pipeline improvements that advance real-time visualization.
With a commitment to produce all new animated works using InTru 3D technology, DreamWorks Animation has embarked on a path at the leading edge of the technology where it’s critical to tap into every available core and consider every processor cycle to maintain efficiency. At the end of the day, however, the storytelling is what really matters. DreamWorks Animation and Intel combine creative talents and technological expertise to bring storytelling, CG animation, and stereoscopic 3D into a new and exciting realm.
Article from Intel Visual Adrenaline magazine issue 7. Click here to read more.
All pictures from Rex Features.