Interview: The IT powering animated movie-making

Video special effects require similar processing to oil and gas, but lack the budget, says head of technology at Illumination Mac Guff, Bruno Mahe

From an IT perspective, video special effects require similar processing to geospatial analysis in oil and gas, but lack the same budget, says Illumination Mac Guff head of technology Bruno Mahe.

Illumination Mac Guff is the video effects company behind movies such as Despicable Me.

Over the past 10 years, since Mahe began working there, IT has changed radically. So too has the world of film animation – a world that relies heavily on high-performance computing.

"Oil and gas uses the same infrastructure as us but has much more money," says Mahe.

This infrastructure supports ray-tracing and rendering applications that turn the 3D artists’ creations into memorable animated characters and scenes for the big screen.

Rendering is the process of painting texture onto the 3D wireframe models of the characters, props and sets, which heavily involves the central processing unit (CPU).

"When we did our first film – Dragon Hunters, in 2007 – we used a 155 four-core node high-performance computer (HPC)," says Mahe. So it took 620 cores to render the film. Despicable Me 2 – released in 2013 – used 17,000 cores, he says.

HPC can now be easily purchased from the likes of Amazon Web Services (AWS), but that does not necessarily make it the best option, even for a relatively small company like Illumination Mac Guff. 

True, the user only pays for what is used, however, Mahe says: "It is still cheaper for us to buy our own boxes because we use them all of the time." In other words, the economics of cloud quickly diminishes if CPU load is maxed out constantly.

There is also the question of security. Putting a movie that is still in development on the public cloud is a risk Mahe and Illumination Mac Guff are not prepared to take. "We can’t use storage in the public cloud," he says. "We keep all our data on premise, but we offload some rendering to the public cloud."

To reduce data loss, he says Illumination Mac Guff runs a virtual private network (VPN) into AWS. Mahe admits there is a huge amount of latency, but his team splits the workload to make sure the rendering tasks run in the cloud are not data-intensive. "We put workloads in the cloud that do not require a lot of data," he says.

Significantly, the data used in the rendering process is transient, which means Illumination Mac Guff does not risk putting its intellectual property in the public cloud. Moreover, Mahe says the approach he has taken avoids the need to get the company’s legal experts involved.

Increased storage required to make movies

And it is not just CPU load that Mahe needs to maximise to support animation. In the decade since he joined the company there has been an explosion in its storage requirements. 

Since Dragon Hunters, which required 12TBs of storage for the whole system, the company has seen more than a 50-fold increase in the storage needed to make an animation movie. "We used 680TBs of peak useful storage for Despicable Me 2," says Mahe.

Besides the actual amount of storage, Mahe says he also needs to ensure artists are given fast access to their models. "The artists were using less resources storage-wise than the rendering farm, but they were more sensitive to resource contention,” he says. "Waiting half a second to access the storage array would create a huge productivity issue."

The company deployed the FXT Edge tiered storage from Avere Systems, a hybrid cloud storage company, Mahe explains. The filer uses built-in RAM, solid-state discs and high-speed hard-disc drives to accelerate input/output (I/O) throughput. Rather than buying more discs to increase I/O throughput, Avere is designed to move data accessed at a high frequency into flash memory, while low-frequency data is pushed back to disc.

An eye on future innovation

The movie industry is constantly evolving. In 2013 it was the advent of 4K ultra-high definition resolution. For Mahe, the biggest change in recent times, is the move to 60 frames per second (FPS) from 24 FPS, which is the current standard. This will have a major impact on storage and CPU, since two-and-a-half times more data is needed per second of footage.

One of the most exciting areas of development is real-time rendering

Illumination Mac Guff has a three-year pipeline, so it is working on a number of films at different stages of development. Given the amount of work being taken onboard, Mahe needs to keep an eye on future technology innovation, because this affects how fast the rendering farm can operate, as well as the projected storage and I/O requirements.

"One of the most exciting areas of development is real-time rendering," he says. But this requires much higher performance than is currently possible. Rendering is a CPU and memory-intensive workload. While graphics processing units (GPUs) are gaining in popularity – due to the high performance a graphics chip can offer, and the hundreds of cores a typical GPU can integrate on a single board – they lack sufficient system memory, according to Mahe. 

So, while GPUs can realistically be used to render a 30-second animated TV commercial, Mahe says they fall short for a full-length feature film.

Read more on Server hardware

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close