A Real-time animation production pipeline

© Red Kite Animation 2014

© Red Kite Animation 2014

...or How to save a fortune when making an animation.

(Originally written on 13 October 2014)

While we were still developing the alpha-version of RenderDigimania, I was asked by Red Kite Animation and Super Umami to help them with a deadline...

They needed to take the pilot episode of their new animation series, “Bradley and Bee” to MIP Junior. The problem was that they didn’t have enough time to do it the ‘traditional’ way; it was May and the episode had to be ready by mid-September. Moreover, it was 11 minutes long, had over a dozen characters, two action-packed chase scenes and ambitions to be “Indiana Jones for kids”.

Red Kite and Garry Marshall (Super Umami) knew we were working on a game-engine based renderer and wondered if it could help them with the deadline. The project was secret; never did find out how they knew about it! Anyway, this is the story of that process.

Super Umami’s Garry Marshall and I shared a similar belief: that the ‘traditional’ way of making cartoons is bust, slow, long-winded and, ultimately, expensive. He had ideas around efficient production setup, a clever 3D After Effects pipeline and I had the prototype version of a game-engine renderer. But, that’s twice I’ve referred to a ‘traditional’ way of making animation. It’s worth examining that concept before seeing how we ripped it up.

Animation is difficult, slow and expensive. Much of this is due to the high level of craftmanship required such as modelling, animating, designing, composing, etc. But, equally, there’s a lot of waiting and paying for computers while they render frames, transfer files and save images. No one on an animation production likes waiting for things, some people even hate paying to wait as well. A simple process of fixing an object’s colour and re-rendering can take days. So, this is the traditional way of doing things: craft your animation in, say, Maya, send to a render farm and wait an extraordinary amount of time to see results. Also, more often than not, the frames will be broken or glitchy. So you’re forced to iterate on this expensive cycle until time (or money) runs out.

RenderDigimania was a prototype app back then but I knew that we could push 3D data through it for lightning-fast rendering. Garry had a novel After Effects pipeline for 3D planar backgrounds. The plan was to do foregrounds in RenderDigimania (the game engine) and backgrounds in After Effects. The thinking continued: so long as we share Maya cameras then the frames from RenderDigimania would match exactly the 3D world inside After Effects.

The full process, described by pictures and diagrams, can be found here.

It was interesting to note that the episode was ‘book-ended’ by two scenes that were rendered with Mental Ray. In other words we had a lovely comparison set up between the game engine renders and Mental Ray’s.

So, how did it go? Well, the good news was that, yes, you could make an animation with a game-engine renderer. Here are the headlines:

  • The visual quality was great, no one spotted that it was game engine
  • We didn’t once hit an engine-imposed limit on polygons or textures
  • We could render the entire 9.5 minute game-engine section in 3.5 hours on one PC
  • RenderDigimania was spitting out 5 frames per second
  • RenderDigimania could render faster than After Effects
  • Broadcasters have asked for a 52-episode series

But what about the creative process itself? How did the presence of a real-time app such as RenderDigimania affect the actual production?

Well, the actual 3D draughtmanship didn’t change much. I suppose we had to make sure our assets were “game-compliant” but, as I said above, that didn’t seem to affect anyone except the Rigger. And all he had to do was make sure the animation rig could bake down to a game rig at export time. The lighting was all dynamic so ambient occlusion was calculated in Maya and baked into diffuse maps and, er, that’s it, I think.

The presence of real time software meant that we didn’t have to worry so much about post-production. The traditional animation process is to deconstruct a storyboard into shots and layers. Your intention is to split up each shot so it’s quick to render and affords you the chance to tweak things in post. Well, the real-time software meant that we could make decisions before rendering. If we wanted to change the colour of a light or the intensity of a shadow we’d do it in real-time before rendering. It meant that the Director could leave creative decisions until the very last minute. It was also easier for him to see the full episode in a single consistent render pass. Changes, to put it simply, were very quick to push across the whole episode and re-render.

It wasn’t all roses, mind you. We had a particularly fragile FBX pipeline and slippery daily builds of prototype software. There was also a lot of hand-knitted processes in desperate need of automation. Also, the power to defer decision making - while useful for the Director - was horrible for producers who were used to seeing the steady and regular production of frames as a marker of progress. May I also complain that the frequency of fresh renders only increased the amount of producer change-lists that were circulated. Changes for the sake of changes, some may say. I’m sure I’ll get into trouble for that. Here’s a screenshot of the comments for one scene. I’ve zoomed out so you can take in their entire beauty.

But, in the end, we’d ripped up the usual way of making animation. Our render “farm” was a £700 PC from Dell, we could do the whole episode on that PC in 3.5 hours. You can scale this saving upwards to the 52-episode series - it’s a saving of thousands (thousands!) of pounds (commercial sensitivity prohibits me from revealing the true amount!)

Since then our software - RenderDigimania - has been polished, improved and released to the public.