Bradley and Bee

Excerpt from Bradley and Bee, episode 1. © Red Kite Animation Ltd 2014

What is Bradley and Bee?

"Bradley and Bee" is a 52-episode TV series centred on seven year old Bradley and his five year old sister, Bee. They share an incredible, amazing secret…  a magical book which, when opened, transports them into a wonderful world of adventure.

The series is unique - it's been designed around a real-time pipeline with game engine-based rendering. If you're looking for a headline it's this:

We are making a animated TV show without a render farm - just six PCs from Dell. The savings are enormous. 

My role

As Producer and CG Lead I've devised the real-time pipeline and built the animation studio around it.

More info

Case Study: Making of the pilot episode

Bradley and Bee website

Red Kite Animation

 © Red Kite Animation Ltd 2014

© Red Kite Animation Ltd 2014

  © Red Kite Animation Ltd 2014

© Red Kite Animation Ltd 2014

"Bradley and Bee" episode 1: A real-time animation production pipeline

(Excerpt, full article is here)

Well, the good news is that, yes, you can make an animation with a game-engine renderer. Here are the headlines:

  • The visual quality was great, no one spotted that it was game engine
  • We didn’t once hit an engine-imposed limit on polygons or textures
  • We could render the entire 9.5 minute game-engine section in 3.5 hours on one PC
  • RenderDigimania was spitting out 5 frames per second
  • RenderDigimania could render faster than After Effects

But what about the creative process itself? How did the presence of a real-time app such as RenderDigimania affect the actual production?

Good bits and bad bits

Well, the actual 3D draughtmanship didn’t change much. I suppose we had to make sure our assets were “game-compliant” but, as I said above, that didn’t seem to affect anyone except the Rigger. And all he had to do was make sure the animation rig could bake down to a game rig at export time.

The presence of real time software meant that we didn’t have to worry so much about post-production. The traditional animation process is to deconstruct a storyboard into shots and layers. Your intention is to split up each shot so it’s quick to render and affords you the chance to tweak things in post. Well, the real-time software meant that we could make decisions before rendering. If we wanted to change the colour of a light or the intensity of a shadow we’d do it in real-time before rendering. It meant that the Director could leave creative decisions until the very last minute. It was also easier for him to see the full episode in a single consistent render pass. Changes, to put it simply, were very quick to push across the whole episode and re-render.

It wasn’t all roses, mind you. We had a particularly fragile FBX pipeline and slippery daily builds of prototype software. There was also a lot of hand-knitted processes in desperate need of automation. Also, the power to defer decision making - while useful for the Director - was horrible for producers who were used to seeing the steady and regular production of frames as a marker of progress. May I also complain that the frequency of fresh renders only increased the amount of producer change-lists that were circulated.


But, in the end, we’d ripped up the usual way of making animation. Our render “farm” was a £700 PC from Dell, we could do the whole episode on that PC in 3.5 hours. You can scale this saving upwards to the 52-episode series - it’s a saving of thousands (thousands!) of pounds (commercial sensitivity prohibits me from revealing the true amount!)

Since then the software - RenderDigimania - has been polished, improved and released to the public.