A more complete version of this sequence is here.
A lot of things have happened in the last month – one of which is that we’ve been sponsored by DNAsoft, developers of the renderman based 3Delight renderer. The character in the rough test above is rendered with 3Delight, which we – myself, Aske Dørge, and Nicolai Slothuus – spent about a week and some working with. I’ve included some images below on the various stages of the process. We took extensive set measurements to determine the camera position (although I think we’ll be trying out some image based modeling methods for the next test), shot chrome spheres, matte balls, and foreground bluescreen elements. As always, there’s a fair bit of compositing in Shake as well as some sound mixing in Final Cut Pro. Most of the sounds in this clip are downloaded from the great online resource freesound.org.
For the renders, we used 3Delight’s point cloud rendering methods – which meant that at small HD resolution, we could output our character with motion blur, displacements, depth-of-field, and occlusion (along with a range of other secondary passes – or arbitrary output variables) at under 1 minute a frame. Our next test is to try and come up with a global illumination process, using our set survey data, and light emitting surfaces baked into a point cloud, and rendered using some custom shaders. One of the great features with renderman based renderers is the simple shading language (RSL) which, in 3Delight, we can access through the Maya interface. This means we can test and write custom shaders in the interface, before converting them to standard .sl files, which we then compile through 3Delight’s shader compile utility.
For the animation pipeline, we decided to rely on Maya’s geometry cache features, which allow us to isolate the animation and lighting pipelines from each other. This means that the lighting scene references only the models (no rigs) and the layout, and the geometry cache imports all of the animation information. As the animation updates, so do the lighting scenes.
We also implemented some custom spotlights, with falloff regions, and on-screen visualisation. For this test, since we used spotlights to mimic all of our direct and indirect illumination, the falloff regions gave us more granular control over attenuation. At some point, I may look into a linear workflow, at which point Maya’s standard decay types might be more useful (or not).