Archive for the 'pipeline' Category

The Danish Film Institute and other news


So, this blog has been pretty quiet for a couple of months. Hydralab has been busy with a number of other projects (which we hope to put up soon), and also with developing our pipeline tools.
At the same time, for the last 6 months, we’ve been working with the Danish Film Institute, through their New Danish Screen initiative, and spending some more time with the script. The focus of the story has sharpened considerably, and we’re now entering our second development phase, where I hope to test out some of my ideas.
We’ve also stepped back to rethink the production process. I’ll post more on this later, but our new process puts a considerable emphasis on a robust previs, with a skeleton crew, before any production begins.

Mercurial, file browsers, and the pipeline


So, we’ve more or less finished the first version of our file browser – a custom Qt/PyQt interface written for the project primarily by Mikkel Jans.  Above is the version view component of the browser.  The browser is the start to managing our pipeline data processes – publishing, versioning (I’ll get into that below), launching files, and connecting (at the moment through socket interfaces) with our pipeline applications.


logo-droplets-200We’ve chosen to use Mercurial, a distributed version control system, for our file versioning.  A user might typically save, say, a Maya file dozens of times during a day, but she’ll commit a new revision only a handful of those dozen times.  Each commit requires a comment which describes pertinent information about that commit, and the user is able to revert to any commit in the history, or restore a commit as a separate file to a local folder.  Branching is also possible, which means that it’s relatively simple to ‘try out’ ideas in your work files.  In a typical programming environment, you might then merge that branch with your main branch, but it’s a bit more difficult with Maya or Shake files to merge through a text editor (so we’re not addressing that aspect yet).

We’ve also written a standard view into our browser which displays the revision history on a selected file, with a visual graph that displays the DAG (directed acyclic graph) of the file history.

What this means is that we don’t add user names or version increments in our naming convention – each file’s history is saved in a mercurial repository, along with comments, usernames, and other tags that we include in the changesets.  Since mercurial saves only file differences, and compresses that file history into a binary format, the entire mercurial repository for a file, including dozens of commits, is usually considerably less than the size of the actual file (at least for most of our work files which are saved in ascii formats).

Mercurial forms one of the components of our pipeline.  It allows us to easily roll-back assets, keep track of asset relationships (dev -> publish), monitor user activity on work files, and have an overview of the different iterations a file goes through.  It’s relatively lightweight, cross-platform, and integrates well with our primarily Python based setup.

3Delight & rendering


A more complete version of this sequence is here.

A lot of things have happened in the last month – one of which is that we’ve been sponsored by DNAsoft, developers of the renderman based 3Delight renderer. The character in the rough test above is rendered with 3Delight, which we – myself, Aske Dørge, and Nicolai Slothuus – spent about a week and some working with.  I’ve included some images below on the various stages of the process.  We took extensive set measurements to determine the camera position (although I think we’ll be trying out some image based modeling methods for the next test), shot chrome spheres, matte balls, and foreground bluescreen elements.  As always, there’s a fair bit of compositing in Shake as well as some sound mixing in Final Cut Pro.  Most of the sounds in this clip are downloaded from the great online resource

For the renders, we used 3Delight’s point cloud rendering methods – which meant that at small HD resolution, we could output our character with motion blur, displacements, depth-of-field, and occlusion (along with a range of other secondary passes – or arbitrary output variables) at under 1 minute a frame.  Our next test is to try and come up with a global illumination process, using our set survey data, and light emitting surfaces baked into a point cloud, and rendered using some custom shaders.  One of the great features with renderman based renderers is the simple shading language (RSL) which, in 3Delight, we can access through the Maya interface.  This means we can test and write custom shaders in the interface, before converting them to standard .sl files, which we then compile through 3Delight’s shader compile utility.

For the animation pipeline, we decided to rely on Maya’s geometry cache features, which allow us to isolate the animation and lighting pipelines from each other.  This means that the lighting scene references only the models (no rigs) and the layout, and the geometry cache imports all of the animation information.  As the animation updates, so do the lighting scenes.

We also implemented some custom spotlights, with falloff regions, and on-screen visualisation.  For this test, since we used spotlights to mimic all of our direct and indirect illumination, the falloff regions gave us more granular control over attenuation.  At some point, I may look into a linear workflow, at which point Maya’s standard decay types might be more useful (or not).

Bang & Olufsen commercial


This is a recent HD commercial I directed for Bang & Olufsen, produced at Mark Film.  Thanks to the great team I worked with: Tore Rex, Jesper Bentzen, Jimmy Levinsky, our producer Claus Toksvig, and the really fantastic guys at Mark Film.  Thanks also to Lawrence Marvit, who worked on the matte paintings and background design.  We had about two actual weeks of shot production with the full crew, and some additional time for pre-production, asset building, sound and finishing with a much smaller crew.

I got to try out a number of production ideas on this spot – which proved to be a good test bed for the film workflow.  I also locked the storyboards well before the crew came on board, and we relied heavily on the 2d animatic to plan out the schedule, focusing our time on only what the camera would see.

At some point I’ll try to put up a “making of” and some additional images.

Reclining man: ZBrush test


This is a test for some of my process ideas.  I’ve gotten recently back into using ZBrush, and I thought I’d see how far I could get starting from just a polygonal cube – above is the result of about a day and a half of working with the mesh.  In production, I’d expect to have a posed (possibly) higher resolution base mesh, with a nicer topological layout, but this was an exercise in figuring out how much mesh resolution I really needed to start with.

I expect that the majority of secondary characters in the film will have very little movement on a shot by shot basis, and rather than fully rigging/modeling/texturing these characters, I expect to work predominantly from the camera’s perspective.  It’s an idea I tried out on a recent commercial project I directed and supervised (which at some point I’ll put up here).  The storyboards and 2d animatic were locked down (after many revisions) well before production, which allowed us to really economize the production time on only what we would see.

Below are some progression shots.


In preparation for the shoot, I’ve installed Earth from Rising Sun Pictures, which keeps track of disk usage.  It’s browser based, and runs off a Postgre database and RubyOnRails.  I think I’ll be expanding the server soon – since the pipeline is based around 16bit float (RAW/EXR), and the motion control shots will likely have more than one pass, our data for the shoot will take at least a terabyte of space.  The production, I’m anticipating, will easily eclipse that number.

Along with showing usage statistics, Earth also has a neat radial display: