Tag Archive for 'movie'

Color, texture, and a making of

ma_tp_edit_v003_sor3

MAKING OF:

At the top is a version of the test with some color/texture, and simple shaders.  I’m also posting a “Making-of” so people can follow some of the integration process.  These are both roughs – there are comp, animation, and render errors, but are nonetheless interesting for us.

While I think the test got the crew used to the general pipeline/workflow, aesthestically we’re still a bit off.  At the moment, this hits closer to something from Monster House, or Polar Express.  I’d like to move more towards stop-motion, and we hope to get in some animation studies over the next few weeks, spending time with some shots from the fantastic Madame Tutli-Putli.

At the moment, the textures are mostly without detail (both in the diffuse and specular), so we’ll be working to increase some of the sophistication.  We might also spend some time with the shaders, although I’m not yet convinced we need anything more than a blinn, some fresnel, and hi-detail textures.

3Delight & rendering

ma_tp020_comp_v007_spg0057

A more complete version of this sequence is here.

A lot of things have happened in the last month – one of which is that we’ve been sponsored by DNAsoft, developers of the renderman based 3Delight renderer. The character in the rough test above is rendered with 3Delight, which we – myself, Aske Dørge, and Nicolai Slothuus – spent about a week and some working with.  I’ve included some images below on the various stages of the process.  We took extensive set measurements to determine the camera position (although I think we’ll be trying out some image based modeling methods for the next test), shot chrome spheres, matte balls, and foreground bluescreen elements.  As always, there’s a fair bit of compositing in Shake as well as some sound mixing in Final Cut Pro.  Most of the sounds in this clip are downloaded from the great online resource freesound.org.

For the renders, we used 3Delight’s point cloud rendering methods – which meant that at small HD resolution, we could output our character with motion blur, displacements, depth-of-field, and occlusion (along with a range of other secondary passes – or arbitrary output variables) at under 1 minute a frame.  Our next test is to try and come up with a global illumination process, using our set survey data, and light emitting surfaces baked into a point cloud, and rendered using some custom shaders.  One of the great features with renderman based renderers is the simple shading language (RSL) which, in 3Delight, we can access through the Maya interface.  This means we can test and write custom shaders in the interface, before converting them to standard .sl files, which we then compile through 3Delight’s shader compile utility.

For the animation pipeline, we decided to rely on Maya’s geometry cache features, which allow us to isolate the animation and lighting pipelines from each other.  This means that the lighting scene references only the models (no rigs) and the layout, and the geometry cache imports all of the animation information.  As the animation updates, so do the lighting scenes.

We also implemented some custom spotlights, with falloff regions, and on-screen visualisation.  For this test, since we used spotlights to mimic all of our direct and indirect illumination, the falloff regions gave us more granular control over attenuation.  At some point, I may look into a linear workflow, at which point Maya’s standard decay types might be more useful (or not).

The last stages of building

img_3326

Above is a timelapse of the last day of official work (the video is a bit low quality).  We shipped the sets the day after and have set them up in the studio.  Karin (and possible Nancy, another builder) will be coming here in about a week to finish some of the houses, and survey the setup.

Below are some of the stills.

Concepting heads

lrn_caricature_movie

I’m trying to work out a visual vocabulary for the level of caricature in the film.  This is one of the tests – above is the turn around of the high detail version, and below are some shots of variations.  One of the concepts in the film is that the characters are imprinted with some of their personal histories – kind of like scars.  It’s not clear here, but I’ll keep posting examples of what that means in later posts.  These examples are also a bit conservative – I’ll be trying to push a bit more in the next couple of iterations.

Again, these are done with ZBrush.

Bang & Olufsen commercial

bo_leaves_large

This is a recent HD commercial I directed for Bang & Olufsen, produced at Mark Film.  Thanks to the great team I worked with: Tore Rex, Jesper Bentzen, Jimmy Levinsky, our producer Claus Toksvig, and the really fantastic guys at Mark Film.  Thanks also to Lawrence Marvit, who worked on the matte paintings and background design.  We had about two actual weeks of shot production with the full crew, and some additional time for pre-production, asset building, sound and finishing with a much smaller crew.

I got to try out a number of production ideas on this spot – which proved to be a good test bed for the film workflow.  I also locked the storyboards well before the crew came on board, and we relied heavily on the 2d animatic to plan out the schedule, focusing our time on only what the camera would see.

At some point I’ll try to put up a “making of” and some additional images.

Loco 2nd Generation, stable and shipped

soi_3g_timelapse

So we shipped out our 2nd generation LEGO rig to a studio in Germany, who’ll be working with it for the next year on a combination stop-motion/CG project.  Above is a quick timelapse test I did this morning – the camera is running through the middle of people, dogs, movement, and general unrest in my squeaky wood floored studio.  The rig got a bit pushed on occasion, but the move came out incredibly stable – what you see above is directly from the camera.

All of pictures you see below are with the top mount.  There is also a 1.5 meter bottom mount to hang the camera.  Notice how the rig transitions to black – this is some laborious hours sanding, washing, and spraying the rails and base plate.  For the next rig, we’ll try to hire someone else to cover some of these areas.

I’ve also implemented the iCommand NXT library, a command (not VM) based project of leJOS.  I’m having much better luck with setting rotation limits, and stepping down the power as we approach the actual rotation target.  I’ve also implemented some parity compensation, as well as gear lag, in the software package, and the data structures are a bit cleaner.  It looks as though I’ll be writing a keyframe interface as well in the next month.

I’ll be flying down to Germany on Sunday to set up and test the rig on the set there.  If I can (I’m not sure about the NDA restrictions), I’ll post some pictures.

As for the future – are you can see, parts of the rig are still Mindstorms driven, and LEGO built.  There are benefits (modular construction, easy to refactor), and some looming disadvantages (plastic parts on top of that list), and at some point, not the next generation, nor, probably the one after, we’ll consider looking into other microprocessor boards and sensors, and designing the rig in a slightly different way.  I think LEGO will continue to play a big part in the development, and certainly the feedback sensors, which is our biggest focus for the next version(s), will continue to be in the design and prototype process.

The LEGO rig on set

Goutte d’Or is now running with the LEGO rig daily on set.  Here are some images around the ship and of the new lift.  Above is one of the latest camera moves.

New modules and Goutte d’Or

goutte_d_or

Above is a test clip we shot yesterday for Goutte d’Or, a stop motion film by my friend Christophe Peladan, which is using the LEGO rig.  I’m rebuilding the lift, and using a more modular construction for it.  I’ve also organized a large part of the LEGO collection, which you can see below.

There have been some great replies on the nxstasy forum to my questions about minimizing slop in the gear train for our rig.  With any luck, this new version will add some more stability and user friendliness.

The software has also been updated to deal with both the Canon EOS 40D and the Canon EOS 400D.

LEGO Motion Control v2

soi_proto9_pan_stabilized0125

soi_proto9_lift_stabilized0149

We’re building the next major iteration of the motion control system, and are thinking of supplementing the LEGO with some more stable parts – the major part is the baseplate of the rig, which we’ll use MDF to start with, and see how much stability it brings.

Tomorrow, the 1st iteration rig also starts getting use on La Goutte D’or, a stop motion puppet film in production in one of the other buildings here.  With the new seek algorithm, I’ve sped up the incremental moves by 300%, which means we can shoot about 8-12 seconds automated per hour.

The movies above represent moves run with the new software.  I’ve started consulting my handy American Cinematographers Manual to look at the pan and lift speeds.  Both moves are processed with the “smoothcam” node in Shake.  As far as I can tell, the amount of pixel shifting that happens when smoothing has a minimal effect on the plate – I don’t see much introduced blur, especially since our processing happens at 4K (roughly), and then gets downsized to 2K.  I haven’t yet tracked a “smoothed” shot, so I might discover some issues there.  The jitter in the original plate is also pretty minimal, and hopefully with the next iteration of the rig, we can bring it even further down.

Below are some images of the 1st iteration.

Realism test

ma_realism_w_refplate_v001_sm

This is simple test with a still image of the miniature shoot from last year.  The idea to to see how much I get from the plate, and how much I need to add in order to hit a high level of detail.  The doll was our character stand-in – the animation is just some image warping.

Most of the work is done in Shake (with a bit of Photoshop).  I’ve added live elements, balanced light, lens flare, defocus, a background replacement (with a simple key), animation, and a bit of paint over to cover up some odd lighting in the original plate.

Below is the original image:

Proto6

ma_rnd_proto6_bg

So we’ve gone back and forth with various modifications on the rig.  We added some boogie wheels and then realized that dynamic distribution of weight actually adds another factor of unpredictability to the motion.  The new rig is, however, more stable than the previous couple, and we’ve also attached a small Manfrotto head mount to the bottom of the lift.  Now the camera is much easier to mount.  The first movie below is the actual test, while the second is a stabilized version, taken through Shake.  The movie above is roughly color corrected, stabilized, and the background replaced.  The reflections in the mirror are Benny and I moving around as the move is running.

original:

stabilized:

Proto4 test shoot2

ma_rnd_proto4_test3_hi

This a clip from a rough shoot of one of our buildings.  In total we’ll have about 9 buildings, many of which we’ll be interchanging parts in order to add variation.

Our lift stopped working in the middle of the shot – a gear driving another gear on one of the four track gears slipped out of place, but because of the differential (which aligns the four rack gears), I didn’t notice the problem until too far in.  Our next version should deal with some of the workflow complications we noticed through this shoot.

I’ve added some different coloring to one of the front walls in this shot – just to see how that might look.  In order to do that, I first applied a lens undistort (you can see the distortion at the edges of the frame), then color-corrected the plate, with a traveling matte that followed the camera move, and then redistorted the plate.  This way, I can create a matte for the wall that has straight lines, and then, when I redistort, get the lens barrelling back into the image.

Below are some shots of the buildings and the shoot in progress.

Proto4 test shoot

So we shot some footage on our miniature set which is being built about 400 km away.  Over the next few days, I’ll be adding some more shots.  Above is one of the clips (the flicker is from the large sunlight in the workspace – we’ll be shooting in an closed off studio for the actual film shoot), with some work done in Shake to average the bluescreen, replace the background, and add the foreground flicker to the background (with a multiplier).

This is shot on our new set of rails (which are 2 meters long).  The rig will need some modification, since the rails are a bit tighter.  The lift is also a bit unstable (still), but we think we’re on the right track – using 3 differentials driving 4 24-tooth gears along 4 rack gears.  We should, hopefully be able to stabilize the motion completely.

Below are some of the photos of the shoot.

Proto4, PID, worm gears

So we’ve redesigned the lift mechanism a couple of times.  The lift you see here has about six worm gears (inside of a gear housing) that drive four gears around four rack gears.  This corresponds to one gear on each corner of the lift rod.  There are differentials between each set of gears and the central driving gear (three differential gears).  This is an improvement over some of our last models, which had dozens of gears – our theory is that at this small scale, the more plastic LEGO gears you introduce, the more possibility of error.  So less is more.

We’ve also been visually logging our degree variance in the Processing sketch I’m writing.  It’s pretty primitive, but gives you a good idea of how close we are to a target rotation count.  At the moment, using a pseudo PID (with multiple levels of bi-directional adjustment once the degree count is within certain thresholds, as well as fail-safe conditions, in case of run away motors), we get an average of about 5 degrees of variance.   That’s not bad when we’ve got gear factors of 1,728 and 576 (track & lift).  To the left you can see the screen grab of what that visual graph looks like.

.

.

Proto2 testing

Here’s a first test from the Proto2 rig.  As you can see, there’s a fair bit of work here isolating the shake.  The actual gear motion is relatively accurate, but the lift on the rig introduces some instabilities.  We’ll take a look at these in the next couple of days.

The second test is an overlay of the camera move we set in 3D (with a stand-in object) and the actual footage.  The general motion is fine, but it’s easy to see the stability problems.  Because we’ll be shooting closeup on the miniatures, I expect to have a lot of parallax in the shots – what that means is that fixing stability in post (eg. re-projecting the plates on 3D objects and creating baked texture maps, and semi-stabilized plates) is a complex problem.  So we’ll need to solve the stability issues at the rig level.

Prototype 2

So this is our second prototype rig.  That’s Benny Bondesgaard in the picture, who’s doing most of the building.  We’re shooting tests this weekend in preparation for doing a test shoot with the miniatures next week.  Our process goes a bit like this:

  1. Set a move in Autodesk Maya using a 3D version of the LEGO rig – built to scale, and with moving parts.
  2. Export the move in .move format (ascii format, where each axis is a column, and frames are line separated)
  3. Set the rig to a default position (we’re working on a gear release, so that we can manually move the lift/rotation).
  4. Start Processing and EOS Utility (we’re using a Canon EOS 40D).
  5. Run our sketch.

I found a way to trigger the remote shoot on the EOS Utility using a simple Applescript which sorts through UI elements and finds the right button.  Not even close to a good long term solution, but one that works fine for the moment.  I spent some time mucking around with gphoto and a few other utilities (as well as trying to wrap my head around the Canon ED-SDK), and this is definitely the easiest to implement.  This way, we can create a minimal UI for the actual Processing application and be able to use all of the functions Canon has built into the EOS Utility.

In the Processing sketch, we set each incremental move by setting a motorForwardLimit (which is based on the rotation of the servo).  Often enough, the motor overshoots the limit, so I set an adjustment move based on a threshold of error.  This gets us closer – still not quite exact, but we’re working on it.  I may end up going back to just turning on the motor, and gradually lowering the speed as I approach the right rotation count.  Or perhaps setting a limit for the first step, and then gradually working with the speed for the adjustment step.

After the camera trigger clicks, I also run a loop that goes for five seconds (it tests against the millis() function in Processing, which is a millisecond count from when the sketch was initialized).  This is so that the Applescript has enough time to open and click on the release button in the EOS Utility.




Warning: require_once(/home/sunit_parekh/blog.machinefilm.com/wp-content/themes/k2/footer.php): failed to open stream: Permission denied in /home/sunit_parekh/blog.machinefilm.com/wp-includes/template.php on line 572

Fatal error: require_once(): Failed opening required '/home/sunit_parekh/blog.machinefilm.com/wp-content/themes/k2/footer.php' (include_path='.:/usr/local/lib/php:/usr/local/php5/lib/pear') in /home/sunit_parekh/blog.machinefilm.com/wp-includes/template.php on line 572