Archive for the 'motion control' Category

Loco filmed set dress

This is a quick timelapse filmed on our own loco motion control system. Here we see (mostly) Charlene Barre setting up props and buildings for one of our city sets. Very rough car animation by me – Sunit Parekh-Gaihede.

Eventually, we’d like to show the full evolution of a shot (from set dress to final grade), but we decided to put up this test sooner than later.

Loco and the city

Above is a recent motion control clip of one of our city sets (shot on our loco rig).  There’s a bit of compositing on the movie above, and we have a test running with 15 CG characters in the scene to see how much animation we need to put in for the scene to read believably.  If we get around to lighting that scene before production starts, I’ll put it up here.

Below are some more of the reference plates from the shoot, as well as shots from other city sets in progress.  The stand-in figures are built by the talented Hanna Habermann who will be joining the project again in January for the rest of the shoot.

Sets, lighting, and motion control

ma_move_test_v0010001

We’ve now got our own motion control rig set up.  It’s based off of the last rig, which we shipped down to Studio SOI, who are using it for some exciting projects.  The test above is from a demonstration we gave to the Danish Film Institute earlier today.  There is a bit of subtle shaking in the shot, which comes from us handling the rig while the move was in progress.

Below are some recent tests of the trees Bente Laurenz Jacobsen (who is pictured below) is building for the project.  We discovered that ambient daylight is difficult to recreate indoors, so we built a large rig (like a flash umbrella), that we will be stretching cloth over and hanging above the sets for the day shots.  The shots below represent both the indoor light tests, and some outdoor shots.  The environment around the workshop provides an interesting backdrop to the shots.

There are also some shots with Nancy Munford and Karin Ørum who came up for a weekend in February to finish work on the street sets.

The stairs on the house were some of the last items that got painted – hence they’re still foam in the pictures.

Loco 2nd Generation, stable and shipped

soi_3g_timelapse

So we shipped out our 2nd generation LEGO rig to a studio in Germany, who’ll be working with it for the next year on a combination stop-motion/CG project.  Above is a quick timelapse test I did this morning – the camera is running through the middle of people, dogs, movement, and general unrest in my squeaky wood floored studio.  The rig got a bit pushed on occasion, but the move came out incredibly stable – what you see above is directly from the camera.

All of pictures you see below are with the top mount.  There is also a 1.5 meter bottom mount to hang the camera.  Notice how the rig transitions to black – this is some laborious hours sanding, washing, and spraying the rails and base plate.  For the next rig, we’ll try to hire someone else to cover some of these areas.

I’ve also implemented the iCommand NXT library, a command (not VM) based project of leJOS.  I’m having much better luck with setting rotation limits, and stepping down the power as we approach the actual rotation target.  I’ve also implemented some parity compensation, as well as gear lag, in the software package, and the data structures are a bit cleaner.  It looks as though I’ll be writing a keyframe interface as well in the next month.

I’ll be flying down to Germany on Sunday to set up and test the rig on the set there.  If I can (I’m not sure about the NDA restrictions), I’ll post some pictures.

As for the future – are you can see, parts of the rig are still Mindstorms driven, and LEGO built.  There are benefits (modular construction, easy to refactor), and some looming disadvantages (plastic parts on top of that list), and at some point, not the next generation, nor, probably the one after, we’ll consider looking into other microprocessor boards and sensors, and designing the rig in a slightly different way.  I think LEGO will continue to play a big part in the development, and certainly the feedback sensors, which is our biggest focus for the next version(s), will continue to be in the design and prototype process.

The LEGO rig on set

Goutte d’Or is now running with the LEGO rig daily on set.  Here are some images around the ship and of the new lift.  Above is one of the latest camera moves.

New modules and Goutte d’Or

goutte_d_or

Above is a test clip we shot yesterday for Goutte d’Or, a stop motion film by my friend Christophe Peladan, which is using the LEGO rig.  I’m rebuilding the lift, and using a more modular construction for it.  I’ve also organized a large part of the LEGO collection, which you can see below.

There have been some great replies on the nxstasy forum to my questions about minimizing slop in the gear train for our rig.  With any luck, this new version will add some more stability and user friendliness.

The software has also been updated to deal with both the Canon EOS 40D and the Canon EOS 400D.

LEGO Motion Control v2

soi_proto9_pan_stabilized0125

soi_proto9_lift_stabilized0149

We’re building the next major iteration of the motion control system, and are thinking of supplementing the LEGO with some more stable parts – the major part is the baseplate of the rig, which we’ll use MDF to start with, and see how much stability it brings.

Tomorrow, the 1st iteration rig also starts getting use on La Goutte D’or, a stop motion puppet film in production in one of the other buildings here.  With the new seek algorithm, I’ve sped up the incremental moves by 300%, which means we can shoot about 8-12 seconds automated per hour.

The movies above represent moves run with the new software.  I’ve started consulting my handy American Cinematographers Manual to look at the pan and lift speeds.  Both moves are processed with the “smoothcam” node in Shake.  As far as I can tell, the amount of pixel shifting that happens when smoothing has a minimal effect on the plate – I don’t see much introduced blur, especially since our processing happens at 4K (roughly), and then gets downsized to 2K.  I haven’t yet tracked a “smoothed” shot, so I might discover some issues there.  The jitter in the original plate is also pretty minimal, and hopefully with the next iteration of the rig, we can bring it even further down.

Below are some images of the 1st iteration.

Proto6

ma_rnd_proto6_bg

So we’ve gone back and forth with various modifications on the rig.  We added some boogie wheels and then realized that dynamic distribution of weight actually adds another factor of unpredictability to the motion.  The new rig is, however, more stable than the previous couple, and we’ve also attached a small Manfrotto head mount to the bottom of the lift.  Now the camera is much easier to mount.  The first movie below is the actual test, while the second is a stabilized version, taken through Shake.  The movie above is roughly color corrected, stabilized, and the background replaced.  The reflections in the mirror are Benny and I moving around as the move is running.

original:

stabilized:

Proto4 test shoot2

ma_rnd_proto4_test3_hi

This a clip from a rough shoot of one of our buildings.  In total we’ll have about 9 buildings, many of which we’ll be interchanging parts in order to add variation.

Our lift stopped working in the middle of the shot – a gear driving another gear on one of the four track gears slipped out of place, but because of the differential (which aligns the four rack gears), I didn’t notice the problem until too far in.  Our next version should deal with some of the workflow complications we noticed through this shoot.

I’ve added some different coloring to one of the front walls in this shot – just to see how that might look.  In order to do that, I first applied a lens undistort (you can see the distortion at the edges of the frame), then color-corrected the plate, with a traveling matte that followed the camera move, and then redistorted the plate.  This way, I can create a matte for the wall that has straight lines, and then, when I redistort, get the lens barrelling back into the image.

Below are some shots of the buildings and the shoot in progress.

Proto4 test shoot

So we shot some footage on our miniature set which is being built about 400 km away.  Over the next few days, I’ll be adding some more shots.  Above is one of the clips (the flicker is from the large sunlight in the workspace – we’ll be shooting in an closed off studio for the actual film shoot), with some work done in Shake to average the bluescreen, replace the background, and add the foreground flicker to the background (with a multiplier).

This is shot on our new set of rails (which are 2 meters long).  The rig will need some modification, since the rails are a bit tighter.  The lift is also a bit unstable (still), but we think we’re on the right track – using 3 differentials driving 4 24-tooth gears along 4 rack gears.  We should, hopefully be able to stabilize the motion completely.

Below are some of the photos of the shoot.

Proto4, PID, worm gears

So we’ve redesigned the lift mechanism a couple of times.  The lift you see here has about six worm gears (inside of a gear housing) that drive four gears around four rack gears.  This corresponds to one gear on each corner of the lift rod.  There are differentials between each set of gears and the central driving gear (three differential gears).  This is an improvement over some of our last models, which had dozens of gears – our theory is that at this small scale, the more plastic LEGO gears you introduce, the more possibility of error.  So less is more.

We’ve also been visually logging our degree variance in the Processing sketch I’m writing.  It’s pretty primitive, but gives you a good idea of how close we are to a target rotation count.  At the moment, using a pseudo PID (with multiple levels of bi-directional adjustment once the degree count is within certain thresholds, as well as fail-safe conditions, in case of run away motors), we get an average of about 5 degrees of variance.   That’s not bad when we’ve got gear factors of 1,728 and 576 (track & lift).  To the left you can see the screen grab of what that visual graph looks like.

.

.

Proto2 testing

Here’s a first test from the Proto2 rig.  As you can see, there’s a fair bit of work here isolating the shake.  The actual gear motion is relatively accurate, but the lift on the rig introduces some instabilities.  We’ll take a look at these in the next couple of days.

The second test is an overlay of the camera move we set in 3D (with a stand-in object) and the actual footage.  The general motion is fine, but it’s easy to see the stability problems.  Because we’ll be shooting closeup on the miniatures, I expect to have a lot of parallax in the shots – what that means is that fixing stability in post (eg. re-projecting the plates on 3D objects and creating baked texture maps, and semi-stabilized plates) is a complex problem.  So we’ll need to solve the stability issues at the rig level.

Prototype 2

So this is our second prototype rig.  That’s Benny Bondesgaard in the picture, who’s doing most of the building.  We’re shooting tests this weekend in preparation for doing a test shoot with the miniatures next week.  Our process goes a bit like this:

  1. Set a move in Autodesk Maya using a 3D version of the LEGO rig – built to scale, and with moving parts.
  2. Export the move in .move format (ascii format, where each axis is a column, and frames are line separated)
  3. Set the rig to a default position (we’re working on a gear release, so that we can manually move the lift/rotation).
  4. Start Processing and EOS Utility (we’re using a Canon EOS 40D).
  5. Run our sketch.

I found a way to trigger the remote shoot on the EOS Utility using a simple Applescript which sorts through UI elements and finds the right button.  Not even close to a good long term solution, but one that works fine for the moment.  I spent some time mucking around with gphoto and a few other utilities (as well as trying to wrap my head around the Canon ED-SDK), and this is definitely the easiest to implement.  This way, we can create a minimal UI for the actual Processing application and be able to use all of the functions Canon has built into the EOS Utility.

In the Processing sketch, we set each incremental move by setting a motorForwardLimit (which is based on the rotation of the servo).  Often enough, the motor overshoots the limit, so I set an adjustment move based on a threshold of error.  This gets us closer – still not quite exact, but we’re working on it.  I may end up going back to just turning on the motor, and gradually lowering the speed as I approach the right rotation count.  Or perhaps setting a limit for the first step, and then gradually working with the speed for the adjustment step.

After the camera trigger clicks, I also run a loop that goes for five seconds (it tests against the millis() function in Processing, which is a millisecond count from when the sketch was initialized).  This is so that the Applescript has enough time to open and click on the release button in the EOS Utility.

Home Brew Rigs

This is one of the first tests on our home made motion control system.

Contrast this with another test we made on a professional motion control unit last year:

The home made rig is as of yet pretty unstable, but the project is in progress (sponsored no less by LEGO).  Following a suggestion from Saschka Unseld over at Studio Soi, we discovered that by stabilizing just one point in the dolly shot and cropping in, you can fake the illusion of a track/pan.

Stability test 2

Ok, so this is our second test – using a more manageable gear factor of 576 (24 ^ 2), and with some increased speed on the motor.  Unfortunately, we’ve got some issues here:

  1. Our software is not resetting the rig back to the same position
  2. The motion is more inconsistent between passes (than the last test)
  3. The camera looks like it’s shifted a bit (could be focal length, could be focus) in the first frame – we’ll need to lock it down a bit more for the next tests.

Below you see two passes that have edge detection applied screened together.  The jitter you see (where the red and green jump around each other) represents the motion instability.

red_green

First stability test

Here are two separate passes we shot with our stability rig.  Our gear factor is an incredible 1:13824 and the motors take a couple of minutes to move .1 of a centimeter.  While that’s great, it’s a bit unwieldy, unless we get some higher power motors.  So we’ll be working with finding the sweet spot, where we can hit the same precision, but get our rig to move a bit quicker (click on the images for the movies).

This move is driven by processing and Maya.  We’re mostly just using Maya to set some simple moves and use it’s curve editors, until we get our own flavor of 3d interface up and running in processing.  The moves are exported as .move files, which then I parse through in processing and transfer to translation for the rig.

This test is to find out whether we can accurately repeat the same motion.  Below is the combination clip, where blue represents one layer, and red represents the other layer.  For the next few tests, I’ll shoot each pass with the same lighting conditions and do some difference composites (instead of this red/blue).

While the test is interesting, I’ve had to slightly hand transform (a few pixels) and then animate over the length of the shot one pass in order to match the other.  Our task will be to narrow down where this descrepancy is happening (either our rig, or the motors, or the software).

This test is only 25 frames, but we’ll get some longer length tests also up soon.

Servo Motors

Regarding the Prototype Test, while the movement can be stablized in post, we can’t yet recreate the same motion over multiple passes.  So that will be our first task – testing the useability of the Mindstorms servos.

Our software is based on processing and the NXT java library.  Processing is a great environment to sketch in, and we’ll be writing our interfaces through it.

Prototype Test

Here’s a test we made with our prototype system a couple of months ago.  The movement was stabilized in post, and that will be one of our challenges with the actual build – if we can create stable motion with the actual rig.

The lighting for this shot was a simple office style lamp and some ambient fill from the windows (which I then deflickered in Shake).  One of the nice things about shooting with a stills camera (rather than a moving image one) is that our exposure can be extremely long, and our lighting can be relatively low.

Legos have arrived!

These are some of the photos we took when the LEGO’s first arrived.


Images from the shoot

Thanks to all for the shoot. Here’s a quick comp of one of the motion control moves (click on image for movie):

sh015_test_mid.jpg

and another one:

sh002_test_mid.jpg

Both of these movies need a fair bit of comp work (dust busting, stabilizing, flicker removal, etc). At the moment, they’re also comped at 8bit, but I’ll soon be moving over to our 16bit pipeline (thanks to RAW format that we shot on).

Some images (I’ll put up a few more images on this post in the next couple of days):

comp.jpg comp1.jpg img_0003_1.jpg




Warning: require_once(/home/sunit_parekh/blog.machinefilm.com/wp-content/themes/k2/footer.php): failed to open stream: Permission denied in /home/sunit_parekh/blog.machinefilm.com/wp-includes/template.php on line 572

Fatal error: require_once(): Failed opening required '/home/sunit_parekh/blog.machinefilm.com/wp-content/themes/k2/footer.php' (include_path='.:/usr/local/lib/php:/usr/local/php5/lib/pear') in /home/sunit_parekh/blog.machinefilm.com/wp-includes/template.php on line 572