Miniature molds

These are some pictures of the molds we use for the miniatures.  Notice the use of LEGOs to frame the 2 component plastic we cast in.  The LEGOs are easy to pull off and reconstruct, rather than having to build a new frame for each cast.

Proto4, PID, worm gears

So we’ve redesigned the lift mechanism a couple of times.  The lift you see here has about six worm gears (inside of a gear housing) that drive four gears around four rack gears.  This corresponds to one gear on each corner of the lift rod.  There are differentials between each set of gears and the central driving gear (three differential gears).  This is an improvement over some of our last models, which had dozens of gears – our theory is that at this small scale, the more plastic LEGO gears you introduce, the more possibility of error.  So less is more.

We’ve also been visually logging our degree variance in the Processing sketch I’m writing.  It’s pretty primitive, but gives you a good idea of how close we are to a target rotation count.  At the moment, using a pseudo PID (with multiple levels of bi-directional adjustment once the degree count is within certain thresholds, as well as fail-safe conditions, in case of run away motors), we get an average of about 5 degrees of variance.   That’s not bad when we’ve got gear factors of 1,728 and 576 (track & lift).  To the left you can see the screen grab of what that visual graph looks like.



Proto2 testing

Here’s a first test from the Proto2 rig.  As you can see, there’s a fair bit of work here isolating the shake.  The actual gear motion is relatively accurate, but the lift on the rig introduces some instabilities.  We’ll take a look at these in the next couple of days.

The second test is an overlay of the camera move we set in 3D (with a stand-in object) and the actual footage.  The general motion is fine, but it’s easy to see the stability problems.  Because we’ll be shooting closeup on the miniatures, I expect to have a lot of parallax in the shots – what that means is that fixing stability in post (eg. re-projecting the plates on 3D objects and creating baked texture maps, and semi-stabilized plates) is a complex problem.  So we’ll need to solve the stability issues at the rig level.

Prototype 2

So this is our second prototype rig.  That’s Benny Bondesgaard in the picture, who’s doing most of the building.  We’re shooting tests this weekend in preparation for doing a test shoot with the miniatures next week.  Our process goes a bit like this:

  1. Set a move in Autodesk Maya using a 3D version of the LEGO rig – built to scale, and with moving parts.
  2. Export the move in .move format (ascii format, where each axis is a column, and frames are line separated)
  3. Set the rig to a default position (we’re working on a gear release, so that we can manually move the lift/rotation).
  4. Start Processing and EOS Utility (we’re using a Canon EOS 40D).
  5. Run our sketch.

I found a way to trigger the remote shoot on the EOS Utility using a simple Applescript which sorts through UI elements and finds the right button.  Not even close to a good long term solution, but one that works fine for the moment.  I spent some time mucking around with gphoto and a few other utilities (as well as trying to wrap my head around the Canon ED-SDK), and this is definitely the easiest to implement.  This way, we can create a minimal UI for the actual Processing application and be able to use all of the functions Canon has built into the EOS Utility.

In the Processing sketch, we set each incremental move by setting a motorForwardLimit (which is based on the rotation of the servo).  Often enough, the motor overshoots the limit, so I set an adjustment move based on a threshold of error.  This gets us closer – still not quite exact, but we’re working on it.  I may end up going back to just turning on the motor, and gradually lowering the speed as I approach the right rotation count.  Or perhaps setting a limit for the first step, and then gradually working with the speed for the adjustment step.

After the camera trigger clicks, I also run a loop that goes for five seconds (it tests against the millis() function in Processing, which is a millisecond count from when the sketch was initialized).  This is so that the Applescript has enough time to open and click on the release button in the EOS Utility.


This is the start of one of our windows frames (for the 1:24 scale set).  We’ll be refining this and then using it as a master for hopefully a couple of the buildings.

Home Brew Rigs

This is one of the first tests on our home made motion control system.

Contrast this with another test we made on a professional motion control unit last year:

The home made rig is as of yet pretty unstable, but the project is in progress (sponsored no less by LEGO).  Following a suggestion from Saschka Unseld over at Studio Soi, we discovered that by stabilizing just one point in the dolly shot and cropping in, you can fake the illusion of a track/pan.

Stability test 2

Ok, so this is our second test – using a more manageable gear factor of 576 (24 ^ 2), and with some increased speed on the motor.  Unfortunately, we’ve got some issues here:

  1. Our software is not resetting the rig back to the same position
  2. The motion is more inconsistent between passes (than the last test)
  3. The camera looks like it’s shifted a bit (could be focal length, could be focus) in the first frame – we’ll need to lock it down a bit more for the next tests.

Below you see two passes that have edge detection applied screened together.  The jitter you see (where the red and green jump around each other) represents the motion instability.


First stability test

Here are two separate passes we shot with our stability rig.  Our gear factor is an incredible 1:13824 and the motors take a couple of minutes to move .1 of a centimeter.  While that’s great, it’s a bit unwieldy, unless we get some higher power motors.  So we’ll be working with finding the sweet spot, where we can hit the same precision, but get our rig to move a bit quicker (click on the images for the movies).

This move is driven by processing and Maya.  We’re mostly just using Maya to set some simple moves and use it’s curve editors, until we get our own flavor of 3d interface up and running in processing.  The moves are exported as .move files, which then I parse through in processing and transfer to translation for the rig.

This test is to find out whether we can accurately repeat the same motion.  Below is the combination clip, where blue represents one layer, and red represents the other layer.  For the next few tests, I’ll shoot each pass with the same lighting conditions and do some difference composites (instead of this red/blue).

While the test is interesting, I’ve had to slightly hand transform (a few pixels) and then animate over the length of the shot one pass in order to match the other.  Our task will be to narrow down where this descrepancy is happening (either our rig, or the motors, or the software).

This test is only 25 frames, but we’ll get some longer length tests also up soon.

Servo Motors

Regarding the Prototype Test, while the movement can be stablized in post, we can’t yet recreate the same motion over multiple passes.  So that will be our first task – testing the useability of the Mindstorms servos.

Our software is based on processing and the NXT java library.  Processing is a great environment to sketch in, and we’ll be writing our interfaces through it.

Prototype Test

Here’s a test we made with our prototype system a couple of months ago.  The movement was stabilized in post, and that will be one of our challenges with the actual build – if we can create stable motion with the actual rig.

The lighting for this shot was a simple office style lamp and some ambient fill from the windows (which I then deflickered in Shake).  One of the nice things about shooting with a stills camera (rather than a moving image one) is that our exposure can be extremely long, and our lighting can be relatively low.

Legos have arrived!

These are some of the photos we took when the LEGO’s first arrived.

Images from the shoot

Thanks to all for the shoot. Here’s a quick comp of one of the motion control moves (click on image for movie):


and another one:


Both of these movies need a fair bit of comp work (dust busting, stabilizing, flicker removal, etc). At the moment, they’re also comped at 8bit, but I’ll soon be moving over to our 16bit pipeline (thanks to RAW format that we shot on).

Some images (I’ll put up a few more images on this post in the next couple of days):

comp.jpg comp1.jpg img_0003_1.jpg


Some miscellaneous photos and the set on location:

Some recent images. The walls will be painted lighter, the doors/floors/panels are all undergoing treatment,
and the furniture/appliances are in progress, all thanks to Karin. Additional thanks to Christine Bechameil and Ursula Nielsen for advice, paint, and furniture wear & tear.

miniature setup

here are some more setups of the set. from here the dirty work begins, making these pieces look more real.

miniature build

here’re some photos from the miniature build with karin ørum pedersen. thanks to the guys at samsoe & fribo for the space in avedøre:

all the walls are removable to allow for camera access and the base is cut into three separate pieces.

miniature test

A movie from the miniature test today with my consumer Panasonic NV GS400 cam. About 19 takes with different f-stops, shutter speeds, moves. click on the image for the movie:


Below – pictures with my Fujifilm Finepix S7000 (with a little bit of Photoshop work).C lick on the images for larger versions:

Warning: require_once(/home/sunit_parekh/ failed to open stream: Permission denied in /home/sunit_parekh/ on line 572

Fatal error: require_once(): Failed opening required '/home/sunit_parekh/' (include_path='.:/usr/local/lib/php:/usr/local/php5/lib/pear') in /home/sunit_parekh/ on line 572