Monday, March 19, 2012

Sometimes it is amazing how upgrading to a new version can make a problem just magically go away. It also helps to make a useful mistake along the way.

I work with three 3D camera trackers: PHoePro (The Pixel Farm), SynthEyes (Andersson Technologies), and Camera Tracker for After Effects (The Foundry). Just started a new project that involved some deceptively simple green screen elements. The lens used for shooting these objects, however (a fairly long one [75mm]) was the kind that you can shift into different positions so as to offset the perspective. Doing camera or object tracks with footage shot with such a lens can be a little tricky, especially with a long lens.

On-set in a supervisory capacity, I was tempted to upset the whole arrangement by asking to shoot with a different (that is, not-shift-able) lens. But I am loathe to do too much of that kind of thing, mostly because they had already decided that the look of the objects was "right". I'm more likely to ask for slight adjustments to the shoot that take maybe 2 minutes, and save hours (or days, hopefully) in post. In this case, the good folks at AutoFuss were good enough to humor me and use a longer lens than the one they started with (but still shifted), and utilize on the rig some very high tech additions which I had brought with me: marked up clothes pins (they look sort of like little "Egyptian Mau"s).

The lens was to get more of the rig in the shot; the clothespins were quickly clamped onto the rig (away from the important object) to make it more "track-able". We were shooting with the Red One, so there was plenty of resolution to allow for this. The first tests I did showed that I was going to have problems. All attempts at tracking either flat-out failed or created bad cameras that were infected with "inverted perspective" -- where the software "sees" a detail moving across the screen and interprets that movement as being background when it should be foreground, or vice versa. This can happen when background details are fuzzy and jump around and appear to be moving "faster" (if they weren't jumping, they wouldn't be moving faster, you see), so the algorithms determine that such movement must be closer to camera. This doesn't happen often, but it happens. Inverted perspective can also happen when the perspective is so compressed, as with a very long lens, that the software gets fooled, and chooses a screwy "solution" (like in this case, half the move is correct, then the solve camera decides to make a 180 as it were and starts moving in the opposite direction, even though the original camera continues along the "same" path).

Anyway, the shifted lens was complicating things, and probably adding to the perspective problems. When I first saw the camera and lens set-up, I suspected this might lead to weirdness, so just in case, when they were about to wrap, I had them shoot a couple of reference frames of the lens in its shifted and "unshifted" positions.

So fast forward to the next day, when all the footage was in, and I was playing around like crazy trying to get my trackers to work. I didn't have too much faith in PFHoePro, although it happens to give the best results when it works, and I have done some pretty nutty things to footage to get it to work in PFHoePro when it seemed like there was no way such an "auto-pilot" tool would handle less-than-ideal footage. SynthEyes seemed like the best bet, but I figured I'd still have to pull some tricks out of my bag to get it to behave. One of the things I tried was simply rotating the footage -90 degrees. Interestingly, the tracker behaved differently, but still failed.

Many hours later, I had tried a number of things, but felt I needed to come up with something really wacky. Then, after a lot of playing around and meditating, I thought "hey, I just got paid; let's upgrade SynthEyes" (I was still using the '08 version).

After upgrading to the newer '11 version,  my intention was to first plug in the original footage to see how it would do. But this wasn't what I did. I accidentally plugged in the rotated-only version and was amazed to see a perfect track. When I eventually realized my "mistake" of using the rotated footage, I thought, oh, I'll just plug in the original non-rotated footage. But guess what? FAIL. Inverted perspective.

So it seems that SynthEyes understands rotations around the Y axis better than the Z (given X=left-right, Y=up-down, Z=backward-forward). Go figure.

Sunday, February 19, 2012

I wanted to get something on the blog, but wasn't ready to post something too long, and most of my drafts are pretty monstrous. So, here's a nice image that makes a dang fine desktop picture. I made it in Cheetah 3D. Steal it if you want -- click on "Shiny Chain" to get it in its largest size (of course, larger versions are available upon request).

Shiny Chain
Shiny Chain

Friday, January 20, 2012

Recently I was alerted to the fact that the newest version of the SynthEyes 3D camera tracker uses a new scheme to export for After Effects (I am one version behind). In addition of using the old ".ma" way, SE now also can generate a jsx (JavaScript) file that can be run from AE to build the appropriate objects (camera and nulls). This is a better approach, but it does show the glaring inferiority of the 3.3 version of my 3D_TrackedCamera_Preflight script. Thanks to Julian Herrera for bringing this to my attention. I have updated the script to correct for this problem, and improved the code in other ways, and added a few cosmetic improvements. If you are one of the millions of adoring fans of this script (Mr. Herrera's feedback was the first I had ever received), please download the latest version (3.5) and replace the old one.

Tuesday, January 17, 2012

I sort of wish I could add about fifty posts or so and pretend I've been doing this for years. But to be honest, this is post #1. Soon I'll be posting announcements about the various activities and interests that keep me occupied, or preoccupied, not to mention (what an odd convention ... I'm mentioning it, aren't I?) whatever tricks of fate life decides to throw at my cringing face.
For the time being, if you read Japanese , or you like to see a few English words surrounded at spearpoint by Hiragana and Katakana, feel free to check out this mention (thanks to Hiroyuki Sato) of a couple of my recent geeky experiments: a hack of a fun script for Adobe Illustrator, and my linb UI Builder hack (which I, for some pretentious reason, decided to call "Boethos").
I'm working on a couple of explanatory movies for these offerings. Here are a couple of paragraphs to give you the gist:
  • Tree-likeUI.js is a tree shape-making script. The original script (by Sato-san) was so cool that I decided to make it more fun, easy and efficient to use. My hack adds a user interface, and makes the script "remember" previous settings so a user can bang out a whole forest in no time. It also allowed me to create a logarithmic slider control. If that sounds like something you don't care about, by all means, ignore it. 

  • If you develop scripts for CS apps (After Effects, Photoshop, Illustrator, &c), Boethos hacks (or really, utilizes) the CrossUI UI Builder (formerly the jsLinb UI Builder) so that you can spit out a user interface in very little time (this supersedes and far out-does my Comp_To_UI Script for After Effects). My first official use of Boethos was for ... guess what? The Tree-likeUI script. A somewhat detailed description of how it works can be read using the link above. A short tutorial is on Hiroyuki's blog (link above). My own tutorial is rather large and I'm only about half-done. Mr. Sato shamed me by making a tutorial in a matter of hours.