Sometimes it is amazing how upgrading to a new version can make a problem just magically go away. It also helps to make a useful mistake along the way.
I work with three 3D camera trackers: PHoePro (The Pixel Farm), SynthEyes (Andersson Technologies), and Camera Tracker for After Effects (The Foundry). Just started a new project that involved some deceptively simple green screen elements. The lens used for shooting these objects, however (a fairly long one [75mm]) was the kind that you can shift into different positions so as to offset the perspective. Doing camera or object tracks with footage shot with such a lens can be a little tricky, especially with a long lens.
On-set in a supervisory capacity, I was tempted to upset the whole arrangement by asking to shoot with a different (that is, not-shift-able) lens. But I am loathe to do too much of that kind of thing, mostly because they had already decided that the look of the objects was "right". I'm more likely to ask for slight adjustments to the shoot that take maybe 2 minutes, and save hours (or days, hopefully) in post. In this case, the good folks at AutoFuss were good enough to humor me and use a longer lens than the one they started with (but still shifted), and utilize on the rig some very high tech additions which I had brought with me: marked up clothes pins (they look sort of like little "Egyptian Mau"s).
The lens was to get more of the rig in the shot; the clothespins were quickly clamped onto the rig (away from the important object) to make it more "track-able". We were shooting with the Red One, so there was plenty of resolution to allow for this. The first tests I did showed that I was going to have problems. All attempts at tracking either flat-out failed or created bad cameras that were infected with "inverted perspective" -- where the software "sees" a detail moving across the screen and interprets that movement as being background when it should be foreground, or vice versa. This can happen when background details are fuzzy and jump around and appear to be moving "faster" (if they weren't jumping, they wouldn't be moving faster, you see), so the algorithms determine that such movement must be closer to camera. This doesn't happen often, but it happens. Inverted perspective can also happen when the perspective is so compressed, as with a very long lens, that the software gets fooled, and chooses a screwy "solution" (like in this case, half the move is correct, then the solve camera decides to make a 180 as it were and starts moving in the opposite direction, even though the original camera continues along the "same" path).
Anyway, the shifted lens was complicating things, and probably adding to the perspective problems. When I first saw the camera and lens set-up, I suspected this might lead to weirdness, so just in case, when they were about to wrap, I had them shoot a couple of reference frames of the lens in its shifted and "unshifted" positions.
So fast forward to the next day, when all the footage was in, and I was playing around like crazy trying to get my trackers to work. I didn't have too much faith in PFHoePro, although it happens to give the best results when it works, and I have done some pretty nutty things to footage to get it to work in PFHoePro when it seemed like there was no way such an "auto-pilot" tool would handle less-than-ideal footage. SynthEyes seemed like the best bet, but I figured I'd still have to pull some tricks out of my bag to get it to behave. One of the things I tried was simply rotating the footage -90 degrees. Interestingly, the tracker behaved differently, but still failed.
Many hours later, I had tried a number of things, but felt I needed to come up with something really wacky. Then, after a lot of playing around and meditating, I thought "hey, I just got paid; let's upgrade SynthEyes" (I was still using the '08 version).
After upgrading to the newer '11 version, my intention was to first plug in the original footage to see how it would do. But this wasn't what I did. I accidentally plugged in the rotated-only version and was amazed to see a perfect track. When I eventually realized my "mistake" of using the rotated footage, I thought, oh, I'll just plug in the original non-rotated footage. But guess what? FAIL. Inverted perspective.
So it seems that SynthEyes understands rotations around the Y axis better than the Z (given X=left-right, Y=up-down, Z=backward-forward). Go figure.