Gyro data for camera tracking

I’m a newbie in Blender but I do a lot with Arduino and stuff. I was wondering if gyro data could help improve camera tracking. I’m thinking a small box with 1 or 2 semiconductor gyros, a GPS (mainly for timecode) and an Arduino or Raspberry Pi which writes the gyro roll values x times per second to a file on SD card. Put everything in a small waterproof metal box with an 1/4" thread to mount it to a camera.

Could that improve camera tracking? Opinions?

Based on what I understand there are 3 ways for camera tracking. 1 is 3D motion tracking which is better suited to track sensors in 3D space (position/rotation), 2. gyro tracking that is better for rotations, 3. camera tracking where tracking occurs based on footage.

Camera tracking is important for compositing because it can provide pixel perfect stabilization.

But using camera tracking by itself has limitations because it works best for planar tracking. If you have more complex shots with rotation or movement the reconstructed 3D markers will be estimated so they won’t be super accurate.

So the best use for gyros is to provide support for camera rotation, at the very basic level you can have have a valid measurement of the shot so you can figure out what the camera is doing. The more data you have the better you will be.

My ideal choice would be a 3D motion tracker but it’s extremely expensive investment, so gyros and Arduino are perhaps the most accessible technology to use. :slight_smile:

Well, I also play with UAVs, so… Add a few accelerometers and maybe an optical flow sensor and you should get a pretty good 3D camera tracker for about $300-400

Sent from my Find7 using Tapatalk

But I was just wondering if the gyro data could be somehow usefully integrated into Blender’s tracking algos.

Sent from my Find7 using Tapatalk

@sgofferj, I like the idea of using a gyro or even a 3-axis acceleromiter would work. You would be able to get the Pitch, Yaw, and Roll of the camera. If you have a few IR LEDs and a detector you may also be able to get a reading of distance from camera your object is with rather good precision if you could align and focus it correctly. Although digital cameras will pick up IR light and reflections, you might try an ultrasonic distance sensor if IR light is affecting your photos/video.

I think that Blender tracking code won’t change, because it’s a third party library (https://code.google.com/p/libmv/) and it’s purpose is to focus specifically to video footage.

However any other alternative way of tracking can be added with Python, I guess that you won’t use real time tracking or do complex calculations so execution speed of Python won’t bother you. However when you record your video, sticking the sensors to camera to capture data as well, will give you huge benefits.

Just a side note: there are camera arm robots, which measure the angles of all hinges on their arm (which is very precise), so they always know the camera position relative to their own base.