Tracking of 360deg video footage

Hello Blenderheads,

anybody tried to track spots in 360deg video footage?

should the Blender tracking system behave differently or strangely if the video is deformed?

thanks for your suggestions, cheers!

The “mathematical challenge” here would be how to correctly model in Blender’s 3D world the spatial relationship between various objects of the camera, and the position and orientation of the camera itself.

If it were me, I would look for pragmatic ways to simplify (avoid …) the problem. For one thing, when I watch movies where the camera gets too-crazy … especially if the damm thing starts “spinning around,” I get dizzy to the point of nausea. (Bleah…) Therefore, some creative editing should give a clear-enough visual impression of “it’s ‘in the round,’” while keeping the problem of each individual shot manageable using traditional tracking techniques. Please don’t take your audience on a “merry go-round” ride unless you also provide … little white bags.

I assume you mean videos in equirectangular projection?
Tracking itself should work fine, but solving the camera solution won’t since the solving code is based on a mathematical model of a perspective camera.
However, the solving step shouldn’t be much harder to implement for equirect videos. For example, you can skip the perspective transform, but the resulting system of equations won’t be linear anymore…

I’m looking around for the same question, searching the web seems there is no a lot of app doing this.
these are two example video, but don’t explain what they use:


with no an appropriate camera solver i think it’s probably possible reproduce in Blender this Skybox(after effects) technique, but i’m not so smart :smiley:

Thank you everybody,

i’ve forwarder your suggestion to the programmer who developed the 360deg video stitcher, i will post any news.

One way to get the camera movement from 360 video is to convert it to wide-angle footage and track that. You will not get 3d locations for features that are outside the field of view, but you get the camera.

In blender you could add the 360 video as sky texture and render it out using static camera. Orient the camera or sky texture so that it faces the part which you want to be front in solved camera. Track and solve the rendered footage.

There can be problems with this kind of rough solution but you will get at least something. The biggest problem will probably be the misalignment of features on the “sides” and “behind” you because you can’t track them and camera is solved based on visible features only.

I once had to track footage shot with a fisheye lens (180 degrees). It was impossible to track like that in Blender. So I opened it in After Effects and applied the Optics Compensation filter to undistort the footage and get the lines straight again. That I could track in Blender and get a descent solve. Then I could use the straight lines as reference to build a proper scene. All I had to do then was give the Blender camera a fisheye lens which redistorted the CG elements which I could then comp onto the original footage.