Blender Tracking Engine update

So what Sergey was talking about in his presentation at bconf14 for the tracking engine, it has just been pushed to the trunk.

http://lists.blender.org/pipermail/bf-blender-cvs/2014-October/069226.html – Libmv: Initial commit of unfinished AutoTrack API
http://lists.blender.org/pipermail/bf-blender-cvs/2014-October/069227.html – Add autotrack API to the C-API
http://lists.blender.org/pipermail/bf-blender-cvs/2014-October/069228.html – Replace region tracker with autotracker in Blender

I am sure that this is the first of many improvements to come!

I will be testing once the buildbots have pushed a windows version.

I hope they will work in a variable lens solver to shots with zoom. This is actually a big limitation in the blender tracking system.

Please, show me a real movie where zoom is actually used. Not counting horror movies drive-back - zoom effect :wink:

This improvements are very welcome, thanks to sergej.

double post.

I very much hope they can get stereoscopic tracking happening, and witness cameras and other general refinements left right and center. Not sure how effective the autotrack button will be but i am sure it could give a great base to work from.

Any movie that shows a close-up on something while never moving the actual camera.

I know people tend to confuse “dolly” and “zoom” but this isn’t one of those times :wink:

While I’m here, can someone explain to me what could hypothetically be possible in blender now in regards to this tracking update. I’m NOT well versed in this portion of blender, but my gut tells me that this is kind of a big deal, and I want to know what I should be excited about, because I think that this is really awesome and I don’t know why. lol

The autotracker will be able to detect features from frame to frame that most will not see over a certain range of frames. You can do your manual supervise tracking and lock those and then autotrack the rest when more predictions are needed. You need at lease 8 trackers for a solve but 16 tracker will be better. You will notice the difference in color in the top right window in the clip editor when 8 & 16 tracks overlap in the buffered range.

It is the start of many improvements… the main one being multi camera solving (important for stereoscopic tracking, witness camera solving, and optical mocap)

Why are you thinking in movies? sometimes you need to solve tracking for other types of job, you should ask to the clients, usually they provide you with all kind of things so we should have the tools to be able to work in any possible scenario. Is like saying… why have custom resolutions in the render panel when most of the time it’s only used: 720p, 1080p or 4K? You see what I mean? is not about how YOU work with Blender is about how to make blender flexible enough and capable enough for most of the users

Finally I just want to let you know that many professional tracking applications provides variable focal lens solver to being able to handle zooms for example.