Vector blur with external motion data - how?

I love the vector blur in the composite nodes. But I notice I can’t really use the Speed input with anything else BUT blender’s rendered speed input.

I notice they have a billion grades of saturation, and RGB values in the hundreds and negative hundreds sometimes. I also notice it’s value typically is set at 0.

I tried to simulate it converting RGB vector motion data (from other software like Mental Ray, GPU renderers, etc) and I can’t make anything compatible.

I notice Black Magic Fusion and Nuke both have motion blur and vector motion blur nodes (mostly in the paid versions) that can either pull the motion data, or create it, or use motion data from different software - including strictly clamped RGB motion data.

But Blender’s motion data is not clamped, far from it, and I have no idea how to replicate it or import from external image sequences created elsewhere.

Any pointers?

Node group I made back in the day – http://www.blenderartists.org/forum/showthread.php?265887-Functionality-expansi-Custom-motion-vector-export-capability-willing-to-pay-donate&highlight=

Let me know if you need any help!

That will come in handy! I just had a look at what you did, and it might actually be what I need. I just need to do it into reverse (to build the Blender Speed vector for Vector Blur in Compositor).

I tried to quickly inverse it (taking an RGBA motion vector) to use with the Vector Blur node of Blender, but didn’t have much luck. I’m trying to study up on it, but some pointers would be greatly appreciated…

As long as the renderer isn’t normalizing the pass (I can say from experience mental ray and arnold won’t by default), it’s pretty simple. You just need to correct for Blender’s weird two-step RG=past BA=future channel thing that the vector blur node is expecting. And that’s just copying R into B, and G into A. Then just hook up a depth pass to Z. Here’s an example with a file from arnold (just Z and motionvector AOVs enabled, saved to multilayer EXR):


HAH!

It was simpler than I thought.

Seperate RGBA from your 8bit Motion Vector data (or 16bit)
Add R with B, Add G with Alpha.
Multiple each channal add node by whatever (say 127 or 256 or 2048)
Combine RGBA just as you had it before (Multiply into R and B, the other Multiply into the G and A)
Use the Vector Blur speed input

Make sure your motion vector data has your “zero” movement color or alpha set to zero, by either using a subtract or “minumum” (any value less than background color = 0).
The other is making sure your Depth is as white = far (inverse depth typically).
To avoid artifacts (streaks), use a Filter on “Soften” and up your samples. Also set your maximum to 64 or so.
And one last detail, is to make sure your main beauty image has 1.0 alpha.

Thanks for the clarification. The multiplying wouldn’t be necessary with 16bit images… Yeah, I was using 8bit depth and 8bit motion vector data (normalized).

I was rendering out of Mach Studio Pro 2 - a dead GPU renderer that was a head of it’s time (2011) because it had realtime volume AO (not SSAO) and Global Illumnation (screenspace and volume).

I’d recommend you never do that again. :wink: Between the tiny value range and trying to figure out if a gamma curve got applied to your speed vectors somewhere (ick), you’re in for all kinds of headaches. Disk space is cheap, use EXRs.