Would it take a whole new build of Blender to have it render several mist passes with different mist settings when rendering or can it be done using a script?
I have this complicated scene which needs a lot of BVH calculation and it is kind of sad to have to make a new identical scene and have it render that to get two different mist passes.
I cannot use z-depth for what I’m making because the anti-alias problems, plus depth of field etc.
Also a global mist pass that I can shape to what I need in post isn’t what I want since it is 32 bit and I need a bit more smooth gradients in the alphas
I guess that is the most modern technique for rendering these depth maps, but wouldn’t it need immense amounts of rendering time to be used in present version of Blender with the volumetrics we have now? And the artifacts with even high amounts of samples would still be too much, I think.
this was what I meant with a global mist pass, it is just a 32 bit grayscale image, if I started making math operations to have it fit my need post-render, the gradients would be too rough to use
Example:
oh wow, I never though of this! I haven’t used it before, I heard it works only on static scenes where only the lights and camera moves around, can it cache BVH for sharing between different render layers on the same frame? or maybe cache BVH for individual objects?
I’m not really sure how it works
mStuff, you can do a second render pass with a antialiased z pass shader that override all objects.
It’s even better that the mist pass because, as long as you render it in 32 bits, you’ll have all values in one single render.
And you’re still adding another render layer, which brings back the BVH problem. At that point you might as well just make a linked copy of the scene with a different mist pass.