Cycles; What pass information is needed to make compositing feel complete?

It’s not an ID pass. An ID pass will always have jaggy edges by definition, because you cannot blend or average IDs.

There are many ways to get a matte, a simple one would be this:

  1. Create a new scene and link into it all the objects you need to affect the matte, as well as the camera
  2. Create a material that emits white (1.0,1.0,1.0) at strength 1 and set it as override material in the render layer
  3. Set the world background to black
  4. (optional) Move any objects you want to obstruct the matte with to another layer and use that layer in “Mask Layer”
  5. Add a “Render Layers” node into the compositor and select the new scene. Rendering with F12 will now take the extra scene into account.

Even though you need to render the scene (or parts of it) twice that way, you don’t need a lot of samples for this and you can dial down the render features to the bare minimum.

I never used the ID output in Blender actually. I do like you said, except I create three emission materials, red green and blue, so I can distinguish between different objects inside the pass.

As far as i know, blender’s ID passes have jaggy edges mainly because of the way it stores them in rendered file (exr). Every ID pass is stored in one single channel with different value ranges for each of them.

ID pass is a matte pass by definition whatever tech is used to get one, because it’s a matte of some objects or objects with some materials. And any matte pass with jagged edges is not usable - by definition. Flickering and artifacts… and no antialiasing filter can help.

You described a interesting way of making mattes and i have used it quite often when i needed 2 or 3 mattes, but in all the other production renderers, you can get matte pass with 2-3 clicks with main beauty render. You can have 50 or 100 different matte passes of different objects in one render, rendertime will be affected though.

Now imagine making 50+ matte passes with workflow you described…

This is completely wrong.

An ID pass stores for every pixel a number identifying the object at the first intersection. An ID pass stores identifiers for all objects in a single image. You cannot antialias it, because that requires averaging multiple samples. What is the average of four ID samples (2,29,521,8) supposed to be? 140? That would be a completely different object, of course. It’s logically impossible.

A matte on the other hand stores for a single object (or group of objects) the coverage, which can be any value between 0 and 1, it can be blended, averaged, etc. It requires a separate image for every object (or object group).

Now, what you can do when using the ID layer for masking is rendering at a higher resolution. You don’t have to render everything at higher resolution, just the ID pass should be fine. After the ID has been turned into a coverage value (white/black), it can be averaged. You also can just render the entire image at a higher resolution and then downsample. BI also has a “full-sample” AA mode that does this for you by rendering the frame multiple times at sub-pixel offsets and then averaging, precisely for cases where compositing results in aliasing.

Now imagine making 50+ matte passes with workflow you described…

I can’t imagine ever needing that, but if I did, I’d be much more concerning about working with all these passes in the compositor than setting up the workaround.

Anyway, I’m not saying matte passes aren’t a useful feature to have. You can work around it not having it, that’s all.

VrayMultiMatte* assigns 0-255 AA’d value to each channel (RGB). It’s elegant and highly functional way for native resolution. Of course you could just render it in separate pass manually and even use more than 3 HUEs. You could also render objectID at higher res, as BeerBarron said, or apply Coverage pass for objectID. Does blender have coverage?

Man, you just said the same thing i said and still saying that i’m “completely wrong”. That’s interesting at least

I didn’t said ID pass should or must be antialiased. I said that the reason one needs ID pass is the same reason one needs matte pass. To affect to a part of an image where one or the other object can be seen. That’s the exact same reason matte passes and, as i know, ID pass is needed. You don’t need ID pass to grow flowers in the garden do you?

In my experience in vfx, there were at least dozen of examples i needed 20+ mattes from a single render, and at least few with 50+.

Don’t get me wrong, there is a workaround for everything, and i honestly do like very much how blender and cycles are evolving. My few last advertisements projects are done completely with blender, and i’m completely happy with it. I just wanna say that now is a good time to think about upgrading a bit render pass system.

This is not true for animations with motion blur.

ID are stored in pure integer values. Maybe this could be possible to save the all the range (antialiasing, motion blur, transparency) between each step. from 0 to 1, 1 to 2, 2 to 3… you have of the float values that can also be used. Is this hard to implement? I have no idea of how difficult it is.

Hm at the moment i am writting an image filter in a separate program, not into blender,
As i lack blender internals knowledge, and i prefer C# above c++ for quicker coding.

The reason for doing so, is that although blender does allow for some filtering.
But if you think about it, the options are to general, not realy specific to the noise we see in renders.
I’m not implementing a neural network denoise filters (as was recently in the news), and so my filters wont be the best.
In fact i’m losing details in some textures, but despite that the extra composite output layers, wouldnt simply allow for better denoising, something new would be needed here. … as for the moment i think the NEAT image filter is pretty good, but it could be a lot better.
the reason is that render output noise doesnt have the RGB variance as we see in camera photo’s.
the noise is only related to licht calculations (the randomness we get with the seed valeu for cycles).
so pottentially a lot more colors are correct, only reflections on them contain some randomness.
If that could be averaged out a bit, think we could do with lower sample rates (and depending on filter type ad the cost of some firefly’s people actualy want or so).

Oh, how hard it is to get “experts” to admit to themselves they said something wrong…

I didn’t say the same thing. Here’s what you said:

“As far as i know, blender’s ID passes have jaggy edges mainly because of the way it stores them in rendered file (exr).”

This is wrong.

“Every ID pass is stored in one single channel with different value ranges for each of them.”

This is also wrong.

“ID pass is a matte pass by definition whatever tech is used to get one, because it’s a matte of some objects or objects with some materials.”

Wrong.

“And any matte pass with jagged edges is not usable - by definition.”

Arguably correct, but since ID passes are not matte passes, the corollary is still wrong.

“Flickering and artifacts… and no antialiasing filter can help.”

Anti-Aliasing through supersampling is something you can always do in post, and it’s common use for issues like these.

I didn’t said ID pass should or must be antialiased. I said that the reason one needs ID pass is the same reason one needs matte pass. To affect to a part of an image where one or the other object can be seen.

You didn’t actually say that. You said that ID passes are matte passes and that anti-aliased matte passes are not usable. You can create a matte from an ID pass, but obviously that’s not working too well because ID passes cannot be anti-aliased.

That’s the exact same reason matte passes and, as i know, ID pass is needed. You don’t need ID pass to grow flowers in the garden do you?

The primary use for ID passes is identifying objects (useful for picking or for verification e.g. in computer vision). It’s not actually that good for matting, because you can’t easily group IDs, you can’t control them, etc. They’re also dead simple to implement, so they’re good to just have around, even if there aren’t that many really good uses for it.

edit:

For motion blur, semi-transparent objects etc, ID passes don’t really work in general. For this, use the matte scene workaround I described.

ID are stored in pure integer values. Maybe this could be possible to save the all the range (antialiasing, motion blur, transparency) between each step. from 0 to 1, 1 to 2, 2 to 3… you have of the float values that can also be used. Is this hard to implement? I have no idea of how difficult it is.

I’m not sure what this is supposed to mean. What you can do is have multiple samples per pixel. The only way to do this with the current compositor is to render at a higher resolution (and then downsample). This doesn’t solve the problems requiring multiple depth samples, for that you really do need deep compositing.


Here is normalized id pass from blender containing only integer values of 1, 2 and 3 for each of 3 IDs in single channel.

Matte is an area of an image where certain object or objects appear.


Here is an isolated id of one of the objects from the same ID pass. Now if this is not an matte, than what is it, a normal pass? Or world position pass?

As you can see, i’ve said that matte pass with jagged edges is not usable. I have not said antialiased matte pass is not usable. Matte pass should be antialiased and desirably with same sample pattern and filter width that main beauty render have to avoid edge artifacts in compositing. And such antialiasning is impossible in post.

ID pass maybe very useful for computer vision, but I’m pretty sure nobody who used blender and cycles has ever used id passes for that reason (computer vision). In blender their primary and i think only use is - as matte pass. E.g grading only some objects, and not all the scene. In compositing you don’t need object verification, you need its matte which matches exactly to rendered image, whether motion blurred or in dof.

And speaking of “experts”

That’s exactly how they behave

sigh

To clarify:
“As far as i know, blender’s ID passes have jaggy edges mainly because of the way it stores them in rendered file (exr)”

ID passes have jaggy edges because they’re ID passes and you cannot antialias ID passes. Not for any other reason, including “the way they are stored”.

“Every ID pass is stored in one single channel with different value ranges for each of them.”

There aren’t any value ranges, there’s a single number for each object.

Here is an isolated id of one of the objects from the same ID pass. Now if this is not an matte, than what is it, a normal pass? Or world position pass?

You created a matte out of an ID pass (not a good one, since you forgot to normalize it). Does that mean an ID pass “is a matte pass by definition”? No. By that logic, anything you can turn into a matte is a matte, and there are a lot of things you can turn into a matte by some process.

As you can see, i’ve said that matte pass with jagged edges is not usable. I have not said antialiased matte pass is not usable.

I’m sorry, I said “anti-aliased” where I should’ve said “aliased”. Of course you want your matte to be anti-aliased. There, I said something wrong!

Matte pass should be antialiased and desirably with same sample pattern and filter width that main beauty render have to avoid edge artifacts in compositing. And such antialiasning is impossible in post.

Will supersampling completely eliminate any artifacts? Probably not. Does it help with “Flickering and Artifacts”? It sure does.

In compositing you don’t need object verification, you need its matte which matches exactly to rendered image, whether motion blurred or in dof.

Maybe, but you can’t do DOF/MotionBlur with a matte from an ID pass, either. I understand that you think the purpose of ID passes must be matting. I maintain that it’s a “low-hanging fruit” kind of feature without any one defined purpose.

And speaking of “experts”… That’s exactly how they behave

Hey, I’ll readily admit it’s not easy to get me to admit I’ve said something wrong. But it shouldn’t be that way when you’re actually wrong. If you still can’t see how you made even a single wrong statement, fine. We can leave it at that.

You’re catching words here and there but somehow fail to see main message i’m trying to say.

In blender, ID mask pass is being used exclusively as matte pass, though it may not be it as it is, it is still blenders “matte pass” by definition because it is used only for that and has no other purpose in blender. It may be outdated and old but in blender, cycles it is used only for that purpose, for masking objects, as mattes are. Aliased, ID pass could be used with post antialiasing filter, but together with effects cycles is widely being used for (DOF, motion blur, fur… etc) not post antialiasing filter nor ID pass itself works anyhow. Which leaves ID pass system without it’s sole purpose - masking of objects. That is why we need separate scenes with red green blue shaders assigned to objects, the workflow which just does not work in large projects. That’s what i’m trying to say.

I could’ve been wrong in some statements (values instead of value ranges…), but i don’t see why is my main message i’m trying to say is wrong.

By the way, being relatively new to blender (more than 2 years actually :slight_smile: ) the workflow you described with linking objects to new scene is very appreciated. Thanks, seriously.

Would it be possible to have some sort of multisampling filter node for the compositor?

This would make it possible to have say, anti-aliased Z and ID passes since it would be operating on 2D data.