Cycles: Stop sampling a pixel when color doesn't change by a given amount

First of all, I never coded for Blender and don’t know if I might, so this is a suggestion for other developers if that’s alright in this forum section. I only wish to request a simple optimization for the Cycles rendering engine, which popped to mind as I again spent hours trying to get less noise without huge render times. If this little feature could be added, I believe render times could be cut drastically with barely any extra noise being introduced. If the idea can’t be accepted in mainstream for any reason, perhaps it can be attempted as an addon.

The problem I see is this: For both the branched and non-branched path tracer, you specify a number of samples. Cycles will do that many samples for each pixel or branch, regardless whether it improves quality or not. So while sending 100 samples in a bright area lit by few lights will cause each sample to reduce noise, sending 100 samples in an area which is barely (or not at all) lit is a waste of resources, as the first 10 samples will already define the final result and 90 more are sent without visibly improving anything.

My suggestion is to allow a threshold, based on which we stop sampling a ray once it stops improving quality. This can be detected by checking how much the colors of that ray changed over a given amount of samples. If more than X samples have been ran but the color of that pixel hasn’t changed more than Y, sampling for that pixel is ended. The two optional settings which should be introduced are the number of samples between which to check, and the color difference below which to stop.

A simplified example: The number of total samples is 100, while the threshold is a 0.1 color difference over the course of 5 samples. Cycles sends the first ray for that pixel, and gets an RGB color of 0.5,0.5,0.5. It then does 5 more samples and checks again; This time the color is 0.75,0.5,0.5 (more red), which means there’s a 0.25 difference for one channel and it should keep going. So it again does 5 samples in a row, and this time gets 0.875,1,1, meaning the difference is 0.125 and it keeps going again. 5 more samples are executed, and now the color is 0.9,1,1. That means a 0.025 difference which this time is less than 0.1, so sampling is stopped and the current result remains the final color of that pixel.

I thought of this after I noticed how, as samples are being sent, the quality of the image improves more and more slowly… starting from very grainy, quickly becoming acceptable, soon becoming pretty clean, and from that point on you only notice the ongoing improvement if you look very closely at the screen. But the process is different for each part of the image, based on the lighting and surface shaders… any part can take a different amount of samples to reach the same level of clarity. If we could harvest this gradual improvement per pixel to define where sampling should stop, I think it would be an amazing optimization! How could this be done please?

There was already a patch for this which simply “Nops” a tile whenever the change between N samples is below a certain amount, but it didn’t really pass muster with Sergey because he was concerned over how it would hold up in animation.

In a lot of more difficult scenes, the noise is already even enough that it would be a long while before even the first tiles are stopped (due to things like importance sampling improvements). However, I can see how it would bring a handsome speedup to simpler and outdoor scenes. Another thing is that such systems can still stop a tile if the entire area is converged save for a slowly converging corner, which leads to noise-retention.

The usual rule does seem to be that quality will improve everywhere in the image at the same rate. This doesn’t always seem to be the case however. Especially when certain surfaces use DiffuseBDSF and others use GlossyBDSF, or some areas are lit differently than others. For example, if from the same view point you see both an outdoor area in broad daylight (noise disappears quickly) and an enclosed room through an opening (light enters much more slowly so it takes more samples to remove the noise), why not use just 10 samples for surfaces lit by the sun and all 100 for the inside of the room where it’s hard for light to reach?

In my case I realized the problem while rendering a scene where some areas are pitch black and light only shines in some parts of the world. Entire tiles will not be anything but pure black… yet still took up the same sampling effort. Seeing a CPU thread wasted for a minute just for a black square that never changes does give the feeling that something can seriously be improved :slight_smile:

I’m not sure how doing it per tile may work, although I can imagine different implementations. I think that the simplest way is to stop sampling when the color of a pixel no longer changes beyond a certain amount, which means it’s unlikely that sampling further is going to improve anything. Tile size can vary and would only focus the improvement in some parts of the image, whereas this would guarantee that strictly the necessary amount of effort is done for each pixel and at the (most) right moment.

You’re best bet is to follow this thread (DONT COMMENT IN THE THREAD UNLESS YOU ARE A DEVELOPER. There already has been too much noise from artists, not developers on this topic.)-- https://developer.blender.org/D808

Totally subscribed to that, thank you! I so hope it really happens, and the patch makes it into mainstream Blender someday not too far.

Reading the description, I sort of understand what it’s trying to do with a per tile noise level estimation. I am however still worried that by using this approach, the optimization won’t be as good as it could be. Since unless you use really small tiles (which can occasionally yield in slightly lower performance), a tile may cover multiple areas subject to different circumstances.

For example, what if a tile intersects the edge of wall, and on the left side you have a zone which isn’t lit while at the right a half a bright zone? If it will take 50 samples to get rid of noise in the lit area, 50 samples would also be used for the black zone where it’s pointless… unless I’m missing something. Of course no method can be perfect, and this happening per tile is WAY better than it happening for the entire image, so I certainly can’t complain.

Otherwise I’d like to understand how noise level estimation is done. Is it by evaluating the pixel color changes, or by checking the overall roughness of the tile? Hopefully it can tell the difference between noise caused by insufficient samples, and roughness caused intentionally by a grainy texture or hard detailed geometry.