Gooseberry cycles render times

Thanks for the explanation! Glad to see the improvement in render time. I’m still a fan of just using post even if you think Cycles shouldn’t have passes as a strength (which i kinda disagree with to be honest, lots of useful stuff one can done in post especially when it comes to quick fixes :P)

And also thanks for more of an explanation Hoverkraft, I can see the issue you’re talking about.

True, even if post is not as accurate it is still indispensable in every pipeline. As such I hope that getting raw data out of the renderer will be improved upon in the future (deep compositing) and not be the stepchild. I get that path-tracing and splitting scenes are a bit of a conflicting philosophy but to make a photography analogy it sounds like saying you want the camera to do the image manipulation because raw files are just huge and unwieldy.

That’s great to hear that the motion blur slowdown issues are being looked at. I, for one, hate vector blur as it very rarely looks as good as 3d blur and is difficult to get right in post, especially when combined with dof, so I have my fingers crossed that you guys can find the solution for a good 3d motion blur pipeline with hair and volumes.

Hi @ndy,

Thanks for giving us some technical details. I was wondering, do you use branched path tracing for your ‘farm’ renders ? As branched path tracing gives you more control on the samples, you can tweak settings more in depth and optimize your render times this way.

From my tests, branched path tracing is slower on GPU, but with CPU, it is definetly faster, it is worth trying it !

Cheers

If you have complex materials though (which is often needed for truly convincing imagery, Branched path tracing can become as much as 3 times slower than regular path tracing, especially if there’s refractive surfaces involved.

What is needed is a way to dramatically reduce the additional overhead introduced by branched path tracing, then there would be almost no reason to keep the regular tracing option around as it would pretty much be universally slower.

I have to concur, my experience is that every single time I am adjusting settings for final render I find plain pathtracing just so much quicker to clear up that I don’t bother with branched tracing any more. As soon as you are using 4-5 of the raytypes the times go trough the roof, especially with SSS and/or Volumetrics and refraction, which I have to use a lot. Add DOF and Motion blur and the situation is hopeless for branched tracing.

this is true. On the other hands BSDF such as SSS are much noiser with the regular path tracing in comparison with the branched path tracing

I find that’s only true if you are only rendering SSS and maybe a ground plane and you can give all the samples to SSS? Let’s say you’ve got an interior scene with a bowl of fruit in the foreground so you have to crank SSS but then there is that translucent curtain, that glossy floor, a couple of big window-lights then that will kill the branched pathtracer.

My approach is to leave all ray samples at 1 and raise AA until everything looks clean, so I get max samples needed, then I dial AA down to 4-8 and raise all the rays to match the required sample count, then dial down individual raytypes that can do with less samples. I would be happy to learn that I am doing it wrong :slight_smile:

But as Ace mentioned as soon as you have a couple of raytypes which all have to be sampled anyway the overhead becomes too much.

Do you guys compare branched with sample all lights indirect and direct checked? That makes quite a difference in time and sampling, because indirect sampling of lights is turned off for regular path tracing.
The workflow for branched path tracing is to get a minimum amount of AA samples which looks good (mostly edges and hair (visible AA in alpha channel), blurred MB parts or out of focus parts). I find usually 32-64 AA samples is okay for shots without MB or strong DoF.
Once this looks OK, leave AA and play with other types of samples. Again these can be checked separately in their respective passes.

I’ve got “sample all indirect” off and “sample direct” on, I didn’t notice that much of a visual difference in the scenes I tried it on but the speed hit of “sample indidirect” was very noticeable.

Edit: The kind of scenes I render most often is:

  • Mostly direct light
  • Simple scene but heavy geomtry (product viz)
  • Complex shaders, translucency, SSS, volumetrics, everything mashed together (medical and technical equipment)
  • Rather hight bounce depth (12-24) because of refraction
  • Most of the times animation so I can’t push samples too high and sometimes I do DOF and/or MB in render

If someone knows a way to set glass exit color that would be helpful too (sorry for the OT)

I have question if any developer reads this. Will be there noise filtering in Path Tracing? Such as Radiance Filtering method works better for DOF and motion Blur.

There’s a fairly hidden optimization notes page for the Cycles engine (http://wiki.blender.org/index.php/Dev:2.6/Source/Render/Cycles/Optimization/Notes) that has some decent information on the best uses for the branched path tracer vs the normal one. According to that page, scenes with numerous lights, complex materials or a high bounce count usually render faster with the normal path tracer (and this is what you appear to have found as well). The branched path tracer is designed for simpler lighting setups that are dominated by direct lighting and use relatively low bounce counts (like a single diffuse bounce and maybe 2 to 3 glossy bounces)


@jdent02 Thank you for bringing the above to my attention, most interesting.

However a really new newbee here only for past month in fact; not exactly a newbee to 3D but most certainly to Blender & Cycles.

So, had a bit of a struggle rendering this bearing in mind that I am new to blender et al my question is, are there any ridiculous settings you can see on this screen shot. I did try many different combinations before settling on these. Much obliged if any one responds, but understands if you do not.

:eek: Arrrg! Sorry ignore the glass image, should have been the screen shot. :o

Attachments


You samples are wayyyyy too high. Look at the total sample count (the sampels get squared, because you got “square samples ticked”), 900 AA samples (AA = antialiasing, how crisp the edges get basically) needn’t be no more than 16, 32 maybe 64 for this kind of image image. And you can set diffuse, subsurface and volume to 1 because you are not using any of those shaders in the scene (I don’t know if this even makes a difference then). Check the samples on the lamps, maybe 2-4? Check MIS (multiple importance sampling) on the lamps. If you start using a world BG, enable MIS there and bump the samples, shouldn’t need much for clear glass but once you have something diffuse you will need to up them. But the main thing is, set AA to something sensible like 4 (4 squared = 16 AA samples), the other samples are scaled off of that accordingly (keep an eye on the total samples count). You will be rendering the same (or imperceptibly different) image quality in a fraction of a time.

Regarding glass exit color, this seems to work OK:


@ Hoverkraft, Thank you very much for your experience; early on I did render it with the default AA and couldn’t believe how bad the jaggies looked. Although at the time I was early on experimenting on knife cutting my box model and the subdivision was naff to say the least. So, as I improved the model I gradually increased the AA settings.

I tried reducing the volume settings and the result was a change in the World colour where the glass is, to the extent that I loss all transparency. So I gradually ramped the volume setting back up until the colour matched and the transparency returned.

Off the top of my head I’m pretty sure I have already checked the MIS, certainly regarding the area light. I will have to check the other three lights and the world settings.

As to the noodles; well though I use nodes in World Machine and Terragen 3 I’ve not yet ventured into Blender’s node system although I have studied it with the Mark I eyeball.

I must confess that I’m intending to link Blender to the Thea render in the fullness of time once I feel more comfortable modelling with Blender; so in closing, again many thanks for your help which has been most helpful.

A gentleman and a scolar. Cheers.

Attachments


i use Thea myself for some cases, a scene like this with caustics etc. is especially suited for Thea, I can recommend it. Your glass geometry looks wonky. Take a look at this tutorial: http://adaptivesamples.com/2013/10/19/fluid-in-a-glass/

The volume settings shouldn’t be doing anything!? Are you using a volume shader somewhere, in the world or in the glass? You shouldn’t need one, at least not until you want to add some absorption, either in the glass (colored glass) or in the liquid. But leave that for later. Check you world and shaders for volume, there shouldn’t be anything connected to the volume socket in the node tree.

Keep going, having used terragen you know that this learning process is a steep and tedious curve. Don’t be scared of the node system, once you get the hang of it you’ll love it.

As a side node: Make sure that the UV/Image Editor’s left-hand “T-key” panel is never open while you render.


In some cases updating those graphs seems to cause so much overhead, that the render performance virtually plummets - which could very much explain why your glass took 15 hours(?!?) to render…


In addition to that:
Clamping values far below 1 will totally kill many of your lighting efforts. Clamping means that all brightness values will be cut off at the value you specify. With values lower than 1, you’re even cutting away parts of the meagre brightness range of a LDR image…

Have you logged a bug about the histogram behaviour? That’s actually pretty bad. Though I believe in this case with 32400 Glossy samples the rendertime would have been too long anyway.

Good catch about the clamp settings!

Well if you know about it then it’s not a big deal. If you don’t it is one of the things that could make someone abandon blender.

Should I log a bug or will you :slight_smile: ? (Not that I am lazy, I just have to put the kids to bed but I can log the bug later)