YafaRay 0.1.5 for Blender 2.67 released

It says also version 0.1.5 and it works Blender v.2.67.1 r56680

material preview is not available.

Could someone explain to me what the main benefits for using Yafaray are? Is it meant to be used for something specifically?

That’s what I’d like to figure out too.

After hours of testing, pathtracer engine isn’t really faster than Cycles, it can take a long time to get noise free results.
The photon map engines produce a lot of artefact, maybe I just can’t find the right parameters but I can’t get a clean render with it.
The bidirectional engine doesn’t seem to work at all, at least when using the addon. I guess there must be some features under the hood?
That’s why I can’t wait for a decent light cache for pathtracing.

Thanx Alvaro!

The megasoft builds are actually 0.1.5 versions, I see they are updated with the latest changes from yafaray trunk. The versioning info hasn’t been updated, that’s all.

For those wondering about the premultiply option, premultiply is enabled automatically now by the exporter when rendering in blender, so that option has been removed from the UI.

lucky: photon mapping causes the most problems with artifacts and noise when photon density is too low in the area being rendered, usually due to an inefficient lighting setup. Maybe the photon mapping settings could be improved too, but without a specific example I couldn’t say for sure what your problem is.

I’ve tested a simple interior scene, a room with two windows. I’ve tried to tune photons parameters, up to 4 millions photons, increasing and decreasing “search count” and “radius”, and tested different things such as area lights, BG portals for the windows… But nothing really worked, I always get a lot of artefacts.

@lucky:
Use only geometry with real depth - not only thin faces. I find it’s good to use photon emmiters only pointing inside the interior (area lights in windows) - this way you don’t get light leaks. Portal lights help quite a bit too. Tune yout photon map using “Show radiance map”. There shouldn’t be any light leaks and it should provide enough detail, but sometimes you can get away with it because it’ll get smoothed by FG. And don’t be afraid to go really high with photons (sometimes I use like 25 million and it takes like a minute on my machine).But I agree that photon mapping is obsolete. One of the first GI techniques. The best render engines like vray or final render use light cache for secondary bounces.
For bi-dir you need to set many AA samples and AA threshold to 0.0 to sample the whole image. But I noticed a strange bug in the 1.5 release - it doesn’t sample the very top left bucket and it stays grainy.

The addon was upgraded without number change because we forgot to include some stuff when compiling. The ones posted by megasoft on Graphical.org on May 11th are the good ones.

Cycles path tracing on GPU is a superior solution to anything YafaRay has got… except when path tracing is not the best solution for the job. In general, you should use YafaRay

If you don’t own a CUDA card
If Cycles can not cope with montecarlo noise in an efficient manner
If your scenes needs lots of caustic work. Example : http://www.titanic3d.ca/

Photon mapping is really easy to configurate once you understand how it works.

But I agree that photon mapping is obsolete. One of the first GI techniques.

Path tracing is older than photon mapping and much older than stochastic progressive photon mapping. I believe that photo mapping is the only solution that can reproduce all GI lighting scenarios with enough efficiency and consistency: from the same photon emision you can do not only diffuse intereflection and caustic paths, you can do all kind of volumetric stuff and SSS. No need to use different GI algoritms for each job.

I believe at the end Path tracing is going to be beaten by non brute-force solutions in animation too, the same way it got beaten for indoor and caustic work in the past. Intelligent brute force. SPPM is only the first step in the right direction.

Thank you for the information.

Alvaro,

I’d love to be pointed in the direction of a photon mapper (especially a progressive one) that can handle homogeneous and heterogeneous volumes. I’ve never encountered one, and know that in the past it’s been one of the hardest problems to solve.

Both RenderMan and Mental Ray implement it.
Photon mapping for volume rendering was published in 98, so it’s not a particularly new trick:
http://graphics.ucsd.edu/~henrik/papers/sig98.html

Thanks for the new release! And for those who doubt in Yaf(a)ray - point Cycles interiors that look better than Yaf(a)ray ones. I cannot explain it but Yaf(a)ray’s light looks much better, much more photorealistic than that offered by Cycles.

There is lot of research about volumetric photon mapping but I have never seen a progresive photon mapper doing volumetric work and SSS yet. We need more research not only on that, but also to improve SPPM convergence in diffuse intereflection so with lots of photon passes we can make it converge with the consistency of a path tracer. I believe it can be done. SPPM is still too focused on caustic problems. On the other hand, consistency is the only factor that keeps pathtracing in use nowadays, but we pay a high price for it, sometimes in a literal way.

Anyway my point is that with photon mapping you can really have a global GI algorithm with light paths working in all mediums and changing from one to another without problems, something observer-dependent algorithms have never achieved. There is the undeniable fact that some GI problems are just better solved from the light source viewpoint.

I’d love to be pointed in the direction of a photon mapper (especially a progressive one) that can handle homogeneous and heterogeneous volumes.

Actually, I think you can render volumes in SPPM mode with Luxrender. Never tried heterogenous though.

point Cycles interiors that look better than Yaf(a)ray ones. I cannot explain it but Yaf(a)ray’s light looks much better, much more photorealistic than that offered by Cycles.

That’s a totally subjective vision, you can check this comparative renders : http://www.blendernation.com/wp-content/uploads/2012/06/ea85ae3f09a2cb67a4ae0085760dbfc4.jpeg

Maybe you can notice slight differences between two pathtracers (light leaks, approximations), but a pathtracer is a pathtracer anyway.

No, you can’t. You can set the scene up, but the photons won’t “catch” on the volume, they just pass right through and it’s only picked up on the eye pass.

The link does not work for me.

Yep seems the direct link to the pic can’t be used from an external page.