[Arnold] Render Tests

Hi All,

I have cobbled together a basic mesh exporter for the Arnold render engine. It renders directly from Blender and displays the result in the Image viewer for further compositing. The current exporter only supports object types of MESH, CURVE, FONT, LAMP and CAMERA. The camera is DOF enabled. I have routed some minor color options from the material panel as well. Right now all surfaces are considered globally smooth. I am not sure how to specify the smoothing for faces individually yet.

Here are a few test scene I pulled from Blendswap.

Attachments




I have routed the HEMI light to the Arnold skydome_light node. This offers an easy way to add general lighting to a scene with soft shadows.

Attachments




That’s impressive. Keep at it.

Thanks for the encouragement. (I checked out your slogobox…neat)

After thinking about it, however, I realized that the skydome_light node really belongs in the Environment Lighting section of the World context. So I have moved it to that location in panels. I am fetching values from the fields highlighted in yellow to offer a bit more control over the skydome_light node.


I have restored the HEMI functionality back to being a plain old non-shadow ambient_light.


Atom, nice process, can you say something about performance compare to Cycles?

Cheers, mib

Great news Atom, really nice to see something with arnold on blender! :slight_smile: I’m totally ignorant on the matter, but I heard that with alembic support the export process to arnold would be much easier… is that correct?

However is really great to see something like this! I also would be interested in some comparison!

Keep up the good work! :slight_smile:

As far as speed comparisons, it would not be a fair comparison at this time. But overall my exporter will always be slower than Cycles because it is written in python which is single core processing. But at a glance it is easy to say that Cycles will render faster than Arnold on Blender. But Arnold renders faster than Pixie, that is for sure.

There are two ways to program Arnold to render your image. One way is to import the arnold python module.

import arnold

As a programmer when you have access to the arnold python module you kind of “turn on” the Arnold engine and put it in standby while your code creates nodes (XML segments) that represent your scene or tasks you want to complete. This is how MtoA and StoA seems to work. However, when I installed Arnold using the Maya 2015 install the arnold python module was not registered with my system so when I try to import arnold I get nothing. I probably could locate the file and setup an environment variable to it and proceed as expected but I did not want to have to explain that to anyone using the exporter so I opted to program Arnold using the second method.

The second method for rendering with Arnold is to generate an .ass file. This is the equivalent of a RIB file if you are familiar with Renderman render engines. The .ass file is an ASCII representation of the scene. When I examine the log displayed by the kick renderer it clearly tells me that 50%-90% of the time spent “rendering” is really spent decoding this ASCII text file. It urges me to optimize my file. This is another reason why a speed comparison would not be valid at this time. I am a new user to Arnold and have no idea how to optimize the .ass file or what are considered “bad” techniques when it comes to scene definition. As I learn more about the system I may be able to make more efficient .ass files but at this time I am just trying to implement features.

Overall I like the look of the resulting images. I feel that they have less grain than Cycles images which is nice.

SolidAngle provides Ok documentation at this time but there are a few questions I have that I can not seem to get answered. The first question is about how to link a series of shaders into a polymesh so I can support multiple-material per mesh. Shaders can be an array and Arnold supports up to 256 materials per mesh (Blender Internal supports up to 65,535 per mesh…i believe) but I have yet to see an example .ass file that shows how to set this up.

If you would like to help and have Maya to SoftImage installed on your system. Please export a single cube with a unique material on each side as an .ass file. Place the .ass file in a ZIP and attach it to thread. If I could see a “working” example I could probably get it to work.

check inbox

@titipuchal: Thanks for file, that was just what I needed.

Multiple materials are now working. I have also routed a few more parameters to the shader itself so it now supports…diffuse, specular, reflection, emission, transparency, translucency, IOR for refelction, fresnel, mirror fade to sky, and subsurface scattering (as supported by the standard Arnold shader). There is another sss shader node in Arnold that is completely dedicated to subsurface scattering.


If you turn on Wire for the material you get a wireframe render of your object, in quads, with the diffuse color controlling the stroke and the specular color controlling the fill color of the object. You can set the wire width via the Edge Threshold setting under the Render context.


wow!!! that was fast. amazing job!.
what about render time displacement?. That was te reason of my interest in arnold, because cycles dispalcement seems be so far away.

I plan to take a look at displacement and subdivision at the same time. Currently the Arnold exporter does render the current state of the Blender modifier stack. So if you use modifiers you can just render right now as a polymesh. But my thought is instead of applying the modifiers to the polymesh on export, to transfer the displace and subsurface modifier parameters to the built-in functions that Arnold has. Then it might be possible to really crank up the Render subdivision of the modifier and let Arnold handle that at render time. The exporter wouldn’t have to generate the subdivided mesh values either.

I still need to get up to speed on textures, both procedural and image based. This means tackling UVs as well. So there is still a lot to do.

I played around with some better default sample/bounce options. Here are a couple of speed tests.


    AA_samples 3
    xres 960
    yres 540
    GI_single_scatter_samples 3
    GI_diffuse_samples 3
    GI_diffuse_depth 5
    GI_glossy_samples 3
    GI_glossy_depth 5
    GI_reflection_samples 3
    GI_reflection_depth 5
    GI_refraction_samples 3
    GI_refraction_depth 5
    GI_total_depth 3
    low_light_threshold 0.1

Above settings with the Environment skydome enabled @ 5 samples = 1:41 on an AMD 6 core @2.7Ghz.


Above settings with the Environment skydome disabled = 1:08 on an AMD 6 core @2.7Ghz.


NOTE: GI_total_depth is set to 3. I think this means that even though other depths are set to 5 the maximum any can ever reach is the GI_total_depth. GI_total_depth can be thought of as the big-ole speed vs. quality knob.

I have routed Cycles Samples and Light Paths panels into the Arnold render. Right now the interface is looking a little Frankenstien with some panels from Blender Internal and others from Cycles but it does allow us to work with parameters that we are already familiar with. Arnold uses squared samples by default so you have to be careful when setting the master Render samples. 5 = 25, 10 = 100 and so on.

NOTE: The re-use of the Preview samples to control all other samples.

With the Environment skydome enabled @ 3 samples = 4:49 on an AMD 6 core @2.7Ghz.


Wow, this is one of the most interesting projects Blender-related I’ve seen in a long time. Kudos Atom, amazing job. Hopefully Blender will upgrade some of its features to really be useful to have Arnold. If you script gets completed the bottleneck would be Blender for sure as it’s quite bad to handle big projects at this moment which is actually the strong point in Arnold Render.

Thanks for the encouragement Sam.:smiley:

I conducted a deformed bones test and the exporter seems to work with thislow poly dragon test rig by zoltan miklosi.

With the Environment skydome enabled @ 3 samples = 5:49 on an AMD 6 core @2.7Ghz.


Render time 0:06


@titipuchal: Thanks for sending along the physical_sky example .ass file! I now have it incorporated into the Arnold renderer. You can activate it using the Real Sky checkbox of the world context. I have also added a simple fog shader activated by the Mist checkbox.


The emission through the fog may be causing the fireflies? These are the first I have seen.

I have duplivert and dupliface working. Here is a lowpoly sphere as the child of a softbody icosphere who has fallen to the ground. I have also setup the framework for how particles and groups participants can be handled the same way.

Dupli Face


Dupli Vert


I like what I see Atom! You are working fast! :slight_smile:

The fireflies are probably secondary reflections of the sun. (blurred glossy floor > sharp glossy sphere > sun). I’ve run into that issue with sun/sky in Arnold quite a bit. SA even has an entire wiki page on it: https://support.solidangle.com/display/mayatut/Fireflies+-+Boat+Scene

And great work, btw. :slight_smile: Especially great to see duplis up and running, one of the better ways to handle high poly counts in Blender. Blender can chew higher poly counts than some people give it credit for, you just need to be careful about how they’re handled. Speaking of polycounts, do you plan to add standin support? Once you have standins, who cares how few polys Blender can handle at once. (I can generate a set of .ass files for a standin setup if you need them. I haven’t looked into Arnold’s scene format too much, not sure how standins are dealt with)

I have implemented dupligroup support for Empties. The exporter can detect any of the supported object types inside a group, including lamps. Groups inside groups are not supported at this time.


Speaking of polycounts, do you plan to add standin support?

@J_theNinja: As I implemented this dupligroup feature I read the description on the ginstance node. I think I may have to refactor my generation to support instances first. Currently, this exporter can easily generate a 50MB-100MB .ass file from a fairly simple Blender scene. So trimming that file size is starting to become more of a priority.

While instances are not the same as standin’s it would be a step in the right direction towards making any part of the puzzle replaceable with another element.

This is fantastic! While I’d love to see deep integration at some point (my personal tests writing directly to .ass hit the same speed bumps as you are) this is an amazing start! As a long-time Arnold user, I’ve wanted to see proper support in Blender for a while now. I’ll be keeping a very close eye on this project :smiley: