netrender still image ?

Trying to speed-up render times .
Is there a way to network render with cycles large single-frame images (photo realistic) for example with netrender.

Wold like to add a question to this.
Is there a way to divide tasks over network for still images?Meaning can i divide an image in 10 strips, and 2 computers to render 5 each like in backburner for max?
As far as i understood, netrender only works for animations, and the chunks option is for frames rather than actual chunks of a frame.

Thanks

When you are connected as a client, have you tried clicking “Send current frame job”? It should send the current frame and only that frame to the server. However, I don’t think you can split that frame up to render across multiple computers

Of course we want to render one still image with multiple computers and multiple gpu cards for faster cycles render.

I think there is a script floating around just for this task. The basic logic is that it creates a series of border renders for different sections of the image on each render machine (you can set this up manually too). Then you have to put all the pieces together in Photoshop or gimp.

Hi antonvdh,

maybe this.

Rendering a Single Frame Faster with Multiple Computers

Ok but cumbersome.

Interesting thanks

The Blender network rendering add-on desperately needs automated image merging. Maxwell Render automatically gathers and merges all of the images rendered on each separate machine for still image renderings. This would be ideal. I do mainly single image arch viz renderings and I am migrating Cycles into my workflow. But I can not take full advantage of my render farm. I know there are some work around options. But they are not the most convenient options. Renderstreet is a good option. But not always needed for all renderings, if I could take full advantage of my local farm in a convenient manner.

Thanks for mentioning us Aaron. We do indeed have our own solution for rendering a still image on multiple servers. This is useful for large images, which would take too long to render on one machine, or won’t render at all (because of Blender crashes). The system works for Blender Internal, Cycles and V-Ray for Blender. The only disadvantage in this kind of usage is that compositing is disabled (including the file output nodes).

@ RenderSrteet,

Will there, in future development, be implementation for rendering a single image on multiple computers where compositing and file out put nodes will be supported?

We would really love to do it, as it would also simplify the processes on our side. Unfortunately, there are compositing effects that are not computed the same when you only render part of the file. This makes the stitched file not usable.

We’re looking into any ways to work around this issue and will definitely implement it if possible. This is likely to take a while though, if at all possible.

I really do not need the compositing computed on the render farm. I just need the multi-layer.exr file with all of the associated render-layers. For a single image, it is easy enough to set up the compositing and render the effects locally on my computer.

Unfortunately, we haven’t found a way to properly stitch the multilayer EXR files. Still looking into this and are open to suggestions :slight_smile:

Here is an idea.

This is basically how Maxwell Render works too. It distributes the job so each computer is rendered with a different seed then merges the images automatically. I do not know if this works with .exr files anyway. This maybe similar to what you are already doing. I do not know.

In the video tutorial that I have attached, maybe this could be done and when the images are finished on each machine, they are some how merged through the node set up as shown in the video.

Yes, we are aware of this technique. It should work, though it will not address the multilayer issue. As far as I know, it works for PNG files only and it will not work for multilayer files, as it would have to apply this on all layers in the file.