How is blender cycles vs others rendering engine

@piotr,
i guess what i really mean to ask is why don’t i have the option to save the results? i’m able to play test the scene within blender at high frame rates, so why shouldn’t i be able to save it out to a file?

thanks for the opengl tip. it will surely come in handy

@m9
i don’t mind using a rasterizer, i’ve been using BI all this while and i’m more than happy with what it gives me

But you can save it to a file. The same way you save any render.

You can indeed save it to a file, but you won’t render at 30fps. It’s largely slowed down by writing each frame. It’s still way faster than rendering out of BI, but don’t expect it to render in real-time.

No, it’s not. If you want to render photorealistic animations on a single PC, that’s just not going to happen with any renderer. Rasterization-based renderers like BI are fast by default, but take know-how and work to look good. Path tracers like Cycles are the opposite, they look good by default, but take work and know-how to be fast.

Cycles has plenty of room for optimizations, but some people seem to have the idea it’s going to 100x faster than it is now. That’s not going to happen, and people like Brecht and DingTo would have known from the beginning it’s not going to happen. It’s the nature of the beast.

On top of that, people don’t even use the optimizations that are there now. Time and time again you see people (not singling out you or anyone else here) complaining about Cycles render times, then you look at their settings and they’re using progressive mode with 5000 samples, full caustics, no clamping, filter glossy disabled, transparent shadows everywhere, mixing multiple BSDFs instead of colors, recessed light fixtures, and so on. No, people won’t know that sort of thing out of the box, but that’s the way raytracers are. Pretty comes standard, fast takes work.

Also, Cycles’ OpenCL mode works fine, AMD’s driver is just a piece of shit and can’t compile it.

huh??? you mean with f12? the result i get is identical to using BI. same speed too
what’s more, there’s no output panel in the render settings

@khalibloo:

It will use whatever output settings are set in the output panel, just like normal rendering.

alright, cool! thanks again for all your help!
i’ll certainly put it to good use

My renders were photorealistic enough for my needs (and my clients needs), and faster, 10 years ago with V-ray. For an image with GI I achieved noise free results in minutes or hours, depending on the hurry and needed quality, using AMD or Intel, and all the ram on my PC. I’d like to get the same using BI or Cycles (I’d like, not expecting!)

That’s because the early GI engines such as Vray and Mental Ray employed a lot of shortcuts and cache/pre-processing tricks. This allowed for fast renders, but at the cost of having to tweak a lot of different settings depending on the scene. I believe one person here said that Mental Ray historically required a lot of skill to tune everything for a scene, to the point where universities had entire sessions on what does which.

The Cycles paradigm likewise is to create an engine that’s easy to get good results in at the cost of render time, essentially it means you spend more time with the actual rendering and less time with the setup. You don’t need bounce-lights and you don’t run into issues such as the photon-map being too coarse to render some small details with accurate lighting (not to mention other things like light leaks if your walls aren’t thick enough).

Then… use VRay? Comparing ancient final gather methods to true traced GI is your problem. If you want fast and biased AND modern, pick up Thea and learn to use their Field Mapping renderer.

I like Thea very much, and I’m learning it indeed. It’s so versatile and reltively cheap… AND, even more important, I’m not depending on NVidia monopoly.

Octane renderer just got support for Unreal Engine 4 . http://vimeo.com/m/93246664

Now everything works in real time. Instead of waiting 2 hours or use expensive render farms for Cycles, I'd better render my animations with this solution. It's a lot faster.
 I think that in time classic renderers will get depreciated due to advancements in tech.

  Autodesk is also willing to integrate game engines into their softwares, so we could expect them to make metal ray render in realtime, a lot faster.

Are you sure you didn’t misunderstand what that video represent? It looks like a straight forward comparison between octane render and unreal 4. It might as well been titled cycles GPU versus Unreal 4. The octane renders were not done in realtime but the unreal 4 ones are.

You should be able to get similar render times between Octane and Cycles(CUDA). The are similar beasts physically based path tracers, Octane goes a bit further down that road in been spectrum based.

I don’t see the point of comparing off-line render engines to real time ones. The real time ones will always win because their aim is to put out 60FPS the off-line renderers their aim is to put the highest quality image the possibly can.

Personally, I very-much like the OpenGL Preview renders, which are entirely OpenGL, and I have found various ways to turn-off some of the “extra garbage” (guidelines and so on) that typically show up in them.

The BGE is very powerful in this regard, and there are certain “tricks” that you can use to store its outputs in a file.

I’ve said for many years now that what I desperately want is a set of rendering-nodes which employ OpenGL alone to produce their outputs. I want access to the BGE without having to define a “game.” I want ready access to OpenGL-generated outputs, which I can (and do …) use as a “foundation layer of paint” upon which other layers can be added using other, more computationally-expensive techniques. And, I want to do all of it “in Blender.”

I’m fairly sure that “Python-based nodes” could be the glue that gets me what I want, but I really don’t have the visual-programming “chops” to do it myself, nor necessarily the time.

It’s great that we’re using the GPU, through Cycles, to do something that it was not designed-for. Why, indeed, don’t we have effortless access to what it was designed-for? :spin:

I already “render in passes,” because I’ve never really had super-souped-up hardware. OpenGL is excellent at producing, in near-real time(!), “85% or more of what I need,” such that the remaining work is basically “spices not the main-course.” I want that process to be easier … much easier.

Octane is also integrated in UE4 http://render.otoy.com/newsblog/?p=600

Well… it’s not really integrated. You can do viewport rendering using the data from the UE4 viewport. It’s not like it’s going to be replacing the raster engine of UE4 any time soon. Even with a bank of Titans at your disposal, it’s unusable in real time for gaming purposes.

@Darksider reread that post carefully octane is acting like an external render engine for unreal but like m9105826 not real time. You could possibly do the same with stand alone Cycles but the only app I saw with an exporter in development is Rhino. Personally I think it would be cool if you could see stand alone Cycles in other programs like Maya, Max, 4D.

Hey just lurking in this thread. I’m just at the start of learning the intricacies of cycles. Is there a good tutorial on optimising cycles for different render situations? I understand most of optimization issues listed above but not sure what is meant by ‘recessed light features’?
Back in my [highly stressful] mental ray days, there was an option to have ‘portal’ geometry. An emmisive shader that you’d use for a window or doorway in an interior scene that stopped the renderer spraying useless photons around the outside of your scene. Is this along the lines of what you’re talking about when you mention ‘recessed light features’? If so, what do you do in cycles to stop a whole HDR environment being taken into account, when you only see a bit of it through a small window?

Bliz,

I did a course on identifying and optimizing for noise.

Should get you up and running.

As far as portals, they don’t exist in Cycles yet. Unidirectional path tracers aren’t great for interior scenes lit by external light sources in general though.

Back in my [highly stressful] mental ray days, there was an option to have ‘portal’ geometry. An emmisive shader that you’d use for a window or doorway in an interior scene that stopped the renderer spraying useless photons around the outside of your scene. Is this along the lines of what you’re talking about when you mention ‘recessed light features’?

Try this, it’s not 100% accurate but still the result is nice.