Cycles for Blender 2.8 - User feedback.

Yes, OpenCL for AMD and CUDA for Nvidia. Using OpenCL for Nvidia is slower than CUDA, even though it does work.

Maybe something to improve the speed of animation rendering, i’d love to see some optimization in the ‘time frame’

check if object didnt move
check if lights didnt move
check if camera didnt move
And if material is non glossy, we can re-use pixel as rendered by previous frame.

perhaps a bit smarter filter for glossy as glossy can reflect a strong light too.

That’s something I like to debate on:
where should they be online?
who’s gonna create those materials? and (speaking of photorealistic) how can someone tell that a X material is how X is in real life?
No software can do that. You find pre-made materials and that’s just someone’s approximation.

@Isscp @sono2000

Does this belong to the cycles discussion at all ??, that discussion should be part of asset management i think
To have libraries with common objects, and example materials.

Will Cycles take advantage of 16 bit floating point precision in Nvidias new pascal architecture?

Hi LordOdin.
You see this report:
https://developer.blender.org/T43310

You do not have to see the memory usage reporting by Blender. You have to use an external resources monitor, “watch -n 2 nvidia-smi” on Linux, GPUZ on Windows.
With that BMW27 scene, Blender 2.76 is even using slightly more vRAM than 2.73a in my tests (about 780MB in 2.76, 745MB in 2.73a).
In that report the developers hope that this will improve with the implementation of CUDA Kernel Split.

Pascal cards actually don’t exist now so it’s quite hard to test them. :wink: Kernel split need to be done first, then we can think about optimizations for specific cards.

edit: Uh. So RAM usage is even worse now… Anyone want this kernel thing as much as I do? :stuck_out_tongue:

This has been answered. I didn’t know any better.

Original comment:

A MUST for me, and frankly speaking, I think I speak on behalf most Blender users, please, for the sake of render time, make it possible to render REAL real-time. Like Unreal Engine. Unreal Engine 4 simply makes near-photorealism possible in real-time rendering, which essentially means, if you’ve got a 10 second animation, the render time is… 10 seconds :slight_smile:

Today with the current Cycles-rendering, I can only make this partly possible by spending time on baking. And still, the render time is too slow (i’m using Titan GPU).

I know you will have to make a compromise on realism, lighting etc. but just make it a feature to turn on/off if the user so wish, and at least make Cycles compete with Unreal Engine.

Please.

I don’t want to use external software for rendering my animations. I want to use Blender only. Cycles with actual real, real-time-rendering will make just that possible.

https://www.youtube.com/watch?v=DRqMbHgBIyY

I want THAT possible in Cycles. Please.

Srsly?: Do you understand the difference between Unreal Engine (or any game engine) and Raytracer (Cycles?)

I do not understand the technicalities between a ray tracer and a game engine, but neither do I understand why Blender Game Engine cannot have Cycles materials, or convert Cycles materials. What I want is essentially Cycles to have implementet a game engine for rendering faster for those of us who don’t want to spend time on rendering, yet want semi-realistic animations.

To me, that above animation in the link is nearly as good as any architectural render inside Cycles. Technical discussions are irrelevant in that matter, as the picture quality is as good, or at least nearly as good, as a raytracer.

But the render times are hugely different. Why cant I have this in Blender?

Unfortunately I do not think that it will ever happen, at least on Unreal level (on low level it is possible now). BGE had it chance to offer a complete pipeline inside Blender (missed), now there is incredible huge difference between popular RT engines and BGE.

On the other hand, you have free access to the market-leading solutions (what is not true for market-leading modeling/rendering apps), so it simple makes no sense to waste resources on BGE anymore, although I completely understand your points.

This is the thing you don`t understand… They are completely different technologies aimed at doing different things. (in very different way). What you might want is Blender Game engine (internal) to be improved to Unreal Engine realism… however that is huge task (guess how much developers are working on unreal… But having Cycles materials in Game Engine is nonsense as those are raytracer shaders and game/real time engines just dont work that way. Anyway IMHO in the Unreal Demo you provided i still expect lot of lighting etc. to be prebaked… which you can also do in Blender… e.g. Render in cycles -> Bake and use in Game Engine

If you look at this report you can see that I am well aware xD

I will put my idea to the table for Cycles… Maybe trivial to implement maybe too huge task. That is to put more samples to some smaller part of image post render (or resume render of part)… many times i found after pressing render before going to sleep (4hours long render time) and then finding out just a small part of image (where there is some caustics or more reflections etc.) just needs few more samples. Now i have to increase sample count and render from begining whole image… which seems such a waste

I know they are different technologies, but why can’t you at least make a Converter that can convert Cycles materials into BGE-materials (essentially Internal materials) or make a BGE-version with the same workflow of materials (node-based) like Cycles, and improve the quality of BGE to be like UnrealEngine4, simulating Cycles (edit: i now understand this is not possible)… but perhaps that is for another thread sorry.

edit:
Btw. thanks for clarifying. I didn’t know the shadows and lighting was pre-baked in Unreal Engine4.

Technically the ‘unified material pipeline’ is possible with its limitations; I wrote about it many-many years ago. For today Substance offers a solution for this, also node-based conversions are possible.

Anyway i would suggest you really take a look on what is the diference between Rasterisation (real time 3d) and Raytracing… +Having basic idea how raytracing works. It will greatly help you in whatever you are doing… trust me and also make it clear to you why cyles/raytracing cannot be used for real time graphics (considering current speed of HW…)

Yeah of course you can use node setup conversion… (e.g. you will not need to set up textures again etc. etc. etc.) but you will not get 1:1 visual result (cycles:BGE)… thats just nonsense

So the conclusion of my original question/wish for Cycles, would be: Not possible due to being two different technologies.
BGE with same visual quality like Unreal/Cycles also is not possible due to technological differences.

Thanks for answering the question that has troubled me for many years :slight_smile:

btw. I know I sounded stupid, but I am not into coding, technology etc. especially not when it comes to how Computer Graphics works. I am just an “artist”.

Yep, we agree. Node setup conversion is possible (at least it is possible with ‘ubershader’ approach; I’m not sure if it is as easy with Cycles), also ‘color remapping’ is possible.
Honestly even 2 raytracers have differences in the output, depending on settings and internal processing and I do not think that is the purpose (I mean to expect the same output). But raising BGE into Unreal level…that is mission impossible with the current resources (also has no sense).