GPU viewport render but not F12

I’m just getting going with this new Cycles thing. I’ve enabled the GPU in the settings and told my scene to render using GPU compute. The viewport rendered view uses the GPU just fine, but when I do an F12 render it’s still using the CPU.

Clearly I’m missing something obvious, but a bit of searching hasn’t got me very far. Can anyone tell me what it might be?

Thanks,
Steve.

What makes you say that? That the F12 render does no “progressive refine render” on the entire image, but rather renders in tiles? That’s the intended, default behaviour and has nothing to do with GPU/CPU…

I was actually watching the CPU usage, which was working a lot more during the F12 render than the viewport render, and the render was taking a lot longer using the same sample count. It turns out that the default tiles were so small that joining them all together was taking longer than the actual render. Increasing the tile size gave me the performance I was looking for :slight_smile: