Cycles NVidia MAXWELL Benchmarks

Hi There,

The NVidia GeForce GTX 750 and GTX 750 Ti are out, lets see what we can expect from the Maxwell architecture. Please post any benchmark of upcomming NVidia Maxwell family cards you find or you do in Cycles or any other CUDA based benchmarks.

Thanks in advance

with friendly regards

Jozef

Mike Pan BMW Benchmark

Based on my own build, Revision 61305
Build with VS2008, Scons, Nvidia Cuda Toolkit 6.0 RC

System:
CPU: Intel I7-920@3,5ghz
GPU1: Nvidia GTX 570 @732/950/1464
GPU2: Nvidia GTX 750ti @(1085)~1228/1375
OS: Windows 7 64bit | Blender 2.69.11

GTX570:
Default settings: 83.14 s
128x128 tiles: 63.77 s
256x256 tiles: 48.84 s
512x512 tiles: 49.92 s

GTX750ti:
Default settings: 68.87 s
128x128 tiles: 62.34 s
256x256 tiles: 55.25 s
512x512 tiles: 64.19 s

GTX570+GTX750ti
Default settings: 39.27 s
128x128 tiles: 32.28 s
256x256 tiles: 27.29 s
512x512 tiles: 31.12 s

I can´t test the GTX750ti with official 2.68a build because the GTX750ti (Maxwell architecture) has cuda capability 5.0. which isn´t supported at the moment.
My build is based on the official Blender Source R61305
I compile it with default settings, only change is i added support for cuda capability 5.0 and compile with the Cuda Toolkit v6.0RC(official builds based on v5.0 or v5.5, which is necessary to build the kernel_sm_50.cubin for the Maxwell architecture.

Only Chance is, i compile the latest blender 2.68a source with cuda toolkit v6.0RC and test it

sorry for my bad english…

Quick test:
Cycles cornell bench
Default stettings
GTX750ti:
2:09.67


Great, thanks for explanation, it will be helpful for other users. How about to upload your CUDA 6 RC build on http://www.graphicall.org/ ??

I haven’t a account yet.

More Tests.
Blender 2.69.11 Revision 61310 Hash eb4f2b4

Pabellon_Barcelona_v1.3 Scene

GTX750ti
Time: 2:36.47


GTX570
Time: 2:19.37


GTX570+GTX750ti
1:16.10


So by Rolf`s tests it seems that high-end Maxvell cards could be really great for Cycles. 750Ti is decent already.

I just got a 750 ti OC card but not sure how to get a version of blender that supports it. I’d be happy to do some benchmarks. Help.

Rolf, its really cool. But I still don’t know how to compiling for older CUDA Capability v1.0? Because I’ve GT8600 DDR3 which has CUDA Capability v1.0 and is it possible to make v1.0 cubin to use ‘Experimental GPU Compute’ settings? I apprecieted if you teach us for it and make test version for v1.0 cuda cards too. Big thanks.

ROLF, thanks for your posts, I appreciate them very much.

I dont think it is possible to run it on CUDA 1.0, as far as I know, it does not have some instructions/functions that cycles uses. Time for a little upgrade.

750ti looks a great card. I purchased recently a used 570 2.5gb for 170 euro, I feel to have paid too much for a very old card with a 220 watt tdp…

The GTX 750 Ti looks really nice.

Here are some Tec-Specs and Prices for GTX 750 and GTX 750 Ti. (In German)

Power Consumption:

Make CUDA aware of COMPUTE_50 (sm_50, Maxwell)
http://www.miikahweb.com/en/blender/git-logs/commit/2d26d1dbbd0e91c2fd7b6dd147ab54dbd7f73111

Rolf is the best. Thank you thank you thank you. I downloaded your build and it works great with my GTX 750 ti! I’m in California. If I’m ever in Germany I gotta bring you a prize.

Wow the GTX 750ti with 2 GB seems really great - Nearly as fast as a GTX 580 (~200watt) but only uses 60watt. And costs 140 Euros. So you can nearly buy 2 GTX 750ti for the price of one used GTX 580 with 3GB and still you consume half the power at even greater rendering speeds. Of course it is only 2GB.

Looking forward to the higher Maxwell graphics crads.

Maybe off topic, 2 questions came to my mind:
1-Will cycles (or is any other GPU Renderer already able) be able to make use of the regular memory? Or is this impossible?
2-And why is V-Ram so much more expensive as regular ram?

Rolf, thanks for all info. - Got to know that GTX 750Ti is not SLi-ready. How two cards work together then? Or SLI not necessary for this? (Sorry, till now I have not much knowledge about hardware.)

Thx :smiley:
Can you post some benchmark results of your pc?

i forgot to mention, the GTX750ti is not SLI ready.
But SLI is not necessary to render with two cards , and unlike SLI, in cycles two cards of different series can work together.

Hey I got questions for yas !! :smiley:

sorry im terrible with spreadsheets (from the other thread), can you please tell me how much faster the 750ti is over a 580 ?

Ive read that the 750ti cant be SLI’d, but does that mean it wouldnt be possible to use 2 at once ? My motherboard supports sli/crossfire.

Im also new to nvidia cards. Which brand is good ? Ive been really happy with MSI but that is with ATI.

here you go

SLI actually slows down Cycles, you don`t need SLI for rendering.

Hi guys. Blender noob here and this is my first post since joining in September of last year.

Here is a short video demonstrating the CUDA compute capability of Nvidia’s Maxwell chip on my new video card, as it renders Mike Pan’s popular BMW scene using the Blender build uploaded by Rolf. Enjoy. :slight_smile: