Nvidia unveils Pascal; Bringing in the era of slimmed down GPU units

This isn’t just “evil marketing”. As a “consumer”-level customer, you have benefited tremendously from NVIDIA designing their product portfolio in the way they do. Case in point: You can have a 5 teraflop (SP) general-purpose parallel processor for about 500$.

Other companies like Intel do not offer such devices for the consumer market at all. Instead, they charge thousands of dollars for devices only aimed at the professional market (like their 15-core CPUs or their Xeon Phi accelerator boards). They can charge that much, because that’s what these devices are worth in the professional market.

It would be idiotic for NVIDIA to not charge similar market prices for their professional-level devices. If NVIDIA designed completely different hardware just for high-performance computing, you just wouldn’t be able to afford it. Since they try to cater to both markets using essentially the same design, they of course need some artificial differentiation for the professional devices (like limited double-precision performance). From a GPGPU rendering perspective, those differences are unimportant, so you get a tremendous price/performance advantage from using consumer devices. You can consider this a happy accident.

“Anyone in the right mind knows instead of buying 1 Quadro you can afford to break 5 Titans… And the best Quadro has the same amount of ram as the Titan X”

If you own a render farm full of Quadros… I mean 4 Quadros because thats all you will be able to afford then you wasted all the money you had to build a real render farm lol

If I was setting up a render farm id choose 20 Titans over 4 Quadros… You dont even have a farm with 4 Quadros xD

Now if you needed double precision you just have to pay 20k for Quadros that still suck at double precision.

If one quadro lasts for 3-5 years of constant use and the Titan keeps breaking after a month of constant use than you save money in the long term with the Quadro. Studios generally have different ways to do render farms, some save money by having their workstations work as render farm computers during their off time. I’m not talking about a sad farm of twenty Titans either.

I wasn’t really thinking of small studios, I was thinking more of the 500+ sized ones.

If I’m on a deadline I don’t want my gpu to overheat. I could lose thousands of dollars, a good client and my reputation because I thought “hey, if it breaks I’ll just swap it out for a new one once I’ve come in the next morning to discover my render failed. I’m saving money!”.

If the entire success of a project depends on a render not failing over night, your planning is wrong. It’s much more likely for the render to fail due to a software fault or user error than due to hardware failure. You still have to plan for either cases.

If reliability was your primary concern, you wouldn’t have gone with GPUs in the first place. The price/performance advantage of these professional GPUs for GPGPU rendering is so small (if it exists at all), you’d be better off with just using CPUs.

Also: You can underclock GPUs for thermal considerations as well. You don’t have to spend extra money on a professional GPU that has lower clocks as a “feature”.

There’s anecdotal data available on the lifetime of consumer GPUs running 24/7 for mining cryptocurrency. It appears that consumer GPUs can indeed run for years and the most likely cause of failure is the fan.

I wasn’t really thinking of small studios, I was thinking more of the 500+ sized ones.

I don’t know what studios you are thinking of (500+ is basically the size of Pixar), but I haven’t heard of anyone major using GPU render farms. Final rendering at Pixar is all CPU-based, when they do use GPU-acceleration, it’s rather at the workstation level for previz.

For the record, I have a quadro in my laptop and a GeForce on my desktop. We have a mix of Quadros and GeForce cards in our office. I find that the GeForce is perfectly adequate for our needs and haven’t had problems with either. All I’m saying is that just because you don’t think Quadros are worth it, doesn’t mean that “no one in their right mind” should want one either. They have advantages in demanding environments.

What you’re saying is that Quadros are built better than Geforces and that isn’t true at all. Just buy a PNY Quadro with its crappy components and you’ll wish you’ve bought any premium Geforce. If you are so concerned about your components overheating just set up a liquid cooling system (or have someone do it for you if you’re not into that).

For the statistic my parent’s PC is rocking a HD 5850 that has been running almost 24/7 for 5 years because they forget to turn it off most of the time. Same goes for its Noctua NH-12 that have been running without issues in that PC for 7 frigging years, mind you, original fan.

There are lots of companies doing this to make a profit and they’re not evil at all. They offer their software for free for learning and hobbyist purposes and charge for it if you want to make money out of it.

Yeah… Titans dont break after 1 month of use… But I guess in your twisted world it would be a better idea to go with Quadros :stuck_out_tongue:

Any decent 3D artist knows their GPUs average peak temperature while rendering :smiley: I recommend looking in to it also maybe turn the GPUs fan up…

I’m not sure if you notice that consumer GPUs tend to have as good, if not superior cooling solutions than their Quadro equivalents, and of course, the liquid cooling option is a route to go too if you’re using something like 3-4 Titans. If you’re case, psu and fan setup is crap to begin with, you’ll trip up even a Quadro. Otherwise, a Titan setup will have no problem running months straight.

Stability also isn’t a major concern in CGI unless you’re running an unstable overclock to begin with. At stock speeds, stability is non-issue in Blender. In CAD workloads, the additional stability is useful where hidden errors in simulations can impact real-world practical applications and absolutely zero faults can be accepted. In CGI workloads, this level of stability is not necessary as any hidden undetectable errors that (by some small chance) occur are most likely invisible in the final render anyway.

For Blender, a GeForce and Quadro are equally suited and shouldn’t be very far apart speed-wise (the edge going to GeForce for more aggressive clocks), though when costs are considered, you’d honestly have to be very Naive (and apparently cash strapped) to consider Quadros in a workstation if your only workload is Blender.