Nvidia, stop being a DICK (YouTube video) what do you think?

I recently watched this video on youtube, I found it very interesting.
What do you think about that ? ? and how this affects Blender development and more important, blender remaining as open source as possible?

I haven’t bought a Nvidia card since 2006, and have been an AMD user since. Back then it was simply a minor preference, but nowadays when I use Linux and the free video driver MESA (which works great on AMD but poorly on Nvidia) it certainly stays. After seeing this video which further explains what a junk company Nvidia is, I sure am glad of my choice.

In my experience, and as mentioned in the linked video, AMD’s technologies under-perform compared to Nvidia. I’d hardly call choosing not to cripple their technology by going with an inferior one being a dick.

Look at how long it took to get cycles to run on AMD cards. AMD had to get a team of engineers working on cycles to make it work with their system, and it still isn’t up to parity with the CUDA implementation. Cuda worked out of the box, day 1. It’s great that AMD contributed to the code base, but it’s ridiculous that they had to.

I’m all for open standards, but it’s really difficult to choose an inferior option simply on principle. Especially as a business decision.

Yep I agree, it’s not an Nvidia problem, it’s AMD dropping the ball, again and again. I view it as a problem with AMD management, they just don’t spend the money they need to on R&D. Open source is great, but if you don’t support it, it’s not going to go very far. Just look at the difference between CUDA and OpenCl. If AMD had put in the same effort in OpenCL that Nvidia has done with CUDA, things would be much different now.

I might be a bit evil for doing this, and possibly generalizing two different issues. But if anyone wants a great example on why openness at the cost of facing extra challenges is better than closed technology with the benefit of instant support, take a look at Windows 10: It’s infested with spyware and privacy invasive mechanisms which even hackers have a hard time disabling… while I also heard you can no longer turn off Windows updates, meaning Microsoft can forcefully put whatever they want on your computer without your permissions! The majority of users who over the years went with Windows now has a hard time departing and is stuck in Microsoft’s hands… meanwhile us Linux users sigh in relief to having avoided this scary fate :slight_smile:

AMD might wrote shitty drivers and NVIDIA sells you 4 GB of fast VRAM while it only has 3.5 GB.

Problem is there is no third party like in the USA with the politics either Dems or Reps and no real alternative :wink:

one issue is that ATI was a Microsoft ONLY!!! shop for so long that the ATI open linux driver has been developed much longer than the Nouveau open driver for nvidia

but in 15 years i have had ZERO issues with nvidia supporting openGL and have seen a TON of problems with ATI supporting OpenGL
( Microsoft is still trying to kill off opengl)

and a binary blob of the Nvidia.run or the AMD.bin is a necessary evil

MS was (not sure how it is now) so terrible that they also influenced chip design for their own directX system.

Writing drivers is not easy picking to be honest but one has to ask seriously you cannot write a decent driver for your hardware?

But then you see that the advanced cards have better drivers so I assume for the gaming devices it is different than the pro cards.

i saw that video and while i’m all for openness and fair competition. I just want a decent graphics card with a driver I can rely on not crashing or glitching. Nvidia provides that and I can model, render and get on with my life.

I tried AMD a long time ago and I recently tried the 290x. There has been minimal improvement in driver stability but it’s still miles away from Nvidia’s. If you provide an alternative and it hinders my workflow, it’s a deal breaker for me.

Nvidia will compete in any way it can just like any other company, which is a good thing since competition is the reason we are as technologically advanced as we are today, while it is true that especially large companies that have a near monopoly on a market will occasionally perform anti-competitive actions it rarely results in any long-term negative effects.

So, do you want chocolate, strawberry, or vanilla ice cream? because those are your choices and you’re free to pick any combination you like. :stuck_out_tongue:

You most certainly can control when and how you get update. MS has not made it simple as before but it can still be done. The only complaint I have is they do not give you enough information about what the update covers as they did in the past.

And to use the term spyware is a bit over the top in my opinion as most of the reporting is to insure that you stay secure. You can turn off all the location information as well as the other feedback reporting options.

I do not like the MS store apps though. Even the media player is slow and clunky on my extremely old laptop. I use a different player for video and audio playback. I suspect it is related to the DRM stuff but I have no proof of that.

Also I do not feel comfortable allowing them control over my mail or calendar and I control my backup personally and never use cloud services to store anything I would like to keep private.

Simple solution: make your next graphics card NVidia.

Signed: NVidia user since 1996.

Linus Torvalds was right…

Slightly OT, but in accordance with some comments, for Windows 10 users, this is worth a watch:

Personally, I’m exclusively Linux, and happy to be so, but I still found it illuminating.

I very much agree with the statement that Nvidia is being a dick. Funny thing is that if you’re under windows you can freely go with amd and its open source favorable policies, unless you’re suffering from some kind of 1 FPS addiction.
With Linux you’re forced to take an ethical decision because nvidia’s blob not only performs so much better than the amd one, but it also has a faster development cycle, better working ui (can I change my desktop resolution without logging out AMD? It’s 2015, thank you)… it’s just better value for money.
For my last PC I went amd again but I practically donated money to the cause because I could have had the same performances and a better user experience with a cheaper NVidia card.
Imho, because NVidia is making money out of being a dick, things won’t change unless more consumers become masochists or some public authority starts a procedure for abuse of dominant position. Both are unlikely to happen.

Youtubers, stop being FOOLS.

NVIDIA is a for-profit company, not a charity. They provide distinguishing features for their products, part of which is proprietary technology like GameWorks or partially-proprietary runtimes like CUDA.

If developers choose to optimize their games only for certain hardware and choose to use proprietary middleware, the blame should be on them, not NVIDIA who is just offering those tools.

The video also blames NVIDIA for being good at tesselation and developers for having tesselation enabled in their “ultra settings”. Hey, I’ve got a solution for AMD: Stop sucking at tesselation. By the way, that “excessive” tesselation shown in the video is a actually good for performance from a rasterization point-of-view, because GPUs reach peak efficiency around 8x8 pixels per triangle (large on-screen triangles are actually bad).

AMD also doesn’t “open” their technology out of the kindness of their heart, but because that’s one of the few distinguishing features they can provide. Back when NVIDIA wasn’t the uncontested leader of GPGPU, AMD had their own proprietary GPGPU framework that nobody even remembers. After that, they went for OpenCL, but their implementation was so terrible that it couldn’t hold a candle to CUDA. Now, they try once again by providing semi-automated CUDA transpilation (which sounds like another terrible idea to me). If AMD really was all about “open technology”, they’d open up their proprietary drivers. They’re not doing that, because that would give them a competitive disadvantage.

Also, being “open” doesn’t necessarily mean anyone else can use it. Mantle was designed specifically for AMD GCN hardware, so even if NVIDIA wanted to, it wouldn’t necessarily make sense to support it. Similarly, it is theoretically possible to run unmodified PTX (compiled CUDA) on AMD hardware. The CUDA compiler is also open-source software, by the way.

Lastly, the claim that AMD can’t optimize drivers for GameWorks titles because they don’t have the sourcecode is utter bullshit. It’s not a secret that both vendors hack and replace the shaders of popular titles through their drivers, in order to improve performance. They don’t have (or need) the sourcecode for those either, they just need capable engineers.

I don’t like the idea of NVIDIA (or Intel) gaining an overwhelming market share over AMD, because that’s bad for vital market competition. However, AMD needs to stop playing the victim of “unfair business practices”, banking on the emotions of those gamers who have an irrational attachment to a microprocessor brand. Instead, AMD needs to provide better technology and make better business decisions.

It isn’t good for anything when tessellation is 1*1 in the video example. The proof is the slab being solid red in the wired mode.

The amount of tessellation needed to bring an AMD card to its knees is far beyond overkill. Most games use tessellation nowadays but only the ones with insane amounts of it are used for these comparisons.

What a bunch of nonsense. Intel already has open source drivers for their GPUs and AMD’s are on their way¹.

¹ And this can take over a decade at this rate.

I’ve been using NVidia on Windows and Linux for 3D and gaming and I can’t complain. At the same time AMD, while having great hardware, had such a crappy drivers (OpenGL at least) that it was only good for playing DX games on Windows. Latest drivers got much better in OpenGL for gaming (Windows), but not sure about 3D work or Linux.

The slab being red isn’t “proof” that the triangles are 1x1 pixel-sized, because it is a wireframe, i.e. the edges of the triangle fill, not the triangle itself.

If you look at this image from the original article this complaint is based on, you can see that most triangles would be decently sized for an 8x8 pixel target. Of course, the shader can’t pick optimal tesselation for every triangle, it still depends on the topology of the object and the fixed-function features of the tesselation stage.

Anyway, my point is not that “optimization” is the real reason why the concrete slab is so tesselated when it doesn’t need to be - having so many little triangles just isn’t as bad as it looks. The most likely reason is that nobody went in to test “optimal” tesselation levels for every single asset.

The amount of tessellation needed to bring an AMD card to its knees is far beyond overkill. Most games use tessellation nowadays but only the ones with insane amounts of it are used for these comparisons.

I don’t think this amount of tesselation is “overkill” for good displacement mapping.

What a bunch of nonsense. Intel already has open source drivers for their GPUs and AMD’s are on their way¹.

Neither companies have open-source drivers on Windows (i.e. the platform that actually matters for games) nor have they announced any such plans. I’m sure game developers would love to be able to properly debug their applications with open-source drivers, though.

You tend to be very categorical and you know what you said. Making excuses and quoting “optimization” and “optimal” don’t make it any better. The amount of tessellation in Crysis 2 was overkill and everyone acknowledges it. It wasn’t only the insane amounts of tessellation on objects that didn’t need it, it was the frigging tessellated ocean below the ground too. Crytek either dropped the ball hardcore there or they partnered with Nvidia to showcase a totally pointless strength.

Well, you know, that’s like, uh, your opinion, man.

I’m not going to argue with that. Still, Intel and AMD have open source drivers.