Viewport FX

Speaking of things handed down from on high, I thought I heard that colored wire-frames were something that had been ruled out for some reason. I was thinking I’d end up sneaking that in because in a programmable system there is no reason to not allow it, even if it required somebody to write a custom script.

The priority should definitely be the performance. It is way slower at drawing then similar apps for some reason :slight_smile:

What else would be fun is making the 3D view beautiful :slight_smile:
Gradient background, ability to use semi transparent wires, anti aliasing turned on in preferences, anything else you could think of…
Posted this image a while ago:
http://img441.imageshack.us/img441/940/blendadacute.png
May inspire.

Thank you so much for doing this jwilkins, the viewport performance has been neglected for far too long already :slight_smile:

In my humble opinion I do not mind you going ‘overboard’ on the performance side, I would love to be able to navigate a mango-scene in 200fps :wink:

I think that if you just include the possibility of the eye-candy stuff in the API it can always be added later, performance comes in first hand.

Also about the ‘legacy-hardware’ discussion, I might be a little bit harsh but I rather see hobbyists running on ancient hardware not being able to use Blender than it being a hindrance to those who are trying to use it professionally, I have worked with Autodesk products for years and despite what they say their software runs just fine on consumer cards.

Shading and effects sounds fun, but It’s probably best to focus only on performance now to make it the best it can be, and leave the other stuff for some other time.

Regarding “handed down from on high”. Custom colored wireframes, along with some other things like pre-selection highlighting, is legendary as a feature that apparently won’t get considered because Ton doesn’t like it. I’ve never read what the reasons are though.

I can’t imagine a single reason why blender shouldn’t have pre-selection highlighting. Anyone?

You’re right, it’s a bit harsh, but I completely agree with you :smiley:

Btw, good luck Jason. I hope Blender will have a better viewport performance that attracts professionals.

Maybe a patent problem?

Btw @FreeMind I’m pretty sure you can already turn on anti-aliasing in the preferences?

But yeah, a background gradient would be great to have, wireframe overlay is something I’ve missed since I started using Blender (it would also be great if the wires could not be outlined when the object is selected in object mode)

Only for UI text, not the 3d viewport.

By the way, overall I feel that viewport antialiasing should not be a simple on/off checkbox. Performance (both quality and speed) can vary greatly between different implementations and circumstances. Perhaps it should cycle between all different settings allowed by the video card? But then, if it’s too complex or adds unnecessary UI clutter maybe it would be better to leave it as it is currently? (that is, let the user decide through the video driver control panel)

No.
From what I heard Ton doesn’t like it or something like that.

don’t know if this has been brought up, but if you find the time, I would love to see solid wire draw mode and random wire color for objects(yes like 3dsmax…whatever/shrug)…somewhat out of you proposal and somehow part of it at the same time.
another thing I would like to see…pre-selection highlighting…I think it was brought up before I don’t see why we do not have it…at least as an option.
EDIT:
I just noticed this was already brought up…and that fact that Ton may, or may not be, against it is just ridiculous. I am curious what reasoning there is for this, as I see it as just being stubborn to the point of a hindrance…*IF that is the case.

Although I’d love to know the reasons for some of these Ton-specific decisions, I don’t mind that we have a single person as a “benevolent dictator”. Without a person to ultimately say “yah” or “nay” and put their stamp on things, we’d have a program designed by committee which would suck.

Considering Blender is a professional tool, and not a toy that needs to support ancient hardware to make more sales, it would be appropriate IMO to require at least OpenGL 3.3 level hardware. Considering OpenGL 4 has been out for a while and DX 12 is coming soon, OpenGL 3 is already a legacy API. GPUs that support it are dirt cheap and anyone who wants to use Blender is better off investing $30 than trying to work with it on a museum piece. On the nVidia side, all GPUs 6 years old (8000 series in 2006) and newer have support for OpenGL 3.3. All AMD GPUs since the R600 in 2006 also support OpenGL 3.3. And even the crappy Intel integrated chips have supported it since 2008 or so. The benefit would of course be cleaner APIs and shaders taking advantage of the performance and possibilities of glsl 3.3. There’s really no good reason to stay with the ancient legacy APIs except of course not being able to overhaul it all at once.

Thanks for doing this, I think it will improve blender a lot!

I am looking forward to your work!

@jwilkins

Great Project!
I really want to see performance boost AND new visualization methods, for sculpting (AO, cavity, matcaps, shadows, configurable normals based fake lights), retopo (opacity, overlay) and even modeling with a PHONG shading :slight_smile: I find Gouraud shading so useless when modeling… I use it only when editing a scene, when modeling assets for scene I prefer GLSL mode with matcap to have that nice phong shading with no extra lamps :/, but how it is handled now is not very comfortable and fast.
I know, that for now, You only plan performance boost, but maybe optional phong shading for solid mode? Pretty please ^^’

I also think, that there should be more care about OpenGL 3 and 4, than legacy ones, old cards should just work, so people could discover and learn Blender, but real hobbyist, or professional, will find money for better hardware. Creating bottlenecks for supporting old cards is actually useless.

For all the talk about OpenGL 3 or 4, forever now the Intel GFX chips they put into laptops and whatnot has had only 1.4 support through Mesa on linux, supposedly this will change on the next release for some chips through the Gallium driver but who knows if/when they’ll all be supported.

And since these intel chips are by far the most popular embedded chipset…

While it’d be nice to pop over to the store with $30 and buy a new video card I simply don’t have the $500 - $600 it would take to buy the rest of the hardware to replace my poor little netbook.

I’m just happy that google semi-adopted the particular chipset in this thing (they apparently use it in their tablets) and pushed a bunch of changes up to the Mesa folks so in a week or so when I upgrade to fedora 17 I should get some improved graphics.

+1 pre-selection highlighting
+1 wire color
+1 background gradient

do not know what kind of optimization will implement jason, but softimage can support up to 40 billions polygons in the viewport whit the giga-polygon core technology

What is the minimum graphics card requirement here?