Oculus Rift for viewport navigation; Maya is already doing it.

When I mean by that, when it comes to the viewport, Maya once again is doing what BlenDon’t.
http://www.creativecrash.com/maya/plugin/moculus

Would be interesting to see in Blender (as well as the Blender game engine), but first we need to make sure that people won’t get a choppy experience with fast head movement in a scene that’s little more than a few heavily subsurfed Suzanne heads, which I’m afraid will disqualify for Blender being able to make this useful until the Viewport FX project from Jwilkens is completed and optimized (which you saw in that one benchmark article in which the Blender bench looked like a gag thrown in there for humor).

But just think of the possibilities once the big Oculus Rift unit is improved and slimmed down and allows you to sculpt in Blender without any use of a virtual clay apparatus :slight_smile:

Maybe it’s just my failure of imagination, but how exactly would oculus rift help with sculpting? 99% of the sculpting that I do requires rapid orbiting/rotation, which OR can’t do.

It might help to visualize bigger scenes, but otherwise it looks like a bit of a gimmick.

“BlenDon’t” - this is very good! I like! Blender is a very bad program.
Maya is very good, 3ds Max is very good etc.
Blender is - BlenDon’t.

Hail to the amateurs!

I think you’re too fixated on using the head tracking for orbiting/rotating.
Even using an OR I’d still sit at my desk, with mouse/keyboard/tablet, and not run around the model in the room.
The major navigation should still be done with “classical” input devices, and the OR work on top of it, rather replacing a quick pan/tilt.

And to immerse in modelling is just great. Around 5 years back I had the opportunity to work with a 720p HMD and a Sensable Phantom Omni (which is pretty much a “3d-tablet with force feedback” - fun with physical properties, then your “clay” has resistance, or objects have a weight).

It’ll also eliminate the “need” for matcaps for instance, because you have a real depth feedback when sculpting.

The biggest challenge to me it seems would be the UI.
It has to be neatly integrated.

One problem I see is (might only be mine), when I have to look at a menu in Blender, I don’t just move my eyes, I turn my head just slightly - might also be because I wear glasses working on the PC. You start to turn your head instead of moving your eyes if you wear glasses unless you have huuuugeass glasses :D.
Anyways. Those slight head-turns to look at the menu obviously would not move the UI, it has to be relative to your orientation, but it would always move the viewport.

All in all I think it would be beneficial for sculpting and modelling, and an interesting challenge for the developer.
But if I am not mistaken one of our forum members has an OR devkit and is tinkering with this already?
Same goes for Leap Motion IIRC.

Did not think of that. That could actually be pretty useful.

I think that’s a no-brainer actually. We naturally map the viewport and the gui to certain parts of the view in front of you. It would be like a giant curved display, except that the 3d-viewport would have depth vision. That way you could turn your head to see the gui. You could even have multiple 3d-viewports on this large “screen” so you could actually view the model from different angles by turning your head.

That would also work wonders for node editing, since you’d have all the screen space you ever wanted.

Yea, I changed my mind. Rift support would be awesome :smiley: (if done like this).

Can anyone inform me if the Oculus Development Kit has a head-tracker that supports head translation(XYZ) already? If so, I might be tempted to get one and start tinkering with Blender code to integrate the Rift with Blender viewport (if noone is doing this already). If not, that would mean its too early to start work on this because it would mean the head tracker is too limited for having a truely immersive sculpting/modeling session. Information on the internet about this is extremely unclear.

EDIT : Seems not from what Im reading… So I believe its too early for such an implementation.

It’s actually not so much of a no-brainer and not so trivial.

You got to separate that those are two different kinds of navigation.
On one hand you have to simulate you looking at the control panels on your physical screen in the virtual world.
That’s easily done, by mapping the UI just like you said.
On the other hand, you got to control the viewport with the OR. Actually you could just skip that part and use the OR just for 3D perception and immersion, however it’d be nice to be able, for instance to pan your view so it feels like your sculpt is standing in front of you on the table. This way you could digitally sculpt, just as you would clay on your table and look around it - and additionally rotate it with your mouse to place it as you like as their’s no gravity and your clay just floats midair.
But if you now track the head movement to orbit around your mesh, it would feel natural to do “rotate around selected” or “rotate around 3d cursor”, you move the UI at the same time.

It’s a very theoretical discussion and I guess one has to implement it first, and then check out what a good option is, or take it by it’s name and make it optional to the user if he wants UI+viewport absolute, relative or mixed.

I have a feeling I’m misunderstanding what you’re saying. First you say it’s not an easy problem and then you immediately propose a perfectly sensible solution :). With regular mouse/keyboard controls this is not a hard problem and I would do it myself if I had the time and money. The only non-trivial part is that blender’s gui is not very easy to extend. Might be easier with the Qt extension.

The 3D tracking stuff is entirely separate problem and could be done even without OR. The 3Gear system seems perfectly capable of doing that out of the box (assuming someone would implement full 3d sculpting). The leapmotion gadget might suffice for simpler things but it doesn’t seem anywhere near accurate enough for sculpting. You won’t get haptics with either of those, though.

Their newest version uses absolute optical head tracking, but I don’t think it’s in production yet. It’s coming though.

The fact they’re charging nearly the price of an oculus devkit for this plugin is hilarious.

Time for this thread to plop up again.

Occulus Rift DK2 is here :wink:

What’s news?
6DOF, meaning now it’s possible to track translation.
1080p OLED display for crisp’n’clear images. No smearing, no (less) headache. (DK1 was 720p)
Improved lenses.
Cranked up to 75Hz to further counter motion sickness, plans are to go to 90Hz with the next version.

Aaaaand a little vid:

For only $350 I get veeeery tempted to order one for myself just to play with, even though I know that the final consumer version will be much better. But I am not sure I can wait much longer! :smiley: