3D Razer Hydra Controller in Blender 3D Sculpting

I was able to hack together a proof of concept script for integrating the Razer Hydra with Blender 3D.

Latest Download here:
-> /uploads/default/original/4X/2/7/d/27d5679d56a843505f0abe52da8b791edadba0b8.zipd=1400716680 <-

The video explains pretty much everything but how easy it is to use. It feels like you’re holding your 3D object in your hand, and takes absolutely no effort or learning to understand how to use it.

1:1 control of objects in your scene feels incredibly natural, and will be even better when Blender 3D’s Stereoscopic Viewport is released. Once this is fully build as a proper plugin, it’s going to be a 3D design game changer.

Right now it’s really basic, and only operates on the default cube, but it’s like nothing blender has ever seen before, so I had to share it! It should end up working as a plugin with lots of features, and be easy to use.

I haven’t released the script, because I just made it yesterday, and isn’t very well built. I’m new to python, and have never written any plugins, so I’m open to help and suggestions.

(also, the dubstep bit in the song was created live using a razer hydra and midi keyboard)

Nice one! How much does one of those puppies run for?

$100 - http://www.razerzone.com/store/razer-hydra. I got mine for music and games, and hopefully making cool games with. =)

Or $165 if you’re Australian. Yay Aussie tax. :slight_smile:

I was looking at one of these to use with an Oculus Rift. If you’re making a Blender implementation too, that clinches it.

Very nice demo.
I own a hydra myself since yesterday.
How do you have setup your system to work with blender?
You wrote about a simple “proof of concept script”, is this a patch or a python script?
Would you share your current version of your script?

@quollism, my Rift is ready for shipping. That was the reason to buy the Hydra, too. Rift and Hydra seams to be a good combination.

would Pressure Sensitivity be available for this and when it is how is it working on it ?

very impressive, I hope this makes it to trunk ! :slight_smile:

this will end up as an Addon, so it can be changed by anybody. It could do a lot without any hard integration in blender, but if blender was extended so you could use it to dynamically draw meshes or things, that would be awesome.

Right now it’s just rotating a cube. You can use your tablet for sculpting pressure. However, making the hydra act as a tablet with sculpting would require some fancy coding.

Congrats on getting a hydra! It’s a python script sitting in my addon’s folder. It runs with the classy --python [script location\script.py] shortcut launch command.
If you PM me your e-mail I can send it your way with full instructions.

Hot damn. This is marvelous. I don’t think £90 has left my wallet as fast in my life.

My Oculus Rift is in the post too (allegedly), but being able to use something like the Hydra as a simple motion controller would make it so much easier for me to animate - I’ve come from 3DS Max, which let you use the mouse as a simple motion controller, but I haven’t found a way to do it in Blender yet. If I have scenes with lots of little characters in them, it’s such a timesaver to just record head movements with the mouse. You get quite good at it after a while.

So your demo - if you can control that mesh like that, it’s gotta just be a matter of sorting out some python to play the animation (perhaps with a few seconds of pre-roll), then record your motions as you move tilt the hydra.

Hmm.

Just a thought - if you have that Auto Keyframing record button switched on, then play your animation, does Blender record what you’re doing with the Hydra?

Right now it doesn’t record animations, because it’s changing object position, not object translation.

As soon as this is changed (or rather, as soon as there’s a sweet tool panel for it with options) you’ll be able to use it to animate right in the viewport.

You’ll even be able to set up marionette style stuff using ridged body physics. It’s going to be outstanding.

Got my Hydra on Monday afternoon. It is an odd but definitely nifty little device. Mystical black and green glowing orb of mystery indeed…

Even just with that beta MCE editor and sculpting-specific key bindings, it’s really usable. P1R joystick button re-centring the mouse cursor is dead useful. Can’t figure out how to bind F or shift-F yet to alter brush size/strength, but i’ll keep playing.

Hey! I’m probably going to be actually linking this up with blender sooner or later, and I’ve got help.

I’m pretty sure we can also push it all the way to full freedom + full freedom scale mesh editing and view-port camera control using an MCE binding and python scripting.

I can see it being very useful for modeling trees or curves in 3D by simply dragging your arm through space and clicking whenever you need to add a new point / extrusion.

Sixense itself want to build a VR-Modeler called “MakeVR”.
http://sixense.com/makevr
http://www.roadtovr.com/2013/04/03/gdc-2013-sixense-makevr-virtual-reality-cad-kickstarter-4495

Some news about the blender integration?

Right now Razer Hydras are 50% off… http://www.razerzone.com/VRpromo

This project is still underway. It will probably happen. Your posts and feedback (and video hits) are my current inspiration to not put this off for too much longer. =)

Sixense make VR isn’t aimed at professionals. The only way I’ll make this is if I can use it professionally. The other guy helping right now isn’t working on it because he’s using blender professionally on a big project.

I really want to get this done, because then I’ll be able to spend more time (more efficiently) making cool puppets in blender using ridged body puppets and live animation.

I’m also going to be presenting about it at a cool VR / HCI group that just started, so I’d best have some cool demos soon.

For the sake of completeness i’ll repost what i said on Youtube before writing more here:

Why not move the camera with one hand and sculpt with the other?

Oh, and for zooming (instead of dollying), just grab the space with both hands and stretch.

edit: btw, about the camera motion, i mean in that grabbing and releasing style; like on those early tech-demos and the more recent videos about Sixense’s boolean modelling thingy.

Here’re some additional ideas/requests:

  • It would be great to have it work both in and out of sculpt mode.
  • With an easy way to use the interface without having to put down the controllers.
  • A 3d selection method with a spherical “brush” to paint things you wanna select.
  • The possibility of 6dof sculpting, as well as locking individual axes.
  • The analog trigger, as well as the analog stick axes, as inputs like the “pressure” thing from tablets
  • A “2d” tablet-like mode where it just hooks into the tablet functions and uses the proximity to an imaginary plane as the “pressure”
  • Pie menu called by pressing down the stick, highlight an option with the stick and let go to select.
  • Stamping/Carving with arbitrary meshes
  • Object manipulation with both hands (for things like stretching, twisting, bending etc)

I guess that is it for the moment…

Here’s a demo using the hydra + blender for projector mapping.

I’ve updated the script. It no longer crashes, and runs inside blender’s thread rather than a separate thread.
It’s working as an addon.

Things are hardwired now, but I’m working on making it easy to do motion capture.

I’ll release the script to anyone who wants it, but I’m not going to release it yet globally.

Awesome work. I pm’d you my email. I would love to collaborate one day.

I’ve got a rough demo working well enough for anybody to try.

It doesn’t have animation in it, but it’s fun for holding a sculpting object or smash ridged bodies around with more ridged bodies or force fields.

Hydra Motion Capture 001.zip (338 KB)

I tried to follow the instructions from the ReadMe file but got the following on the console in that scene after i click “reload as trusted”:

Error in Driver: The following Python expression failed:
‘GetLocals(locals())’

Traceback (most recent call last):
File “<bpy driver>”, line 1, in <module>
NameError: name ‘GetLocals’ is not defined

Seems i didn’t follow the instruction as well as i thought i had, got it working now. Sorry about that, my bad.