Perception Neuron + Blender?

Hey i was just wondering. I feel like i have heard that Perception neuron works for Blender, but i actually can’t find anyone who has used it with blender.

Does anyone know of anyone that has tried it with blender or would know if this is worth trying to get for blender?

I brought this up before it was released, however a force feedback style suit
would have more potential,

I have a perception neuron on order (should be here next week according to my shipping tracking number)… anyway, from memory its program that it natively communicates to, can export out to .bvh file format, which is a standard for mocap data… and that blender can natively import.

@BPR, how would a force feedback style suit have more potential for blender? both have their own pros and cons to each system.

To be able to act in real time, and have force feedback in the game engine, and record the performance,

this way a actor can feel a doornob, or pick up a cigarette, that does not exist.

I see people ‘Playing’ a scene to record it, in 1000th the time it takes to animate by hand,

mocap with visual data is ok, but having a actor use AR seems like the next logical step.

also with force feedback, you could sculpt by hand, and feel the substance yield like clay.

(I don’t just see them for blender either, for mocap, and for driving a surrogate, playing games, etc)

an overpowered version could help the disabled walk.

I think some one will 3d print them soon.

(Will it be you?)

Ah of course, Yeah i dont use the game engine at all, no point for force feedback in my situation.

also with force feedback, you could sculpt by hand, and feel the substance yield like clay.

Sure, but maybe it would be more efficient workflow wise to build tools which are not physically possible instead of trying to mimic reality?

I will receive it too (soon i hope ^^)

Did you try the Motion Capture Addon made by Benjy Cook ?

I already got my Perception Neuron.
I thought about doing an unboxing and some videos about experimenting with it, but i have to learn for exams, so sadly i do not have that much time at the moment. I just did some test, and yeah i tried Benjys Addon but not that successfuly.
I will post some results as soon as possible! :slight_smile:

I have a Perception Neuron. Kickstarter yay. Only planning to use it for offline motion recording, not realtime control.

Thanks for info about mocap tools!

looking forward for some review of the product, looks really promising! :slight_smile:

Not really, a real sculptor could do quite well in 3d if all their skills translated,

imagine picking up a paint brush, and feeling the brush slide over the medium.

imagine making a sword in the viewport, pick it up in your hand and clicking ‘record’ and swinging it,

then throwing the animation into unity etc.

exporting to bvh and using makewalk to load in the armature works great! the only issue is I can’t make the bvh exported from raw from neuron axis work for fingers in a blender meta rig (from rigify)

What is makewalk?
How is your exact workflow?
I have experimented alot now and planing to do a tutorial series from Axis Neuron -> Blender -> UE4 if there are people intrested in it.

makewalk is the addon I use to load .bvh files exported from Axis Neuron. Were you successful making the fingers work in the blender armature?

This is the whole process I’m doing:

1- Open a .raw file sample (downloadable from perception neuron website) from Axis Neuron Software;
2- Export the animation from that .raw file to .bvh file.
3- Open blender, add human metarig armature (from rigify) (don’t generate the rig);
4- assuming you have makewalk addon installed, enable the left panel by pressing “T” in blender;
5- Under “Misc” tab (it will be available only if you have makewalk), click on “Load and Retarget” to open .bvh file dialog (important: it will only allow to load a .bnh file if your armature is scale 1, otherwise it will prompt an error message)
6- Change View to animation and click on “play” button;
7- It should work great, for me it does. the only issue, again, is the fingers that although my .bvh contains fingers animation it doesnt load in the blender armature;

any tips are welcome, thanks.

makewalk is the addon I use to load .bvh files exported from Axis Neuron. Were you successful making the fingers work in the blender armature?

Hey, cool Addon! I tried it my self, but it seems to to work with fingers. You can look in the panels for Source Armature and Target Armature, that there are no entries for the fingers.

I did all my retargeting manualy with Constrains. A lot of work, but it worked well. And you only have to do it ones.

Hi guys I have also a few questions to discus :slight_smile:

I’m totally on a beginning, I never did human rig or retarget so a lot to learn :slight_smile:
For now I tried only default BVH import, but I have two questions:

  • Even I did only right hand test motion, so the rest of armature should be in default position, right leg is shorter in Axis Neuron.
    (In Axis I didn’t noticed at first, only after import to blender, there I could align a side view.)
    Does anybody can confirm that?

  • Is there any benefit to export data as FBX format to import in Blender?

Thank you for help.

Here is one of my first test of right hand. First impression is very good :slight_smile: there are errors, but for price I could get it from Kickstarter it seems useable (for their full price I would not buy at this stage). Also if I will have to pay for Axis Pro it would be a pain. My opinion is not so much fair, because I have never seen a “raw” data from any other mocap system. I don’t how much time they spend by cleaning data.

Yesterday, after few calibrations to find correct position of fingers that takes neurons on a good position, I was able to touch thumb by forefinger. Today zero success. One issue that I can’t confirm for now, but it seems longer capturing gives more errors.

https://vimeo.com/145382509

https://vimeo.com/145382509

Hello, I’m looking for users of PERCEPTION NEURON before ordering for my serious’ game company. Did you order and receive your modele ?
Do they function as good as on videos ?
Thank you

I’m not the right one for you. I want to use it for mocap not for games. But I can say - I finally get my order. As Kickstarter baker I supposed to get my order on February, so more than half year delay. They had some manufacturing issues and what more ever to solve. So if I get it now try to imagine all the line of people that ordered after kickstarter campaign. But maybe they are ready for mass production in better condition now. Probably better to ask them, it couldn’t mean so much since I was waiting so long with all their schedules, but I can understand it is not an easy process to bring these projects to life.

What I can say - final product from factory is very well done. Sometimes it’s problem to unplug some neuron, anyway materials and preciseness of manufactured pieces are in a very good quality.

Another thing is quality of output data. In some cases results are brilliant, in some cases they are very inaccurate.
I used mocap data a few times before, but just for fun and data I was used were already after cleaning.
I never saw RAW data from any motion capture, so I’m not relevant to speak about quality of PN.
(RAW I mean motion data that I don’t have to touch, what you can see in video are that kind data, they are not “raw” since there are several algorithms to “clean” data - contact constraints, smoothing f-curves). This smooth function works great, their first released data were shaking, and when I tried to polish these data I had no such success.
I just can’t say what they will be able to do with raw data in a future of their development and what can’t be done with this piece of technology (gyro, accelometr, magnetometer) better.

If it helps, here is my first fullbody mocap test, just pressed a record button:

Through this test I had to do a calibration several times. I used it just a half hour, so it’s just a feeling, but for longer capturing it looked like sensors became sometimes more inaccurate. And sometimes very bad start position ended with much better result.
Good calibration is “key” like they say, but hard to say if this is the only one main factor.

Note: I’m quite surpriced how low among of informations is from end-user. Totally zero feedback. Most of the videos are from Noitom, or unboxing that say totally nothing. Since they distribute their product slowly but already several months, there is nothing to see. Does user throw it directly to a trash or is it so common just use it and nobody has any issues to discuss or even share results?

With a preception nueron + walking ragdoll, you could actually interact with the environment, however the feedback needs to be much much stronger.

I think full body 3d pri ted exoskeleton tech will help animate the future, from the disabled walking, to true telepresence.

Wow! Thank you so much this video is VERY useful. It shows the true potential of PN with the errors included, so it’s good for knowing what to expect. My workplace is waiting for a package, hopefully we can do this kind of testing as well.

Thanks!

@vklidu: You exceeded your pm folder! :wink:

vklidu asked me how i used my BVH files.

I retargeted it manualy within blender. I have a Rig with IKs and constrained the IK Bones to the bones of the BVH rig.
I mainly used Copy Rotation, Copy Location, Child Of, and some Transformation Constraints.
This is a bit try and error if you are not that familiar with Constrains and the result is not perfect.