I hope I’m writting at the right place this time.
I’m doing a realtime motion capture with kinect v2 and blender. I managed to send rotation data in quaternions from kinect application to blender, but my 3D model in blender gets totaly wracked.
It seems I must do something with those quaternions before applying them to bones.
Any idea?
My C# program sends rotation data. But blender script on the link recieves mostly Spinebase and SpineMid bones and very rare some other bones. When I test with other pyosc script it recieves a lot of different bones.