Gesture Recognition for BGE

Hello,

I am working on a project right now. I need gesture recognition for it.

Basically through gestures a Blender object should move. If anyone could post any tutorials or any explanation that would be of help, it would be great.

Thank you!:slight_smile:

By ā€œgesturesā€, do you mean ā€œanimationsā€ or ā€œactionsā€? Otherwise, I have no idea what you mean, sorry. =P

I think he means multi-touch trackpad gestures like the ones that you use on a Macbook trackpad or other kind of devices.
He could mean also ā€œairā€ gestures like the ones that you use with Kinect, Leap Motion and so onā€¦

Just wondering, did you just receive the Myo also? :slight_smile:

Anyhow, you can use Python to receive and send data. Since multithreading and multiprocessing in the BGE is tricky, you could instead poll the device (every frame), either by directly reading the data or using some form of IPC where you read from a socket and have another program/process relaying the information.

Hi, I mean air gestures for example swipe left , right , top and down.
My gestures are getting recognized using opencv but i cant catch the keyboard input in blender.
For example,when i swipe towards the left, the code returns ā€˜wā€™ as the keyboard input.So on notepad a ā€˜wā€™ appears when i swipe left.
And when iā€™m assigning the ā€˜wā€™ key to player in blender for moving forward ,Blender isnā€™t catching the keyboard input.

Can anyone help me with this or anything which will move objects in blender using gestures.
P.S: Gestures are captured using a web-camera.

Thank You.