Hello everyone,
A while back I made an interactive installation with wiimotes, blender and pure data. Pure Data read the wiimote’s data, transferred it to blender by OSC and I could use any wiimote input as I wanted inside blender to get my desired results. Well, thankfully I coded just a few lines ( being an artist, not a coder), because all the data transfer code I got was already done. I just had to set it to the right outputs.
I am wondering if there is anything similar to this using Android’s or IOS’ sensors. From which i can get accelerometer, gyroscope, compass and other sensors’ data and use it in BGE for creating the visual output I want.
I could still use all the work and code with the wiimote, wiimotion+ for the gyroscope and so on. But I want to teach that with an artistic focus on the installations concepts to the general public and it is more common finding people with smartphones than with wiimotes.
I did try to search this but could only find people trying to code the bridge between blender and the sensors back in 2012. I hope it has been done already since then.
Thanks in advance.