Cellphone sensors real time data transfer to BGE ( NOT bge on cellphones)

Hello everyone,

A while back I made an interactive installation with wiimotes, blender and pure data. Pure Data read the wiimote’s data, transferred it to blender by OSC and I could use any wiimote input as I wanted inside blender to get my desired results. Well, thankfully I coded just a few lines ( being an artist, not a coder), because all the data transfer code I got was already done. I just had to set it to the right outputs.

I am wondering if there is anything similar to this using Android’s or IOS’ sensors. From which i can get accelerometer, gyroscope, compass and other sensors’ data and use it in BGE for creating the visual output I want.

I could still use all the work and code with the wiimote, wiimotion+ for the gyroscope and so on. But I want to teach that with an artistic focus on the installations concepts to the general public and it is more common finding people with smartphones than with wiimotes.

I did try to search this but could only find people trying to code the bridge between blender and the sensors back in 2012. I hope it has been done already since then.

Thanks in advance.

unfortunately, the “bridge” funny choice of words as the only platform debugger (android debugger bridge) is useless for sensor output unless you can get root, but as for the C api goes you have to use the libraries from the hardware manufacturer, enigmatic at best, See the sdk for the java implementation to send sensor data via network or pipe(latency ridden over bluetooth), best bet is to get a cheap rf transmitter and plug it in the headset jack and using latches to filter out timed commands/channels. Oh and I finally found a c-api resource for audio alsa in/out. pjsip library implements it for a voip phone, now if only I could find native activity realtime management of the audio alsa, and as I understand it (limitidly) iOs is just as bad.