OSC Input for 2.5

When 2.5 started its feature-request wiki, I started a discussion on setting up a sort of “verse” that did nothing more than transmit these new “universal handler” commands via a network.

More recently I have discovered OSC which does a fantastic job at sending UDP messages from one thing to another.

As you can guess, I’m running in to plenty of problems. Since the thread is new, I’m just going to post my checklist of issues below. As we smash through my stranger problems I’ll keep developing this area as I have been for the past two months in hopes of making something that actually works.

To catch you up, there are two different OSC modules for python:

  • pyKit and the popular thing to say about it is that it is “no longer developed”. A shame because it works very well. Sadly it doesn’t come with any example files so the learning curve is high.
import socket<b>
import OSC</b>

def osc_data(host, port):
    s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
    s.bind((host, port))
    raw, addr = s.recvfrom(1024)
    list = OSC.decodeOSC(raw)
    return list
        
#in this one line I imported 16 analog inputs from a ps3 controller, and the 6 axes from a spacenavigator all packed in to one signal 
gx, gy, dx, dy, x1, y1, x2, y2, r1, r2, l1, l2, x, c, q, t, l3, r3, sx, sy, sz, srx, sry, srz = osc_data('localhost', 4950)
  • SimpleOSC is mantained but it is deceptive in that it is built on the pyKit, but pyKit examples fail their execution. Its code is messier to set up, and it is unusably slow. Frankly I hate it, but I can’t figure out how to “install” the pykit
<b>import osc</b>

try :
    osc.state
except AttributeError : 
    osc.init() 
    osc.inPort = osc.createListener("127.0.0.1", 9001)
    osc.sendMsg('/init', ['done init osc'], "127.0.0.1", 9001)
    osc.state = 1

def testF(*msg): 
    print 'msg: ', msg
    print 'msg value is: ', msg[0][2] 
            
# bind the function with an osc tag, when a msg tagged /test arrives funcion testF will be fired
osc.bind(testF, '/test')
osc.getOSC(osc.inPort)

I’ve been doing a lot of python research the past couple of months, and I’ve figured out how to read in long lists of information via OSC. Most of my working files implement code that was done with the game engine, but the question remains: how do I record the values as IPOs

Well, I figure the first step in that direction is to figure out how to have a pythong script loop when you run it (not in the game engine) so its results show up when you press play. I realize this is a sort of “2.5 knows how to do this, but 2.4 it will just be a headache” situation so I’m looking for 2.5-specific solutions

Feel free to jump in with solutions to any of these problems that have had me tearing my hair out.

  • I can’t figure out how to “install” the pykit so that the IDLE or Blender can see it within OSX. WIndows has been a little more forgiving. Linux I haven’t tried yet but there are instructions available.

Obviously, as I’m known to do, I’ll keep updating this first post as I make more and more advances.

Oh, and one more thing. It seems that practically everyone who ever got far in this line of research has fallen off the face of the earth!

My solution proposal might a little sloppy right now, but Ton boasted of 2.5’s support for many multiple controllers when he was interviewed by Bart nearly a year ago and since then there has been little to no work that i can see on handling analog input binding. Even if OSC isn’t the fix-all, it’s an open source project that works well and should be explored more.It’s a LOT better than MIDI, that’s for sure.

I figured out why the site-packages folder wasn’t working in OS X. There’s two different ones, and the documentation tells you about the wrong one!

http://code.djangoproject.com/ticket/4331

OSC is very nice and promising, in music production is intended to replace midi protocol which several decades ago outdated. Unfortunately MIDI is so wide spread , that this replacement will take a very long time.

Yep. That about sums it up.

OSC has recently become popular among leisure programmers, “makers”, and experimental input enthusiasts because it works so fast. The real tragedy is most of the software I have encountered the past few weeks has one of the following problems

  • can’t send information as a bundled package (Osculator, Multicontrol, Quartz Composer)
  • slows down if information isn’t bundled (pyKit / SimpleOSC)
  • can’t receive information with different tags (pyKit)
  • can’t receive information at all (GlovePIE)

and having just one breaks the whole two-link chain.It’s like OSC software developers aren’t even reading how OSC works before they’ve already put something out.

An efficient OSC packet looks like this

/controller1 ,ffis  440.0,23.667,1,watermelon

If I am not mistaken , there several extremely serious apps that use OSC already. Ableton Live come in mind and I am sure Lemur which the king of midi controller use it as well and is largly based on it.

If OSC can be supported inside Blender (and I see no reason why it can not) then midi controllers using OSC can be used to control 3d graphics in real time, the same way a composer shape and sculpt sound and music in real time.

Thsi would be revolutionary as the status quo is mouse driven off line 3d graphic production, could change to on line (real time) osc controlled driven 3d graphic production.

Yes! You’re getting the idea. The mouse is a sort of single-button input that switches software in to another state. Reading the x/y movement for one thing at a time. The thing that makes MIDI controllers wonderful inside of music programs is you can bind any controller you want to any parameter you want. The Animato system is very conducive to this concept, and Ton mentioned in his interview last year that 2.5 was going to sort of be this beast for taking in joystick or MIDI or sensor inputs. So far, no one has jumped on it, which is a bad thing. The farther along the core of 2.5 gets without any thought as to how multiple-controller-inputs will be handled, the harder it will be to wedge it in later, I think.

Recording inputs, for example, is something I haven’t been able to wrap my head around. Since animation playback can run independently of the GUI, I worry a little that “time” and “control” will be a little too disconnected. And calibrating inputs is another thing that has to be made easy to do. most controllers have a 0…1 or -1…1 range, MIDI is 1…127, and OSC can be just about anything.

More later…

I was able to use OSC.py directly in BGE but its error handling is pathetic. When It’s receiving data it all works nicely but when the OSC sender stops the script starts to error out and the framerate drops to 2-3 :s (actually I use a “try” so I dont see the errors but its slow still)

I guess I have to modify some function in the OSC.py itself to handle not recieving data better

Actually that is theoretically, MIDI is 1…127 but it does not always capture the full range. Sometimes is only on/off situation which means 1 or 127 or sometimes only a fraction of 1-127 values , so it makes perfect sense fot OSC to be much about anything since most synthesizers do not have uniformity in their features parameters .

of course theoretically because Blender allow for full python integration any python library could be used for parameter automation , midi or osc. I know there several midi libraries.

My research in to OSC has nearly come to a close, and a few people sort of beat me to the punch if you’ve been following BlenderNation’s links about OSC. However if you watch this guy’s videos of his cellphone’s tilt interacting with the game engine you’ll notice that it’s running incredibly slow and choppy. There’s also hundreds of lines of code. No good. As far as interacting with the game engine, I’ve figured that out months ago and with a good framereate. The trick is simply packaging your signals in to one packet and trimming your numbers’ decimals or converting your floats to times-a-thousand-integers.

OSC signals convert every number in their sequence in to a unicode character. The less characters there are the faster the signal can be. Therefore, the less channels there are, the less times it has to write out what channel a signal is coming from. Likewise, you don’t want to use too many different ports either since you can run in to the same problem.

The guy in the videos said he was using PureData which I can’t seem to wrap my head around. It seemed like the best option for OSC routing since it runs on every platform, but its interface is crap compared to its related softwares like vvvv or especially MAX/msp and Quartz Composer. The reason I

The bottom line I came to is that Blender will need to incorporate some sort of similar “patching” system for ease of use’s sake. In other words “controller mapping nodes” which could take advantage of the already existing Nodes look n’ feel and simply serve the purpose of taking an input from a controller, from OSC, from MIDI from keyboard, and connecting it to the Animato. Having magical automatic support for Wii remotes can come later. Wii remotes, contrary to popular belief, aren’t even a very good 1-to-1 tracking device as they lack a true Z axis accelerometer (something I’m learning the hard way)

So maybe I should dedicate some sort of webpage of my own to this topic. Maybe it’s too big of an idea for just a forum post. This is the sort of thing I would need everybody in the whole community to hear before the right person would agree. I wish I knew anything about Blender’s source. :frowning:

Lots of people use liblo, and there is a python wrapper for it.

http://liblo.sourceforge.net/
http://das.nasophon.de/pyliblo/

Kudos to the effort indeed… Having IPO to MIDI, or MIDI to IPO ability with Blender, whether through OSC or otherwise, is something I have been long dreaming of. Of special interest for me is being able to import and export a standard MIDI file to work with IPO’s.

Please don’t give up the dreams!

Hi bmud,

there are some examples out there on how to use osc with blender.

Here are some links for further reading:

julian oliver’s blog:
http://selectparks.net/~julian/blog.php?entry=entry050908-212057

one blenderartists thread (i´m aware of):
http://blenderartists.org/forum/showthread.php?t=79914&highlight=osc

It’s a bit of age, but i have some of the old files i experimented with still in my archive. If you have interest in that, i’m glad to share them.

Concerning the use of Pd or the like … I think that these Tools are great for rapid prototyping and a Lab or Art Piece/Performance Situation.
The real problem starts, when you plan to distribute your development as an app or for blenderplayer usage.
I don’t see an easy way out of that problem currently.
And I don’t think that, even if wanted, it makes sense to incorporate that functionality into blender.
The general EventLoop of blender GE, in which all functions are executed, isn’t really a base for realtime dsp of any kind.
Even in the existing solutions of using pd/max and blender via osc, you may realize the problem of dropped messages, because blender GE is not running synchronized to any external input, but independent of that on its own eventLoop.
To guarantee that every message is processed, in the current environment, you would need more something like Tibco-Rendezvous, where you have an internal cache and a queue for incoming messages, so that you are sure every message reaches the client.

If interested, I can digg up the old examples and try to explain it.

I must confess, for my lack of coding skills- not having gotten too far past “Hello, World!”- this all seems a gargantuan challenge to solve!

It is encouraging nonetheless to know that there are others with much interest in this area, and examples and work shared- past and present (and, hopefully, in future, also)- to study and observe.

I wonder (hopefully, I’m not going too far off-topic from this thread) if working with just MIDI data and Blender somehow- eliminating OSC, Pd, and the GE, etc.- would be more efficient?

Cheers!

Hello all. I know this thread is a little old, but I’m excited to report that I’ll be presenting my work with OSC at blender conference Friday October 23rd at Amsterdam time 2:00pm

Since I started this research I’ve made a lot more advances, and I’m pumped to share it with the big shots and get some real feedback. Who knows, maybe I’ll sway a developer in to collaborating with me to get true MIDI / OSC input in 2.5. A guy can dream… :slight_smile:

@bmud,
I’m looking forward to hearing further developments of OSC / MIDI.

I did not get to see your presentation at the Blender Conference last year. How did you solve your OSC issues in Blender 2.5?

Thanks!


import bpy

import time
import threading

import socketserver

class MyUDPHandler(socketserver.BaseRequestHandler):
    """
    This class works similar to the TCP handler class, except that
    self.request consists of a pair of data and client socket, and since
    there is no connection the client address must be given explicitly
    when sending data back via sendto().
    """

    def handle(self):
        data = self.request[0].strip()
        socket = self.request[1]
        print("%s wrote:" % self.client_address[0])
        print(data)
        socket.sendto(data.upper(), self.client_address)

HOST, PORT = "localhost",8004
server = socketserver.UDPServer((HOST,PORT), MyUDPHandler)

class Timer(threading.Thread):
    def __init__(self, seconds):
        self.runTime = seconds
        threading.Thread.__init__(self)

    def run(self):
        time.sleep(self.runTime)
      
        server.handle_request()
        self.run()
  
t = Timer(0.1)
t.start()

This is as far as I ended up getting. Recently I met a guy who’s a better programmer than me, and he said liblo is the way to go. So I guess I’m going to start over. This was a good start though – it takes care of running the OSC reader in a separate thread. The OSC just isn’t being interpreted/processed/turned in to controls for anything. I’m kind of ashamed to post half-finished work, but I just lose steam working on this darn project every time i start. It’s hard. :no:

I realize this project is six year old now, but I’m bumping it because there was one poor guy at the ask-a-dev-anything event at Blender Conference yesterday who said “what about OSC?” and Ton replied “I only know of one developer doing that”. If that person is me… wow.

Anyway, it got me fired up. And I have an idea of how to make it. It’s a lot simpler than I thought. As I’ve learned over the years, graphics applications that accept OSC inputs make it easy on themselves by requiring the controller to speak directly to whatever thing it wants to talk to by just interpreting the OSC path as a route through their API. And guess what? Blender already has an API.

So the pseudo code looks like this:
import OSC, bpy
start OSC server
on message received:
loop through control messages
does this message’s name match an object in the python api? if yes:
apply message’s value to the python object.

Yeah? Sound good? I think we (I with help from you) should try building it and see what happens.

First thing I want to build with this is a jog and shuttle wheel Because I’m fed up with scrubbing the timeline and I’m feeling fancy. :slight_smile: