Bassic Dave wrote:forge wrote:how is this useful to people that dont have controllers that need OSC or clip names etc?
I have a clunky old UC33e and that's all
is there any big fun thing you can do with it that I couldnt before?
This is my question too. Something in my gut tells me this is really really good, but i cant see what i could to with osc or any of this that i couldnt do with standard midi. I know im missing something, live being open source has to give us something extra special, i just cant see what.
Can anyone give an explanation for those of us that dont get it yet?
Thanks in advance! =]
1A MIDI can only send 2 bytes per 'payload'.
1B OSC can send arbitrary blobs, floats, ints, strings -- as well as arrays of those values. Want to trigger track 0, sample 5? Use address /ableton/clip
and send it values (0,5).
2A MIDI has a limited set of 'addresses'. It can send to 16 "channels", and within those 16 channels a pretty small subset of paths, CC, Note-on/Note-off,Sysex, etc.
2B OSC has an open-ended addressing system that you can direct any message to any callbacks you wish.
3A MIDI is primarily a wire-based protocol, some wireless solutions exist, to the best of my knowledge all are proprietary, none are standard.
3B OSC can be used over UDP or TCP to allow applications to speak over a network. It's got a published spec and uses network protocols that virtually every desktop in the world (and space) use. Imagine you're performing for people with Live and you want to engage your audience. You could publish 'participant' OSC addresses and allow devices to modify those settings (within specified ranges), or you could tie envelopes or device settings to lighting, or video triggers, or cellphones speaking bluetooth, or PSPs, or microphones, or ... etc etc. Sure, you can do that with midi, but it'd probably take a little soldering or a lot of glue code.
The trick here is to detach yourself from thinking MIDI, think combos, think interactions, think progressive organic behavior that you have control over using an object oriented language rather than being limited to 'on/off' of a note or 1-127 on a CC.
And finally, OSC isn't really 'the big deal' about LiveAPi, the Python API access is. It's basically an example of how powerful this interface can be, we shimmed in a network transport layer interface to allow remote control. Just imagine what you'll be able to do.
On the ho-hum side of things, you could recolor your clips automatically if you wanted. All bass immediately becomes green, all leads red, etc. You could do name matching on that type of stuff. Or, you could 'tie' knobs together.. Turn a dial from 1-127, another dial goes from 127-1 inversely. If that kind of stuff doesn't turn you on and you don't do performance stuff, then this probably isn't a big deal for you -- yet.
I've been dreaming up ideas since the release that have nothing to do with playing music, but are more about automating processes within Ableton to save me time.. But that's just what _I_ want. You've got the power to make what _you_ want.
That's the big deal.
Once this post-release rush of communication and documentation is over we'll have some examples that'll blow your sox off. Right now I can understand how it's only the gear-heads getting excited, but pretty soon you're going to be seeing a lot of practical applications that'll get the average user excited!
-Nate