So, I’ve been experimenting with ways to connect Ableton Live to visual programs, to get tight integration of audio and visual. Live is great because of the flexibility it offers, and the way it can take all samples / tracks and timestretch etc so they are in beat matched. We wanted to use the power of this but you can’t use it for movie tracks, its audio only. So, ideal world is to use Ableton for Audio and something else for the visual side of things. In this example I have hooked it up to Modul8.
My first thoughts were to take the playhead position information from Ableton and pass this across to whichever software was controlling the visual clips, and have waited for Max for Live to be released to realise this. Unfortunately I just haven’t found a way of doing this – I’m still convinced there is something deep in the APi of Max 4 Live that will allow this, but I haven’t found it yet. This would also require a fairly complicated playback method within the visual program, and I think would be much more applicable to when I move across into something like openFrameworks for this.
What I ended up with works really well, and you can see it in this video I recorded of the process. Its really simple (which is generally the best way it seems) and triggers clips when needed. It can’t keep each individual movie clip in time with the audio, but this isn’t necessarily a problem as long as clips are triggered at the right time, ie on a breakdown or a certain bar.
It can also be expanded vastly, and I’ll be looking to produce a small Max patch which will change the midi note into OSC, as i’ll be able to add more to OSC to pass to the visual program. Combining this with the generative visual power of oF, and using audio reactivity, I think I have the basis for a strong AV performance.