christian - thanks for making me smile with that last post.
i just wanted to add another possible benefit of this 'workaround' (it is getting more groundbreaking all the time) - it might solve this:
Quote:Yeah, it has always been like this. Many tears have been shed over this issue. It is essentially this what keeps folks from using Logic as an absolute killer VSTi host for realtime on-stage work.
because now you *might* finally be able to use logic live as a killer VSTi host.
just load up mulitple songs and layer the parts you want via the same channel in each song (so, if you want to play ES-2 with Atmosphere and Reaktor, just load ES-2 into AI #1 of song 1, Atmosphere into AI #1 of song 2, and Reaktor into AI #1 of song 3). you can then play them together from one master keyboard and, possibly (probably, i think), with no latency.
the reason i say possibly is that i think it works without latency, but i'm not at the office to test. when i was playing multiple AIs in different songs at once yesterday (and trying as i mentioned above to *not* play them at the same time) - i didn't hear any latency like i used to with the I/O plug in trick. even without playback engaged, it seems to me that all selected AI channels would play at the same time. at least that is how i remember it. i will test it when i get back to the studio.
so for live performance, just create many 'multi instruments' across several songs and use something like the logic control to switch between AI channels (thus switching between multi-instruments) in real time. you can have your above ES-2, atmosphere, reaktor on the 3 AI #1s and then reaktor, stylus, EXS-24 on the AI #2s and vanguard, FM-7, EXS-24 on AI#3s, etc.
and you could assign each song different midi channels (all AIs in song 1 respond to midi channel 1, all AIs in song 2 respond to midi channel 2, etc.) so you could use your master controller to control them all by sending midi out to all channels, but individually by sending out only specific channels.
of course, the potential also exists then to use multiple live inputs on the same audio channel in different songs - so a guitar could be routed 'live' to input one of audio channel one in song one. a bass to audio channel 1 of song 2, drums to audio channel 1 of song 3, etc. and all would have access to separate effects which are tempo synced.
again, i haven't tested for this 'live' potential and i'm only free thinking this, but the potential is there to finally use logic (multiple logics, actually) for true live perfomance.
of course, this will require much RAM and CPU. now where are those G5 powerbooks?
all in all, very exciting, i think.
cheers