Thread: [Bluemusic-users] Realtime and blue
Brought to you by:
kunstmusik
From: Ken <ren...@ea...> - 2006-10-08 17:12:50
|
Hi Steven, This is not so much a feature request, but more a question on the direction you are taking with blue. I have come from the windows world of recent, and have played with most of the audio software out there, but because of my present beliefs can no longer support the commercial model anymore having moved to linux now, once again, and hopefully for good. I have found in using the software, the features that I like and work for me as a musician. I wish to focus on using Csound as my main synthesis engine, and am now contemplating weather to build my own composition application using the API, use blue, or use another environment like Common Music, eventually creating a GUI for the two. So i have a few questions about blue, and where you are taking it to see if it fits into my plans and desires. I really like all you have done and would create something similar as to the gui, instrument gui's and the like. -Recently there was talk of midi recording, does this mean midi controlled instruments, display, and recording of input midi streams? -Would it be possible to have python code that generates midi note streams, that in turn route to midi instruments? I am thinking that incoming midi would act as a toggle for processes (on, off), or as pitch input to live processes. -Would it be possible to have say python code effect the incoming midi streams like the note processors, but acting in realtime on live incoming streams? -What about midi control of the various gui components like the mixer, or synth parameters with a midi learn faclility? My two favorite apps in the windows world were ableton live for the live improvisatory feel, and kontatkt 2. Kontakt had an interesting scripting engine that would effect incoming midi as well as generate midi, with arpeggiators etc. I am looking to create the same, and am definitely interested in realtime, algorithmic, improvisatory, exploratory composition methods. I am interested in your thoughts and plans on any of this. Thanks, Ken - |
From: Steven Y. <ste...@gm...> - 2006-10-08 21:43:11
|
Hi Ken, Thanks for your questions; there's a lot here to discuss so I'll reply inline: > -Recently there was talk of midi recording, does this mean midi > controlled instruments, display, and recording of input midi streams? I was actually looking into MIDI recording today! It was actually strangely quick. With just a bit of work I have a test program that is recording the input from my MIDI keyboard looking at note-on's and note-off's and writing Csound SCO text. Now, recording the information is doable. What I'm going to look at next is how to handle this in terms of Csound. Now, I can record the information, create SCO statements, and then pass them on to a running instance of Csound to synthesize. I think I even know what to do with using fractional instrument numbers (one for each note in MIDI (0-127) plus a safe buffer to not interfere with existing fractional instrument number) and a special blue instrument to turn off those instruments using a turnoff2 statement (so what Csound gets and what blue "records" for the user is slightly different, to handle Csound performing score statements in realtime). This has complications in terms of p3 dependent information, so may require some programming guidelines for instruments in that case. There's also implementing a mono-mode to think of; I can do that fairly easily and create SCO that will be all tied-notes unless all keys off. But how that works with getting Csound to play it is another story. Also, there comes the bigest issue of how to have a running Csound instance to synthesize realtime notes when also having a Csound instance be performing the score from the timeline, and should they be the same. If not, it's a mess as always-on instruments can be taxing on CPU and if you have to copies of your project running, one to synthesize realtime and one to synthesize score, then it's a bit of a waste. Also, if they're in the same instance, there's issues too. Now, some of this comes as a mismatch between what Csound is now and what it it isn't. As itis now, It's not a program setup for modification in realtime of its internal data, so you couldn't say with a running instance of Csound change instruments or modify the parsed score. Technically, that is what you would need to match the experience in midi-based environments (cubase, logic, etc.). So, in terms of all of this, there might have to be some complications or limitations to accept. Now, if we do something like make blueLive be a live laboratory for music making, have that be where you record MIDI, work with soundObjects in realtime, do filtering/modfications to score, and we *don't* have it also synced up to the main timeline, then I don't think that would be a problem at all. After today's experiments, I am thinking that that might be the easiest way to go, and that doing so I can make it fairly full featured. To have it involved with the main timeline recording as well, I don't know yet. Still have to think it through. > -Would it be possible to have python code that generates midi note > streams, that in turn route to midi instruments? I am thinking that > incoming midi would act as a toggle for processes (on, off), or as pitch > input to live processes. I think this could be done. I am not sure yet how to go with this as for me, my interest in live work has been mostly not as an end but a means to experiment, with ultimately fixed score being what I work with. Live music production has not been a focus for blue and is not a compositional concern of mine, so it would require others to help me out to determine what are desirable features. > -Would it be possible to have say python code effect the incoming midi > streams like the note processors, but acting in realtime on live > incoming streams? Yes; we'd need to create some guidelines as to when python code runs (on note on, on note off) and if there's going to be threads used, but I think it could work. There might be complications, but the rules to work those out can be determined here on the list collaboratively. > -What about midi control of the various gui components like the mixer, > or synth parameters with a midi learn faclility? Tricky; it's technically possible, but then again comes issues of how to work with Csound instances and what not. This also touches on issues of parameter automation which I've wanted to look into for a while. What I have considered lightly in the past is to add some meta-information to Instruments in blue. One would be a MIDI note template to use for midi note data. The other would be to be able to query automatable parameters. Doing the latter, we could setup something such that if blueLive is running, record that information and modify the parameters in the interface as well as recording that information. Changing a parameter live is no problem, but recording continous data would be tricky. Again the problem of multiple instances and what to do with existing data from the Score Timeline. > My two favorite apps in the windows world were ableton live for the live > improvisatory feel, and kontatkt 2. Kontakt had an interesting > scripting engine that would effect incoming midi as well as generate > midi, with arpeggiators etc. I am looking to create the same, and am > definitely interested in realtime, algorithmic, improvisatory, > exploratory composition methods. Well, as mentioned before, my own focus for blue thus far has mostly not been realtime work as an ends. Now, I am becoming more interested in developing out blue's realtime capabilities, but am not sure how far it will go and even yet what all the possibilities are. It's a slightly different music making space and it has a number of technical complications. Also, there's another issue at hand, which is that I have thus far resisted using the Csound API. Some of the things mentioned here would be easier to implement if using the API, but it also introduces a number of complications (packaging, installation, stability). Also, blueLive does not work on Windows as line events do not work on that platform. Perhaps I should start using the API as a project for Windows to do line events (which I know would work) and then move on to gradually using the API more. (Which reminds me that I need to experiment with using the API optionally at runtime...) It's a lot to think about. It's something that I would want to make sure is done right the first time and will work in as many possible ways as possible for as many contexts as possible. I think realistically that there's a lot of general ideas here and that it's going to need to be really ironed out into concrete details and broken up into smaller, palatable mini-projects for me to be able to focus on it. I'd love to hear as many ideas on how MIDI should work within blue from everyone with as much detail as possible to come up with a set of requirements to work off of. That's a long reply with a lot of information so hopefully that makes some kind of sense and explains what going on in my mind regarding all of this. Thanks, steven |
From: Ken <ren...@ea...> - 2006-10-11 13:56:19
|
Thanks for the detailed reply Steven. It'll be interesting to see what replies come in. I will stay in touch about all this. I realize that Blue is designed primarily for non-realtime compostion rather than midi. I have been playing around with Common Muisc quite a bit, and will continue to try and push ahead with what I've been doing there as well as staying in touch with Blue and its development. Of course, at the present moment all of this would be on your head, as I have no java skills, nor python, though I'd be willing to dig in if I decide to go that route. Primarily I wanted to check in and see what your future plans are. All in all, its alot to think about as you said. Ken Steven Yi wrote: > Hi Ken, > > Thanks for your questions; there's a lot here to discuss so I'll reply inline: > > >> -Recently there was talk of midi recording, does this mean midi >> controlled instruments, display, and recording of input midi streams? >> > > I was actually looking into MIDI recording today! It was actually > strangely quick. With just a bit of work I have a test program thats > is recording the input from my MIDI keyboard looking at note-on's and > note-off's and writing Csound SCO text. > > > |
From: Steven Y. <ste...@gm...> - 2006-10-11 14:05:29
|
Hi Ken, Thanks for your response. If you find yourself interested to get into blue and Java programming I'd be more than happy to help out. Also, feel free to ask any questions regarding ideas you may have as I feel I have a pretty good grasp of Csound and how it can be used, as well as what's already been done, so that I can point to other projects that might have working code. Best of luck in your work! steven On 10/11/06, Ken <ren...@ea...> wrote: > Thanks for the detailed reply Steven. It'll be interesting to see what > replies come in. I will stay in touch about all this. I realize that > Blue is designed primarily for non-realtime compostion rather than > midi. I have been playing around with Common Muisc quite a bit, and > will continue to try and push ahead with what I've been doing there as > well as staying in touch with Blue and its development. Of course, at > the present moment all of this would be on your head, as I have no java > skills, nor python, though I'd be willing to dig in if I decide to go > that route. Primarily I wanted to check in and see what your future > plans are. All in all, its alot to think about as you said. > Ken > > Steven Yi wrote: > > Hi Ken, > > > > Thanks for your questions; there's a lot here to discuss so I'll reply inline: > > > > > >> -Recently there was talk of midi recording, does this mean midi > >> controlled instruments, display, and recording of input midi streams? > >> > > > > I was actually looking into MIDI recording today! It was actually > > strangely quick. With just a bit of work I have a test program thats > > is recording the input from my MIDI keyboard looking at note-on's and > > note-off's and writing Csound SCO text. > > > > > > > > > ------------------------------------------------------------------------- > Using Tomcat but need to do more? Need to support web services, security? > Get stuff done quickly with pre-integrated technology to make your job easier > Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo > http://sel.as-us.falkag.net/sel?cmd=lnk&kid=120709&bid=263057&dat=121642 > _______________________________________________ > Bluemusic-users mailing list > Blu...@li... > https://lists.sourceforge.net/lists/listinfo/bluemusic-users > |