|
From: Josh G. <jg...@us...> - 2002-11-06 17:33:23
|
On Wed, 2002-11-06 at 06:18, Steve Harris wrote: > On Tue, Nov 05, 2002 at 11:35:24PM -0800, Josh Green wrote: > > - Using GObject, a C based object system (GStreamer uses this). This > > gives us an object system with C the lowest common denominator > > programming language. GObject also has an easy python binding code > > generator that is part of the pygtk package. > > Cool. Does GObject have c++ bindings for the c++ nazis here ;) > GObject is part of glib 2.0 and is used by GTK+ 2.0. Therefore the gtkmm (C++ bindings for GTK) contains GObject bindings as well. I don't know the details of this though or whether the GObject stuff can be used by itself without the GTK dependency. > > - How well would the object system libInstPatch uses fit with a real > > time synthesizer? What kind of system could be used to associate > > synthesizer time critical data with patch objects? > > It would probably have to be abstracted in some way, the deal with RT-ish > audio software is you get handed a buffer big enough for 64 samples (for > example), and you have <<1 ms to fill it then return. This means, no > malloc, no file i/o, no serious data traversal. > I am also pretty familiar with the requirements of real time synthesis. That is why I was posing that question. Since patch parameters and objects are multi-thread locked for various operations they might not be very friendly to real time situations. That is why I was trying to think of some sort of parameter caching system, where each active preset has its own parameter space that the synth has direct access to and can define its own values (it would need this for its internal variables anyways). This could then be synchronized with the object system. I perceive it as a trade off between patch parameter flexibility and speed. Creating 2 systems that synchronize with each other could give us the best of both worlds. > I would imagine that most engines will preprepare large linear sample > buffers of the audio that there most likly to be required to play, then > when there RT callback gets called they can just copy from the prepared > buffers (maybe apply post processing effects if they have RT controls) > and then return. > Streaming from disk is I think the answer. Of course small samples could be cached in RAM (Linux does this for us to some extent anyways). What I'm concerned with currently is the patch information and parameters and its real time response in querying it (what the synth engine would be doing). > If the post processing only contains effects that are controlled by > time (static envelopes, retriggered LFOs etc.), not by user interaction > (MIDI, sliders, dynamic enveopes,whatever) then they could be applied > ahead of time. But maybe that is too special a case. > Well I suppose there might be some optimization that could occur when there aren't any real time controls connected to a synthesis parameter. SoundFont uses modulators to connect parameters to MIDI controls. > - Steve > BTW if anyone has yet to try my program Swami (http://swami.sourceforge.net) you may want to do so :) It currently uses iiwusynth as its software synthesizer which has a lot of the features we are talking about but uses SoundFont files as its basis synthesis format. Modulator support is one of the things already working in iiwusynth (even loop points can now be modulated, but in CVS of Swami and iiwusynth only). The underlying architecture of Swami is all GObject and in my opinion very flexible and powerful (plugins, wavetable objects, etc). I have many plans to create instrument patch oriented applications with it (Python binding, online web patch database, multi peer jamming on LAN/internet). Of course much of this may fit well with the goals of LinuxSampler. iiwusynth (now called FluidSynth) may also have much to offer, although I'm not sure if the author's know about this project yet. I'll shoot an email over to the iiwusynth devel list. It would be a shame for all of us to re-invent the wheel and then find that the wheels don't even work together :) Cheers. Josh Green |