|
From: Josh G. <jg...@us...> - 2002-11-06 07:32:58
|
Went on a trip for a few days and come back to find my email box full of discussions about a patch loading library. What follows are details concerning my libInstPatch library and its design goals. If you haven't had a look at my API for libInstPatch (http://swami.sourceforge.net) here is my current plans for it. I am very curious how well this project could fit in with the LinuxSampler project (actually I'm also wondering if Swami could become a GUI for it, since I'm planning on fully objectifying the GUI as well, it is pretty much already, just not the interface). - Using GObject, a C based object system (GStreamer uses this). This gives us an object system with C the lowest common denominator programming language. GObject also has an easy python binding code generator that is part of the pygtk package. - Not attempt to make a unified patch format (sounds like others agree) - Take advantage of GObject's typed property system for setting parameters on patches, this provides for a bit of unified API between patch formats, example of setting parameters on a SoundFont preset in C: g_object_set (preset, "name", "Preset Foo", "bank", 0, "psetnum", 4, NULL); - Patch formats are broken out into individual GObjects (IPatchSFont, IPatchSFPreset, IPatchSFInst, IPatchSFZone and IPatchSFSample for SoundFont banks for example) and organized in a parent/child tree structure where appropriate. - Multi-threaded patch objects - A sample store object that provides for pluggable methods of storing sample data (RAM, swap file, etc) I believe this design will provide for a rather flexible patch file library and other formats can be easily added to it. Some things to think about: - My new libInstPatch actually needs a bit of testing and debugging as I just recently completed the API and have not brought the new development version of Swami up to date to actually use it. - Loading/saving API is not flexible enough (hasn't been re-written yet for libInstPatch which is now no longer SoundFont centric) - How well would the object system libInstPatch uses fit with a real time synthesizer? What kind of system could be used to associate synthesizer time critical data with patch objects? - Multi-threaded objects: while it makes server/multiple client architectures possible it also adds excess locking requirements etc. For example: all object lists must be locked before iterating over them in libInstPatch, if not using the user iterator routines which make a copy of object lists. - I decided not to deal with the converting of audio data with libInstPatch although I'm sure this will be necessary as more formats are added. Perhaps a sample library like libsndfile could be used for such things, I know the author is interested in this stuff. My library has pretty much been designed around the idea of the most flexibility in the patch file realm and act like a patch server that can have multiple clients editing the same patches (think distributed patch editing sessions, between programs or computers). I have not really thought about the real time realm required by soft synths. It may be that my design cannot incorporate whats required by LinuxSampler, but I would really like it to. This project sounds really exciting. I'm just hoping I can keep up with the massive amount of email these discussions involve (and actually get some programming done at the same time :) Cheers. Josh Green |
|
From: Steve H. <S.W...@ec...> - 2002-11-06 14:18:58
|
On Tue, Nov 05, 2002 at 11:35:24PM -0800, Josh Green wrote: > - Using GObject, a C based object system (GStreamer uses this). This > gives us an object system with C the lowest common denominator > programming language. GObject also has an easy python binding code > generator that is part of the pygtk package. Cool. Does GObject have c++ bindings for the c++ nazis here ;) > - How well would the object system libInstPatch uses fit with a real > time synthesizer? What kind of system could be used to associate > synthesizer time critical data with patch objects? It would probably have to be abstracted in some way, the deal with RT-ish audio software is you get handed a buffer big enough for 64 samples (for example), and you have <<1 ms to fill it then return. This means, no malloc, no file i/o, no serious data traversal. I would imagine that most engines will preprepare large linear sample buffers of the audio that there most likly to be required to play, then when there RT callback gets called they can just copy from the prepared buffers (maybe apply post processing effects if they have RT controls) and then return. If the post processing only contains effects that are controlled by time (static envelopes, retriggered LFOs etc.), not by user interaction (MIDI, sliders, dynamic enveopes,whatever) then they could be applied ahead of time. But maybe that is too special a case. - Steve |
|
From: Josh G. <jg...@us...> - 2002-11-06 17:33:23
|
On Wed, 2002-11-06 at 06:18, Steve Harris wrote: > On Tue, Nov 05, 2002 at 11:35:24PM -0800, Josh Green wrote: > > - Using GObject, a C based object system (GStreamer uses this). This > > gives us an object system with C the lowest common denominator > > programming language. GObject also has an easy python binding code > > generator that is part of the pygtk package. > > Cool. Does GObject have c++ bindings for the c++ nazis here ;) > GObject is part of glib 2.0 and is used by GTK+ 2.0. Therefore the gtkmm (C++ bindings for GTK) contains GObject bindings as well. I don't know the details of this though or whether the GObject stuff can be used by itself without the GTK dependency. > > - How well would the object system libInstPatch uses fit with a real > > time synthesizer? What kind of system could be used to associate > > synthesizer time critical data with patch objects? > > It would probably have to be abstracted in some way, the deal with RT-ish > audio software is you get handed a buffer big enough for 64 samples (for > example), and you have <<1 ms to fill it then return. This means, no > malloc, no file i/o, no serious data traversal. > I am also pretty familiar with the requirements of real time synthesis. That is why I was posing that question. Since patch parameters and objects are multi-thread locked for various operations they might not be very friendly to real time situations. That is why I was trying to think of some sort of parameter caching system, where each active preset has its own parameter space that the synth has direct access to and can define its own values (it would need this for its internal variables anyways). This could then be synchronized with the object system. I perceive it as a trade off between patch parameter flexibility and speed. Creating 2 systems that synchronize with each other could give us the best of both worlds. > I would imagine that most engines will preprepare large linear sample > buffers of the audio that there most likly to be required to play, then > when there RT callback gets called they can just copy from the prepared > buffers (maybe apply post processing effects if they have RT controls) > and then return. > Streaming from disk is I think the answer. Of course small samples could be cached in RAM (Linux does this for us to some extent anyways). What I'm concerned with currently is the patch information and parameters and its real time response in querying it (what the synth engine would be doing). > If the post processing only contains effects that are controlled by > time (static envelopes, retriggered LFOs etc.), not by user interaction > (MIDI, sliders, dynamic enveopes,whatever) then they could be applied > ahead of time. But maybe that is too special a case. > Well I suppose there might be some optimization that could occur when there aren't any real time controls connected to a synthesis parameter. SoundFont uses modulators to connect parameters to MIDI controls. > - Steve > BTW if anyone has yet to try my program Swami (http://swami.sourceforge.net) you may want to do so :) It currently uses iiwusynth as its software synthesizer which has a lot of the features we are talking about but uses SoundFont files as its basis synthesis format. Modulator support is one of the things already working in iiwusynth (even loop points can now be modulated, but in CVS of Swami and iiwusynth only). The underlying architecture of Swami is all GObject and in my opinion very flexible and powerful (plugins, wavetable objects, etc). I have many plans to create instrument patch oriented applications with it (Python binding, online web patch database, multi peer jamming on LAN/internet). Of course much of this may fit well with the goals of LinuxSampler. iiwusynth (now called FluidSynth) may also have much to offer, although I'm not sure if the author's know about this project yet. I'll shoot an email over to the iiwusynth devel list. It would be a shame for all of us to re-invent the wheel and then find that the wheels don't even work together :) Cheers. Josh Green |
|
From: Steve H. <S.W...@ec...> - 2002-11-06 17:44:18
|
On Wed, Nov 06, 2002 at 09:35:33AM -0800, Josh Green wrote: > peer jamming on LAN/internet). Of course much of this may fit well with > the goals of LinuxSampler. iiwusynth (now called FluidSynth) may also > have much to offer, although I'm not sure if the author's know about > this project yet. I'll shoot an email over to the iiwusynth devel list. > It would be a shame for all of us to re-invent the wheel and then find > that the wheels don't even work together :) Cheers. Interesting, maybe the question should be whether its easier to add disk streaming to FluidSynth, I played with iiwusynth a fews months back and it seemed very impressive. - Steve |
|
From: Josh G. <jg...@us...> - 2002-11-07 05:59:46
|
On Wed, 2002-11-06 at 09:44, Steve Harris wrote: > On Wed, Nov 06, 2002 at 09:35:33AM -0800, Josh Green wrote: > > peer jamming on LAN/internet). Of course much of this may fit well with > > the goals of LinuxSampler. iiwusynth (now called FluidSynth) may also > > have much to offer, although I'm not sure if the author's know about > > this project yet. I'll shoot an email over to the iiwusynth devel list. > > It would be a shame for all of us to re-invent the wheel and then find > > that the wheels don't even work together :) Cheers. > > Interesting, maybe the question should be whether its easier to add disk > streaming to FluidSynth, I played with iiwusynth a fews months back and it > seemed very impressive. > > - Steve > FluidSynth has been a little quite for a while (as well as my own project) but things seem to be picking up again. One of the developers, Markus Nentwig, is doing lots of optimization work. We were just recently making a list of future plans, one of them being sample streaming support (including disk streaming). I really think that Swami/FluidSynth are quite nice and that a lot of people are missing out by not trying them :) Cheers. Josh Green |