From: Tom B. (Tehom) <te...@pa...> - 2012-01-18 19:59:04
|
I've gotten a fair bit further in logical instruments. First, I've put the major things back right. Preview notes play again, percussion is treated right, metronome plays on the right instrument. But there are any number of smaller odds and ends I still have to change over. The major thing still to do is to give instruments optional fixed channels, per earlier discussion with Michael. ****** Pre-existing bugs squashed Serendipitously, I also found and squashed several pre-existing bugs ******* The non-playing trigger bug You know that subtle bug where sometimes a segment with triggered segments wouldn't play until it was moved? I found it and squashed it. I will spare you the gory details and the blow-by-blow of how I found it, but it was a tough one. Boiled down, SegmentMapper::dump (which is InternalSegmentMapper::dump in the branch I'm working on) resizes the buffer when it gets triggered segments, but doesn't set the fill size (setBufferFill) until it's done. That means that resizeBuffer copies too little. This smashes the buffer, and instead of the real events you get a spate of events at time 0.0 playing on NO_TRACK. Fixed. ******* Instrument being deleted Instruments could be deleted while objects still point to them. That includes MatrixWidget, InstrumentParameterPanel, InstrumentAliasButton, RoseXmlHandler, AudioRouteMenu, and my ControllerSearch and ChannelManager. This can cause segmentation faults. I added two signals, "destroyed" (implied by QObject) and "wholeDeviceDestroyed". I connected them to the various affected classes, except I left ControllerSearch and RoseXmlHandler mostly alone. Most classes were already checking whether they were NULL, just they'd hold on to deleted Instrument* pointers. I know there's the InstrumentId lookup method too, but Instrument* itself was held by some objects and I figured it'd take a much bigger rewrite to change them to the InstrumentId way. I also considered Qt's weak pointers, but that seemed like a much bigger change too, and heavier. ******* CompositionMapper created redundant SegmentMappers CompositionMapper used to create redundant SegmentMappers when mapSegment was called. By my tracing it seemed to create everything twice. That used to do little harm, but in logical-instruments it made me run out of channels. Fixed. ****** Question: Are sendNRPN and sendRPN worth updating? StudioControl::sendNRPN and sendRPN would be affected by this, but it looks like they aren't ever called anywhere. Are they used in some subtle way that escapes me? I don't want to leave them inconsistent with the rest of RG to bite some unwary future maintainer, but I don't want to spend time and effort on them if they don't matter. Tom Breton (Tehom) |
From: Chris C. <ca...@al...> - 2012-01-18 20:48:21
|
Tom -- this looks like good stuff, thank you. On 18 January 2012 19:58, Tom Breton (Tehom) <te...@pa...> wrote: > StudioControl::sendNRPN and sendRPN would be affected by this, but it > looks like they aren't ever called anywhere. Are they used in some subtle > way that escapes me? I have a feeling they were never completed and hooked up -- Richard Bown might remember with more certainty. In which case, if they're now actively wrong, it would probably be better to remove them. Chris |
From: Richard B. <ric...@fe...> - 2012-01-18 21:33:26
|
On 18 Jan 2012, at 21:21, Chris Cannam <ca...@al...> wrote: > I have a feeling they were never completed and hooked up -- Richard > Bown might remember with more certainty. Too long ago. But I did have a reaction to NRPN and remember they were a function of (primarily for me) the QY-70 sequencer... So, upshot is.. no idea but probably get rid if it's not used. R |
From: Tom B. (Tehom) <te...@pa...> - 2012-01-18 21:55:11
|
> > On 18 Jan 2012, at 21:21, Chris Cannam <ca...@al...> > wrote: >> I have a feeling they were never completed and hooked up -- Richard >> Bown might remember with more certainty. > > Too long ago. But I did have a reaction to NRPN and remember they were a > function of (primarily for me) the QY-70 sequencer... > > So, upshot is.. no idea but probably get rid if it's not used. Thanks. Tom Breton (Tehom) |
From: D. M. M. <mic...@ro...> - 2012-01-19 10:04:38
|
On Wednesday, January 18, 2012, Tom Breton (Tehom) wrote: > I don't want to leave them inconsistent with the rest of RG to bite some > unwary future maintainer, but I don't want to spend time and effort on > them if they don't matter. That's really thoughtful of you. There have been quite a few times when I got waist deep into tinkering with the guts of something, only to realize that the whole thing was some piece of vestigial cruft. It's much better to just dump stuff like that. I appreciate it. I still haven't managed to give your branch a test drive yet, as last weekend was my wedding anniversary. I hope to get there soon. -- D. Michael McIntyre |
From: Tom B. (Tehom) <te...@pa...> - 2012-01-21 22:02:15
|
> That's really thoughtful of you. There have been quite a few times when I > got > waist deep into tinkering with the guts of something, only to realize that > the > whole thing was some piece of vestigial cruft. It's much better to just > dump > stuff like that. I appreciate it. Thanks for the kind words. I just committed a new update which allows instruments to be fixed to specific channels. Now I mostly have to clean up loose ends - I mean, vestigial cruft. > I still haven't managed to give your branch a test drive yet, as last > weekend > was my wedding anniversary. I hope to get there soon. Happy anniversary! Hope you had a good one. Tom Breton (Tehom) |
From: Tom B. (Tehom) <te...@pa...> - 2012-01-22 02:22:10
|
The logical instruments branch now saves and restores the "fixedness" of instruments' channels. It works correctly on my test files. I was able to make it forward and backward compatible, so I just bumped FILE_FORMAT_VERSION_POINT up to 1. I thought the best thing to do was group "fixed" with "channel" that it affects, so I made it an attribute of of "instrument" too. The affected XML looks like: : <instrument id="2000" channel="0" fixed="false" type="midi"> Oddly, channel means something different for each type of Instrument. Someone please correct me if I'm wrong, I'm going by the comments in RoseXmlHandler.cpp that "Synth and Audio instruments always have the channel set to 2." implying that (as I suspected) SoftSynth are not playing on distinct channels as MIDI are. I also see that Audio uses it to mean "mono or stereo". I removed the handler.channelsWereRemapped message in RosegardenDocument::xmlParse, since it no longer applies and it now can never fire. I also added another example file with a fixed instrument, this one saved with the new code. It loads in old versions but triggers the channelsWereRemapped message because of the fixed instrument. Tom Breton (Tehom) |
From: D. M. M. <mic...@ro...> - 2012-01-22 10:36:45
|
On Saturday, January 21, 2012, Tom Breton (Tehom) wrote: > I removed the handler.channelsWereRemapped message in > RosegardenDocument::xmlParse, since it no longer applies and it now can > never fire. I hadn't even thought about that. Interesting. I'm off tomorrow night, and hope I can finally get some time to sit down and tinker with this. I appreciate that you seem to be keeping a pretty good eye on the big picture here, and have already addressed details I hadn't thought to wonder about yet. -- D. Michael McIntyre |
From: D. M. M. <mic...@ro...> - 2012-01-24 08:26:24
|
On Saturday, January 21, 2012, Tom Breton (Tehom) wrote: > channel set to 2." implying that (as I suspected) SoftSynth are not > playing on distinct channels as MIDI are. Each one is basically a channel unto itself. I think you can safely just ignore plugin instruments for your purposes, and save some headaches. > I also added another example file with a fixed instrument, this one saved > with the new code. Not finding the new example files reminded me to run scripts/rebuild-qrc manually. On doing so, I found I've apparently been adding things for a long time without bothering to follow up and see that they showed up as expected. That embarrasses me. Oh well. It's fixed now. Some reactions... First, I'm not sure what cool thing the provided example files are supposed to be showing me. It isn't that apparent, since I don't know what controllers or whatever to look for, to watch what changing. I'm too lazy to dig around in the event list to try to figure it out. I set up an example of my own, with volume and pan controllers doing opposing things in segments on the same track. It works fine. I haven't really been able to break it, but then I haven't really been able to think up inventive ways to break it either. It's sort of an invisible feature that quietly does a whole lot of things to make something cool but subtle possible. I did try dinking around with some pitch bendy compositions and seeing what would happen when I jumped playback around randomly. I couldn't seem to shake it, but when I compared with standard Rosegarden, I didn't really notice any obvious difference, so I'm not convinced I actually accomplished anything with those tests. (The last time I thought your new stuff was bullet proof, it turned out you hadn't even committed it yet, and I was seeing differences that weren't even there! My powers of observation are, therefore, highly suspect!) I do see a big can of worms when it comes to percussion though. Try loading "Stormy Riders" or "Bogus Surf Jam" and the drums are coming out with a piano. I eventually managed to get percussion working by checking the percussion checkbox. I kind of hate to force everyone to go fiddle with that checkbox in order to get old files to play. Instrument #10 should probably default to being fixed on channel 10, and that would probably take care of it. Probably. As far as fixing channels, I think I'd favor just having a simple checkbox here, and while in fixed mode, the instrument uses whatever channel its instrument number is. (Actually, I think I'd do a combo box, with something like, "Channel: [auto | fixed]" instead of a checkbox.) Rosegarden used to have independent channel controls per instrument, and you could set instrument #3 to channel 6 if you wanted. This caused no end of confusion, and several recurring themes in the way of bug reports. It literally took me years to figure out the common denominator behind all of those problems, and that's why I removed the channel controls, imposed a fixed relationship between the instrument number and the channel number, and put in that whole "channels were remapped" thing that is now deprecated. I still think that was the right call. It would probably complicate your allocation logic a little bit, but I think you can probably deal with it. When this box is checked on instrument #4, find whatever instrument is using channel 4, switch it over to some other available channel, and switch instrument #4 to channel 4. Life just seems less complicated that way, and I can't now, as then, think of any reason why people should ever _have_ to have instrument #3 play on channel 12. It shouldn't constrain anybody's ability to achieve something. All in all though, it seems pretty solid, Tom. Nothing really jumps out as being different, you can move segments full of pitch bends and so forth around randomly without causing any obvious strange things to happen, and it doesn't seem possible to get the whole thing out of whack by jumping around. I don't notice some crazy amount of processing overhead either. I don't think I have a problem with the idea of merging this into mainline Rosegarden whenever you think it's had long enough to bake. -- D. Michael McIntyre |
From: Tom B. (Tehom) <te...@pa...> - 2012-01-24 18:49:47
|
> On Saturday, January 21, 2012, Tom Breton (Tehom) wrote: > >> channel set to 2." implying that (as I suspected) SoftSynth are not >> playing on distinct channels as MIDI are. > > Each one is basically a channel unto itself. I think you can safely just > ignore plugin instruments for your purposes, and save some headaches. Thanks for the information. One question: What's the status of MidiFile? That's the one I still need to fix. It seemed like it was already ignoring repeats, triggered segments, etc. > [...] > First, I'm not sure what cool thing the provided example files are > supposed to > be showing me. It isn't that apparent, since I don't know what > controllers or > whatever to look for, to watch what changing. I'm too lazy to dig around > in > the event list to try to figure it out. OK, I guess it wasn't as clear as I thought. The example is meant to show: * The 4 piano segments are alternately playing in unison and playing separately. The old way, they would (as a group) alternately play 1 note and 4 notes; ie, the sonority would change for unisons. It's more obvious with voices or wind instruments; maybe I should have used those instead. Now when they play two or more notes in unison, they really sound that many notes in unison. Measure 2 contrasts this with an actual 1-vs-4 section. If you can hear the difference when it switches back, it's working. * The "Fixed instrument" segments demonstrate that fixed instruments are in fact working the old way; when they change from thirds to unison, the sonority should change (be cut in half), just like old times. * For the trumpet segments, it's pretty much what you demonstrated with your own example: one is crescendoing, simultaneously the other is diminuendoing. May not seem striking, but if you'd heard them fighting it out when I was developing this, the difference would be striking. Also if you jump back in the trumpet bit, it will demo another "invisible feature" - they get the right controllers for that point in time. The old way, if you jumped back it would start with a big burp as controllers first got their static values that were very different from the progressing values they would have at that time. * The metronome track demonstrates that the metronome works on its proper instrument. It's on slightly obnoxious keys because before it worked, it played on piano and I got tired of hearing minor seconds clashing so I moved it to a C/G fifth. > [...] > ways to break it either. It's sort of an invisible feature that quietly > does > a whole lot of things to make something cool but subtle possible. Thanks. That's what it's supposed to be. > I do see a big can of worms when it comes to percussion though. Try > loading > "Stormy Riders" or "Bogus Surf Jam" and the drums are coming out with a > piano. > I eventually managed to get percussion working by checking the percussion > checkbox. I kind of hate to force everyone to go fiddle with that > checkbox in > order to get old files to play. Instrument #10 should probably default to > being fixed on channel 10, and that would probably take care of it. > Probably. That is a good idea. I force percussion to channel 10 (one-based), but I missed doing it in the opposite direction. > [...] > I still think that was the right call. It would probably complicate your > allocation logic a little bit, but I think you can probably deal with it. > When this box is checked on instrument #4, find whatever instrument is > using > channel 4, switch it over to some other available channel, and switch > instrument #4 to channel 4. Life just seems less complicated that way, > and I > can't now, as then, think of any reason why people should ever _have_ to > have > instrument #3 play on channel 12. It shouldn't constrain anybody's > ability to > achieve something. Ah, the voice of experience. That is a very good idea. Wish I'd thought of that! Would've saved me a fair bit of coding. But I learned more about Qt, so the time wasn't wasted. OK, so fixed instruments will take their track number as the channel. I think it's worth the tradeoffs: * Adding or deleting tracks above the fixed instruments changes the channel. * Won't handle multiple MIDI devices neatly. For instance, fixing 2 track tracks on different devices becomes impossible. But maybe we can improve on even that. I want to run this by your voice of experience first: Suppose there were 3 options: * auto * fixed to track number * custom - fix to a specified channel on a specified MIDI device no matter where the track is. In your opinion, is that clear enough to forestall user confusion and bug reports? And where "custom fixed" and "fixed to track number" compete for the same channel, "custom" would win because it's more of a deliberate user choice. And "fixed to track" will clone instruments to give each fixed track a dedicated instrument, so there's no question of the same instrument being fixed to two tracks or playing oddly on an "auto" track. > All in all though, it seems pretty solid, Tom. Nothing really jumps out > as > being different, you can move segments full of pitch bends and so forth > around > randomly without causing any obvious strange things to happen, and it > doesn't > seem possible to get the whole thing out of whack by jumping around. Thanks. > I > don't > notice some crazy amount of processing overhead either. Thanks. I actually am rewriting the controller context code for speed. When I came to understand segment mappers, I realized that I could just look for triggered events on the temporary segment. Wish I'd realized earlier. > I don't think I have a problem with the idea of merging this into mainline > Rosegarden whenever you think it's had long enough to bake. OK. > -- > D. Michael McIntyre Tom Breton (Tehom) |
From: D. M. M. <mic...@ro...> - 2012-01-25 02:54:34
|
On Tuesday, January 24, 2012, Tom Breton (Tehom) wrote: > One question: What's the status of MidiFile? That's the one I still need > to fix. It seemed like it was already ignoring repeats, triggered > segments, etc. When MidiFile is used to convert a composition into a standard MIDI file, it's home to a laundry list of bugs of omission that really impose some pretty serious limitations on anybody who wants to export a native composition to MIDI intact. There is plenty that could be fixed, and if you think something seems broken, you're almost certainly right. > instead. Now when they play two or more notes in unison, they really > sound that many notes in unison. Ahhhhh. Very subtle, but obvious once you know what you're listening for. > Also if you jump back in the trumpet bit, it will demo another > "invisible feature" - they get the right controllers for that point in > time. This bit is especially striking if you try it with stock Rosegarden for contrast. > * Won't handle multiple MIDI devices neatly. For instance, > fixing 2 track tracks on different devices becomes impossible. I'm not sure I follow you. > But maybe we can improve on even that. I want to run this by your voice > of experience first: Suppose there were 3 options: > * auto > * fixed to track number Fixed to *instrument* number. Multiple tracks can use the same instrument. > * custom - fix to a specified channel on a specified MIDI device no > matter where the track is. I think you're confused about what owns what. From the Instrument Parameters box containing the controls for "General MIDI Device #2" you can't change the device. This instrument, #2, is contained inside of "General MIDI Device" and just by arriving here we have already established that "General MIDI Device" is the device. There can be no other. As such, all we could do for a third option is have a manual option to fix to some arbitrary channel of the user's choosing, instead of the channel corresponding with this instrument number. > In your opinion, is that clear enough to forestall user confusion and bug > reports? The problem with confusion and bug reports is that people forget that they set instrument #3 to play on channel 12, and they also set instrument #12 to play on channel 12. They change controls for #3 not realizing it affects channel 12, and therefore #12 as well. Which set of controls should take precedent is really undefined behavior, and this causes all kinds of bizarre things to happen. I suppose in your scheme if you fixed #3 to 12, it would automatically force #12 to allocate to some other channel, so that would solve the biggest part of the problem. However, I still have yet to see a defined use case where it is necessary to use some arbitrary channel, instead of the channel corresponding with the instrument number. People find Rosegarden's overall schema enormously confusing, and providing the ability to change channels manually just makes Rosegarden looks that much more like MusE or Cakewalk or something, and encourages that much more confusion as users come to grips with the fact that Rosegarden's whole model for everything is quite distinctly non-traditional, and different. Moreover, everything you're doing with automatic allocation of channels and so on just takes Rosegarden even farther along the path of being quite distinctly non-traditional and different. I don't think that's a bad thing at all, but in order to be consistent with its own reality, I don't think trying to offer more traditional controls on top of a radically non-traditional schema really accomplishes anything beneficial to users trying to get their heads around how everything fits together. I say all of this as somebody who spent years trying to get his head around how everything fits together. Even through about the first three drafts of my book, I still didn't quite get it. It's VERY confusing to someone who grew up on Cakewalk, although it's all fine and well thought out once you get into the same head space as the people who designed it. > And "fixed to track" will clone instruments to give each fixed track a > dedicated instrument, so there's no question of the same instrument being > fixed to two tracks or playing oddly on an "auto" track. Think about everything I just said, and see if you still have some thoughts along the above lines, after filtering them through a fresh look at how everything fits together. > Thanks. I actually am rewriting the controller context code for speed. I imagine the overhead is crazy, and I'm surprised at how well it works in its initial form. -- D. Michael McIntyre |
From: Tom B. (Tehom) <te...@pa...> - 2012-01-25 21:19:39
|
> On Tuesday, January 24, 2012, Tom Breton (Tehom) wrote: > >> One question: What's the status of MidiFile? That's the one I still >> need >> to fix. It seemed like it was already ignoring repeats, triggered >> segments, etc. > > When MidiFile is used to convert a composition into a standard MIDI file, > it's > home to a laundry list of bugs of omission that really impose some pretty > serious limitations on anybody who wants to export a native composition to > MIDI intact. > > There is plenty that could be fixed, and if you think something seems > broken, > you're almost certainly right. Thanks for the info. > Fixed to *instrument* number. Multiple tracks can use the same > instrument. Ah! For some reason I assumed you meant track number. Perhaps because I have been so busy removing the assumption that Instrument's m_channel is its channel that I mentally ruled it out here. That's what I was confused about. So ignore most of what I said. So I'll use Instrument's old m_channel as its channel just when it's fixed, which is just when it's not auto. And bring the remapping check and warning back in, and save the device-relative channel despite anything auto does. I'm glad you said something, because that will be much easier than what I was thinking. For some strange reason I was doing it the hard way. > Think about everything I just said, and see if you still have some > thoughts > along the above lines, after filtering them through a fresh look at how > everything fits together. I did. That fresh look is more like a face-palm, but thank you! Tom Breton (Tehom) |
From: D. M. M. <mic...@ro...> - 2012-01-26 07:56:12
|
On Wednesday, January 25, 2012, Tom Breton (Tehom) wrote: > So ignore most of what I said. So I'll use Instrument's old m_channel as > its channel just when it's fixed, which is just when it's not auto. And > bring the remapping check and warning back in, and save the > device-relative channel despite anything auto does. I wonder about the remapping check. It _should_ only come into play for old compositions. What would have happened before is Rosegarden would just remap the channels, and if you didn't like it, too bad. Whether that's the right approach or not isn't worth debating; that's just what you're up against for baseline behavior. As such, I think it would be entirely appropriate just to default to auto for everything, and ignore whatever m_channel was completely. There was no way to specify fixed vs. auto in old versions, so there's no way to guess if the user might prefer the channels fixed for some reason. In most cases, these old compositions will probably load and work just fine with automatically-allocated channels, and if there is some need for tweaking, the controls are available. I think that's the way I'd go. Future compositions that need fixed channels for some reason will carry the information we need, but so long as the channel and instrument number continue to correspond, there won't be any future need for remapping. I think it's still safe to dump that stuff. This is pretty much the whole thought process that went through my head the other day when I saw you were going to dump this, and I said, "Interesting." > I'm glad you said something, because that will be much easier than what I > was thinking. For some strange reason I was doing it the hard way. That happens frequently around here. No one is immune to that, I'm afraid. -- D. Michael McIntyre |
From: Tom B. (Tehom) <te...@pa...> - 2012-01-26 17:47:29
|
> On Wednesday, January 25, 2012, Tom Breton (Tehom) wrote: > >> So ignore most of what I said. So I'll use Instrument's old m_channel >> as >> its channel just when it's fixed, which is just when it's not auto. And >> bring the remapping check and warning back in, and save the >> device-relative channel despite anything auto does. > > I wonder about the remapping check. It _should_ only come into play for > old > compositions. OK, it's back out. That's easy right now. For old compositions, I still have to make percussion default to fixed. Otherwise, while new drums wouldn't play out of the piano, old ones would. (Done now. I'm back) With that, I think the only loose end is MidiFile.cpp. I've given that some thought. Hopefully I haven't designed it the hard way again. I will run it by you. What I'd like to do is make MidiFile use part of the normal Events-to-output logic (metaiterator and the segment mappers). MidiFile would write the MIDI headers etc, then create a MappedBufMetaIterator and use it to write the MIDI. In one way, it's roundabout. I'll turn Events into MappedEvents and then to MIDI. It'll turn TimeT into RealTime and then back into TimeT. But it may have to be that way to handle segments' RealTime delay. On the plus side, it wouldn't miss anything and the sorting is already there. Tom Breton (Tehom) |
From: D. M. M. <mic...@ro...> - 2012-01-26 20:16:22
|
On Thursday, January 26, 2012, Tom Breton (Tehom) wrote: > For old compositions, I still have to make percussion default to fixed. > Otherwise, while new drums wouldn't play out of the piano, old ones would. Just to be clear, we're still talking about having instrument #10 default to being fixed, and fixing to channel 10, right? > What I'd like to do is make MidiFile use part of the normal > Events-to-output logic (metaiterator and the segment mappers). MidiFile > would write the MIDI headers etc, then create a MappedBufMetaIterator and > use it to write the MIDI. That's a really novel approach. Does this end up handling tempo ramps? I'm not sure about that, but it seems like it should be able to handle everything else handily. I'd like to see you have a go at that, and see how it turns out. I'm thinking even if there are some problems, it's likely possible to massage around them. I can't think of anything I expect to be a complete show stopper with that approach. Other opinions welcome. It's a pretty big thing we're discussing. -- D. Michael McIntyre |
From: Richard B. <ric...@fe...> - 2012-01-26 20:37:18
|
> Other opinions welcome. It's a pretty big thing we're discussing. Best of luck! Rich |
From: Tom B. (Tehom) <te...@pa...> - 2012-01-26 21:19:55
|
> On Thursday, January 26, 2012, Tom Breton (Tehom) wrote: > >> For old compositions, I still have to make percussion default to fixed. >> Otherwise, while new drums wouldn't play out of the piano, old ones >> would. > > Just to be clear, we're still talking about having instrument #10 default > to > being fixed, and fixing to channel 10, right? Yes. That's what I meant. >> What I'd like to do is make MidiFile use part of the normal >> Events-to-output logic (metaiterator and the segment mappers). MidiFile >> would write the MIDI headers etc, then create a MappedBufMetaIterator >> and >> use it to write the MIDI. > > That's a really novel approach. Does this end up handling tempo ramps? > I'm > not sure about that, but it seems like it should be able to handle > everything > else handily. My guess is that it handles tempo ramps awkwardly but correctly, translating each event's TimeT to RealTime and then looking it up and getting the original TimeT back. It depends on how exact getElapsedTimeForRealTime is, but it looks like it takes ramping into account. > I'd like to see you have a go at that, and see how it turns out. I'm > thinking > even if there are some problems, it's likely possible to massage around > them. > I can't think of anything I expect to be a complete show stopper with that > approach. I will try it. Tom Breton (Tehom) |
From: D. M. M. <mic...@ro...> - 2012-01-28 08:51:41
|
On Thursday, January 26, 2012, Tom Breton (Tehom) wrote: > I will try it. I look forward to seeing what results. If I fully have my head around what you intend, it's a very different approach that I expect will probably work out extremely well. -- D. Michael McIntyre |
From: Tom B. (Tehom) <te...@pa...> - 2012-01-28 22:27:48
|
> On Thursday, January 26, 2012, Tom Breton (Tehom) wrote: > >> I will try it. > > I look forward to seeing what results. If I fully have my head around > what > you intend, it's a very different approach that I expect will probably > work > out extremely well. I certainly hope so. Just to let you know where I am, I have written most of the major pieces and I'm about to try it "live" for the first time. Noteoffs aren't handled yet though. It's the one thing that I can't just pull from of the MappedEvent stream. It seems like adding noteoffs in mappers is probably easiest, but I'm not sure if that messes anything up. They'd create MappedEvent::MidiNotes with velocity zero, which AlsaDriver already handles for some preview notes. If that's a problem, the second choice is to associate noteoff queues with output tracks. It also looks like to get certain types of output (markers and text) I'll need two more mapper types and a new MappedEvent enum - nothing tricky. So far it's gone fairly smoothly, except for one hitch when makedepend didn't understand header includes recursively. Tom Breton (Tehom) |