From: Michael R. <mr...@us...> - 2005-10-31 12:20:43
|
Hi, >> Alien, it is not a matter of having a crossfading parameter or not. >> the problem is that i don't know how the crossfading would be >> implemented. >> >> it might sound like a simple extension of my patch but it is not. the >> gapless support is basically just disabling a couple of things at the >> engine to avoid buffers being dropped, audio port being closed and >> metronom being reset. so output layers will see as if a continuos >> stream of bytes were provided, no matter it came from different >> files. >> >> crossfading would require some sort of buffering that knows how to >> mix >> two sources and a metronom mapping that goes to the past when the new >> stream is started. >> >> conceptually it sounds simple, but i don't know how to implement it >> cleanly. so it is still an open problem, unless i'm missing something >> obvious (this is how it worked with the gapless idea ;-) > > I understand > > a small not however, doesn't gapless require audio port not closing > and similar things? > > I just hope it isn't so that when later, the crossfading is > implemented, that we have a new gapless method on our hands... > (crossfading being 0). The gapless stuff is definitely a requirement for crossfading. I think for small crossfade intervals, we could use a post plugin, which simply buffers up enough of the audio stream to calculate the fade in the buffer before sending the data on to the output. Since crossfading is typically used for audio-only playback and audio decoding should be fast enough to fill such a buffer in an acceptable time, I think this should work. Michael |