When using tempo effect with factor >= 1.0 and in-memory output buffer (created by sox_open_memstream_write function), the output buffer is read incorrectly with sox_read function - it returns buffer incorrect lenght.
I've made an investigation, the problem appears in sox_read function:
if (ft->signal.length != SOX_UNSPEC)
len = min(len, ft->signal.length - ft->olength);
actual = ft->handler.read? (*ft->handler.read)(ft, buf, len) : 0;
after using tempo effect with factor 1.0 ft->signal.length becomes equal to ft->olength , so len becomes 0 and buffer is not read. When tempo factor > 1.0 the buffer is smaller than its real length.
The issue does not appear with written to file buffers, just with in-memory buffers. Here is this problem description on Stack Overflow
The minimal working example is included.
Sorry, for the bug in the minimal working example - it didn't updated buffer length counter.
Here is the fixed version.
https://codeberg.org/sox_ng/sox_ng/issues/241
Hi! I've had another look at this, and the problem may be that the code calls sox_memstream_write(), writes a load of stuff into it and then calls sox_read on the same sox_format_t that was opened for writing, whereas what it should do is sox_close() the output and then read the data directly from the memory buffer pointer and length that are updated only by stdio when the file is flushed or closed.
I suspect that it "works" when using a disk file because either SoX or stdio has separate read and write pointers to the file.
I've updated https://codeberg.org/sox_ng/sox_ng/issues/241 with more details that led to this way of thinking and I suspect that the real defect is that libsox allowed you to call "read" on a descriptor opened for writing without complaining at you. And that the documentation for libsox is appallingly scarse or nonexistent.