I'm sorry if this is the wrong forum; I'm new to sourceforge and don't really know my way around yet. I scanned the docs, but didn't find the answer I'm looking for.
I'm trying to read SCF files into my own program, and the documentation on Sample Points concerned me. The samples array is declared as unsigned, but there is a lot of subtracting happening, and I expect that this generates a lot of underflow errors.
My guess is that this is OK, because it all gets added back in on the return trip, but I'd welcome comments from the developers about this. I'm trying to read these files in a language (VB) that doesn't support unsigned types, and I'm encountering errors as I try to reconstitute the delta_delta'ed data.
Thanks for any comments of explanation and/or advice.
It's magic :-)
You can probably simulate this by using signed larger datatypes and then ANDing with 0xff to get back an 8-bit or 16-bit value depending on the sample precision.
eg with values of 20, 200, 20, we get deltas of 20 (from 0), +180, -180. This gives the impression of needing a full range of -255 to +255 to store the differences between any 2 8-bit values. However as the original data is in the range of 0 to 255 we can simply ignore overflows and underflows and let the deltas always cycle around to fit in this value.
For 8-bit scf samples, we would compute a single level delta via "d = a-b" and recompute b from a and d via "b = a-d". This is using unsigned chars.
To simulate this with signed ints use "d = (a-b) & 0xff" and "b = (a-d) & x0ff".
Obviously 0xffff is needed for the 16-bit samples instead.
Thanks; I have it working now. Not only the unsigned-ness, but also the byte order, were causing me trouble. I appreciate that you have this so well documented, or it would have been a great deal harder. Having dealt with the undocumented ABI format already, this is quite a relief.
Sign up for the SourceForge newsletter:
You seem to have CSS turned off.
Please don't fill out this field.