Well, the whole point of this project is that the higher
resolution decoding increases quality, correct? Unless
there's something more complicated to it than that, why
not also support higher quality decoding? First,
supposedly even the 24-bit is better quality when
dithered downward to 16, so I imagine 32-bit wouldn't be
so bad dithered downward to 24. Second, there are a
few soundcards that can play back audio at 32-bits
(such as mine) which would likely allow some people to
get even better quality yet. It occured to me that if it
weren't better quality when dithered to 24-bit, then it
might still work well if the decoder automatically used the
appropriate amount based on which setting was chosen.
Eg, if the user chooses 24, decode at 24, if the user
chooses 32, decode at 32. Really I imagine the main
thing holding such a thing back is the extra CPU power
that would take. Most people these days have better
PCs than my mere athlon 1800+, and I believe mine could
easily handle that with quite a lot of room to spare.
I know this isn't terribly important, but it seems to me like
it couldn't exactly hurt anything to try and might actually
Log in to post a comment.