When compiling libFLAC for x86 machines that lack SSE2 (pre-Pentium 4/pre-Athlon64 CPUs), the configure script will incorrectly assume that just because the compiler accepts -msse2 as an option then it has to be used. This will produce a binary that is unusable on Pentium 3 and AMD K7 and older CPUs.
Kars /var/tmp/portage/media-libs/flac-1.3.3-r1/work/flac-1.3.3 # grep msse2 . -r
./configure: --disable-sse Disable passing of -msse2 to the compiler
./configure: { $as_echo "$as_me:${as_lineno-$LINENO}: checking if $CC accepts -msse2" >&5
./configure:$as_echo_n "checking if $CC accepts -msse2... " >&6; }
./configure: CFLAGS="-msse2"
./configure: CFLAGS="$ac_add_cflags__old_cflags -msse2"
./configure.ac:AC_HELP_STRING([--disable-sse], [Disable passing of -msse2 to the compiler]),
./configure.ac: XIPH_ADD_CFLAGS([-msse2])
Perhaps a better approach would be:
cpuid opcode to determine if we can use SSE2. This opcode is present in all x86 CPUs since i486SL.Better yet, it's really strange why enabling SSE would warrant SSE2 being present which isn't fair even if we cross-compile - we could be cross-compiling for a particularly old machine and still hit this bug. Maybe each SSE level should have its own --enable flag?
This was done on purpose. Support for older CPUs (at the expense of a slower binary for newer CPUs) is enabled with --disable-sse as you noticed. This is nothing strange, many programs (browsers for example) and OSes (Windows 8.1 and up) require SSE2. FLAC however, still allows you to compile your own version without this requirement, but it is set as default.
Well, this just leaves a portion of CPUs served worse than others.
Non-SSE CPUs get their non-SSE binary, and only SSE2 CPUs get the SSE
binary. My point is that the interests of the middle ground between
these 2 CPUs should be covered. I could write a patch when I find the time.
W dniu 1.03.2022 o 08:37, Martijn van Beurden pisze:
Related
Bugs: #479
SSE2 was introduced in 2000 and has been mandatory for all 64-bit CPUs. Is it really such a big problem that those ancient CPUs get the best possible binary?