Decoding error in libvorbis when compiled on Mac OS X in i386 architecture
I encountered a subtle and odd error in libvorbis that shows up on Mac OS X when you compile for i386 architecture.
This is for the released library libvorbis 1.3.2.
It doesn't show up when compiling for x86_64 architecture.
The symptom: ogg123 produces noise when compiled for i386 but plays the file quite normally when compiled for x86_64. After much tinkering, I finally decoded the .ogg to raw with oggdec in both i386 and x86_64 and compared the output with od:
s007% od -t xS -N 32 /tmp/oggdec_i386.out
0000000 c2b5 8000 c4ba 8000 c252 8000 c28b 8000
0000020 c7f0 8000 cb4b 8000 ce95 8000 d114 8000
s007% od -t xS -N 32 /tmp/oggdec_x86-64.out
0000000 c2b5 d253 c4ba d4a9 c252 d4b1 c28b d7d3
0000020 c7f0 ddfe cb4b df58 ce95 de6c d114 dc79
Notice that in the i386 decoding, every other short is nonsense (x8000).
I traced the problem to the float-to-int conversion code defined in lib/os.h. IIRC, Mac OS X relies entirely on the SSE1/2/3 instruction set to perform math, and may not initialize the FPU correctly.
The problem is solved neatly by relying on the SSE2 code (which is used in the x86_64 pathway). ogg123 plays the file correctly, and the decoding matches between the i386 and x86_64 builds. This fix should be safe on Mac OS X; all of the Intel machines that Apple ships have the SSE2 instruction set.
I've attached a diff of my changes, which are localized to lib/os.h