Files
SDL/src/test
Simon McVittie d95d2d7051 SDLTest_CompareSurfaces: Decode pixels correctly on big-endian platforms
Previously, if acting on a surface with less than 32 bits per pixel,
this code was placing the pixel value from the surface in the first
few bytes of the Uint32 to be decoded, and unrelated data from a
subsequent pixel in the remaining bytes.

Because SDL_GetRGBA takes the bits to be decoded from the
least-significant bits of the given value, ignoring the higher-order
bits if any, this happened to be correct on little-endian platforms,
where the first few bytes store the least-significant bits of an
integer.

However, it was incorrect on big-endian, where the first few bytes are
the most-significant bits of an integer.

The previous implementation also assumed that unaligned access to a
32-bit quantity is possible, which is not the case on all CPUs (but
happens to be true on x86).

These issues were not discovered until now because
SDLTest_CompareSurfaces() is only used in testautomation, which until
recently was not being run routinely at build-time, because it contained
other assumptions that can fail in an autobuilder or CI environment.

Resolves: https://github.com/libsdl-org/SDL/issues/8315
Signed-off-by: Simon McVittie <smcv@collabora.com>
2023-09-29 06:55:17 -07:00
..
2023-01-09 09:41:41 -08:00
2023-07-03 12:46:47 -07:00
2023-07-03 12:46:47 -07:00
2023-01-09 09:41:41 -08:00
2023-01-09 09:41:41 -08:00