| Summary: | QZ_SetGammaRamp: Out-of-bounds memory access | ||
|---|---|---|---|
| Product: | SDL | Reporter: | Seth Willits <seth> |
| Component: | video | Assignee: | Sam Lantinga <slouken> |
| Status: | RESOLVED FIXED | QA Contact: | Sam Lantinga <slouken> |
| Severity: | critical | ||
| Priority: | P2 | CC: | icculus |
| Version: | 1.2.15 | ||
| Hardware: | x86 | ||
| OS: | Mac OS X (All) | ||
| Attachments: | Bad gamma curves caused by this bug | ||
This was fixed on April 17th, 2013, here...
https://hg.libsdl.org/SDL/rev/f7fd5c3951b9
...but has not made it into an official 1.2 release (and later patches have completely disabled QZ_SetGammaRamp() as it uses a deprecated OS X API that would crash some drivers on modern Macs).
--ryan.
|
Created attachment 1509 [details] Bad gamma curves caused by this bug QZ_SetGammaRamp creates gamma LUTs that are only *255* large, but then sets the 256th element. This has obvious negative side effects. Simply changing tableSize to 256 fixes the potential crash and fixes problems where incorrect gamma curves are set. int QZ_SetGammaRamp (_THIS, Uint16 *ramp) { const uint32_t tableSize = 256; // CRITICAL FIX -- it was 255 CGGammaValue redTable[tableSize]; CGGammaValue greenTable[tableSize]; CGGammaValue blueTable[tableSize];