| Summary: | X11_SetGammaRamp doesn't check the return value of XStoreColors | ||
|---|---|---|---|
| Product: | SDL | Reporter: | Sven Hesse <drmccoy> |
| Component: | video | Assignee: | Sam Lantinga <slouken> |
| Status: | RESOLVED ENDOFLIFE | QA Contact: | Sam Lantinga <slouken> |
| Severity: | normal | ||
| Priority: | P2 | CC: | drmccoy, sezeroz, stevenaaus |
| Version: | 1.2.15 | ||
| Hardware: | x86 | ||
| OS: | Linux | ||
| Attachments: | Trivial patch to check the XStoreColors return value | ||
Fixed, thanks! http://hg.libsdl.org/SDL/rev/3665bc284271 Hi, This patch (3665bc284271), tested vi hg SDL-1.2, breaks brightness with Quakespasm on my Nvidia 260.19.29, fedora 14 amd64 box. To reproduce, run quakespasm in windowed mode, set in-game brightness around 80%, and change video resolution via hitting Alt-Enter, or the in-game menus. The game and X display straight away get very bright, and remain that way till app is closed. cheers, Steven Interesting. For the record: where I noticed this bug originally was with Neverwinter Nights. Without the patch, the brightness controls in NWN do nothing on the RV350 laptop. With the patch, they work. This bug escaped my attention for a while because at low gamma settings, it is not very evident. You have to make the gamma around 80% for it to be really obvious. It also seems to make our gamma slider not linear. Move the brightness slider up and down, and brightness is kindof all over the place. The thing is, quakespasm is calling SDL_SetColors, not SDL_SetGammaRamp. If SetGammaRamp function of SDL's driver (in this case X11_SetGammaRamp) fails, SDL_SetColors calls the SetColors function of its driver (in this case X11_SetColors): I guess that's where the problem is. OK, here's the thing: I noticed that with SDL-1.2.15 and older, my
brightness does actually work, and with SDL-1.2-hg my display gets
almost twice as bright.
So I added an fprintf(), like:
--- SDL_x11video.c~
+++ SDL_x11video.c
@@ -1481,8 +1481,9 @@ int X11_SetGammaRamp(_THIS, Uint16 *ramp
xcmap[i].blue = ramp[2*256+c];
xcmap[i].flags = (DoRed|DoGreen|DoBlue);
}
- if ( XStoreColors(GFX_Display, SDL_XColorMap, xcmap, ncolors) != 0 ) {
+ if ( (i = XStoreColors(GFX_Display, SDL_XColorMap, xcmap, ncolors)) != 0 ) {
SDL_SetError("Setting gamma correction failed");
+ fprintf(stderr, "XStoreColors returned %d\n", i);
return(-1);
}
XSync(GFX_Display, False);
... and I saw on my terminal "XStoreColors returned 1", i.e. BadRequest.
On the other hand, my man page for XStoreColors() says that only
BadAccess (10), BadColor (12), BadName (15), or BadValue (2) would be
generated.
Unless the above information is changed/wrong, this is possibly a driver
error (mine is an AMD Radeon HD 7700 series, firegl 13.4 driver, I think)
and SDL is ending up appling the gamma value twice.
And my laptop generates a BadRequest while /not/ applying the gamma value? (In reply to Sven Hesse from comment #7) > And my laptop generates a BadRequest while /not/ applying the gamma value? Oh, I noticed that you also noticed the returned error code. (Damn bugzilla not wrapping the long lines..) I surely cannot speak for your setup, but I described what mine does. The brightness does actually work, even though I may want to set my gamma value to rather small numbers such as 0.7 or 0.6. I don't have a NWN setup ready to test what it does. Besides the man page clearly documents what error codes can be returned and 1 (RadRequest) is not one of them. So I don't know what the good solution would be. FWIW, Fedora is using a different patch, changing SDL_SetGamma() to skip SetGammaRamp() and directly use SetGamma() since their SDL-1.2.15-5 (Jan. 2013): http://pkgs.fedoraproject.org/cgit/SDL.git/log/ http://pkgs.fedoraproject.org/cgit/SDL.git/commit/?id=acadfda846514cf441f83f208e89532a0580a552 .. with reference to https://bugs.freedesktop.org/show_bug.cgi?id=27222 Perhaps this is a better solution. (In reply to Ozkan Sezer from comment #8) > (In reply to Sven Hesse from comment #7) > > And my laptop generates a BadRequest while /not/ applying the gamma value? > > [...] I surely cannot speak for your setup, but I > described what mine does. [...] Clarification: What I described was with X server 1.5.2. Went onto a laptop with X server 1.11.4 + x.org radeon drivers, brightness changing indeed does not work without a patched SDL. Hello, and sorry if you're getting several copies of this message by email, since we are closing many bugs at once here. We have decided to mark all SDL 1.2-related bugs as RESOLVED ENDOFLIFE, as we don't intend to work on SDL 1.2 any further, but didn't want to mark a large quantity of bugs as RESOLVED WONTFIX, to clearly show what was left unattended to and make it easily searchable. Our current focus is on SDL 2.0. If you are still having problems with an ENDOFLIFE bug, your absolute best option is to move your program to SDL2, as it will likely fix the problem by default, and give you access to modern platforms and tons of super-cool new features. Failing that, we _will_ accept small patches to fix these issues, and put them in revision control, although we do not intend to do any further official 1.2 releases. Failing that, please feel free to contact me directly by email (icculus@icculus.org) and we'll try to find some way to help you out of your situation. Thank you, --ryan. |
Created attachment 1243 [details] Trivial patch to check the XStoreColors return value Here on my laptop with an ATI Mobility Radeon 9600 PRO Turbo (RV350), the free radeon drivers and X.org 1.12.4, SDL_SetGamma() fails because it tries to use SDL_SetGammaRamp() if available. SDL_SetGammaRamp() uses XStoreColors(), and that call always fails with BadRequest (for whatever reason, I don't know). The SDL_Visual class check goes through, though. Now, XStoreColors failing is maybe another bug in SDL, or in Xorg, or in the radeon drivers or anywhere else, but I think SDL should still check that the XStoreColors() doesn't error as well. For extra clarity and convinence, I'll attach a patch, but it's quite trivial.