| Summary: | SDL_SetVideoMode() does not return correct bpp information for 16bpp fullscreen contexts | ||
|---|---|---|---|
| Product: | SDL | Reporter: | Jonas Maebe <jonas.bugzilla> |
| Component: | video | Assignee: | Sam Lantinga <slouken> |
| Status: | RESOLVED WONTFIX | QA Contact: | Sam Lantinga <slouken> |
| Severity: | major | ||
| Priority: | P2 | CC: | jonas.bugzilla |
| Version: | 1.2.14 | ||
| Hardware: | x86 | ||
| OS: | Mac OS X 10.6 | ||
| Attachments: |
Test program that requests a 16 bit context, but gets a 15 bits context masquerading as a 16 bit context instead
Correctly set the bpp info for requested 16 bit quartz contexts |
||
|
Description
Jonas Maebe
2011-06-22 15:04:59 UTC
Created attachment 640 [details]
Correctly set the bpp info for requested 16 bit quartz contexts
I've attached a patch that fixes the problem. It's a oneliner: it sets the bpp info for "16 bit" contexts returned by Quartz to 15 bit, since there is only 15 bit colour information (5 bits of red, green and blue each).
As a result, when requesting a 16 bit surface from Quartz you no longer get a hardware surface back: the flag SDL_HWSURFACE (= 0x1) is no longer set. But that is correct, since the hardware does not support 16 bit surfaces.
I've verified that it fixes the DOSBox problem. The test program now prints:
setting video mode 640x480x16, flags: 0x80000001
-> flags==0x80000000, bpp==[2;16], r==f800:11:3, g==7e0:5:2, b==1f:0:3, a==0:0:0
So it still returns 16 bit context, but no longer claims it's a hardware surface and SDL indeed properly translates between the 16 bit surface and the 15 bit Quartz context now (although the latter part is not demonstrated by the test program, it is demonstrated by DOSBox).
Comment on attachment 640 [details]
Correctly set the bpp info for requested 16 bit quartz contexts
Fixing mimetype on attachment.
Comment on attachment 632 [details]
Test program that requests a 16 bit context, but gets a 15 bits context masquerading as a 16 bit context instead
Fixing mimetype on attachment.
I'm hesitant to change the return value for existing SDL 1.2 programs. If I remember right 16-bit x555 is actually a valid pixel format, and many programs can't correctly handle a bpp value of 15. (In reply to comment #4) > I'm hesitant to change the return value for existing SDL 1.2 programs. If I > remember right 16-bit x555 is actually a valid pixel format, and many programs > can't correctly handle a bpp value of 15. I just wanted to add that my patch should not result in apps suddenly getting 15 bpp contexts. In fact, they will now get a 565 16 bit context if they ask for a 16 bit context, rather than an x555 one (which, I agree, indeed a 16 bit pixel format, but with 15 bit color information). It simply won't be a hardware surface anymore. Yeah, in the original design there were no assumptions about mask and channel ordering when specifying a specific depth in SDL_SetVideoMode(). In fact the driver was intended to pick the fastest one and have the application convert assets as necessary to match (RGBA vs ABGR, etc), and 555 vs 565 fell in this category. I certainly agree that your change is a good one, I just really don't want to change 15 year old behavior. :) SDL 1.3 is the place for updating these kinds of things. |