Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SDL_SetVideoMode() does not return correct bpp information for 16bpp fullscreen contexts #628

Closed
SDLBugzilla opened this issue Feb 10, 2021 · 0 comments
Labels
wontfix This will not be worked on

Comments

@SDLBugzilla
Copy link
Collaborator

This bug report was migrated from our old Bugzilla tracker.

These attachments are available in the static archive:

Reported in version: 1.2.14
Reported for operating system, platform: Mac OS X 10.6, x86

Comments on the original bug report:

On 2011-06-22 15:04:59 +0000, Jonas Maebe wrote:

Created attachment 632
Test program that requests a 16 bit context, but gets a 15 bits context masquerading as a 16 bit context instead

If you request a 16 bpp fullscreen context on Mac OS X 10.6, SDL allocates a 15 bpp context even if SDL_ANYFORMAT is not specified on input to SDL_SetVideoMode(). This in itself is not really a big problem (and the flag magically does end up in the flags of the surface returned by SDL_SetVideoMode), but the returned surface indicates that it actually is a 16 bit context, which is a real problem. Only by looking at the color masks/shifts counts/"lost" bits, you can determine that it's actually a 15 bit context.

I compile the attached program via

gcc -arch i386 -I/sw/include/SDL -L/sw/lib -lSDL -lSDLmain -framework Cocoa -o testfs testfs.c

(SDL 1.2.14-6 is installed via fink, but it also happens with "regular" SDL as evidenced by http://sourceforge.net/tracker/?func=detail&aid=2999634&group_id=52551&atid=467232)

When executing it, I get this output:

setting video mode 640x480x16, flags: 0x80000001
-> flags==0x81000001, bpp==[2;16], r==7c00:10:3, g==3e0:5:3, b==1f:0:3, a==0:0:8

As you can see, the surface says it's 16 bit, but the r/g/b masks clearly show that it's only 15 bit. And when putting pixels on it, they are also clearly interpreted as 5/5/5 rather than 5/6/5 (see the DOSBox bug mentioned above).

I'm running this on a 13" MacBook Pro with an NVIDIA GeForce 320M and Mac OS X 10.6.7

PS: I set the severity to "major" because it affects DOSBox quite severely.
PPS: the referred DOSBox bug report also contains information about a problem that is actually a DOSBox bug (the fact that it can't deal with the BGRA format of windowed 32 bpp contexts).

On 2011-07-01 12:06:03 +0000, Jonas Maebe wrote:

Created attachment 640
Correctly set the bpp info for requested 16 bit quartz contexts

I've attached a patch that fixes the problem. It's a oneliner: it sets the bpp info for "16 bit" contexts returned by Quartz to 15 bit, since there is only 15 bit colour information (5 bits of red, green and blue each).

As a result, when requesting a 16 bit surface from Quartz you no longer get a hardware surface back: the flag SDL_HWSURFACE (= 0x1) is no longer set. But that is correct, since the hardware does not support 16 bit surfaces.

I've verified that it fixes the DOSBox problem. The test program now prints:

setting video mode 640x480x16, flags: 0x80000001
-> flags==0x80000000, bpp==[2;16], r==f800:11:3, g==7e0:5:2, b==1f:0:3, a==0:0:0

So it still returns 16 bit context, but no longer claims it's a hardware surface and SDL indeed properly translates between the 16 bit surface and the 15 bit Quartz context now (although the latter part is not demonstrated by the test program, it is demonstrated by DOSBox).

On 2011-08-21 08:58:40 +0000, Ryan C. Gordon wrote:

Comment on attachment 640
Correctly set the bpp info for requested 16 bit quartz contexts

Fixing mimetype on attachment.

On 2011-08-21 08:59:24 +0000, Ryan C. Gordon wrote:

Comment on attachment 632
Test program that requests a 16 bit context, but gets a 15 bits context masquerading as a 16 bit context instead

Fixing mimetype on attachment.

On 2011-12-29 01:42:39 +0000, Sam Lantinga wrote:

I'm hesitant to change the return value for existing SDL 1.2 programs. If I remember right 16-bit x555 is actually a valid pixel format, and many programs can't correctly handle a bpp value of 15.

On 2011-12-29 03:02:02 +0000, Jonas Maebe wrote:

(In reply to comment # 4)

I'm hesitant to change the return value for existing SDL 1.2 programs. If I
remember right 16-bit x555 is actually a valid pixel format, and many programs
can't correctly handle a bpp value of 15.

I just wanted to add that my patch should not result in apps suddenly getting 15 bpp contexts. In fact, they will now get a 565 16 bit context if they ask for a 16 bit context, rather than an x555 one (which, I agree, indeed a 16 bit pixel format, but with 15 bit color information). It simply won't be a hardware surface anymore.

On 2011-12-29 12:16:37 +0000, Sam Lantinga wrote:

Yeah, in the original design there were no assumptions about mask and channel ordering when specifying a specific depth in SDL_SetVideoMode(). In fact the driver was intended to pick the fastest one and have the application convert assets as necessary to match (RGBA vs ABGR, etc), and 555 vs 565 fell in this category.

I certainly agree that your change is a good one, I just really don't want to change 15 year old behavior. :)

SDL 1.3 is the place for updating these kinds of things.

@SDLBugzilla SDLBugzilla added bug wontfix This will not be worked on labels Feb 10, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
wontfix This will not be worked on
Projects
None yet
Development

No branches or pull requests

1 participant