We are currently migrating Bugzilla to GitHub issues.
Any changes made to the bug tracker now will be lost, so please do not post new bugs or make changes to them.
When we're done, all bug URLs will redirect to their equivalent location on the new bug tracker.

Bug 1235 - SDL_SetVideoMode() does not return correct bpp information for 16bpp fullscreen contexts
Summary: SDL_SetVideoMode() does not return correct bpp information for 16bpp fullscre...
Status: RESOLVED WONTFIX
Alias: None
Product: SDL
Classification: Unclassified
Component: video (show other bugs)
Version: 1.2.14
Hardware: x86 Mac OS X 10.6
: P2 major
Assignee: Sam Lantinga
QA Contact: Sam Lantinga
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2011-06-22 15:04 UTC by Jonas Maebe
Modified: 2011-12-29 12:16 UTC (History)
1 user (show)

See Also:


Attachments
Test program that requests a 16 bit context, but gets a 15 bits context masquerading as a 16 bit context instead (1.59 KB, text/plain)
2011-06-22 15:04 UTC, Jonas Maebe
Details
Correctly set the bpp info for requested 16 bit quartz contexts (390 bytes, patch)
2011-07-01 12:06 UTC, Jonas Maebe
Details | Diff

Note You need to log in before you can comment on or make changes to this bug.
Description Jonas Maebe 2011-06-22 15:04:59 UTC
Created attachment 632 [details]
Test program that requests a 16 bit context, but gets a 15 bits context masquerading as a 16 bit context instead

If you request a 16 bpp fullscreen context on Mac OS X 10.6, SDL allocates a 15 bpp context even if SDL_ANYFORMAT is not specified on input to SDL_SetVideoMode(). This in itself is not really a big problem (and the flag magically does end up in the flags of the surface returned by SDL_SetVideoMode), but the returned surface indicates that it actually is a 16 bit context, which *is* a real problem. Only by looking at the color masks/shifts counts/"lost" bits, you can determine that it's actually a 15 bit context.

I compile the attached program via

gcc -arch i386 -I/sw/include/SDL -L/sw/lib -lSDL -lSDLmain -framework Cocoa -o testfs testfs.c

(SDL 1.2.14-6 is installed via fink, but it also happens with "regular" SDL as evidenced by http://sourceforge.net/tracker/?func=detail&aid=2999634&group_id=52551&atid=467232)

When executing it, I get this output:

setting video mode 640x480x16, flags: 0x80000001
-> flags==0x81000001, bpp==[2;16], r==7c00:10:3, g==3e0:5:3, b==1f:0:3, a==0:0:8

As you can see, the surface says it's 16 bit, but the r/g/b masks clearly show that it's only 15 bit. And when putting pixels on it, they are also clearly interpreted as 5/5/5 rather than 5/6/5 (see the DOSBox bug mentioned above).

I'm running this on a 13" MacBook Pro with an NVIDIA GeForce 320M and Mac OS X 10.6.7

PS: I set the severity to "major" because it affects DOSBox quite severely.
PPS: the referred DOSBox bug report also contains information about a problem that is actually a DOSBox bug (the fact that it can't deal with the BGRA format of windowed 32 bpp contexts).
Comment 1 Jonas Maebe 2011-07-01 12:06:03 UTC
Created attachment 640 [details]
Correctly set the bpp info for requested 16 bit quartz contexts

I've attached a patch that fixes the problem. It's a oneliner: it sets the bpp info for "16 bit" contexts returned by Quartz to 15 bit, since there is only 15 bit colour information (5 bits of red, green and blue each).

As a result, when requesting a 16 bit surface from Quartz you no longer get a hardware surface back: the flag SDL_HWSURFACE (= 0x1) is no longer set. But that is correct, since the hardware does not support 16 bit surfaces.

I've verified that it fixes the DOSBox problem. The test program now prints:


setting video mode 640x480x16, flags: 0x80000001
-> flags==0x80000000, bpp==[2;16], r==f800:11:3, g==7e0:5:2, b==1f:0:3, a==0:0:0

So it still returns 16 bit context, but no longer claims it's a hardware surface and SDL indeed properly translates between the 16 bit surface and the 15 bit Quartz context now (although the latter part is not demonstrated by the test program, it is demonstrated by DOSBox).
Comment 2 Ryan C. Gordon 2011-08-21 08:58:40 UTC
Comment on attachment 640 [details]
Correctly set the bpp info for requested 16 bit quartz contexts

Fixing mimetype on attachment.
Comment 3 Ryan C. Gordon 2011-08-21 08:59:24 UTC
Comment on attachment 632 [details]
Test program that requests a 16 bit context, but gets a 15 bits context masquerading as a 16 bit context instead


Fixing mimetype on attachment.
Comment 4 Sam Lantinga 2011-12-29 01:42:39 UTC
I'm hesitant to change the return value for existing SDL 1.2 programs.  If I remember right 16-bit x555 is actually a valid pixel format, and many programs can't correctly handle a bpp value of 15.
Comment 5 Jonas Maebe 2011-12-29 03:02:02 UTC
(In reply to comment #4)
> I'm hesitant to change the return value for existing SDL 1.2 programs.  If I
> remember right 16-bit x555 is actually a valid pixel format, and many programs
> can't correctly handle a bpp value of 15.

I just wanted to add that my patch should not result in apps suddenly getting 15 bpp contexts. In fact, they will now get a 565 16 bit context if they ask for a 16 bit context, rather than an x555 one (which, I agree, indeed a 16 bit pixel format, but with 15 bit color information). It simply won't be a hardware surface anymore.
Comment 6 Sam Lantinga 2011-12-29 12:16:37 UTC
Yeah, in the original design there were no assumptions about mask and channel ordering when specifying a specific depth in SDL_SetVideoMode().  In fact the driver was intended to pick the fastest one and have the application convert assets as necessary to match (RGBA vs ABGR, etc), and 555 vs 565 fell in this category.

I certainly agree that your change is a good one, I just really don't want to change 15 year old behavior. :)

SDL 1.3 is the place for updating these kinds of things.