We are currently migrating Bugzilla to GitHub issues.
Any changes made to the bug tracker now will be lost, so please do not post new bugs or make changes to them.
When we're done, all bug URLs will redirect to their equivalent location on the new bug tracker.

Bug 4994

Summary: SDL_MasksToPixelFormatEnum does not work for 30bit (R10G10B10) xorg setup
Product: SDL Reporter: johannes hanika <hanatos>
Component: videoAssignee: Sam Lantinga <slouken>
Status: NEW --- QA Contact: Sam Lantinga <slouken>
Severity: normal    
Priority: P2    
Version: 2.0.10   
Hardware: x86_64   
OS: Linux   

Description johannes hanika 2020-02-18 09:19:31 UTC
hi,

i'm trying to run SDL2/opengl on xorg configured with

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    30
    Option         "AllowEmptyInitialConfiguration" "true"
    SubSection     "Display"
        Depth       30
    EndSubSection
EndSection

(10 bits per colour channel). this works fine if i set

      SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 10);
      SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 10);
      SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 10);
      SDL_GL_SetAttribute(SDL_GL_ALPHA_SIZE, 2);

prior to creating the GL context. unfortunately i don't want to do this on other machines which don't run 30bit X. so i'm trying to find out by testing:

if(SDL_PIXELFORMAT_ARGB2101010 == SDL_GetWindowPixelFormat(Priv::window))

which unfortunately always returns

SDL_PIXELFORMAT_UNKNOWN

on said machine. this is because internally in video/SDL_pixels.c, if i patch:

Uint32
SDL_MasksToPixelFormatEnum(int bpp, Uint32 Rmask, Uint32 Gmask, Uint32 Bmask,
                           Uint32 Amask)
{
+    fprintf(stderr, "bpp %d bpp masks %x %x %x %x\n", bpp, Rmask, Gmask, Gmask, Amask);

it will output

bpp 30 bpp masks 3ff ffc00 ffc00 0

i.e. there are only 30 bpp and no alpha reported by the masks. in a way this is probably more precise than assuming 2 bit alpha, since X indeed is set to 30bit, not 32. but writing client code to assume UNKNOWN to mean 30bit sounds brittle to me, i'd rather have the pixel format return something more explicit.

in my case i would be fine with an extra case statement that accepts 30bits and still returns SDL_PIXELFORMAT_ARGB2101010, but i don't know enough about the SDL internals to propose a fix that is aligned with the rest of the code here.

not sure about the priority, it seems simply not setting the SDL_GL_RED_SIZE attribute will default to a working state in both cases (8-bit and 10-bit).