| Summary: | SDL can't set video mode on Linux with MSAA more than 4x | ||
|---|---|---|---|
| Product: | SDL | Reporter: | Michael Kurinnoy <viewizard> |
| Component: | video | Assignee: | Sam Lantinga <slouken> |
| Status: | RESOLVED FIXED | QA Contact: | Sam Lantinga <slouken> |
| Severity: | normal | ||
| Priority: | P2 | CC: | icculus, matthias.bentrup |
| Version: | 1.2.15 | ||
| Hardware: | x86_64 | ||
| OS: | Linux | ||
|
Description
Michael Kurinnoy
2012-10-05 06:53:17 UTC
Forgot to say, SDL error is "Couldn't find matching GLX visual" I look around a little more, looks like this bug connected to glXChooseVisual implementation in nvidia drivers, but not to libSDL directly. I really see nothing in SDL_x11gl.c what could be wrong with initialization. Just tested FBO with GL_EXT_framebuffer_multisample extension - all works like a charm on 8 and 16 samples. I guess this behaviour is caused by setting SDL_GL_ACCELERATED_VISUAL to 1. Unfortunately NVidia drivers set the "non-conformant" caveat in all MSAA visuals, and SDL requests a visual with no caveats when an accelerated visual is required (to disallow contexts with the "slow" caveat). If you set SDL_GL_ACCELERATED_VISUAL to -1, the driver can return a MSAA visual. I checked this issue now on Linux with 325.15 nvidia driver. Looks like, this issue was fixed by nvidia in drivers. I can't reproduce this issue now on same hardware with same 1.2.15 libsdl version and with the same test code, what I used year ago. It sounds like this is fixed by NVIDIA. (In reply to Sam Lantinga from comment #5) > It sounds like this is fixed by NVIDIA. I'm wondering if we should just ignore the SDL_GL_ACCELERATED_VISUAL flag for SDL 2.1 (or...2.0.2?). That thing has caused problems like this before, and...who doesn't want an accelerated visual? It seems like a silly thing to make the app specify. --ryan. Not a bad idea... I think at this point it's only used to prevent an accelerated visual, but that's silly too. :) |