| Summary: | OpenGL 3.3 on OS X Mavericks | ||
|---|---|---|---|
| Product: | SDL | Reporter: | jbates |
| Component: | video | Assignee: | Ryan C. Gordon <icculus> |
| Status: | RESOLVED FIXED | QA Contact: | Sam Lantinga <slouken> |
| Severity: | trivial | ||
| Priority: | P2 | CC: | amaranth72, JvsqfI8ZY2hr |
| Version: | HG 2.0 | ||
| Hardware: | x86_64 | ||
| OS: | Mac OS X (All) | ||
| Attachments: |
Patch to fix requesting GL3.3, 4.0 and 4.1 contexts
Simplify requesting OpenGL context versions on OSX |
||
Forget what I said about 3.1. Its version of a "core profile" is defined very differently from later versions. Accepting 4.0 would make sense, though. Created attachment 1421 [details] Patch to fix requesting GL3.3, 4.0 and 4.1 contexts On my system (Macbook Pro with an Intel HD 3000 and an AMD Radeon 6750m), after the commit which added code for requesting a GL4.1 context ( https://hg.libsdl.org/SDL/rev/bb624b1348da ), requesting GL 4.1 now forces an Apple Software Renderer context - and fails completely if SDL_GL_ACCELERATED_VISUAL is requested. Requesting a GL 3.2 context returns GL 4.1 for my AMD 6750m as expected. I dug into SDL's code and it looks like the kCGLOGLPVersion_GL4_Core pixel format attribute (used as of the above-linked commit) just does not work properly - even if I make it create a CGL pixelformat object and use that instead of an NSOpenGL one, it still never recognizes my AMD 6750m and always tries to fall back to the software renderer. I created a patch which does two things: it no longer uses the kCGLOGLPVersion_GL4_Core constant when requesting 4.x (requesting a core context returns the highest version supported regardless), and it dynamically compares the requested OpenGL version to the actual returned one, in order to determine whether the context should be returned successfully. This means requesting a GL 3.3 core profile context in Mavericks now works as expected - same with requesting GL 4.0 and 4.1. It should also continue to work for later GL versions when OS X is updated to support them, without recompiling SDL. Created attachment 1574 [details]
Simplify requesting OpenGL context versions on OSX
This patch simplifies and future-proofs OpenGL context version selection on OSX.
The logic is:
- OpenGL ES is not supported
- core profile is only supported on 10.7 and up, so the OS version check is necessary
- only compatibility or core profile can be requested; OS X gives you the latest version it supports regardless of what version is requested
- the version it gives must be greater or equal to the version requested
- on success, update SDL's GL attributes with the context version number
I can only test on 10.9.1 with hardware that supports 3.3 so it would be good to test on other OS versions and hardware.
This approach makes me nervous, but it's not any worse than our current programmatic situation, and it's certain yielding better results, so I'm applying it...but I suspect we'll be revisiting this problem in the future. I'll open a bug with Apple's radar about it, if getting the software renderer for 4.1 turns out to actually be their bug. Daniel's patch is now https://hg.libsdl.org/SDL/rev/9bd65b58278e Thanks, everyone, for working through this bug. :) --ryan. |
I was just looking at "Updated GL version tests for Mac OS X 10.9 ('Mavericks')" (http://hg.libsdl.org/SDL/rev/bb624b1348da). You add support for OpenGL 4.1 but not OpenGL 3.3. According to https://developer.apple.com/graphicsimaging/opengl/capabilities/, some GPUs on OS X Mavericks support OpenGL 3.3 but not OpenGL 4.1. You should probably add 3.3 as an acceptable version when osversion >= 0x1090. Also, if the user requests a 3.1 or 4.0 core context, would it not make sense to accept those and silently return a higher version context?