| Summary: | OpenGL renderer chosen in Linux when not specified in SDL_CreateWindowAndRenderer | ||
|---|---|---|---|
| Product: | SDL | Reporter: | Ryan H <rhajdaj> |
| Component: | video | Assignee: | Ryan C. Gordon <icculus> |
| Status: | RESOLVED INVALID | QA Contact: | Sam Lantinga <slouken> |
| Severity: | normal | ||
| Priority: | P2 | CC: | amaranth72, icculus |
| Version: | 2.0.3 | ||
| Hardware: | x86 | ||
| OS: | Linux | ||
You can choose which of the available renderers to use via SDL_CreateRenderer after SDL_CreateWindow. (In reply to Ryan H from comment #0) > I'd like to be able to select the x11 (non-GL) driver in Linux environments > for performance purposes (for 2D app). The OpenGL renderer is hardware-accelerated. The software renderer is not. You will get much better performance using hardware acceleration. Also keep in mind there is a difference between the video subsystem backend ("SDL_Video driver") and the renderer backend ("SDL_Render driver"). The former only deals with the interactions with the window manager (e.g. X11 versus Wayland), and the latter only deals with rendering (e.g. OpenGL versus OpenGL ES versus software rendering.) Also I don't know what version of Mesa you have, but you should make sure you have the latest (version 10). Recent versions of Mesa include llvmpipe, which is a software rendering backend for OpenGL that has much better performance than swrast, which is legacy technology by now. Thanks for the response. Can you please specify which flag to use in the call to SDL_CreateRenderer to disable OpenGL? Here are the flags I see from the SDL Wiki: SDL_RENDERER_SOFTWARE SDL_RENDERER_ACCELERATED SDL_RENDERER_PRESENTVSYNC SDL_RENDERER_TARGETTEXTURE Also, I take it I'll need to call SDL_CreateWindow and SDL_CreateRenderer separately (instead of calling SDL_CreateWindowAndRenderer) to accomplish this? (In reply to Ryan H from comment #3) > Thanks for the response. Can you please specify which flag to use in the > call to SDL_CreateRenderer to disable OpenGL? Here are the flags I see from > the SDL Wiki: There's two ways to do this. If you don't specify a renderer (either with a -1 to SDL_CreateRenderer(), or by using SDL_CreateWindowAndRenderer()), SDL will choose a default. You can force the default with a hint... SDL_SetHint(SDL_HINT_RENDER_DRIVER, "software"); Alternately, you can specify a renderer by index with SDL_CreateRenderer()... int getRendererIndex(const char *rendererName) { const int total = SDL_GetNumRenderDrivers(); for (int i = 0; i < total; i++) { SDL_RendererInfo info; if (SDL_GetRenderDriverInfo(i, &info) == 0) { if (SDL_strcmp(info.name, rendererName) == 0) { return i; } } } return -1; // let SDL pick one. } SDL_CreateRenderer(window, getRendererIndex("software"), 0); ...We really should add a way to do this where you just give it the name, but that's not how it works at the moment, so you need to roll your own getRendererIndex() like the above. Anyhow, those are the options. That being said, be sure you really want to force a software renderer. It's almost certainly not what you should do, even if you plan to do software rendering yourself (you can get benefits from using OpenGL to push an otherwise--software-rendered game to the screen). Going to mark this bug as INVALID, because I think we've decided this isn't a bug in SDL and you have options to get what you want done...if I'm wrong about that, please feel free to reopen this bug. --ryan. Thanks for the response. To implement your solution, I'll probably need to use some "#ifdef UNIX"'s since Windows works well out of the box, but that's no problem. As far as the documentation goes, I see that for SDL_CreateWindowAndRenderer, the flags argument lists the following: SDL_WINDOW_OPENGL window usable with OpenGL context Based on what you've said, it appears to me that SDL ignores this flag. If this is the case, I suggest eliminating this flag. If this is not the case, I'd suggest updating the docs to make it more obvious how SDL uses it... Anyway, just a suggestion. But yes, as far as this bug is concerned, I'm good. Thanks, - Ryan |
In a 32-bit Linux environment (using Linux from Scratch version 6.8, kernel version 2.6.37), an OpenGL renderer is used by SDL_RenderCopy even though SDL_WINDOW_OPENGL is not passed as a flag to SDL_CreateWindowAndRenderer. For example, the following call: SDL_CreateWindowAndRenderer (960, 720, 0, &pWindow, &pRenderer) results in the following backtrace from SDL_RenderCopy: #0 0xb68d79da in fetch_vector4 () from /usr/lib/dri/swrast_dri.so #1 0xb68d92c9 in _mesa_execute_program () from /usr/lib/dri/swrast_dri.so #2 0xb69960a2 in _swrast_exec_fragment_program () from /usr/lib/dri/swrast_dri.so #3 0xb68f26a9 in _swrast_write_rgba_span () from /usr/lib/dri/swrast_dri.so #4 0xb690d7b0 in general_triangle () from /usr/lib/dri/swrast_dri.so #5 0xb68e7c50 in _swrast_validate_triangle () from /usr/lib/dri/swrast_dri.so #6 0xb68e7dc2 in _swrast_Triangle () from /usr/lib/dri/swrast_dri.so #7 0xb691976e in triangle_rgba () from /usr/lib/dri/swrast_dri.so #8 0xb68bb613 in _tnl_render_tri_strip_verts () from /usr/lib/dri/swrast_dri.so #9 0xb68bccce in run_render () from /usr/lib/dri/swrast_dri.so #10 0xb68b1cee in _tnl_run_pipeline () from /usr/lib/dri/swrast_dri.so #11 0xb68b2b56 in _tnl_draw_prims () from /usr/lib/dri/swrast_dri.so #12 0xb68b2e59 in _tnl_vbo_draw_prims () from /usr/lib/dri/swrast_dri.so #13 0xb68aafdd in vbo_exec_vtx_flush () from /usr/lib/dri/swrast_dri.so #14 0xb68a922a in vbo_exec_FlushVertices_internal () from /usr/lib/dri/swrast_dri.so #15 0xb68a93c2 in vbo_exec_FlushVertices () from /usr/lib/dri/swrast_dri.so #16 0xb683963e in enable_texture () from /usr/lib/dri/swrast_dri.so #17 0xb6839bc7 in _mesa_set_enable () from /usr/lib/dri/swrast_dri.so #18 0xb683bb15 in _mesa_Disable () from /usr/lib/dri/swrast_dri.so #19 0xb7793d06 in GL_RenderCopy (renderer=0x96c0e68, texture=0x9cc17d0, srcrect=0xbfc5ea40, dstrect=0xbfc5ea20) at /home/fred/SDL2-2.0.3/src/render/opengl/SDL_render_gl.c:1234 #20 0xb778bfa1 in SDL_RenderCopy_REAL (renderer=0x96c0e68, texture=0x9cc17d0, srcrect=0x0, dstrect=0x0) at /home/fred/SDL2-2.0.3/src/render/SDL_render.c:1689 #21 0xb77749f9 in SDL_RenderCopy (a=0x96c0e68, b=0x9cc17d0, c=0x0, d=0x0) at /home/fred/SDL2-2.0.3/src/dynapi/SDL_dynapi_procs.h:378 My build config is as follows: Enabled modules : atomic audio video render events joystick haptic power filesystem threads timers file loadso cpuinfo assembly Assembly Math : mmx 3dnow sse Audio drivers : disk dummy oss alsa(dynamic) arts(dynamic) Video drivers : dummy x11(dynamic) opengl X11 libraries : xcursor xinerama xinput2 xrandr xscrnsaver xshape xvidmode Input drivers : linuxev linuxkd Using libudev : YES Using dbus : NO I'd like to be able to select the x11 (non-GL) driver in Linux environments for performance purposes (for 2D app). Thanks, - Ryan