| Summary: | Render targets lose scale quality after minimizing a fullscreen window | ||
|---|---|---|---|
| Product: | SDL | Reporter: | Olli-Samuli Lehmus <ollisamuli.lehmus> |
| Component: | render | Assignee: | Sam Lantinga <slouken> |
| Status: | RESOLVED FIXED | QA Contact: | Sam Lantinga <slouken> |
| Severity: | normal | ||
| Priority: | P2 | CC: | ollisamuli.lehmus |
| Version: | 2.0.8 | ||
| Hardware: | x86 | ||
| OS: | Windows 10 | ||
| Attachments: |
A sample program that reproduces the issue
proposed new SDL_Render_d3d.c |
||
All right, I managed to fix this for the DirectX 9 renderer. I'm unable to test, but I assume this is also an issue in the DirectX 11 renderer. I looked into patching it into there too with a similar approach, but the implementation looks a little different so I wasn't able to apply the patch there too. If you're more familiar with DirectX, this won't probably be new to you, but anyways: DirectX requires render targets to be recreated on device loss (which happens for an example when a window is defocused). SDL destroys the render targets first, and then recreates them. However, on recreation, the global scale hint is looked up again, which results in the scale type changing. This is understandable as there's really no way to figure out what scale type the target texture was using. To patch this, I had to stuff additional data into the SDL_Texture struct, which I'm not super happy about, but that's the cleanest approach I came up with. So, I added a "void* persistentdata" member into SDL_Texture in SDL_sysrender.h. I introduced a "void D3D_DestroyTextureDeviceLoss" function into SDL_render_d3d.c, that doesn't clean up the persistent data, which is invoked on device loss. "D3D_CreateTexture" now handles creating the persistent data and assigning the correct scalemode into the texture pointer. I'm not really sure what kind of format you prefer for proposed patches, so I've just attached the complete changed SDL_render_d3d.c file. It needs the persistentadata-member addition into the SDL_Texture as described to work. Not really sure if this is the way you even want this to be patched. Created attachment 3241 [details]
proposed new SDL_Render_d3d.c
Okay, I fixed this a different way, for all renderers: https://hg.libsdl.org/SDL/rev/d7582d7286aa Thanks! |
Created attachment 3220 [details] A sample program that reproduces the issue If one creates a window with the SDL_WINDOW_FULLSCREEN_DESKTOP flag, and creates a render target with SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, "linear"), and afterwards sets SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, "nearest"), after minimizing the window, the scale quality hint is lost on the render target. Textures however do keep their interpolation modes. On a (maybe?) related note, SDL_WINDOW_FULLSCREEN just renders a black render target after the minimize. Attached is a program that I'm able to reproduce the issue with. You can press Enter to see how the rendering result changes after minimize & remaximize (note that this also happens if a user manually clicks out of the window and then opens it back up). There is a render target on the left side of the screen and a regular texture on the right side of the screen. This does not happen when SDL_Renderer is using OpenGL.