| Summary: | xxx_RenderReadPixels - incorrect behaviour in certain conditions | ||
|---|---|---|---|
| Product: | SDL | Reporter: | PoopiSan <poopisan> |
| Component: | video | Assignee: | Sam Lantinga <slouken> |
| Status: | RESOLVED FIXED | QA Contact: | Sam Lantinga <slouken> |
| Severity: | normal | ||
| Priority: | P2 | ||
| Version: | HG 2.0 | ||
| Hardware: | All | ||
| OS: | Other | ||
Can you double check this in the current snapshot? http://www.libsdl.org/tmp/SDL-2.0.zip I think this was fixed at some point, since the viewport is actually used now: real_rect.x = renderer->viewport.x; real_rect.y = renderer->viewport.y; real_rect.w = renderer->viewport.w; real_rect.h = renderer->viewport.h; Actually, I just ran into this bug myself and fixed it. Thanks! http://hg.libsdl.org/SDL/rev/fe82b639c4d6 |
GLES2_RenderReadPixels, GLES_RenderReadPixels, GL_RenderReadPixels and possibly other backends is incorrectly implemented. If the current target viewport is different than window size the function is reading garbage and according to the function documentation should work with any rendering target "Read pixels from the current rendering target.". this seems to be caused by this line: ... SDL_GetWindowSize(window, &w, &h); ... quick fix and change to if(renderer->target == NULL) { SDL_GetWindowSize(window, &w, &h); } else { w=rect->w;h=rect->h; } seem to solve my case ( viewport is equal to texture size, no scaling ). After this change all textures are read correctly. This function is very useful for texture rendering debugging.