| Summary: | Pixel missing in SDL_RenderDrawLines | ||
|---|---|---|---|
| Product: | SDL | Reporter: | Sean McKean <smckean83> |
| Component: | render | Assignee: | Sam Lantinga <slouken> |
| Status: | RESOLVED WONTFIX | QA Contact: | Sam Lantinga <slouken> |
| Severity: | minor | ||
| Priority: | P2 | ||
| Version: | 2.0.0 | ||
| Hardware: | x86 | ||
| OS: | Linux | ||
| Attachments: |
output of glxinfo
Changes made to GL_RenderDrawLines new SDL_render_gl.c file Refactored SDL_render_gl.c SDL_render_gl.c: GL_RenderDrawLines() with glDrawArrays SDL_glfuncs.h: Added support for glEnable/DisableClientState, glVertexPointer and glDrawArrays Draws all points in lines with glBegin() on Linux |
||
|
Description
Sean McKean
2013-10-17 04:00:04 UTC
What video card do you have? What is the output of glxinfo? Different drivers treat line drawing differently. Take a look at the code in GL_RenderDrawLines() and see the hacks in there regarding ending the lines. Please feel free to play around with it and let me know what works for you. Cheers! Created attachment 1372 [details]
output of glxinfo
OpenGL vendor string: Tungsten Graphics, Inc
OpenGL renderer string: Mesa DRI Intel(R) IGD x86/MMX/SSE2
Onboard Intel graphics for a netbook.
Created attachment 1375 [details]
Changes made to GL_RenderDrawLines
Made some changes to ensure the right-most and bottom-most pixels are explicitly rendered on Linux. I haven't tested this on Windows/OSX yet, but it should work as it did before on those platforms.
The new section draws two extra pixels only if 'count > 2', meaning there are more than one line segment (either a strip or loop). Feel free to make changes to it for readability or optimization.
This patch threw the code somewhere weird. Can you attach the whole file and explain what your changes are intended to do? Thanks! Created attachment 1377 [details]
new SDL_render_gl.c file
Attached is the SDL2-2.0.0 file that was changed. All changes are in function GL_RenderDrawLines.
I reorganized some of the code and made sure the new code only runs if __APPLE__ and __WIN32__ are not defined (meaning we're on Linux). The first section tests if there are more than two points; if so, test for and draw the two points that are furthest right and furthest down. Then draw the lines strip/loop as before. The last change draws the last vertex if not on Linux (as before); otherwise, on Linux it runs the previous point-drawing code only if there are two vertices, since the first section above already handles the needed pixels if more than two vertices are present.
Hope this helps
BTW, the first change is on line 1009. Created attachment 1378 [details]
Refactored SDL_render_gl.c
I refactored it to simplify the logic a little bit. Can you see if this works?
It works for line strips only, not continuous line loops. Perhaps take the GL_POINTS drawing code out of the if(count > 2 ...) block to make sure it executes whether drawing a LOOP or a STRIP. Created attachment 1379 [details]
SDL_render_gl.c: GL_RenderDrawLines() with glDrawArrays
Created attachment 1380 [details]
SDL_glfuncs.h: Added support for glEnable/DisableClientState, glVertexPointer and glDrawArrays
After playing around with some test cases it looks like there can be more than two points missing for a set of lines. For instance a circle composed of many line segments, or a spirograph-like set. I found that a good solution is to enable glVertexPointer and draw all points and lines through glDrawArrays. It might be overkill for a single line segment, but it seemed like too many iterative loops to specify lines and points one vertex at a time. I also added a call to glTranslatef -- call glTranslatef(0.5f, 0.5f, 0.0f) at the beginning, and negate it at the end of the function for proper rendering.
Please let me know how well this works
Looks good, thanks! http://hg.libsdl.org/SDL/rev/909b0d7fe4dd Whoops, edited the wrong bug. Looks good, thanks! I massaged it a bit and committed it here: http://hg.libsdl.org/SDL/rev/7e9cd472a889 Using glDrawArrays() ended up causing performance problems on Windows. Can you verify that this change fixes the original bug here? https://hg.libsdl.org/SDL/rev/b339af982b62 Thanks! Created attachment 1409 [details]
Draws all points in lines with glBegin() on Linux
Some gaps were still present on Linux; all vertex points are drawn in this patch, mostly because I'm not sure which points will go missing for more complicated graphs.
That's a pretty significant performance difference, and according to the spec you shouldn't need it for line loops. Are you sure you need all the points in all cases? Can you construct simple test cases to show the problem? If you want, you can even add them to the automated tests in test/testautomation_render.c so we can see if this ever gets broken again. I tried the regression on a different Ubuntu desktop with an ATI Radeon. It looked as it was supposed to, so I'm guessing the problem is specifically with my Intel GPU. I'll look more closely at the behavior and see if I can spot which pixels are not being drawn, but for now I'll just add SDL_RenderDrawPoints() to my code where it's needed. After plying the problem in more depth, I am pretty sure that the problem is not worth solving, especially for just one make of integrated graphics chip only on Linux. There could be anywhere from one or two end/middle points missing to nearly half depending on the graph. I also tried updating the Intel drivers but got the same result. But thank you for responding to my case. I suppose I'll just have to make do with an extra call to SDL_RenderDrawPoints in some cases. |