We are currently migrating Bugzilla to GitHub issues.
Any changes made to the bug tracker now will be lost, so please do not post new bugs or make changes to them.
When we're done, all bug URLs will redirect to their equivalent location on the new bug tracker.

Bug 753

Summary: SDL_GL_SetSwapInterval doesn't work with nvidia drivers
Product: SDL Reporter: Alessandro <laynor>
Component: videoAssignee: Ryan C. Gordon <icculus>
Status: RESOLVED FIXED QA Contact: Sam Lantinga <slouken>
Severity: normal    
Priority: P2 CC: icculus, renesd, sezeroz, simon
Version: 1.2.13Keywords: target-1.2.14
Hardware: x86   
OS: Linux   
Attachments: Cache swap control value for both Mesa and SGI glX extensions.

Description Alessandro 2009-06-10 17:15:25 UTC
setting the interval to 1 has no effect with nvidia drivers. 
it has to do with the mesa stuff, it's not supported by nvidia drivers. glXSetSwapIntervalSGI works.
Comment 1 Ryan C. Gordon 2009-09-12 21:39:14 UTC
It doesn't look like we ever load the glX extension functions...we check if we have the extension, but never get the function pointers set.

--ryan.
Comment 2 Ryan C. Gordon 2009-09-13 16:33:39 UTC
Tagging this bug with "target-1.2.14" so we can try to resolve it for SDL 1.2.14.

Please note that we may choose to resolve it as WONTFIX. This tag is largely so we have a comprehensive wishlist of bugs to examine for 1.2.14 (and so we can close bugs that we'll never fix, rather than have them live forever in Bugzilla).

--ryan.
Comment 3 Rene Dudfield 2009-09-22 12:17:44 UTC
hi,

Also confirmed at this bug report: http://bugzilla.libsdl.org/show_bug.cgi?id=697

You'll note Simon from that report uploaded his testgl results, and it's a nvidia driver too.

The sgi swap control extension is documented here:
http://www.opengl.org/registry/specs/SGI/swap_control.txt

The function is: "int glXSwapIntervalSGI(int interval)" not glXSetSwapIntervalSGI.


cheers,
Comment 4 Simon Williams 2009-09-22 15:00:47 UTC
If this doesn't work then how is it that if I don't call SDL_GL_SetAttribute(SDL_GL_SWAP_CONTROL, 1) in my application then I get very noticable tearing?

Is there another swap control method being used there?
Comment 5 Rene Dudfield 2009-09-23 03:20:26 UTC
hi,

I'm not sure why it's working for you... puzzling!  I guess you'd need to put some print statements in there to see if it's checking the function... or use the 'glxinfo' program to see if it's printing out the extension.


The GLWF library uses glXSwapIntervalSGI internally.

Here's another report that it works for a nvidia user where the sdl function didn't:
http://www.gamedev.net/community/forums/topic.asp?topic_id=482099


cheers,
Comment 6 Sam Lantinga 2009-09-24 00:33:19 UTC
Ryan, can you look into this?
Comment 7 Simon Williams 2009-09-26 04:02:23 UTC
(In reply to comment #5)
> I'm not sure why it's working for you... puzzling!  I guess you'd need to put
> some print statements in there to see if it's checking the function... or use
> the 'glxinfo' program to see if it's printing out the extension.

> The GLWF library uses glXSwapIntervalSGI internally.

glxinfo says GLX_SGI_swap_control is present in both client and server. No mention of swap interval anywhere.
Comment 8 Ryan C. Gordon 2009-09-28 23:49:04 UTC
(In reply to comment #1)
> It doesn't look like we ever load the glX extension functions...we check if we
> have the extension, but never get the function pointers set.

(This is incorrect, we DO look up the function).

--ryan.
Comment 9 Ryan C. Gordon 2009-09-29 06:30:37 UTC
We appear to be using this correctly. I'll have to test on a Linux system with an Nvidia GPU later today, but perhaps this was a driver issue?

Also: this was definitely broken on Nvidia cards before SDL 1.2.12 (one of the people on the gamedev.net link mentioned 1.2.10), as we were looking for the wrong extension name, so we would think you lacked GLX_SGI_swap_control even if you had it.

--ryan.
Comment 10 Rene Dudfield 2009-09-29 07:48:38 UTC
Hi,


It looks like the interval paramater is slighty different.  However SDL passes the same value in to both of them.

"glXSwapIntervalMESA returns GLX_BAD_VALUE if parameter <interval> is less than zero."

"glXSwapIntervalSGI returns GLX_BAD_VALUE if parameter <interval> is less than or equal to zero."

The code should change to not pass glXSwapIntervalSGI a zero - since that is defined as an error.  The extension seems a bit undefined with regard to how to not worry about synchronizing.  Unless the spec is wrong, and the code works?

Also it looks like SGI defaults to 1, and mesa defaults to 0.

Mesa defines glXSwapIntervalSGI like this:

/*** GLX_SGI_swap_control ***/

int glXSwapIntervalSGI(int interval)
{
   struct _glxapi_table *t;
   Display *dpy = glXGetCurrentDisplay();
   GET_DISPATCH(dpy, t);
   if (!t)
      return 0;
   return (t->SwapIntervalSGI)(interval);
}

Where if interval is zero it tries to pass it in anyway.  So maybe passing in zero is allowed after all... not sure.


There doesn't seem to be a way to query the glXSwapIntervalSGI set value.  This means it would seem to have no effect on nvidia drivers(via looking at the value returned from SDL) - but still might have an effect by actually causing swap interval to work.

This might explain the report of why some people say it's not working - but  Simon Williams reports that in fact it does sync for him on his nvidia card.

** To fix this X11_GL_GetAttribute should be changed so that it returns the value last sent by glXSwapIntervalSGI.  It seems the SDL_GL_GetAttribute call is the one with the problem here.  Potentially leading to people thinking the swap interval is not working.

Wxgtk and clanlib also use glXSwapIntervalSGI rather than the mesa version along with glfw.


Another potential issue is this:  the swap interval does not take affect until the next buffer swap.  It seems that case might be undefined where you X11_GL_CreateContext and then query the value - it might give you back 0 even if you requested 1, since there has been no buffer swap yet.

** to fix this potential problem swap the buffer in the create visual call?  That might create more issues than it's worth fixing.



Sorry I can't test out, as lacking a nvidia card... but maybe this research is helpful.

cheers,
Comment 11 Ryan C. Gordon 2009-10-09 22:34:55 UTC
(In reply to comment #10)
> Sorry I can't test out, as lacking a nvidia card... but maybe this research is
> helpful.

This research is definitely helpful!

> "glXSwapIntervalSGI returns GLX_BAD_VALUE if parameter <interval> is less than
> or equal to zero."
> 
> The code should change to not pass glXSwapIntervalSGI a zero - since that is
> defined as an error.  The extension seems a bit undefined with regard to how to
> not worry about synchronizing.  Unless the spec is wrong, and the code works?
> 
> Also it looks like SGI defaults to 1, and mesa defaults to 0.

It looks like the nvidia drivers default to 0 (no sync). If I explicitly set it to 1 with the SGI extension, testgl locks to 60fps. If I don't set it, it rolls in around 10,000fps.  :)

My guess is that the SGI extension docs are either out of date, or people realized that not being able to turn off vsync was a bad idea.

Without a doubt, though, this will fail on Irix, if we care about that, as SGI's docs explicitly say it will:

http://techpubs.sgi.com/library/tpl/cgi-bin/getdoc.cgi?coll=0650&db=man&fname=/usr/share/catman/g_man/cat3/OpenGL/glxswapintervalsgi.z


> There doesn't seem to be a way to query the glXSwapIntervalSGI set value.  This
> means it would seem to have no effect on nvidia drivers(via looking at the
> value returned from SDL) - but still might have an effect by actually causing
> swap interval to work.

Svn revision #4984 correctly caches the value we set for the SGI extension.

Svn revision #4983 sets the default value reported by SDL_GL_GetAttribute() to "-1" ... I'm not sure this is ideal, but it'll have to do for not knowing the default, as some glX implementations seem to default to 0 and some to 1 (and Nvidia's __GL_SWAP_TO_VSYNC environment variable changes the default per-run, unless the app overrides it)...and there's no way to query it. Once the user successfully sets a swap interval, GetAttribute will report it correctly.

> ** To fix this X11_GL_GetAttribute should be changed so that it returns the
> value last sent by glXSwapIntervalSGI.  It seems the SDL_GL_GetAttribute call
> is the one with the problem here.  Potentially leading to people thinking the
> swap interval is not working.

SDL_GL_GetAttribute() now returns success (per Bug #697), but might tell you the attribute is set to -1.

> ** to fix this potential problem swap the buffer in the create visual call? 
> That might create more issues than it's worth fixing.

We should probably just cache the value with the Mesa codepath, too. I'm attaching a patch to this bug report for Sam to make a decision on.

--ryan.
Comment 12 Ryan C. Gordon 2009-10-09 22:41:42 UTC
Created attachment 405 [details]
Cache swap control value for both Mesa and SGI glX extensions.


Attached patch for Sam to review.

--ryan.
Comment 13 Ryan C. Gordon 2009-10-09 22:42:17 UTC
Tossing bug to Sam.

--ryan.
Comment 14 Ryan C. Gordon 2009-10-09 23:45:27 UTC
Sam said patch is okay, so I committed it, svn revision #4987.

I believe that resolves this bug, so I'm marking it FIXED.

--ryan.