Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SDL_CreateWindow (on Xorg with multiple X screens) only creates windows on the first screen #1176

Closed
SDLBugzilla opened this issue Feb 10, 2021 · 0 comments

Comments

@SDLBugzilla
Copy link
Collaborator

This bug report was migrated from our old Bugzilla tracker.

These attachments are available in the static archive:

Reported in version: 2.0.1
Reported for operating system, platform: Linux, x86

Comments on the original bug report:

On 2013-10-28 17:19:58 +0000, Raymond wrote:

Hello,

I have a Linux (Slackware) setup with two independent X screens on an nVidia graphics card (running in so-called "Zaphod Mode", if that's the right term).
I compiled SDL 2.0.1 from the release source tarball (SDL2-2.0.1.tar.gz, md5 0eb97039488bf463e775295f7b18b227),

The example provided on http://wiki.libsdl.org/SDL_CreateWindow always opens its window on my first screen, even if I started it on my second screen.

The DISPLAY environment variable that targets my first screen is ':0.0', and my second screen is ':0.1'. I currently don't know how I would be able to open the SDL2 window on the 2nd screen, but I'd guess that the "default behavior" would always be to adhere to the $DISPLAY environment variable ...that is, it should open on the screen that I expect like any other X program (including SDL 1.2 programs?

Thanks for your time and efforts,

Greetings,

Raymond Dresens.

On 2013-10-30 04:37:24 +0000, Alex Szpakowski wrote:

Try SDL_WINDOWPOS_UNDEFINED_DISPLAY(index) for the x and y arguments in SDL_CreateWindow.

On 2013-10-30 07:45:38 +0000, Raymond wrote:

Hello Alex,

It seems that the parameter does not have any effect on my system. I tried the following:

windows[i] = SDL_CreateWindow
    ("An SDL2 window",
     SDL_WINDOWPOS_UNDEFINED_DISPLAY(1),
     SDL_WINDOWPOS_UNDEFINED_DISPLAY(1),
     640, 480, SDL_WINDOW_OPENGL);

...and I tried display 0 as well, just to be sure. The following code btw...

int displays = SDL_GetNumVideoDisplays();

if (displays < 1)

return 1;

printf("%d displays\n", displays);

SDL_Rect r; int i; for (i = 0; i < displays; i++)
{

if (SDL_GetDisplayBounds(i, &r) < 0)
return 1;

printf("%d: %dx%d\n", i, r.w, r.h);
}

Outputs...

2 displays                                  
0: 1920x1200
1: 2560x1600

Greetings,

Raymond.

On 2013-10-31 18:02:17 +0000, Tony Rogvall wrote:

Hi!

I found the cause to this problem, after a couple of days poking around.
I made a local fix, but I am sure that the real fix is better handled
by the author(s) of SDL.

The problem is that SDL_CreateWindow (video/SDL_video.c) updates
window->x and window->y after calculating screen and center and so on.
Later X11_CreateWindow (video/x11/SDL_x11window.c) will call
SDL_GetDisplayForWindow/SDL_GetWindowDisplayIndex to find display and
screen information, this is however calculated with the updated
window->x and window->y (not containing the screen info anymore)
hence window will always end up on screen 0.

My fix included caching the displayIndex in the window structure, and
only reset that when moving windows etc.

On 2013-10-31 18:48:52 +0000, Gabriel Jacobo wrote:

What behaviour do you get when running...

testrendercopyex --display 0 --fullscreen-desktop
testrendercopyex --display 1 --fullscreen-desktop

Thanks!

On 2013-11-02 09:19:25 +0000, Raymond wrote:

Hello Gabriel,

I ran the program from the "test/"-directory from the source tarball:

$ ./testrendercopyex --display 0 --fullscreen-desktop
Window requested size 640x480, got 1920x1200
INFO: 2750.61 frames per second

$ ./testrendercopyex --display 1--fullscreendesktop
Window requested size 640x480, got 1920x1200
INFO: 1607.83 frames per second

The second invocation, however, did display on my 2nd screen (though the reported size, 1920x1200, seems from the 1st screen?)

...so the library is capable of displaying something on my 2nd screen after all, albeit you have to force it to somehow.

I wonder why my test program can't be forced to run on the 2nd screen? I'll attempt to figure that out, this weekend.

Thanks,

Greetings,

Raymond.

On 2013-11-02 12:19:42 +0000, Tony Rogvall wrote:

Raymond did you see my description of the problem ?

/Tony

On 2013-11-03 08:45:05 +0000, Raymond wrote:

(In reply to Tony Rogvall from comment # 6)

Hello Tony,

I would like to try your local fix, though I was not able to recreate it myself this morning (though I did attempted a crude 'fix' that involved creating my own X display (with XOpenDisplay) and by using the default root window (with XDefaultRootWindow), but I ended up with an X11_CreateWindow function that yields an "X Error of failed request" ;)

Is it perhaps possible to make a patch, or am I able to clone a repository where this fix resides?

Thanks,

Greetings,

Raymond.

On 2013-11-03 10:23:28 +0000, Raymond wrote:

Hello Gabriel,

To add (and to fix) to my previous post:

$ ./testrendercopyex --display 1--fullscreendesktop

... should have been ...

$ ./testrendercopyex --display 1 --fullscreen-desktop

... which I also ran when I reported the results (I was sloppy during copy/pasting or something like that). The following command, however ...

$ ./testrendercopyex --display 1

... will create a window on display 0 rather than display 1.

The following call ...

windows[i] = SDL_CreateWindow
    (titleBuffer,
     SDL_WINDOWPOS_UNDEFINED_DISPLAY(1),
     SDL_WINDOWPOS_UNDEFINED_DISPLAY(1), 640, 480,
     SDL_WINDOW_FULLSCREEN_DESKTOP);

... will work as intended. Leaving out the FULLSCREEN_DESKTOP bit will not. The following call ...

windows[i] = SDL_CreateWindow
    (titleBuffer,
     SDL_WINDOWPOS_UNDEFINED,
     SDL_WINDOWPOS_UNDEFINED, 640, 480,
     SDL_WINDOW_FULLSCREEN_DESKTOP);

... will open the window on "display 0", regardless of the terminal (e.g. the DISPLAY environment variable) which is used to start the program in the first place (this would perhaps be the most concise description of the issue that I have reported).

Perhaps Tony's local fix will remedy both issues?

There is another peculiarity by the way: on my system, when I try to implement a program that creates two "desktop full screen" windows, one for each display, then only one fullscreen window can be visible simultaneously. Explicitly focusing the other will hide the other. Perhaps this is a window management issue on my end (in my case, fluxbox)? It seems to be some kind of window/display lost focus issue? ...but there's a twist: I only need to click once on the 'window 2' representation in my window bar in order to have it back. For 'window 1', I need to click it twice to get it back...

Greetings,

Raymond.

On 2013-11-03 11:51:55 +0000, Raymond wrote:

Hello again ;)

I'm not well versed in the internals of SDL ofcourse, but I decided to take a shot at studying the source code in SDL_video.c while attempting to implement Tony's suggestion by myself. I now wonder (too?) why the SDL_Video structure (defined in SDL_sysvideo.h) doesn't retain a "display" attribute/int? Such an integer could be 0xffff (or something similar) that would mean "SDL_DISPLAY_DEFAULT" (which would mean that the X11 code adheres to the DISPLAY environment variable)? It could perhaps prevent the necessity of the current SDL_GetWindowDisplayIndex code?

Indeed: the display number is "smuggled" into the SDL_CreateWindow function inside the x (and y) coordinate of the window position, and it gets lost when the code that actually computes this position gets executed: it seems only possible to specify a display when you don't provide window coordinates?

This is ofcourse just a guessing game from my part ;)

Greetings,

Raymond.

On 2013-11-03 12:38:24 +0000, Tony Rogvall wrote:

Hi Raymod.

Yes, you got it.
Hopefully this will be fixed in the next release. In my application I want to display oculus rift video output using a separate graphics card.
I do not have a good understanding in where to make the fixes, not to interfere with the other platforms (win/mac etc).
If this is not fixed in the next release I promise to make the effort of generating a patch. :-)

Regards

/Tony

On 2013-11-03 14:53:32 +0000, Raymond wrote:

Created attachment 1402
A patch

Hello,

I suppose that I have a patch now.

I did this more or less for fun (I haven't done such a thing before), but it's somewhat rancid: I introduced a new environment variable called SDL_DISPLAY (because I don't know how to do it properly). The 'API mechanics' of my change may be of interest though, that is - if I haven't made a silly mistake. It can be applied like this:

tar -xvf SDL2-2.0.1.tar.gz; cd SDL2-2.0.1; patch -p1 < ~/my-sdl.patch

The following commands do what I "expect" now:

$ ./testrendercopyex --display 0
warning: you're running SDL2-2.0.1-mine
warning: don't know how to determine default display; you could set SDL_DISPLAY instead.
(...)

$ ./testrendercopyex --display 1
warning: you're running SDL2-2.0.1-mine
warning: don't know how to determine default display; you could set SDL_DISPLAY instead.
(...)

$ ./testrendercopyex --display 0 --fullscreen-desktop
warning: you're running SDL2-2.0.1-mine
warning: don't know how to determine default display; you could set SDL_DISPLAY instead.
(...)
warning: you're running SDL2-2.0.1-mine
Window requested size 640x480, got 1920x1200
INFO: 2216.34 frames per second

$ ./testrendercopyex --display 1 --fullscreen-desktop
warning: you're running SDL2-2.0.1-mine
warning: don't know how to determine default display; you could set SDL_DISPLAY instead.
(...)
warning: you're running SDL2-2.0.1-mine
Window requested size 640x480, got 2560x1600
INFO: 1547.58 frames per second

This still doesn't work when started on the 2nd screen ...

$ ./testrendercopyex

... but this does now :) (and it doesn't give any warnings) ...

$ SDL_DISPLAY=1 ./testrendercopyex

... and btw ...

$ SDL_DISPLAY=2 ./testrendercopyex
warning: you're running SDL2-2.0.1-mine
bad: SDL_DISPLAY points to nonexistent display
(...)

Valgrind doesn't seem to complain either!

My changes are...

  • The SDL_Window has a 'display' property now,
(That change is likely not ABI-technically wise: perhaps I should have
 moved my new property to the 'bottom' of the struct?)
  • the SDL_WINDOWPOS_UNDEFINED_DISPLAY(X) and
    SDL_WINDOWPOS_CENTERED_DISPLAY(X) macro's now add an extra bit to the
    window position (defined in SDL_WINDOWPOS_PROVIDED_DISPLAY), which is used
    as an indicator that the programmer/user of the SDL_CreateWindow function
    actually requested a display explicitly,

    Implicitly requested displays are now handled differently, and when no
    display is explicitly requested then the
    "SDL_WINDOWPOS_DISPLAY_DEFAULT display" will be used instead.

  • I have no idea how to determine the default display (I'd guess that
    it is platform specific; the X11 initialization code should be likely
    responsible but Xlib still somewhat eludes me), so I added a hack that
    allows you to set an "SDL_DISPLAY" environment
    variable (analogous to "DISPLAY").

Greetings,

Raymond.

On 2013-11-04 10:40:51 +0000, Tony Rogvall wrote:

Hi Raymond!

Do you handle SDL_SetWindowPosition ?
Is this way of working the way that SDL development is done?

Regards

/Tony

On 2013-11-04 12:43:55 +0000, Gabriel Jacobo wrote:

Thanks for the patch. If you can modify it so it can properly read the DISPLAY environment variable (which is something we don't seem to be honoring), I'll review it for incorporation into mainline if Sam agrees.
Take into account that the "displays" SDL uses do not exactly match the "displays" X11 uses (an SDL display is more closely related to a monitor output).

See this for a reference: http://gerardnico.com/wiki/linux/display

(I think keeping it simple, with a host=localhost and display=0 would be a nice first step).

Finally, consider using SDL_getenv instead of getenv (it's basically the same thing but who knows what the future holds!)

On 2013-11-04 18:54:02 +0000, Raymond wrote:

Hello Tony,

The SDL_SetWindowPosition implementation is currently untouched/not explored by me. I think b.t.w. that actually changing the display of an already created X11 window (which would be permitted, API-wise) with this function will be quite difficult/problematic (due of the necessary tear-down and re-initialization involved).

Hello Gabriel,

I suspect that my patch will likely break things for non-X11 systems in a horrible way,

The original way how the display of a window is detected (as far as I grasp it - by iterating over all displays and doing some kind of comparison based on the dimensions of the screen) is currently cut out,

Next weekend, I'll be able to spend some time to research this further, and I'll attempt to improve my solution,

I hope to be able to devise a solution which involves Xlib (I believe that the default display can be requested somewhere during the initialization code, in video/x11/SDL_x11video.c)... but I should try to get my head around the _this structure passing/casting stuff in the code first,

To be continued...

On 2013-11-10 08:44:02 +0000, Raymond wrote:

Hello,

I did some SDL code tinkering this weekend, in order to come up with a
solution and I'm not quite sure how to proceed...

...I currently have a solution in mind that can be implemented like
this: The SDL_DisplayData structure will have one or two members
added (preferrably one, I think, but it depends on the place where
things have to be initialized/queried/processed):

  • an X-specific "default_Xscreen". It could be used to store the
    return value of the DefaultScreen(display); X11 call (see
    http://tronche.com/gui/x/xlib/display/display-macros.html). It
    could (or should?) be done in the X11_CreateDevice, rigt after
    the X11_XopenDisplay call.

  • the addition to the SDL_DisplayData structure would be a
    generic (applies for all video backends) "default_screen", which
    would be an index to the "displays"-array.

    The X11_CreateDevice function (src/video/x11/SDL_x11video.c) is
    (for X) solely responsible for calling SDL_AddVideoDisplay; it
    increments the "num_displays" member of the (statically declared
    single) SDL_DisplayData structure: the num_displays serves as
    a bound for the "displays" structure, and this loop could set the
    "default_screen" as well, during the iteration over all displays.

This would yield an X11 SDL video backend which is aware of its
DISPLAY environment variable in the way that Xlib developers intended.
Then it would also be easy to add an SDL_GetDefaultDisplay API call,
which likely must work for all video backends. This call would then
simplify and vastly improve my previously submitted patch.

Spending more time into this would be a waste if I'm on the totally
wrong track, and if this solution is not wanted/accepted. Perhaps I
should discuss this with core developers?

How can I proceed, assuming that it is wise to proceed?

On 2013-11-10 22:24:14 +0000, Sam Lantinga wrote:

If SDL can create the window on either display, but passing --display 1 makes it show on the first monitor, this is just a logic bug somewhere in SDL.

It's been a while since I've tested multi-monitor mode in X11. How do you set up "Zaphod Mode" so I can try to reproduce this here?

On 2013-11-10 23:02:20 +0000, Tony Rogvall wrote:

Hi Sam!

Please have a look at comment 3. This describes the bug and how to fix it.

Thanks

On 2013-11-10 23:20:16 +0000, Gabriel Jacobo wrote:

@sam: By Zaphod I think he means "not Twinviewed", that is separate X screens. There seems to be a bug in our logic since he runs testrendercopyex --display 1 and SDL picks up the first screen resolution instead of the second one.

On 2013-11-10 23:41:57 +0000, Sam Lantinga wrote:

Ah, in that case, can you try the latest snapshot? I fixed a bug that might be related:
http://www.libsdl.org/tmp/SDL-2.0.zip

Thanks!

On 2013-11-11 17:15:04 +0000, Raymond wrote:

Created attachment 1416
xorg.conf of my "Zaphod" setup

Hello Sam,

I attached my xorg.conf to this post. Indeed: I do not use TwinView;
my screens operate 'independently'; you likely could start two different
window managers on both screens. Applications cannot be moved from
screen to screen, and applications are guaranteed to start on the screen
where you started them.

I'll try the latest snapshot as soon as possible,

Greetings,

Raymond.

On 2013-11-11 17:25:51 +0000, Raymond wrote:

The results are...

$ ./testrendercopyex --display 0

(( displays on screen 0: good ))

$ ./testrendercopyex --display 1

(( displays on screen 0: bad ))

$ ./testrendercopyex --display 0 --fullscreen-desktop
Window requested size 640x480, got 1920x1200 ))

(( displays on screen 0: good ))

$ ./testrendercopyex --display 1 --fullscreen-desktop
Window requested size 640x480, got 1920x1200

(( displays on screen 1: good, but resolution on STDOUT is not correct ))

Greetings,

Raymond.

On 2013-11-11 19:02:55 +0000, Gabriel Jacobo wrote:

More data points, I just tried this in a dual head Ubuntu 13.04 VM...

DISPLAY=:0 xrandr -q
Screen 0: minimum 64 x 64, current 1704 x 704, maximum 32000 x 32000
VBOX0 connected primary 852x704+0+0 0mm x 0mm
852x704 60.0*+
1280x960 60.0
1024x768 60.0
800x600 60.0
640x480 60.0
VBOX1 connected 852x704+852+0 0mm x 0mm
852x704 60.0*+
1280x960 60.0
1024x768 60.0
800x600 60.0
640x480 60.0

Five of six cases work fine, which is great for my self esteem but problematic to figure out the problem :)

testrendercopyex --display 0
testrendercopyex --display 1
testrendercopyex --display 0 --fullscreen-desktop
testrendercopyex --display 1 --fullscreen-desktop
testrendercopyex --display 1 --fullscreen

This one fails:
testrendercopyex --display 0 --fullscreen

X Error of failed request: BadMatch (invalid parameter attributes)
Major opcode of failed request: 140 (RANDR)
Minor opcode of failed request: 21 (RRSetCrtcConfig)
Serial number of failed request: 220
Current serial number in output stream: 224

On 2013-11-11 19:27:17 +0000, Gabriel Jacobo wrote:

It seems the Xrandr error is related to VirtualBox quirks, restarting the X server makes it go away. So, all six cases actually work with this setup.

On 2013-11-11 19:34:21 +0000, Raymond wrote:

On my setup, I will not see both screens when I use the 'xrandr -q' command.
Depending on the DISPLAY variable (e.g. the screen where the terminal is
started), the commands provide...

Screen 0: minimum 8 x 8, current 1920 x 1200, maximum 16384 x 16384
DVI-I-0 disconnected (normal left inverted right x axis y axis)
DVI-I-1 disconnected (normal left inverted right x axis y axis)
DVI-I-2 connected 1920x1200+0+0 (normal left inverted right x axis y axis)
518mm x 324mm
   1920x1200      60.0*+
   1920x1080      60.0     50.0  
   1680x1050      60.0  
   1600x900       60.0  
   1440x900       59.9  
   1280x1024      60.0  
   1280x800       59.8  
   1280x720       60.0     50.0  
   1024x768       60.0  
   800x600        60.3     56.2  
   720x576        50.0  
   720x480        59.9  
   640x480        59.9  
HDMI-0 disconnected (normal left inverted right x axis y axis)
DP-0 disconnected (normal left inverted right x axis y axis)
DP-1 disconnected (normal left inverted right x axis y axis)

...and when run on screen 1, it provides...

Screen 1: minimum 8 x 8, current 2560 x 1600, maximum 16384 x 16384
DVI-I-3 connected 2560x1600+0+0 (normal left inverted right x axis y axis)
641mm x 400mm
   2560x1600      59.9*+
   1920x1440      60.0  
   1920x1200      60.0     59.9  
   1600x1200      60.0  
   1280x1024      75.0     60.0  
   1280x800       59.8  
   1152x864       75.0  
   1024x768       75.0     60.0  
   800x600        75.0     60.3  
   640x480        75.0     59.9  

Greetings,

Raymond.

On 2013-11-11 20:57:33 +0000, Gabriel Jacobo wrote:

@Raymond, a stab in the dark, but can you try running the tests with SDL_HINT_VIDEO_X11_XINERAMA=0 ?

On 2013-12-07 06:22:04 +0000, Jameson Ernst wrote:

The project I've been on for the last 6 weeks or so is a digital kiosk that uses a "Zaphod" style x11 setup so one app can drive 2 displays that v-sync independently, and we were bitten hard by this bug. I can confirm comment # 3 describes the problem correctly, and the local patch I wrote for it is basically the same as the one presented here. I was in a rush so I didn't add support for the DISPLAY env var, but a proper fix should definitely have that, which it sounds like it will.

What we found after we implemented our version of this fix though is that the underlying issue at play here extends beyond just window creation. When attempting to bind a GL context of version > 2.1 for a window on the secondary display, it would fail. Digging in, we discovered that SDL_x11opengl.c:657 uses the x11 DefaultScreen macro, rather than the actual screen number of the window, which causes the context to fail when made current. We patched this also and things are working well now for us, but other such cases could be lurking elsewhere outside of our particular use case.

On 2014-04-06 14:26:31 +0000, James Le Cuirot wrote:

Created attachment 1604
xf86-video-ati Zaphod xorg.conf

For those wanting to test the issue on Radeon hardware, here is my xorg.conf. If you're using xf86-video-ati then you may need to use the ZaphodHeads option. When Alex Deucher created this option, I argued that it shouldn't be necessary if you named and ordered the monitor sections appropriately but he didn't want to break existing setups or something. It made sense at the time. :) If you're using fglrx then it definitely does support this kind of setup but it doesn't use the ZaphodHeads options.

If you're having trouble working out when this alternative logic should be applied, maybe you could check for the presence of a dot in the DISPLAY variable. I think values like :0.0 and :0.1 only apply to Zaphod mode. It's worth pointing out that if I start a terminal on either display, the DISPLAY variable magically gets set to the correct value for that session.

On 2014-04-06 15:32:14 +0000, James Le Cuirot wrote:

Created attachment 1606
sdl2-zaphod.patch

I decided to have a stab at this myself based on what I just said about DISPLAY having a dot. It worked first time. :) If the variable somehow resolves to a number outside num_displays then it falls back to the other methods.

I'm not sure whether any of these other methods make sense under Zaphod mode. If they do then they should probably come first. I don't know which of the methods was actually being used before.

Hopefully you'll be able to use this patch one way or another.

On 2014-05-03 22:27:15 +0000, James Le Cuirot wrote:

Anyone care to take a look at my patch? :(

Just noting a few things about focus. If the pointer is on the same screen when the game starts then it is locked to that screen. If it is on another screen then it only becomes locked if you move it to the game screen. However, if you click on anything outside the game, the game window minimises. This is a shame because it would be useful to be able to continue working while someone else controls the game with a joystick though there's probably also issues around keyboard focus there. You can alt-tab in and out of the game but only if there is another window on that screen to alt-tab too. I'm not sure how much of this is specific to Zaphod mode and a lot of it is probably specific to XFCE.

I also seem to lack vsync. Haven't yet worked out whether this is specific to SDL2 but it's probably a driver issue.

On 2014-09-05 11:12:38 +0000, wrote:

I am using 2.0.3 and this behavior basically breaks all SDL2 applications for me, because I have so many monitors and :0.0 is small and ugly and far away on my left where I can barely see it...

Virtually always people order screens from left :0.0 to right and it happens more than often that you have to keep a certain order to avoid driver issues across multiple graphics cards especially. But it is theoretically possible with most WMs and decent monocultural graphics cards to display :0.0 without any issues in the middle (except if you need to avoid tearing due to vsync issues with a particular monitor, you can't argue with vsync it will always sync to the first screen....) Especially with SDL/OpenGL you will mostly find games and those should reasonably display in the middle of your screens and not somewhere on the far left. So basically screen order is not a matter of choice, to be realistic. It is a rant, I could go on and on, sorry.

I can only urge you to suppor some kind of flag SDL_NO_MULTIHEAD_EFFORT to disable the new kind of assumptions SDL seems to make. I am not sure what the problem is, all I know is that eventually you have to explicitly override the x window placement at the lowest level to deviate from what you would naturally see from the DISPLAY variable. If it was just about displaying a window on the screen it was opened on, you would never have to index or check anything in the API in the first place.

Please, developers are stuck with this and can't change a thing about it since a year or longer, except reverting to SDL 1.2 . I remember that e.g. SPICE and the NOMACHINE client suffer from the same bug for far longer than that, and now I wonder if it is because they use SDL2. I come here from ioquake3 and they fully implemented SDL2 only to realize afterwards that it never was usable on multihead setups.

On 2014-09-05 11:25:52 +0000, James Le Cuirot wrote:

Please remember that Zaphod mode (which you do appear to have) is a specialised setup and most people use xrandr instead, which does not feature :0.0 and :0.1, etc.

If you are in a position to try my patch, please do. It is still working for me. The vsync issue turned out to be something else.

On 2014-09-05 11:42:48 +0000, wrote:

@le Cuirot:

Well, firstly having seperate screens is the original or rather 'usual' way to get a multihead setup with X11 and it has nothing to do with ZaphodHeads, since ZaphodHeads is plainly a driver option to tell the driver to use a particular output of one card and that output only for a X11 screen (e.g. :0.0), to provide unambigious separation of those, basically as if you were using multiple cards.

In the old days, before DVI outputs and such, you would simply add multiple PCI graphics cards to your AGP one to get your multihead setup and short of super expensive multi-output cards that was the only way.

Xrandr is a functionality that allows you to dynamically manipulate all the outputs of one graphics card (and one card only) into (if not otherwise specified in advance) one giant virtual screen that may even span across multiple monitors. This has limitations and is per se not designed for full multihead support, but only kind of a trick to get it with limited outputs and a single card. On older laptops for example using xrandr, you can't exeed some whatever absolute 4096x4096 virtual resolution and in any case on most cards you can't use more than two monitors at a time thus not more than two with xrandr alone. Since Xinerama never really worked without some kind of critical malfunction and arbitrary unreliabilities on updates (like ridiculous CPU usage or complete inability to work with anything but card monocultures), you almost always have to use more than just Xrandr (i.e. real seperate X11 screens) if you are merely using 3 monitors, which I don't think is unreasonably uncommon.

Though admittedly, xrandr gained popularity since multi-output cards became standard and especially to dynamically add a seperate screen on laptops.

On 2014-09-05 11:47:30 +0000, wrote:

One thing I forgot however: Since USB 2.0 and cheap 2D accelerators, USB graphics cards became more popular too. You can't use those with Xrandr either. Also if you rotate your monitors, it used to be entirely impossible to do that with X11 and the nouveau driver, without disrupting the pointer transition. There are many reasonable reasons why people cannot use Xrandr alone, even if they happen to use less than 3 monitors.

On 2014-10-09 00:24:17 +0000, Steaphan Greene wrote:

Created attachment 1891
Respect default screen from DISPLAY variable, using standard X11 function calls.

I believe this patch is a simpler solution the the problem of SDL2 ignoring the screen in the DISPLAY environment variable in X11.

From looking at the SDL2 video code, it seems to be completely broken for using anything but screen 0, unless the x/y offsets are actually not overlapping. With multiple screens starting at (0,0), only the first screen can be accessed.

This patch does not fix that problem, but it does make sure that this first screen, and thus the default screen that is used when none is selected, is the correct default screen, and all the windows are no longer forced over to screen 0.

And, this patch should not interfere with fixing the overal problem with multiple screens that are not offset from (0,0).

On 2014-10-15 21:36:49 +0000, Gabriel Jacobo wrote:

(In reply to Steaphan Greene from comment # 34)

Created attachment 1891 [details]
Respect default screen from DISPLAY variable, using standard X11 function
calls.

Applied here: https://hg.libsdl.org/SDL/rev/3d2c0f659ad3

I tested with Xvfb in a dual screen config and it seems to work fine, marking as fixed for now but feel free to re open (or open a new bug if there's other details to take care of) if anyone finds an issue.

Thanks!

On 2014-10-15 23:04:33 +0000, Steaphan Greene wrote:

Awesome, thanks.

Yes, there is a separate problem that is still not fixed here - the fact that a window can not be opened on a non-default screen. But, the patch that was applied fixes the simpler problem of ignoring the default screen, which covers the vast majority of the cases where this will be a problem.

And, I agree the rest should be part of a seprate bug report. I'll open a new one for this remaining part of the problem this weekend if nobody else beats me to that.

On 2014-10-20 06:43:09 +0000, wrote:

Thanks, I pulled the version with the patch and updated the PKGBUILD for archlinux ( https://drive.google.com/file/d/0B9Rf4rFQvxzCOGhja0ZwX2l3bFk/view?usp=sharing , pacman -U sdl2-2.0.3-1-x86_64.pkg.tar.xz ). Windows would not open at all though unless I configured with --dbus-disable, because of some unable-to-connect-to-dbus error, I suppose that is a different issue though. I don't know what is wrong with spicec and x2go they still open on :0.0 .

On 2014-10-20 16:28:52 +0000, Steaphan Greene wrote:

What is fixed here is the recognition of the correct default screen.

If an ap is trying to create a window on a specific screen, or if it is trying to handle the interpretation and selection of a default screen manually, that still won't work, due to the other problems mentioned here.

I still plan to open a new bug with the details of those problems... I just haven't gotten the time to do that yet.

No idea about your dbus issues, they don't seem related.

On 2017-05-18 16:13:01 +0000, Werner Almesberger wrote:

Note that this problem still occurs if SDL uses X RandR.
As a work-around, disabling X RandR makes SDL fall back to the corrected display selection:
SDL_SetHint(SDL_HINT_VIDEO_X11_XRANDR, "0");

I've observed the issue and tested correct operation of the work-around with
SDL2 2.0.4 (Ubuntu yakkety)
SDL2 2.0.5 (Ubuntu zesty)
SDL2 built from hg tip (11007:e75416b7aba1)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant