| Summary: | Touch event coordinates not consistent between Android/IOS and X11 | ||
|---|---|---|---|
| Product: | SDL | Reporter: | Sylvain <sylvain.becker> |
| Component: | *don't know* | Assignee: | Sam Lantinga <slouken> |
| Status: | RESOLVED FIXED | QA Contact: | Sam Lantinga <slouken> |
| Severity: | normal | ||
| Priority: | P2 | CC: | amaranth72, gabomdq |
| Version: | HG 2.1 | ||
| Hardware: | x86_64 | ||
| OS: | Linux | ||
| Attachments: |
patch to normalize coordinates of touch event
patch logical size touch event |
||
|
Description
Sylvain
2013-12-14 22:38:40 UTC
XIDeviceEvent has a "Window event" member, doesn't that one match the SDL_WindowData.xwindow member? You'd also probably need to crop coordinates in addition to scaling them (substract window.x and window.y from event_x and event_y). Also: http://www.x.org/releases/X11R7.7/doc/inputproto/XI2proto.txt Section "Touch device modes" (roughly absolute vs relative coordinates) should be taken into account, I don't think it is right now. Created attachment 1498 [details]
patch to normalize coordinates of touch event
Indeed, the "window event" did the trick ! So, here's a (temporary) patch to normalize the coordinates. I am having a look at the "Touch device modes". You are right, the "mode" field is not used by SDL. I can write something for that ! I think you also need to consider the window offset (x,y coordinates) when normalizing the touch coordinates. No, the (event_x, event_y) coordinates are relative to the upper left corner of the window. So (0, 0) *is* the upper left corner.
which is ok with the documentation you pointed :
---------
[[events-deviceevent]]
...
event_x
event_y
The position of the pointer in screen coordinates relative to the
event window (16.16 fixed point).
---------
Notice that the event may belong to a Window and be outside of it (for instance, when start touching inside a window, then going outside of it).
It leads to coordinates <0 or >1 when the touch is outside the window.
That doesn't match the behaviour of some other operating systems. In OS X, the positions of the touch events are in the coordinate space of the touch device, rather than the SDL window. That means the coordinates are always within [0; 1] no matter what, and if for example the touch device is a trackpad, (0, 0) will be sent if I touch the top-left corner of the trackpad. I am fine with any solution as long as this is generic/uniform for all platforms :) What's your trackpad? I have a laptop with a trackpad and it behaves exactly like a mouse on linux. I have no difference with the mouse. But maybe this is because I don't explicitly open the touch device! Also, what I think of the issue is that coordinates could be expressed in at least three different reference frames. raw frame: example: - coordinates of the mouse within the screen (screens?) (like global mouse state ?) - coordinates of the finger within the screen(s). - coordinates within a hardware tablet. ... window frame: - coordinates translated to the window (if any!). renderer (logical) frame: - coordinates converted to the renderer frame (also if any, and also maybe several renderer could be attached to the same window). Also: For each frame, the coordinates could be relative or not. I mean scaled to be expressed within [0; 1]. So that [0; 1]x[0; 1] matches the frame boundaries. Also: In some (corner) case, it seems possible to have the focus on one Window, and the finger going outside that window without losing focus. The window is still capturing the event (which is fine). So events should not be truncated to the window/renderer boundaries. My point of view is that only the renderer frame matters :). But this is because I am using the SDL renderer, and I know this is possible not to use it. And also not to use window at all. Maybe: we should only provide raw coordinates? Had function(hint) for always converting the coordinates to Window Frame? Had function(hint) for always converting the coordinates to Rendered Frame? Or to make this implicit like : if we have a window we convert to window frame, and if we have a renderer, we convert to renderer frame. (This seems this is the case currently?) Maybe a boolean inside the event structure to tell if the coordinates are within [0; 1] or not? Maybe an enum to tell if the coordinate are in Raw, Window or Renderer frame? Created attachment 1779 [details]
patch logical size touch event
In addition, here's a patch, to have the fingertouch event scaled to renderer logical size.
I believe the most sensible option (and the one which matches the behaviour of SDL on other operating systems) is to make touchscreen events local to the window (i.e. (0, 0) at the top left of the window's inner frame), and touch device events which don't have a screen local to the device ((0, 0) at the top left of the device.) Touch screen events shouldn't be sent if the finger is outside of the SDL window. Whether touch events in general should be affected by a SDL_Renderer is outside the scope of this issue, I think. You could make a new one for that! - With this patch the event are local to the window! (0, 0) is the top-left corner. - They are within [0; 1] (that's the point of the patch, which only affects X11 / Xinput2!). (but they still can be outside the windows as it was possible before). Sorry, but I am not sure to understand your issues. Also, what is the "trackpad" you are talking about ? duplicate of bug 3871 The touch coordinate normalization is in: https://hg.libsdl.org/SDL/rev/d94abaa07d8e Thanks! |