We are currently migrating Bugzilla to GitHub issues.
Any changes made to the bug tracker now will be lost, so please do not post new bugs or make changes to them.
When we're done, all bug URLs will redirect to their equivalent location on the new bug tracker.

Bug 2307 - Touch event coordinates not consistent between Android/IOS and X11
Summary: Touch event coordinates not consistent between Android/IOS and X11
Status: RESOLVED FIXED
Alias: None
Product: SDL
Classification: Unclassified
Component: *don't know* (show other bugs)
Version: HG 2.1
Hardware: x86_64 Linux
: P2 normal
Assignee: Sam Lantinga
QA Contact: Sam Lantinga
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2013-12-14 22:38 UTC by Sylvain
Modified: 2017-10-12 15:38 UTC (History)
2 users (show)

See Also:


Attachments
patch to normalize coordinates of touch event (2.70 KB, patch)
2013-12-16 21:03 UTC, Sylvain
Details | Diff
patch logical size touch event (1.12 KB, patch)
2014-07-20 11:03 UTC, Sylvain
Details | Diff

Note You need to log in before you can comment on or make changes to this bug.
Description Sylvain 2013-12-14 22:38:40 UTC
Hello,

On Android and IOS, it seems touch event coordinates are within [0; 1]
whereas, with X11, it seems to be in pixels. (0,0) is the upper left corner of the window.

It would be nice to have a generic convention (whatever the convention !)

I wanted to submit a patch modifying the X11 stuff, but :

In the X11_HandleXinput2Event, the "xevent.xany.xwindow" has not match in the list of "videodata->windowlist[i]".

Something like this is ok with exactly 1 window:

int window_w, window_h;
SDL_WindowData *data = videodata->windowlist[0];
SDL_GetWindowSize(data->window, &window_w, &window_h);

then dividing when sending the events : 

xev->event_x / (float) window_w
xev->event_y / (float) window_h


Maybe : 
- making rect intersection is maybe too much processing for only touch events ...
- what would happen with 2 or more windows ?
- what would happen with 0 window :)
Comment 1 Gabriel Jacobo 2013-12-16 13:43:46 UTC
XIDeviceEvent has a "Window event" member, doesn't that one match the SDL_WindowData.xwindow member?

You'd also probably need to crop coordinates in addition to scaling them (substract window.x and window.y from event_x and event_y).

Also: http://www.x.org/releases/X11R7.7/doc/inputproto/XI2proto.txt

Section "Touch device modes" (roughly absolute vs relative coordinates) should be taken into account, I don't think it is right now.
Comment 2 Sylvain 2013-12-16 21:03:27 UTC
Created attachment 1498 [details]
patch to normalize coordinates of touch event
Comment 3 Sylvain 2013-12-16 21:10:53 UTC
Indeed, the "window event" did the trick !
So, here's a (temporary) patch to normalize the coordinates.


I am having a look at the "Touch device modes". You are right, the "mode" field is not used by SDL. I can write something for that !
Comment 4 Gabriel Jacobo 2013-12-17 01:37:23 UTC
I think you also need to consider the window offset (x,y coordinates) when normalizing the touch coordinates.
Comment 5 Sylvain 2013-12-17 07:35:54 UTC
No, the (event_x, event_y) coordinates are relative to the upper left corner of the window. So (0, 0) *is* the upper left corner.


which is ok with the documentation you pointed :
---------
[[events-deviceevent]]
    ...
    event_x
    event_y
        The position of the pointer in screen coordinates relative to the
        event window (16.16 fixed point).
---------


Notice that the event may belong to a Window and be outside of it (for instance, when start touching inside a window, then going outside of it).
It leads to coordinates <0 or >1 when the touch is outside the window.
Comment 6 Alex Szpakowski 2014-07-11 01:43:44 UTC
That doesn't match the behaviour of some other operating systems.
In OS X, the positions of the touch events are in the coordinate space of the touch device, rather than the SDL window. That means the coordinates are always within [0; 1] no matter what, and if for example the touch device is a trackpad, (0, 0) will be sent if I touch the top-left corner of the trackpad.
Comment 7 Sylvain 2014-07-16 06:32:21 UTC
I am fine with any solution as long as this is generic/uniform for all platforms :)


What's your trackpad? I have a laptop with a trackpad and it behaves exactly like a mouse on linux. I have no difference with the mouse. But maybe this is because I don't explicitly open the touch device!


Also, what I think of the issue is that coordinates could be expressed in at least three different reference frames. 

raw frame:
example:
- coordinates of the mouse within the screen (screens?) (like global mouse state ?)
- coordinates of the finger within the screen(s).
- coordinates within a hardware tablet.
...

window frame:
- coordinates translated to the window (if any!).

renderer (logical) frame:
- coordinates converted to the renderer frame (also if any, and also maybe several renderer could be attached to the same window).


Also:
For each frame, the coordinates could be relative or not. I mean scaled to be expressed within [0; 1]. So that [0; 1]x[0; 1] matches the frame boundaries.

Also:
In some (corner) case, it seems possible to have the focus on one Window, and the finger going outside that window without losing focus. The window is still capturing the event (which is fine). So events should not be truncated to the window/renderer boundaries.


My point of view is that only the renderer frame matters :). But this is because I am using the SDL renderer, and I know this is possible not to use it. And also not to use window at all.

Maybe:
we should only provide raw coordinates?
Had function(hint) for always converting the coordinates to Window Frame?
Had function(hint) for always converting the coordinates to Rendered Frame?

Or to make this implicit like : if we have a window we convert to window frame, and if we have a renderer, we convert to renderer frame. (This seems this is the case currently?)

Maybe a boolean inside the event structure to tell if the coordinates are within [0; 1] or not?
Maybe an enum to tell if the coordinate are in Raw, Window or Renderer frame?
Comment 8 Sylvain 2014-07-20 11:03:28 UTC
Created attachment 1779 [details]
patch logical size touch event

In addition, here's a patch, to have the fingertouch event scaled to renderer logical size.
Comment 9 Alex Szpakowski 2014-07-24 04:21:26 UTC
I believe the most sensible option (and the one which matches the behaviour of SDL on other operating systems) is to make touchscreen events local to the window (i.e. (0, 0) at the top left of the window's inner frame), and touch device events which don't have a screen local to the device ((0, 0) at the top left of the device.) Touch screen events shouldn't be sent if the finger is outside of the SDL window.

Whether touch events in general should be affected by a SDL_Renderer is outside the scope of this issue, I think. You could make a new one for that!
Comment 10 Sylvain 2014-07-24 06:36:27 UTC
- With this patch the event are local to the window! (0, 0) is the top-left corner.
- They are within [0; 1] (that's the point of the patch, which only affects X11 / Xinput2!).

(but they still can be outside the windows as it was possible before).


Sorry, but I am not sure to understand your issues. 
Also, what is the "trackpad" you are talking about ?
Comment 11 Sylvain 2017-10-12 07:08:43 UTC
duplicate of bug 3871
Comment 12 Sam Lantinga 2017-10-12 15:38:26 UTC
The touch coordinate normalization is in:
https://hg.libsdl.org/SDL/rev/d94abaa07d8e

Thanks!