| Summary: | SDL touch event ups not always recognized | ||
|---|---|---|---|
| Product: | SDL | Reporter: | Edward Rudd <urkle> |
| Component: | events | Assignee: | Sam Lantinga <slouken> |
| Status: | RESOLVED FIXED | QA Contact: | Sam Lantinga <slouken> |
| Severity: | normal | ||
| Priority: | P2 | CC: | amaranth72, sylvain.becker |
| Version: | HG 2.0 | ||
| Hardware: | x86 | ||
| OS: | Mac OS X (All) | ||
I don't have a touch capable Mac, so tested patches are welcome! In OS X, if you have some fingers on a trackpad with a SDL window in the foreground and then use those fingers to do a system-wide gesture (like switching Spaces or opening Mission control) and then remove the fingers from the trackpad, Cocoa won't always trigger touchesCancelled or touchesEnded for all (or sometimes any) of the fingers. That will leave SDL with 'ghost fingers' which sometimes never get removed. I've been trying to figure that one out for a while now, and I suspect it might be a bug in OS X itself but I don't know for sure. That would explain the other cases I've been seeing this behavior. Now here is something interesting. When looking at the actual even methods (touches*WithEvent:). The touchesBeganWithEvent is NOT always called at all when the window is not active..it seems only ONE began OR Move is called, and them an END.. but not for both fingers (well if they are released at the same time). So maybe we should IGNORE touch events when the window is not active. And maybe we need some way of "Resetting" and removing the ghost fingers? maybe in touchesBeganWithEvent, check for "ANY" fingers, and if the count of "all" touches == began touches and SDL's tracked touches > 0, reset them all? Yeah, maybe we should just always reset touches when the SDL application loses focus? Hi. Just let know of this solved bug : https://bugzilla.libsdl.org/show_bug.cgi?id=2558 There was missing finger_up on Android and also on Mir (because Mir is based on android stack), because the abort gesture event was not handled. Maybe there is some kind of "abort_gesture" also for OSX. (In reply to Sylvain from comment #5) > Hi. Just let know of this solved bug : > https://bugzilla.libsdl.org/show_bug.cgi?id=2558 > > There was missing finger_up on Android and also on Mir (because Mir is based > on android stack), because the abort gesture event was not handled. > > Maybe there is some kind of "abort_gesture" also for OSX. There is an abort(cancel) touch event and it is being handled.. The issue is OS X isn't calling them in certain (rare) circumstances. (In reply to Sam Lantinga from comment #4) > Yeah, maybe we should just always reset touches when the SDL application > loses focus? I tested this out, and unfortunately it's not enough. Some Cocoa touches still get 'lost' (never removed - at least until I switch focus out to another window) if I do any system gesture even if it doesn't change the window focus, for example if I swipe 4 fingers to partially switch to another Space, but lift them up before it "commits" to switching to that Space. I have a fix for this that I'll be pushing shortly. Comments on it are very welcome. The solution I came up with seems to work with all the scenarios of "lost touches".. However it only "fixes" them up when the users touches again.. thus.. time -> (T == touch, actual "lost" release, R when release is sent to the app) ---T----L-------TR-----R the release for the first touch will only be seen when the second touch occurs after the "lost" touch.. However it is FAR better than what is happening now, where those touches are lost and the app will go on thinking there are 8 or more fingers on the touch bad when there is only one:) Edward, can you attach a patch for the fix you were describing? Thanks! Sam, this actually was pushed back in November 2014... https://hg.libsdl.org/SDL/rev/a845edf98a80 Oh great, thanks! |
when a window is NOT the active window touch event "finger up" is not always captured by SDL leaving "lingering fingers" which mess up MULTIGESTURE events. steps to recreate.. 1) modify testgesture.c to enable verbose messages 2) build + run testgesture 3) put testgesture as a background window (e.g. put xcode above it) but so you can partially see the window. 4) put 2 fingers down and "move" them while over the testgesture window * note that the two down events are recognized 5) lift ONE finger * note that the up event is recognized 6) lift the other finger * note that NO up event is recognized 7) make testgesture the active window by clicking it's titlebar. 8) move ONE finger around and notice that MULTIGESTURE events are generated for 2 fingers.. This is fairly reliably recreate-able.