Detect gesture and fingers on a trackpad

Hi,

I would like to implement some gesture detection in a Xojo App. I know that part of it is possible using MBS plug-in: one can detect begining and end of gesture events , and rotation- and magnify- type of gesture using the CustomNSViewMBS under Cocoa.

However I do not know how to detect the type of gesture (number of touch) and the direction
Is there a MBS or declare that one can use to do so ? Ideally, I would like to know how many fingers are touching the trackpad and what are their coordinates.

Thanks,
Franck

PS: I guess some critical information can be found there Handling Trackpad Events but I am far from being at ease with ObjC…

My ezgl framework does this. It’s in a state of refactoring now but the touch/gesture stuff has been separated so it can be composed with controls other than an OpenGLSurface. There’s no documentation yet and all my examples are OpenGL but I can make a quick Canvas demo later tonight.

If all you want is the touch points this can do that. If you want to make custom gestures from that data there’s some aspects that don’t work so well. Basically there’s no “I’m handling the touch data” switch to prevent the OS always making it’s changes.

I found a project that was easy to strip down to a simple demo as MainWindow0 (MW0). In it the window is blue until you click and it turns white and touches are shown while pinning the cursor. Click again to turn blue and be able to move the cursor. This lets you touch all around without the cursor getting in the way (but as I alluded to before certain touch patterns slip through).

Realizing you probably aren’t making a custom gesture there’s a simpler MainWindow1, it’s code below. All it does is turn on receiving touches, store them and draw them.

[code]Private lastTouches() As TouchableView.TouchData //for storing the most recent touches received

Sub Open(byref useMagnify As boolean, byref useRotate As boolean, byref useTouch As boolean)
useTouch = true //must turn on the feature in the Open event
End Sub

Function TouchDown(touches() As TouchableView.TouchData, x As integer, y As integer) As boolean
lastTouches = touches //store
me.Invalidate //redraw
return false //don’t (pin cursor and begin TouchDrag)
End Function

Sub Paint(g As Graphics, areas() As REALbasic.Rect)
g.ForeColor = &cFFFFFF //fill
g.FillRect(0, 0, g.Width, g.Height)

g.ForeColor = &c00FF00
dim t As TouchableView.NSPoint
for i As integer = 0 to lastTouches.Ubound //for each Touch received
t = lastTouches(i).pos //get it’s xy point
g.FillRect(t.x * g.Width - 20, g.height * (1-t.y) - 20, 40, 40) //draw it centered 40x40
next

g.ForeColor = &c808080 //outline
g.DrawRect(0, 0, g.Width, g.Height)
End Sub[/code]

MainWindow2 adds the Magnify gesture with touch drawing. Sorry there’s no docs, hope it’s simple enough to be self explanatory.

http://home.comcast.net/~trochoid/code/TouchDemo.xojo_binary_project.zip

Thanks Will. I just tested your demo and it is great. I will look in more details at the implementation but your modules seems indeed hyper clear and easy to understand.
Thank you so much form making this available,

Best,
Franck

PS : I have no need at the moment but, would the same method allow to implement multitouch in iOS in the near future ?

With some changes the touch part might work, but gestures are handled differently in iOS. In fact iOS has a much better system that allows creating true custom gestures. OSX doesn’t have this and my code does some gymnastics to feign a custom gesture which you wouldn’t want to carry over.

My guess is if iOS has declares then a UIGestureRecognizer with all the touch info and control you need won’t take long to appear (if not built in).

I’ve written a tutorial on gestures on Windows and OSX on my blog. It’s a complete framework to build your own gestures. Maybe you use soùe of the code for multitouch?

https://alwaysbusycorner.wordpress.com/2014/04/02/xojo-mouse-touch-gestures-engine

Thanks.
Yes, merging your gesture framework with the multitouch implementation would be great. Beyond my need (and my skills…) at the moment but very tempting.

Being in development I rechecked my code to make sure it’s on the up and up and found an issue you should fix. A cyclic link is made for dispatching events to instances and for some reason I’ve left the method that breaks this link empty. So in TouchableView.eventClose, which is already there, add this code to do that…

Sub eventClose() for i As integer = 0 to dispatchList.Ubound if dispatchList(i) = self then dispatchList.Remove(i) exit end next End Sub

CanvasTouch already calls that in its Close event so everything cleans up.

Another issue is I do weak error checking of declare calls, usually just something like “if result = nil then break” and nothing else. The assumption is those functions are always there and always work. So far I haven’t had problems with this but it’s something to know. In time I’ll think how to gracefully handle these potential errors, unless someone beats me to it :slight_smile:

[quote=90396:@Will Shank]…

http://home.comcast.net/~trochoid/code/TouchDemo.xojo_binary_project.zip[/quote]

So nice! I can make good use of this. I have an app where the input-screen is user definable and now I can add gesture-support for resizing. And also for resizing photos …

Big help, thanks a lot!

Hi,
I have tried to intercept swiping movement (TouchDrag) but this event seems not to fire. Something I did not get ?
Thanks.

What do you mean by intercept swiping? I was going off your original post where you said “I would like to know how many fingers are touching the trackpad and what are their coordinates.” If all you need are the touch coordinates then use only TouchDown by returning false. The name TouchDown is somewhat of a misnomer here. The entire touch sequence will be sent to it as long as you return false, and for getting the data it’s all you need.

To sortof create a custom gesture is when you return true and use TouchDrag/TouchUp. In this scenerio TouchDown is implemented to detect the gesture starting and return true. The touch stream is then sent to TouchDrag where you implement the behavior of your gesture.

OSX actually doesn’t provide for making custom gestures. You can get the touch points but the OS will always process them to do it’s thing: moving the cursor, sending wheel events, switching spaces, etc. When you return true from TouchDown and enter custom gesturing mode these extraneous events are received but discarded, and some trickery is done to keep the cursor in position. Another trick is done to filter out possible wheel momentum scrolling that can carry on after all touches are up. But I haven’t found a way to filter out all OS effects, for example I have 4 finger swipe to switch spaces and that can still happen while a ‘custom gesture’ is running.

In the project MainWindow0 comes from an audio synthesizing app I’m working on where all touches, including a single touch, are processed as sound. To allow a single touch to freely move on the touch pad True is returned right away in TouchDown so the cursor doesn’t hit hot corners or the dock. But then the cursor can’t ever be moved. So a single finger click is used to toggle between normal (blue background) and synth input (white) modes.

The only other time I’ve used the custom gesture feature is to create a two finger simultaneous scale/rotate/translate gesture. It predates the refactoring and is quite involved, mostly in mapping the touches to the previous event by id since their array order isn’t guaranteed. This can be folded into the class but I’m still working on how I want to implement custom gestures, maybe something like UIGestureRecognizer for iOS. If only apple made a NSGestureRecognizer this would be so much easier :slight_smile:

Thanks Will.
It all started because I played around with the “Gesture” example from the MBS plug-in. I could detect gesture starting but not the co-ordinate of my fingers. With you approach I could beautifully get access to the coordinates but not to the information that a swiping gesture was started. I would like to make a window appear when swiping fork the border of my trackpad.
I will thus look at the TouchDrag. I also though that since I have access to a touchdown event and to a touchup event I should be able to know the displacement between these two events and determine the direction of an “easy” straight gesture.
I could also mix it with the event detection provided by the CustomNSViewMBS from MonkeyBread Software

It should be easy to add the swipe gesture when I get time. Another possibility is to compose TouchableView with CustomNSViewMBS like how it is in CanvasTouch. Also, if CustomNSViewMBSs swipe event provides the actual OS event pointer I think you can pull the touch data from that. It also may work to just call NSApplication.currentEvent() and get touches there, will need to experiment.

I recommend not using the custom-gesture/TouchDrag/TouchUp unless you’re really trying to make a custom gesture because of the other stuff it does.

Another question Will (sorry…)
Do you think it would be possible to provide multitouch detection at the level of the Application itself and not only for a Canvas. I have tried but I have yet to understand the way you implemented it…

No apologies, I love to learn.

What are you picturing for this Application level detection?

Looking at the docs I see the touch callbacks are part of NSResponder, which is the super of NSView, NSWindow and NSApplication. NSResponders let you chain how an event flows, which starts at a target NSView and travels up the chain to NSApplication.

I thought maybe the touch events could be implemented in a ‘nextResponder’ that’s added to NSApplication, then all events will flow up to it. I haven’t monkeyed with the responder chain until now but it looks… I don’t know how it looks :slight_smile:

NSApplication and NSWindow, being NSResponder subclasses, have these touch event handlers but they don’t have setAcceptsTouch which allows the events to be sent in the first place, only NSView does. So I think possibly it could be done, a single app level handling of touch events, but you’ll have to go through all NSViews and set setAcceptsTouch true for the touches to flow up there.

The other thing is touch events only flow to the target NSView that was under the cursor when a touch sequence starts. So at the most you can only get touches when the cursor is inside one of your windows (unless maybe a fullscreen transparent window above all others can intercept events).

Maybe just Window level detection would be cleaner but I don’t know how safe it is to relink that part of the chain.

Oh an event monitor may be the ticket.
“The AppKit framework allows you to install an event monitor, an object that looks for user-input events of a certain type (or types) as an application dispatches them in its sendEvent: method. For example, a monitor could look for mouse-up, key-down, or swipe-gesture events, or even all events.”

And this may allow me to make real custom gestures without the blocking hacks. This will require a plugin from Joe Ranieri for ObjC-Blocks, or I think MBS has it too.

I’m finding even more ways that might get you what you want, depending what it is you want exactly :slight_smile:

Is this supposed to be like a new key, I mean, any time, anywhere you can say swipe right and if the fingers hit the right edge of the trackpad then a new window comes up? Or is it just swiping from a certain window or control area?

I thought about a helper application that can be shown or hidden when necessary.
A system-wide shortcut may be used for this but using the trackpad could be also very useful. Slide up from the bottom edge would show the application, the opposite movement would hide it.
Similar to the swiping movement toward the right used to show the list of notifications.