An Overview of iOS 17 Multitouch, Taps, and Gestures

In terms of physical points of interaction between the device and the user, the iPhone and iPad provide four buttons (three in the case of the iPhone X), a switch, and a touch screen. Without question, the user will spend far more time using the touch screen than any other device aspect. Therefore, any app must be able to handle gestures (touches, multitouches, taps, swipes, pinches, etc.) performed by the user’s fingers on the touch screen.

Before writing code to handle these gestures, this chapter will spend some time talking about the responder chain in relation to touch screen events before delving a little deeper into the types of gestures an iOS app is likely to encounter.

The Responder Chain

In the chapter entitled Understanding iOS 17 Views, Windows, and the View Hierarchy, we discussed the view hierarchy of an app’s user interface and how that hierarchy also defined part of the app’s responder chain. However, to fully understand the concepts behind handling touchscreen gestures, it is first necessary to spend a little more time learning about the responder chain.

When the user interacts with the touch screen of an iPhone or iPad, the hardware detects the physical contact and notifies the operating system. The operating system subsequently creates an event associated with the interaction and passes it into the currently active app’s event queue, where it is then picked up by the event loop and passed to the current first responder object; the first responder being the object with which the user was interacting when this event was triggered (for example a UIButton or UIView object). If the first responder has been programmed to handle the type of event received, it does so (for example, a button may have an action defined to call a particular method when it receives a touch event). Having handled the event, the responder then has the option of discarding that event or passing it up to the next responder in the response chain (defined by the object’s next property) for further processing, and so on up the chain. If the first responder is not able to handle the event, it will also pass it to the next responder in the chain and so on until it either reaches a responder that handles the event or it reaches the end of the chain (the UIApp object) where it will either be handled or discarded.

Take, for example, a UIView with a UIButton subview. If the user touches the screen over the button, then the button, as the first responder, will receive the event. If the button cannot handle the event, it will need to be passed up to the view object. If the view can also not handle the event, it would then be passed to the view controller.

 

You are reading a sample chapter from Building iOS 17 Apps using Xcode Storyboards.

Buy the full book now in eBook or Print format.

The full book contains 96 chapters and 760 pages of in-depth information.

Learn more.

Preview  Buy eBook  Buy Print

 

When working with the responder chain, it is important to note that the passing of an event from one responder to the next in the chain does not happen automatically. If an event needs to be passed to the next responder, code must be written to make it happen.

Forwarding an Event to the Next Responder

To pass an event to the next responder in the chain, a reference to the next responder object must first be obtained. This can be achieved by accessing the next property of the current responder. Once the next responder has been identified, the method that triggered the event is called on that object and passed any relevant event data.

Take, for example, a situation where the current responder object cannot handle a touchesBegan event. To pass this to the next responder, the touchesBegan method of the current responder will need to make a call as follows:

override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
        self.next?.touchesBegan(touches, with: event)
}

In this case, the touchesBegan method is called on the next responder and passes the original touches and event parameters.

Gestures

Gesture is an umbrella term used to encapsulate any interaction between the touch screen and the user, starting at the point that the screen is touched (by one or more fingers) and the time that the last finger leaves the screen’s surface. Swipes, pinches, stretches, and flicks are all forms of gesture.

 

You are reading a sample chapter from Building iOS 17 Apps using Xcode Storyboards.

Buy the full book now in eBook or Print format.

The full book contains 96 chapters and 760 pages of in-depth information.

Learn more.

Preview  Buy eBook  Buy Print

 

Taps

As the name suggests, a tap occurs when the user touches the screen with a single finger and immediately lifts it from the screen. Taps can be single-taps or multiple-taps, and the event will contain information about the number of times a user tapped on the screen.

Touches

A touch occurs when a finger establishes contact with the screen. When more than one finger touches the screen, each finger registers as a touch, up to a maximum of five fingers.

Touch Notification Methods

Touch screen events cause one of four methods on the first responder object to be called. The method that gets called for a specific event will depend on the nature of the interaction. To handle events, therefore, it is important to ensure that the appropriate methods from those outlined below are implemented within your responder chain. These methods will be used in the worked example in the book’s An Example iOS 17 Touch, Multitouch, and Tap App and Detecting iOS 17 Touch Screen Gesture Motions chapters.

touchesBegan method

The touchesBegan method is called when the user first touches the screen. Passed to this method is an argument called touches of type NSSet and the corresponding UIEvent object. The touches object contains a UITouch event for each finger in contact with the screen. The tapCount method of any of the UITouch events within the touches set can be called to identify the number of taps, if any, performed by the user. Similarly, the coordinates of an individual touch can be identified from the UITouch event either relative to the entire screen or within the local view itself.

touchesMoved method

The touchesMoved method is called when one or more fingers move across the screen. As fingers move across the screen, this method gets called multiple times, allowing the app to track the new coordinates and touch count at regular intervals. As with the touchesBegan method, this method is provided with an event object and an NSSet object containing UITouch events for each finger on the screen.

 

You are reading a sample chapter from Building iOS 17 Apps using Xcode Storyboards.

Buy the full book now in eBook or Print format.

The full book contains 96 chapters and 760 pages of in-depth information.

Learn more.

Preview  Buy eBook  Buy Print

 

touchesEnded method

This method is called when the user lifts one or more fingers from the screen. As with the previous methods, touchesEnded is provided with the event and NSSet objects.

touchesCancelled method

When a gesture is interrupted due to a high-level interrupt, such as the phone detecting an incoming call, the touchesCancelled method is called.

Touch Prediction

A feature introduced as part of the iOS 9 SDK is touch prediction. Each time the system updates the current coordinates of a touch on the screen, a set of algorithms attempt to predict the coordinates of the next location. For example, a finger sweeping across the screen will trigger multiple calls to the touchesMoved method passing through the current touch coordinates. Also passed through to the method is a UIEvent object on which a method named predictedTouchesForTouch may be called, passing through the touch object representing the current location. In return, the method will provide an array of UITouch objects that predict the next few locations of the touch motion. This information can then be used to improve the app’s performance and responsiveness to the user’s touch behavior.

Touch Coalescing

iOS devices are categorized by two metrics known as the touch sample rate and touch delivery rate. The touch sample rate is the frequency with which the screen scans for the current position of touches on the screen. The touch delivery rate, on the other hand, is the frequency with which that information is passed to the currently running app.

On most devices (including all recent iPhone models except the iPhone X), the sample and delivery rates run at 60 Hz (60 times a second). On other device models, however, the sample and delivery frequencies do not match. The iPhone X, for example, samples at 120 Hz but delivers at a slower 60 Hz, while an iPad Pro with an Apple Pencil samples at an impressive 240 Hz while delivering at only 120 Hz.

 

You are reading a sample chapter from Building iOS 17 Apps using Xcode Storyboards.

Buy the full book now in eBook or Print format.

The full book contains 96 chapters and 760 pages of in-depth information.

Learn more.

Preview  Buy eBook  Buy Print

 

To avoid the loss of touch information caused by the gap in sampling and delivery frequencies, UIKit uses a system referred to as touch coalescing to deliver the additional touch data generated by the higher sampling frequencies.

With touch coalescing, the same touch notification methods are called and passed the same UITouch objects, which are referred to as the main touches. Also passed through to each method is a UIEvent object on which the coalescedTouchesForTouch method may be called, passing through as an argument the current main touch object. When called within an app running on a device where the sampling rate exceeds the delivery rate, the method will return an array of touch objects consisting of both a copy of the current main touch and the intermediate touch activity between the current main touch and the previous main touch. These intermediate touch objects are referred to as coalesced touches. On iOS devices with matching rates, no coalesced touch objects will be returned by this method call.

52.9 Summary

To fully appreciate the mechanisms for handling touchscreen events within an iOS app, it is first important to understand both the responder chain and the methods that are called on a responder depending on the type of interaction. We have covered these basics in this chapter. In the next chapter, entitled An Example iOS 17 Touch, Multitouch, and Tap App, we will use these concepts to create an example app demonstrating touch screen event handling.


Categories