Touch Events and UIResponder - iOS Programming: The Big Nerd Ranch Guide (2014)

iOS Programming: The Big Nerd Ranch Guide (2014)

12. Touch Events and UIResponder

For the next three chapters, you are going to step away from Homepwner and build a new application named TouchTracker to learn more about touch events and gestures, as well as debugging applications.

In this chapter, you will create a view that lets the user draw lines by dragging across the view (Figure 12.1). Using multi-touch, the user will be able to draw more than one line at a time.

Figure 12.1 A drawing program

A drawing program

Touch Events

As a subclass of UIResponder, a UIView can override four methods to handle the four distinct touch events:

· a finger or fingers touches the screen

· - (void)touchesBegan:(NSSet *)touches

· withEvent:(UIEvent *)event;

· a finger or fingers moves across the screen (this message is sent repeatedly as a finger moves)

· - (void)touchesMoved:(NSSet *)touches

· withEvent:(UIEvent *)event;

· a finger or fingers is removed from the screen

· - (void)touchesEnded:(NSSet *)touches

· withEvent:(UIEvent *)event;

· a system event, like an incoming phone call, interrupts a touch before it ends

· - (void)touchesCancelled:(NSSet *)touches

· withEvent:(UIEvent *)event;

When a finger touches the screen, an instance of UITouch is created. The UIView that this finger touched is sent the message touchesBegan:withEvent: and the UITouch is in the NSSet of touches.

As that finger moves around the screen, the touch object is updated to contain the current location of the finger on the screen. Then, the same UIView that the touch began on is sent the message touchesMoved:withEvent:. The NSSet that is passed as an argument to this method contains the sameUITouch that originally was created when the finger it represents touched the screen.

When a finger is removed from the screen, the touch object is updated one last time to contain the current location of the finger, and the view that the touch began on is sent the message touchesEnded:withEvent:. After that method finishes executing, the UITouch object is destroyed.

From this information, we can draw a few conclusions about how touch objects work:

· One UITouch corresponds to one finger on the screen. This touch object lives as long as the finger is on the screen and always contains the current position of the finger on the screen.

· The view that the finger started on will receive every touch event message for that finger no matter what. If the finger moves outside of the UIView’s frame that it began on, that view still receives the touchesMoved:withEvent: and touchesEnded:withEvent: messages. Thus, if a touch begins on a view, then that view owns the touch for the life of the touch.

· You do not have to – nor should you ever – keep a reference to a UITouch object. The application will give you access to a touch object when it changes state.

Every time a touch does something, like begins, moves, or ends, a touch event is added to a queue of events that the UIApplication object manages. In practice, the queue rarely fills up, and events are delivered immediately. The delivery of these touch events involves sending one of theUIResponder messages to the view that owns the touch. (If your touches are sluggish, then one of your methods is hogging the CPU, and events are waiting in line to be delivered. Chapter 14 will show you how to catch these problems.)

What about multiple touches? If multiple fingers do the same thing at the exact same time to the same view, all of these touch events are delivered at once. Each touch object – one for each finger – is included in the NSSet passed as an argument in the UIResponder messages. However, the window of opportunity for the “exact same time” is fairly short. So, instead of one responder message with all of the touches, there are usually multiple responder messages with one or more of the touches.

Creating the TouchTracker Application

Now let’s get started with your application. In Xcode, create a new Empty Application iPhone project and name it TouchTracker. The class prefix should be the same as the other projects, BNR (Figure 12.2).

Figure 12.2 Creating TouchTracker

Creating TouchTracker

First, you will need a model object that describes a line. Create a new subclass of NSObject and name it BNRLine. In BNRLine.h, declare two CGPoint properties:

#import <Foundation/Foundation.h>

@interface BNRLine : NSObject

@property (nonatomic) CGPoint begin;

@property (nonatomic) CGPoint end;

@end

Next, create a new NSObject subclass called BNRDrawView. In BNRDrawView.h, change the superclass to UIView.

#import <Foundation/Foundation.h>

@interface BNRDrawView : NSObject

@interface BNRDrawView : UIView

@end

Now you need a view controller to manage an instance of BNRDrawView in TouchTracker. Create a new NSObject subclass named BNRDrawViewController. In BNRDrawViewController.h, change the superclass to UIViewController.

@interface BNRDrawViewController : NSObject

@interface BNRDrawViewController : UIViewController

In BNRDrawViewController.m, override loadView to set up an instance of BNRDrawView as BNRDrawViewController’s view. Make sure to import the header file for BNRDrawView at the top of this file.

#import "BNRDrawViewController.h"

#import "BNRDrawView.h"

@implementation BNRDrawViewController

- (void)loadView

{

self.view = [[BNRDrawView alloc] initWithFrame:CGRectZero];

}

@end

In BNRAppDelegate.m, create an instance of BNRDrawViewController and set it as the rootViewController of the window. Do not forget to import the header file for BNRDrawViewController in this file.

#import "BNRAppDelegate.h"

#import "BNRDrawViewController.h"

@implementation BNRAppDelegate

- (BOOL)application:(UIApplication *)application

didFinishLaunchingWithOptions:(NSDictionary *)launchOptions

{

self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];

// Override point for customization after application launch

BNRDrawViewController *dvc = [[BNRDrawViewController alloc] init];

self.window.rootViewController = dvc;

self.window.backgroundColor = [UIColor whiteColor];

[self.window makeKeyAndVisible];

return YES;

}

Figure 12.3 Object diagram for TouchTracker

Object diagram for TouchTracker

The major objects you have just set up for TouchTracker are shown in Figure 12.3.

Drawing with BNRDrawView

BNRDrawView will keep track of all of the lines that have been drawn and the line that is currently being drawn. In BNRDrawView.m, create two instance variables in the class extension that will hold the lines in their two states. Make sure to import BNRLine.h and implement initWithFrame:.

#import "BNRDrawView.h"

#import "BNRLine.h"

@interface BNRDrawView ()

@property (nonatomic, strong) BNRLine *currentLine;

@property (nonatomic, strong) NSMutableArray *finishedLines;

@end

@implementation BNRDrawView

- (instancetype)initWithFrame:(CGRect)r

{

self = [super initWithFrame:r];

if (self) {

self.finishedLines = [[NSMutableArray alloc] init];

self.backgroundColor = [UIColor grayColor];

}

return self;

}

We will get to how lines are created in a moment, but in order to test that the line creation code is written correctly, you need the BNRDrawView to be able to draw lines.

In BNRDrawView.m, implement drawRect: to draw the current and finished lines.

- (void)strokeLine:(BNRLine *)line

{

UIBezierPath *bp = [UIBezierPath bezierPath];

bp.lineWidth = 10;

bp.lineCapStyle = kCGLineCapRound;

[bp moveToPoint:line.begin];

[bp addLineToPoint:line.end];

[bp stroke];

}

- (void)drawRect:(CGRect)rect

{

// Draw finished lines in black

[[UIColor blackColor] set];

for (BNRLine *line in self.finishedLines) {

[self strokeLine:line];

}

if (self.currentLine) {

// If there is a line currently being drawn, do it in red

[[UIColor redColor] set];

[self strokeLine:self.currentLine];

}

}

Turning Touches into Lines

A line is defined by two points. Your BNRLine stores these points as properties named begin and end. When a touch begins, you will create a line and set both begin and end to the point where the touch began. When the touch moves, you will update end. When the touch ends, you will have your complete line.

In BNRDrawView.m, implement touchesBegan:withEvent: to create a new line.

- (void)touchesBegan:(NSSet *)touches

withEvent:(UIEvent *)event

{

UITouch *t = [touches anyObject];

// Get location of the touch in view's coordinate system

CGPoint location = [t locationInView:self];

self.currentLine = [[BNRLine alloc] init];

self.currentLine.begin = location;

self.currentLine.end = location;

[self setNeedsDisplay];

}

Then, in BNRDrawView.m, implement touchesMoved:withEvent: so that it updates the end of the currentLine.

- (void)touchesMoved:(NSSet *)touches

withEvent:(UIEvent *)event

{

UITouch *t = [touches anyObject];

CGPoint location = [t locationInView:self];

self.currentLine.end = location;

[self setNeedsDisplay];

}

Finally, in BNRDrawView.m, add the currentLine to the finishedLines when the touch ends.

- (void)touchesEnded:(NSSet *)touches

withEvent:(UIEvent *)event

{

[self.finishedLines addObject:self.currentLine];

self.currentLine = nil;

[self setNeedsDisplay];

}

Build and run the application and draw some lines on the screen. While you are drawing, the lines will appear in red and once finished, they will appear in black.

Handling multiple touches

When drawing lines, you may have noticed that having more than one finger on the screen does not do anything – that is, you can only draw one line at a time. Let’s update BNRDrawView so that you can draw as many lines as you can fit fingers on the screen.

By default, a view will only accept one touch at a time. If one finger has already triggered touchesBegan:withEvent: but has not finished – and therefore has not triggered touchesEnded:withEvent: – subsequent touches are ignored. In this context, “ignore” means that the BNRDrawView will not be sent touchesBegan:withEvent: or any other UIResponder messages related to the extra touches.

In BNRDrawView.m, enable BNRDrawView instances to accept multiple touches.

- (instancetype)initWithFrame:(CGRect)r

{

self = [super initWithFrame:r];

if (self) {

self.finishedLines = [[NSMutableArray alloc] init];

self.backgroundColor = [UIColor grayColor];

self.multipleTouchEnabled = YES;

}

return self;

}

Now that BNRDrawView will accept multiple touches, each time a finger touches the screen, moves, or is removed from the screen, the view will receive the appropriate UIResponder message. However, this now presents a problem: your UIResponder code assumes there will only be one touch active and one line being drawn at a time.

Notice, first, that each touch handling method you have already implemented sends the message anyObject to the NSSet of touches it receives. In a single-touch view, there will only ever be one object in the set, so asking for any object will always give you the touch that triggered the event. In a multiple touch view, that set could contain more than one touch.

Then, notice that there is only one property (currentLine) that hangs on to a line in progress. Obviously, you will need to hold as many lines as there are touches currently on the screen. While you could create a few more properties, like currentLine1 and currentLine2, you would have to go to considerable lengths to manage which instance variable corresponds to which touch.

Instead of the multiple property approach, you can use an NSMutableDictionary to hang on to each BNRLine in progress. The key to store the line in the dictionary will be derived from the UITouch object that the line corresponds to. As more touch events occur, you can use the same algorithm to derive the key from the UITouch that triggered the event and use it to look up the appropriate BNRLine in the dictionary.

In BNRDrawView.m, add a new instance variable to replace the currentLine and instantiate it in initWithFrame:.

@interface BNRDrawView ()

@property (nonatomic, strong) BNRLine *currentLine;

@property (nonatomic, strong) NSMutableDictionary *linesInProgress;

@property (nonatomic, strong) NSMutableArray *finishedLines;

@end

@implementation BNRDrawView

- (instancetype)initWithFrame:(CGRect)r

{

self = [super initWithFrame:r];

if (self) {

self.linesInProgress = [[NSMutableDictionary alloc] init];

self.finishedLines = [[NSMutableArray alloc] init];

self.backgroundColor = [UIColor grayColor];

self.multipleTouchEnabled = YES;

}

return self;

}

Now you need to update the UIResponder methods to add lines that are currently being drawn to this dictionary. In BNRDrawView.m, update the code in touchesBegan:withEvent:.

- (void)touchesBegan:(NSSet *)touches

withEvent:(UIEvent *)event

{

// Let's put in a log statement to see the order of events

NSLog(@"%@", NSStringFromSelector(_cmd));

for (UITouch *t in touches) {

CGPoint location = [t locationInView:self];

BNRLine *line = [[BNRLine alloc] init];

line.begin = location;

line.end = location;

NSValue *key = [NSValue valueWithNonretainedObject:t];

self.linesInProgress[key] = line;

}

UITouch *t = [touches anyObject];

CGPoint location = [t locationInView:self];

self.currentLine = [[BNRLine alloc] init];

self.currentLine.begin = location;

self.currentLine.end = location;

[self setNeedsDisplay];

}

First, notice that you use fast enumeration to loop over all of the touches that began, because it is possible that more than one touch can begin at the same time. (Although typically touches begin at different times and BNRDrawView will receive multiple touchesBegan:withEvent: messages containing each touch.)

Next, notice the use of valueWithNonretainedObject: to derive the key to store the BNRLine. This method creates an NSValue instance that holds on to the address of the UITouch object that will be associated with this line. Since a UITouch is created when a touch begins, updated throughout its lifetime, and destroyed when the touch ends, the address of that object will be constant through each touch event message.

Figure 12.4 Object diagram for Multitouch TouchTracker

Object diagram for Multitouch TouchTracker

Update touchesMoved:withEvent: in BNRDrawView.m so that it can look up the right BNRLine.

- (void)touchesMoved:(NSSet *)touches

withEvent:(UIEvent *)event

{

// Let's put in a log statement to see the order of events

NSLog(@"%@", NSStringFromSelector(_cmd));

for (UITouch *t in touches) {

NSValue *key = [NSValue valueWithNonretainedObject:t];

BNRLine *line = self.linesInProgress[key];

line.end = [t locationInView:self];

}

UITouch *t = [touches anyObject];

CGPoint location = [t locationInView:self];

self.currentLine.end = location;

[self setNeedsDisplay];

}

Then, update touchesEnded:withEvent: to move any finished lines into the _finishedLines array.

- (void)touchesEnded:(NSSet *)touches

withEvent:(UIEvent *)event

{

// Let's put in a log statement to see the order of events

NSLog(@"%@", NSStringFromSelector(_cmd));

for (UITouch *t in touches) {

NSValue *key = [NSValue valueWithNonretainedObject:t];

BNRLine *line = self.linesInProgress[key];

[self.finishedLines addObject:line];

[self.linesInProgress removeObjectForKey:key];

}

[self.finishedLines addObject:self.currentLine];

self.currentLine = nil;

[self setNeedsDisplay];

}

Finally, update drawRect: to draw each line in _linesInProgress.

// Draw finished lines in black

[[UIColor blackColor] set];

for (BNRLine *line in self.finishedLines) {

[self strokeLine:line];

}

[[UIColor redColor] set];

for (NSValue *key in self.linesInProgress) {

[self strokeLine:self.linesInProgress[key]];

}

if (self.currentLine) {

// Draw line in progress in red

[[UIColor redColor] set];

[self strokeLine:self.currentLine];

}

}

Build and run the application and start drawing lines with multiple fingers. (You can simulate multiple fingers on the simulator by holding down the option key as you drag.)

You may be wondering: why not use the UITouch itself as the key? Why go through the hoop of creating an NSValue? Objects used as keys in an NSDictionary must conform to the NSCopying protocol, which allows them to be copied by sending the message copy. UITouch instances do not conform to this protocol because it does not make sense for them to be copied. Thus, the NSValue instances hold the address of the UITouch so that equal NSValue instances can be later created with the same UITouch.

Also, you should know that when a UIResponder message like touchesMoved:withEvent: is sent to a view, only the touches that have moved will be in the NSSet of touches. Thus, it is possible for three touches to be on a view, but only one touch inside the set of touches passed into one of these methods if the other two did not move. Additionally, once a UITouch begins on a view, all touch event messages are sent to that same view over the touch’s lifetime, even if that touch moves off of the view it began on.

The last thing left for the basics of TouchTracker is to handle what happens when a touch is cancelled. A touch can be cancelled when an application is interrupted by the operating system (for example, a phone call comes in) when a touch is currently on the screen. When a touch is cancelled, any state it set up should be reverted. In this case, you should remove any lines in progress.

In BNRDrawView.m, implement touchesCancelled:withEvent:.

- (void)touchesCancelled:(NSSet *)touches

withEvent:(UIEvent *)event

{

// Let's put in a log statement to see the order of events

NSLog(@"%@", NSStringFromSelector(_cmd));

for (UITouch *t in touches) {

NSValue *key = [NSValue valueWithNonretainedObject:t];

[self.linesInProgress removeObjectForKey:key];

}

[self setNeedsDisplay];

}

Bronze Challenge: Saving and Loading

Save the lines when the application terminates. Reload them when the application resumes.

Silver Challenge: Colors

Make it so the angle at which a line is drawn dictates its color once it has been added to _finishedLines.

Gold Challenge: Circles

Use two fingers to draw circles. Try having each finger represent one corner of the bounding box around the circle. You can simulate two fingers on the simulator by holding down the Option button. (Hint: This is much easier if you track touches that are working on a circle in a separate dictionary.)

For the More Curious: The Responder Chain

In Chapter 7, we talked briefly about UIResponder and the first responder. A UIResponder can receive touch events. UIView is one example of a UIResponder subclass, but there are many others, including UIViewController, UIApplication, and UIWindow. You are probably thinking, “But you can’t touch a UIViewController. It’s not an on-screen object.” You are right – you cannot send a touch event directly to a UIViewController, but view controllers can receive events through the responder chain.

Every UIResponder has a pointer called nextResponder, and together these objects make up the responder chain (Figure 12.5). A touch event starts at the view that was touched. The nextResponder of a view is typically its UIViewController (if it has one) or its superview (if it does not). ThenextResponder of a view controller is typically its view’s superview. The top-most superview is the window. The window’s nextResponder is the singleton instance of UIApplication.

Figure 12.5 Responder chain

Responder chain

How does a UIResponder not handle an event? It forwards the same message to its nextResponder. That is what the default implementation of methods like touchesBegan:withEvent: do. So if a method is not overridden, its next responder will attempt to handle the touch event. If the application (the last object in the responder chain) does not handle the event, then it is discarded.

You can explicitly send a message to a next responder, too. Let’s say there is a view that tracks touches, but if a double tap occurs, its next responder should handle it. The code would look like this:

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event

{

UITouch *touch = [touches anyObject];

if (touch.tapCount == 2) {

[[self nextResponder] touchesBegan:touches withEvent:event];

return;

}

... Go on to handle touches that are not double taps

}

For the More Curious: UIControl

The class UIControl is the superclass for several classes in Cocoa Touch, including UIButton and UISlider. You have seen how to set the targets and actions for these controls. Now we can take a closer look at how UIControl overrides the same UIResponder methods you implemented in this chapter.

In UIControl, each possible control event is associated with a constant. Buttons, for example, typically send action messages on the UIControlEventTouchUpInside control event. A target registered for this control event will only receive its action message if the user touches the control and then lifts the finger off the screen inside the frame of the control. Essentially, it is a tap.

For a button, however, you can have actions on other event types. For example, you might trigger a method if the user removes the finger inside or outside the frame. Assigning the target and action programmatically would look like this:

[rButton addTarget:tempController

action:@selector(resetTemperature:)

forControlEvents:UIControlEventTouchUpInside | UIControlEventTouchUpOutside];

Now consider how UIControl handles UIControlEventTouchUpInside.

// Not the exact code. There is a bit more going on!

- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event

{

// Reference to the touch that is ending

UITouch *touch = [touches anyObject];

// Location of that point in this control's coordinate system

CGPoint touchLocation = [touch locationInView:self];

// Is that point still in my viewing bounds?

if (CGRectContainsPoint(self.bounds, touchLocation))

{

// Send out action messages to all targets registered for this event!

[self sendActionsForControlEvents:UIControlEventTouchUpInside];

} else {

// The touch ended outside the bounds, different control event

[self sendActionsForControlEvents:UIControlEventTouchUpOutside];

}

}

So how do these actions get sent to the right target? At the end of the UIResponder method implementations, the control sends the message sendActionsForControlEvents: to itself. This method looks at all of the target-action pairs the control has, and if any of them are registered for the control event passed as the argument, those targets are sent an action message.

However, a control never sends a message directly to its targets. Instead, it routes these messages through the UIApplication object. Why not have controls send the action messages directly to the targets? Controls can also have nil-targeted actions. If a UIControl’s target is nil, theUIApplication finds the first responder of its UIWindow and sends the action message to it.