Using Sensors - Learning iPhone Programming (2010)

Learning iPhone Programming (2010)

Chapter 10. Using Sensors

Mobile phones aren’t just for making phone calls anymore. The iPhone, like a lot of high-end smartphones these days, comes with a number of sensors: camera, accelerometer, GPS module, and digital compass. We’re entering a period of change: more and more users expect these sensors to be integrated into the “application experience.” If your application can make use of them, it probably should.

Hardware Support

While the iPhone is almost unique among mobile platforms in guaranteeing that your code will run on all of the current devices, there is some variation in available hardware between the various models.

Determining Available Hardware Support

Table 10-1 lists the hardware differences between the devices. Because your app will likely support multiple devices, you’ll need to write code to check which features are supported and adjust your application’s behavior as appropriate.

Table 10-1. Hardware support in various iPhone and iPod touch models

Hardware features

Original iPhone

iPhone 3G

iPhone 3GS

First-generation iPod touch

Second-generation iPod touch

Third-generation iPod touch

Cellular

x

x

x

WiFi

x

x

x

x

x

x

Bluetooth

x

x

x

x

x

Speaker

x

x

x

x

x

Audio-in

x

x

x

x

x

Accelerometer

x

x

x

x

x

x

Magnetometer

x

GPS

x

x

Proximity sensor

x

x

x

Camera

x

x

x

Video capture

x

Vibration

x

x

x

Network availability

We covered Apple’s Reachability code in detail in Apple’s Reachability Class in Chapter 7. We can easily determine whether the network is reachable, and whether we are using the wireless or WWAN interface:

Reachability *reach = [

[Reachability reachabilityForInternetConnection] retain];

NetworkStatus status = [reach currentReachabilityStatus];1

1

This call will return a network status: NotReachable, ReachableViaWiFi, or ReachableViaWWAN.

Camera availability

We cover the camera in detail later in this chapter. However, it is simple to determine whether a camera is present in the device:

BOOL available = [UIImagePickerController

isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera];

Once you have determined that a camera is present, you can inquire whether it supports video by making a call to determine the available media types the camera supports:

NSArray *media = [UIImagePickerController availableMediaTypesForSourceType:

UIImagePickerControllerSourceTypeCamera];

If the kUTTypeMovie media type is returned as part of the array, the camera will support video recording.

Audio input availability

You can poll whether audio input is available using the AVAudioSession singleton class by checking the inputIsAvailable class property:

AVAudioSession *audioSession = [AVAudioSession sharedInstance];

BOOL audioAvailable = audioSession.inputIsAvailable;

Note

You will need to add the AVFoundation.Framework (right-click or Ctrl-click on the Frameworks folder in Xcode and choose Add→Existing Frameworks). You’ll also need to import the header (put this in your declaration if you plan to implement theAVAudioSessionDelegate protocol, discussed shortly):

#import <AVFoundation/AVFoundation.h>

You can also be notified of any changes in the availability of audio input (e.g., a second-generation iPod touch user has plugged in headphones with microphone capabilities). First, nominate your class as a delegate:

audioSession.delegate = self;

Declare it as implementing the AVAudioSessionDelegate protocol in the declaration:

@interface YourAppDelegate : NSObject <UIApplicationDelegate,

AVAudioSessionDelegate >

Then implement inputIsAvailableChanged: in the implementation:

- (void)inputIsAvailableChanged:(BOOL)audioAvailable {

NSLog(@"Audio availability has changed");

}

GPS availability

I’m going to cover the Core Location framework, and GPS, later in the chapter. However, the short answer to a fairly commonly asked question is that, unfortunately, the Core Location framework does not provide any way to get direct information about the availability of specific hardware.

While you cannot check for the availability of GPS using Core Location, you can require the presence of GPS hardware for your application to load. I will discuss this in the next section.

Setting Required Hardware Capabilities

If your application requires specific hardware features in order to run, you can add a list of required capabilities to your application’s Info.plist file. Your application will not start unless those capabilities are present on the device.

Later in the chapter we’ll modify the Weather application to make use of the Core Location framework to determine current position, so let’s modify it now to make sure this capability is available.

Note

You may want to make a copy of the Weather application before modifying, as we have done previously. Navigate to where you saved the project and make a copy of the project folder, and then rename it. Then open the new (duplicate) project inside Xcode and use the Project→Rename tool to rename the project.

Open the Weather application in Xcode, open the Weather-Info.plist file in the Xcode editor, and click on the bottommost entry. A button with a plus sign (+) on it will appear to the righthand side of the key-value pair table. Click on this button to add a new row to the table; then scroll down the list of possible options and select “Required device capabilities” (the UIRequiredDeviceCapabilities key) as shown Figure 10-1. This will add an (empty) array to the .plist file. If you add “location-services” (see Figure 10-2) as Item 0 of this array (some versions of Xcode may label the first item in the array Item 1), your application will no longer start if such services are unavailable. If you want to add further entries, select Item 0 and click the plus button to the righthand side of the table.

Setting the “Required device capabilities” key

Figure 10-1. Setting the “Required device capabilities” key

The allowed values for the keys are telephony, sms, still-camera, auto-focus-camera, video-camera, wifi, accelerometer, location-services, gps, magnetometer, microphone, opengles-1, opengles-2, armv6, armv7, and peer-peer. A full description of the possible keys is available in the Device Support section of the iPhone Application Programming Guide available from the iPhone Dev Center.

Adding the location-services item to “Required device capabilities”

Figure 10-2. Adding the location-services item to “Required device capabilities”

Using the Camera

We looked at the image picker view controller in Chapter 6, where we used it to add pictures to our City Guide application using our AddCityController class. We have to change only one line in our code from Chapter 6 to make our City Guide application use the camera instead of the saved photo album.

If you open the CityGuide project in Xcode and look at the viewDidLoad: method in the AddCityController class, you’ll see that we set the source of the image picker controller to be the photo album:

pickerController.sourceType =

UIImagePickerControllerSourceTypeSavedPhotosAlbum;

Changing the source to UIImagePickerControllerSourceTypeCamera will mean that when you call presentModalViewController:, which presents the UIImagePickerController, the camera interface rather than the photo album will be presented to the user, allowing him to take a new picture.

If you want to enable video, you need to add the relevant media type to the array indicating the media types to be accessed by the picker. By default, this array contains only the image media type. The following code should determine whether your device supports a camera, and if it does, it will add all of the available media types (including video on the iPhone 3GS) to the media types array. If there is no camera present, the source will be set to the photo album as before:

if ([UIImagePickerController

isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera])

{

pickerController.sourceType = UIImagePickerControllerSourceTypeCamera;

NSArray* mediaTypes =

[UIImagePickerController availableMediaTypesForSourceType:

UIImagePickerControllerSourceTypeCamera];

pickerController.mediaTypes = mediaTypes;

} else {

pickerController.sourceType =

UIImagePickerControllerSourceTypeSavedPhotosAlbum;

pickerController.allowsEditing = YES;

}

The Core Location Framework

The Core Location framework is an abstraction layer in front of several different methods to find the user’s location (and, by extrapolation, her speed and course). It can provide the latitude, longitude, and altitude of the device (along with the level of accuracy to which this is known). There are three levels of accuracy:

§ The least accurate level uses the cell network to locate the user (the process is similar to triangulation, but more complex). This can quickly provide a position to around 12 km accuracy, which can be reduced to 1–3 km after some time depending on the tower density at your current location.

§ The next accuracy level is obtained by utilizing Skyhook Wireless’s WiFi-based positioning system. This is much more precise, giving a position to approximately 100 m. However, it depends on the user being in range of a known wireless hotspot.

§ The highest level of accuracy is obtained by using GPS hardware, which should provide a position to less than 40 m.

Warning

On the iPod touch, the user’s location is derived solely from WiFi positioning. The original iPhone will use WiFi and cell tower triangulation, and on the iPhone 3G and 3GS it will also make use of the built-in GPS hardware.

The actual method used to determine the user’s location is abstracted away from both the user and the developer. The only control the developer has over the chosen method is by requesting a certain level of accuracy, although the actual accuracy achieved is not guaranteed. Further, thebattery power consumed and the time to calculate the position increase with increasing accuracy.

Warning

Some users may choose to explicitly disable reporting of their position. You should therefore always check to see whether location services are enabled before attempting to turn on these services. This will avoid unnecessary prompting from your application.

The Core Location framework is implemented using the CLLocationManager class. The following code will create an instance of this class, and from then on will send location update messages to the designated delegate class:

CLLocationManager *locationManager = [[CLLocationManager alloc] init];

locationManager.delegate = self;

if( locationManager.locationServicesEnabled ) {

[locationManager startUpdatingLocation];

} else {

NSLog(@"Location services not enabled.");

}

Note

To use this code, you will need to add the Core Location framework. In Groups & Files, right-click or Ctrl-click on Frameworks and select Add→Existing Frameworks. Add CoreLocation. You will also need to declare your class as implementing theCLLocationManagerDelegate protocol and import CoreLocation in your declaration or implementation with the following code:

#import <CoreLocation/CoreLocation.h>

We can filter these location update messages based on a distance filter. Changes in position of less than this amount will not generate an update message to the delegate:

locationManager.distanceFilter = 1000; // 1km

We can also set a desired level of accuracy; this will determine the location method(s) used by the Core Location framework to determine the user’s location:

locationManager.desiredAccuracy = kCLLocationAccuracyKilometer;

The CLLocationManagerDelegate protocol offers two methods. The first is called when a location update occurs:

- (void)locationManager:(CLLocationManager *)manager

didUpdateToLocation:(CLLocation *)newLocation

fromLocation:(CLLocation *)oldLocation

{

NSLog(@"Moved from %@ to %@", oldLocation, newLocation);

}

The second is called when an error occurs:

- (void)locationManager:(CLLocationManager *)manager

didFailWithError:(NSError *)error {

NSLog(@"Received Core Location error %@", error);

[manager stopUpdatingLocation];

}

If the location manager is not able to ascertain the user’s location immediately, it reports a kCLErrorLocationUnknown error and keeps trying. In most cases, you can choose to ignore the error and wait for a new event. However, if the user denies your application access to the location service, the manager will report a kCLErrorDenied error. Upon receiving such an error, you should stop the location manager.

Location-Dependent Weather

In Chapter 7 we built a simple Weather application, but it would be much better if the application gave us weather information for our current location. We can use the Core Location framework to retrieve the user’s latitude and longitude. However, the Google Weather Service, which we used to back our Weather application, takes only city names, not latitude or longitude arguments.

There are several ways around this problem. For instance, the MapKit framework, which we’ll meet later in the book, offers reverse geocoding capabilities (which turn coordinates into postal addresses). However, for this example, I’m going to make use of one of the many web services offered by the GeoNames.org site to carry our reverse geocoding to retrieve the nearest city from the latitude and longitude returned by the Core Location framework.

Using the GeoNames reverse geocoding service

One of the RESTful web services offered by GeoNames.org will return an XML or JSON document listing the nearest populated place using reverse geocoding. Requests to the service take the form http://ws.geonames.org/findNearbyPlaceName?lat=<XX.X>&lng=<XX.X> if you want an XML document returned, or http://ws.geonames.org/findNearbyPlaceNameJSON?lat=<XX.X>&lng=<XX.X> if you prefer a JSON document. There are several optional parameters: radius (in km), max (maximum number of rows returned), and style (SHORT, MEDIUM, LONG, and FULL).

Passing the longitude and latitude of Cupertino, California, which is the location returned by Core Location in all cases for iPhone Simulator, the JSON service would return the following JSON document:

{

"geonames":[

{

"countryName":"United States",

"adminCode1":"CA",

"fclName":"city, village,...",

"countryCode":"US",

"lng":-122.0321823,

"fcodeName":"populated place",

"distance":"0.9749",

"fcl":"P",

"name":"Cupertino",

"fcode":"PPL",

"geonameId":5341145,

"lat":37.3229978,

"population":50934,

"adminName1":"California"

}

]

}

Modifying the Weather application

Let’s modify our Weather application to make use of Core Location and (optionally) give us the weather where we are, rather than just for a hardwired single location. Open the Weather project in Xcode and click on the WeatherAppDelegate.h interface file to open it in the Xcode editor.

We’re going to use the application delegate to manage the CLLocationManager. I’ve highlighted the changes you need to make to this file in bold:

#import <CoreLocation/CoreLocation.h>

@class MainViewController;

@interface WeatherAppDelegate : NSObject

<UIApplicationDelegate, CLLocationManagerDelegate>1

{

UIWindow *window;

MainViewController *mainViewController;

BOOL updateLocation;2

CLLocationManager *locationManager;3

}

@property (nonatomic, retain) IBOutlet UIWindow *window;

@property (nonatomic, retain) MainViewController *mainViewController;

@property (nonatomic) BOOL updateLocation;

@property (nonatomic, retain) CLLocationManager *locationManager;

@end

1

We declare that the application delegate is also a CLLocationManager delegate.

2

We declare a Boolean variable to indicate whether we’re currently supposed to be monitoring the device’s location.

3

We declare an instance of the CLLocationManager.

You will also need to add the Core Location framework to the project. In Groups & Files, right-click or Ctrl-click on Frameworks and select Add→Existing Frameworks. Select CoreLocation and click Add.

In the corresponding implementation file (WeatherAppDelegate.m), we first need to synthesize the new variables we declared in the interface file:

@synthesize updateLocation;

@synthesize locationManager;

After that, add the code shown in bold to the applicationDidFinishLaunching: method. This creates an instance of the CLLocationManager class and sets the delegate for the class to be the current class (the application delegate).

- (void)applicationDidFinishLaunching:(UIApplication *)application {

// Create instance of Main View controller

MainViewController *aController =

[[MainViewController alloc]

initWithNibName:@"MainView" bundle:nil];

self.mainViewController = aController;

[aController release];

// Create instance of LocationManager object

self.locationManager =

[[[CLLocationManager alloc] init] autorelease];1

self.locationManager.delegate = self;2

// Create instance of WeatherForecast object

WeatherForecast *forecast = [[WeatherForecast alloc] init];

self.mainViewController.forecast = forecast;

[forecast release];

// Set the main view

mainViewController.view.frame = [UIScreen mainScreen].applicationFrame;

[window addSubview:[mainViewController view]];

[window makeKeyAndVisible];

}

1

This creates the CLLocationManager instance.

2

This sets the delegate for the instance to the current class.

Finally, we have to make sure the CLLocationManager instance is released in the dealloc: method, and implement the two CLLocationManagerDelegate methods we’re going to need. Make the changes shown in bold:

- (void)dealloc {

[locationManager release];

[mainViewController release];

[window release];

[super dealloc];

}

#pragma mark CLLocationManager Methods

- (void)locationManager:(CLLocationManager *)manager

didUpdateToLocation:(CLLocation *)newLocation

fromLocation:(CLLocation *)oldLocation {1

NSLog(@"Location: %@", [newLocation description]);

if ( newLocation != oldLocation ) {

// Add code here

}

}

- (void)locationManager:(CLLocationManager *)manager

didFailWithError:(NSError *)error {2

NSLog(@"Error: %@", [error description]);

}

1

This is the delegate method to handle changes in location.

2

This is the delegate method to handle any errors that occur.

We’re going to modify the (currently unused) flip side of the Weather application and add a switch (UISwitch), similar to our Battery Monitor application from Chapter 6. This will toggle whether our application should be updating its location. However, let’s modify theFlipSideViewController interface file before we go to the NIB file, adding both a switch and a switchThrown: interface builder action that we’ll connect to the switch. I’ve also added a reference to the application delegate. Make the changes shown in bold toFlipSideViewController.h:

@protocol FlipsideViewControllerDelegate;

@class WeatherAppDelegate;

@interface FlipsideViewController : UIViewController {

id <FlipsideViewControllerDelegate> delegate;

IBOutlet UISwitch *toggleSwitch;

WeatherAppDelegate *appDelegate;

}

@property (nonatomic, assign) id <FlipsideViewControllerDelegate> delegate;

- (IBAction)done;

- (IBAction)switchThrown;

@end

In the corresponding implementation (FlipSideViewController.m), import both the Core Location framework and the application delegate interface file:

#import <CoreLocation/CoreLocation.h>

#import "WeatherAppDelegate.h";

Then in the viewDidLoad: method, we need to populate the reference to the application delegate and use the value of the updateLocation Boolean declared earlier to set the state of the UISwitch. Add the lines shown in bold:

- (void)viewDidLoad {

[super viewDidLoad];

self.view.backgroundColor = [UIColor viewFlipsideBackgroundColor];

appDelegate = (WeatherAppDelegate *)

[[UIApplication sharedApplication] delegate];

toggleSwitch.on = appDelegate.updateLocation;

}

In the done: method, which is called when the user clicks on the Done button to close the flipside view, we must set the same updateLocation Boolean variable in the application delegate to be that of the state of the switch. If the user has changed the switch state on the flip side, it will now be reflected in the application delegate. Add the line shown in bold:

- (IBAction)done {

appDelegate.updateLocation = toggleSwitch.on;

[self.delegate flipsideViewControllerDidFinish:self];

}

Next, provide an implementation of the switchThrown: method that you’ll attach to the UISwitch in Interface Builder:

-(IBAction)switchThrown {

NSLog(@"Switch thrown");

if ( toggleSwitch.on ) {

[appDelegate.locationManager startUpdatingLocation];

} else {

[appDelegate.locationManager stopUpdatingLocation];

}

}

Finally, remember to release the toggleSwitch inside the dealloc: method:

- (void)dealloc {

[toggleSwitch release];

[super dealloc];

}

Now let’s add that switch to the flipside view. Make sure you’ve saved all your changes and then double-click on the FlipsideView.xib file to open it in Interface Builder. Drag and drop a label (UILabel) and a switch (UISwitch) element from the Library window into the Flipside View window. Position them and adjust the attributes (⌘-1) of the label so that your layout looks like Figure 10-3.

Adding the UISwitch to the FlipsideView controller

Figure 10-3. Adding the UISwitch to the FlipsideView controller

Click File’s Owner, open the Connections Inspector (⌘-2), and connect the toggleS⁠witch outlet to the UISwitch. Then connect the switchThrown: action to the UISwitch’s Value Changed event. While you’re here, double-click on the navigation bar title and change the text to “Preferences”. Save your changes; we’re done here.

We’ve reached a natural point to take a break and test the application. Save FlipsideView.xib and return to Xcode. Then click the Build and Run button in the Xcode toolbar to compile and deploy the Weather application into the simulator. Once it’s running, click the Info button to go to the flip side of the application and toggle the switch. If you look at the Debugger Console (Run→Console in the Xcode menu bar), you should (after a small amount of time) see something that looks a lot like Figure 10-4.

The Weather application reporting the current location (of iPhone Simulator) when the flipside switch is thrown

Figure 10-4. The Weather application reporting the current location (of iPhone Simulator) when the flipside switch is thrown

iPhone Simulator will always report its location as being at Lat. +37.33168900, Long. –122.03073100, corresponding to 1 Infinite Loop, Cupertino, CA.

Quit the simulator. Back in Xcode, click on the MainViewController.h interface file to open it in the editor. Since we’re now going to have multiple locations, we need somewhere to store the name of the location that we’ll get back from the reverse geocoder. So, add an NSString toMainViewController.h (somewhere inside the opening and closing curly braces after the @interface directive) to store the location:

NSString *location;

Then expose this and the UIActivityIndicator (we’re going to use that shortly) as properties. Add the following just before the @end directive:

@property(nonatomic, retain) UIActivityIndicatorView *loadingActivityIndicator;

@property(nonatomic, retain) NSString *location;

Since we’ve declared location and loadingActivityIndicator as properties, go back to the implementation file (MainViewController.m) and add these lines to synthesize those properties:

@synthesize loadingActivityIndicator;

@synthesize location;

Then in the viewDidLoad: method, initialize the location string:

- (void)viewDidLoad {

[super viewDidLoad];

location = [[NSString alloc] init];

[self refreshView:self];

}

Make sure it is released in the dealloc: method:

- (void)dealloc {

[location release];

... rest of the method not shown ...

}

Next, in the refreshView: method, check whether the app is monitoring the device’s location so that you know whether to query the Google Weather Service with the default location (London, UK) or with the current location:

- (IBAction)refreshView:(id)sender {

[loadingActivityIndicator startAnimating];

WeatherAppDelegate *appDelegate =

(WeatherAppDelegate *)[[UIApplication sharedApplication] delegate];

if( appDelegate.updateLocation ) {

NSLog( @"updating for location = %@", self.location );

[forecast queryService:self.location withParent:self];

} else {

[forecast queryService:@"London,UK" withParent:self];

}

}

Since we’ve made use of the application delegate, we need to make sure we import it into the MainViewController implementation. Add this line to the top of the file:

#import "WeatherAppDelegate.h"

Now we’re done with the view controller.

What’s left to do? First, we need to build a class to query the GeoNames reverse geocoder service, and then we need to pass the latitude and longitude to the reverse geocoder service from the CLLocationManager delegate methodlocationManager:didUpdateToLocation:fromLocation: in the application delegate.

Note

Since we’re going to make use of the JSON service, we need to add the JSON parser to our project in the same way we did in Chapter 8 for the Twitter Trends application. See Parsing JSON in Chapter 8 for details on how to add the json-framework library to your project.

Right-click on the Other Sources group in the Groups & Files pane of the Xcode interface and select Add→New Files. In the New File pop up, make sure Cocoa Touch Class (under iPhone OS) is selected. Next, choose “Objective-C class”, a subclass of NSObject, and click the Next button. Name the new class “FindNearbyPlace” when prompted and click Finish.

Click on the FindNearbyPlace.h interface file and modify the template so that it looks like the following code:

#import <Foundation/Foundation.h>

@class WeatherAppDelegate;

@interface FindNearbyPlace : NSObject {

WeatherAppDelegate *appDelegate;

NSMutableData *responseData;

NSURL *theURL;

}

- (void)queryServiceWithLat:(NSString *)latitude

andLong:(NSString *)longitude;

@end

Modify the FindNearbyPlace.m implementation file so that it looks like the following code. You may recognize this code from Chapter 8; apart from the connectionDidFi⁠⁠nishLoading: method, it’s almost identical to the Trends API code we wrote for the Twitter Trends application:

#import "WeatherAppDelegate.h"

#import "MainViewController.h"

#import "FindNearbyPlace.h"

#import "JSON/JSON.h"

@implementation FindNearbyPlace

- (void)queryServiceWithLat:(NSString *)latitude

andLong:(NSString *)longitude

{

appDelegate = (WeatherAppDelegate *)

[[UIApplication sharedApplication] delegate];

responseData = [[NSMutableData data] retain];

NSString *url = [NSString stringWithFormat:

@"http://ws.geonames.org/findNearbyPlaceNameJSON?lat=%@&lng=%@",

latitude, longitude];

theURL = [[NSURL URLWithString:url] retain];

NSURLRequest *request = [NSURLRequest requestWithURL:theURL];

[[NSURLConnection alloc] initWithRequest:request delegate:self];

}

- (NSURLRequest *)connection:(NSURLConnection *)connection

willSendRequest:(NSURLRequest *)request

redirectResponse:(NSURLResponse *)redirectResponse

{

[theURL autorelease];

theURL = [[request URL] retain];

return request;

}

- (void)connection:(NSURLConnection *)connection

didReceiveResponse:(NSURLResponse *)response

{

[responseData setLength:0];

}

- (void)connection:(NSURLConnection *)connection

didReceiveData:(NSData *)data

{

[responseData appendData:data];

}

- (void)connection:(NSURLConnection *)connection

didFailWithError:(NSError *)error

{

// Handle Error

}

- (void)connectionDidFinishLoading:(NSURLConnection *)connection {

NSString *content =

[[NSString alloc] initWithBytes:[responseData bytes]

length:[responseData length]

encoding:NSUTF8StringEncoding];

NSLog(@"Content = %@", content);

SBJSON *parser = [[SBJSON alloc] init];

NSDictionary *json = [parser objectWithString:content];

NSArray *geonames = [json objectForKey:@"geonames"];

NSString *city = [[NSString alloc] init];

NSString *state = [[NSString alloc] init];

NSString *country = [[NSString alloc] init];

for (NSDictionary *name in geonames) {

city = [name objectForKey:@"name"];

state = [name objectForKey:@"adminCode1"];

country = [name objectForKey:@"countryName"];

}

[parser release];

NSLog( @"Location = %@, %@, %@", city, state, country );

NSString *string = [NSString stringWithFormat:@"%@,%@", city, state];

appDelegate.mainViewController.location = string;1

[appDelegate.mainViewController.loadingActivityIndicator

stopAnimating];2

[appDelegate.mainViewController refreshView: self];3

}

-(void)dealloc {

[appDelegate release];

[responseData release];

[theURL release];

[super dealloc];

}

@end

1

This sets the location string in our MainViewController class.

2

This stops the loading indicator spinning in the MainViewController class.

3

This refreshes the main view managed by the MainViewController class.

Now we have the class to query and parse the reverse geocoder service; we just need to write the code in the locationManager:didUpdateToLocation:fromLocation: delegate method.

Click on the application delegate implementation file (WeatherAppDelegate.m) to open it in the Xcode editor and import the geocoder class by adding this line at the top:

#import "FindNearbyPlace.h"

Next, in the didUpdateToLocation: method, add the code shown in bold:

- (void)locationManager:(CLLocationManager *)manager

didUpdateToLocation:(CLLocation *)newLocation

fromLocation:(CLLocation *)oldLocation

{

NSLog(@"Location: %@", [newLocation description]);

if ( newLocation != oldLocation ) {

[self.mainViewController.loadingActivityIndicator

startAnimating];1

FindNearbyPlace *find = [[FindNearbyPlace alloc] init];

NSString *latitude = [NSString stringWithFormat:@"%f",

newLocation.coordinate.latitude];

NSString *longitude = [NSString stringWithFormat:@"%f",

newLocation.coordinate.longitude];

[find queryServiceWithLat:latitude andLong:longitude];

}

}

1

This starts the activity indicator spinning. We’ll stop it when we’ve parsed the JSON returned by the GeoNames service and we’re ready to refresh the view in the connectionDidFinishLoading: method of the FindNearbyPlace class.

Here we simply retrieve the latitude and longitude from the CLLocation object, and we pass them to our FindNearbyPlace class to resolve. There the connectionDidFinishLoading: method takes care of updating the main view controller.

We’re done. Save your changes and click Build and Run to compile and deploy the application in iPhone Simulator. Once it’s running, click the Info button to go to the flip side of the application and toggle the switch. Click the Done button and return to the main view. After a little while the activity indicator in the top-righthand corner should start spinning and the weather information should change from being for London to being for Cupertino, California.

Tidying up

Don’t be fooled. The application has many dangling loose ends to clean up before it can be considered “ready for release.” For instance, in the FindNearbyPlace class we concatenate the city and state to create the location we pass to the Google Weather Service:

city = [name objectForKey:@"name"];

state = [name objectForKey:@"adminCode1"];

NSString *string = [NSString stringWithFormat:@"%@,%@", city, state];

appDelegate.mainViewController.location = string;

While this works for U.S. locations (Cupertino, CA), it fails for British locations where you end up with a string of the form London,ENG, which the Weather service can’t understand.

However, as it stands, it’s a nice starting point for integrating multiple web services into a single application.

Using the Accelerometer

The iPhone’s accelerometer measures the linear acceleration of the device so that it can report its roll and pitch, but not its yaw.

Note

Yaw, pitch, and roll refer to the rotation of the device in three axes. If you think about an aircraft in the sky, pushing the nose down or pulling it up modifies the pitch angle of the aircraft. However, if you keep the nose straight ahead, you can also modify the roll of the aircraft using the flaps; one wing will come up, the other will go down. Finally, keeping the wings level you can use the tail flap to change the heading (or yaw) of the aircraft (rotating it in a 2D plane).

If you are dealing with an iPhone 3GS, which has a digital compass, you can combine the accelerometer and magnetometer readings to have roll, pitch, and yaw measurements (see the following section for details on how to access the magnetometer).

The accelerometer reports three figures: X, Y, and Z (see Figure 10-5). Acceleration values for each axis are reported directly by the hardware as G-force values. Therefore, a value of 1.0 represents a load of approximately 1-gravity (Earth’s gravity). X corresponds to roll, Y to pitch, and Z to whether the device is front side up or front side down, with a value of 0.0 being reported when the iPhone is edge-on.

The iPhone accelerometer axes

Figure 10-5. The iPhone accelerometer axes

When dealing with acceleration measurements, you must keep in mind that the accelerometer is measuring just that: the linear acceleration of the device. When at rest (in whatever orientation), the figures represent the force of gravity acting on the device, and correspond to the roll and pitch of the device (in the X and Y directions at least). But while in motion, the figures represent the acceleration due to gravity, plus the acceleration of the device itself relative to its rest frame.

Writing an Accelerometer Application

Note

You can follow along while I build this application in a screencast available on the book’s website.

Let’s implement a simple view-based application to illustrate how to approach the accelerometer. Open Xcode and start a new iPhone project, select a View-based Application template, and name the project “Accelerometer” when prompted for a name.

Before jumping back into Xcode to show you how to use the accelerometer, we’re going to build the UI for the application. Double-click on the AccelerometerViewController.xib NIB file to open it in Interface Builder.

We’re going to both report the raw figures from the accelerometer and display them using a UIProgressView element. So, drag and drop three progress bars along with labels for those bars into the View window. After you do that, it should look something like Figure 10-6. I’ve used two labels for each progress bar: one to hold the X, Y, or Z and the other to hold the accelerometer measurements.

The Accelerometer application UI

Figure 10-6. The Accelerometer application UI

Make sure you’ve saved your changes, and close Interface Builder and return to Xcode. Click on the AccelerometerViewController.h interface file to open it in the Xcode editor. We’re going to declare three UILabel and three UIProgressView variables as IBOutlets. Since they aren’t going to be used outside the class, there isn’t much point in declaring them as class properties. We’ll also declare a UIAccelerometer instance. Here’s how the AccelerometerViewController.h interface file should look when you are done:

#import <UIKit/UIKit.h>

@interface AccelerometerViewController :

UIViewController <UIAccelerometerDelegate> {1

IBOutlet UILabel *xLabel;

IBOutlet UILabel *yLabel;

IBOutlet UILabel *zLabel;

IBOutlet UIProgressView *xBar;

IBOutlet UIProgressView *yBar;

IBOutlet UIProgressView *zBar;

UIAccelerometer *accelerometer;

}

@end

1

Here we declare that the class implements the UIAccelerometer delegate protocol.

Make sure you’ve saved your changes and click on the corresponding AccelerometerViewController.m implementation file to open it in the Xcode editor. We don’t actually have to do very much here, as Interface Builder is going to handle most of the heavy lifting. Here’s what the file should look like when you are done:

#import "AccelerometerViewController.h"

@implementation AccelerometerViewController

- (void)viewDidLoad {

accelerometer = [UIAccelerometer sharedAccelerometer];1

accelerometer.updateInterval = 0.1;2

accelerometer.delegate = self;3

[super viewDidLoad];

}

- (void)didReceiveMemoryWarning {

[super didReceiveMemoryWarning];

}

- (void)dealloc {

[xLabel release];

[yLabel release];

[zLabel release];

[xBar release];

[yBar release];

[zBar release];

accelerometer.delegate = nil;

[accelerometer release];

[super dealloc];

}

#pragma mark UIAccelerometerDelegate Methods

- (void)accelerometer:(UIAccelerometer *)meter

didAccelerate:(UIAcceleration *)acceleration4

{

xLabel.text = [NSString stringWithFormat:@"%f", acceleration.x];

xBar.progress = ABS(acceleration.x);

yLabel.text = [NSString stringWithFormat:@"%f", acceleration.y];

yBar.progress = ABS(acceleration.y);

zLabel.text = [NSString stringWithFormat:@"%f", acceleration.z];

zBar.progress = ABS(acceleration.z);

}

@end

1

The UIAccelerometer is a singleton object, so we grab a reference to the singleton rather than allocate and initialize a new instance of the class.

2

We set the update interval to 0.1 s, hence the accelerometer:didAccelerate: method will be called 10 times every second.

3

We declare that this class is the delegate for the UIAccelerometer.

4

We implement the accelerometer:didAccelerate: delegate method and use it to set the X, Y, and Z labels to the raw accelerometer readings, and the progress bar values to the absolute value (the value without regard to sign) of the accelerometer reading, each time it is called.

All we need to do now is connect the outlets to the UI elements we created earlier and we’re done. Make sure you’ve saved your changes to the code and double-click on the AccelerometerViewController.xib file to go back into Interface Builder.

Click on File’s Owner, and go to the Connections Inspector (⌘-2) and connect the xLabel, yLabel, and zLabel outlets to the appropriate UILabel elements in the View window. Then connect the xBar, yBar, and zBar outlets to the corresponding UIProgressBar elements, as shown inFigure 10-7.

OK, we’re done. Save the NIB and return to Xcode. Before you click the Build and Run button, make sure you’ve configured the project to deploy onto your iPhone or iPod touch to test it. Since this application makes use of the accelerometer, and iPhone Simulator doesn’t have one, we’re going to have to test it directly on the device. We covered deploying applications onto your iPhone or iPod touch at the end of Chapter 3.

If all goes well, you should see something that looks a lot like Figure 10-8.

Connecting the outlets to the UI elements

Figure 10-7. Connecting the outlets to the UI elements

The Accelerometer application running on an iPod touch sitting face-up on my desk, measuring a 1-gravity acceleration straight down

Figure 10-8. The Accelerometer application running on an iPod touch sitting face-up on my desk, measuring a 1-gravity acceleration straight down

Using the Digital Compass

In addition to the accelerometer, the iPhone 3GS has a magnetometer that acts as a digital compass. Combining the heading (yaw) information (see Figure 10-9) returned by this device with the roll and pitch information returned by the accelerometer will let you determine the true orientation of the iPhone in real time.

Using the magnetometer (a.k.a. the digital compass) in the iPhone 3GS, you can determine the heading (yaw) of the device

Figure 10-9. Using the magnetometer (a.k.a. the digital compass) in the iPhone 3GS, you can determine the heading (yaw) of the device

You should be aware that the magnetometer is measuring the strength of the magnetic field surrounding the device. In the absence of any strong local fields, these measurements will be of Earth’s ambient magnetic field, allowing the device to determine its “heading” with respect to the geomagnetic North Pole. The geomagnetic heading and true heading, relative to the geographical North Pole, can vary widely (by several tens of degrees depending on your location).

As well as reporting the current location, the CLLocationManager class can, in the case where the device’s hardware supports it, report the current heading of the device. The following code will create an instance of the class, and will send both location and heading update messages to the designated delegate class:

CLLocationManager *locationManager = [[CLLocationManager alloc] init];

locationManager.delegate = self;

if( locationManager.locationServicesEnabled &&

locationManager.headingAvailable)1

{

[locationManager startUpdatingLocation];

[locationManager startUpdatingHeading];

} else {

NSLog(@"Can't report heading");

}

1

It’s even more important to check whether heading information is available than it is to check whether location services are available, as the availability of heading information is currently restricted to iPhone 3GS devices only.

We can filter these update messages based on an angular filter. Changes in heading of less than this amount will not generate an update message to the delegate:

locationManager.headingFilter = 5; // 5 degrees

The default value of this property is kCLHeadingFilterNone. Use this value if you want to be notified of all heading updates.

The CLLocationManagerDelegate protocol offers a method that is called when the heading is updated:

- (void)locationManager:(CLLocationManager*)manager

didUpdateHeading:(CLHeading*)newHeading

{

// If the accuracy is valid, process the event.

if (newHeading.headingAccuracy > 0)

{

CLLocationDirection theHeading = newHeading.magneticHeading;

// Do something with the event data.

}

}

If location updates are also enabled, the location manager returns both true heading and magnetic heading values. If location updates are not enabled, the location manager returns only the magnetic heading value:

CLLocationDirection trueHeading = newHeading.trueHeading;

As I mentioned previously, the magnetometer readings will be affected by local magnetic fields, so the CLLocationManager will attempt to calibrate its heading readings by displaying a heading calibration panel before it starts to issue update messages. However, before it does so, it will call the locationManagerShouldDisplayHeadingCalibration: delegate method:

- (BOOL)locationManagerShouldDisplayHeadingCalibration:

(CLLocationManager *)manager {

... code not shown ...

}

If you return YES from this method, the CLLocationManager will proceed to display the device calibration panel on top of the current window. The calibration panel prompts the user to move the device in a figure-eight pattern so that Core Location can distinguish between Earth’s magnetic field and any local magnetic fields. The panel will remain visible until calibration is complete or until you dismiss it by calling the dismissHeadingCalibrationDisplay: method in the CLLocationManager class.

Accessing the Proximity Sensor

The proximity and ambient light sensors are two separate sensors. The ambient light sensor is used to change the brightness level of the device’s screen automatically, while the proximity sensor is used by the device to turn the screen off when you put the phone to your ear to make a call. Although it does have an ambient light sensor, the iPod touch does not have a proximity sensor.

Unfortunately, there is no way to access the ambient light sensor in the official SDK. However, developers can access the proximity sensor via the UIDevice class. This sensor is an infrared LED emitter/detector pair positioned near the earpiece, as shown in Figure 10-10. It measures the return reflection of the transmitted infrared beam to detect (large) objects near the phone.

The IR LED of the proximity sensor is located near the earpiece

Figure 10-10. The IR LED of the proximity sensor is located near the earpiece

You can enable the sensor in your application by toggling the proximityMonitoringEnabled Boolean:

UIDevice *device = [UIDevice currentDevice];

device.proximityMonitoringEnabled = YES;

You can query whether the proximity sensor is close to the user:

BOOL state = device.proximityState;

If proximity monitoring is enabled, a UIDeviceProximityStateDidChangeNotification notification will be posted by the UIDevice when the state of the proximity sensor changes; you can ask that your application is notified when this occurs by registering your class as an observer with the notification center:

[[NSNotificationCenter defaultCenter]

addObserver:self selector:@selector(proximityChanged:)

name:@"UIDeviceProximityStateDidChangeNotification" object:nil];

Notifications would then get received by the proximityChanged: method:

- (void) proximityChanged: (NSNotification *)note {

UIDevice *device = [note object];

NSLog(@"In proximity: %i", device.proximityState);

}

Using Vibration

Note

The motor that controls vibration is not a sensor; technically, it’s an actuator. Because sensors and actuators generally go hand in hand, we’ll look at the capability here.

Making the iPhone vibrate is a simple system call. You first need to add the AudioToolbox framework to your project (right- or Ctrl-click on Frameworks, then use the Add Existing Frameworks option), and then import the AudioToolbox headers into the class where you intend to trigger the vibration:

#import <AudioToolbox/AudioToolbox.h>

At this point, you can make the device produce a short buzz by calling the following method:

AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);

Unfortunately, despite the fact that the underlying (private) Telephony framework offers relatively subtle levels of control over the vibration pattern, the official support in the SDK is limited to this single call.

Warning

You need to be careful about using the vibration feature. Using continuous vibration, or using a timer to maintain the vibration, is a reason for rejection during the App Store review process.