Sweet, Sweet Music - Learn iOS 8 App Development, Second Edition (2014)

Learn iOS 8 App Development, Second Edition (2014)

Chapter 9. Sweet, Sweet Music

Choosing and playing music from your iPod library is a great way to add some toe-tapping fun to your app. You can also add your own music and audio effects to actions and games. Both are relatively easy to do, and I’ll get to those straightaway. But don’t stop reading this chapter at that point. Sound in iOS apps exists in a larger world of competing audio sources, real-world events, and an ever-changing configuration of hardware. Making audio work nicely in this demanding, and sometimes complex, environment is the real test of your iOS development skills. This chapter covers the following:

· Choosing tracks from the iPod music library

· Playing music in the iPod music library

· Obtaining the details (title, artist, album, artwork) of a track

· Playing sound files

· Configuring the behavior of audio in your app

· Mixing music with other sounds

· Responding to interruptions

· Responding to hardware changes

Along the way, you’ll pick up some timesaving Xcode tricks, manage view objects without an outlet connection, and learn some mad constraint skills. Are you ready to make some noise?

Note The app you’re about to create will run in the simulator, but the simulator does not have any music in its iPod library. If you want to pick a song and play music, you’ll need a provisioned iOS device.

Making Your Own iPod

The two most common sources for prerecorded sounds in an iOS app are audio resource files and audio files in the user’s iPod library. The app you’ll develop in this chapter plays both—at the same time! It’s a dubbing app that lets you play a track from your iPod’s music library and then spontaneously add your own percussive instrument sounds. So if you’ve ever felt that Delibes’ Flower Duet (Lakmé, Act 1) would sound so much better with a tambourine, this is the app you’ve been waiting for.

Design

Your app design is a simple, one-screen affair that I’ve named DrumDub. At the bottom are controls for choosing a track from your music library and for pausing and resuming playback. At the top you’ll find information about the track that’s playing. In the middle are buttons to add percussive sounds, all shown in Figure 9-1.

image

Figure 9-1. DrumDub rough sketch

You’ll start by building the iPod music playback. Later you’ll add the album artwork, and finally you’ll mix in the percussion sounds. As always, start by creating a new Xcode project.

1. Use the Single View Application template.

2. Name the project DrumDub.

3. Set the language to Swift.

4. Set the device to Universal.

5. Save the project.

6. In the project’s supported interface orientations, find the iPhone/iPod section and turn off landscape left and right, leaving only portrait enabled. (The iPad version can run in any orientation.)

Adding a Music Picker

The first step is to create an interface so the user can choose a song, or songs, from their iPod music library. After Chapter 7 (where you used the photo library picker), you shouldn’t be surprised to learn that iOS provides a ready-made music picker interface. All you have to do is configure it and present it to the user.

You’ll present the music picker interface when the user taps the Song button in the interface. For that you’ll need an action. Start by adding this stub function to your ViewController.swift file:

@IBAction func selectTrack(sender: AnyObject!) {
}

Switch to the Main.storyboard Interface Builder file. In the object library, find the Toolbar object. Drag a toolbar into your interface, positioning it at the bottom of the view. The toolbar already includes a bar button item. Select it and change its Title property to Song. Connect its sent action (Control+right-drag) to the view controller’s selectTrack: action, as shown in Figure 9-2.

image

Figure 9-2. Connecting Song action

Switch back to the ViewController.swift file and finish your selectTrack(_:) function (new code in bold).

@IBAction func selectTrack(sender: AnyObject!) {
let picker = MPMediaPickerController(mediaTypes: .AnyAudio)
picker.delegate = self
picker.allowsPickingMultipleItems = false
picker.prompt = "Choose a song"
presentViewController(picker, animated: true, completion: nil)
}

This code creates a new MPMediaPickerController object that will let the user choose any audio type. The media picker is rather flexible and can be set up to present various types of audio and/or video content on the device. The categories for audio content are as follows:

· Music (MPMediaType.Music)

· Podcasts (MPMusicType.Podcast)

· Audiobooks (MPMediaType.AudioBook)

· iTunes U (MPMediaType.AudioITunesU)

By combining these binary values, you can configure your media picker to present any combination of those categories you desire. The constant MPMediaType.AnyAudio includes all categories, allowing the user to choose any audio item in their library. A similar set of flags allows video content to be selected.

Tip Some parameters, like the mediaTypes parameter in MPMediaPickerController(mediaTypes:), are interpreted not as a single integer value but as a collection of bits or flags. The raw value of each individual MPMediaType constant is a power of 2—a single 1 bit in the integer. You can combine them by logically ORing the values together to form arbitrary combinations, such as (.Music | .AudioBook). The resulting value would present all music tracks and audiobooks but would not let the user pick podcasts or iTunes U content. The convenient .AnyAudio constant is just all-possible audio flags ORed together.

You then make your ViewController object the picker’s delegate. For that to work, your view controller needs to conform to the MPMediaPickerControllerDelegate protocol. Add that to your class declaration now (new code in bold).

class ViewController: UIViewController, MPMediaPickerControllerDelegate {

Next, the option to allow picking multiple tracks at once is disabled. The user will be able to choose only one song at a time. And set a prompt, or title, so the user knows what you’re asking them to do.

Finally, the controller is presented, allowing it to take over the interface and choose a song. This is enough code to see it working, so give it a try. Set the project’s scheme to your iOS device and click the Run button, as shown on the left in Figure 9-3. The toolbar appears, and you can tap the Song button to bring up the music picker, browse your audio library, and choose a song, as shown on the right in Figure 9-3. If you run the app in the simulator, the picker will work, but there won’t be anything to pick (as shown on the middle of Figure 9-3).

image

Figure 9-3. Testing the audio picker

QUERYING THE IPOD MUSIC LIBRARY

You don’t have to use the media picker to choose items from the user’s iPod library. It’s just the most convenient method.

It’s possible to create your own interface or not have an interface at all. The iPod framework provides classes that allow your app to explore and search the user’s media collection as if it was a database. (Come to think of it, it is a database, so that description is literally true.)

You do this by creating a query object that defines what you’re searching for. This can be as simple as “all R&B songs” or more nuanced, such as “all tracks longer than 2 minutes, belonging to the ‘dance’ genre, with a BPM tag between 110 and 120.” The result is a list of media items matching that description, which you can present any way you like (cough—table—cough).

You can read more about this in the iPod Library Access Programming Guide that you will find in Xcode’s Documentation and API Reference. Read the section “Getting Media Items Programmatically” to get started.

Using a Music Player

What happens next is, well, nothing happens next. When the user picks a track or taps the Cancel button, one of these delegate functions is called:

mediaPicker(_:,didPickMediaItems:)
mediaPickerDidCancel(_:)

Nothing happened because you haven’t written either. Start by writing mediaPicker(_:,didPickMediaItems:). This method will retrieve the audio track the user picked and start it playing using an MPMusicPlayerController object.

Add the first delegate method to your ViewController class.

func mediaPicker(mediaPicker: MPMediaPickerController!, image
didPickMediaItems mediaItemCollection: MPMediaItemCollection!) {
if let songChoices = mediaItemCollection {
if songChoices.count != 0 {
musicPlayer.setQueueWithItemCollection(songChoices)
musicPlayer.play()
}
}
dismissViewControllerAnimated(true, completion: nil)
}

The mediaItemCollection parameter contains the list of tracks, books, or videos the user picked. Remember that the picker can be used to choose multiple items at once. Since you set the allowsPickingMultipleItems property to false, your picker will always return a single item.

We double-check to see that at least one track was chosen (just to be sure) and then use the collection to set the music player’s playback queue. The playback queue is a list of tracks to play and works just like a playlist. In this case, it’s a playlist of one. The next statement starts the music playing. It’s that simple.

Note While the music player’s playback queue works just like a playlist, it isn’t an iPod playlist. It won’t appear in the iPod interface as a playlist, and iOS won’t save it for you. If you want this functionality in your app, you can do it yourself. Using what you learned in Chapter 5, present the items in the media collection as a table, allowing the user to reorder, delete, or add new items (using the media picker again) as they like. Call the music player’s setQueueWithItemCollection(_:) function again with the updated collection.

So, what’s the problem with this code? The problem is there is no musicPlayer property yet! Write a read-only computed property for musicPlayer that lazily creates the object.

var musicPlayer: MPMusicPlayerController {
if musicPlayer_Lazy == nil {
musicPlayer_Lazy = MPMusicPlayerController()
musicPlayer_Lazy!.shuffleMode = .Off
musicPlayer_Lazy!.repeatMode = .None
}
return musicPlayer_Lazy!
}
private var musicPlayer_Lazy: MPMusicPlayerController?

Note This code follows two well-used design patterns: singleton and lazy initialization. The code implements a computed musicPlayer property; any code that requests that property (myController.musicPlayer) invokes this code. The code checks to see whether anMPMusicPlayerController object—stored in musicPlayer_Lazy—has already been created. If not, it creates one, configures it, and saves it in the musicPlayer_Lazy instance variable. This happens only once. All subsequent requests to get musicPlayer see that the musicPlayer_Lazy variable is already set and immediately return the (single) object.

When you construct an application music player (see the “Application and iPod Music Players” sidebar), the player inherits the current iPod playback settings for things such as shuffle and repeat modes. You don’t want any of that, so you turn them off.

APPLICATION AND IPOD MUSIC PLAYERS

Your app has access to two different music player objects. The application music player belongs to your app. Its current playlist and settings exist only in your app, and it stops playing when your app stops.

You can also request the system music player object, using MPMusicPlayerController.systemMusicPlayer(). The system music player object is a direct connection to the iPod player in the device. It reflects the current state of music playing in the iPod app. Any changes you make (such as pausing playback or altering shuffle mode) change the iPod app. Music playback continues after your app stops.

There’s only one quirk. The system music player object won’t report information about media that’s being streamed, say via home sharing. But other than that, the system music player object is a transparent extension of the built-in iPod app and allows your app to participate in, and integrate with, the user’s current music activity.

Only one music player can be playing at a time. If your app starts an application music player, it takes over the music playback service, causing the built-in iPod player to stop. Likewise, if your application music player is playing and the user starts the system player, your music player is stopped.

Now toss in a delegate function to handle the case where the user declines to choose a track.

func mediaPickerDidCancel(mediaPicker: MPMediaPickerController!) {
dismissViewControllerAnimated(true, completion: nil)
}

Your basic playback code is now complete. Run your app, choose a track, and enjoy the music.

The MPMusicPlayerController object is self-contained. It takes care of all the standard iPod behavior for you. It will, for example, automatically fade out if interrupted by an alarm or incoming call or stop playback when the user unplugs their headphones. I’ll talk a lot more about these events later in this chapter.

That’s not to say you can’t influence the music player. In fact, you have a remarkable amount of control over it. You can start and stop the player, adjust the volume, skip forward or backward in the playlist, set shuffle and repeat modes, change the playback rate, and more. The player will also tell you a lot about what it’s doing and playing. Using these properties and methods, you could create your own, full-featured music player.

For this app, you don’t need a full-featured music player. But it would be nice to at least know what’s playing and be able to pause it. Get ready to add that next.

Adding Playback Control

Start by adding some buttons to pause and play the current song. These buttons will need actions, so add these two methods to your ViewController.swift file:

@IBAction func play(sender: AnyObject!) {
musicPlayer.play()
}

@IBAction func pause(sender: AnyObject!) {
musicPlayer.pause()
}

You’ll also need to update the state of the play and pause buttons, so add some connections for that.

@IBOutlet var playButton: UIBarButtonItem!
@IBOutlet var pauseButton: UIBarButtonItem!

Switch to your Main.storyboard file and add the following objects to the toolbar, inserting them to the left of the Song button, in order, as shown in Figure 9-4:

1. A Flexible Space Bar Button Item

2. A Bar Button Item, changing its style to Plain, changing its identifier to Play, and unchecking Enabled

3. A Bar Button Item, changing its style to Plain, changing its identifier to Pause, and unchecking Enabled

4. A Flexible Space Bar Button Item

image

Figure 9-4. Adding controls to the toolbar

Finally, set all of the connections. Control+right-click the play button and connect its action to the play: action (in the View Controller) and connect the pause button to the pause: action. Select the View Controller object and use the connections inspector to connect theplayButton outlet to the play button and to connect the pauseButton outlet to the pause button.

With the interface objects created and connected, consider for a moment how these buttons should work. You want the following:

· The play button to be active (tappable) when the music player is not currently playing

· The play button’s action to start the music playing

· The pause button to be active when the music player is playing

· The pause button’s action to pause the music player

The button’s actions will start and stop the music player. You’ll need to update the enabled state of the buttons whenever the player starts or stops playing. The first part you’ve already done, in the play(_:) and pause(_:) functions. The second half is updating the button states (enabling or disabling them) at the appropriate times, and for that you’ll need to get some information from the music player.

Receiving Music Player Notifications

The music player runs in a background thread. Normally, it plays tracks in its playlist until it runs out and stops. It can also pause in response to external events: the user presses the pause button on their headphone cable, or they unplug the iPod from a dock. How do you think your app will learn about these events?

If you said, “From delegate functions or notifications,” give yourself a big round of applause! Reading the documentation for the MPMusicPlayerController class, you discover that the music player will optionally send notifications whenever important changes occur, which happen to include when it starts or stops playing. To be notified of those events, you’ll need to register your controller object to receive them. As you remember from Chapter 5, to receive notifications you must do the following:

1. Create a notification function.

2. Register with the notification center to become an observer for the notification.

Start by adding this notification function to your ViewController.swift implementation:

func playbackStateDidChange(notification: NSNotification) {
let playing = ( musicPlayer.playbackState == .Playing )
playButton!.enabled = !playing
pauseButton!.enabled = playing
}

Your notification handler examines the current playbackState of your music player. The player’s playback state will be one of stopped, playing, paused, interrupted, seeking forward, or seeking backward. In this implementation, the only likely states are playing, stopped, interrupted, and paused.

If the player is playing, the pause button is enabled, and the play button is disabled. If it’s not playing, the opposite occurs. This presents the play button as an option whenever the player is not playing and presents the pause button when it is.

Your controller won’t receive these notifications until two additional steps are taken. First, you must register to receive these notifications. In the musicPlayer getter block, add this immediately after the player object is created and configured (new code in bold):

musicPlayer_Lazy = MPMusicPlayerController()
musicPlayer_Lazy!.shuffleMode = .Off
musicPlayer_Lazy!.repeatMode = .None
let center = NSNotificationCenter.defaultCenter()
center.addObserver( self,
selector: "playbackStateDidChange:",
name: MPMusicPlayerControllerPlaybackStateDidChangeNotification,
object: musicPlayer_Lazy)

The second step is to enable the music player’s notifications. MPMusicPlayerController does not, by default, send these notifications. You must explicitly request that it does. Immediately after the previous code, add one more line.

musicPlayer_Lazy!.beginGeneratingPlaybackNotifications()

Your playback controls are now finished. Run your app and see that they work, as shown in Figure 9-5.

image

Figure 9-5. Working playback controls

Both buttons start out disabled. When you choose a track to play, the pause button becomes active (the middle of Figure 9-5). If you pause the song or let it finish playing, the play button becomes active (on the right in Figure 9-5).

MVC AT WORK

You’re watching the model-view-controller design pattern at work—again. In this scenario, your music player (despite the fact it’s called a “music controller”) is your data model. It contains the state of the music playback. Whenever that state changes, your controller receives a notification and updates the relevant views—in this case, the play and pause buttons.

You didn’t write any code to update the play or pause button when you start or stop the player. Those requests are just sent to the music player. If one of those requests results in a state change, the music player posts the appropriate notifications, and the affected views are updated.

While functional, your app lacks a certain je ne sais quoi. Oh, who are we kidding? This interface is as dull as dishwater! Let’s spruce it up a bit.

Adding Media Metadata

A colorful aspect of the music player object is its nowPlayingItem property. This property returns an object containing metadata about the song that’s playing. The object works like a dictionary, revealing all kinds of interesting tidbits about the current song. This includes information such as its title, the artist, the track number, the musical genre, any album artwork, and much more.

Note Metadata is “data about data.” A file, like a document, contains data. The name of that file, when it was created, and so on, is its metadata—it’s data that describes the data in the file. A waveform stored in a song file is data. The name of the song, the artist, and its genre are all metadata.

For your app, you’ll add an image view to display the album’s cover and text fields to show the song’s title, the album it came from, and the artist. Start by adding new interface objects to Main.storyboard.

Creating a Metadata View

You’re going to add an image view and a few label views to the interface. The image view will display the song’s album artwork, while the label views will show the song, artist, and album currently playing (see Figure 9-1). But this isn’t a one-size-fits-all layout. The image and label views that would fit on an iPhone would look odd and puny on an iPad. And the layout that would look good on an iPad would look strange on an iPhone. So, what do you do?

The answer is in adaptive constraints, which are new in iOS 8. As you remember from Chapter 2, the view controller is associated with a size class. A size class is a broad indication of the space available for your interface. There are only two size classes: Regular (there’s plenty of room for your interface to spread out) and Compact (your interface needs to be tight). These aren’t indications of actual device sizes—although you can get that information if your app needs it. They’re intended to make it easy to create alternate layouts that work in a variety of device sizes and orientations, without getting tangled up in the specifics.

Your interface has two size classes, one for horizontal and one for vertical. For example, if you’re using an iPhone 5 in portrait orientation, your view’s size class will be Compact/Regular. This means the horizontal size class is Compact, and the vertical size class is Regular.

The constraint sets you’re about to create are the most complex ones in this book. But you’ll also see that, with a little planning, it’s not difficult to create sophisticated sets of layout constraints that intelligently adapt to different devices and orientations, without writing a single line of code.

Adding the Image View

Select the Main.storyboard file. Using the object library, find the Image View object and add one to the interface. Position it (roughly) in the upper-left corner of the view, as shown in Figure 9-6.

image

Figure 9-6. Adding the album view

The size and position of the image view will be determined by its constraints. There’s only one rule for constraints; you must add enough constraints so that iOS can unambiguously determine the view’s position (horizontal and vertical) and its size (height and width). And you can’t add constraints that conflict. OK, that’s two rules.

Coming up with constraints for a particular layout or device size is pretty easy. What’s fun is coming up with sets of constraints so your interface will lay out nicely on all devices, in all supported orientations. For DrumDub, you need only two layouts:

· For the iPhone/iPod:

o Artwork is smaller and nestled in the upper-left corner.

o The song info labels fill the space to the right.

· For the iPad:

o Artwork view is larger.

o The artwork image and song info labels split the screen, so the image is to the left of center, and the labels are to the right.

This required two set of constraints, one for a Compact/Regular (iPhone) interface and one for a Regular/Regular (iPad) interface. Furthermore, there are some constraints that are common to both interfaces. All of the needed constraints are shown in Table 9-1.

Table 9-1. Artwork Image View Constraint Sets

Any/Any

Compact/Any

Regular/Any

Top of image is just below top layout guide

Left edge is against left edge of super view

Right edge is against the horizontal center of superview

Size is 160x160

Size is 300x300

Constraints you add in Interface Builder form a hierarchy. The constraints you add to the Any/Any category will always be applied to your interface. Constraints that you add to the Compact/Any category will be applied to your interface only when it appears in a Compact/Regular or Compact/Compact environment. Likewise, constraints you add to the Regular/Any category will be applied only when your interface appears in a Regular/Regular environment. You can also get very specific, adding constraints that are active only when the interface is Compact/Regular or Compact/Compact. Let’s get started.

Adding the Universal Constraints

You need one constraint—the top edge position—to be applied in all cases. Still in the Main.storyboard file, make sure the size class category at the bottom of the Interface Builder canvas is set towAny/hAny (any width, any height). Select the UIImageView object and click the Pin Constraints control, as shown in Figure 9-7. Select the top constraint and set its value to Use Standard Value. Add the constraint.

image

Figure 9-7. Setting top constraint for all size classes

Adding Compact/Any Constraints

The next step is to add the constraints that apply only when the horizontal size is compact. Click the size class control at the bottom of the canvas and drag until the matrix says Compact Width | Any Height, as shown on the left in Figure 9-8. Notice that the view controller canvas gets a little narrower, suggesting a more compact device.

image

Figure 9-8. Adding Compact/Any constraints

Any constraints you add now will be applied only when the interface’s horizontal size class is compact. Select the image view object, click the pin constraints control, add a leading edge constraint set to Use Standard Value, and add both Height and Width constraints set to 160 pixels, as shown on the right in Figure 9-8.

When the horizontal size class is compact (iPhone or iPod in any orientation), the image view will be positioned in the upper-left corner and be 160 pixels high and wide. Now let’s move on to the iPad layout.

Adding Regular/Any Constraints

Repeat the steps you took for the iPhone interface. Change the size class control to wRegular/hAny, as shown on the left in Figure 9-9. Select the image view, click the pin constraints control, and add Height and Width constraints both set to 300 pixels (as shown in the middle in Figure 9-9). Click the align constraint control and add a Horizontal Center in Container constraint with a value of 150 (as shown on the right in Figure 9-9).

image

Figure 9-9. Adding Regular|Any constraints

When the horizontal size class is regular (iPad), the image view will be positioned to the left of center and be 300 x 300 pixels. If the value of the Horizontal Center in Container constraint was 0, the center of the view would be centered in the superview. Adding an offset of half the width positions the right edge of the view at the center of the superview instead. With the image view constraints for all possible size classes finished, turn your attention to the label views.

Adding the Song Labels

You’re going to add three labels. You want those labels to align with the album art view and fill the right side of the interface, for all sizes and orientations.

Before adding the label objects, switch back to the wAny/hAny size class, as shown on the left in Figure 9-10. Drag in a label object, align it with the top edge of the image view, and stretch it so it fills the space to the right, as shown on the right in Figure 9-10. Set its font size to System 14.0.

image

Figure 9-10. Adding the first song label object

Caution Make sure the size class is set to wAny/hAny before adding new objects to your interface, or you’ll be adding those views to the layout only for that size class. Interface Builder allows you to add objects that will appear only in specific size classes, but that’s not what you want here.

Make two duplicates of the first label—hold down the Option key and drag a copy to a new location—so you have three, as shown on the left in Figure 9-11. Select the top label and the image view, click the align constraints control, and add a Top Edges constraint, as shown on the right inFigure 9-11. This will vertically position the label view so its top edge is the same as the image view’s top edge.

image

Figure 9-11. Duplicating label and top alignment

Select all three labels. Click the Pin Constraints control and add a leading edge, trailing edge, and height constraints for all three label objects, as shown on the left in Figure 9-12. Notice that the button says that you’re about to add nine constraints—three constraints for three label objects.

image

Figure 9-12. Setting label constraints

Now add vertical spacing constraints between the top label and the next two. You could do this using the constraint controls, as you have been doing, but here’s another way. If you want to establish a constraint between two specific views, Control+click one view (the top label) and drag it to the other (the middle label), as shown on the right in Figure 9-12. When you release the mouse button, a pop-up menu will appear. Choose the constraint you want to establish. In this case, choose Vertical Spacing. Repeat, creating another vertical spacing constraint between the second and third labels.

Tip The Control+drag method of adding constraints is quick and specific, but you can’t modify the value of the constraint at the same time. If you need to change its value after the fact, select the constraint and edit its value using the attributes inspector.

Your constraints are now complete. The constraints you added for the label objects are the same for all size classes. The beautiful thing is that those constraints are relative to the position of the image view, and that will change because it has a different set of constraints for compact and regular-width devices.

Previewing Your Layouts

It can sometimes be difficult to predict how constraints will affect your layout, and the effects of conditional constraints in different size classes can make your head start to spin. Fortunately, Xcode is here to help.

Switch to the assistant view. The right pane should display the ViewController.swift file. At the top of the editing pane on the right, choose Preview from the navigation menu, as shown at the top in Figure 9-13. The right pane will now show you how your layouts, as interpreted by Interface Builder, should appear on specific devices.

image

Figure 9-13. Previewing layouts for different devices

Note These are live previews. If your layouts are not looking the way they should, continue to edit your views and constraints in the left pane, and the effects will be immediately appear on the right.

Click the + button in the lower-left corner of the pane to add new devices to the preview. In Figure 9-13, I’m previewing this layout on both a 4-inch iPhone and an iPad simultaneously. You can rotate a preview to check landscape orientation or even add both portrait and landscape previews simultaneously—assuming you have a really big monitor. Click a preview and press the Delete key to remove it.

Finishing the Album Interface

For the final touch, select the three label views and use the attributes inspector to change their font color to White. Select the root view object and change its background color to Black.

While still in the assistant editor, use the navigation ribbon to switch back to the Automatic view. The ViewController.swift will reappear in the right-hand pane. Add these four outlets:

@IBOutlet var albumView: UIImageView!
@IBOutlet var songLabel: UILabel!
@IBOutlet var albumLabel: UILabel!
@IBOutlet var artistLabel: UILabel!

Now connect them to the image and label views, as shown in Figure 9-14.

image

Figure 9-14. Connecting the album views

You’ve done a lot of cool work in this section. You’ve created a layout that uses adaptive constraints to adjust itself, automatically, for different devices sizes and orientations. Switch back to the standard editor (View image Standard Editor image Show Standard Editor) and select theViewController.swift file. It’s time to write the code to update these new interface objects.

Observing the Playing Item

The music player object also sends notifications when the item being played changes. This occurs when a new song starts playing or one finishes playing. The notification is different than the one your controller is currently observing, so you’ll need to create another notification handler and register to observe it.

Near the playbackStateDidChange(_:) function, add your new notification handler function.

func playingItemDidChange(notification: NSNotification) {
let nowPlaying = musicPlayer.nowPlayingItem

var albumImage: UIImage!
if let artwork = nowPlaying?.valueForProperty(MPMediaItemPropertyArtwork) image
as? MPMediaItemArtwork {
albumImage = artwork.imageWithSize(albumView.bounds.size)
}
if albumImage == nil {
albumImage = UIImage(named: "noartwork")
}
albumView.image = albumImage

songLabel.text = image
nowPlaying?.valueForProperty(MPMediaItemPropertyTitle) as? NSString
albumLabel.text = image
nowPlaying?.valueForProperty(MPMediaItemPropertyAlbumTitle) as? NSString
artistLabel.text = image
nowPlaying?.valueForProperty(MPMediaItemPropertyArtist) as? NSString
}

The method gets the nowPlayingItem property object. Rather than have a bunch of fixed properties (like typical objects), the MPMediaItem object contains a variable number of property values that you request via a key. A key is a fixed value—typically a string—that identifies the value you’re interested in.

The first thing you ask for is the MPMediaItemPropertyArtwork value. This value will be a MPMediaItemArtwork object that encapsulates the album artwork for the song. You then request a UIImage object, optimized for the size of your image view.

Tip MPMediaItemArtwork objects may store multiple versions of the item’s artwork, at different sizes and resolutions. When requesting a UIImage of the artwork, specify a size as close as possible to the size you plan on displaying the image, so the media item object can return the best possible image for that size.

The thing to remember about media metadata is that there are no guarantees. Any song in the iPod library might have values for title, artist, and artwork. Or, it might not have any of those values. Or, it might have a title and artist but no artwork, or artwork and no title. The bottom line is, be prepared for the case where something you ask for isn’t available.

In this app, you test to see whether MPMediaItemArtwork declined to return a displayable image (albumImage==nil). In that case, replace the image with a resource image named “noartwork.”

For that statement to work, you’ll need to add the noartwork.png and noartwork@2x.png files to your project. Select the Images.xcassets item in the navigator. Find the Learn iOS Development Projects image Ch 9 image DrumDub (Resources) folder and drag thenoartwork.png and noartwork@2x.png files into the asset catalog.

The last three statements repeat this process, obtaining the title, album title, and artist name for the item. In this code you don’t have to worry about missing values. If an item doesn’t have an album name (when requesting MPMediaItemPropertyAlbumTitle), the media item will return nothing. It just so happens that setting a UILabel’s text property to nothing blanks the view—exactly what you want to happen if there’s no album name.

The last step is observing the item changed notifications. Find the musicPlayer property getter code. Find the code that observes the playback state changes and insert this new statement:

center.addObserver( self,
selector: "playingItemDidChange:",
name: MPMusicPlayerControllerNowPlayingItemDidChangeNotification,
object: musicPlayer_Lazy)

Now whenever a new song starts playing, your controller will receive a “playing item did change” notification and display that information to the user. Give it a try.

Run your app, select a song, and start it playing. The song information and artwork display, as shown in Figure 9-15. If you let the song play to its end, the information disappears again.

image

Figure 9-15. Album artwork and song metadata

The only thing I don’t like about this interface is that the artwork view and the three label views are filled with the placeholder information when the app launches. Fix that in the Main.storyboard file by clearing the text property of the three label objects and setting the image view’s initial image to noartwork.png.

Make Some Noise

So far, you’ve essentially created a (minimal) iPod app. That’s an impressive feat, but it isn’t the only way to add sound to your app. You may want to add sound effects to actions or play music files that you’ve bundled. Maybe you want to play live audio streams from a network data source. Those are all easy to do, even easier than playing songs from the iPod library—which was pretty easy.

I’ll get the easy part out of the way first. To play and control almost any kind of audio data your app has access to, follow these steps:

1. Create an AVAudioPlayer object.

2. Initialize the player with the source of the audio data, typically a URL to a resource file.

3. Call its play() function.

And just like the MPMusicPlayerController, the AVAudioPlayer object takes care of all of the details, including notifying your delegate when it’s done.

So, you might be thinking that it won’t take more than a dozen lines of code and some buttons to finish this app, but you would be mistaken.

Living in a Larger World

What makes playing audio in this app complicated is not the code to play your sounds. The complication lies in the nature of iOS devices and the environment they exist in.

Consider an iPhone. It’s a telephone and a videophone; audio is used to indicate incoming calls and play the audio stream from the caller. It’s a music player; you can play your favorite music or audiobook, or stream Internet radio, even while using other apps. It’s an alarm clock; timers can remind you of things to do any time of the day or night. It’s a game console; games are full of sounds, sound effects, and ambient music. It’s a pager; messages, notifications, and alerts can occur for countless reasons, interrupting your work (or play) at a moment’s notice. It’s also a video player, TV, answering machine, GPS navigator, movie editor, Dictaphone, and digital assistant.

All of these audio sources share a single output. To do that effectively—creating a pleasant experience for the user—all of these competing audio sources have to cooperate. Game sounds and music playback have to stop when a telephone call arrives. Background music needs to temporarily lower its volume1 if the user is expected to hear a reminder or recorded message. iOS refers to these as interruptions.

Adding to this complexity, iOS devices have many different ways of producing sound. Consider the built-in speakers, the headphone jack, wireless Bluetooth devices, AirPlay, and the dock connecter; iOS calls these audio routes. Audio can be directed to any one of these and switched to a different one at any time (called a route change). Audio playback must be aware of this, and your app may need to react to those changes. For example, Apple recommends that unplugging the headphones should cause music playback to pause, but game sound effects should continue playing.

And just to add one more dash of complication, most iOS devices have a ring/silence switch. Audio that’s intended as an alert, alarm, embellishment, or sound effect should play only when the ring switch is in its normal position. More deliberate audio, such as movies and audiobooks, should play normally, even when the silence switch is engaged.

Taken together, your app needs to do the following:

· Decide the intent and purpose of each source of audio in your app

· Declare this purpose so iOS can adjust its behavior to accommodate your audio

· Observe interruptions and audio route changes and take appropriate action

The good news is that not every audio-endowed app you write has to do all of these things. In fact, if you only use the iPod music player or only play incidental sounds using AVAudioPlayer objects, you probably don’t have to do anything at all. Both of these classes will “do the right thing.”

For an app like DrumDub, however, that wants to manage its own music playback while mixing in additional sound effects, all of these steps need to be taken. So, before you start adding sound effects to your app, lay some of the groundwork.

Configuring Your Audio Session

You communicate your intent—describe the kinds of sounds your app will make and how those will affect other audio sources—to iOS through an audio session. Every iOS app gets a generic audio session, preconfigured with a basic set of behaviors. That’s why if you play music only through a music player controller, you don’t have to do anything special; the default audio session is just fine.

DrumDub needs to both playback and mix audio. This is unusual, so it will need to reconfigure its audio session. Apps that only play audio can typically configure their audio session once and leave it.

Note Apps that record audio, or record and playback audio, are more complicated. They must repeatedly reconfigure their audio session as they switch between recording, playback, and processing.

In your AppDelegate.swift file, you’ll find the code for your app’s delegate object. One of the functions in your app’s delegate is the application(_:,didFinishLaunchingWithOptions:) function. As the name implies, it’s called immediately after your app has loaded and initialized and is about to start running. It’s the perfect place to put code that needs to run just once and run before anything else gets underway. Add the following code (in bold) to the beginning of that function:

func application(application: UIApplication!, didFinishLaunchingWithOptions
launchOptions: NSDictionary!) -> Bool {
let audioSession = AVAudioSession.sharedInstance()
audioSession.setCategory( AVAudioSessionCategoryPlayback,
withOptions: .MixWithOthers,
error: nil)
return true
}

An audio session has a category and a set of options. There are seven different categories to choose from, as listed in Table 9-2.

Table 9-2. Audio Session Categories

Session Categories

App Description

AVAudioSessionCategoryAmbient

Plays background audio or nonessential sound effects. The app will work just fine without them. App audio mixes with other audio (like your iPod) playing at the same time. The ring/silence switch silences the app’s audio.

AVAudioSessionCategorySoloAmbient

Plays nonessential audio that does not mix with other audio; other audio sources are silenced when the app plays its audio. The ring/silence switch silences the app’s audio. This is the default category.

AVAudioSessionCategoryPlayback

Plays music or other essential sounds. In other words, audio is the principle purpose of the app, and it wouldn’t work without it. The ring/silence switch does not silence its audio.

AVAudioSessionCategoryRecord

Records audio.

AVAudioSessionCategoryPlayAndRecord

Plays and records audio.

AVAudioSessionCategoryAudioProcessing

Performs audio processing (using the hardware audio codecs), while neither playing nor recording.

AVAudioSessionCategoryMultiRoute

Needs to output audio to multiple routes simultaneously. A slideshow app might play music through a dock connector while simultaneously sending audio prompts through the headphones.

The default category is AVAudioSessionCategorySoloAmbient. For DrumDub, you’ve decided that audio is its raison d'être—its reason to exist—so you call the setCategory(_:,withOptions:,error:) function to change its category toAVAudioSessionCategoryPlayback. Now your app’s audio won’t be silenced by the ring/silence switch.

You can also fine-tune the category with a number of category-specific options. The only option for this playback category is AVAudioSessionCategoryOptionMixWithOthers. If set, this option allows audio played with your AVAudioPlayer objects to “mix” with other audio playing at the same time. This is exactly what you want for DrumDub. Without this option, playing a sound would stop playback of the song.

The code you just added is being flagged with errors. That’s because all of these symbols are defined in the AVFoundation framework, so you’ll need to import those definitions to use them. Add this statement before all the other import statements in AppDelegate.swift:

import AVFoundation

See, that wasn’t too hard. In fact, there was a lot more explanation than code. With your audio session correctly configured, you can now add (mix in) sound effects with your music.

Playing Audio Files

You’re finally at the heart of your app’s design: playing sounds. You’re going to have four buttons, each playing a different sound. To implement this, you will need the following:

· Four button objects

· Four images

· Four AVAudioPlayer objects

· Four sampled sound files

· An action function to play a sound

It will be easier to build the interface once you have added the resources and defined the action function, so start there. Find your Learn iOS Development Projects image Ch 9 image DrumDub (Resources) folder and locate the 12 files in this table:

Sound Sample

Button Image

Retina Display Image

snare.m4v

snare.png

snare@2x.png

bass.m4v

bass.png

bass@2x.png

tambourine.m4v

tambourine.png

tambourine@2x.png

maraca.m4v

maraca.png

maraca@2x.png

Begin by adding the button image files. Select the Images.xcassets asset catalog item. In it, you’ll see the noartwork resource you added earlier. Drag the eight instrument image files (two each of snare, bass, tambourine, and maraca) into the asset catalog’s group list, as shown inFigure 9-16.

image

Figure 9-16. Adding image resources

While you’re here, select the AppIcon group of the asset catalog and drag the app icon image files into it, as you’ve done for earlier projects.

The four sound files (bass.m4a, maraca.m4a, snare.m4a, and tambourine.m4a) will also become resource files, but they’re not the kind of resources managed by an asset catalog. You can add any kind of file you want directly to a project and have that file included as a resource in your app’s bundle.

For the sake of neatness, begin by creating a new group for these resource files. Control+click/right-click the DrumDub group (not the project) in the navigator and choose the New Group command, as shown on the left in Figure 9-17.

image

Figure 9-17. Adding nonimage resources

Name the group Sounds, as shown in the middle of Figure 9-17. Locate the four sound sample files in the Finder and drag them into the group, as shown on the right of Figure 9-17. If you miss and add the items to the DrumDub group instead, select them in the navigator and drag them into the Sounds group. You can always reorganize your project items as you please.

After dropping your items into the navigator, Xcode presents some options that determine how the items will be added to your project, as shown in Figure 9-18. Make sure the Copy items into destination group’s folder (if needed) option is checked. This option copies the new items into your app’s project folder. The second option (Create groups for any added folders) applies only when adding folders full of resource files.

image

Figure 9-18. Add project file options

Caution If you fail to check the Copy items into destination group’s folder (if needed) option, Xcode will add only a reference to the original item, which is still outside your project’s folder. This works fine, until you rename one of the original files, move your project, or copy it to another system—then your project suddenly stops building. Save yourself some grief and keep all of your project’s resources inside your project folder.

Finally, make sure the DrumDub target is checked, as shown in Figure 9-14. This option makes these items members of the DrumDub app target, which means they’ll be included as resource files in your finished app. (If you forget to check this, you can later change the target membership of any item using the file inspector.) Click Finish, and Xcode will copy the sound sample files into your project folder, add them to the project navigator, and include them in the DrumDub app target. These files are now ready to be used in your app.

Creating AVAudioPlayer Objects

You’ll play the sound sample files using AVAudioPlayer objects. You’ll need four. Rather than creating four AVAudioPlayer variables and writing four play actions, create one array to hold all of the objects and one function to play any of them. Start with the AVAudioPlayerobjects. Add these statements to your ViewController.swift file:

let soundNames = [ "snare", "bass", "tambourine", "maraca" ]
var players = [AVAudioPlayer]()

The first statement declares a constant array of strings that contains the names of the sound resource files. The players variable is an array of AVAudioPlayer objects, initialized to an empty array. The syntax looks a little weird, but it’s just creating a new object. The new object is an array containing AVAudioPlayer objects ([AVAudioPlayer]), created using the default object initializer (()).

Do those compiler errors look familiar? Just as you did for the AppDelelgate class, add this import statement to the beginning of your ViewController.swift file:

import AVFoundation

The functions createAudioPlayers() and destroyAudioPlayers() will create and destroy all four audio player objects at once. Add them now.

func createAudioPlayers() {
destroyAudioPlayers()
for soundName in soundNames {
if let soundURL = NSBundle.mainBundle().URLForResource(soundName, image
withExtension: "m4a") {
let player = AVAudioPlayer(contentsOfURL: soundURL, error: nil)
player.prepareToPlay()
players.append(player)
}
}
}

func destroyAudioPlayers() {
players = []
}

createAudioPlayers() loops through the array of sound name constants (soundNames) and uses that to create a URL that refers to the m4a sound resource file that you added earlier. This URL is used to create and initialize a new AVAudioPlayer object that will play that sound file.

Some optimization is then applied. The prepareToPlay() function is called on each sound player. This preps the player object so that it is immediately ready to play its sound. Finally, the new player object is appended to the players array. When the loop is finished, you’ll have an array of AVAudioPlayer objects, each configured to play the corresponding sound file in the soundNames array.

Note Normally, player objects prepare themselves lazily, waiting until you request them to play before actually reading the sound sample data file, allocating their buffers, configuring hardware codecs, and so on. All of this takes time. When your user taps a sound button, they don’t want to wait for the sound to play; they want it to play immediately. The prepareToPlay() function eliminates that initial delay.

The destroyAudioPlayers() function is self-explanatory, and you don’t need it yet. It will come into “play” later.

Next up are the buttons to play these sounds and the action function to make that happen. Start by adding a stu.e bang(_:) action.

@IBAction func bang(sender: AnyObject!) {
}

Now you’re ready to design the interface.

Adding the Sound Buttons

Return to your Main.storyboard Interface Builder file. Drag in a new UIButton object. Select it and do the following:

1. Use the attributes inspector to

a. Set its type property to Custom

b. Clear its title text property (deleting Button)

c. Set its image property to snare

d. Scroll down to its tag property and change it from 0 to 1

2. Select the button and use the Pin Constraints control to add both a height and a width constraint, both set to 100 pixels.

3. Use the connections inspector to connect its Touch Down event to the new bang: action of the View Controller object (see Figure 9-19).

image

Figure 9-19. Creating the first bang button

There are a couple of noteworthy aspects to this button’s configuration. First, you’ve connected the Touch Down event, instead of the more common Touch Up Inside event. That’s because you want to call the bang(_:) action function the instant the user touches the button. Normally, buttons don’t send their action message until the user touches them and releases again, with their finger still inside the button—thus, the name Touch Up Inside.

Second, you didn’t create an outlet to connect to this button. You’re going to identify, and access, the object via its tag property. All UIView objects have an integer tag property. It exists solely for your use in identifying views; iOS doesn’t use it for anything else. You’re going to use thetag to determine which sound to play and later to obtain the UIButton object in the interface.

Duplicate the new button three times, to create four buttons in all. You can do this either by using the clipboard or by holding down the Option key and dragging out new copies of the button.

All of the buttons have the same type, image, tag, constraints, and action connection. Use the attributes inspector to change the image and tag properties of the three duplicates, using the following table:

Image

Tag

bass

2

tambourine

3

maraca

4

Your interface should now look like the one in Figure 9-20.

image

Figure 9-20. Configured sound buttons

Now you need to add constraints to position the button in the interface. The buttons are too wide to be in a single row on compact devices, like an iPhone. On the other hand, you wouldn’t want them all bunched together in a tiny group on a big iPad. If only there were some way the layout could adapt itself to different devices and orientations….

Of course, I’m joking. You’ll use the same technique you used earlier with the album artwork and song labels. This time, you’ll create a completely different set of constraints that will fundamentally change how the buttons are laid out on different devices. Let’s get started.

Button Layout for Compact Interfaces

You’re already added height and width constraints for all four buttons in the wAny/hAny size class. Therefore, the buttons will be 100x100 pixels in all layouts. All you have to add now are the constraints to position them. Switch to the wCompact/hAny size class and add these constraints:

1. Select the button with tag 1 (the snare drum).

a. Add a Horizontal Center in Container constraint with a value of 60.

b. Add a Vertical Center in Container constraint with a value of -8 (left image in Figure 9-21).

image

Figure 9-21. Button constraints for the Compact/Any size class

These constraints are sufficient to position the first button just to the left, and slightly below, the center point of the interface. This will be your “anchor” button. All of the remaining constraints will be relative to that button.

2. Drag the button with tag 2 (the bass drum) so it’s to the right of the first button.

3. Select both (or Control+drag between them) and add a Top Edges alignment constraint (middle of Figure 9-21) with a value of 0.

4. Select the right button and add a Leading Edge space constraint with a value of 20, as shown on the right in Figure 9-21. Make sure the leading edge is relative to the other button and not the container view.

You’ve positioned the second button 20 pixels to the right, and at the same vertical position, as the first button. Keep adding constraints to position the remaining two buttons:

5. Drag the third button below the first.

6. Select the first and third buttons and add a Leading Edge alignment constraint with a value of 0.

7. Add a Vertical Spacing constraint between the first and third buttons, with a value of 20.

8. Drag the last button in to fill out the square.

9. Add a Top (or Bottom, your choice) Edges alignment constraint with the third button with a value of 0.

10.Add a Trailing (or Leading, your choice) Edges alignment constraint with the second button with a value of 0.

Tip Having trouble selecting a button? When the Interface Builder canvas is resized, view objects can “fall off” the edges where you can’t see or select them. Don’t worry. All of the scene’s objects are listed to the left. Double-click the missing button object in the view hierarchy; this will select it in the canvas (even if you can’t see it). Use the arrow keys to move the view (hold down the Shift key to move faster) or use the size inspector to change its origin so it’s visible in the canvas again.

This final two constraints position the last button relative to the second (horizontally) and the third (vertically), completing the grid. The finished layout is shown in Figure 9-22.

image

Figure 9-22. Finished compact button layout

Button Layout for Regular Interfaces

Now perform these steps again for regular width environments. For iPads, the buttons don’t need to be grouped tightly together. Let’s create a different set of constraints that let them breathe a little.

Switch to the wRegular/hAny size class. All of the constraints you just added disappear. This time start with the button with tag 2 (the bass drum). This will be the “anchor” for this layout.

1. Select the second button (the bass drum).

a. Add a Horizontal Center in Container constraint with a value of 70.

b. Add a Vertical Center in Container constraint with a value of -160 (see Figure 9-23).

image

Figure 9-23. Button constraints for Regular/Any size class

Just as you did before, you’ve positioned this button to the left, and below, the relative center of the display. This time the distances are little more generous because the display is bigger.

2. Drag the other buttons so they make a rough line.

3. Select all of the buttons and add a Top Edges alignment constraint with a value of 0.

This tells iOS to position the other three buttons at the same (vertical) position as the second button. The only thing left is to assign horizontal positions to the remaining three buttons.

4. Add a Horizontal Spacing constraint between the first and second button with a value of 40.

5. Do the same between the second and third button.

6. Repeat, adding a horizontal spacing constraint between the third and final buttons.

Check your work using the assistant editor and the preview, as shown in Figure 9-24. If you did everything right, you should see pleasing button layouts for 4-inch iPhones, 3.5-inch iPhones, and iPads.

image

Figure 9-24. Previewing button layout on multiple devices

Your app can seamlessly morph between substantially different interface designs, accomplished simply by providing alternate sets of constraints. This was all very educational, but I’m sure you really want to get back to making those sound buttons work.

Making Noise

Return again to the regular editor and the ViewController.swift file. Finish writing your bang(_:) function (new code in bold).

@IBAction func bang(sender: AnyObject!) {
if let button = sender as? UIButton {
let index = button.tag-1
if index >=0 && index < players.count {
let player = players[index]
player.pause()
player.currentTime = 0.0
player.play()
}
}
}

All four buttons send the same action. You determine which button sent the message using its tag property. Your four buttons have tag values between 1 and 4, which you use as an index (0 through 3) to obtain that button’s AVAudioPlayer object.

Once you have the button’s AVAudioPlayer, you first call its pause() function. This will suspend playback of the sound if it’s currently playing. If not, it does nothing.

Then the currentTime property is set to 0. This property is the player’s logical “playhead,” indicating the position (in seconds) where the player is currently playing or will begin playing. Setting it to 0 “rewinds” the sound so it plays from the beginning.

Finally, the play() function starts the sound playing. The play() function is asynchronous; it starts a background task to play and manage the sound and then returns immediately.

There are just two more details to take care of before your sounds will play.

Activating Your Audio Session

It’s not strictly required, but the documentation for the AVAudioSession class recommends that your app activate the audio session when it starts and again whenever your audio session is interrupted. You’ll take this opportunity to prepare the audio player objects at the same time. You’ll do that in an activateAudioSession() function, which you’ll add in a moment. Call it once when the view first loads. Find the viewDidLoad() function and add that call (new line in bold).

override func viewDidLoad() {
super.viewDidLoad()
activateAudioSession()
}

Now write the activateAudioSession() function.

func activateAudioSession() {
let active = AVAudioSession.sharedInstance().setActive(true, error: nil)
if active {
if players.count == 0 {
createAudioPlayers()
}
} else {
destroyAudioPlayers()
}
for i in 0..<soundNames.count {
if let button = view.viewWithTag(i+1) as? UIButton {
button.enabled = active
}
}
}

The first line obtains your app’s audio session object (the same one you configured back in application(_:,didFinishLaunchingWithOptions:)). You call its setActive(_:,error:) function to activate, or reactivate, the audio session.

The setActive(_:,error:) function returns true if the audio session is now active. There are a few obscure situations where this will fail (returning false), and your app should deal with that situation gracefully.

In this app, you look to see whether the session was activated and call createAudioPlayers() to prepare the AVAudioPlayer objects for playback. If the session couldn’t be activated (which means your app can’t use any audio), then you destroy any AVAudioPlayer objects you previously created and disable all of the sound effect buttons in the interface.

Since you don’t have an outlet connected to those buttons, you’ll get them using their tag. The viewWithTag(_:) function searches the hierarchy of a view object and returns the first subview matching that tag. Your bang buttons are the only views with tag values of 1, 2, 3, and 4. The loop obtains each button view and enables, or disables, it.

Tip Tags are a convenient way to manage a group of view objects, without requiring you to create an outlet for each one.

The functional portion of your app is now finished. By functional, I mean you can run your app, play music, and annoy anyone else in the room with cheesy percussion noises, as shown in Figure 9-25.

image

Figure 9-25. Working DrumDub app

Interruptions and Detours

In the “Living in a Larger World” section, I described the multitude of events and situations that conspire to complicate your app’s use of audio. Most people hate interruptions or being forced to take a detour, and I suspect app developers are no different. But dealing with these events gracefully is the hallmark of a finely crafted iOS app. First up are interruptions.

Dealing with Interruptions

An interruption occurs when another app or service needs to activate its audio session. The most common sources of interruptions are incoming phone calls and alerts (triggered by alarms, messages, notification, and reminders).

Most of the work of handling interruptions is done for you. When your app’s audio session is interrupted, iOS fades out your audio and deactivates your session. The usurping session then takes over and begins playing the user’s ring tone or alert sound. Your app, audio, and music player delegates then receive “begin interruption” messages.

Your app should do whatever is appropriate to respond to the interruption. Often, this isn’t much. You might update the interface to indicate that you’re no longer playing music. Mostly, your app should just make a note of what it was doing so it can resume when the interruption ends.

Interruptions can be short, such as a few seconds for alarms. Or they can be very (very) long, such as an hour or more, if you accept that incoming phone call from chatty aunt May. Don’t make any assumptions on how long the interruption will last; just wait for iOS to notify your app when it’s over.

When the interruption is over, your app will receive “end interruption” messages. This is where the work begins. First, your app should explicitly reactivate its audio session. This isn’t a strict requirement, but it’s recommended. It gives your app a chance to catch the (very rare) situation where your audio session can’t be reactivated.

Then you need to resume playback, reload audio objects, update your interface, or whatever else your app needs to do so it is once again running, exactly as it was before the interruption occurred. In DrumDub, there’s surprisingly little work to do because most of the default music and audio player behavior is exactly what you want. Nevertheless, there’s still some rudimentary interruption handling you need to add.

Adding Your Interruption Handlers

Interruption notifications can be received in a number of different ways. Your app only needs to observe those it wants and are convenient; there’s no need to observe them all. Begin and end interruption messages are sent to the following:

· Any observer of the audio session’s interruption notification (AVAudioSessionInterruptionNotification)

· All audio player delegates (AVAudioPlayerDelegate)

· Any observer of music player state change notifications (MPMusicPlayerControllerPlaybackStateDidChangeNotification)

Decide how you want your app to respond to interruptions and then implement the handlers that conveniently let you do that. When something interrupts DrumDub, you want to do the following:

· Pause the playback of the music

· Stop any percussion sound that’s playing (so it doesn’t resume when the interruption is over)

When the interruption ends, you want DrumDub to do the following:

· Reactivate the audio session and check for problems

· Resume playback of the music

Pausing and resuming the music player requires no code. The MPMusicPlayerController class does this automatically in response to interruptions. You don’t even need to add any code to update your interface. When the music player is interrupted, its playbackState changes toMPMusicPlaybackStateInterrupted, and your controller gets a playbackStateDidChangeNotification(_:) call, which updates your play and pause buttons. When the interruption ends, the music player resumes playing and sends another state change notification.

So, DrumDub’s only nonstandard behavior is to silence any playing percussion sounds when an interruption arrives. That’s so the “tail end” of the sound bite doesn’t start playing again when the interruption is over (which is the default behavior). Start writing youraudioInterruption(_:) function and handle that in its first case.

func audioInterruption(notification: NSNotification) {
if let typeValue = image
notification.userInfo?[AVAudioSessionInterruptionTypeKey] as? NSNumber {
if let type = AVAudioSessionInterruptionType.fromRaw( image
typeValue.unsignedLongValue) {
switch type {
case .Began:
for player in players {
player.pause()
}

The last task on the list is to reactivate the audio session when the interruption is over. You already wrote the code to do that in activateAudioSession(); you just need to call it. Do that in the second case.

case .Ended:
activateAudioSession()
}
}
}
}

To get these notifications, add your view controller as an observer in your viewDidLoad() function (new code in bold).

override func viewDidLoad() {
super.viewDidLoad()
activateAudioSession()
let center = NSNotificationCenter.defaultCenter()
center.addObserver( self,
selector: "audioInterruption:",
name: AVAudioSessionInterruptionNotification,
object: nil)

With the tricky business of interruptions taken care of, it’s time to deal with detours (route changes).

Dealing with Audio Route Changes

An audio route is the path that data takes to get to the eardrum of the listener. Your iPhone might be paired to the speakers in your car. When you get out of your car, your iPhone switches to its built-in speakers. When you plug in some headphones, it stops playing through its speaker and begins playing through your headphones. Each of these events is an audio route change.

You deal with audio route changes exactly the way you deal with interruptions: decide what your app should do in each situation and then write handlers to observe those events and implement your policies. From DrumDub, you want to implement Apple’s recommended behavior of stopping music playback when the user unplugs their headphones or disconnects from external speakers. If these were sound effects in a game, or something similar, it would be appropriate to let them continue playing. But DrumDub’s music will stop playing when the headphones are unplugged, so the instrument sounds should stop too.

Audio route notifications are posted by the AVAudioSession object, all you have to do is observe them. Begin by requesting that your ViewController object receive audio route change notifications. At the end of the viewDidLoad() function, add this code:

center.addObserver( self,
selector: "audioRouteChange:",
name: AVAudioSessionRouteChangeNotification,
object: nil)

Now add your route change handler function.

func audioRouteChange(notification: NSNotification) {
if let reasonValue = image
notification.userInfo?[AVAudioSessionRouteChangeReasonKey] as? NSNumber {
if reasonValue.unsignedLongValue == image
AVAudioSessionRouteChangeReason.OldDeviceUnavailable.toRaw() {
for player in players {
player.pause()
}
}
}
}

The function begins by examining the reason for the audio route change. It gets this information from the notification’s userInfo dictionary. If the value associated with the AVAudioSessionRouteChangeReasonKey isAVAudioSessionRouteChangeReasonOldDeviceUnavailable, it indicates that a previously active audio route is no longer available. This happens when headphones are unplugged, the device is removed from a dock connector, a wireless speaker system is disconnected, and so on. If that’s the case, it stops playback of all four audio players.

That wraps up this app! Go ahead and run it again to make sure everything is working. You’ll want to test your interruption and audio route change logic by doing things like the following:

· Setting an alarm to interrupt playback

· Calling your iPhone from another phone

· Plugging and unplugging headphones

While you do this, set breakpoints in your audioInterruption(_:) and audioRouteChange(_:) functions to verify that they are being called. Testing your app under as many situations as you can devise is an important part of app development.

Other Audio Topics

This chapter didn’t even begin to approach the subjects of audio recording or signal processing. To get started with these, and similar topics, start with the Multimedia Programming Guide. It provides an overview and road map for playing, recording, and manipulating both audio and video in iOS.

If you need to perform advanced or low-level audio tasks (such as analyzing or encoding audio), refer to the Core Audio Overview. All of these documents can be found in Xcode’s Documentation and API Reference.

Here’s something else to look at: if you need to present audio or video in a view, want your app to play music in the background (that is, when your app is not running), or need to handle remote events, take a look at the AVPlayer and AVPlayerLayer classes. The first is a near-universal media player for both audio and video, similar to MPMusicPlayerController and AVAudioPlayer. It’s a little more complicated but also more capable. It will work in conjunction with an AVPlayerLayer object to present visual content (movie) in a view, so you can create your own YouTube-style video player.

Summary

Sound adds a rich dimension to your app. You’ve learned how to play and control audio from the iPod library as well as resource files bundled in your app. You understand the importance of configuring your audio session and intelligently handing interruptions and audio route changes. “Playing nice” with other audio sources creates the kind of experience that users enjoy and will want to use again and again.

But is there more to iOS interfaces than labels, buttons, and image views? Join me in the next chapter to find out.

EXERCISE

Blend DrumDub further into the iOS experience by using the system music player, instead of an application music player. This will require a couple of subtle changes, listed here:

· Obtain the music player object by calling systemMusicPlayer(), instead of applicationMusicPlayer().

· Create and initialize the music player as soon as the view loads, rather than doing it lazily when the user chooses a song.

· Don’t arbitrarily change the player’s settings (like the shuffle or repeat modes). Remember that you’re changing the user’s iPod settings; most people won’t like your app fiddling with their iPod.

When you’re done, DrumDub will be “plugged in” to the user’s iPod app. If their iPod music is playing when they launch DrumDub, the song will appear the moment your app launches. If the user starts a song playing and quits DrumDub, the music plays on.

You’ll find my solution to this exercise in the Learn iOS Development Projects image Ch 9 image DrumDub E1 folder.

____________________

1Temporarily lowering the volume of one audio source so you can hear a second audio source is called ducking.