Audio, Video, and Using the Camera - Professional Android 4 Application Development (2012)

Professional Android 4 Application Development (2012)

Chapter 15. Audio, Video, and Using the Camera

What's in this Chapter?

Playing audio and video with the Media Player

Handling audio focus and media button presses

Using the Remote Control Client

Applying audio and video effects

Recording audio and video with the Media Recorder

Recording video and taking pictures using Intents

Previewing recorded video and displaying live camera streams

Taking pictures and controlling the camera

Manipulating raw audio

Using face and feature recognition

The increasing popularity of cloud-based music players, combined with the ubiquity of modern phones with ever-increasing storage capacities, is leading to mobile devices becoming the de facto portable digital media player.

This chapter introduces you to the Android APIs for controlling audio and video playback, controlling the audio focus of the device, and reacting appropriately when other applications take focus or the output channel is changed (for example, when headphones are unplugged).

You'll also learn how to use the Remote Control Client, introduced in Android 4.0. It provides a mechanism for showing users details on the media they're playing and allows them to control the playback from the device's lock screen.

With many devices now including two high-resolution cameras, mobiles have also begun to take the place of non-SLR digital cameras. You'll learn to use the Android camera APIs to take photos and record video using any camera available to the device, as well as displaying the live camera feed. New media effects APIs provide a way to modify and enhance video images in real time from within your applications.

Android's open platform and provider-agnostic philosophy ensures that it offers a multimedia API capable of playing and recording a wide range of image, audio, and video formats, both locally and streamed.

You'll also learn how to manipulate raw audio files using the Audio Track and Audio Record classes, to create a Sound Pool, and to add newly recorded media files to the Media Store.

Playing Audio and Video

Android 4.0.3 (API level 15) supports the following multimedia formats for playback as part of the base framework. Note that some devices may support playback of additional file formats:

· Audio

· AAC LC/LTP

· HE-AACv1 (AAC+)

· HE-AACv2 (Enhanced AAC+)

· AMR-NB

· AMR-WB

· MP3

· MIDI

· Ogg Vorbis

· PCM/WAVE

· FLAC (on devices running Android 3.1 or above)

· Image

· JPEG

· PNG

· WEBP (on devices running Android 4.0 or above)

· GIF

· BMP

· Video

· H.263

· H.264 AVC

· MPEG-4 SP

· VP8 (on devices running Android 2.3.3 or above)

The following network protocols are supported for streaming media:

· RTSP (RTP, SDP)

· HTTP/HTTPS progressive streaming

· HTTP/HTTPS live streaming (on devices running Android 3.0 or above)

2.1

For full details on the currently supported media formats and recommendations for video encoding and audio streaming, see the Android Dev Guide, at http://developer.android.com/guide/appendix/media-formats.html.

Introducing the Media Player

The playback of audio and video within Android applications is handled primarily through the MediaPlayer class. Using the Media Player, you can play media stored in application resources, local files, Content Providers, or streamed from a network URL.

The Media Player's management of audio and video files and streams is handled as a state machine. In the most simplistic terms, transitions through the state machine can be described as follows:

1. Initialize the Media Player with media to play.

2. Prepare the Media Player for playback.

3. Start the playback.

4. Pause or stop the playback prior to its completing.

5. The playback is complete.

2.1

A more detailed and thorough description of the Media Player state machine is provided at the Android developer site, at http://developer.android.com/reference/android/media/MediaPlayer.html#StateDiagram.

To play a media resource, you need to create a new MediaPlayer instance, initialize it with a media source, and prepare it for playback.

The following section describes how to initialize and prepare the Media Player. After that, you'll learn to control the playback to start, pause, stop, or seek the prepared media.

To stream Internet media using the Media Player, your application must include the INTERNET permission:

<uses-permission android:name="android.permission.INTERNET"/>

2.1

Android supports a limited number of simultaneous Media Player objects; not releasing them can cause runtime exceptions when the system runs out. When you finish playback, call release on your Media Player object to free the associated resources:

mediaPlayer.release();

Preparing Audio for Playback

There are a number of ways you can play audio content through the Media Player. You can include it as an application resource, play it from local files or Content Providers, or stream it from a remote URL.

To include audio content as an application resource, add it to the res/raw folder of your resources hierarchy. Raw resources are not compressed or manipulated in any way when packaged into your application, making them an ideal way to store precompressed files such as audio files.

Initializing Audio Content for Playback

To play back audio content using a Media Player, you need to create a new Media Player object and set the data source of the audio in question. You can do this by using the static create method, passing in the Activity Context and any one of the following audio sources:

· A resource identifier (typically for an audio file stored in the res/raw resource folder)

· A URI to a local file (using the file:// schema)

· A URI to an online audio resource (as a URL)

· A URI to a row within a Content Provider that returns an audio file

// Load an audio resource from a package resource.
MediaPlayer resourcePlayer = 
  MediaPlayer.create(this, R.raw.my_audio);
 
// Load an audio resource from a local file.
MediaPlayer filePlayer = MediaPlayer.create(this,
  Uri.parse("file:///sdcard/localfile.mp3"));
 
// Load an audio resource from an online resource.
MediaPlayer urlPlayer = MediaPlayer.create(this,
  Uri.parse("http://site.com/audio/audio.mp3"));
 
// Load an audio resource from a Content Provider.
MediaPlayer contentPlayer = MediaPlayer.create(this, 
  Settings.System.DEFAULT_RINGTONE_URI);

2.1

The Media Player objects returned by these create methods have already had prepare called. It's important that you do not call it again.

Alternatively, you can use the setDataSource method on an existing Media Player instance, as shown in Listing 15.1. This method accepts a file path, Content Provider URI, streaming media URL path, or File Descriptor. When using the setDataSource method, it is vital that you call prepare on the Media Player before you begin playback.

2.11

Listing 15.1: Audio playback using the Media Player

MediaPlayer mediaPlayer = new MediaPlayer();
mediaPlayer.setDataSource("/sdcard/mydopetunes.mp3");
mediaPlayer.prepare();

code snippet PA4AD_Ch15_Media_Player/src/VideoViewActivity.java

Preparing Video for Playback

Playback of video content is slightly more involved than audio. To play a video, you first must have a Surface on which to show it.

There are two alternatives for the playback of video content. The first technique, using the VideoView class, encapsulates the creation of a Surface and allocation and preparation of video content using a Media Player.

The second technique allows you to specify your own Surface and manipulate the underlying Media Player instance directly.

Playing Video Using the Video View

The simplest way to play back video is to use the Video View. The Video View includes a Surface on which the video is displayed and encapsulates and manages a Media Player instance that handles the playback.

After placing the Video View within the UI, get a reference to it within your code. You can then assign a video to play by calling its setVideoPath or setVideoURI methods to specify the path to a local file, or the URI of either a Content Provider or remote video stream:

final VideoView videoView = (VideoView)findViewById(R.id.videoView);
 
// Assign a local file to play
videoView.setVideoPath("/sdcard/mycatvideo.3gp");
 
// Assign a URL of a remote video stream
videoView.setVideoUri(myAwesomeStreamingSource);

When the video is initialized, you can control its playback using the start, stopPlayback, pause, and seekTo methods. The Video View also includes the setKeepScreenOn method to apply a screen Wake Lock that will prevent the screen from being dimmed while playback is in progress without requiring a special permission.

Listing 15.2 shows the skeleton code used to assign a video to a Video View. It uses a Media Controller to control playback, as described in the section “Controlling the Media Player Playback.”

2.11

Listing 15.2: Video playback using a Video View

// Get a reference to the Video View.
final VideoView videoView = (VideoView)findViewById(R.id.videoView);
 
// Configure the video view and assign a source video.
videoView.setKeepScreenOn(true);
videoView.setVideoPath("/sdcard/mycatvideo.3gp");
 
// Attach a Media Controller
MediaController mediaController = new MediaController(this);
videoView.setMediaController(mediaController); 

code snippet PA4AD_Ch15_Media_Player/src/VideoViewActivity.java

Creating a Surface for Video Playback

The first step to using the Media Player directly to view video content is to prepare a Surface onto which the video will be displayed.

This is generally handled using a SurfaceView object. The Surface View class is a wrapper around the Surface Holder object, which, in turn, is a wrapper around the Surface that is used to support visual updates from background threads.

The Media Player uses a SurfaceHolder object to display video content, assigned using the setDisplay method. To include a Surface Holder in your UI layout, use the SurfaceView class, as shown in the sample layout XML in Listing 15.3.

Listing 15.3: Sample layout using a Surface View

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout   
  xmlns:android="http://schemas.android.com/apk/res/android"
  android:layout_width="match_parent"
  android:layout_height="match_parent"
  android:orientation="vertical" >
  <SurfaceView
    android:id="@+id/surfaceView"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:layout_weight="30"
  />
  <LinearLayout
    android:id="@+id/linearLayout1"
    android:layout_width="match_parent"
    android:layout_height="wrap_content"
    android:layout_weight="1">
    <Button
      android:id="@+id/buttonPause"
      android:layout_width="wrap_content"
      android:layout_height="wrap_content"
      android:text="Pause" 
    />
    <Button
      android:id="@+id/buttonStop"
      android:layout_width="wrap_content"
      android:layout_height="wrap_content"
      android:text="Stop" 
    />
    <Button
      android:id="@+id/buttonSkip"
      android:layout_width="wrap_content"
      android:layout_height="wrap_content"
      android:text="Skip"
    />
  </LinearLayout>
</LinearLayout>

code snippet PA4AD_Ch15_Media_Player/res/layout/surfaceviewvideoviewer.xml

Surface Holders are created asynchronously, so you must wait until the surfaceCreated handler has been fired before assigning the returned Surface Holder object to the Media Player by implementing the SurfaceHolder.Callback interface.

After creating and assigning the Surface Holder to your Media Player, use the setDataSource method to specify the path, URL, or Content Provider URI of the video resource to play.

After you select your media source, call prepare to initialize the Media Player in preparation for playback. Listing 15.4 shows the skeleton code used to initialize a Surface View within your Activity and assigns it as a display target for a Media Player.

2.11

Listing 15.4: Initializing and assigning a Surface View to a Media Player

import java.io.IOException;
import android.app.Activity;
import android.media.MediaPlayer;
import android.os.Bundle;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
 
public class SurfaceViewVideoViewActivity extends Activity 
  implements SurfaceHolder.Callback {
 
  static final String TAG = "SurfaceViewVideoViewActivity";
  
  private MediaPlayer mediaPlayer;
 
  public void surfaceCreated(SurfaceHolder holder) { 
    try {
      // When the surface is created, assign it as the
      // display surface and assign and prepare a data 
      // source.
      mediaPlayer.setDisplay(holder);
      mediaPlayer.setDataSource("/sdcard/test2.3gp");
      mediaPlayer.prepare();
    } catch (IllegalArgumentException e) {
      Log.e(TAG, "Illegal Argument Exception", e);
    } catch (IllegalStateException e) {
      Log.e(TAG, "Illegal State Exception", e);
    } catch (SecurityException e) {
      Log.e(TAG, "Security Exception", e);
    } catch (IOException e) {      
      Log.e(TAG, "IO Exception", e);
    }
  }
 
  public void surfaceDestroyed(SurfaceHolder holder) {
    mediaPlayer.release();
  }  
 
  public void surfaceChanged(SurfaceHolder holder,
                             int format, int width, int height) { }
 
  
  @Override
  public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    
    setContentView(R.layout.surfaceviewvideoviewer);
    
    // Create a new Media Player.
    mediaPlayer = new MediaPlayer();
 
    // Get a reference to the Surface View.
    final SurfaceView surfaceView =
      (SurfaceView)findViewById(R.id.surfaceView);
 
    // Configure the Surface View.
    surfaceView.setKeepScreenOn(true);
 
    // Configure the Surface Holder and register the callback.
    SurfaceHolder holder = surfaceView.getHolder();
    holder.addCallback(this);
    holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    holder.setFixedSize(400, 300);
    
    // Connect a play button.
    Button playButton = (Button)findViewById(R.id.buttonPlay);
    playButton.setOnClickListener(new OnClickListener() {
      public void onClick(View v) {
        mediaPlayer.start();        
      }
    });
     
    // Connect a pause button.
    Button pauseButton = (Button)findViewById(R.id.buttonPause);
    pauseButton.setOnClickListener(new OnClickListener() {
      public void onClick(View v) {
        mediaPlayer.pause();        
      }
    });
    
    // Add a skip button.
    Button skipButton = (Button)findViewById(R.id.buttonSkip);    
    skipButton.setOnClickListener(new OnClickListener() {
      public void onClick(View v) {
        mediaPlayer.seekTo(mediaPlayer.getDuration()/2);
      }
    });
  }
}

code snippet PA4AD_Ch15_Media_Player/src/SurfaceViewVideoViewActivity.java

Controlling Media Player Playback

When a Media Player is prepared, call start to begin playback of the associated media:

mediaPlayer.start();

Use the stop and pause methods to stop or pause playback, respectively.

The Media Player also provides the getDuration method to find the length of the media being played and the getCurrentPosition method to find the playback position. Use the seekTo method to jump to a specific position in the media (refer to Listing 15.4).

To ensure a consistent media control experience, Android includes the MediaController—a standard control that provides the common media control buttons, as shown in Figure 15.1.

Figure 15.1

15.1

If you are using the Media Controller to control video playback, it's good practice to instantiate it in code and associate it with the video playback View, rather than including it within your layout. When created this way, the Media Controller will be visible only when you set it to visible, touch its host Video View, or are interacting with it.

If you're using a Video View to display your video content, you can use the Media Controller simply by the Video View's setMediaController method:

// Attach a Media Controller
MediaController mediaController = new MediaController(this);
videoView.setMediaController(mediaController); 

You can use a Media Controller to control any Media Player and associate it with any View in your UI.

To control a Media Player, or other audio or video source directly, you need to implement a new MediaController.MediaPlayerControl, as shown in Listing 15.5.

2.11

Listing 15.5: Controlling playback using the Media Controller

MediaController mediaController = new MediaController(this);
mediaController.setMediaPlayer(new MediaPlayerControl() {
 
  public boolean canPause() {
    return true;
  }
 
  public boolean canSeekBackward() {
    return true;
  }
 
  public boolean canSeekForward() {
    return true;
  }
 
  public int getBufferPercentage() {
    return 0;
  }
 
  public int getCurrentPosition() {
    return mediaPlayer.getCurrentPosition();
  }
 
  public int getDuration() {
    return mediaPlayer.getDuration();
  }
 
  public boolean isPlaying() {
    return mediaPlayer.isPlaying();
  }
 
  public void pause() {
    mediaPlayer.pause();
  }
 
  public void seekTo(int pos) {
    mediaPlayer.seekTo(pos);
  }
 
  public void start() {
    mediaPlayer.start();
  }
  
});

code snippet PA4AD_Ch15_Media_Player/src/SurfaceViewVideoViewActivity.java

Use the setAnchorView method to determine which view should anchor the Media Controller when it's visible, and call show or hide to show or hide the controller, respectively:

mediaController.setAnchorView(myView);
mediaController.show();

Note that you must associate a Media Player Control before attempting to display the Media Controller.

Managing Media Playback Output

The Media Player provides methods to control the volume of the output, lock the screen brightness during playback, and set the looping status.

You can control the volume for each channel during playback using the setVolume method. It takes a scalar float value between 0 and 1 for both the left and right channels (where 0 is silent and 1 is maximum volume).

mediaPlayer.setVolume(0.5f, 0.5f);

To force the screen to stay on during video playback, use the setScreenOnWhilePlaying method:

mediaPlayer.setScreenOnWhilePlaying(true);

This is preferred to using Wake Locks because it doesn't require any additional permissions. Wake Locks are described in more detail in Chapter 18, “Advanced Android Development.”

Use the isLooping method to determine the current loop status, and the setLooping method to specify if the media being played should loop when it completes:

if (!mediaPlayer.isLooping())
  mediaPlayer.setLooping(true);

2.1

It is currently not possible to play audio into a phone conversation; the Media Player always plays audio using the standard output device—the speaker or connected headset.

Responding to the Volume Controls

To ensure a consistent user experience, it's important that your application correctly handles users pressing the volume and any attached media playback control keys.

By default, using the volume keys, on either the device or an attached headset, changes the volume of whichever audio stream is currently playing. If no stream is active, the volume keys will alter the ringtone volume.

If your Activity is expected to play audio for a significant proportion of its visible lifetime (for example, a music player or game with a soundtrack and audio effects), it's reasonable for users to expect that the volume keys will alter the music volume even if no music is currently playing.

Using the Activity's setVolumeControlStream method—typically within its onCreate method, as shown in Listing 15.6—allows you to specify which audio stream should be controlled by the volume keys while the current Activity is active.

You can specify any of the available audio streams, but when using the Media Player, you should specify the STREAM_MUSIC stream to make it the focus of the volume keys.

2.11

Listing 15.6: Setting the volume control stream for an Activity

@Override
public void onCreate(Bundle savedInstanceState) {
  super.onCreate(savedInstanceState);
  setContentView(R.layout.audioplayer);
  
  setVolumeControlStream(AudioManager.STREAM_MUSIC);
}

code snippet PA4AD_Ch15_Media_Player/src/AudioPlayerActivity.java

2.1

Although it's also possible to listen for volume key presses directly, this is generally considered poor practice. There are several ways a user can modify the audio volume, including the hardware buttons as well as software controls. Triggering volume changes manually based only on the hardware buttons is likely to make your application respond unexpectedly.

Responding to the Media Playback Controls

If your application plays audio and/or video in a way users would associate with a media player, it should respond predictably to media button presses.

Some devices, as well as attached or Bluetooth headsets, feature play, stop, pause, skip, and previous media playback keys. When users press these keys, the system broadcasts an Intent with the ACTION_MEDIA_BUTTON action. To receive this broadcast, you must have a Broadcast Receiver declared in your manifest that listens for this action, as shown in Listing 15.7.

Listing 15.7: Media button press Broadcast Receiver manifest declaration

<receiver android:name=".MediaControlReceiver">
  <intent-filter>
    <action android:name="android.intent.action.MEDIA_BUTTON"/>
  </intent-filter>
</receiver>

code snippet PA4AD_Ch15_Media_Player/AndroidManifest.xml

In Listing 15.8 this Broadcast Receiver is implemented such that when it receives the media button key presses, it simply creates a new Intent that includes the same extras and broadcasts it to the Activity playing the audio.

2.11

Listing 15.8: A media button press Manifest Broadcast Receiver implementation

public class MediaControlReceiver extends BroadcastReceiver {
  
  public static final String ACTION_MEDIA_BUTTON =
    "com.paad.ACTION_MEDIA_BUTTON";
  
  @Override
  public void onReceive(Context context, Intent intent) {
    if (Intent.ACTION_MEDIA_BUTTON.equals(intent.getAction())) {
      Intent internalIntent = new Intent(ACTION_MEDIA_BUTTON);
      internalIntent.putExtras(intent.getExtras());
      context.sendBroadcast(internalIntent);
    }
  }
}

code snippet PA4AD_Ch15_Media_Player/src/MediaControlReceiver.java

The key code of the media button pressed is stored within the received Intent within the EXTRA_KEY_EVENT extra, as shown in Listing 15.9.

Listing 15.9: Media button press Broadcast Receiver implementation

public class ActivityMediaControlReceiver extends BroadcastReceiver {
  @Override
  public void onReceive(Context context, Intent intent) {
    if (MediaControlReceiver.ACTION_MEDIA_BUTTON.equals(
        intent.getAction())) {
      KeyEvent event =
        (KeyEvent)intent.getParcelableExtra(Intent.EXTRA_KEY_EVENT);
    
      switch (event.getKeyCode()) {
        case (KeyEvent.KEYCODE_MEDIA_PLAY_PAUSE) : 
          if (mediaPlayer.isPlaying())
            pause();
          else
            play();
          break;
        case (KeyEvent.KEYCODE_MEDIA_PLAY) : 
          play(); break;
        case (KeyEvent.KEYCODE_MEDIA_PAUSE) : 
          pause(); break;
        case (KeyEvent.KEYCODE_MEDIA_NEXT) : 
          skip(); break;
        case (KeyEvent.KEYCODE_MEDIA_PREVIOUS) : 
          previous(); break;
        case (KeyEvent.KEYCODE_MEDIA_STOP) : 
          stop(); break;
        default: break;
      }
    }
  }
}

code snippet PA4AD_Ch15_Media_Player/src/AudioPlayerActivity.java

If your application intends to continue playing audio in the background when the Activity isn't visible, a good approach is to keep your Media Player within a Service, controlling the media playback using Intents.

Multiple applications might be installed on a given device, each configured to receive media key presses; therefore, you must also use the Audio Manager's registerMediaButtonEventReceiver method to register your Receiver as the exclusive handler of media button presses, as shown in Listing 15.10, which both registers the media button event Receiver declared in your manifest and the local Broadcast Receiver that interprets the Intent when it's passed through to the Activity.

2.11

Listing 15.10: Media button press Receiver manifest declaration

// Register the Media Button Event Receiver to 
// listen for media button presses.
AudioManager am =
  (AudioManager)getSystemService(Context.AUDIO_SERVICE);
ComponentName component = 
  new ComponentName(this, MediaControlReceiver.class);
 
am.registerMediaButtonEventReceiver(component);
 
// Register a local Intent Receiver that receives media button
// presses from the Receiver registered in the manifest.
activityMediaControlReceiver = new ActivityMediaControlReceiver();
IntentFilter filter = 
  new IntentFilter(MediaControlReceiver.ACTION_MEDIA_BUTTON);
 
registerReceiver(activityMediaControlReceiver, filter);

code snippet PA4AD_Ch15_Media_Player/src/AudioPlayerActivity.java

2.1

Calls to registerMediaButtonEventReceiver are respected in the order in which they're received, so it's good practice to register and unregister your Receiver based on when you have (and lose) audio focus, as described in the next section.

Requesting and Managing Audio Focus

In some cases (particularly for media players) your application should continue to respond to media buttons when it isn't visible or active. Users may have multiple media players on their devices, so it's important that your application pause playback and cede control of the media buttons when another media application takes focus.

Similarly, when your application becomes active, it should notify other audio playback applications that they should pause playback and allow it to become the focus for media button clicks. Such delegation is handled through the audio focus, a set of APIs introduced in Android 2.2 (API level 8).

To request audio focus before beginning playback, use the Audio Manager's requestAudioFocus method. When requesting the audio focus, you can specify which stream you require (typically STREAM_MUSIC), and for how long you expect to require focus—either permanently (such as when playing music) or transiently (such as when providing navigation instructions). In the latter case you can also specify if your transient interruption can be handled by the currently focused application “ducking” (lowering its volume) until your interruption is complete.

Specifying the nature of the audio focus you require allows other applications to better react to their own loss of audio focus, as described later in this section.

Listing 15.11 shows the skeleton code for an Activity that requests permanent audio focus for the music stream. You must also specify an Audio Focus Change Listener. This lets you monitor for loss of audio focus and respond accordingly (and is described in more detail later in this section).

2.11

Listing 15.11: Requesting the audio focus

AudioManager am = (AudioManager)getSystemService(Context.AUDIO_SERVICE);
 
// Request audio focus for playback
int result = am.requestAudioFocus(focusChangeListener,
               // Use the music stream.
               AudioManager.STREAM_MUSIC,
               // Request permanent focus.
               AudioManager.AUDIOFOCUS_GAIN);
   
if (result == AudioManager.AUDIOFOCUS_REQUEST_GRANTED) {
  mediaPlayer.start();
}

code snippet PA4AD_Ch15_Media_Player/src/AudioPlayerActivity.java

Audio focus is assigned in turn to each application that requests it. This means that if another application requests audio focus, your application will lose it. You will be notified of the loss of audio focus through the onAudioFocusChange handler of the Audio Focus Change Listener you registered when requesting the audio focus, as shown in Listing 15.12.

The focusChange parameter indicates the nature of the focus loss—either transient or permanent—and whether ducking is permitted.

It's best practice to pause your media playback whenever you lose audio focus, or, in the case of a transient loss that supports ducking, to lower the volume of your audio output.

In the case of a transient focus loss, you will be notified when you have regained focus, at which point you can return to playing your audio at the previous volume.

For a permanent focus loss, you should stop playback and restart it only through a user interaction (such as pressing the play button within your UI). In such circumstances you should also take this opportunity to unregister the media button Receiver.

2.11

Listing 15.12: Responding to the loss of audio focus

private OnAudioFocusChangeListener focusChangeListener = 
  new OnAudioFocusChangeListener() {
  
  public void onAudioFocusChange(int focusChange) {
    AudioManager am = 
      (AudioManager)getSystemService(Context.AUDIO_SERVICE);
    
    switch (focusChange) {
      case (AudioManager.AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK) :
        // Lower the volume while ducking.
        mediaPlayer.setVolume(0.2f, 0.2f);
        break;
 
      case (AudioManager.AUDIOFOCUS_LOSS_TRANSIENT) : 
        pause();
        break;
 
      case (AudioManager.AUDIOFOCUS_LOSS) :
        stop();
        ComponentName component = 
          new ComponentName(AudioPlayerActivity.this,
            MediaControlReceiver.class);
        am.unregisterMediaButtonEventReceiver(component);
        break;
 
      case (AudioManager.AUDIOFOCUS_GAIN) : 
        // Return the volume to normal and resume if paused.
        mediaPlayer.setVolume(1f, 1f);
        mediaPlayer.start();
        break;
 
      default: break;
    }
  }
};

code snippet PA4AD_Ch15_Media_Player/src/AudioPlayerActivity.java

When you have completed your audio playback, you may choose to abandon the audio focus, as shown in Listing 15.13.

2.11

Listing 15.13: Abandoning audio focus

AudioManager am = 
  (AudioManager)getSystemService(Context.AUDIO_SERVICE);
 
am.abandonAudioFocus(focusChangeListener);

code snippet PA4AD_Ch15_Media_Player/src/AudioPlayerActivity.java

Typically, this is only necessary when your application takes only transient audio focus. In the case of a media player, it is reasonable to maintain the audio focus whenever the music is playing or your Activity is in the foreground.

Pausing Playback When the Output Changes

If the current output stream is an attached headset, disconnecting it will result in the system automatically switching output to the device's speakers. It's considered good practice to pause (or reduce the volume of) your audio output in these circumstances.

To do so, create a Broadcast Receiver that listens for the AudioManager.ACTION_AUDIO_BECOMING_NOISY broadcast and pauses your playback, as shown in Listing 15.14.

2.11

Listing 15.14: Pausing output when the headset is disconnected

private class NoisyAudioStreamReceiver extends BroadcastReceiver {
  @Override
  public void onReceive(Context context, Intent intent) {
    if (AudioManager.ACTION_AUDIO_BECOMING_NOISY.equals
      (intent.getAction())) {
      pause();
    }
  }
}

code snippet PA4AD_Ch15_Media_Player/src/AudioPlayerActivity.java

Introducing the Remote Control Client

Android 4.0 (API level 14) introduced the Remote Control Client. Using the Remote Control Client, your application can provide data, and respond, to remote controls capable of displaying metadata, artwork, and media transport control buttons—such as the lock screen on Android 4.0 devices, as shown in Figure 15.2.

Figure 15.2

15.2

To add support for the Remote Control Client in your application, you must have a Receiver implementation that has been registered as a Media Button Event Receiver, as described earlier in the “Responding to the Media Playback Controls” section.

Create a Pending Intent containing the ACTION_MEDIA_BUTTON action that is targeted at your Receiver, and use it to create a new Remote Control Client. You register it with the Audio Manager using the registerRemoteControlClient method, as shown in Listing 15.15.

2.11

Listing 15.15: Registering a Remote Control Client

AudioManager am = 
  (AudioManager)getSystemService(Context.AUDIO_SERVICE);
 
// Create a Pending Intent that will broadcast the 
// media button press action. Set the target component 
// to your Broadcast Receiver. 
Intent mediaButtonIntent = new Intent(Intent.ACTION_MEDIA_BUTTON);
ComponentName component = 
  new ComponentName(this, MediaControlReceiver.class);
 
mediaButtonIntent.setComponent(component);
PendingIntent mediaPendingIntent =
   PendingIntent.getBroadcast(getApplicationContext(), 0,    
                              mediaButtonIntent, 0);
 
// Create a new Remote Control Client using the
// Pending Intent and register it with the
// Audio Manager
myRemoteControlClient = 
  new RemoteControlClient(mediaPendingIntent);
 
am.registerRemoteControlClient(myRemoteControlClient);

code snippet PA4AD_Ch15_Media_Player/src/AudioPlayerActivity.java

In this example, the Remote Control Client button presses will be received by the Media Control Receiver, which, in turn, will broadcast them to the Receiver registered within the Activity.

After registering your Remote Control Client, you can use it to modify the metadata displayed on the associated display.

Use the setTransportControlFlags method to define which playback controls your application supports, as shown in Listing 15.16.

2.11

Listing 15.16: Configuring the Remote Control Client playback controls

myRemoteControlClient.setTransportControlFlags(
  RemoteControlClient.FLAG_KEY_MEDIA_PLAY_PAUSE|
  RemoteControlClient.FLAG_KEY_MEDIA_STOP);

code snippet PA4AD_Ch15_Media_Player/src/AudioPlayerActivity.java

It's also possible to use the setPlaybackState method to update the current state of playback by using one of the RemoteControlClient.PLAYBACK_* constants:

myRemoteControlClient.setPlaybackState(RemoteControlClient.PLAYSTATE_PLAYING);

You can supply a bitmap, text string, and numeric value associated with the currently playing audio—typically the album artwork, track name, and elapsed track time, respectively. To do so, use the MetadataEditor, accessible from the Remote Control Client's editMetadata method as shown inListing 15.17.

Using the putBitmap on the MetadataEditor object, you can specify an associated bitmap using the MetadataEditor.BITMAP_KEY_ARTWORK key.

Using the putLong method you can add the track number, CD number, year of recording, and elapsed duration using the MediaMetadataRetriever.METADATA_KEY_* constants.

Similarly, the putString method lets you specify the album, album artist, track title, track title, author, compilation, composer, release data, genre, and writer of the current audio—specifying null where no such data is available.

To apply changes to the displayed metadata, call the apply method.

2.11

Listing 15.17: Applying changes to the Remote Control Client metadata

MetadataEditor editor = myRemoteControlClient.editMetadata(false);
 
editor.putBitmap(MetadataEditor.BITMAP_KEY_ARTWORK, artwork);
editor.putString(MediaMetadataRetriever.METADATA_KEY_ALBUM, album);
editor.putString(MediaMetadataRetriever.METADATA_KEY_ARTIST, artist);
editor.putLong(MediaMetadataRetriever.METADATA_KEY_CD_TRACK_NUMBER, 
               trackNumber);
 
editor.apply();

code snippet PA4AD_Ch15_Media_Player/src/AudioPlayerActivity.java

Manipulating Raw Audio

The AudioTrack and AudioRecord classes let you record audio directly from the audio input hardware and stream PCM audio buffers directly to the audio hardware for playback.

Using Audio Track streaming, you can process and play back incoming audio in near real time, letting you manipulate incoming or outgoing audio and perform signal processing on raw audio.

Although a detailed account of raw audio processing and manipulation is beyond the scope of this book, the following sections offer an introduction to recording and playing back raw PCM data.

Recording Sound with Audio Record

Use the AudioRecord class to record audio directly from the hardware buffers. Create a new Audio Record object, specifying the source, frequency, channel configuration, audio encoding, and buffer size:

int bufferSize = AudioRecord.getMinBufferSize(frequency,
                                              channelConfiguration,
                                              audioEncoding);
 
AudioRecord audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,
                                          frequency, channelConfiguration,
                                          audioEncoding, bufferSize); 

The frequency, audio encoding, and channel configuration values will affect the size and quality of the recorded audio. None of this meta data is associated with the recorded files.

For privacy reasons, Android requires that the RECORD_AUDIO permission be included in your manifest:

<uses-permission android:name="android.permission.RECORD_AUDIO"/>

When your Audio Record object is initialized, run the startRecording method to begin asynchronous recording, and use the read method to add raw audio data into the recording buffer:

audioRecord.startRecording(); 
while (isRecording) {
  [ ... populate the buffer ... ]
  int bufferReadResult = audioRecord.read(buffer, 0, bufferSize);
}

Listing 15.18 records raw audio from a microphone to a file stored on an SD card. The next section shows you how to use an Audio Track to play this audio.

2.11

Listing 15.18: Recording raw audio with Audio Record

int frequency = 11025;
int channelConfiguration = AudioFormat.CHANNEL_CONFIGURATION_MONO;
int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;
 
File file = 
  new File(Environment.getExternalStorageDirectory(), "raw.pcm");
 
// Create the new file.
try {
  file.createNewFile();
} catch (IOException e) {
  Log.d(TAG, "IO Exception", e);
}
 
try {
  OutputStream os = new FileOutputStream(file);
  BufferedOutputStream bos = new BufferedOutputStream(os);
  DataOutputStream dos = new DataOutputStream(bos);
 
  int bufferSize = AudioRecord.getMinBufferSize(frequency,
                                                channelConfiguration,
                                                audioEncoding);
  short[] buffer = new short[bufferSize];
 
  // Create a new AudioRecord object to record the audio.
  AudioRecord audioRecord =
    new AudioRecord(MediaRecorder.AudioSource.MIC,
                    frequency,
                    channelConfiguration,
                    audioEncoding, bufferSize);
  audioRecord.startRecording();
 
  while (isRecording) {
    int bufferReadResult = audioRecord.read(buffer, 0, bufferSize);
    for (int i = 0; i < bufferReadResult; i++)
      dos.writeShort(buffer[i]);
  }
 
  audioRecord.stop();
  dos.close();
} catch (Throwable t) {
  Log.d(TAG, "An error occurred during recording", t);
}

code snippet PA4AD_Ch15_Raw_Audio/src/RawAudioActivity.java

Playing Sound with Audio Track

Use the AudioTrack class to play raw audio directly into the hardware buffers. Create a new Audio Track object, specifying the streaming mode, frequency, channel configuration, and the audio encoding type and length of the audio to play back:

  AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
                                         frequency,
                                         channelConfiguration,
                                         audioEncoding,
                                         audioLength,
                                         AudioTrack.MODE_STREAM);

Because this is raw audio, no metadata is associated with the recorded files, so it's important to correctly set the audio data properties to the same values as those used when recording the file.

After initializing your Audio Track, run the play method to begin asynchronous playback, and use the write method to add raw audio data into the playback buffer:

  audioTrack.play(); 
  audioTrack.write(audio, 0, audioLength);

You can write audio into the Audio Track buffer either before or after play has been called. In the former case, playback will commence as soon as play is called; in the latter case, playback will begin as soon as you write data to the Audio Track buffer.

Listing 15.19 plays back the raw audio recorded in Listing 15.18 but does so at double speed by halving the expected frequency of the audio file.

2.11

Listing 15.19: Playing raw audio with Audio Track

int frequency = 11025/2;
int channelConfiguration = AudioFormat.CHANNEL_CONFIGURATION_MONO;
int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;
 
File file = 
  new File(Environment.getExternalStorageDirectory(), "raw.pcm");
 
// Short array to store audio track (16 bit so 2 bytes per short)
int audioLength = (int)(file.length()/2);
short[] audio = new short[audioLength];
 
try {
  InputStream is = new FileInputStream(file);
  BufferedInputStream bis = new BufferedInputStream(is);
  DataInputStream dis = new DataInputStream(bis);
 
  int i = 0;
  while (dis.available() > 0) {
    audio[i] = dis.readShort();
    i++;
  }
 
  // Close the input streams.
  dis.close();
 
  // Create and play a new AudioTrack object
  AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
                                         frequency,
                                         channelConfiguration,
                                         audioEncoding,
                                         audioLength,
                                         AudioTrack.MODE_STREAM);
  audioTrack.play();
  audioTrack.write(audio, 0, audioLength);
} catch (Throwable t) {
  Log.d(TAG, "An error occurred during playback", t);
}

code snippet PA4AD_Ch15_Raw_Audio/src/RawAudioActivity.java

Creating a Sound Pool

You can use the SoundPool class to manage audio when your application requires low audio latency and/or will be playing multiple audio streams simultaneously (such as a game with multiple sound effects and background music).

Creating a Sound Pool preloads the audio tracks used by your application, such as each level within a game, and optimizes their resource management.

As you add each track to the Sound Pool, it is decompressed and decoded into raw 16-bit PCM streams, allowing you to package compressed audio resources without suffering from the latency and CPU effects of decompression during playback.

When creating a Sound Pool, you can specify the maximum number of concurrent streams to play, allowing it to minimize the effect of the audio mixing by automatically stopping the oldest, lowest priority stream within the pool when the limit is reached.

When creating a new Sound Pool, you specify the target stream (almost always STREAM_MUSIC) and the maximum number of simultaneous streams that should be played concurrently as shown in Listing 15.20.

The Sound Pool supports loading audio resources from an Asset File Descriptor, package resource, file path, or File Descriptor, using a series of overloaded load methods. Loading a new audio resource returns an integer that is used to uniquely identify that sample and must be used to alter its playback settings or initiate or pause playback, as shown in Listing 15.21.

2.11

Listing 15.20: Creating a Sound Pool

int maxStreams = 10;
SoundPool sp = new SoundPool(maxStreams, AudioManager.STREAM_MUSIC, 0);
 
int track1 = sp.load(this, R.raw.track1, 0);
int track2 = sp.load(this, R.raw.track2, 0);
int track3 = sp.load(this, R.raw.track3, 0);

code snippet PA4AD_Ch15_Media_Player/src/SoundPoolActivity.java

Use the play, pause, resume, and stop methods to control the playback of each audio stream. When your audio samples are playing, you can use the setLoop method to alter the number of times the specified sample should repeat, the setRate method to modify the playback frequency, and thesetVolume method to alter the playback volume. Some of these playback controls are shown in Listing 15.21.

2.11

Listing 15.21: Controlling playback of Sound Pool audio

track1Button.setOnClickListener(new OnClickListener() {
  public void onClick(View v) {
    sp.play(track1, 1, 1, 0, -1, 1);
  }
});
 
track2Button.setOnClickListener(new OnClickListener() {
  public void onClick(View v) {
    sp.play(track2, 1, 1, 0, 0, 1);
  }
});
 
track3Button.setOnClickListener(new OnClickListener() {
  public void onClick(View v) {
    sp.play(track3, 1, 1, 0, 0, 0.5f);
  }
});
 
stopButton.setOnClickListener(new OnClickListener() {
  public void onClick(View v) {
    sp.stop(track1);
    sp.stop(track2);
    sp.stop(track3);
  }
});
 
chipmunkButton.setOnClickListener(new OnClickListener() {
  public void onClick(View v) {
    sp.setRate(track1, 2f);
  }
});

code snippet PA4AD_Ch15_Media_Player/src/SoundPoolActivity.java

Android 2.2 (API level 8) introduced two convenience methods, autoPause and autoResume, that will pause and resume, respectively, all the active audio streams.

If you are creating a game, or other application that should play audio only when visible, it's good practice to pause all the active audio unless your application is active and visible, and restart it only after the user has begun interacting with it again—typically by touching the screen.

When you no longer require the audio collected within a Sound Pool, call its release method to free the resources:

soundPool.release();

Using Audio Effects

Android 2.3 (API level 9) introduced a suite of audio effects that can be applied to the audio output of any Audio Track or Media Player. After applying the effects, you can modify the effect settings and parameters to alter how they affect the audio being output within your application.

As of Android 4.0.3, the following five AudioEffect subclasses are available:

· Equalizer—Lets you modify the frequency response of your audio output. Use the setBandLevel method to assign a gain value to a specific frequency band.

· Virtualizer—Makes audio appear to be more three-dimensional. Its implementation will vary depending on the configuration of the output device. Use the setStrength method to set the strength of the effect between 0 and 1000.

· BassBoost—Boosts the low frequencies of your audio output. Use the setStrength method to set the strength of the effect between 0 and 1000.

· PresetReverb—Allows you to specify one of a number of reverb presets, designed to make your audio sound as though it were being played in one of the specified room types. Use the setPreset method to apply reverb equivalent to a medium or large hall, or small, medium, or large room using a PresetReverb.PRESET_* constant.

· EnvironmentalReverb—Like the Preset Reverb, the Environmental Reverb allows you to control the audio output to simulate the effect of a different environment. Unlike the Preset Reverb, this subclass lets you specify each of the reverb parameters yourself to create a custom effect.

To apply one of these effects to your Audio Track or Media Player, find its unique audio session ID using the getAudioSessionId method on either object. Use the value to construct a new instance of the Audio Effect subclass you want to use, modify its settings as desired, and enable it, as shown in Listing 15.22.

2.11

Listing 15.22: Applying audio effects

int sessionId = mediaPlayer.getAudioSessionId();
short boostStrength = 500;
int priority = 0;
 
BassBoost bassBoost = new BassBoost (priority, sessionId);
bassBoost.setStrength(boostStrength);
bassBoost.setEnabled(true);

code snippet PA4AD_Ch15_Media_Player/src/AudioPlayerActivity.java

Using the Camera for Taking Pictures

The T-Mobile G1 was released in 2008 with a 3.2-megapixel camera. Today, most devices feature at least a 5-megapixel camera, with some models sporting 8.1-megapixel cameras. The ubiquity of smartphones featuring increasingly high-quality cameras has made camera applications popular additions to Google Play.

The following sections demonstrate the mechanisms you can use to control the camera and take photos within your applications.

Using Intents to Take Pictures

The easiest way to take a picture from within your application is to fire an Intent using the MediaStore.ACTION_IMAGE_CAPTURE action:

startActivityForResult(
  new Intent(MediaStore.ACTION_IMAGE_CAPTURE), TAKE_PICTURE);

This launches a Camera application to take the photo, providing your users with the full suite of camera functionality without you having to rewrite the native Camera application.

Once users are satisfied with the image, the result is returned to your application within the Intent received by the onActivityResult handler.

By default, the picture taken will be returned as a thumbnail, available as a raw bitmap within the data extra within the returned Intent.

To obtain a full image, you must specify a target file in which to store it, encoded as a URI passed in using the MediaStore.EXTRA_OUTPUT extra in the launch Intent, as shown in Listing 15.23.

2.11

Listing 15.23: Requesting a full-size picture using an Intent

// Create an output file.
File file = new File(Environment.getExternalStorageDirectory(),
                     "test.jpg");
Uri outputFileUri = Uri.fromFile(file);
 
// Generate the Intent.
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
intent.putExtra(MediaStore.EXTRA_OUTPUT, outputFileUri);
 
// Launch the camera app.
startActivityForResult(intent, TAKE_PICTURE);

code snippet PA4AD_Ch15_Intent_Camera/src/CameraActivity.java

The full-size image taken by the camera will then be saved to the specified location. No thumbnail will be returned in the Activity result callback, and the received Intent's data will be null.

Listing 15.24 shows how to use getParcelableExtra to extract a thumbnail where one is returned, or to decode the saved file when a full-size image is taken.

2.11

Listing 15.24: Receiving pictures from an Intent

@Override
protected void onActivityResult(int requestCode,
                                int resultCode, Intent data) {
  if (requestCode == TAKE_PICTURE) {
    // Check if the result includes a thumbnail Bitmap
    if (data != null) {
      if (data.hasExtra("data")) {
        Bitmap thumbnail = data.getParcelableExtra("data");
        imageView.setImageBitmap(thumbnail);
      }
    } else {
      // If there is no thumbnail image data, the image
      // will have been stored in the target output URI.
 
      // Resize the full image to fit in out image view.
      int width = imageView.getWidth();
      int height = imageView.getHeight();
      
      BitmapFactory.Options factoryOptions = new 
        BitmapFactory.Options();
 
      factoryOptions.inJustDecodeBounds = true;
      BitmapFactory.decodeFile(outputFileUri.getPath(), 
                              factoryOptions);
        
      int imageWidth = factoryOptions.outWidth;
      int imageHeight = factoryOptions.outHeight;
      
      // Determine how much to scale down the image
      int scaleFactor = Math.min(imageWidth/width, 
                                 imageHeight/height);
      
      // Decode the image file into a Bitmap sized to fill the View
      factoryOptions.inJustDecodeBounds = false;
      factoryOptions.inSampleSize = scaleFactor;
      factoryOptions.inPurgeable = true;
      
      Bitmap bitmap = 
        BitmapFactory.decodeFile(outputFileUri.getPath(),
                                 factoryOptions);
        
      imageView.setImageBitmap(bitmap); 
    }
  }
}

code snippet PA4AD_Ch15_Intent_Camera/src/CameraActivity.java

To make photos you save available to other applications, including the native Gallery app, it's good practice to add them Media Store—as described in the section “Adding Media to the Media Store.”

Controlling the Camera Directly

To access the camera hardware directly, you need to add the CAMERA permission to your application manifest:

<uses-permission android:name="android.permission.CAMERA"/>

Use the Camera class to adjust camera settings, specify image preferences, and take pictures. To access the Camera, use the static open method on the Camera class:

Camera camera = Camera.open();

When you are finished with the Camera, remember to relinquish your hold on it by calling release:

camera.release();

2.1

The Camera.open method will turn on and initialize the Camera. At this point it is ready for you to modify settings, configure the preview surface, and take pictures, as shown in the following sections.

Camera Properties

The Camera settings are stored using a Camera.Parameters object, accessible by calling the getParameters method on the Camera object:

Camera.Parameters parameters = camera.getParameters();

Using the Camera Parameters you can find many of the properties of the Camera and currently focused scene; the parameters available depend on the platform version.

You can find the focal length and related horizontal and vertical angle of view using thegetFocalLength and get[Horizontal/Vertical]ViewAngle methods, respectively, introduced in Android 2.2 (API level 8).

Android 2.3 (API level 9) introduced the getFocusDistances method, which you can use to estimate the distance between the lens and the objects currently believed to be in focus. Rather than returning a value, this method populates an array of floats corresponding to the near, far, and optimal focus distances, as shown in Listing 15.25. The object most sharply focused will be at the optimal distance.

2.11

Listing 15.25: Finding the distance to focused objects

float[] focusDistances = new float[3];
 
parameters.getFocusDistances(focusDistances);
 
float near =
  focusDistances[Camera.Parameters.FOCUS_DISTANCE_NEAR_INDEX];
float far = 
  focusDistances[Camera.Parameters.FOCUS_DISTANCE_FAR_INDEX];
float optimal = 
  focusDistances[Camera.Parameters.FOCUS_DISTANCE_OPTIMAL_INDEX];

code snippet PA4AD_Ch15_Camera/src/CameraActivity.java

Camera Settings and Image Parameters

To change the Camera settings, use the set* methods to modify the Parameters object. Android 2.0 (API level 5) introduced a wide range of Camera Parameters, each with a setter and getter. Before attempting to modify any camera parameter, it's important to confirm that the Camera implementation on the host device supports the change.

After modifying the Parameters, pass them back into the Camera, using its setParameters method to apply the changes:

camera.setParameters(parameters);

Most of the following parameters are useful primarily if you are replacing the native Camera application. That said, they can also be useful for customizing the way the live preview is displayed, allowing you to customize the live stream for augmented reality applications.

· [get/set]SceneMode—Returns/sets the type of scene being photographed using one of several SCENE_MODE_* static constants. Each scene mode optimally configures the Camera parameters (flash, white balance, focus mode, and so on) for a particular scene type (party, beach, sunset, and so on).

· [get/set]FlashMode—Returns/sets the current flash mode (typically one of on, off, red-eye reduction, or flashlight mode) using the FLASH_MODE_* static constants. Before attempting to set the flash mode, use the getSupportedFlashModes method to confirm which modes are available.

· [get/set]WhiteBalance—Returns/sets the white balance correction used to correct the scene, using one of the WHITE_BALANCE_* static constants. Before setting the white balance, use the getSupportedWhiteBalance method to confirm which settings are available.

· [get/set]AutoWhiteBalanceLock—Introduced in Android 4.0 (API level 14). When using automatic white balancing, enabling the auto white balance lock will pause the color correction algorithm, ensuring that multiple sequential photos use the same color balance settings. This is particularly effective when taking panoramic images or exposure bracketing for high dynamic range images. Use the isAutoWhiteBalanceLockSupported method to confirm this functionality is available on the host device.

· [get/set]ColorEffect—Returns/sets any special color effects to apply to the image using an EFFECT_* static constant. The color effects available (including sepia tone, posterize, and blackboard effects) vary by device and platform version. Use the getSupportedColorEffects method to find which color effects are available.

· [get/set]FocusMode—Returns/sets how the camera should attempt to focus using a FOCUS_MODE_* static constant. The available focus modes vary depending on the platform version. (For example, continuous autofocus was introduced in Android 4.0.) Use the getSupportedFocusModesmethod to find which modes are available.

· [get/set]Antibanding—Returns/sets the screen refresh frequency that should be used to reduce banding effects using an ANTIBANDING_* static constant. Use the getSupportedAntibanding method to find which frequencies are available.

You can also use Camera Parameters to read or specify size, quality, and format parameters for the image, thumbnail, and camera preview. The following list explains how to set some of these values:

· JPEG and thumbnail quality—Use the setJpegQuality and setJpegThumbnailQuality methods, respectively, passing in an integer value between 0 and 100, where 100 is the best quality.

· Image, preview, and thumbnail sizes—Use setPictureSize, setPreviewSize, and setJpegThumbnailSize to specify a height and width for the image, preview, and thumbnail, respectively. In each case, you should use the corresponding getSupportedPictureSizes,getSupportedPreviewSizes, and getSupportedJpegThumbnailSizes methods to determine valid values. Each method returns a List of Camera.Size objects that specify valid height/width combinations.

· Image and preview pixel format—Use setPictureFormat and setPreviewFormat to set the image format using a static constant from the PixelFormat class. Use the getSupportedPictureFormats and getSupportedPreviewFormats methods to return a list of the supported formats before using either of these setters.

· Preview frame rate—The setPreviewFpsRange method replaces the setPreviewFrameRate method that was deprecated in Android 2.3 (API level 9). Use it to specify your preferred frame rate range to use for previews. Use the getSupportedPreviewFpsRange method to find the minimum and maximum supported frame rate. Both methods represent the frame rate as an integer multiplied by 1000, so a range of 24 to 30 FPS becomes 24000 to 30000.

Checking for supported parameter values is particularly important when selecting valid preview or image sizes, as each device's camera will potentially support a different subset.

Controlling Auto Focus, Focus Areas, and Metering Areas

If the host Camera supports auto focus, you can specify the focus mode using the setFocusMode method, passing in one of the Camera.Parameters.FOCUS_MODE_* constants. The available focus modes will depend on the capabilities of the hardware and the version of the Android platform it runs. Use the getSupportedFocusModes method to find which modes are available.

To be notified when the auto focus operation has completed, initiate auto focus using the autofocus method, specifying an AutoFocusCallback implementation:

Camera.Parameters parameters = camera.getParameters();
if (parameters.getSupportedFocusModes().contains(
  Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE)) {
  parameters.setFocusMode(
    Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
  
  camera.autoFocus(new AutoFocusCallback() {
    public void onAutoFocus(boolean success, Camera camera) {
      Log.d(TAG, "AutoFocus: " + (success ? "Succeeded" : "Failed"));
    }
  });
}

Android 4.0 (API level 14) introduced two additional focus APIs that enable you to specify the focus areas and metering areas to use when focusing your pictures or determining the white balance and brightness of your scene.

Not all devices support defining focus areas. To confirm that this is available on the host device, use the Camera's getMaxNumFocusAreas method:

int focusAreaCount = camera.getMaxNumFocusAreas()

This will return the maximum number of focus areas the device camera is capable of detecting. If the result is 0, focus area specification is not supported.

Specifying Focus Areas allows you to instruct the camera driver as to the relative importance of different areas of the scene when attempting to focus the image. This is typically used to focus on faces or to allow users to manually select a focal point.

To define your focus areas, use the setFocusAreas method, passing in a List of Camera.Area objects. Each Camera Area consists of a rectangle that defines the boundary of that focus area (between –1000 and 1000, measured from the upper-left corner) relative to the currently visible scene and the relative weight of that focus area. The camera driver will multiply the area of each focus area with its weight to calculate the relative weight of each area when attempting to focus the scene.

You can use the same approach to set the metering areas using setMeteringAreas. As with focus area support, not all devices will support multiple metering areas—use the getMaxNumMeteringAreas to determine if the host camera supports one or more metering areas.

Using the Camera Preview

If you are implementing your own camera, you will need to display a preview of what's being captured by the camera to allow users to compose their photos. It's not possible to take a picture using the Camera object without first displaying a preview.

Being able to display the camera's streaming video also means that you can incorporate live video into your applications, such as implementing augmented reality (the process of overlaying dynamic contextual data—such as details for landmarks or points of interest—on top of a live camera feed).

The camera preview is displayed using a SurfaceHolder, so to view the live camera stream within your application, you must include a Surface View within your UI hierarchy. Implement a SurfaceHolder.Callback to listen for the construction of a valid surface before passing it in to thesetPreviewDisplay method of your Camera object.

A call to startPreview will begin the streaming, and stopPreview will end it, as shown in Listing 15.26.

2.11

Listing 15.26: Previewing a real-time camera stream

public class CameraActivity extends Activity implements 
  SurfaceHolder.Callback {
  
  private static final String TAG = "CameraActivity";
 
  private Camera camera;
  
  @Override
  public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.main);
    
    SurfaceView surface = (SurfaceView)findViewById(R.id.surfaceView);
    SurfaceHolder holder = surface.getHolder();
    holder.addCallback(this);
    holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    holder.setFixedSize(400, 300);
  }
  
  public void surfaceCreated(SurfaceHolder holder) { 
    try {
      camera.setPreviewDisplay(holder);
      camera.startPreview();
      // TODO Draw over the preview if required.
    } catch (IOException e) {
      Log.d(TAG, "IO Exception", e);
    }
  }
 
  public void surfaceDestroyed(SurfaceHolder holder) {
    camera.stopPreview();
  }
  
  public void surfaceChanged(SurfaceHolder holder, int format, 
                             int width, int height) {
  }
  
  @Override
  protected void onPause() {
    super.onPause();
    camera.release();
  }
 
  @Override
  protected void onResume() {
    super.onResume();
    camera = Camera.open();        
  }
}

code snippet PA4AD_Ch15_Camera/src/CameraActivity.java

2.1

The Android SDK includes an excellent example of using a SurfaceView to display the camera preview in real time. It can be found at http://developer.android.com/resources/samples/ApiDemos/src/com/example/android/apis/graphics/CameraPreview.html.

You can also assign a PreviewCallback to be fired for each preview frame, allowing you to manipulate or perform analysis of each preview frame in real time. Call the setPreviewCallback method on the Camera object, passing in a new PreviewCallback implementation overriding theonPreviewFrame method.

Each frame will be received by the onPreviewFrame event with the image passed in as a Bitmap represented as a byte array:

camera.setPreviewCallback(new PreviewCallback() {
  public void onPreviewFrame(byte[] data, Camera camera) {
    int quality = 60;
  
    Size previewSize = camera.getParameters().getPreviewSize();
    YuvImage image = new YuvImage(data, ImageFormat.NV21, 
      previewSize.width, previewSize.height, null);
    ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
    
    image.compressToJpeg(
      new Rect(0, 0,previewSize.width, previewSize.height), 
        quality, outputStream);
 
     // TODO Do something with the preview image.
  }
});

Detecting Faces and Facial Features

Android 4.0 (API level 14) introduced APIs that you can use to detect faces and facial features within a scene. This feature is most useful for tweaking the focus areas, metering areas, and determining white balance when taking photos featuring people, but it can also be used creatively when applying effects.

Face detection is not necessarily available on every device, even those running Android 4.0 or above. To confirm that face detection is available on the host device, use the Camera's getMaxNumDetectedFaces method:

int facesDetectable = camera.getParameters().getMaxNumDetectedFaces()

This will return the maximum number of faces the device camera is capable of detecting. If the result is 0, face detection is not supported.

Before you begin monitoring a camera for faces, you need to assign a new FaceDetectionListener, overriding the onFaceDetection method. You will receive an array containing a Face object for each face detected within the scene (up to the maximum supported number).

Each Face object includes a unique identifier that can be used to track each face while it remains in the scene, a confidence score between 0 and 100 that indicates the likelihood that what's been detected is actually a face, the bounding rectangle containing the face, and the coordinates of each eye and the mouth:

camera.setFaceDetectionListener(new FaceDetectionListener() {  
  public void onFaceDetection(Face[] faces, Camera camera) {
    if (faces.length > 0){
      Log.d("FaceDetection", "face detected: "+ faces.length +
          " Face 1 Location X: " + faces[0].rect.centerX() +
          "Y: " + faces[0].rect.centerY() );
    }
  }
});

To begin detecting and tracking faces, call the Camera's startFaceDetection method. This must be called each time you start (or restart) the Camera preview, as it will be automatically stopped whenever the preview ends.

public void surfaceCreated(SurfaceHolder holder) { 
  try {
    camera.setPreviewDisplay(holder);
    camera.startPreview();
    camera.startFaceDetection();
    // TODO Draw over the preview if required.
  } catch (IOException e) {
    Log.d(TAG, "IO Exception", e);
  }
}

You can stop face detection by calling stopFaceDetection:

public void surfaceDestroyed(SurfaceHolder holder) {
  camera.stopFaceDetection();
  camera.stopPreview();
}

Taking a Picture

After you have configured the camera settings, and a preview is active, you can take a picture by calling takePicture on the Camera object and passing in a ShutterCallback and two PictureCallback implementations (one for the RAW and one for JPEG-encoded images). Each picture callback will receive a byte array representing the image in the appropriate format, while the shutter callback is triggered immediately after the shutter is closed.

Listing 15.27 shows the skeleton code for taking a picture and saving the JPEG image to an SD card.

2.11

Listing 15.27: Taking a picture

private void takePicture() {
  camera.takePicture(shutterCallback, rawCallback, jpegCallback);
}
 
ShutterCallback shutterCallback = new ShutterCallback() {
  public void onShutter() {
    // TODO Do something when the shutter closes.
  }
};
 
PictureCallback rawCallback = new PictureCallback() {
  public void onPictureTaken(byte[] data, Camera camera) {
    // TODO Do something with the image RAW data.
  }
};
 
PictureCallback jpegCallback = new PictureCallback() {
  public void onPictureTaken(byte[] data, Camera camera) {
    // Save the image JPEG data to the SD card
    FileOutputStream outStream = null;
    try {
      String path = Environment.getExternalStorageDirectory() + 
                    "\test.jpg";
 
      outStream = new FileOutputStream(path);
      outStream.write(data);
      outStream.close();
    } catch (FileNotFoundException e) {
      Log.e(TAG, "File Note Found", e);
    } catch (IOException e) {
      Log.e(TAG, "IO Exception", e);
    }
  }
};

code snippet PA4AD_Ch15_Camera/src/CameraActivity.java

Reading and Writing JPEG EXIF Image Details

The ExifInterface class provides mechanisms for you to read and modify the Exchangeable Image File Format (EXIF) meta data stored within a JPEG file. Create a new ExifInterface instance by passing the full filename of the target JPEG in to the constructor:

  ExifInterface exif = new ExifInterface(jpegfilename);

EXIF data is used to store a wide range of meta data on photographs, including date and time, camera settings (such as make and model), and image settings (such as aperture and shutter speed), as well as image descriptions and locations.

To read an EXIF attribute, call getAttribute on the ExifInterface object, passing in the name of the attribute to read. The Exifinterface class includes a number of static TAG_* constants that can be used to access common EXIF meta data. To modify an EXIF attribute, use setAttribute, passing in the name of the attribute to read and the value to set it to.

Listing 15.28 shows how to read the location coordinates and camera model from a file stored on an SD card, before modifying the camera manufacturer details.

2.11

Listing 15.28: Reading and modifying EXIF data

File file = new File(Environment.getExternalStorageDirectory(),
                     "test.jpg");
 
try {
  ExifInterface exif = new ExifInterface(file.getCanonicalPath());
  // Read the camera model and location attributes
  String model = exif.getAttribute(ExifInterface.TAG_MODEL);
  Log.d(TAG, "Model: " + model);
  // Set the camera make
  exif.setAttribute(ExifInterface.TAG_MAKE, "My Phone");
} catch (IOException e) {
  Log.e(TAG, "IO Exception", e);
}

code snippet PA4AD_Ch15_Camera/src/CameraActivity.java

Recording Video

Android offers two options for recording video within your application.

The simplest technique is to use Intents to launch the video camera application. This option lets you specify the output location and video recording quality, while letting the native video recording application handle the user experience and error handling. This is the best practice approach and should be used in most circumstances, unless you are building your own replacement video recorder.

In cases where you want to replace the native application or simply need more fine-grained control over the video capture UI or recording settings, you can use the Media Recorder class.

Using Intents to Record Video

The easiest, and best practice, way to initiate video recording is using the MediaStore.ACTION_VIDEO_CAPTURE action Intent.

Starting a new Activity with this Intent launches the native video recorder, allowing users to start, stop, review, and retake their video. When they're satisfied, a URI to the recorded video is provided to your Activity as the data parameter of the returned Intent:

The video capture action Intent can contain the following three optional extras:

· MediaStore.EXTRA_OUTPUT—By default, the video recorded by the video capture action will be stored in the default Media Store. If you want to record it elsewhere, you can specify an alternative URI using this extra.

· MediaStore.EXTRA_VIDEO_QUALITY—The video capture action allows you to specify an image quality using an integer value. There are currently two possible values: 0 for low (MMS) quality videos, or 1 for high (full resolution) videos. By default, the high-resolution mode is used.

· MediaStore.EXTRA_DURATION_LIMIT—The maximum length of the recorded video (in seconds).

Listing 15.29 shows how to use the video capture action to record a new video.

2.11

Listing 15.29: Recording video using an Intent

private static final int RECORD_VIDEO = 0;
  
private void startRecording() {
  // Generate the Intent.
  Intent intent = new Intent(MediaStore.ACTION_VIDEO_CAPTURE);
  
  // Launch the camera app.
  startActivityForResult(intent, RECORD_VIDEO);
}
 
@Override
protected void onActivityResult(int requestCode,
                                int resultCode, Intent data) {
  if (requestCode == RECORD_VIDEO) {
    VideoView videoView = (VideoView)findViewById(R.id.videoView);
    videoView.setVideoURI(data.getData()); 
    videoView.start()
  }
}

code snippet PA4AD_Ch15_Intent_Video_Camera/src/VideoCameraActivity.java

Using the Media Recorder to Record Video

You can use the MediaRecorder class to record audio and/or video files that can be used in your own applications or added to the Media Store.

To record any media in Android, your application needs the CAMERA and RECORD_AUDIO and/or RECORD_VIDEO permissions as applicable:

<uses-permission android:name="android.permission.RECORD_AUDIO"/>
<uses-permission android:name="android.permission.RECORD_VIDEO"/>
<uses-permission android:name="android.permission.CAMERA"/>

The Media Recorder lets you specify the audio and video source, the output file format, and the audio and video encoders to use when recording your file. Android 2.2 (API level 8) introduced the concept of profiles, which can be used to apply a predefined set of Media Recorder configurations.

Much like the Media Player, the Media Recorder manages recording as a state machine. This means that the order in which you configure and manage the Media Recorder is important. In the simplest terms, the transitions through the state machine can be described as follows:

1. Create a new Media Recorder.

2. Unlock the Camera and assign it to the Media Recorder.

3. Specify the input sources to record from.

4. Select a profile to use for Android 2.2 and above, or define the output format and specify the audio and video encoder, frame rate, and output size.

5. Select an output file.

6. Assign a preview Surface.

7. Prepare the Media Recorder for recording.

8. Record.

9. End the recording.

2.1

A more detailed and thorough description of the Media Recorder state machine is provided at the Android developer site, at http://developer.android.com/reference/android/media/MediaRecorder.html.

When you finish recording your media, call release on your Media Recorder object to free the associated resources:

mediaRecorder.release();

Configuring the Video Recorder

As described in the preceding section, before recording you must allocate the camera to use, specify the input sources, choose a profile (or output format, audio, and video encoder), and assign an output file—in that order.

Start by unlocking the Camera and assigning it to the Media Recorder using the setCamera method.

The setAudioSource and setVideoSource methods let you specify a MediaRecorder.AudioSource.* and MediaRecorder.VideoSource.* static constant that define the audio and video source, respectively.

After selecting your input sources, you need to specify the recording profile to use. Android 2.2 (API level 8) introduced the setProfile method, which uses a profile created using the CamcorderProfile class's get method, specifying a quality profile using the CamcorderProfile.QUALITY_*constants. Not all profiles are supported on every device, so use the CamcorderProfile.hasProfile method to confirm the availability of profile before applying it to your Media Recorder:

if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_1080P)) {
  CamcorderProfile profile = CamcorderProfile.get(CamcorderProfile.QUALITY_1080P);
  mediaRecorder.setProfile(profile);
}

Alternatively, you can specify the recording profile manually by selecting the output format, using the setOutputFormat method to specify a MediaRecorder.OutputFormat constant and using the set[audio/video]Encoder methods to specify an audio or video encoder constant from theMediaRecorder.[Audio/Video]Encoder class. Take this opportunity to set the frame rate or video output size, if desired.

Finally, assign a file to store the recorded media using the setOutputFile method before allocating a preview surface and calling prepare.

Listing 15.30 shows how to configure a Media Recorder to record audio and video from the microphone and camera, using the 1080p quality profile, to a file in your application's external storage folder.

2.11

Listing 15.30: Preparing to record audio and video using the Media Recorder

// Unlock the Camera to allow the Media Recorder to own it.
camera.unlock();
 
// Assign the Camera to the Media Recorder.
mediaRecorder.setCamera(camera);
 
// Configure the input sources.
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
 
// Set the recording profile.
CamcorderProfile profile = null;
 
if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_1080P))
  profile = CamcorderProfile.get(CamcorderProfile.QUALITY_1080P);
else if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P))
  profile = CamcorderProfile.get(CamcorderProfile.QUALITY_720P);
else if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_480P))
  profile = CamcorderProfile.get(CamcorderProfile.QUALITY_480P);
else if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_HIGH))
  profile = CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH);
  
if (profile != null)
  mediaRecorder.setProfile(profile); 
 
// Specify the output file
mediaRecorder.setOutputFile("/sdcard/myvideorecording.mp4");
 
// Prepare to record
mediaRecorder.prepare();

code snippet PA4AD_Ch15_Intent_Video_Camera/src/VideoCameraActivity.java

2.1

The setOutputFile method must be called before prepare and after setOutputFormat; otherwise, it will throw an Illegal State Exception.

Android 4.0 (API level 14) introduced a technique to improve the performance of the Media Recorder by reducing startup time. When your Activity is being used only to record audio/video (rather than to take still pictures), you can use the Camera.Parameters.setRecordingHint method to tell the Camera you only want to record audio/video, as shown in Listing 15.31.

2.11

Listing 15.31: Using the Camera recording hint

Camera.Parameters parameters = camera.getParameters();
parameters.setRecordingHint(true);
camera.setParameters(parameters);

code snippet PA4AD_Ch15_Intent_Video_Camera/src/VideoCameraActivity.java

Previewing the Video Stream

When recording video, it's considered good practice to display a preview of the recorded video in real time. Like the Camera preview, you can assign a Surface to display the video stream using the setPreviewDisplay method on your Media Recorder object. The preview display will be hosted within a SurfaceView that must be initialized within a SurfaceHolder.Callback interface implementation.

After creating the Surface Holder, assign it to the Media Recorder using the setPreviewDisplay method—after specifying the recording sources and output file but before calling prepare:

mediaRecorder.setPreviewDisplay(holder.getSurface());

The live video preview stream will begin as soon as you make a call to prepare:

mediaRecorder.prepare();

Controlling the Recording

After configuring the Media Recorder and setting up the preview, you can begin recording at any time by calling the start method:

mediaRecorder.start();

When you finish recording, call stop to end the playback, followed by reset and release to free the Media Recorder resources, as shown in Listing 15.32). At this point, you should also lock the camera.

Listing 15.32: Stopping a video recording

mediaRecorder.stop();
    
// Reset and release the media recorder.
mediaRecorder.reset();
mediaRecorder.release();
camera.lock();

code snippet PA4AD_Ch15_Intent_Video_Camera/src/VideoCameraActivity.java

Android 4.0.3 (API level 15) introduced the ability to apply image stabilization to your video recordings. To toggle image stabilization, modify the Camera parameters using the setVideoStabilization method, as shown in Listing 15.33. Not all camera hardware will support image stabilization, so be sure to check that it's available using the isVideoStabilizationSupported method.

2.11

Listing 15.33: Image stabilization

Camera.Parameters parameters = camera.getParameters();
if (parameters.isVideoStabilizationSupported())
  parameters.setVideoStabilization(true);
camera.setParameters(parameters);

code snippet PA4AD_Ch15_Intent_Video_Camera/src/VideoCameraActivity.java

Creating a Time-Lapse Video

Android 2.2 (API level 8) enhanced the Media Recorder to provide support for creating time-lapse videos. To configure a Media Recorder object to create a time-lapse effect, use the setCaptureRate to set the required frame capture rate:

// Capture an image every 30 seconds.
mediaRecorder.setCaptureRate(0.03); 

The Media Recorder must also be set using one of a number of predefined profiles optimized for time-lapse video capture. Use the setProfile method to use one of the QUALITY_TIME_LAPSE_* profiles:

CamcorderProfile profile =
  CamcorderProfile.get(CamcorderProfile.QUALITY_TIME_LAPSE_HIGH);
 
mediaRecorder.setProfile(profile);

Using Media Effects

Android 4.0 (API level 14) introduced a new media effects API that can be used to apply a number of real-time visual effects to video content using the GPU via OpenGL textures.

You can apply media effects to bitmaps, videos, or the live camera previews, provided that the source images are bound to a GL_TEXTURE_2D texture image and contain at least one mipmap level.

Although a full examination of how to use these media effects is outside the scope of this book, generally speaking, to apply an effect to an image or video frame, you need to create a new EffectContext, using the EffectContext.createWithCurrentGlContext method from within an OpenGL ES 2.0 context.

The effects are created using an EffectFactory, which can be created by calling getFactory on the returned EffectContext. To create a particular effect, call createEffect, passing in one of the EffectFactory.EFFECT_* constants. Each Effect supports different parameters, which you can configure by calling setParameter and passing the name of the setting to change and the value to apply.

More than 25 effects are currently supported. The full list— including the parameters they support—is available at http://developer.android.com/reference/android/media/effect/EffectFactory.html.

After configuring the effect you want to apply, use its apply method, passing in the input texture, its dimensions, and the target texture to apply it.

Adding Media to the Media Store

By default, media files created by your application that are stored in private application folders will be unavailable to other applications. To make them visible, you need to insert them into the Media Store. Android provides two options for this. The preferred approach is to use the Media Scanner to interpret your file and insert it automatically. Or you can manually insert a new record in the appropriate Content Provider. Using the Media Scanner is almost always the better approach.

Inserting Media Using the Media Scanner

If you have recorded new media of any kind, the MediaScannerConnection class provides the scanFile method as a simple way for you to add it to the Media Store without needing to construct the full record for the Media Store Content Provider.

Before you can use the scanFile method to initiate a content scan on your file, you must call connect and wait for the connection to the Media Scanner to complete. This call is asynchronous, so you will need to implement a MediaScannerConnectionClient to notify you when the connection has been made. You can use this same class to notify you when the scan is complete, at which point you can disconnect your Media Scanner Connection.

This sounds more complex than it is. Listing 15.34 shows the skeleton code for creating a new MediaScannerConnectionClient that defines a MediaScannerConnection, which is used to add a new file to the Media Store.

2.11

Listing 15.34: Adding files to the Media Store using the Media Scanner

private void mediaScan(final String filePath) {
  
  MediaScannerConnectionClient mediaScannerClient = new
    MediaScannerConnectionClient() {
  
    private MediaScannerConnection msc = null;
 
    {
      msc = new MediaScannerConnection(
        VideoCameraActivity.this, this); 
      msc.connect();
    }
 
    public void onMediaScannerConnected() {
      // Optionally specify a MIME Type, or
      // have the Media Scanner imply one based
      // on the filename.
      String mimeType = null;
      msc.scanFile(filePath, mimeType);
    }
 
    public void onScanCompleted(String path, Uri uri) {
      msc.disconnect();
      Log.d(TAG, "File Added at: " + uri.toString());
    }
  };
}

code snippet PA4AD_Ch15_Intent_Video_Camera/src/VideoCameraActivity.java

Inserting Media Manually

Rather than relying on the Media Scanner, you can add new media to the Media Store directly by creating a new ContentValues object and inserting it into the appropriate Media Store Content Provider yourself.

The meta data you specify here can include the title, timestamp, and geocoding information for your new media file:

ContentValues content = new ContentValues(3);
content.put(Audio.AudioColumns.TITLE, "TheSoundandtheFury");
content.put(Audio.AudioColumns.DATE_ADDED,
            System.currentTimeMillis() / 1000);
content.put(Audio.Media.MIME_TYPE, "audio/amr");

You must also specify the absolute path of the media file being added:

content.put(MediaStore.Audio.Media.DATA, "/sdcard/myoutputfile.mp4");

Get access to the application's ContentResolver, and use it to insert this new row into the Media Store:

ContentResolver resolver = getContentResolver();
Uri uri = resolver.insert(MediaStore.Video.Media.EXTERNAL_CONTENT_URI,
                          content);

After inserting the media file into the Media Store, you should announce its availability using a Broadcast Intent, as follows:

sendBroadcast(new Intent(Intent.ACTION_MEDIA_SCANNER_SCAN_FILE, uri));