Multimedia Techniques - The Android Developer’s Cookbook: Building Applications with the Android SDK, Second Edition (2013)

The Android Developer’s Cookbook: Building Applications with the Android SDK, Second Edition (2013)

Chapter 8. Multimedia Techniques

The Android platform provides comprehensive multimedia functionality. This chapter introduces techniques to manipulate images, record and play back audio, and record and play back video. Most decoders are supported by Android for reading multimedia, but only a subset of encoders is available for creating multimedia. Basic media framework support in Android 4.1 is summarized in Table 8.1. Vendor-specific versions of Android are known to support more formats than this. This is specifically true for Google TV devices.

Image

Image

Image

Table 8.1 Supported Media Types in Android 4.1 for Reading and Writing

An application that records any type of media requires setting the appropriate permission in the AndroidManifest.xml file (one or both of the following):

<uses-permission android:name="android.permission.RECORD_AUDIO"/>
<uses-permission android:name="android.permission.RECORD_VIDEO"/>

Images

Images local to an application are usually put in the res/drawable/ directory, as discussed in Chapter 5, “User Interface Layout,” and are packaged with the application. They can be accessed with the appropriate resource identifier, such as R.drawable.my_picture. Images on the Android device filesystem can be accessed using the normal Java classes, such as an InputStream. However, the preferred method in Android to read an image into memory for manipulation is to use the built-in class BitmapFactory.

BitmapFactory creates bitmap objects from files, streams, or byte arrays. A resource or a file can be loaded like this:

Bitmap myBitmap1 = BitmapFactory.decodeResource(getResources(),
R.drawable.my_picture);
Bitmap myBitmap2 = BitmapFactory.decodeFile(filePath);

After the image is in memory, it can be manipulated using the bitmap methods, such as getPixel() and setPixel(). However, most images are too large to manipulate full scale on an embedded device. Instead, consider subsampling the image:

Bitmap bm = Bitmap.createScaledBitmap(myBitmap2, 480, 320, false);

This helps to avoid OutOfMemory run-time errors. The following recipe shows an optimized method for loading large images.

Recipe: Loading and Displaying an Image for Manipulation

This recipe shows an example of an image cut into four pieces and scrambled before being displayed to the screen. It also shows how to create a selectable list of images.

When a picture is taken on a device, it is put in the DCIM/Camera/ directory, which is used as an example image directory in this recipe. The image directory is passed to the ListFiles activity, which lists all files and returns the one chosen by the user. The ListFiles activity is shown inListing 8.1.

Listing 8.1. ListFiles.java


public class ListFiles extends ListActivity {
private List<String> directoryEntries = new ArrayList<String>();

@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
Intent i = getIntent();
File directory = new File(i.getStringExtra("directory"));

if (directory.isDirectory()){
File[] files = directory.listFiles();

//Sort in descending date order
Arrays.sort(files, new Comparator<File>(){
public int compare(File f1, File f2) {
returnLong.valueOf(
f1.lastModified()).compareTo(f2.lastModified()
);
}
});

//Fill list with files
this.directoryEntries.clear();
for (File file : files){
this.directoryEntries.add(file.getPath());
}

ArrayAdapter<String> directoryList = new ArrayAdapter<String>(
this,
R.layout.file_row, this.directoryEntries);

//Alphabetize entries
//directoryList.sort(null);
this.setListAdapter(directoryList);
}
}

@Override
protected void onListItemClick(ListView l, View v,
int position, long id) {
File clickedFile = new File(this.directoryEntries.get(position));
Intent i = getIntent();
i.putExtra("clickedFile", clickedFile.toString());
setResult(RESULT_OK, i);
finish();
}
}


A File object is created based on the directory string passed to the activity. If it is a directory, the files are sorted in reverse chronological order by specifying a new compare() method based on the lastModified() flag of the files.

If instead an alphabetical list is desired, the sort() method can be used. (This is in the ListFiles activity, too, but commented out.) The list is then built and displayed on the screen using a separate layout file R.layout.file_row, which is shown in Listing 8.2.

Listing 8.2. res/layout/file_row.xml


<?xml version="1.0" encoding="utf-8"?>
<TextView
xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:textSize="20sp"
android:padding="3pt"
/>


ListFiles returns the path of the selected file to the calling activity, which can read this from the bundle in its onActivityResult(. .) method.

The chosen picture is then loaded into memory for manipulation. If the file is too large, it can be subsampled as it is loaded to save memory; just replace the single bolded statement in onActivityResult of Listing 8.3 with the following:

BitmapFactory.Options options = new BitmapFactory.Options();
options.inSampleSize = 4;
Bitmap imageToChange = BitmapFactory.decodeFile(tmp, options);

An inSampleSize of four creates an image 1/16 the size of the original (four times smaller in each of the pixel dimensions). The limit can be adaptive based on the original image size.

Another method to save memory is to resize the bitmap in memory before manipulations. This is done using the createScaledBitmap() method, as shown in this recipe. Listing 8.3 shows the main activity.

Listing 8.3. ImageManipulation.java


package cc.dividebyzero.android.cookbook.chapter8.image;

import cc.dividebyzero.android.cookbook.chapter8.ListFiles;
import cc.dividebyzero.android.cookbook.chapter8.R;
import cc.dividebyzero.android.cookbook.chapter8.R.id;
import cc.dividebyzero.android.cookbook.chapter8.R.layout;
import android.app.Activity;
import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.os.Bundle;
import android.os.Environment;
import android.widget.ImageView;

public class ImageManipulation extends Activity {
static final String CAMERA_PIC_DIR = "/DCIM/Camera/";
ImageView iv;

@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.image_manipulation);
iv = (ImageView) findViewById(R.id.my_image);

String imageDir =
Environment.getExternalStorageDirectory().getAbsolutePath()
+ CAMERA_PIC_DIR;

Intent i = new Intent(this, ListFiles.class);
i.putExtra("directory", imageDir);
startActivityForResult(i,0);
}

@Override
protected void onActivityResult(int requestCode,
int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if(requestCode == 0 && resultCode==RESULT_OK) {
String tmp = data.getExtras().getString("clickedFile");
Bitmap imageToChange= BitmapFactory.decodeFile(tmp);
process_image(imageToChange);
}

}

void process_image(Bitmap image) {
Bitmap bm = Bitmap.createScaledBitmap(image, 480, 320, false);
int width = bm.getWidth();
int height = bm.getHeight();
int x = width>>1;
int y = height>>1;
int[] pixels1 = new int[(width*height)];
int[] pixels2 = new int[(width*height)];
int[] pixels3 = new int[(width*height)];
int[] pixels4 = new int[(width*height)];
bm.getPixels(pixels1, 0, width, 0, 0, width>>1, height>>1);
bm.getPixels(pixels2, 0, width, x, 0, width>>1, height>>1);
bm.getPixels(pixels3, 0, width, 0, y, width>>1, height>>1);
bm.getPixels(pixels4, 0, width, x, y, width>>1, height>>1);
if(bm.isMutable()) {
bm.setPixels(pixels2, 0, width, 0, 0, width>>1, height>>1);
bm.setPixels(pixels4, 0, width, x, 0, width>>1, height>>1);
bm.setPixels(pixels1, 0, width, 0, y, width>>1, height>>1);
bm.setPixels(pixels3, 0, width, x, y, width>>1, height>>1);
}
iv.setImageBitmap(bm);
}
}


The associated main layout is shown in Listing 8.4.

Listing 8.4. image_manipulation.xml


<?xml version="1.0" encoding="utf-8"?>
<LinearLayout
xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:layout_width="match_parent"
android:layout_height="match_parent">
<TextView android:layout_width="match_parent"
android:layout_height="wrap_content"
android:textSize="30sp"
android:text="Scrambled Picture" />
<ImageView android:id="@+id/my_image"
android:layout_width="wrap_content"
android:layout_height="wrap_content" />
</LinearLayout>


The AndroidManifest.xml file must declare both the activities, as shown in Listing 8.5. An example of the output is shown in Figure 8.1.

Image

Figure 8.1 Scrambled image

Listing 8.5. AndroidManifest.xml


<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="cc.dividebyzero.android.cookbook.chapter8"
android:versionCode="1"
android:versionName="1.0">
<uses-sdk android:minSdkVersion="8" android:targetSdkVersion="15" />
<application android:label="@string/app_name"
android:icon="@drawable/ic_launcher"
android:theme="@style/AppTheme">
<activity android:name=".Chapter8">
<intent-filter >
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
<activity android:name=".ListFiles">
<intent-filter >
<action android:name="android.intent.action.PICK" />
<category android:name="android.intent.category.DEFAULT" />
</intent-filter>
</activity>
<activity android:name=".audio.AudioPlayback"/>
<activity android:name=".audio.AudioRecording"/>
<activity android:name=".audio.AudioSoundPool"/>

<activity android:name=".video.VideoViewActivity"/>
<activity android:name=".video.VideoPlayback"/>
<activity android:name=".image.ImageManipulation"/>
</application>
</manifest>


Audio

There are two distinct frameworks for recording and playing audio. The choice of which to use depends on the application:

Image MediaPlayer/MediaRecorder—This is the standard method to manipulate audio but the data must be file- or stream-based. It creates its own thread for processing. SoundPool uses this framework.

Image AudioTrack/AudioRecorder—This method provides direct access to raw audio and is useful for manipulating audio in memory, writing to the buffer while already playing, or any other usage that does not require a file or stream. It does not create its own thread for processing.

These methods are used in the following recipes.

Recipe: Choosing and Playing Back Audio Files

The MediaRecorder and MediaPlayer classes are used to record and play back either audio or video. This recipe focuses on audio, and the usage is straightforward. For playback, the steps are as follows:

1. Create an instance of the MediaPlayer:

MediaPlayer m_mediaPlayer = new MediaPlayer();

2. Specify the source of media. It can be created from a raw resource:

m_mediaPlayer = MediaPlayer.create(this, R.raw.my_music);

Another option is to select a file from the filesystem (which then also needs a prepare statement):

m_mediaPlayer.setDataSource(path);
m_mediaPlayer.prepare();

In any case, these statements need to be surrounded by a try-catch block because the specified resource might not exist.

3. Start playback of the audio:

m_mediaPlayer.start();

4. When the playback is done, stop the MediaPlayer and release the instance to free up resources:

m_mediaPlayer.stop();
m_mediaPlayer.release();

This recipe uses the same ListFiles activity shown in Listings 8.1 and 8.2 to create a selectable list of audio files for playback. It is assumed that audio files are in the /sdcard/music/ directory of the Android device, but this is configurable.

When the ListFiles activity returns a file, it is initialized as the MediaPlayer media source, and the method startMP() is called. This starts the MediaPlayer and sets the button text to show “Pause.” Similarly, the pauseMP() method pauses the MediaPlayer and sets the button text to show “Play.” At any time, the user can click the button to pause or continue the playback of the music.

In general, the MediaPlayer creates its own background thread and does not pause when the main activity pauses. This is reasonable behavior for a music player, but in general, the developer might want control over this. Therefore, for illustration purposes, in this recipe the music playback is paused and resumed along with the main activity by overriding the onPause() and onResume() methods. This is shown in Listing 8.6.

Listing 8.6. AudioPlayback.java


package cc.dividebyzero.android.cookbook.chapter8.audio;

import cc.dividebyzero.android.cookbook.chapter8.ListFiles;
import cc.dividebyzero.android.cookbook.chapter8.R;
import cc.dividebyzero.android.cookbook.chapter8.R.id;
import cc.dividebyzero.android.cookbook.chapter8.R.layout;
import android.app.Activity;
import android.content.Intent;
import android.media.MediaPlayer;
import android.os.Bundle;
import android.os.Environment;
import android.view.View;
import android.widget.Button;

public class AudioPlayback extends Activity {
static final String MUSIC_DIR = "/music/";
Button playPauseButton;

private MediaPlayer m_mediaPlayer;

@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);

setContentView(R.layout.audio_playback);
playPauseButton = (Button) findViewById(R.id.play_pause);

m_mediaPlayer= new MediaPlayer();

String musicDir = Environment.getExternalStorageDirectory()
.getAbsolutePath() + MUSIC_DIR;

//Show a list of music files to choose
Intent i = new Intent(this, ListFiles.class);
i.putExtra("directory", musicDir);
startActivityForResult(i,0);

playPauseButton.setOnClickListener(new View.OnClickListener() {
public void onClick(View view) {
if(m_mediaPlayer.isPlaying()) {
//Stop and give option to start again
pauseMP();
} else {
startMP();
}
}
});
}

@Override
protected void onActivityResult(int requestCode, int resultCode,
Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if(requestCode == 0 && resultCode==RESULT_OK) {
String tmp = data.getExtras().getString("clickedFile");

try {
m_mediaPlayer.setDataSource(tmp);
m_mediaPlayer.prepare();
} catch (Exception e) {
e.printStackTrace();
}
startMP();
}
}

void pauseMP() {
playPauseButton.setText("Play");
m_mediaPlayer.pause();
}

void startMP() {
m_mediaPlayer.start();
playPauseButton.setText("Pause");
}

boolean needToResume = false;
@Override
protected void onPause() {
if(m_mediaPlayer != null && m_mediaPlayer.isPlaying()) {
needToResume = true;
pauseMP();
}
super.onPause();
}

@Override
protected void onResume() {
super.onResume();
if(needToResume && m_mediaPlayer != null) {
startMP();
}
}
}


The associated main XML layout with the Play/Pause button is shown in Listing 8.7.

Listing 8.7. res/layout/audio_playback.xml


<?xml version="1.0" encoding="utf-8"?>
<LinearLayout
xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:layout_width="match_parent"
android:layout_height="match_parent">
<Button android:id="@+id/play_pause"
android:text="Play"
android:textSize="20sp"
android:layout_width="wrap_content"
android:layout_height="wrap_content" />
</LinearLayout>


Recipe: Recording Audio Files

Recording audio using MediaRecorder is similar to playback from the previous recipe, except a few more things need to be specified (DEFAULT can also be used and is the same as the first choice in these lists):

Image MediaRecorder.AudioSource:

Image MIC—Built-in microphone

Image VOICE_UPLINK—Transmitted audio during voice call

Image VOICE_DOWNLINK—Received audio during voice call

Image VOICE_CALL—Both uplink and downlink audio during voice call

Image CAMCORDER—Microphone associated with camera if available

Image VOICE_RECOGNITION—Microphone tuned for voice recognition if available

Image MediaRecorder.OutputFormat:

Image THREE_GPP—3GPP media file format

Image MPEG_4—MPEG4 media file format

Image AMR_NB—Adaptive multirate narrowband file format

Image MediaRecorder.AudioEncoder:

Image AMR_NB—Adaptive multirate narrowband vocoder

The steps to record audio are as follows:

1. Create an instance of the MediaRecorder:

MediaRecorder m_Recorder = new MediaRecorder();

2. Specify the source of media, for example, the microphone:

m_Recorder.setAudioSource(MediaRecorder.AudioSource.MIC);

3. Set the output file format and encoding, such as:

m_Recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
m_Recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);

4. Set the path for the file to be saved:

m_Recorder.setOutputFile(path);

5. Prepare and start the recording:

m_Recorder.prepare();
m_Recorder.start();

These steps for audio recording can be used just as they were in the previous recipe for playback.

Recipe: Manipulating Raw Audio

The MediaRecorder/MediaPlayer framework is useful for most audio uses, but to manipulate raw audio straight from the microphone, process it without saving to a file, and/or play back raw audio, use AudioRecord/AudioTrack instead. First, set the permission in the AndroidManifest.xml file:

<uses-permission android:name="android.permission.RECORD_AUDIO" />

Then, the steps to record are the following:

1. Create an AudioRecord instance, specifying the following to the constructor:

Image Audio source—Use one of the MediaRecorder.AudioSource choices described in the previous recipe; for example, use MediaRecorder.AudioSource.MIC.

Image Sampling frequency in hertz—Use 44100 for CD-quality audio or half-rates such as 22050 or 11025 (which are sufficient for voice and are the only sampling frequencies guaranteed to be supported).

Image Channel configuration—Use AudioFormat.CHANNEL_IN_STEREO to record stereo sound and CHANNEL_IN_MONO to record mono sound.

Image Audio encoding—Use either AudioFormat.ENCODING_PCM_8BIT for 8-bit quantization or AudioFormat.ENCODING_PCM_16BIT for 16-bit.

Image Buffer size in bytes—This is the total size of allotted memory in static mode or the size of chunks used in streaming mode. This must be at least getMinBufferSize() bytes.

2. Start recording from the AudioRecord instance.

3. Read audio data to memory audioData[] using one of the following methods:

read(short[] audioData, int offsetInShorts, int sizeInShorts)
read(byte[] audioData, int offsetInBytes, int sizeInBytes)
read(ByteBuffer audioData, int sizeInBytes)

4. Stop recording.

For example, the following is suitable to record voice from the built-in microphone to a memory buffer myRecordedAudio, which can be declared a short[] (for instance, 16 bits each sample). Using a short[] has the advantage of not having to worry about byte ordering when reassembling the byte values into a short. Note that 11,025 samples per second and a buffer size of 10,000 samples means this recording is a little less than a second long:

short[] myRecordedAudio = new short[10000];
AudioRecord audioRecord = new AudioRecord(
MediaRecorder.AudioSource.MIC, 11025,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, 10000);
audioRecord.startRecording();
audioRecord.read(myRecordedAudio, 0, 10000);
audioRecord.stop();
audioRecord.release();

Then, the steps to play back the audio are as follows:

1. Create an AudioTrack instance specifying the following to the constructor:

Image Stream type—Use AudioManager.STREAM_MUSIC for capturing from the microphone or playback to the speaker. Other choices are STREAM_VOICE_CALL, STREAM_SYSTEM, STREAM_RING, and STREAM_ALARM.

Image Sampling frequency in hertz—This has the same meaning as during recording.

Image Channel configuration—Use AudioFormat.CHANNEL_OUT_STEREO to play back stereo sound. There are many other choices such as CHANNEL_OUT_MONO and CHANNEL_OUT_5POINT1 (for surround sound).

Image Audio encoding—This has the same meaning as for recording.

Image Buffer size in bytes—This is the size of chunks of data to play at a time.

Image Buffer mode—Use AudioTrack.MODE_STATIC for short sounds that can fully fit in memory, avoiding transfer overheads. Otherwise, use AudioTrack.MODE_STREAM to write data to hardware in buffer chunks.

2. Start playback from the AudioTrack instance.

3. Write memory audioData[] to hardware using one of the following methods:

write(short[] audioData, int offsetInShorts, int sizeInShorts)
write(byte[] audioData, int offsetInBytes, int sizeInBytes)

4. Stop playback (optional).

For example, the following is suitable to play back the voice data in the previous record example:

AudioTrack audioTrack = new AudioTrack(
AudioManager.STREAM_MUSIC, 11025,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, 4096,
AudioTrack.MODE_STREAM);
audioTrack.play();
audioTrack.write(myRecordedAudio, 0, 10000);
audioTrack.stop();
audioTrack.release();

This recipe uses these two options to record audio to memory and play it back. The layout specifies two buttons on the screen: one to record audio and another to play back that recorded audio, as declared in the main layout file shown in Listing 8.8.

Listing 8.8. audio_recording.xml


<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:layout_width="match_parent"
android:layout_height="match_parent">
<TextView android:id="@+id/status"
android:text="Ready" android:textSize="20sp"
android:layout_width="wrap_content"
android:layout_height="wrap_content" />
<Button android:id="@+id/record"
android:text="Record for 5 seconds"
android:textSize="20sp" android:layout_width="wrap_content"
android:layout_height="wrap_content" />
<Button android:id="@+id/play"
android:text="Play" android:textSize="20sp"
android:layout_width="wrap_content"
android:layout_height="wrap_content" />
</LinearLayout>


The main activity shown in Listing 8.9 first creates an OnClickListener for these buttons to record or play back the in-memory audio buffer. The onClick() callback method creates the appropriate background thread because neither AudioTrack nor AudioRecord should be run in the UI thread. For illustration, two different methods of creating the thread are shown: The record_thread() has a local thread with the UI updated through a Handler, and the play thread uses the main activity’s run() method.

The buffer is kept in memory. For illustration, the recording is kept to 5 seconds.

Listing 8.9. AudioRecording.java


package cc.dividebyzero.android.cookbook.chapter8.audio;

import cc.dividebyzero.android.cookbook.chapter8.R;
import cc.dividebyzero.android.cookbook.chapter8.R.id;
import cc.dividebyzero.android.cookbook.chapter8.R.layout;
import android.app.Activity;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioRecord;
import android.media.AudioTrack;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.os.Handler;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.TextView;

public class AudioRecording extends Activity implements Runnable {
private TextView statusText;

public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.audio_recording);

statusText = (TextView) findViewById(R.id.status);

Button actionButton = (Button) findViewById(R.id.record);
actionButton.setOnClickListener(new View.OnClickListener() {
public void onClick(View view) {
record_thread();
}
});

Button replayButton = (Button) findViewById(R.id.play);
replayButton.setOnClickListener(new View.OnClickListener() {
public void onClick(View view) {
Thread thread = new Thread(AudioRecording.this);
thread.start();
}
});
}

String text_string;
final Handler mHandler = new Handler();
// Create runnable for posting
final Runnable mUpdateResults = new Runnable() {
public void run() {
updateResultsInUi(text_string);
}
};

private void updateResultsInUi(String update_txt) {
statusText.setText(update_txt);
}

private void record_thread() {
Thread thread = new Thread(new Runnable() {
public void run() {
text_string = "Starting";
mHandler.post(mUpdateResults);

record();

text_string = "Done";
mHandler.post(mUpdateResults);
}
});
thread.start();
}

private int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;
int frequency = 11025; //hertz
int bufferSize = 50*AudioTrack.getMinBufferSize(
frequency,
AudioFormat.CHANNEL_OUT_MONO,
audioEncoding
);
// Create new AudioRecord object to record the audio
public AudioRecord audioRecord = new AudioRecord(
MediaRecorder.AudioSource.MIC,
frequency,
AudioFormat.CHANNEL_IN_MONO,
audioEncoding,
bufferSize
);
// Create new AudioTrack object w/same parameters as AudioRecord obj
public AudioTrack audioTrack = new AudioTrack(
AudioManager.STREAM_MUSIC,
frequency,
AudioFormat.CHANNEL_OUT_MONO,
audioEncoding,
4096,
AudioTrack.MODE_STREAM
);
short[] buffer = new short[bufferSize];

public void record() {
try {
audioRecord.startRecording();
audioRecord.read(buffer, 0, bufferSize);
audioRecord.stop();
audioRecord.release();
} catch (Throwable t) {
Log.e("AudioExamplesRaw","Recording Failed");
}
}

public void run() { //Play audio using runnable activity
audioTrack.play();
int i=0;
while(i<bufferSize) {
audioTrack.write(buffer, i++, 1);
}
return;
}

@Override
protected void onPause() {
if(audioTrack!=null) {
if(audioTrack.getPlayState()==AudioTrack.PLAYSTATE_PLAYING) {
audioTrack.pause();
}
}
super.onPause();
}
}


Recipe: Using Sound Resources Efficiently

To keep the smaller memory requirements of compressed audio files but also the benefit of lower-latency playback of raw audio files, the SoundPool class can be used. This uses the MediaPlayer service to decode audio and provides methods to repeat sound buffers and also speed them up or slow them down.

Usage is similar to the other sound methods described in previous recipes: initialize, load a resource, play, and release. However, note that the SoundPool launches a background thread, so a play() right after a load() might not produce sound if the resource does not have time to load. Similarly, a release() called right after a play() releases the resource before it can be played. Therefore, it is best to tie SoundPool resources to activity lifecycle events (such as onCreate and onPause) and tie the playback of SoundPool resources to a user-generated event (such as a button press or advancement in a game).

Using the same layout file as in Listing 8.7, the main activity of this recipe is shown in Listing 8.10. A button press triggers the SoundPool to repeat a drumbeat eight times (the initial time plus seven repeats). Also, the rate alternates from half-speed to double speed between button presses. Up to ten streams can play at once, which means that ten quick button presses can launch ten drumbeats playing simultaneously.

Listing 8.10. AudioSoundPool.java


package cc.dividebyzero.android.cookbook.chapter8.audio;

import cc.dividebyzero.android.cookbook.chapter8.R;
import cc.dividebyzero.android.cookbook.chapter8.R.id;
import cc.dividebyzero.android.cookbook.chapter8.R.layout;
import cc.dividebyzero.android.cookbook.chapter8.R.raw;
import android.app.Activity;
import android.media.AudioManager;
import android.media.SoundPool;
import android.os.Bundle;
import android.view.View;
import android.widget.Button;

public class AudioSoundPool extends Activity {
static float rate = 0.5f;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);

setContentView(R.layout.audio_soundpool);
Button playDrumButton = (Button) findViewById(R.id.play_pause);

final SoundPool mySP = new SoundPool(
10,
AudioManager.STREAM_MUSIC,
0);
final int soundId = mySP.load(this, R.raw.drum_beat, 1);

playDrumButton.setOnClickListener(new View.OnClickListener() {
public void onClick(View view) {
rate = 1/rate;
mySP.play(soundId, 1f, 1f, 1, 7, rate);
}
});
}
}


Recipe: Adding Media and Updating Paths

After an application creates a newly recorded audio file, that file can be registered with the system as available for use. This is done using the MediaStore class. For example, Listing 8.11 shows how to register a saved audio file myFile as a possible ringtone, notification, and alarm, but not to be seen by an MP3 player (because IS_MUSIC is false).

Listing 8.11. Example of Registering an Audio File to the System


//Reload MediaScanner to search for media and update paths
sendBroadcast(new Intent(Intent.ACTION_MEDIA_MOUNTED,
Uri.parse("file://"
+ Environment.getExternalStorageDirectory())));
ContentValues values = new ContentValues();
values.put(MediaStore.MediaColumns.DATA, myFile.getAbsolutePath());
values.put(MediaStore.MediaColumns.TITLE, myFile.getName());
values.put(MediaStore.MediaColumns.TIMESTAMP,
System.currentTimeMillis());
values.put(MediaStore.MediaColumns.MIME_TYPE,
recorder.getMimeContentType());
values.put(MediaStore.Audio.Media.ARTIST, SOME_ARTIST_HERE);
values.put(MediaStore.Audio.Media.IS_RINGTONE, true);
values.put(MediaStore.Audio.Media.IS_NOTIFICATION, true);
values.put(MediaStore.Audio.Media.IS_ALARM, true);
values.put(MediaStore.Audio.Media.IS_MUSIC, false);
ContentResolver contentResolver = new ContentResolver();
Uri base = MediaStore.Audio.INTERNAL_CONTENT_URI;
Uri newUri = contentResolver.insert(base, values);
String path = contentResolver.getDataFilePath(newUri);


Here, ContentValues is used to declare some standard properties of the file, such as TITLE, TIMESTAMP, and MIME_TYPE, and ContentResolver is used to create an entry in the MediaStore content database with the file’s path automatically added.

Video

There are two different ways of displaying video. One uses the MediaPlayer framework similar to the audio examples discussed previously. The other uses a VideoView class that takes care of most of the work and is recommended for simpler use cases.

Recipe: Using the VideoView

Using a VideoView is very easy. Once it is declared in the XML layout and the layout has been loaded, all that needs to be done is to give a URL of the video to the VideoView and it will start playing immediately. It will even show an error dialog, if the video format is not supported by the framework or some other error occurred.

To make things easier for the user, there is another helper class called the MediaController. This adds Play/Pause, Forward, and Rewind buttons and a Seeking Bar control. All that needs to be done is to hook up the MediaController to the VideoView to act as its anchor with .setAnchorView. This way, a full video player can be obtained with just a few lines of code, as can be seen in Listing 8.12.

Listing 8.12. VideoViewActivity.java


public class VideoViewActivity extends Activity {


private static final String VIDEO_DIR =
File.separator+"DCIM"+File.separator+"Camera";
private VideoView videoView;

@Override
public void onCreate(Bundle savedInstanceState){
super.onCreate(savedInstanceState);

setContentView(R.layout.video_view);
videoView=(VideoView)findViewById(R.id.videoView1);
MediaController controller=new MediaController(this);
controller.setMediaPlayer(videoView);
controller.setAnchorView(videoView);

videoView.setMediaController(controller);


String videoDir = Environment.getExternalStorageDirectory()
.getAbsolutePath() + VIDEO_DIR;

//Show a list of video files to choose
Intent i = new Intent(this, ListFiles.class);
i.putExtra("directory", videoDir);
startActivityForResult(i,0);

}

@Override
protected void onActivityResult(int requestCode,
int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if(requestCode == 0 && resultCode==RESULT_OK) {
String path = data.getExtras().getString("clickedFile");

videoView.setVideoPath(path);
videoView.start();

}
}

}


The accompanying layout is seen in Listing 8.13.

Listing 8.13. video_view.xml


<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="vertical" >

<VideoView
android:id="@+id/videoView1"
android:layout_width="match_parent"
android:layout_height="wrap_content"
/>

</LinearLayout>


Recipe: Video Playback Using the MediaPlayer

The MediaPlayer framework can also be used to play videos. The main difference from playing audio is that a surface for rendering the video frames must be provided. This is done using the SurfaceView class, which is added to the layout just beneath the Play/Pause button in Listing 8.14.

Listing 8.14. video_playback.xml


<?xml version="1.0" encoding="utf-8"?>
<LinearLayout
xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:layout_width="match_parent"
android:layout_height="match_parent">
<Button android:id="@+id/play_pause"
android:text="Play"
android:textSize="20sp"
android:layout_width="wrap_content"
android:layout_height="wrap_content" />
<SurfaceView
android:id="@+id/surface"
android:layout_width="match_parent"
android:layout_height="0dip"
android:layout_weight="1"
android:visibility="visible"
/>
</LinearLayout>


Creating the surface can take some time. Because of this, the SurfaceHolder.Callback method is used to set the display of the MediaPlayer after the surface is created. Once this is done, videos can be played. If a video is started without attaching a display or if setDisplay is called with a null argument, only the audio track of the video is played.