Media with JavaFX - Java 8 Recipes, 2th Edition (2014)

Java 8 Recipes, 2th Edition (2014)

CHAPTER 16. Media with JavaFX

JavaFX provides a media-rich API capable of playing audio and video. The Media API allows developers to incorporate audio and video into their Rich Client Applications. One of the main benefits of the Media API is its cross-platform abilities when distributing media content via the web. With a range of devices (tablets, music players, TVs, and so on) that need to play multimedia content, the need for a cross-platform API is essential.

Imagine a not-so-distant future where your TV or wall is capable of interacting with you in ways that you’ve never dreamed possible. For instance, while viewing a movie you could select items or clothing used in the movie to be immediately purchased, all from the comfort of your home. With this future in mind, developers seek to enhance the interactive qualities of their media-based applications.

In this chapter you will learn how to play audio and video in an interactive way. Find your seats for Act III of JavaFX as audio and video take center stage—as depicted in Figure 16-1.

9781430268277_Fig16-01.jpg

Figure 16-1. Audio and video

16-1. Playing Audio

Problem

You want to listen to music and become entertained with a graphical visualization.

Solution

Create an MP3 player by utilizing the following classes:

· javafx.scene.media.Media

· javafx.scene.media.MediaPlayer

· javafx.scene.media.AudioSpectrumListener

The following source code is an implementation of a simple MP3 player:

package org.java8recipes.chapter16.recipe16_01;

import java.io.File;
import java.util.Random;
import javafx.application.Application;
import javafx.application.Platform;
import javafx.geometry.Point2D;
import javafx.scene.Group;
import javafx.scene.Node;
import javafx.scene.Scene;
import javafx.scene.input.DragEvent;
import javafx.scene.input.Dragboard;
import javafx.scene.input.MouseEvent;
import javafx.scene.input.TransferMode;
import javafx.scene.media.AudioSpectrumListener;
import javafx.scene.media.Media;
import javafx.scene.media.MediaPlayer;
import javafx.scene.paint.Color;
import javafx.scene.shape.Arc;
import javafx.scene.shape.ArcType;
import javafx.scene.shape.Circle;
import javafx.scene.shape.Line;
import javafx.scene.shape.Rectangle;
import javafx.scene.text.Text;
import javafx.stage.Stage;
import javafx.stage.StageStyle;

public class PlayingAudio extends Application {

private MediaPlayer mediaPlayer;
private Point2D anchorPt;
private Point2D previousLocation;

/**
* @param args the command line arguments
*/
public static void main(String[] args) {
Application.launch(args);
}

@Override
public void start(final Stage primaryStage) {
primaryStage.setTitle("Chapter 16-1 Playing Audio");
primaryStage.centerOnScreen();
primaryStage.initStyle(StageStyle.TRANSPARENT);

Group root = new Group();
Scene scene = new Scene(root, 551, 270, Color.rgb(0, 0, 0, 0));

// application area
Rectangle applicationArea = new Rectangle();
applicationArea.setArcWidth(20);
applicationArea.setArcHeight(20);
applicationArea.setFill(Color.rgb(0, 0, 0, .80));
applicationArea.setX(0);
applicationArea.setY(0);
applicationArea.setStrokeWidth(2);
applicationArea.setStroke(Color.rgb(255, 255, 255, .70));

root.getChildren().add(applicationArea);
applicationArea.widthProperty().bind(scene.widthProperty());
applicationArea.heightProperty().bind(scene.heightProperty());

final Group phaseNodes = new Group();
root.getChildren().add(phaseNodes);

// starting initial anchor point
scene.setOnMousePressed((MouseEvent event) -> {
anchorPt = new Point2D(event.getScreenX(), event.getScreenY());
});

// dragging the entire stage
scene.setOnMouseDragged((MouseEvent event) -> {
if (anchorPt != null && previousLocation != null) {
primaryStage.setX(previousLocation.getX() + event.getScreenX() - anchorPt.getX());
primaryStage.setY(previousLocation.getY() + event.getScreenY() - anchorPt.getY());
}
});

// set the current location
scene.setOnMouseReleased((MouseEvent event) -> {
previousLocation = new Point2D(primaryStage.getX(), primaryStage.getY());
});

// Dragging over surface
scene.setOnDragOver((DragEvent event) -> {
Dragboard db = event.getDragboard();
if (db.hasFiles()) {
event.acceptTransferModes(TransferMode.COPY);
} else {
event.consume();
}
});

// Dropping over surface
scene.setOnDragDropped((DragEvent event) -> {
Dragboard db = event.getDragboard();
boolean success = false;
if (db.hasFiles()) {
success = true;
String filePath = null;
for (File file : db.getFiles()) {
filePath = file.getAbsolutePath();
System.out.println(filePath);
}
// play file
Media media = new Media(new File(filePath).toURI().toString());

if (mediaPlayer != null) {
mediaPlayer.stop();
}

mediaPlayer = new MediaPlayer(media);

// Maintained Inner Class for Tutorial, could be changed to lambda
mediaPlayer.setAudioSpectrumListener(new AudioSpectrumListener() {
@Override
public void spectrumDataUpdate(double timestamp, double duration, float[]
magnitudes, float[] phases) {
phaseNodes.getChildren().clear();
int i = 0;
int x = 10;
int y = 150;
final Random rand = new Random(System.currentTimeMillis());
for (float phase : phases) {
int red = rand.nextInt(255);
int green = rand.nextInt(255);
int blue = rand.nextInt(255);

Circle circle = new Circle(10);
circle.setCenterX(x + i);
circle.setCenterY(y + (phase * 100));
circle.setFill(Color.rgb(red, green, blue, .70));
phaseNodes.getChildren().add(circle);
i += 5;
}
}
});

mediaPlayer.setOnReady(mediaPlayer::play);
}

event.setDropCompleted(success);
event.consume();
});

// create slide controls
final Group buttonGroup = new Group();

// rounded rect
Rectangle buttonArea = new Rectangle();
buttonArea.setArcWidth(15);
buttonArea.setArcHeight(20);
buttonArea.setFill(new Color(0, 0, 0, .55));
buttonArea.setX(0);
buttonArea.setY(0);
buttonArea.setWidth(60);
buttonArea.setHeight(30);
buttonArea.setStroke(Color.rgb(255, 255, 255, .70));

buttonGroup.getChildren().add(buttonArea);
// stop audio control
Rectangle stopButton = new Rectangle();
stopButton.setArcWidth(5);
stopButton.setArcHeight(5);
stopButton.setFill(Color.rgb(255, 255, 255, .80));
stopButton.setX(0);
stopButton.setY(0);
stopButton.setWidth(10);
stopButton.setHeight(10);
stopButton.setTranslateX(15);
stopButton.setTranslateY(10);
stopButton.setStroke(Color.rgb(255, 255, 255, .70));

stopButton.setOnMousePressed((MouseEvent me) -> {
if (mediaPlayer != null) {
mediaPlayer.stop();
}
});
buttonGroup.getChildren().add(stopButton);

// play control
final Arc playButton = new Arc();
playButton.setType(ArcType.ROUND);
playButton.setCenterX(12);
playButton.setCenterY(16);
playButton.setRadiusX(15);
playButton.setRadiusY(15);
playButton.setStartAngle(180 - 30);
playButton.setLength(60);
playButton.setFill(new Color(1, 1, 1, .90));
playButton.setTranslateX(40);

playButton.setOnMousePressed((MouseEvent me) -> {
mediaPlayer.play();
});

// pause control
final Group pause = new Group();
final Circle pauseButton = new Circle();
pauseButton.setCenterX(12);
pauseButton.setCenterY(16);
pauseButton.setRadius(10);
pauseButton.setStroke(new Color(1, 1, 1, .90));
pauseButton.setTranslateX(30);

final Line firstLine = new Line();
firstLine.setStartX(6);
firstLine.setStartY(16 - 10);
firstLine.setEndX(6);
firstLine.setEndY(16 - 2);
firstLine.setStrokeWidth(3);
firstLine.setTranslateX(34);
firstLine.setTranslateY(6);
firstLine.setStroke(new Color(1, 1, 1, .90));

final Line secondLine = new Line();
secondLine.setStartX(6);
secondLine.setStartY(16 - 10);
secondLine.setEndX(6);
secondLine.setEndY(16 - 2);
secondLine.setStrokeWidth(3);
secondLine.setTranslateX(38);
secondLine.setTranslateY(6);
secondLine.setStroke(new Color(1, 1, 1, .90));

pause.getChildren().addAll(pauseButton, firstLine, secondLine);

pause.setOnMousePressed((MouseEvent me) -> {
if (mediaPlayer != null) {
buttonGroup.getChildren().remove(pause);
buttonGroup.getChildren().add(playButton);
mediaPlayer.pause();
}
});

playButton.setOnMousePressed((MouseEvent me) -> {
if (mediaPlayer != null) {
buttonGroup.getChildren().remove(playButton);
buttonGroup.getChildren().add(pause);
mediaPlayer.play();
}
});

buttonGroup.getChildren().add(pause);
// move button group when scene is resized
buttonGroup.translateXProperty().bind(scene.widthProperty().subtract(buttonArea.getWidth()
+ 6));
buttonGroup.translateYProperty().bind(scene.heightProperty().subtract(buttonArea.getHeight()
+ 6));
root.getChildren().add(buttonGroup);

// close button
final Group closeApp = new Group();
Circle closeButton = new Circle();
closeButton.setCenterX(5);
closeButton.setCenterY(0);
closeButton.setRadius(7);
closeButton.setFill(Color.rgb(255, 255, 255, .80));

Node closeXmark = new Text(2, 4, "X");
closeApp.translateXProperty().bind(scene.widthProperty().subtract(15));
closeApp.setTranslateY(10);
closeApp.getChildren().addAll(closeButton, closeXmark);
closeApp.setOnMouseClicked((MouseEvent event) -> {
Platform.exit();
});

root.getChildren().add(closeApp);

primaryStage.setScene(scene);
primaryStage.show();
previousLocation = new Point2D(primaryStage.getX(), primaryStage.getY());

}
}

Figure 16-2 shows a JavaFX MP3 player with visualizations.

9781430268277_Fig16-02.jpg

Figure 16-2. JavaFX MP3 player

How It Works

Before you get started, I’ll discuss the instructions on how to operate the MP3 player. The users will be able to drag and drop an audio file into the application area to be played. Located on the lower right of the application are buttons to stop, pause, and resume play of audio media. (The button controls are shown in Figure 16-2.) As the music is playing, the user will also notice randomly colored balls bouncing around to the music. Once the users are done listening to the music, they can quit the application by clicking the white rounded close button located in the upper-right corner.

It is similar to Recipe 15-1, in which you learned how to use the drag-and-drop desktop metaphor to load files into a JavaFX application. Instead of image files, however, the user is accessing audio files. JavaFX currently supports the following audio file formats: .mp3, .wav, and.aiff.

Following the same look and feel, you will use the same style as Recipe 15-1. In this recipe, you modify the button controls to resemble buttons, similar to many media player applications. When the pause button is pressed, it will pause the audio media from playing and toggle to the play button control, thus allowing the users to resume. As an added bonus, the MP3 player will appear as an irregular shaped, semitransparent window without borders that can also be dragged around the desktop using the mouse. Now that you know how the music player will operate, let’s walk through the code.

First, you need to create instance variables that will maintain state information for the lifetime of the application. Table 16-1 describes all the instance variables used in this music player application. The first variable is a reference to a media player (MediaPlayer) object that will be created in conjunction with a Media object containing an audio file. Next, you create an anchorPt variable used to save the starting coordinate of a mouse press when the users begin to drag the window across the screen. When calculating the upper-left bounds of the application window during a mouse-dragged operation, the previousLocation variable will contain the previous window’s screen X and Y coordinates.

Table 16-1. MP3 Player Application Instance Variables

Table16-1.jpg

Table 16-1 lists the MP3 player application’s instance variables.

In previous chapters relating to GUIs, you saw that GUI applications normally contain a title bar and windowed borders surrounding the scene. Here, I wanted to raise the bar a little by showing you how to create irregularly shaped semitransparent windows, thus making things look more hip or modern. As you begin to create the media player, you’ll notice in the start() method that you prepare the Stage object by initializing the style using StageStyle.TRANSPARENT. After you initialize the style to StageStyle.TRANSPARENT, the window will be undecorated, with the entire window area’s opaque value set to zero (invisible). The following code shows you how to create a transparent window without a title bar or windowed borders:

primaryStage.initStyle(StageStyle.TRANSPARENT);

With the invisible stage, you create a rounded rectangular region that will be the application’s surface or main content area. Next, notice the width and height of the rectangle bound to the scene object in case the window is resized. Because the window isn’t going to be resized, the bind isn’t necessary (it will be needed, however, in Recipe 16-2, when you get a chance to enlarge your video screen to take on a full-screen mode).

After creating a black, semitransparent, rounded rectangular area (applicationArea), you’ll be creating a simple Group object to hold all the randomly colored Circle nodes that will show off graphical visualizations while the audio is being played. Later, you will see how thephaseNodes (Group) variable is updated based on sound information using an AudioSpectrumListener.

Next, you add EventHandler<MouseEvent> instances to the Scene object (the example uses lambda expressions) to monitor mouse events as the user drags the window around the screen. The first event in this scenario is a mouse press, which will save the cursor’s current (X, Y) coordinates to the variable anchorPt. The following code is adding an EventHandler to the mouse-press property of the Scene:

// starting initial anchor point
scene.setOnMousePressed((MouseEvent event) -> {
anchorPt = new Point2D(event.getScreenX(), event.getScreenY());
});

After implementing the mouse-press event handler, you can create an EventHandler to the Scene’s mouse-drag property. The mouse–drag event handler will update and position the application window (Stage) dynamically, based on the previous window’s location (upper-left corner) along with the anchorPt variable. Shown here is an event handler responsible for the mouse-drag event on the Scene object:

// dragging the entire stage
scene.setOnMouseDragged((MouseEvent event) -> {
if (anchorPt != null && previousLocation != null) {
primaryStage.setX(previousLocation.getX() + event.getScreenX() - anchorPt.getX());
primaryStage.setY(previousLocation.getY() + event.getScreenY() - anchorPt.getY());
}
});

You will want to handle the mouse-release event. Once the mouse is released, the event handler will update the previousLocation variable for subsequent mouse-drag events to move the application window about the screen. The following code snippet updates thepreviousLocation variable:

// set the current location
scene.setOnMouseReleased((MouseEvent event) -> {
previousLocation = new Point2D(primaryStage.getX(), primaryStage.getY());
});

Next, you will be implementing the drag-and-drop scenario to load the audio file from the file system (using the File Manager). When handling a drag-and-drop scenario, it is similar to Recipe 15-1, in which you created an EventHandler to handle DragEvents. Instead of loading image files, you’ll be loading audio files from the host file system. For brevity, I simply mention the code lines of the drag-and-dropped event handler. Once the audio file is available, you will create a Media object by passing in the file as a URI. The following code snippet is how to create aMedia object:

Media media = new Media(new File(filePath).toURI().toString());

Once you have created a Media object you will have to create an instance of a MediaPlayer in order to play the sound file. Both the Media and MediaPlayer objects are immutable, which is why new instances of each will be created every time the user drags a file into the application. Next, you will check the instance variable mediaPlayer for a previous instance to make sure it is stopped before creating a new MediaPlayer instance. The following code checks for a prior media player to be stopped:

if (mediaPlayer != null) {
mediaPlayer.stop();
}

So, here is where you create a MediaPlayer instance. A MediaPlayer object is responsible for controlling the playing of media objects. Notice that a MediaPlayer will treat sound or video media the same in terms of playing, pausing, and stopping media. When creating a media player, you specify the media and audioSpectrumListener attribute methods. Setting the autoPlay attribute to true will play the audio media immediately after it has been loaded. The last thing to specify on the MediaPlayer instance is an AudioSpectrumListener. So, what exactly is this type of listener, you say? Well, according to the Javadoc, it is an observer receiving periodic updates of the audio spectrum. In layman’s terms, it is the audio media’s sound data such as volume, tempo, and so on. To create an instance of an AudioSpectrumListener, you create an inner class that overrides the method spectrumDataUpdate(). You could have also used a lambda expression here; the example uses the inner class to provide better insight into the functionality. Table 16-2 lists all the inbound parameters for the audio spectrum listener’s method. For more details, refer to the Javadoc at http://docs.oracle.com/javase/8/javafx/api/javafx/scene/media/AudioSpectrumListener.html.

Table 16-2. The AudioSpectrumListener's Method spectrumDataUpdate() Inbound Parameters

Table16-2.jpg

In the example, randomly colored circle nodes are created, positioned, and placed on the scene based on the variable phases (array of floats). To draw each colored circle, the circle’s center X is incremented by five pixels and the circle’s center Y is added with each phase value multiplied by 100. Shown here is the code snippet that plots each randomly colored circle:

circle.setCenterX(x + i);
circle.setCenterY(y + (phase * 100));
... // setting the circle
i+=5;

Here is an inner class implementation of an AudioSpectrumListener:

new AudioSpectrumListener() {
@Override
public void spectrumDataUpdate(double timestamp, double duration, float[] magnitudes, float[]
phases) {

phaseNodes.getChildren().clear();
int i = 0;
int x = 10;
int y = 150;
final Random rand = new Random(System.currentTimeMillis());
for(float phase:phases) {
int red = rand.nextInt(255);
int green = rand.nextInt(255);
int blue = rand.nextInt(255);

Circle circle = new Circle(10);
circle.setCenterX(x + i);
circle.setCenterY(y + (phase * 100));
circle.setFill(Color.rgb(red, green, blue, .70));
phaseNodes.getChildren().add(circle);
i+=5;
}

}
};

Once the media player is created, you create a java.lang.Runnable to be set to the onReady attribute to be invoked when the media is in a ready state. Once the ready event is realized, the run() method will call the media player object’s play() method to begin the audio. With the dragged-drop sequence completed, you appropriately notify the drag-and-drop system by invoking the event’s setDropCompleted() method with a value of true. The following code snippet demonstrates how to implement a Runnable to begin the media player as soon as the media player is in a ready state using a method reference:

mediaPlayer.setOnReady(mediaPlayer::play);

Finally, you create buttons with JavaFX shapes to represent the stop, play, pause, and close buttons. When creating shapes or custom nodes, you can add event handlers to nodes in order to respond to mouse clicks. Although there are advanced ways to build custom controls in JavaFX, this example uses custom-built button icons from simple rectangles, arcs, circles, and lines. To see more advanced ways to create custom controls, refer to the Javadoc on the Skinnable API or to Recipe 16-5. To attach event handlers for a mouse press, simply call the setOnMousePress()method by passing in an EventHandler<MouseEvent> instance. The following code demonstrates adding an EventHandler to respond to mouse press on the stopButton node:

stopButton.setOnMousePressed((MouseEvent me) -> {
if (mediaPlayer != null) {
mediaPlayer.stop();
}
});

Because all the buttons use the same code snippet, only the method calls that each button will perform on the media player are listed. The last button, Close, isn’t related to the media player, but it provides a way to exit the MP3 player application. The following actions are responsible for stopping, pausing, playing, and exiting the MP3 player application:

Stop - mediaPlayer.stop();
Pause - mediaPlayer.pause();
Play - mediaPlayer.play();
Close - Platform.exit();

16-2. Playing Video

Problem

You want to view a video file complete with controls to play, pause, stop, and seek.

Solution

Create a video media player by utilizing the following classes:

· javafx.scene.media.Media

· javafx.scene.media.MediaPlayer

· javafx.scene.media.MediaView

The following code is an implementation of a JavaFX basic video player:

public void start(final Stage primaryStage) {
primaryStage.setTitle("Chapter 16-2 Playing Video");
primaryStage.centerOnScreen();
primaryStage.initStyle(StageStyle.TRANSPARENT);

final Group root = new Group();
final Scene scene = new Scene(root, 540, 300, Color.rgb(0, 0, 0, 0));

// rounded rectangle with slightly transparent
Node applicationArea = createBackground(scene);
root.getChildren().add(applicationArea);

// allow the user to drag window on the desktop
attachMouseEvents(scene, primaryStage);

// allow the user to see the progress of the video playing
progressSlider = createSlider(scene);
root.getChildren().add(progressSlider);

// Dragging over surface
scene.setOnDragOver((DragEvent event) -> {
Dragboard db = event.getDragboard();
if (db.hasFiles() || db.hasUrl() || db.hasString()) {
event.acceptTransferModes(TransferMode.COPY);
if (mediaPlayer != null) {
mediaPlayer.stop();
}
} else {
event.consume();
}
});

// update slider as video is progressing (later removal)
progressListener = (ObservableValue<? extends Duration> observable, Duration oldValue, Duration
newValue) -> {
progressSlider.setValue(newValue.toSeconds());
};

// Dropping over surface
scene.setOnDragDropped((DragEvent event) -> {
Dragboard db = event.getDragboard();
boolean success = false;
URI resourceUrlOrFile = null;

// dragged from web browser address line?
if (db.hasContent(DataFormat.URL)) {
try {
resourceUrlOrFile = new URI(db.getUrl());
} catch (URISyntaxException ex) {
ex.printStackTrace();
}
} else if (db.hasFiles()) {
// dragged from the file system
String filePath = null;
for (File file:db.getFiles()) {
filePath = file.getAbsolutePath();
}
resourceUrlOrFile = new File(filePath).toURI();
success = true;
}
// load media
Media media = new Media(resourceUrlOrFile.toString());

// stop previous media player and clean up
if (mediaPlayer != null) {
mediaPlayer.stop();
mediaPlayer.currentTimeProperty().removeListener(progressListener);
mediaPlayer.setOnPaused(null);
mediaPlayer.setOnPlaying(null);
mediaPlayer.setOnReady(null);
}

// create a new media player
mediaPlayer = new MediaPlayer(media);

// as the media is playing move the slider for progress
mediaPlayer.currentTimeProperty().addListener(progressListener);

// play video when ready status
mediaPlayer.setOnReady(() -> {
progressSlider.setValue(1);
progressSlider.setMax(mediaPlayer.getMedia().getDuration().toMillis()/1000);
mediaPlayer.play();
});

// Lazy init media viewer
if (mediaView == null) {
mediaView = new MediaView();
mediaView.setMediaPlayer(mediaPlayer);
mediaView.setX(4);
mediaView.setY(4);
mediaView.setPreserveRatio(true);
mediaView.setOpacity(.85);
mediaView.setSmooth(true);

mediaView.fitWidthProperty().bind(scene.widthProperty().subtract(220));
mediaView.fitHeightProperty().bind(scene.heightProperty().subtract(30));

// make media view as the second node on the scene.
root.getChildren().add(1, mediaView);
}

// sometimes loading errors occur, print error when this happens
mediaView.setOnError((MediaErrorEvent event1) -> {
event1.getMediaError().printStackTrace();
});

mediaView.setMediaPlayer(mediaPlayer);

event.setDropCompleted(success);
event.consume();
});

// rectangular area holding buttons
final Group buttonArea = createButtonArea(scene);

// stop button will stop and rewind the media
Node stopButton = createStopControl();

// play button can resume or start a media
final Node playButton = createPlayControl();

// pause media play
final Node pauseButton = createPauseControl();

stopButton.setOnMousePressed((MouseEvent me) -> {
if (mediaPlayer!= null) {
buttonArea.getChildren().removeAll(pauseButton, playButton);
buttonArea.getChildren().add(playButton);
mediaPlayer.stop();
}
});
// pause media and swap button with play button
pauseButton.setOnMousePressed((MouseEvent me) -> {
if (mediaPlayer!=null) {
buttonArea.getChildren().removeAll(pauseButton, playButton);
buttonArea.getChildren().add(playButton);
mediaPlayer.pause();
paused = true;
}
});

// play media and swap button with pause button
playButton.setOnMousePressed((MouseEvent me) -> {
if (mediaPlayer != null) {
buttonArea.getChildren().removeAll(pauseButton, playButton);
buttonArea.getChildren().add(pauseButton);
paused = false;
mediaPlayer.play();
}
});

// add stop button to button area
buttonArea.getChildren().add(stopButton);

// set pause button as default
buttonArea.getChildren().add(pauseButton);

// add buttons
root.getChildren().add(buttonArea);

// create a close button
Node closeButton= createCloseButton(scene);
root.getChildren().add(closeButton);

primaryStage.setOnShown((WindowEvent we) -> {
previousLocation = new Point2D(primaryStage.getX(), primaryStage.getY());
});

primaryStage.setScene(scene);
primaryStage.show();

}

Following is the attachMouseEvents() method, which adds an EventHandler to the scene so the video player can enter full-screen mode.

private void attachMouseEvents(Scene scene, final Stage primaryStage) {

// Full screen toggle
scene.setOnMouseClicked((MouseEvent event) -> {
if (event.getClickCount() == 2) {
primaryStage.setFullScreen(!primaryStage.isFullScreen());
}
});
... // the rest of the EventHandlers
}

The following method creates a slider control with a ChangeListener to enable the users to search backward and forward through the video:

private Slider createSlider(Scene scene) {
Slider slider = new Slider();
slider.setMin(0);
slider.setMax(100);
slider.setValue(1);
slider.setShowTickLabels(true);
slider.setShowTickMarks(true);

slider.valueProperty().addListener((ObservableValue<? extends Number> observable,
Number oldValue, Number newValue) -> {
if (paused) {
long dur = newValue.intValue() * 1000;
mediaPlayer.seek(new Duration(dur));
}
});

slider.translateYProperty().bind(scene.heightProperty().subtract(30));
return slider;
}

Figure 16-3 depicts the JavaFX basic video player with a slider control.

9781430268277_Fig16-03.jpg

Figure 16-3. JavaFX basic video player

How It Works

To create a video player, you will model the application similar to the example in Recipe 16-1 by reusing the same application features such as drag-and-drop files, media button controls, and so on. For the sake of clarity, I took the previous recipe and moved much of the UI code into convenience functions so you will be able to focus on the Media APIs without getting lost in the UI code. The rest of the recipes in this chapter consist of adding simple features to the JavaFX basic media player created in this recipe. This being said, the code snippets in the following recipes will be brief, consisting only of the necessary code for each new desired feature.

It is important to note that the JavaFX media player supports various media formats. The supported formats are as follows:

· AIFF

· FXM, FLV

· HLS (*)

· MP3

· MP4

· WAV

For a complete summary of the supported media types, see the online documentation at http://docs.oracle.com/javase/8/javafx/api/javafx/scene/media/package-summary.html.

Just like the audio player created in the last recipe, the JavaFX basic video player has the same basic media controls, including stop, pause, and play. In addition to these simple controls, you’ve added new capabilities such as seeking and full-screen mode.

When playing a video you’ll need a view area (javafx.scene.media.MediaView) to show it. You also create a slider control to monitor the progress of the video, which is located at the lower-left portion of the application shown in Figure 16-3. The slider control allows the users to seek backward and forward through the video. One last bonus feature is enabling the video to become full screen by double-clicking the application window. To restore the window, users repeat the double-click or press Escape.

To quickly get started, let’s jump into the code. After setting the stage in the start() method, you create a black semitransparent background by calling the createBackground() method (applicationArea). Next, the attachMouseEvents() method is invoked to set up the EventHandlers so they can enable the users to drag the application window around the desktop. Another EventHandler to be attached to the scene will allow the users to switch to full-screen mode. A conditional is used to check for a double-click in the application window in order to invoke full-screen mode. Once the double-click is performed, the Stage's method setFullScreen() is invoked with a Boolean value opposite of the currently set value. Shown here is the code needed to make a window go to full-screen mode:

// Full screen toggle
scene.setOnMouseClicked((MouseEvent event) -> {
if (event.getClickCount() == 2) {
primaryStage.setFullScreen(!primaryStage.isFullScreen());
}
});

As you continue the steps inside the start() method, a slider control is created by calling the convenience method createSlider(). The createSlider() method instantiates a Slider control and adds a ChangeListener to move the slider as the video is playing. TheChangeListener’s changed() method is invoked any time the slider’s value changes. Once the changed() method is invoked you will have an opportunity to see the old and new values. The following code creates a ChangeListener to update the slider as the video is being played:

// update slider as video is progressing (later removal)
progressListener = (ObservableValue<? extends Duration> observable,
Duration oldValue, Duration newValue) -> {
progressSlider.setValue(newValue.toSeconds());
};

After creating the progress listener (progressListener), the dragged-dropped EventHandler for the scene is created.

The goal is to determine whether the pause button was pressed before the user can move the slider. Once a slider.isPressed() flag is determined, you will obtain the new value to be converted to milliseconds. The dur variable is used to move the mediaPlayer to seek the position into the video as the user slides the control left or right. The ChangeListener’s changed() method is invoked any time the slider’s value changes. The following code is responsible for moving the seek position into the video based on the user moving the slider.

slider.valueProperty().addListener((ObservableValue<? extends Number> observable,
Number oldValue, Number newValue) -> {
if (slider.isPressed()) {
long dur = newValue.intValue() * 1000;
mediaPlayer.seek(new Duration(dur));
}
});

Moving right along, you next implement a drag-dropped EventHandler to handle the media file being dropped into the application window area. Here the example first checks to see whether there was a previous mediaPlayer. If there was, the previous mediaPlayer object is stopped and cleanup is performed:

// stop previous media player and clean up
if (mediaPlayer != null) {
mediaPlayer.stop();
mediaPlayer.currentTimeProperty().removeListener(progressListener);
mediaPlayer.setOnPaused(null);
mediaPlayer.setOnPlaying(null);
mediaPlayer.setOnReady(null);
}
...

// play video when ready status
mediaPlayer.setOnReady(() -> {
progressSlider.setValue(1);
progressSlider.setMax(mediaPlayer.getMedia().getDuration().toMillis() / 1000);
mediaPlayer.play();
});// setOnReady()

As with the audio player, you create a Runnable instance to be run when the media player is in a ready state. You’ll notice also that the progressSlider control uses values in seconds.

Once the media player object is in a ready state, a MediaView instance is created to display the media. The following code creates a MediaView object to be placed into the scene graph to display video content:

// Lazy init media viewer
if (mediaView == null) {
mediaView = new MediaView();
mediaView.setMediaPlayer(mediaPlayer);
mediaView.setX(4);
mediaView.setY(4);
mediaView.setPreserveRatio(true);
mediaView.setOpacity(.85);
mediaView.setSmooth(true);

mediaView.fitWidthProperty().bind(scene.widthProperty().subtract(220));
mediaView.fitHeightProperty().bind(scene.heightProperty().subtract(30));

// make media view as the second node on the scene.
root.getChildren().add(1, mediaView);
}

// sometimes loading errors occur, print error when this happens
mediaView.setOnError((MediaErrorEvent event1) -> {
event1.getMediaError().printStackTrace();
});

mediaView.setMediaPlayer(mediaPlayer);

event.setDropCompleted(success);
event.consume();
});

Whew! You are finally finished with the scene’s drag-dropped EventHandler. Up next is pretty much the rest of the media button controls, which are similar to the code at the end of Recipe 16-1. The only difference is a single instance variable named paused of type Boolean that denotes whether the video was paused. The following code shows the pauseButton and playButton controlling the mediaPlayer object and setting the paused flag accordingly:

// pause media and swap button with play button
pauseButton.setOnMousePressed((MouseEvent me) -> {
if (mediaPlayer != null) {
buttonArea.getChildren().removeAll(pauseButton, playButton);
buttonArea.getChildren().add(playButton);
mediaPlayer.pause();
paused = true;
}
});

// play media and swap button with pause button
playButton.setOnMousePressed((MouseEvent me) -> {
if (mediaPlayer != null) {
buttonArea.getChildren().removeAll(pauseButton, playButton);
buttonArea.getChildren().add(pauseButton);
paused = false;
mediaPlayer.play();
}
});

That is how you create a video media player. In the next recipe, you learn how to listen to media events and invoke actions.

16-3. Controlling Media Actions and Events

Problem

You want the media player to provide feedback in response to certain events, such as displaying the text “Paused” on the screen when the media player’s paused event is triggered.

Solution

You can use one or more of the media event handler methods. Shown in Table 16-3 are all the possible media events that are raised to allow developers to attach EventHandlers or Runnables.

Table 16-3. Media Events

Table16-3.jpg

The following code presents the “Paused” text the users, with the “Duration” having a decimal of milliseconds. This text is overlaid on top of the video when the user clicks the pause button (see Figure 16-4).

9781430268277_Fig16-04.jpg

Figure 16-4. Paused event


// when paused event display pause message
mediaPlayer.setOnPaused(() -> {
pauseMessage.setText("Paused \nDuration: " +
mediaPlayer.currentTimeProperty().getValue().toMillis());
pauseMessage.setOpacity(.90);
});

How It Works

Event-driven architecture (EDA) is a prominent architectural pattern used to model loosely coupled components and services that pass messages asynchronously. The JavaFX team designed the Media API to be event-driven, and this recipe demonstrates how to implement it in response to media events.

With event-based programming in mind, you will discover nonblocking or callback behaviors when invoking functions. In this recipe, you will implement the display of text in response to an onPaused event instead of placing your code into the pause button. Instead of tying code directly to a button via an EventHandler, you will be implementing code that will respond to the media player’s onPaused event being triggered. When responding to media events, you will be implementing java.lang.Runnables.

You’ll be happy to know that you’ve been using event properties and implementing Runnables all along, albeit usually in the form of lambda epresssions. Hopefully you noticed this in all the recipes in this chapter. When the media player is in a ready state, the Runnable code will be invoked. Why is this correct? Well, when the media player is finished loading the media, the onReady property will be notified. That way you can be sure you can invoke the MediaPlayer’s play() method. I trust that you will get used to event style programming. The following code snippet demonstrates setting a Runnable instance into a media player object’s OnReady property using a lambda expression:

mediaPlayer.setOnReady(() -> {
mediaPlayer.play();
});

So that you can see the difference between the newer Java 8 style of programming versus the older style, here is the same code implemented without using a lambda expression:

mediaPlayer.setOnReady(new Runnable() {
@Override
public void run() {
mediaPlayer.play();
}
});

See how many lines of code you got rid of by using lambdas? How much are you enjoying Java 8 now? You will be taking steps similar to the onReady property. Once a Paused event has been triggered, the run() method will be invoked to present to the user a message containing aText node with the word Paused and a duration showing the time in milliseconds into the video. Once the text is displayed, you might want to write down the duration as markers (as you’ll learn in Recipe 16-4). The following code snippet shows an attached Runnable instance, which is responsible for displaying a paused message and duration in milliseconds at the point at which it was paused in the video:

// when paused event display pause message
mediaPlayer.setOnPaused(() -> {
pauseMessage.setText("Paused \nDuration: " +
mediaPlayer.currentTimeProperty().getValue().toMillis());
pauseMessage.setOpacity(.90);
});

16-4. Marking a Position in a Video

Problem

You want to provide closed caption text while playing a video in the media player.

Solution

Begin by applying the solution in Recipe 16-3. By obtaining the marked durations (in milliseconds) from the previous recipe, you will create media marker events at points into the video. With each media marker you will associate text that will be displayed as closed captions. When a marker comes to pass, the text will be shown in the upper-right side.

The following code snippet demonstrates media marker events being handled in the onDragDropped event property of the Scene object:

... // inside the start() method

final VBox messageArea = createClosedCaptionArea(scene);
root.getChildren().add(messageArea);

// Dropping over surface
scene.setOnDragDropped((DragEvent event) -> {
Dragboard db = event.getDragboard();
boolean success = false;
URI resourceUrlOrFile = null;

// dragged from web browser address line?
if (db.hasContent(DataFormat.URL)) {
try {
resourceUrlOrFile = new URI(db.getUrl().toString());
} catch (URISyntaxException ex) {
ex.printStackTrace();
}
} else if (db.hasFiles()) {
// dragged from the file system
String filePath = null;
for (File file:db.getFiles()) {
filePath = file.getAbsolutePath();
}
resourceUrlOrFile = new File(filePath).toURI();
success = true;
}
// load media
Media media = new Media(resourceUrlOrFile.toString());

// stop previous media player and clean up
if (mediaPlayer != null) {
mediaPlayer.stop();
mediaPlayer.currentTimeProperty().removeListener(progressListener);
mediaPlayer.setOnPaused(null);
mediaPlayer.setOnPlaying(null);
mediaPlayer.setOnReady(null);
}

// create a new media player
mediaPlayer = new MediaPlayer(media);

// as the media is playing move the slider for progress
mediaPlayer.currentTimeProperty().addListener(progressListener);

// when paused event display pause message
mediaPlayer.setOnPaused(() -> {
pauseMessage.setOpacity(.90);
});

// when playing make pause text invisible
mediaPlayer.setOnPlaying(() -> {
pauseMessage.setOpacity(0);
});

// play video when ready status
mediaPlayer.setOnReady(() -> {
progressSlider.setValue(1);
progressSlider.setMax(mediaPlayer.getMedia().getDuration().toMillis()/1000);
mediaPlayer.play();
});

// Lazy init media viewer
if (mediaView == null) {
mediaView = new MediaView(mediaPlayer);
mediaView.setX(4);
mediaView.setY(4);
mediaView.setPreserveRatio(true);
mediaView.setOpacity(.85);
mediaView.setSmooth(true);

mediaView.fitWidthProperty().bind(scene.widthProperty().subtract(messageArea.
widthProperty().add(70)));
mediaView.fitHeightProperty().bind(scene.heightProperty().subtract(30));

// make media view as the second node on the scene.
root.getChildren().add(1, mediaView);
}

// sometimes loading errors occur
mediaView.setOnError((MediaErrorEvent event1) -> {
event1.getMediaError().printStackTrace();
});

mediaView.setMediaPlayer(mediaPlayer);

media.getMarkers().put("First marker", Duration.millis(10000));
media.getMarkers().put("Second marker", Duration.millis(20000));
media.getMarkers().put("Last one...", Duration.millis(30000));

// display closed caption
mediaPlayer.setOnMarker((MediaMarkerEvent event1) -> {
closedCaption.setText(event1.getMarker().getKey());
});

event.setDropCompleted(success);
event.consume();
}); // end of setOnDragDropped

The following code shows a factory method that returns an area that will contain the closed caption to be displayed to the right of the video:

private VBox createClosedCaptionArea(final Scene scene) {
// create message area
final VBox messageArea = new VBox(3);
messageArea.setTranslateY(30);
messageArea.translateXProperty().bind(scene.widthProperty().subtract(152) );
messageArea.setTranslateY(20);
closedCaption = new Text();
closedCaption.setStroke(Color.WHITE);
closedCaption.setFill(Color.YELLOW);
closedCaption.setFont(new Font(15));

messageArea.getChildren().add(closedCaption);
return messageArea;
}

Figure 16-5 depicts the video media player displaying the closed caption text.

9781430268277_Fig16-05.jpg

Figure 16-5. Closed caption text

How It Works

The Media API has many event properties to which the developer can attach EventHandlers or Runnables instances so they can respond when the events are triggered. This recipe focused on the OnMarker event property. The Marker property is responsible for receiving marker events (MediaMarkerEvent).

Let’s begin by adding markers to the Media object. It contains a method getMarkers() that returns an javafx.collections.ObservableMap<String, Duration>. With an observable map, you can add key/value pairs that represent each marker. Adding keys should be a unique identifier, and the value is an instance of Duration. For simplicity, this example uses the closed caption text as the key for each media marker. The marker durations are those written down as users press the pause button at points in the video determined in Recipe 16-3. Be advised that this is not the recommended approach to use for production-quality code. You may want to use a parallel Map instead.

After adding markers you will be setting an EventHandler into the MediaPlayer object’s OnMarker property using the setOnMarker() method. Next, you implement an EventHandler via a lambda expression to handle MediaMarkerEvents that are raised. Once an event has been received, you obtain the key representing the text to be used in the closed caption. The instance variable closedCaption (javafx.scene.text.Text node) will simply be shown by calling the setText() method with the key or string associated with the marker.

That’s it for media markers. That goes to show how you can coordinate special effects, animations, and so on during a video quite easily.

16-5. Synchronizing Animation and Media

Problem

You want to incorporate animated effects in your media display, such as scrolling the text "The End" after the video is finished playing.

Solution

You simply use Recipe 16-3 together with Recipe 16-2. Recipe 16-3 shows how to respond to media events and Recipe 16-2 demonstrates how to use the translate transition to animate text.

The following code demonstrates an attached action when the end of a media event is triggered:

mediaPlayer.setOnEndOfMedia(() -> {
closedCaption.setText("");
animateTheEnd.getNode().setOpacity(.90);
animateTheEnd.playFromStart();
});

The following method creates a translateTransition of a Text node containing the string “The End” that appears after an end of media event is triggered:

public TranslateTransition createTheEnd(Scene scene) {
Text theEnd = new Text("The End");
theEnd.setFont(new Font(40));
theEnd.setStrokeWidth(3);
theEnd.setFill(Color.WHITE);
theEnd.setStroke(Color.WHITE);
theEnd.setX(75);

TranslateTransition scrollUp = new TranslateTransition();
scrollUp.setNode(theEnd);
scrollUp.setDuration(Duration.seconds(1));
scrollUp.setInterpolator(Interpolator.EASE_IN);
scrollUp.setFromY(scene.getHeight() + 40);
scrollUp.setToY(scene.getHeight()/2);

return scrollUp;
}

Figure 16-6 depicts the "The End" text node scrolling along after the OnEndOfMedia event is triggered.

9781430268277_Fig16-06.jpg

Figure 16-6. Animating “The End”

How It Works

This recipe showcases how to synchronize events to animated effects. In the code example, when the video reaches the end, an OnEndOfMedia property event initiates a Runnable instance. Once the instance is initiated, a TranslateTransition animation is performed by scrolling aText node upward that contains the string "The End".

Let’s take a look at the setOnEndOfMedia() method associated with the MediaPlayer object. Just like in Recipe 16-3, you simply call the setOnEndOfMedia() method by passing in a lambda expression implementing Runnable, which contains the code that will invoke an animation. If you don’t know how the animation works, refer to Recipe 16-2. Once the event occurs, you will see the text scroll upward. The following code snippet is from inside the scene.setOnDragDropped() method:

mediaPlayer.setOnEndOfMedia(() -> {
closedCaption.setText("");
animateTheEnd.getNode().setOpacity(.90);
animateTheEnd.playFromStart();
});

For the sake of space, I trust you know where the code block would reside. If not, refer to Recipe 16-3, in which you will notice other OnXXX properties methods. To see the entire code listing and download the source code, visit the book’s website.

To animate "The End" you create a convenience createTheEnd() method to create an instance of a Text node and return a TranslateTransition object to the caller. The TranslateTransition that’s returned does the following: it waits a second before playing the video. Next is the interpolator in which you used the Interpolator.EASE_IN to move the Text node by easing in before a full stop. Last is setting the Y property of the node to move from the bottom to the center of the viewing area.

The following code creates an animation that scrolls a node in an upward motion:

TranslateTransition scrollUp = new TranslateTransition();
scrollUp.setNode(theEnd);
scrollUp.setDuration(Duration.seconds(1));
scrollUp.setInterpolator(Interpolator.EASE_IN);
scrollUp.setFromY(scene.getHeight() + 40);
scrollUp.setToY(scene.getHeight()/2);

Summary

JavaFX has been a venue for development of media-based applications since its beginning. The JavaFX Media API enables developers to easily add media and media-based controls to any application. In previous versions of JavaFX, video and audio types were more limited. Java 8 makes it even easier to support different media types and to implement media controls via lambda expressions.

This chapter provided a brief overview of some JavaFX Media API capabilities. However, we haven’t even scratched the surface of the possibilities. For more information regarding the JavaFX Media API, see the online documentation athttp://docs.oracle.com/javase/8/javafx/api/javafx/scene/media/package-summary.html.