1 # Android tutorial 4: A basic media player
7 Enough testing with synthetic images and audio tones! This tutorial
8 finally plays actual media, streamed directly from the Internet, in your
9 Android device. It shows:
11 - How to keep the User Interface regularly updated with the current
12 playback position and duration
13 - How to implement a [Seek
14 Bar](http://developer.android.com/reference/android/widget/SeekBar.html)
15 - How to report the media size to adapt the display surface
17 It also uses the knowledge gathered in the [](tutorials/basic/index.md) regarding:
19 - How to use `playbin` to play any kind of media
20 - How to handle network resilience problems
24 From the previous tutorials, we already have almost all necessary pieces
25 to build a media player. The most complex part is assembling a pipeline
26 which retrieves, decodes and displays the media, but we already know
27 that the `playbin` element can take care of all that for us. We only
28 need to replace the manual pipeline we used in
29 [](tutorials/android/video.md) with a single-element
30 `playbin` pipeline and we are good to go!
32 However, we can do better than. We will add a [Seek
33 Bar](http://developer.android.com/reference/android/widget/SeekBar.html),
34 with a moving thumb that will advance as our current position in the
35 media advances. We will also allow the user to drag the thumb, to jump
36 (or *seek*) to a different position.
38 And finally, we will make the video surface adapt to the media size, so
39 the video sink is not forced to draw black borders around the clip.
40 This also allows the Android layout to adapt more nicely to the actual
41 media content. You can still force the video surface to have a specific
42 size if you really want to.
44 ## A basic media player \[Java code\]
46 **src/com/gst\_sdk\_tutorials/tutorial\_4/Tutorial4.java**
49 package com.gst_sdk_tutorials.tutorial_4;
51 import java.text.SimpleDateFormat;
52 import java.util.Date;
53 import java.util.TimeZone;
55 import android.app.Activity;
56 import android.os.Bundle;
57 import android.util.Log;
58 import android.view.SurfaceHolder;
59 import android.view.SurfaceView;
60 import android.view.View;
61 import android.view.View.OnClickListener;
62 import android.widget.ImageButton;
63 import android.widget.SeekBar;
64 import android.widget.SeekBar.OnSeekBarChangeListener;
65 import android.widget.TextView;
66 import android.widget.Toast;
68 import org.freedesktop.gstreamer.GStreamer;
70 public class Tutorial4 extends Activity implements SurfaceHolder.Callback, OnSeekBarChangeListener {
71 private native void nativeInit(); // Initialize native code, build pipeline, etc
72 private native void nativeFinalize(); // Destroy pipeline and shutdown native code
73 private native void nativeSetUri(String uri); // Set the URI of the media to play
74 private native void nativePlay(); // Set pipeline to PLAYING
75 private native void nativeSetPosition(int milliseconds); // Seek to the indicated position, in milliseconds
76 private native void nativePause(); // Set pipeline to PAUSED
77 private static native boolean nativeClassInit(); // Initialize native class: cache Method IDs for callbacks
78 private native void nativeSurfaceInit(Object surface); // A new surface is available
79 private native void nativeSurfaceFinalize(); // Surface about to be destroyed
80 private long native_custom_data; // Native code will use this to keep private data
82 private boolean is_playing_desired; // Whether the user asked to go to PLAYING
83 private int position; // Current position, reported by native code
84 private int duration; // Current clip duration, reported by native code
85 private boolean is_local_media; // Whether this clip is stored locally or is being streamed
86 private int desired_position; // Position where the users wants to seek to
87 private String mediaUri; // URI of the clip being played
89 private final String defaultMediaUri = "https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-368p.ogv";
91 // Called when the activity is first created.
93 public void onCreate(Bundle savedInstanceState)
95 super.onCreate(savedInstanceState);
97 // Initialize GStreamer and warn if it fails
100 } catch (Exception e) {
101 Toast.makeText(this, e.getMessage(), Toast.LENGTH_LONG).show();
106 setContentView(R.layout.main);
108 ImageButton play = (ImageButton) this.findViewById(R.id.button_play);
109 play.setOnClickListener(new OnClickListener() {
110 public void onClick(View v) {
111 is_playing_desired = true;
116 ImageButton pause = (ImageButton) this.findViewById(R.id.button_stop);
117 pause.setOnClickListener(new OnClickListener() {
118 public void onClick(View v) {
119 is_playing_desired = false;
124 SurfaceView sv = (SurfaceView) this.findViewById(R.id.surface_video);
125 SurfaceHolder sh = sv.getHolder();
126 sh.addCallback(this);
128 SeekBar sb = (SeekBar) this.findViewById(R.id.seek_bar);
129 sb.setOnSeekBarChangeListener(this);
131 // Retrieve our previous state, or initialize it to default values
132 if (savedInstanceState != null) {
133 is_playing_desired = savedInstanceState.getBoolean("playing");
134 position = savedInstanceState.getInt("position");
135 duration = savedInstanceState.getInt("duration");
136 mediaUri = savedInstanceState.getString("mediaUri");
137 Log.i ("GStreamer", "Activity created with saved state:");
139 is_playing_desired = false;
140 position = duration = 0;
141 mediaUri = defaultMediaUri;
142 Log.i ("GStreamer", "Activity created with no saved state:");
144 is_local_media = false;
145 Log.i ("GStreamer", " playing:" + is_playing_desired + " position:" + position +
146 " duration: " + duration + " uri: " + mediaUri);
148 // Start with disabled buttons, until native code is initialized
149 this.findViewById(R.id.button_play).setEnabled(false);
150 this.findViewById(R.id.button_stop).setEnabled(false);
155 protected void onSaveInstanceState (Bundle outState) {
156 Log.d ("GStreamer", "Saving state, playing:" + is_playing_desired + " position:" + position +
157 " duration: " + duration + " uri: " + mediaUri);
158 outState.putBoolean("playing", is_playing_desired);
159 outState.putInt("position", position);
160 outState.putInt("duration", duration);
161 outState.putString("mediaUri", mediaUri);
164 protected void onDestroy() {
169 // Called from native code. This sets the content of the TextView from the UI thread.
170 private void setMessage(final String message) {
171 final TextView tv = (TextView) this.findViewById(R.id.textview_message);
172 runOnUiThread (new Runnable() {
179 // Set the URI to play, and record whether it is a local or remote file
180 private void setMediaUri() {
181 nativeSetUri (mediaUri);
182 is_local_media = mediaUri.startsWith("file://");
185 // Called from native code. Native code calls this once it has created its pipeline and
186 // the main loop is running, so it is ready to accept commands.
187 private void onGStreamerInitialized () {
188 Log.i ("GStreamer", "GStreamer initialized:");
189 Log.i ("GStreamer", " playing:" + is_playing_desired + " position:" + position + " uri: " + mediaUri);
191 // Restore previous playing state
193 nativeSetPosition (position);
194 if (is_playing_desired) {
200 // Re-enable buttons, now that GStreamer is initialized
201 final Activity activity = this;
202 runOnUiThread(new Runnable() {
204 activity.findViewById(R.id.button_play).setEnabled(true);
205 activity.findViewById(R.id.button_stop).setEnabled(true);
210 // The text widget acts as an slave for the seek bar, so it reflects what the seek bar shows, whether
211 // it is an actual pipeline position or the position the user is currently dragging to.
212 private void updateTimeWidget () {
213 final TextView tv = (TextView) this.findViewById(R.id.textview_time);
214 final SeekBar sb = (SeekBar) this.findViewById(R.id.seek_bar);
215 final int pos = sb.getProgress();
217 SimpleDateFormat df = new SimpleDateFormat("HH:mm:ss");
218 df.setTimeZone(TimeZone.getTimeZone("UTC"));
219 final String message = df.format(new Date (pos)) + " / " + df.format(new Date (duration));
223 // Called from native code
224 private void setCurrentPosition(final int position, final int duration) {
225 final SeekBar sb = (SeekBar) this.findViewById(R.id.seek_bar);
227 // Ignore position messages from the pipeline if the seek bar is being dragged
228 if (sb.isPressed()) return;
230 runOnUiThread (new Runnable() {
233 sb.setProgress(position);
237 this.position = position;
238 this.duration = duration;
242 System.loadLibrary("gstreamer_android");
243 System.loadLibrary("tutorial-4");
247 public void surfaceChanged(SurfaceHolder holder, int format, int width,
249 Log.d("GStreamer", "Surface changed to format " + format + " width "
250 + width + " height " + height);
251 nativeSurfaceInit (holder.getSurface());
254 public void surfaceCreated(SurfaceHolder holder) {
255 Log.d("GStreamer", "Surface created: " + holder.getSurface());
258 public void surfaceDestroyed(SurfaceHolder holder) {
259 Log.d("GStreamer", "Surface destroyed");
260 nativeSurfaceFinalize ();
263 // Called from native code when the size of the media changes or is first detected.
264 // Inform the video surface about the new size and recalculate the layout.
265 private void onMediaSizeChanged (int width, int height) {
266 Log.i ("GStreamer", "Media size changed to " + width + "x" + height);
267 final GStreamerSurfaceView gsv = (GStreamerSurfaceView) this.findViewById(R.id.surface_video);
268 gsv.media_width = width;
269 gsv.media_height = height;
270 runOnUiThread(new Runnable() {
277 // The Seek Bar thumb has moved, either because the user dragged it or we have called setProgress()
278 public void onProgressChanged(SeekBar sb, int progress, boolean fromUser) {
279 if (fromUser == false) return;
280 desired_position = progress;
281 // If this is a local file, allow scrub seeking, this is, seek as soon as the slider is moved.
282 if (is_local_media) nativeSetPosition(desired_position);
286 // The user started dragging the Seek Bar thumb
287 public void onStartTrackingTouch(SeekBar sb) {
291 // The user released the Seek Bar thumb
292 public void onStopTrackingTouch(SeekBar sb) {
293 // If this is a remote file, scrub seeking is probably not going to work smoothly enough.
294 // Therefore, perform only the seek when the slider is released.
295 if (!is_local_media) nativeSetPosition(desired_position);
296 if (is_playing_desired) nativePlay();
301 ### Supporting arbitrary media URIs
303 The C code provides the `nativeSetUri()` method so we can indicate the
304 URI of the media to play. Since `playbin` will be taking care of
305 retrieving the media, we can use local or remote URIs indistinctly
306 (`file://` or `http://`, for example). From Java, though, we want to
307 keep track of whether the file is local or remote, because we will not
308 offer the same functionalities. We keep track of this in the
309 `is_local_media` variable, and update it every time we change the media
313 private void setMediaUri() {
314 nativeSetUri (mediaUri);
315 is_local_media = mediaUri.startsWith("file://");
319 We call `setMediaUri()` in the `onGStreamerInitialized()` callback, once
320 the pipeline is ready to accept commands.
322 ### Reporting media size
324 Every time the size of the media changes (which could happen mid-stream,
325 for some kind of streams), or when it is first detected, C code calls
326 our `onMediaSizeChanged()` callback:
329 private void onMediaSizeChanged (int width, int height) {
330 Log.i ("GStreamer", "Media size changed to " + width + "x" + height);
331 final GStreamerSurfaceView gsv = (GStreamerSurfaceView) this.findViewById(R.id.surface_video);
332 gsv.media_width = width;
333 gsv.media_height = height;
334 runOnUiThread(new Runnable() {
342 Here we simply pass the new size onto the `GStreamerSurfaceView` in
343 charge of displaying the media, and ask the Android layout to be
344 recalculated. Eventually, the `onMeasure()` method in
345 GStreamerSurfaceView will be called and the new size will be taken
346 into account. As we have already seen in
347 [](tutorials/android/a-running-pipeline.md), methods which change
348 the UI must be called from the main thread, and we are now in a
349 callback from some GStreamer internal thread. Hence, the usage of
350 [runOnUiThread()](http://developer.android.com/reference/android/app/Activity.html#runOnUiThread\(java.lang.Runnable\)).
352 ### Refreshing the Seek Bar
354 [](tutorials/basic/toolkit-integration.md)
355 has already shown how to implement a [Seek
356 Bar](http://developer.android.com/reference/android/widget/SeekBar.html) using
357 the GTK+ toolkit. The implementation on Android is very similar.
359 The Seek Bar accomplishes to functions: First, it moves on its own to
360 reflect the current playback position in the media. Second, it can be
361 dragged by the user to seek to a different position.
363 To realize the first function, C code will periodically call our
364 `setCurrentPosition()` method so we can update the position of the thumb
365 in the Seek Bar. Again we do so from the UI thread, using
369 private void setCurrentPosition(final int position, final int duration) {
370 final SeekBar sb = (SeekBar) this.findViewById(R.id.seek_bar);
372 // Ignore position messages from the pipeline if the seek bar is being dragged
373 if (sb.isPressed()) return;
375 runOnUiThread (new Runnable() {
378 sb.setProgress(position);
382 this.position = position;
383 this.duration = duration;
387 To the left of the Seek Bar (refer to the screenshot at the top of this
389 [TextView](http://developer.android.com/reference/android/widget/TextView.html)
390 widget which we will use to display the current position and duration in
391 `HH:mm:ss / HH:mm:ss` textual format. The `updateTimeWidget()` method
392 takes care of it, and must be called every time the Seek Bar is updated:
395 private void updateTimeWidget () {
396 final TextView tv = (TextView) this.findViewById(R.id.textview_time);
397 final SeekBar sb = (SeekBar) this.findViewById(R.id.seek_bar);
398 final int pos = sb.getProgress();
400 SimpleDateFormat df = new SimpleDateFormat("HH:mm:ss");
401 df.setTimeZone(TimeZone.getTimeZone("UTC"));
402 final String message = df.format(new Date (pos)) + " / " + df.format(new Date (duration));
407 ### Seeking with the Seek Bar
409 To perform the second function of the [Seek
410 Bar](http://developer.android.com/reference/android/widget/SeekBar.html) (allowing
411 the user to seek by dragging the thumb), we implement the
412 [OnSeekBarChangeListener](http://developer.android.com/reference/android/widget/SeekBar.OnSeekBarChangeListener.html)
417 public class Tutorial4 extends Activity implements SurfaceHolder.Callback, OnSeekBarChangeListener {
420 And we register the Activity as the listener for the [Seek
421 Bar](http://developer.android.com/reference/android/widget/SeekBar.html)’s
422 events in the `onCreate()` method:
425 SeekBar sb = (SeekBar) this.findViewById(R.id.seek_bar);
426 sb.setOnSeekBarChangeListener(this);
429 We will now be notified of three events: When the user starts dragging
430 the thumb, every time the thumb moves and when the thumb is released by
434 public void onStartTrackingTouch(SeekBar sb) {
439 [onStartTrackingTouch()](http://developer.android.com/reference/android/widget/SeekBar.OnSeekBarChangeListener.html#onStartTrackingTouch\(android.widget.SeekBar\))
440 is called when the user starts dragging, and the only thing we do is
441 pause the pipeline. If the user is searching for a particular scene, we
442 do not want it to keep
446 public void onProgressChanged(SeekBar sb, int progress, boolean fromUser) {
447 if (fromUser == false) return;
448 desired_position = progress;
449 // If this is a local file, allow scrub seeking, this is, seek soon as the slider is moved.
450 if (is_local_media) nativeSetPosition(desired_position);
455 [onProgressChanged()](http://developer.android.com/reference/android/widget/SeekBar.OnSeekBarChangeListener.html#onProgressChanged\(android.widget.SeekBar,%20int,%20boolean\)) is
456 called every time the thumb moves, be it because the user dragged it, or
457 because we called `setProgress()` on the Seek Bar. We discard the latter
458 case with the handy `fromUser` parameter.
460 As the comment says, if this is a local media, we allow scrub seeking,
461 this is, we jump to the indicated position as soon as the thumb moves.
462 Otherwise, the seek will be performed when the thumb is released, and
463 the only thing we do here is update the textual time widget.
466 public void onStopTrackingTouch(SeekBar sb) {
467 // If this is a remote file, scrub seeking is probably not going to work smoothly enough.
468 // Therefore, perform only the seek when the slider is released.
469 if (!is_local_media) nativeSetPosition(desired_position);
470 if (is_playing_desired) nativePlay();
474 Finally, [onStopTrackingTouch()](http://developer.android.com/reference/android/widget/SeekBar.OnSeekBarChangeListener.html#onStopTrackingTouch\(android.widget.SeekBar\))
475 is called when the thumb is released. We simply perform the seek
476 operation if the file was non-local, and restore the pipeline to the
477 desired playing state.
479 This concludes the User interface part of this tutorial. Let’s review
480 now the under-the-hood C code that allows this to work.
482 ## A basic media player \[C code\]
489 #include <android/log.h>
490 #include <android/native_window.h>
491 #include <android/native_window_jni.h>
493 #include <gst/interfaces/xoverlay.h>
494 #include <gst/video/video.h>
497 GST_DEBUG_CATEGORY_STATIC (debug_category);
498 #define GST_CAT_DEFAULT debug_category
501 * These macros provide a way to store the native pointer to CustomData, which might be 32 or 64 bits, into
502 * a jlong, which is always 64 bits, without warnings.
504 #if GLIB_SIZEOF_VOID_P == 8
505 ## define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(*env)->GetLongField (env, thiz, fieldID)
506 ## define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)data)
508 ## define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(jint)(*env)->GetLongField (env, thiz, fieldID)
509 ## define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)(jint)data)
512 /* Do not allow seeks to be performed closer than this distance. It is visually useless, and will probably
513 * confuse some demuxers. */
514 #define SEEK_MIN_DELAY (500 * GST_MSECOND)
516 /* Structure to contain all our information, so we can pass it to callbacks */
517 typedef struct _CustomData {
518 jobject app; /* Application instance, used to call its methods. A global reference is kept. */
519 GstElement *pipeline; /* The running pipeline */
520 GMainContext *context; /* GLib context used to run the main loop */
521 GMainLoop *main_loop; /* GLib main loop */
522 gboolean initialized; /* To avoid informing the UI multiple times about the initialization */
523 ANativeWindow *native_window; /* The Android native window where video will be rendered */
524 GstState state; /* Current pipeline state */
525 GstState target_state; /* Desired pipeline state, to be set once buffering is complete */
526 gint64 duration; /* Cached clip duration */
527 gint64 desired_position; /* Position to seek to, once the pipeline is running */
528 GstClockTime last_seek_time; /* For seeking overflow prevention (throttling) */
529 gboolean is_live; /* Live streams do not use buffering */
534 GST_PLAY_FLAG_TEXT = (1 << 2) /* We want subtitle output */
537 /* These global variables cache values which are not changing during execution */
538 static pthread_t gst_app_thread;
539 static pthread_key_t current_jni_env;
540 static JavaVM *java_vm;
541 static jfieldID custom_data_field_id;
542 static jmethodID set_message_method_id;
543 static jmethodID set_current_position_method_id;
544 static jmethodID on_gstreamer_initialized_method_id;
545 static jmethodID on_media_size_changed_method_id;
551 /* Register this thread with the VM */
552 static JNIEnv *attach_current_thread (void) {
554 JavaVMAttachArgs args;
556 GST_DEBUG ("Attaching thread %p", g_thread_self ());
557 args.version = JNI_VERSION_1_4;
561 if ((*java_vm)->AttachCurrentThread (java_vm, &env, &args) < 0) {
562 GST_ERROR ("Failed to attach current thread");
569 /* Unregister this thread from the VM */
570 static void detach_current_thread (void *env) {
571 GST_DEBUG ("Detaching thread %p", g_thread_self ());
572 (*java_vm)->DetachCurrentThread (java_vm);
575 /* Retrieve the JNI environment for this thread */
576 static JNIEnv *get_jni_env (void) {
579 if ((env = pthread_getspecific (current_jni_env)) == NULL) {
580 env = attach_current_thread ();
581 pthread_setspecific (current_jni_env, env);
587 /* Change the content of the UI's TextView */
588 static void set_ui_message (const gchar *message, CustomData *data) {
589 JNIEnv *env = get_jni_env ();
590 GST_DEBUG ("Setting message to: %s", message);
591 jstring jmessage = (*env)->NewStringUTF(env, message);
592 (*env)->CallVoidMethod (env, data->app, set_message_method_id, jmessage);
593 if ((*env)->ExceptionCheck (env)) {
594 GST_ERROR ("Failed to call Java method");
595 (*env)->ExceptionClear (env);
597 (*env)->DeleteLocalRef (env, jmessage);
600 /* Tell the application what is the current position and clip duration */
601 static void set_current_ui_position (gint position, gint duration, CustomData *data) {
602 JNIEnv *env = get_jni_env ();
603 (*env)->CallVoidMethod (env, data->app, set_current_position_method_id, position, duration);
604 if ((*env)->ExceptionCheck (env)) {
605 GST_ERROR ("Failed to call Java method");
606 (*env)->ExceptionClear (env);
610 /* If we have pipeline and it is running, query the current position and clip duration and inform
612 static gboolean refresh_ui (CustomData *data) {
613 GstFormat fmt = GST_FORMAT_TIME;
617 /* We do not want to update anything unless we have a working pipeline in the PAUSED or PLAYING state */
618 if (!data || !data->pipeline || data->state < GST_STATE_PAUSED)
621 /* If we didn't know it yet, query the stream duration */
622 if (!GST_CLOCK_TIME_IS_VALID (data->duration)) {
623 if (!gst_element_query_duration (data->pipeline, &fmt, &data->duration)) {
624 GST_WARNING ("Could not query current duration");
628 if (gst_element_query_position (data->pipeline, &fmt, &position)) {
629 /* Java expects these values in milliseconds, and GStreamer provides nanoseconds */
630 set_current_ui_position (position / GST_MSECOND, data->duration / GST_MSECOND, data);
635 /* Forward declaration for the delayed seek callback */
636 static gboolean delayed_seek_cb (CustomData *data);
638 /* Perform seek, if we are not too close to the previous seek. Otherwise, schedule the seek for
639 * some time in the future. */
640 static void execute_seek (gint64 desired_position, CustomData *data) {
643 if (desired_position == GST_CLOCK_TIME_NONE)
646 diff = gst_util_get_timestamp () - data->last_seek_time;
648 if (GST_CLOCK_TIME_IS_VALID (data->last_seek_time) && diff < SEEK_MIN_DELAY) {
649 /* The previous seek was too close, delay this one */
650 GSource *timeout_source;
652 if (data->desired_position == GST_CLOCK_TIME_NONE) {
653 /* There was no previous seek scheduled. Setup a timer for some time in the future */
654 timeout_source = g_timeout_source_new ((SEEK_MIN_DELAY - diff) / GST_MSECOND);
655 g_source_set_callback (timeout_source, (GSourceFunc)delayed_seek_cb, data, NULL);
656 g_source_attach (timeout_source, data->context);
657 g_source_unref (timeout_source);
659 /* Update the desired seek position. If multiple requests are received before it is time
660 * to perform a seek, only the last one is remembered. */
661 data->desired_position = desired_position;
662 GST_DEBUG ("Throttling seek to %" GST_TIME_FORMAT ", will be in %" GST_TIME_FORMAT,
663 GST_TIME_ARGS (desired_position), GST_TIME_ARGS (SEEK_MIN_DELAY - diff));
665 /* Perform the seek now */
666 GST_DEBUG ("Seeking to %" GST_TIME_FORMAT, GST_TIME_ARGS (desired_position));
667 data->last_seek_time = gst_util_get_timestamp ();
668 gst_element_seek_simple (data->pipeline, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT, desired_position);
669 data->desired_position = GST_CLOCK_TIME_NONE;
673 /* Delayed seek callback. This gets called by the timer setup in the above function. */
674 static gboolean delayed_seek_cb (CustomData *data) {
675 GST_DEBUG ("Doing delayed seek to %" GST_TIME_FORMAT, GST_TIME_ARGS (data->desired_position));
676 execute_seek (data->desired_position, data);
680 /* Retrieve errors from the bus and show them on the UI */
681 static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
684 gchar *message_string;
686 gst_message_parse_error (msg, &err, &debug_info);
687 message_string = g_strdup_printf ("Error received from element %s: %s", GST_OBJECT_NAME (msg->src), err->message);
688 g_clear_error (&err);
690 set_ui_message (message_string, data);
691 g_free (message_string);
692 data->target_state = GST_STATE_NULL;
693 gst_element_set_state (data->pipeline, GST_STATE_NULL);
696 /* Called when the End Of the Stream is reached. Just move to the beginning of the media and pause. */
697 static void eos_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
698 data->target_state = GST_STATE_PAUSED;
699 data->is_live = (gst_element_set_state (data->pipeline, GST_STATE_PAUSED) == GST_STATE_CHANGE_NO_PREROLL);
700 execute_seek (0, data);
703 /* Called when the duration of the media changes. Just mark it as unknown, so we re-query it in the next UI refresh. */
704 static void duration_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
705 data->duration = GST_CLOCK_TIME_NONE;
708 /* Called when buffering messages are received. We inform the UI about the current buffering level and
709 * keep the pipeline paused until 100% buffering is reached. At that point, set the desired state. */
710 static void buffering_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
716 gst_message_parse_buffering (msg, &percent);
717 if (percent < 100 && data->target_state >= GST_STATE_PAUSED) {
718 gchar * message_string = g_strdup_printf ("Buffering %d%%", percent);
719 gst_element_set_state (data->pipeline, GST_STATE_PAUSED);
720 set_ui_message (message_string, data);
721 g_free (message_string);
722 } else if (data->target_state >= GST_STATE_PLAYING) {
723 gst_element_set_state (data->pipeline, GST_STATE_PLAYING);
724 } else if (data->target_state >= GST_STATE_PAUSED) {
725 set_ui_message ("Buffering complete", data);
729 /* Called when the clock is lost */
730 static void clock_lost_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
731 if (data->target_state >= GST_STATE_PLAYING) {
732 gst_element_set_state (data->pipeline, GST_STATE_PAUSED);
733 gst_element_set_state (data->pipeline, GST_STATE_PLAYING);
737 /* Retrieve the video sink's Caps and tell the application about the media size */
738 static void check_media_size (CustomData *data) {
739 JNIEnv *env = get_jni_env ();
740 GstElement *video_sink;
741 GstPad *video_sink_pad;
747 /* Retrieve the Caps at the entrance of the video sink */
748 g_object_get (data->pipeline, "video-sink", &video_sink, NULL);
749 video_sink_pad = gst_element_get_static_pad (video_sink, "sink");
750 caps = gst_pad_get_negotiated_caps (video_sink_pad);
752 if (gst_video_format_parse_caps(caps, &fmt, &width, &height)) {
754 if (gst_video_parse_caps_pixel_aspect_ratio (caps, &par_n, &par_d)) {
755 width = width * par_n / par_d;
757 GST_DEBUG ("Media size is %dx%d, notifying application", width, height);
759 (*env)->CallVoidMethod (env, data->app, on_media_size_changed_method_id, (jint)width, (jint)height);
760 if ((*env)->ExceptionCheck (env)) {
761 GST_ERROR ("Failed to call Java method");
762 (*env)->ExceptionClear (env);
766 gst_caps_unref(caps);
767 gst_object_unref (video_sink_pad);
768 gst_object_unref(video_sink);
771 /* Notify UI about pipeline state changes */
772 static void state_changed_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
773 GstState old_state, new_state, pending_state;
774 gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
775 /* Only pay attention to messages coming from the pipeline, not its children */
776 if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->pipeline)) {
777 data->state = new_state;
778 gchar *message = g_strdup_printf("State changed to %s", gst_element_state_get_name(new_state));
779 set_ui_message(message, data);
782 /* The Ready to Paused state change is particularly interesting: */
783 if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED) {
784 /* By now the sink already knows the media size */
785 check_media_size(data);
787 /* If there was a scheduled seek, perform it now that we have moved to the Paused state */
788 if (GST_CLOCK_TIME_IS_VALID (data->desired_position))
789 execute_seek (data->desired_position, data);
794 /* Check if all conditions are met to report GStreamer as initialized.
795 * These conditions will change depending on the application */
796 static void check_initialization_complete (CustomData *data) {
797 JNIEnv *env = get_jni_env ();
798 if (!data->initialized && data->native_window && data->main_loop) {
799 GST_DEBUG ("Initialization complete, notifying application. native_window:%p main_loop:%p", data->native_window, data->main_loop);
801 /* The main loop is running and we received a native window, inform the sink about it */
802 gst_x_overlay_set_window_handle (GST_X_OVERLAY (data->pipeline), (guintptr)data->native_window);
804 (*env)->CallVoidMethod (env, data->app, on_gstreamer_initialized_method_id);
805 if ((*env)->ExceptionCheck (env)) {
806 GST_ERROR ("Failed to call Java method");
807 (*env)->ExceptionClear (env);
809 data->initialized = TRUE;
813 /* Main method for the native code. This is executed on its own thread. */
814 static void *app_function (void *userdata) {
815 JavaVMAttachArgs args;
817 CustomData *data = (CustomData *)userdata;
818 GSource *timeout_source;
820 GError *error = NULL;
823 GST_DEBUG ("Creating pipeline in CustomData at %p", data);
825 /* Create our own GLib Main Context and make it the default one */
826 data->context = g_main_context_new ();
827 g_main_context_push_thread_default(data->context);
830 data->pipeline = gst_parse_launch("playbin", &error);
832 gchar *message = g_strdup_printf("Unable to build pipeline: %s", error->message);
833 g_clear_error (&error);
834 set_ui_message(message, data);
839 /* Disable subtitles */
840 g_object_get (data->pipeline, "flags", &flags, NULL);
841 flags &= ~GST_PLAY_FLAG_TEXT;
842 g_object_set (data->pipeline, "flags", flags, NULL);
844 /* Set the pipeline to READY, so it can already accept a window handle, if we have one */
845 data->target_state = GST_STATE_READY;
846 gst_element_set_state(data->pipeline, GST_STATE_READY);
848 /* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
849 bus = gst_element_get_bus (data->pipeline);
850 bus_source = gst_bus_create_watch (bus);
851 g_source_set_callback (bus_source, (GSourceFunc) gst_bus_async_signal_func, NULL, NULL);
852 g_source_attach (bus_source, data->context);
853 g_source_unref (bus_source);
854 g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, data);
855 g_signal_connect (G_OBJECT (bus), "message::eos", (GCallback)eos_cb, data);
856 g_signal_connect (G_OBJECT (bus), "message::state-changed", (GCallback)state_changed_cb, data);
857 g_signal_connect (G_OBJECT (bus), "message::duration", (GCallback)duration_cb, data);
858 g_signal_connect (G_OBJECT (bus), "message::buffering", (GCallback)buffering_cb, data);
859 g_signal_connect (G_OBJECT (bus), "message::clock-lost", (GCallback)clock_lost_cb, data);
860 gst_object_unref (bus);
862 /* Register a function that GLib will call 4 times per second */
863 timeout_source = g_timeout_source_new (250);
864 g_source_set_callback (timeout_source, (GSourceFunc)refresh_ui, data, NULL);
865 g_source_attach (timeout_source, data->context);
866 g_source_unref (timeout_source);
868 /* Create a GLib Main Loop and set it to run */
869 GST_DEBUG ("Entering main loop... (CustomData:%p)", data);
870 data->main_loop = g_main_loop_new (data->context, FALSE);
871 check_initialization_complete (data);
872 g_main_loop_run (data->main_loop);
873 GST_DEBUG ("Exited main loop");
874 g_main_loop_unref (data->main_loop);
875 data->main_loop = NULL;
878 g_main_context_pop_thread_default(data->context);
879 g_main_context_unref (data->context);
880 data->target_state = GST_STATE_NULL;
881 gst_element_set_state (data->pipeline, GST_STATE_NULL);
882 gst_object_unref (data->pipeline);
891 /* Instruct the native code to create its internal data structure, pipeline and thread */
892 static void gst_native_init (JNIEnv* env, jobject thiz) {
893 CustomData *data = g_new0 (CustomData, 1);
894 data->desired_position = GST_CLOCK_TIME_NONE;
895 data->last_seek_time = GST_CLOCK_TIME_NONE;
896 SET_CUSTOM_DATA (env, thiz, custom_data_field_id, data);
897 GST_DEBUG_CATEGORY_INIT (debug_category, "tutorial-4", 0, "Android tutorial 4");
898 gst_debug_set_threshold_for_name("tutorial-4", GST_LEVEL_DEBUG);
899 GST_DEBUG ("Created CustomData at %p", data);
900 data->app = (*env)->NewGlobalRef (env, thiz);
901 GST_DEBUG ("Created GlobalRef for app object at %p", data->app);
902 pthread_create (&gst_app_thread, NULL, &app_function, data);
905 /* Quit the main loop, remove the native thread and free resources */
906 static void gst_native_finalize (JNIEnv* env, jobject thiz) {
907 CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
909 GST_DEBUG ("Quitting main loop...");
910 g_main_loop_quit (data->main_loop);
911 GST_DEBUG ("Waiting for thread to finish...");
912 pthread_join (gst_app_thread, NULL);
913 GST_DEBUG ("Deleting GlobalRef for app object at %p", data->app);
914 (*env)->DeleteGlobalRef (env, data->app);
915 GST_DEBUG ("Freeing CustomData at %p", data);
917 SET_CUSTOM_DATA (env, thiz, custom_data_field_id, NULL);
918 GST_DEBUG ("Done finalizing");
921 /* Set playbin's URI */
922 void gst_native_set_uri (JNIEnv* env, jobject thiz, jstring uri) {
923 CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
924 if (!data || !data->pipeline) return;
925 const jbyte *char_uri = (*env)->GetStringUTFChars (env, uri, NULL);
926 GST_DEBUG ("Setting URI to %s", char_uri);
927 if (data->target_state >= GST_STATE_READY)
928 gst_element_set_state (data->pipeline, GST_STATE_READY);
929 g_object_set(data->pipeline, "uri", char_uri, NULL);
930 (*env)->ReleaseStringUTFChars (env, uri, char_uri);
931 data->duration = GST_CLOCK_TIME_NONE;
932 data->is_live = (gst_element_set_state (data->pipeline, data->target_state) == GST_STATE_CHANGE_NO_PREROLL);
935 /* Set pipeline to PLAYING state */
936 static void gst_native_play (JNIEnv* env, jobject thiz) {
937 CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
939 GST_DEBUG ("Setting state to PLAYING");
940 data->target_state = GST_STATE_PLAYING;
941 data->is_live = (gst_element_set_state (data->pipeline, GST_STATE_PLAYING) == GST_STATE_CHANGE_NO_PREROLL);
944 /* Set pipeline to PAUSED state */
945 static void gst_native_pause (JNIEnv* env, jobject thiz) {
946 CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
948 GST_DEBUG ("Setting state to PAUSED");
949 data->target_state = GST_STATE_PAUSED;
950 data->is_live = (gst_element_set_state (data->pipeline, GST_STATE_PAUSED) == GST_STATE_CHANGE_NO_PREROLL);
953 /* Instruct the pipeline to seek to a different position */
954 void gst_native_set_position (JNIEnv* env, jobject thiz, int milliseconds) {
955 CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
957 gint64 desired_position = (gint64)(milliseconds * GST_MSECOND);
958 if (data->state >= GST_STATE_PAUSED) {
959 execute_seek(desired_position, data);
961 GST_DEBUG ("Scheduling seek to %" GST_TIME_FORMAT " for later", GST_TIME_ARGS (desired_position));
962 data->desired_position = desired_position;
966 /* Static class initializer: retrieve method and field IDs */
967 static jboolean gst_native_class_init (JNIEnv* env, jclass klass) {
968 custom_data_field_id = (*env)->GetFieldID (env, klass, "native_custom_data", "J");
969 set_message_method_id = (*env)->GetMethodID (env, klass, "setMessage", "(Ljava/lang/String;)V");
970 set_current_position_method_id = (*env)->GetMethodID (env, klass, "setCurrentPosition", "(II)V");
971 on_gstreamer_initialized_method_id = (*env)->GetMethodID (env, klass, "onGStreamerInitialized", "()V");
972 on_media_size_changed_method_id = (*env)->GetMethodID (env, klass, "onMediaSizeChanged", "(II)V");
974 if (!custom_data_field_id || !set_message_method_id || !on_gstreamer_initialized_method_id ||
975 !on_media_size_changed_method_id || !set_current_position_method_id) {
976 /* We emit this message through the Android log instead of the GStreamer log because the later
977 * has not been initialized yet.
979 __android_log_print (ANDROID_LOG_ERROR, "tutorial-4", "The calling class does not implement all necessary interface methods");
985 static void gst_native_surface_init (JNIEnv *env, jobject thiz, jobject surface) {
986 CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
988 ANativeWindow *new_native_window = ANativeWindow_fromSurface(env, surface);
989 GST_DEBUG ("Received surface %p (native window %p)", surface, new_native_window);
991 if (data->native_window) {
992 ANativeWindow_release (data->native_window);
993 if (data->native_window == new_native_window) {
994 GST_DEBUG ("New native window is the same as the previous one", data->native_window);
995 if (data->pipeline) {
996 gst_x_overlay_expose(GST_X_OVERLAY (data->pipeline));
997 gst_x_overlay_expose(GST_X_OVERLAY (data->pipeline));
1001 GST_DEBUG ("Released previous native window %p", data->native_window);
1002 data->initialized = FALSE;
1005 data->native_window = new_native_window;
1007 check_initialization_complete (data);
1010 static void gst_native_surface_finalize (JNIEnv *env, jobject thiz) {
1011 CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
1013 GST_DEBUG ("Releasing Native Window %p", data->native_window);
1015 if (data->pipeline) {
1016 gst_x_overlay_set_window_handle (GST_X_OVERLAY (data->pipeline), (guintptr)NULL);
1017 gst_element_set_state (data->pipeline, GST_STATE_READY);
1020 ANativeWindow_release (data->native_window);
1021 data->native_window = NULL;
1022 data->initialized = FALSE;
1025 /* List of implemented native methods */
1026 static JNINativeMethod native_methods[] = {
1027 { "nativeInit", "()V", (void *) gst_native_init},
1028 { "nativeFinalize", "()V", (void *) gst_native_finalize},
1029 { "nativeSetUri", "(Ljava/lang/String;)V", (void *) gst_native_set_uri},
1030 { "nativePlay", "()V", (void *) gst_native_play},
1031 { "nativePause", "()V", (void *) gst_native_pause},
1032 { "nativeSetPosition", "(I)V", (void*) gst_native_set_position},
1033 { "nativeSurfaceInit", "(Ljava/lang/Object;)V", (void *) gst_native_surface_init},
1034 { "nativeSurfaceFinalize", "()V", (void *) gst_native_surface_finalize},
1035 { "nativeClassInit", "()Z", (void *) gst_native_class_init}
1038 /* Library initializer */
1039 jint JNI_OnLoad(JavaVM *vm, void *reserved) {
1044 if ((*vm)->GetEnv(vm, (void**) &env, JNI_VERSION_1_4) != JNI_OK) {
1045 __android_log_print (ANDROID_LOG_ERROR, "tutorial-4", "Could not retrieve JNIEnv");
1048 jclass klass = (*env)->FindClass (env, "com/gst_sdk_tutorials/tutorial_4/Tutorial4");
1049 (*env)->RegisterNatives (env, klass, native_methods, G_N_ELEMENTS(native_methods));
1051 pthread_key_create (¤t_jni_env, detach_current_thread);
1053 return JNI_VERSION_1_4;
1057 ### Supporting arbitrary media URIs
1059 Java code will call `gst_native_set_uri()` whenever it wants to change
1060 the playing URI (in this tutorial the URI never changes, but it could):
1063 void gst_native_set_uri (JNIEnv* env, jobject thiz, jstring uri) {
1064 CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
1065 if (!data || !data->pipeline) return;
1066 const jbyte *char_uri = (*env)->GetStringUTFChars (env, uri, NULL);
1067 GST_DEBUG ("Setting URI to %s", char_uri);
1068 if (data->target_state >= GST_STATE_READY)
1069 gst_element_set_state (data->pipeline, GST_STATE_READY);
1070 g_object_set(data->pipeline, "uri", char_uri, NULL);
1071 (*env)->ReleaseStringUTFChars (env, uri, char_uri);
1072 data->duration = GST_CLOCK_TIME_NONE;
1073 data->is_live = (gst_element_set_state (data->pipeline, data->target_state) == GST_STATE_CHANGE_NO_PREROLL);
1077 We first need to convert between the
1078 [UTF16](http://en.wikipedia.org/wiki/UTF-16) encoding used by Java and
1080 UTF8](http://en.wikipedia.org/wiki/UTF-8#Modified_UTF-8) used by
1082 [GetStringUTFChars()](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/functions.html#wp17265)
1084 [ReleaseStringUTFChars()](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/functions.html#wp17294).
1086 `playbin` will only care about URI changes in the READY to PAUSED state
1087 change, because the new URI might need a completely different playback
1088 pipeline (think about switching from a local Matroska file to a remote
1089 OGG file: this would require, at least, different source and demuxing
1090 elements). Thus, before passing the new URI to `playbin` we set its
1091 state to READY (if we were in PAUSED or PLAYING).
1093 `playbin`’s URI is exposed as a common GObject property, so we simply
1094 set it with `g_object_set()`.
1096 We then reset the clip duration, so it is re-queried later, and bring
1097 the pipeline to the playing state it had before. In this last step, we
1098 also take note of whether the new URI corresponds to a live source or
1099 not. Live sources must not use buffering (otherwise latency is
1100 introduced which is inacceptable for them), so we keep track of this
1101 information in the `is_live` variable.
1103 ### Reporting media size
1105 Some codecs allow the media size (width and height of the video) to
1106 change during playback. For simplicity, this tutorial assumes that they
1107 do not. Therefore, in the READY to PAUSED state change, once the Caps of
1108 the decoded media are known, we inspect them in `check_media_size()`:
1111 static void check_media_size (CustomData *data) {
1112 JNIEnv *env = get_jni_env ();
1113 GstElement *video_sink;
1114 GstPad *video_sink_pad;
1120 /* Retrieve the Caps at the entrance of the video sink */
1121 g_object_get (data->pipeline, "video-sink", &video_sink, NULL);
1122 video_sink_pad = gst_element_get_static_pad (video_sink, "sink");
1123 caps = gst_pad_get_negotiated_caps (video_sink_pad);
1125 if (gst_video_format_parse_caps(caps, &fmt, &width, &height)) {
1127 if (gst_video_parse_caps_pixel_aspect_ratio (caps, &par_n, &par_d)) {
1128 width = width * par_n / par_d;
1130 GST_DEBUG ("Media size is %dx%d, notifying application", width, height);
1132 (*env)->CallVoidMethod (env, data->app, on_media_size_changed_method_id, (jint)width, (jint)height);
1133 if ((*env)->ExceptionCheck (env)) {
1134 GST_ERROR ("Failed to call Java method");
1135 (*env)->ExceptionClear (env);
1139 gst_caps_unref(caps);
1140 gst_object_unref (video_sink_pad);
1141 gst_object_unref(video_sink);
1145 We first retrieve the video sink element from the pipeline, using the
1146 `video-sink` property of `playbin`, and then its sink Pad. The
1147 negotiated Caps of this Pad, which we recover using
1148 `gst_pad_get_negotiated_caps()`, are the Caps of the decoded media.
1150 The helper functions `gst_video_format_parse_caps()` and
1151 `gst_video_parse_caps_pixel_aspect_ratio()` turn the Caps into
1152 manageable integers, which we pass to Java through
1153 its `onMediaSizeChanged()` callback.
1155 ### Refreshing the Seek Bar
1157 To keep the UI updated, a GLib timer is installed in the
1158 `app_function()` that fires 4 times per second (or every 250ms), right
1159 before entering the main loop:
1162 timeout_source = g_timeout_source_new (250);
1163 g_source_set_callback (timeout_source, (GSourceFunc)refresh_ui, data, NULL);
1164 g_source_attach (timeout_source, data->context);
1165 g_source_unref (timeout_source);
1168 Then, in the refresh\_ui method:
1171 static gboolean refresh_ui (CustomData *data) {
1172 GstFormat fmt = GST_FORMAT_TIME;
1173 gint64 current = -1;
1176 /* We do not want to update anything unless we have a working pipeline in the PAUSED or PLAYING state */
1177 if (!data || !data->pipeline || data->state < GST_STATE_PAUSED)
1180 /* If we didn't know it yet, query the stream duration */
1181 if (!GST_CLOCK_TIME_IS_VALID (data->duration)) {
1182 if (!gst_element_query_duration (data->pipeline, &fmt, &data->duration)) {
1183 GST_WARNING ("Could not query current duration");
1187 if (gst_element_query_position (data->pipeline, &fmt, &position)) {
1188 /* Java expects these values in milliseconds, and GStreamer provides nanoseconds */
1189 set_current_ui_position (position / GST_MSECOND, data->duration / GST_MSECOND, data);
1195 If it is unknown, the clip duration is retrieved, as explained in
1196 [](tutorials/basic/time-management.md). The current position is
1197 retrieved next, and the UI is informed of both through its
1198 `setCurrentPosition()` callback.
1200 Bear in mind that all time-related measures returned by GStreamer are in
1201 nanoseconds, whereas, for simplicity, we decided to make the UI code
1202 work in milliseconds.
1204 ### Seeking with the Seek Bar
1206 The Java UI code already takes care of most of the complexity of seeking
1207 by dragging the thumb of the Seek Bar. From C code, we just need to
1208 honor the calls to `nativeSetPosition()` and instruct the pipeline to
1209 jump to the indicated position.
1211 There are, though, a couple of caveats. Firstly, seeks are only possible
1212 when the pipeline is in the PAUSED or PLAYING state, and we might
1213 receive seek requests before that happens. Secondly, dragging the Seek
1214 Bar can generate a very high number of seek requests in a short period
1215 of time, which is visually useless and will impair responsiveness. Let’s
1216 see how to overcome these problems.
1221 `gst_native_set_position()`:
1224 void gst_native_set_position (JNIEnv* env, jobject thiz, int milliseconds) {
1225 CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
1227 gint64 desired_position = (gint64)(milliseconds * GST_MSECOND);
1228 if (data->state >= GST_STATE_PAUSED) {
1229 execute_seek(desired_position, data);
1231 GST_DEBUG ("Scheduling seek to %" GST_TIME_FORMAT " for later", GST_TIME_ARGS (desired_position));
1232 data->desired_position = desired_position;
1237 If we are already in the correct state for seeking, execute it right
1238 away; otherwise, store the desired position in the
1239 `desired_position` variable. Then, in the
1240 `state_changed_cb()` callback:
1243 if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED) {
1244 /* By now the sink already knows the media size */
1245 check_media_size(data);
1247 /* If there was a scheduled seek, perform it now that we have moved to the Paused state */
1248 if (GST_CLOCK_TIME_IS_VALID (data->desired_position))
1249 execute_seek (data->desired_position, data);
1254 Once the pipeline moves from the READY to the PAUSED state, we check if
1255 there is a pending seek operation and execute it. The
1256 `desired_position` variable is reset inside `execute_seek()`.
1258 #### Seek throttling
1260 A seek is potentially a lengthy operation. The demuxer (the element
1261 typically in charge of seeking) needs to estimate the appropriate byte
1262 offset inside the media file that corresponds to the time position to
1263 jump to. Then, it needs to start decoding from that point until the
1264 desired position is reached. If the initial estimate is accurate, this
1265 will not take long, but, on some container formats, or when indexing
1266 information is missing, it can take up to several seconds.
1268 If a demuxer is in the process of performing a seek and receives a
1269 second one, it is up to it to finish the first one, start the second one
1270 or abort both, which is a bad thing. A simple method to avoid this issue
1271 is *throttling*, which means that we will only allow one seek every half
1272 a second (for example): after performing a seek, only the last seek
1273 request received during the next 500ms is stored, and will be honored
1274 once this period elapses.
1276 To achieve this, all seek requests are routed through the
1277 `execute_seek()` method:
1280 static void execute_seek (gint64 desired_position, CustomData *data) {
1283 if (desired_position == GST_CLOCK_TIME_NONE)
1286 diff = gst_util_get_timestamp () - data->last_seek_time;
1288 if (GST_CLOCK_TIME_IS_VALID (data->last_seek_time) && diff < SEEK_MIN_DELAY) {
1289 /* The previous seek was too close, delay this one */
1290 GSource *timeout_source;
1292 if (data->desired_position == GST_CLOCK_TIME_NONE) {
1293 /* There was no previous seek scheduled. Setup a timer for some time in the future */
1294 timeout_source = g_timeout_source_new ((SEEK_MIN_DELAY - diff) / GST_MSECOND);
1295 g_source_set_callback (timeout_source, (GSourceFunc)delayed_seek_cb, data, NULL);
1296 g_source_attach (timeout_source, data->context);
1297 g_source_unref (timeout_source);
1299 /* Update the desired seek position. If multiple requests are received before it is time
1300 * to perform a seek, only the last one is remembered. */
1301 data->desired_position = desired_position;
1302 GST_DEBUG ("Throttling seek to %" GST_TIME_FORMAT ", will be in %" GST_TIME_FORMAT,
1303 GST_TIME_ARGS (desired_position), GST_TIME_ARGS (SEEK_MIN_DELAY - diff));
1305 /* Perform the seek now */
1306 GST_DEBUG ("Seeking to %" GST_TIME_FORMAT, GST_TIME_ARGS (desired_position));
1307 data->last_seek_time = gst_util_get_timestamp ();
1308 gst_element_seek_simple (data->pipeline, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT, desired_position);
1309 data->desired_position = GST_CLOCK_TIME_NONE;
1314 The time at which the last seek was performed is stored in the
1315 `last_seek_time` variable. This is wall clock time, not to be confused
1316 with the stream time carried in the media time stamps, and is obtained
1317 with `gst_util_get_timestamp()`.
1319 If enough time has passed since the last seek operation, the new one is
1320 directly executed and `last_seek_time` is updated. Otherwise, the new
1321 seek is scheduled for later. If there is no previously scheduled seek, a
1322 one-shot timer is setup to trigger 500ms after the last seek operation.
1323 If another seek was already scheduled, its desired position is simply
1324 updated with the new one.
1326 The one-shot timer calls `delayed_seek_cb()`, which simply calls
1327 `execute_seek()` again.
1330 > Ideally, `execute_seek()` will now find that enough time has indeed passed since the last seek and the scheduled one will proceed. It might happen, though, that after 500ms of the previous seek, and before the timer wakes up, yet another seek comes through and is executed. `delayed_seek_cb()` needs to check for this condition to avoid performing two very close seeks, and therefore calls `execute_seek()` instead of performing it itself.
1332 > This is not a complete solution: the scheduled seek will still be executed, even though a more-recent seek has already been executed that should have cancelled it. However, it is a good tradeoff between functionality and simplicity.
1334 ### Network resilience
1336 [](tutorials/basic/streaming.md) has already
1337 shown how to adapt to the variable nature of the network bandwidth by
1338 using buffering. The same procedure is used here, by listening to the
1343 g_signal_connect (G_OBJECT (bus), "message::buffering", (GCallback)buffering_cb, data);
1346 And pausing the pipeline until buffering is complete (unless this is a
1351 static void buffering_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
1357 gst_message_parse_buffering (msg, &percent);
1358 if (percent < 100 && data->target_state >= GST_STATE_PAUSED) {
1359 gchar * message_string = g_strdup_printf ("Buffering %d%%", percent);
1360 gst_element_set_state (data->pipeline, GST_STATE_PAUSED);
1361 set_ui_message (message_string, data);
1362 g_free (message_string);
1363 } else if (data->target_state >= GST_STATE_PLAYING) {
1364 gst_element_set_state (data->pipeline, GST_STATE_PLAYING);
1365 } else if (data->target_state >= GST_STATE_PAUSED) {
1366 set_ui_message ("Buffering complete", data);
1371 `target_state` is the state in which we have been instructed to set the
1372 pipeline, which might be different to the current state, because
1373 buffering forces us to go to PAUSED. Once buffering is complete we set
1374 the pipeline to the `target_state`.
1376 ## A basic media player \[Android.mk\]
1378 The only line worth mentioning in the makefile
1379 is `GSTREAMER_PLUGINS`:
1384 GSTREAMER_PLUGINS := $(GSTREAMER_PLUGINS_CORE) $(GSTREAMER_PLUGINS_PLAYBACK) $(GSTREAMER_PLUGINS_CODECS) $(GSTREAMER_PLUGINS_NET) $(GSTREAMER_PLUGINS_SYS)
1387 In which all plugins required for playback are loaded, because it is not
1388 known at build time what would be needed for an unspecified URI (again,
1389 in this tutorial the URI does not change, but it will in the next one).
1393 This tutorial has shown how to embed a `playbin` pipeline into an
1394 Android application. This, effectively, turns such application into a
1395 basic media player, capable of streaming and decoding all the formats
1396 GStreamer understands. More particularly, it has shown:
1398 - How to keep the User Interface regularly updated by using a timer,
1399 querying the pipeline position and calling a UI code method.
1400 - How to implement a Seek Bar which follows the current position and
1401 transforms thumb motion into reliable seek events.
1402 - How to report the media size to adapt the display surface, by
1403 reading the sink Caps at the appropriate moment and telling the UI
1406 The next tutorial adds the missing bits to turn the application built
1407 here into an acceptable Android media player.
1409 As usual, it has been a pleasure having you here, and see you soon!
1411 [screenshot]: images/tutorials/android-media-player-screenshot.png
1412 [information]: images/icons/emoticons/information.png