1 # Android tutorial 3: Video
7 Except for [](tutorial-basic-toolkit-integration.md),
8 which embedded a video window on a GTK application, all tutorials so far
9 relied on GStreamer video sinks to create a window to display their
10 contents. The video sink on Android is not capable of creating its own
11 window, so a drawing surface always needs to be provided. This tutorial
14 - How to allocate a drawing surface on the Android layout and pass it
16 - How to keep GStreamer posted on changes to the surface
20 Since Android does not provide a windowing system, a GStreamer video
21 sink cannot create pop-up windows as it would do on a Desktop platform.
22 Fortunately, the `VideoOverlay` interface allows providing video sinks with
23 an already created window onto which they can draw, as we have seen in
24 [](tutorial-basic-toolkit-integration.md).
27 [SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html)
28 widget (actually, a subclass of it) is placed on the main layout. When
29 Android informs the application that a surface has been created for this
30 widget, we pass it to the C code which stores it. The
31 `check_initialization_complete()` method explained in the previous
32 tutorial is extended so that GStreamer is not considered initialized
33 until a main loop is running and a drawing surface has been received.
35 ### A video surface on Android \[Java code\]
37 **src/org/freedesktop/gstreamer/tutorials/tutorial\_3/Tutorial3.java**
40 package org.freedesktop.gstreamer.tutorials.tutorial_3;
42 import android.app.Activity;
43 import android.os.Bundle;
44 import android.util.Log;
45 import android.view.SurfaceHolder;
46 import android.view.SurfaceView;
47 import android.view.View;
48 import android.view.View.OnClickListener;
49 import android.widget.ImageButton;
50 import android.widget.TextView;
51 import android.widget.Toast;
53 import org.freedesktop.gstreamer.GStreamer;
55 public class Tutorial3 extends Activity implements SurfaceHolder.Callback {
56 private native void nativeInit(); // Initialize native code, build pipeline, etc
57 private native void nativeFinalize(); // Destroy pipeline and shutdown native code
58 private native void nativePlay(); // Set pipeline to PLAYING
59 private native void nativePause(); // Set pipeline to PAUSED
60 private static native boolean nativeClassInit(); // Initialize native class: cache Method IDs for callbacks
61 private native void nativeSurfaceInit(Object surface);
62 private native void nativeSurfaceFinalize();
63 private long native_custom_data; // Native code will use this to keep private data
65 private boolean is_playing_desired; // Whether the user asked to go to PLAYING
67 // Called when the activity is first created.
69 public void onCreate(Bundle savedInstanceState)
71 super.onCreate(savedInstanceState);
73 // Initialize GStreamer and warn if it fails
76 } catch (Exception e) {
77 Toast.makeText(this, e.getMessage(), Toast.LENGTH_LONG).show();
82 setContentView(R.layout.main);
84 ImageButton play = (ImageButton) this.findViewById(R.id.button_play);
85 play.setOnClickListener(new OnClickListener() {
86 public void onClick(View v) {
87 is_playing_desired = true;
92 ImageButton pause = (ImageButton) this.findViewById(R.id.button_stop);
93 pause.setOnClickListener(new OnClickListener() {
94 public void onClick(View v) {
95 is_playing_desired = false;
100 SurfaceView sv = (SurfaceView) this.findViewById(R.id.surface_video);
101 SurfaceHolder sh = sv.getHolder();
102 sh.addCallback(this);
104 if (savedInstanceState != null) {
105 is_playing_desired = savedInstanceState.getBoolean("playing");
106 Log.i ("GStreamer", "Activity created. Saved state is playing:" + is_playing_desired);
108 is_playing_desired = false;
109 Log.i ("GStreamer", "Activity created. There is no saved state, playing: false");
112 // Start with disabled buttons, until native code is initialized
113 this.findViewById(R.id.button_play).setEnabled(false);
114 this.findViewById(R.id.button_stop).setEnabled(false);
119 protected void onSaveInstanceState (Bundle outState) {
120 Log.d ("GStreamer", "Saving state, playing:" + is_playing_desired);
121 outState.putBoolean("playing", is_playing_desired);
124 protected void onDestroy() {
129 // Called from native code. This sets the content of the TextView from the UI thread.
130 private void setMessage(final String message) {
131 final TextView tv = (TextView) this.findViewById(R.id.textview_message);
132 runOnUiThread (new Runnable() {
139 // Called from native code. Native code calls this once it has created its pipeline and
140 // the main loop is running, so it is ready to accept commands.
141 private void onGStreamerInitialized () {
142 Log.i ("GStreamer", "Gst initialized. Restoring state, playing:" + is_playing_desired);
143 // Restore previous playing state
144 if (is_playing_desired) {
150 // Re-enable buttons, now that GStreamer is initialized
151 final Activity activity = this;
152 runOnUiThread(new Runnable() {
154 activity.findViewById(R.id.button_play).setEnabled(true);
155 activity.findViewById(R.id.button_stop).setEnabled(true);
161 System.loadLibrary("gstreamer_android");
162 System.loadLibrary("tutorial-3");
166 public void surfaceChanged(SurfaceHolder holder, int format, int width,
168 Log.d("GStreamer", "Surface changed to format " + format + " width "
169 + width + " height " + height);
170 nativeSurfaceInit (holder.getSurface());
173 public void surfaceCreated(SurfaceHolder holder) {
174 Log.d("GStreamer", "Surface created: " + holder.getSurface());
177 public void surfaceDestroyed(SurfaceHolder holder) {
178 Log.d("GStreamer", "Surface destroyed");
179 nativeSurfaceFinalize ();
185 This tutorial continues where the previous one left, adding a video
186 surface to the layout and changing the GStreamer pipeline to produce
187 video instead of audio. Only the parts of the code that are new will be
191 private native void nativeSurfaceInit(Object surface);
192 private native void nativeSurfaceFinalize();
195 Two new entry points to the C code are defined,
196 `nativeSurfaceInit()` and `nativeSurfaceFinalize()`, which we will call
197 when the video surface becomes available and when it is about to be
198 destroyed, respectively.
201 SurfaceView sv = (SurfaceView) this.findViewById(R.id.surface_video);
202 SurfaceHolder sh = sv.getHolder();
203 sh.addCallback(this);
206 In `onCreate()`, we retrieve the
207 [SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html),
208 and then register ourselves to receive notifications about the surface
210 [SurfaceHolder](http://developer.android.com/reference/android/view/SurfaceHolder.html)
211 interface. This is why we declared this Activity as implementing the
212 [SurfaceHolder.Callback](http://developer.android.com/reference/android/view/SurfaceHolder.Callback.html)
213 interface in line 16.
216 public void surfaceChanged(SurfaceHolder holder, int format, int width,
218 Log.d("GStreamer", "Surface changed to format " + format + " width "
219 + width + " height " + height);
220 nativeSurfaceInit (holder.getSurface());
223 public void surfaceCreated(SurfaceHolder holder) {
224 Log.d("GStreamer", "Surface created: " + holder.getSurface());
227 public void surfaceDestroyed(SurfaceHolder holder) {
228 Log.d("GStreamer", "Surface destroyed");
229 nativeSurfaceFinalize ();
233 This interface is composed of the three methods above, which get called
234 when the geometry of the surface changes, when the surface is created
235 and when it is about to be destroyed. `surfaceChanged()` always gets
236 called at least once, right after `surfaceCreated()`, so we will use it
237 to notify GStreamer about the new surface. We use
238 `surfaceDestroyed()` to tell GStreamer to stop using this surface.
240 Let’s review the C code to see what these functions do.
242 ### A video surface on Android \[C code\]
250 #include <android/log.h>
251 #include <android/native_window.h>
252 #include <android/native_window_jni.h>
254 #include <gst/video/video.h>
257 GST_DEBUG_CATEGORY_STATIC (debug_category);
258 #define GST_CAT_DEFAULT debug_category
261 * These macros provide a way to store the native pointer to CustomData, which might be 32 or 64 bits, into
262 * a jlong, which is always 64 bits, without warnings.
264 #if GLIB_SIZEOF_VOID_P == 8
265 ## define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(*env)->GetLongField (env, thiz, fieldID)
266 ## define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)data)
268 ## define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(jint)(*env)->GetLongField (env, thiz, fieldID)
269 ## define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)(jint)data)
272 /* Structure to contain all our information, so we can pass it to callbacks */
273 typedef struct _CustomData {
274 jobject app; /* Application instance, used to call its methods. A global reference is kept. */
275 GstElement *pipeline; /* The running pipeline */
276 GMainContext *context; /* GLib context used to run the main loop */
277 GMainLoop *main_loop; /* GLib main loop */
278 gboolean initialized; /* To avoid informing the UI multiple times about the initialization */
279 GstElement *video_sink; /* The video sink element which receives VideoOverlay commands */
280 ANativeWindow *native_window; /* The Android native window where video will be rendered */
283 /* These global variables cache values which are not changing during execution */
284 static pthread_t gst_app_thread;
285 static pthread_key_t current_jni_env;
286 static JavaVM *java_vm;
287 static jfieldID custom_data_field_id;
288 static jmethodID set_message_method_id;
289 static jmethodID on_gstreamer_initialized_method_id;
295 /* Register this thread with the VM */
296 static JNIEnv *attach_current_thread (void) {
298 JavaVMAttachArgs args;
300 GST_DEBUG ("Attaching thread %p", g_thread_self ());
301 args.version = JNI_VERSION_1_4;
305 if ((*java_vm)->AttachCurrentThread (java_vm, &env, &args) < 0) {
306 GST_ERROR ("Failed to attach current thread");
313 /* Unregister this thread from the VM */
314 static void detach_current_thread (void *env) {
315 GST_DEBUG ("Detaching thread %p", g_thread_self ());
316 (*java_vm)->DetachCurrentThread (java_vm);
319 /* Retrieve the JNI environment for this thread */
320 static JNIEnv *get_jni_env (void) {
323 if ((env = pthread_getspecific (current_jni_env)) == NULL) {
324 env = attach_current_thread ();
325 pthread_setspecific (current_jni_env, env);
331 /* Change the content of the UI's TextView */
332 static void set_ui_message (const gchar *message, CustomData *data) {
333 JNIEnv *env = get_jni_env ();
334 GST_DEBUG ("Setting message to: %s", message);
335 jstring jmessage = (*env)->NewStringUTF(env, message);
336 (*env)->CallVoidMethod (env, data->app, set_message_method_id, jmessage);
337 if ((*env)->ExceptionCheck (env)) {
338 GST_ERROR ("Failed to call Java method");
339 (*env)->ExceptionClear (env);
341 (*env)->DeleteLocalRef (env, jmessage);
344 /* Retrieve errors from the bus and show them on the UI */
345 static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
348 gchar *message_string;
350 gst_message_parse_error (msg, &err, &debug_info);
351 message_string = g_strdup_printf ("Error received from element %s: %s", GST_OBJECT_NAME (msg->src), err->message);
352 g_clear_error (&err);
354 set_ui_message (message_string, data);
355 g_free (message_string);
356 gst_element_set_state (data->pipeline, GST_STATE_NULL);
359 /* Notify UI about pipeline state changes */
360 static void state_changed_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
361 GstState old_state, new_state, pending_state;
362 gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
363 /* Only pay attention to messages coming from the pipeline, not its children */
364 if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->pipeline)) {
365 gchar *message = g_strdup_printf("State changed to %s", gst_element_state_get_name(new_state));
366 set_ui_message(message, data);
371 /* Check if all conditions are met to report GStreamer as initialized.
372 * These conditions will change depending on the application */
373 static void check_initialization_complete (CustomData *data) {
374 JNIEnv *env = get_jni_env ();
375 if (!data->initialized && data->native_window && data->main_loop) {
376 GST_DEBUG ("Initialization complete, notifying application. native_window:%p main_loop:%p", data->native_window, data->main_loop);
378 /* The main loop is running and we received a native window, inform the sink about it */
379 gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->video_sink), (guintptr)data->native_window);
381 (*env)->CallVoidMethod (env, data->app, on_gstreamer_initialized_method_id);
382 if ((*env)->ExceptionCheck (env)) {
383 GST_ERROR ("Failed to call Java method");
384 (*env)->ExceptionClear (env);
386 data->initialized = TRUE;
390 /* Main method for the native code. This is executed on its own thread. */
391 static void *app_function (void *userdata) {
392 JavaVMAttachArgs args;
394 CustomData *data = (CustomData *)userdata;
396 GError *error = NULL;
398 GST_DEBUG ("Creating pipeline in CustomData at %p", data);
400 /* Create our own GLib Main Context and make it the default one */
401 data->context = g_main_context_new ();
402 g_main_context_push_thread_default(data->context);
405 data->pipeline = gst_parse_launch("videotestsrc ! warptv ! videoconvert ! autovideosink", &error);
407 gchar *message = g_strdup_printf("Unable to build pipeline: %s", error->message);
408 g_clear_error (&error);
409 set_ui_message(message, data);
414 /* Set the pipeline to READY, so it can already accept a window handle, if we have one */
415 gst_element_set_state(data->pipeline, GST_STATE_READY);
417 data->video_sink = gst_bin_get_by_interface(GST_BIN(data->pipeline), GST_TYPE_VIDEO_OVERLAY);
418 if (!data->video_sink) {
419 GST_ERROR ("Could not retrieve video sink");
423 /* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
424 bus = gst_element_get_bus (data->pipeline);
425 bus_source = gst_bus_create_watch (bus);
426 g_source_set_callback (bus_source, (GSourceFunc) gst_bus_async_signal_func, NULL, NULL);
427 g_source_attach (bus_source, data->context);
428 g_source_unref (bus_source);
429 g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, data);
430 g_signal_connect (G_OBJECT (bus), "message::state-changed", (GCallback)state_changed_cb, data);
431 gst_object_unref (bus);
433 /* Create a GLib Main Loop and set it to run */
434 GST_DEBUG ("Entering main loop... (CustomData:%p)", data);
435 data->main_loop = g_main_loop_new (data->context, FALSE);
436 check_initialization_complete (data);
437 g_main_loop_run (data->main_loop);
438 GST_DEBUG ("Exited main loop");
439 g_main_loop_unref (data->main_loop);
440 data->main_loop = NULL;
443 g_main_context_pop_thread_default(data->context);
444 g_main_context_unref (data->context);
445 gst_element_set_state (data->pipeline, GST_STATE_NULL);
446 gst_object_unref (data->video_sink);
447 gst_object_unref (data->pipeline);
456 /* Instruct the native code to create its internal data structure, pipeline and thread */
457 static void gst_native_init (JNIEnv* env, jobject thiz) {
458 CustomData *data = g_new0 (CustomData, 1);
459 SET_CUSTOM_DATA (env, thiz, custom_data_field_id, data);
460 GST_DEBUG_CATEGORY_INIT (debug_category, "tutorial-3", 0, "Android tutorial 3");
461 gst_debug_set_threshold_for_name("tutorial-3", GST_LEVEL_DEBUG);
462 GST_DEBUG ("Created CustomData at %p", data);
463 data->app = (*env)->NewGlobalRef (env, thiz);
464 GST_DEBUG ("Created GlobalRef for app object at %p", data->app);
465 pthread_create (&gst_app_thread, NULL, &app_function, data);
468 /* Quit the main loop, remove the native thread and free resources */
469 static void gst_native_finalize (JNIEnv* env, jobject thiz) {
470 CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
472 GST_DEBUG ("Quitting main loop...");
473 g_main_loop_quit (data->main_loop);
474 GST_DEBUG ("Waiting for thread to finish...");
475 pthread_join (gst_app_thread, NULL);
476 GST_DEBUG ("Deleting GlobalRef for app object at %p", data->app);
477 (*env)->DeleteGlobalRef (env, data->app);
478 GST_DEBUG ("Freeing CustomData at %p", data);
480 SET_CUSTOM_DATA (env, thiz, custom_data_field_id, NULL);
481 GST_DEBUG ("Done finalizing");
484 /* Set pipeline to PLAYING state */
485 static void gst_native_play (JNIEnv* env, jobject thiz) {
486 CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
488 GST_DEBUG ("Setting state to PLAYING");
489 gst_element_set_state (data->pipeline, GST_STATE_PLAYING);
492 /* Set pipeline to PAUSED state */
493 static void gst_native_pause (JNIEnv* env, jobject thiz) {
494 CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
496 GST_DEBUG ("Setting state to PAUSED");
497 gst_element_set_state (data->pipeline, GST_STATE_PAUSED);
500 /* Static class initializer: retrieve method and field IDs */
501 static jboolean gst_native_class_init (JNIEnv* env, jclass klass) {
502 custom_data_field_id = (*env)->GetFieldID (env, klass, "native_custom_data", "J");
503 set_message_method_id = (*env)->GetMethodID (env, klass, "setMessage", "(Ljava/lang/String;)V");
504 on_gstreamer_initialized_method_id = (*env)->GetMethodID (env, klass, "onGStreamerInitialized", "()V");
506 if (!custom_data_field_id || !set_message_method_id || !on_gstreamer_initialized_method_id) {
507 /* We emit this message through the Android log instead of the GStreamer log because the later
508 * has not been initialized yet.
510 __android_log_print (ANDROID_LOG_ERROR, "tutorial-3", "The calling class does not implement all necessary interface methods");
516 static void gst_native_surface_init (JNIEnv *env, jobject thiz, jobject surface) {
517 CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
519 ANativeWindow *new_native_window = ANativeWindow_fromSurface(env, surface);
520 GST_DEBUG ("Received surface %p (native window %p)", surface, new_native_window);
522 if (data->native_window) {
523 ANativeWindow_release (data->native_window);
524 if (data->native_window == new_native_window) {
525 GST_DEBUG ("New native window is the same as the previous one", data->native_window);
526 if (data->video_sink) {
527 gst_video_overlay_expose(GST_VIDEO_OVERLAY (data->video_sink));
528 gst_video_overlay_expose(GST_VIDEO_OVERLAY (data->video_sink));
532 GST_DEBUG ("Released previous native window %p", data->native_window);
533 data->initialized = FALSE;
536 data->native_window = new_native_window;
538 check_initialization_complete (data);
541 static void gst_native_surface_finalize (JNIEnv *env, jobject thiz) {
542 CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
544 GST_DEBUG ("Releasing Native Window %p", data->native_window);
546 if (data->video_sink) {
547 gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->video_sink), (guintptr)NULL);
548 gst_element_set_state (data->pipeline, GST_STATE_READY);
551 ANativeWindow_release (data->native_window);
552 data->native_window = NULL;
553 data->initialized = FALSE;
556 /* List of implemented native methods */
557 static JNINativeMethod native_methods[] = {
558 { "nativeInit", "()V", (void *) gst_native_init},
559 { "nativeFinalize", "()V", (void *) gst_native_finalize},
560 { "nativePlay", "()V", (void *) gst_native_play},
561 { "nativePause", "()V", (void *) gst_native_pause},
562 { "nativeSurfaceInit", "(Ljava/lang/Object;)V", (void *) gst_native_surface_init},
563 { "nativeSurfaceFinalize", "()V", (void *) gst_native_surface_finalize},
564 { "nativeClassInit", "()Z", (void *) gst_native_class_init}
567 /* Library initializer */
568 jint JNI_OnLoad(JavaVM *vm, void *reserved) {
573 if ((*vm)->GetEnv(vm, (void**) &env, JNI_VERSION_1_4) != JNI_OK) {
574 __android_log_print (ANDROID_LOG_ERROR, "tutorial-3", "Could not retrieve JNIEnv");
577 jclass klass = (*env)->FindClass (env, "org/freedesktop/gstreamer/tutorials/tutorial_3/Tutorial3");
578 (*env)->RegisterNatives (env, klass, native_methods, G_N_ELEMENTS(native_methods));
580 pthread_key_create (¤t_jni_env, detach_current_thread);
582 return JNI_VERSION_1_4;
586 First, our `CustomData` structure is augmented to keep a pointer to the
587 video sink element and the native window
591 GstElement *video_sink; /* The video sink element which receives VideoOverlay commands */
592 ANativeWindow *native_window; /* The Android native window where video will be rendered */
595 The `check_initialization_complete()` method is also augmented so that
596 it requires a native window before considering GStreamer to be
600 static void check_initialization_complete (CustomData *data) {
601 JNIEnv *env = get_jni_env ();
602 if (!data->initialized && data->native_window && data->main_loop) {
603 GST_DEBUG ("Initialization complete, notifying application. native_window:%p main_loop:%p", data->native_window, data->main_loop);
605 /* The main loop is running and we received a native window, inform the sink about it */
606 gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->video_sink), (guintptr)data->native_window);
608 (*env)->CallVoidMethod (env, data->app, on_gstreamer_initialized_method_id);
609 if ((*env)->ExceptionCheck (env)) {
610 GST_ERROR ("Failed to call Java method");
611 (*env)->ExceptionClear (env);
613 data->initialized = TRUE;
618 Also, once the pipeline has been built and a native window has been
619 received, we inform the video sink of the window handle to use via the
620 `gst_video_overlay_set_window_handle()` method.
622 The GStreamer pipeline for this tutorial involves a `videotestsrc`, a
623 `warptv` psychedelic distorter effect (check out other cool video
624 effects in the `GSTREAMER_PLUGINS_EFFECTS` package), and an
625 `autovideosink` which will instantiate the adequate video sink for the
629 data->pipeline = gst_parse_launch("videotestsrc ! warptv ! videoconvert ! autovideosink ", &error);
632 Here things start to get more
636 /* Set the pipeline to READY, so it can already accept a window handle, if we have one */
637 gst_element_set_state(data->pipeline, GST_STATE_READY);
639 data->video_sink = gst_bin_get_by_interface(GST_BIN(data->pipeline), GST_TYPE_VIDEO_OVERLAY);
640 if (!data->video_sink) {
641 GST_ERROR ("Could not retrieve video sink");
646 We start by setting the pipeline to the READY state. No data flow occurs
647 yet, but the `autovideosink` will instantiate the actual sink so we can
648 ask for it immediately.
650 The `gst_bin_get_by_interface()` method will examine the whole pipeline
651 and return a pointer to an element which supports the requested
652 interface. We are asking for the `VideoOverlay` interface, explained in
653 [](tutorial-basic-toolkit-integration.md),
654 which controls how to perform rendering into foreign (non-GStreamer)
655 windows. The internal video sink instantiated by `autovideosink` is the
656 only element in this pipeline implementing it, so it will be returned.
658 Now we will implement the two native functions called by the Java code
659 when the drawing surface becomes available or is about to be
663 static void gst_native_surface_init (JNIEnv *env, jobject thiz, jobject surface) {
664 CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
666 ANativeWindow *new_native_window = ANativeWindow_fromSurface(env, surface);
667 GST_DEBUG ("Received surface %p (native window %p)", surface, new_native_window);
669 if (data->native_window) {
670 ANativeWindow_release (data->native_window);
671 if (data->native_window == new_native_window) {
672 GST_DEBUG ("New native window is the same as the previous one", data->native_window);
673 if (data->video_sink) {
674 gst_video_overlay_expose(GST_VIDEO_OVERLAY (data->video_sink));
675 gst_video_overlay_expose(GST_VIDEO_OVERLAY (data->video_sink));
679 GST_DEBUG ("Released previous native window %p", data->native_window);
680 data->initialized = FALSE;
683 data->native_window = new_native_window;
685 check_initialization_complete (data);
689 This method is responsible for providing the video sink with the window
690 handle coming from the Java code. We are passed a
691 [Surface](http://developer.android.com/reference/android/view/Surface.html)
692 object, and we use `ANativeWindow_fromSurface()` to obtain the
693 underlying native window pointer. There is no official online
694 documentation for the NDK, but fortunately the header files are well
695 commented. Native window management functions can be found in
696 `$(ANDROID_NDK_ROOT)\platforms\android-9\arch-arm\usr\include\android\native_window.h` and `native_window_jni.h`
698 If we had already stored a native window, the one we just received can
699 either be a new one, or just an update of the one we have. If the
700 pointers are the same, we assume the geometry of the surface has
701 changed, and simply instruct the video sink to redraw itself, via the
702 `gst_video_overlay_expose()` method. The video sink will recover the new
703 size from the surface itself, so we do not need to bother about it
704 here. We need to call `gst_video_overlay_expose()` twice because of the way
705 the surface changes propagate down the OpenGL ES / EGL pipeline (The
706 only video sink available for Android in GStreamer uses OpenGL
707 ES). By the time we call the first expose, the surface that the sink
708 will pick up still contains the old size.
710 On the other hand, if the pointers are different, we mark GStreamer as
711 not being initialized. Next time we call
712 `check_initialization_complete()`, the video sink will be informed of
713 the new window handle.
715 We finally store the new window handle and call
716 `check_initialization_complete()` to inform the Java code that
717 everything is set up, if that is the case.
720 static void gst_native_surface_finalize (JNIEnv *env, jobject thiz) {
721 CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
723 GST_DEBUG ("Releasing Native Window %p", data->native_window);
725 if (data->video_sink) {
726 gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->video_sink), (guintptr)NULL);
727 gst_element_set_state (data->pipeline, GST_STATE_READY);
730 ANativeWindow_release (data->native_window);
731 data->native_window = NULL;
732 data->initialized = FALSE;
736 The complementary function, `gst_native_surface_finalize()` is called
737 when a surface is about to be destroyed and should not be used anymore.
738 Here, we simply instruct the video sink to stop using the window handle
739 and set the pipeline to READY so no rendering occurs. We release the
740 window pointer we had stored with `ANativeWindow_release()`, and mark
741 GStreamer as not being initialized anymore.
743 And this is all there is to it, regarding the main code. Only a couple
744 of details remain, the subclass we made for SurfaceView and the
747 ### GStreamerSurfaceView, a convenient SurfaceView wrapper \[Java code\]
750 [SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html) does
751 not have any particular size, so it expands to use all the space the
752 layout can give it. While this might be convenient sometimes, it does
753 not allow a great deal of control. In particular, when the surface does
754 not have the same aspect ratio as the media, the sink will add black
755 borders (the known “letterbox” or “pillarbox” effect), which is an
756 unnecessary work (and a waste of battery).
759 [SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html) presented
761 [onMeasure()](http://developer.android.com/reference/android/view/SurfaceView.html#onMeasure\(int,%20int\)) method
762 to report the actual media size, so the surface can adapt to any layout
763 while preserving the media aspect ratio.
765 Since in this tutorial the media size is known beforehand, it is
766 hardcoded in the GStreamerSurfaceView class for simplicity. The next
767 tutorial shows how it can be recovered at runtime and passed onto the
770 **src/org/freedesktop/gstreamer/tutorials/tutorial\_3/GStreamerSurfaceView.java**
773 package org.freedesktop.gstreamer.tutorials.tutorial_3;
775 import android.content.Context;
776 import android.util.AttributeSet;
777 import android.util.Log;
778 import android.view.SurfaceView;
779 import android.view.View;
781 // A simple SurfaceView whose width and height can be set from the outside
782 public class GStreamerSurfaceView extends SurfaceView {
783 public int media_width = 320;
784 public int media_height = 240;
786 // Mandatory constructors, they do not do much
787 public GStreamerSurfaceView(Context context, AttributeSet attrs,
789 super(context, attrs, defStyle);
792 public GStreamerSurfaceView(Context context, AttributeSet attrs) {
793 super(context, attrs);
796 public GStreamerSurfaceView (Context context) {
800 // Called by the layout manager to find out our size and give us some rules.
801 // We will try to maximize our size, and preserve the media's aspect ratio if
802 // we are given the freedom to do so.
804 protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) {
805 int width = 0, height = 0;
806 int wmode = View.MeasureSpec.getMode(widthMeasureSpec);
807 int hmode = View.MeasureSpec.getMode(heightMeasureSpec);
808 int wsize = View.MeasureSpec.getSize(widthMeasureSpec);
809 int hsize = View.MeasureSpec.getSize(heightMeasureSpec);
811 Log.i ("GStreamer", "onMeasure called with " + media_width + "x" + media_height);
814 case View.MeasureSpec.AT_MOST:
815 if (hmode == View.MeasureSpec.EXACTLY) {
816 width = Math.min(hsize * media_width / media_height, wsize);
819 case View.MeasureSpec.EXACTLY:
822 case View.MeasureSpec.UNSPECIFIED:
828 case View.MeasureSpec.AT_MOST:
829 if (wmode == View.MeasureSpec.EXACTLY) {
830 height = Math.min(wsize * media_height / media_width, hsize);
833 case View.MeasureSpec.EXACTLY:
836 case View.MeasureSpec.UNSPECIFIED:
837 height = media_height;
840 // Finally, calculate best size when both axis are free
841 if (hmode == View.MeasureSpec.AT_MOST && wmode == View.MeasureSpec.AT_MOST) {
842 int correct_height = width * media_height / media_width;
843 int correct_width = height * media_width / media_height;
845 if (correct_height < height)
846 height = correct_height;
848 width = correct_width;
852 width = Math.max (getSuggestedMinimumWidth(), width);
853 height = Math.max (getSuggestedMinimumHeight(), height);
854 setMeasuredDimension(width, height);
860 ### A video surface on Android \[Android.mk\]
865 LOCAL_PATH := $(call my-dir)
867 include $(CLEAR_VARS)
869 LOCAL_MODULE := tutorial-3
870 LOCAL_SRC_FILES := tutorial-3.c
871 LOCAL_SHARED_LIBRARIES := gstreamer_android
872 LOCAL_LDLIBS := -llog -landroid
873 include $(BUILD_SHARED_LIBRARY)
875 ifndef GSTREAMER_ROOT
876 ifndef GSTREAMER_ROOT_ANDROID
877 $(error GSTREAMER_ROOT_ANDROID is not defined!)
879 GSTREAMER_ROOT := $(GSTREAMER_ROOT_ANDROID)
881 GSTREAMER_NDK_BUILD_PATH := $(GSTREAMER_ROOT)/share/gst-android/ndk-build/
882 include $(GSTREAMER_NDK_BUILD_PATH)/plugins.mk
883 GSTREAMER_PLUGINS := $(GSTREAMER_PLUGINS_CORE) $(GSTREAMER_PLUGINS_SYS) $(GSTREAMER_PLUGINS_EFFECTS)
884 GSTREAMER_EXTRA_DEPS := gstreamer-video-1.0
885 include $(GSTREAMER_NDK_BUILD_PATH)/gstreamer.mk
888 Worth mentioning is the `-landroid` library being used to allow
889 interaction with the native windows, and the different plugin
890 packages: `GSTREAMER_PLUGINS_SYS` for the system-dependent video sink
891 and `GSTREAMER_PLUGINS_EFFECTS` for the `warptv` element. This tutorial
892 requires the `gstreamer-video` library to use the
893 `VideoOverlay` interface and the video helper methods.
897 This tutorial has shown:
899 - How to display video on Android using a
900 [SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html) and
901 the `VideoOverlay` interface.
902 - How to be aware of changes in the surface’s size using
903 [SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html)’s
905 - How to report the media size to the Android layout engine.
907 The following tutorial plays an actual clip and adds a few more controls
908 to this tutorial in order to build a simple media player.
910 It has been a pleasure having you here, and see you soon\!
913 [screenshot]: images/tutorial-android-video-screenshot.png