1 # iOS tutorial 3: Video
7 Except for [](sdk-basic-tutorial-toolkit-integration.md),
8 which embedded a video window on a GTK application, all tutorials so far
9 relied on GStreamer video sinks to create a window to display their
10 contents. The video sink on iOS is not capable of creating its own
11 window, so a drawing surface always needs to be provided. This tutorial
14 - How to allocate a drawing surface on the Xcode Interface Builder and
19 Since iOS does not provide a windowing system, a GStreamer video sink
20 cannot create pop-up windows as it would do on a Desktop platform.
21 Fortunately, the `VideoOverlay` interface allows providing video sinks with
22 an already created window onto which they can draw, as we have seen
23 in [](sdk-basic-tutorial-toolkit-integration.md).
25 In this tutorial, a `UIView` widget (actually, a subclass of it) is
26 placed on the main storyboard. In the `viewDidLoad` method of the
27 `ViewController`, we pass a pointer to this `UIView `to the instance of
28 the `GStreamerBackend`, so it can tell the video sink where to draw.
32 The storyboard from the previous tutorial is expanded: A `UIView `is
33 added over the toolbar and pinned to all sides so it takes up all
34 available space (`video_container_view` outlet). Inside it, another
35 `UIView `is added (`video_view` outlet) which contains the actual video,
36 centered to its parent, and with a size that adapts to the media size
37 (through the `video_width_constraint` and `video_height_constraint`
43 #import <UIKit/UIKit.h>
44 #import "GStreamerBackendDelegate.h"
46 @interface ViewController : UIViewController <GStreamerBackendDelegate> {
47 IBOutlet UILabel *message_label;
48 IBOutlet UIBarButtonItem *play_button;
49 IBOutlet UIBarButtonItem *pause_button;
50 IBOutlet UIView *video_view;
51 IBOutlet UIView *video_container_view;
52 IBOutlet NSLayoutConstraint *video_width_constraint;
53 IBOutlet NSLayoutConstraint *video_height_constraint;
56 -(IBAction) play:(id)sender;
57 -(IBAction) pause:(id)sender;
59 /* From GStreamerBackendDelegate */
60 -(void) gstreamerInitialized;
61 -(void) gstreamerSetUIMessage:(NSString *)message;
66 ## The View Controller
68 The `ViewController `class manages the UI, instantiates
69 the `GStreamerBackend` and also performs some UI-related tasks on its
75 #import "ViewController.h"
76 #import "GStreamerBackend.h"
77 #import <UIKit/UIKit.h>
79 @interface ViewController () {
80 GStreamerBackend *gst_backend;
87 @implementation ViewController
90 * Methods from UIViewController
97 play_button.enabled = FALSE;
98 pause_button.enabled = FALSE;
100 /* Make these constant for now, later tutorials will change them */
104 gst_backend = [[GStreamerBackend alloc] init:self videoView:video_view];
107 - (void)didReceiveMemoryWarning
109 [super didReceiveMemoryWarning];
110 // Dispose of any resources that can be recreated.
113 /* Called when the Play button is pressed */
114 -(IBAction) play:(id)sender
119 /* Called when the Pause button is pressed */
120 -(IBAction) pause:(id)sender
125 - (void)viewDidLayoutSubviews
127 CGFloat view_width = video_container_view.bounds.size.width;
128 CGFloat view_height = video_container_view.bounds.size.height;
130 CGFloat correct_height = view_width * media_height / media_width;
131 CGFloat correct_width = view_height * media_width / media_height;
133 if (correct_height < view_height) {
134 video_height_constraint.constant = correct_height;
135 video_width_constraint.constant = view_width;
137 video_width_constraint.constant = correct_width;
138 video_height_constraint.constant = view_height;
143 * Methods from GstreamerBackendDelegate
146 -(void) gstreamerInitialized
148 dispatch_async(dispatch_get_main_queue(), ^{
149 play_button.enabled = TRUE;
150 pause_button.enabled = TRUE;
151 message_label.text = @"Ready";
155 -(void) gstreamerSetUIMessage:(NSString *)message
157 dispatch_async(dispatch_get_main_queue(), ^{
158 message_label.text = message;
165 We expand the class to remember the width and height of the media we are
169 @interface ViewController () {
170 GStreamerBackend *gst_backend;
176 In later tutorials this data is retrieved from the GStreamer pipeline,
177 but in this tutorial, for simplicity’s sake, the width and height of the
178 media is constant and initialized in `viewDidLoad`:
185 play_button.enabled = FALSE;
186 pause_button.enabled = FALSE;
188 /* Make these constant for now, later tutorials will change them */
192 gst_backend = [[GStreamerBackend alloc] init:self videoView:video_view];
196 As shown below, the `GStreamerBackend` constructor has also been
197 expanded to accept another parameter: the `UIView *` where the video
200 The rest of the `ViewController `code is the same as the previous
201 tutorial, except for the code that adapts the `video_view` size to the
202 media size, respecting its aspect ratio:
205 - (void)viewDidLayoutSubviews
207 CGFloat view_width = video_container_view.bounds.size.width;
208 CGFloat view_height = video_container_view.bounds.size.height;
210 CGFloat correct_height = view_width * media_height / media_width;
211 CGFloat correct_width = view_height * media_width / media_height;
213 if (correct_height < view_height) {
214 video_height_constraint.constant = correct_height;
215 video_width_constraint.constant = view_width;
217 video_width_constraint.constant = correct_width;
218 video_height_constraint.constant = view_height;
223 The `viewDidLayoutSubviews` method is called every time the main view
224 size has changed (for example, due to a device orientation change) and
225 the entire layout has been recalculated. At this point, we can access
226 the `bounds` property of the `video_container_view` to retrieve its new
227 size and change the `video_view` size accordingly.
229 The simple algorithm above maximizes either the width or the height of
230 the `video_view`, while changing the other axis so the aspect ratio of
231 the media is preserved. The goal is to provide the GStreamer video sink
232 with a surface of the correct proportions, so it does not need to add
233 black borders (*letterboxing*), which is a waste of processing power.
235 The final size is reported to the layout engine by changing the
236 `constant` field in the width and height `Constraints` of the
237 `video_view`. These constraints have been created in the storyboard and
238 are accessible to the `ViewController `through IBOutlets, as is usually
239 done with other widgets.
241 ## The GStreamer Backend
243 The `GStreamerBackend` class performs all GStreamer-related tasks and
244 offers a simplified interface to the application, which does not need to
245 deal with all the GStreamer details. When it needs to perform any UI
246 action, it does so through a delegate, which is expected to adhere to
247 the `GStreamerBackendDelegate` protocol:
249 **GStreamerBackend.m**
252 #import "GStreamerBackend.h"
255 #include <gst/video/video.h>
257 GST_DEBUG_CATEGORY_STATIC (debug_category);
258 #define GST_CAT_DEFAULT debug_category
260 @interface GStreamerBackend()
261 -(void)setUIMessage:(gchar*) message;
263 -(void)check_initialization_complete;
266 @implementation GStreamerBackend {
267 id ui_delegate; /* Class that we use to interact with the user interface */
268 GstElement *pipeline; /* The running pipeline */
269 GstElement *video_sink;/* The video sink element which receives VideoOverlay commands */
270 GMainContext *context; /* GLib context used to run the main loop */
271 GMainLoop *main_loop; /* GLib main loop */
272 gboolean initialized; /* To avoid informing the UI multiple times about the initialization */
273 UIView *ui_video_view; /* UIView that holds the video */
280 -(id) init:(id) uiDelegate videoView:(UIView *)video_view
282 if (self = [super init])
284 self->ui_delegate = uiDelegate;
285 self->ui_video_view = video_view;
287 GST_DEBUG_CATEGORY_INIT (debug_category, "tutorial-3", 0, "iOS tutorial 3");
288 gst_debug_set_threshold_for_name("tutorial-3", GST_LEVEL_DEBUG);
290 /* Start the bus monitoring task */
291 dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
302 GST_DEBUG("Setting the pipeline to NULL");
303 gst_element_set_state(pipeline, GST_STATE_NULL);
304 gst_object_unref(pipeline);
311 if(gst_element_set_state(pipeline, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE) {
312 [self setUIMessage:"Failed to set pipeline to playing"];
318 if(gst_element_set_state(pipeline, GST_STATE_PAUSED) == GST_STATE_CHANGE_FAILURE) {
319 [self setUIMessage:"Failed to set pipeline to paused"];
327 /* Change the message on the UI through the UI delegate */
328 -(void)setUIMessage:(gchar*) message
330 NSString *string = [NSString stringWithUTF8String:message];
331 if(ui_delegate && [ui_delegate respondsToSelector:@selector(gstreamerSetUIMessage:)])
333 [ui_delegate gstreamerSetUIMessage:string];
337 /* Retrieve errors from the bus and show them on the UI */
338 static void error_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *self)
342 gchar *message_string;
344 gst_message_parse_error (msg, &err, &debug_info);
345 message_string = g_strdup_printf ("Error received from element %s: %s", GST_OBJECT_NAME (msg->src), err->message);
346 g_clear_error (&err);
348 [self setUIMessage:message_string];
349 g_free (message_string);
350 gst_element_set_state (self->pipeline, GST_STATE_NULL);
353 /* Notify UI about pipeline state changes */
354 static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *self)
356 GstState old_state, new_state, pending_state;
357 gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
358 /* Only pay attention to messages coming from the pipeline, not its children */
359 if (GST_MESSAGE_SRC (msg) == GST_OBJECT (self->pipeline)) {
360 gchar *message = g_strdup_printf("State changed to %s", gst_element_state_get_name(new_state));
361 [self setUIMessage:message];
366 /* Check if all conditions are met to report GStreamer as initialized.
367 * These conditions will change depending on the application */
368 -(void) check_initialization_complete
370 if (!initialized && main_loop) {
371 GST_DEBUG ("Initialization complete, notifying application.");
372 if (ui_delegate && [ui_delegate respondsToSelector:@selector(gstreamerInitialized)])
374 [ui_delegate gstreamerInitialized];
380 /* Main method for the bus monitoring code */
385 GError *error = NULL;
387 GST_DEBUG ("Creating pipeline");
389 /* Create our own GLib Main Context and make it the default one */
390 context = g_main_context_new ();
391 g_main_context_push_thread_default(context);
394 pipeline = gst_parse_launch("videotestsrc ! warptv ! videoconvert ! autovideosink", &error);
396 gchar *message = g_strdup_printf("Unable to build pipeline: %s", error->message);
397 g_clear_error (&error);
398 [self setUIMessage:message];
403 /* Set the pipeline to READY, so it can already accept a window handle */
404 gst_element_set_state(pipeline, GST_STATE_READY);
406 video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_VIDEO_OVERLAY);
408 GST_ERROR ("Could not retrieve video sink");
411 gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(video_sink), (guintptr) (id) ui_video_view);
413 /* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
414 bus = gst_element_get_bus (pipeline);
415 bus_source = gst_bus_create_watch (bus);
416 g_source_set_callback (bus_source, (GSourceFunc) gst_bus_async_signal_func, NULL, NULL);
417 g_source_attach (bus_source, context);
418 g_source_unref (bus_source);
419 g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, (__bridge void *)self);
420 g_signal_connect (G_OBJECT (bus), "message::state-changed", (GCallback)state_changed_cb, (__bridge void *)self);
421 gst_object_unref (bus);
423 /* Create a GLib Main Loop and set it to run */
424 GST_DEBUG ("Entering main loop...");
425 main_loop = g_main_loop_new (context, FALSE);
426 [self check_initialization_complete];
427 g_main_loop_run (main_loop);
428 GST_DEBUG ("Exited main loop");
429 g_main_loop_unref (main_loop);
433 g_main_context_pop_thread_default(context);
434 g_main_context_unref (context);
435 gst_element_set_state (pipeline, GST_STATE_NULL);
436 gst_object_unref (pipeline);
444 The main differences with the previous tutorial are related to the
445 handling of the `VideoOverlay` interface:
448 @implementation GStreamerBackend {
449 id ui_delegate; /* Class that we use to interact with the user interface */
450 GstElement *pipeline; /* The running pipeline */
451 GstElement *video_sink;/* The video sink element which receives VideoOverlay commands */
452 GMainContext *context; /* GLib context used to run the main loop */
453 GMainLoop *main_loop; /* GLib main loop */
454 gboolean initialized; /* To avoid informing the UI multiple times about the initialization */
455 UIView *ui_video_view; /* UIView that holds the video */
459 The class is expanded to keep track of the video sink element in the
460 pipeline and the `UIView *` onto which rendering is to occur.
463 -(id) init:(id) uiDelegate videoView:(UIView *)video_view
465 if (self = [super init])
467 self->ui_delegate = uiDelegate;
468 self->ui_video_view = video_view;
470 GST_DEBUG_CATEGORY_INIT (debug_category, "tutorial-3", 0, "iOS tutorial 3");
471 gst_debug_set_threshold_for_name("tutorial-3", GST_LEVEL_DEBUG);
473 /* Start the bus monitoring task */
474 dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
483 The constructor accepts the `UIView *` as a new parameter, which, at
484 this point, is simply remembered in `ui_video_view`.
488 pipeline = gst_parse_launch("videotestsrc ! warptv ! videoconvert ! autovideosink", &error);
491 Then, in the `app_function`, the pipeline is constructed. This time we
492 build a video pipeline using a simple `videotestsrc` element with a
493 `warptv` to add some spice. The video sink is `autovideosink`, which
494 choses the appropriate sink for the platform (currently,
495 `glimagesink` is the only option for
499 /* Set the pipeline to READY, so it can already accept a window handle */
500 gst_element_set_state(pipeline, GST_STATE_READY);
502 video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_VIDEO_OVERLAY);
504 GST_ERROR ("Could not retrieve video sink");
507 gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(video_sink), (guintptr) (id) ui_video_view);
510 Once the pipeline is built, we set it to READY. In this state, dataflow
511 has not started yet, but the caps of adjacent elements have been
512 verified to be compatible and their pads have been linked. Also, the
513 `autovideosink` has already instantiated the actual video sink so we can
514 ask for it immediately.
516 The `gst_bin_get_by_interface()` method will examine the whole pipeline
517 and return a pointer to an element which supports the requested
518 interface. We are asking for the `VideoOverlay` interface, explained in
519 [](sdk-basic-tutorial-toolkit-integration.md),
520 which controls how to perform rendering into foreign (non-GStreamer)
521 windows. The internal video sink instantiated by `autovideosink` is the
522 only element in this pipeline implementing it, so it will be returned.
524 Once we have the video sink, we inform it of the `UIView` to use for
525 rendering, through the `gst_video_overlay_set_window_handle()` method.
529 One last detail remains. In order for `glimagesink` to be able to draw
531 [`UIView`](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UIView_Class/UIView/UIView.html),
533 [`Layer`](http://developer.apple.com/library/ios/#documentation/GraphicsImaging/Reference/CALayer_class/Introduction/Introduction.html#//apple_ref/occ/cl/CALayer) associated
534 with this view must be of the
535 [`CAEAGLLayer`](http://developer.apple.com/library/ios/#documentation/QuartzCore/Reference/CAEAGLLayer_Class/CAEGLLayer/CAEGLLayer.html#//apple_ref/occ/cl/CAEAGLLayer) class.
536 To this avail, we create the `EaglUIView` class, derived from
537 `UIView `and overriding the `layerClass` method:
542 #import "EaglUIVIew.h"
544 #import <QuartzCore/QuartzCore.h>
546 @implementation EaglUIView
550 return [CAEAGLLayer class];
556 When creating storyboards, bear in mind that the `UIView `which should
557 contain the video must have `EaglUIView` as its custom class. This is
558 easy to setup from the Xcode interface builder. Take a look at the
559 tutorial storyboard to see how to achieve this.
561 And this is it, using GStreamer to output video onto an iOS application
562 is as simple as it seems.
566 This tutorial has shown:
568 - How to display video on iOS using a `UIView `and
569 the `VideoOverlay` interface.
570 - How to report the media size to the iOS layout engine through
571 runtime manipulation of width and height constraints.
573 The following tutorial plays an actual clip and adds a few more controls
574 to this tutorial in order to build a simple media player.
576 It has been a pleasure having you here, and see you soon!
578 [screenshot]: images/sdk-ios-tutorial-video-screenshot.png