1 # Basic tutorial 3: Dynamic pipelines
5 This tutorial shows the rest of the basic concepts required to use
6 GStreamer, which allow building the pipeline "on the fly", as
7 information becomes available, instead of having a monolithic pipeline
8 defined at the beginning of your application.
10 After this tutorial, you will have the necessary knowledge to start the
11 [Playback tutorials](tutorials/playback/index.md). The points reviewed
14 - How to attain finer control when linking elements.
16 - How to be notified of interesting events so you can react in time.
18 - The various states in which an element can be.
22 As you are about to see, the pipeline in this tutorial is not
23 completely built before it is set to the playing state. This is OK. If
24 we did not take further action, data would reach the end of the
25 pipeline and the pipeline would produce an error message and stop. But
26 we are going to take further action...
28 In this example we are opening a file which is multiplexed (or *muxed)*,
29 this is, audio and video are stored together inside a *container* file.
30 The elements responsible for opening such containers are called
31 *demuxers*, and some examples of container formats are Matroska (MKV),
32 Quick Time (QT, MOV), Ogg, or Advanced Systems Format (ASF, WMV, WMA).
34 If a container embeds multiple streams (one video and two audio tracks,
35 for example), the demuxer will separate them and expose them through
36 different output ports. In this way, different branches can be created
37 in the pipeline, dealing with different types of data.
39 The ports through which GStreamer elements communicate with each other
40 are called pads (`GstPad`). There exists sink pads, through which data
41 enters an element, and source pads, through which data exits an element.
42 It follows naturally that source elements only contain source pads, sink
43 elements only contain sink pads, and filter elements contain
46 ![](images/src-element.png) ![](images/filter-element.png) ![](images/sink-element.png)
48 **Figure 1**. GStreamer elements with their pads.
50 A demuxer contains one sink pad, through which the muxed data arrives,
51 and multiple source pads, one for each stream found in the container:
53 ![](images/filter-element-multi.png)
55 **Figure 2**. A demuxer with two source pads.
57 For completeness, here you have a simplified pipeline containing a
58 demuxer and two branches, one for audio and one for video. This is
59 **NOT** the pipeline that will be built in this example:
61 ![](images/simple-player.png)
63 **Figure 3**. Example pipeline with two branches.
65 The main complexity when dealing with demuxers is that they cannot
66 produce any information until they have received some data and have had
67 a chance to look at the container to see what is inside. This is,
68 demuxers start with no source pads to which other elements can link, and
69 thus the pipeline must necessarily terminate at them.
71 The solution is to build the pipeline from the source down to the
72 demuxer, and set it to run (play). When the demuxer has received enough
73 information to know about the number and kind of streams in the
74 container, it will start creating source pads. This is the right time
75 for us to finish building the pipeline and attach it to the newly added
78 For simplicity, in this example, we will only link to the audio pad and
81 ## Dynamic Hello World
83 Copy this code into a text file named `basic-tutorial-3.c` (or find it
84 in the SDK installation).
86 **basic-tutorial-3.c**
91 /* Structure to contain all our information, so we can pass it to callbacks */
92 typedef struct _CustomData {
99 /* Handler for the pad-added signal */
100 static void pad_added_handler (GstElement *src, GstPad *pad, CustomData *data);
102 int main(int argc, char *argv[]) {
106 GstStateChangeReturn ret;
107 gboolean terminate = FALSE;
109 /* Initialize GStreamer */
110 gst_init (&argc, &argv);
112 /* Create the elements */
113 data.source = gst_element_factory_make ("uridecodebin", "source");
114 data.convert = gst_element_factory_make ("audioconvert", "convert");
115 data.sink = gst_element_factory_make ("autoaudiosink", "sink");
117 /* Create the empty pipeline */
118 data.pipeline = gst_pipeline_new ("test-pipeline");
120 if (!data.pipeline || !data.source || !data.convert || !data.sink) {
121 g_printerr ("Not all elements could be created.\n");
125 /* Build the pipeline. Note that we are NOT linking the source at this
126 * point. We will do it later. */
127 gst_bin_add_many (GST_BIN (data.pipeline), data.source, data.convert , data.sink, NULL);
128 if (!gst_element_link (data.convert, data.sink)) {
129 g_printerr ("Elements could not be linked.\n");
130 gst_object_unref (data.pipeline);
134 /* Set the URI to play */
135 g_object_set (data.source, "uri", "https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm", NULL);
137 /* Connect to the pad-added signal */
138 g_signal_connect (data.source, "pad-added", G_CALLBACK (pad_added_handler), &data);
141 ret = gst_element_set_state (data.pipeline, GST_STATE_PLAYING);
142 if (ret == GST_STATE_CHANGE_FAILURE) {
143 g_printerr ("Unable to set the pipeline to the playing state.\n");
144 gst_object_unref (data.pipeline);
148 /* Listen to the bus */
149 bus = gst_element_get_bus (data.pipeline);
151 msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE,
152 GST_MESSAGE_STATE_CHANGED | GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
159 switch (GST_MESSAGE_TYPE (msg)) {
160 case GST_MESSAGE_ERROR:
161 gst_message_parse_error (msg, &err, &debug_info);
162 g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
163 g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
164 g_clear_error (&err);
168 case GST_MESSAGE_EOS:
169 g_print ("End-Of-Stream reached.\n");
172 case GST_MESSAGE_STATE_CHANGED:
173 /* We are only interested in state-changed messages from the pipeline */
174 if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data.pipeline)) {
175 GstState old_state, new_state, pending_state;
176 gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
177 g_print ("Pipeline state changed from %s to %s:\n",
178 gst_element_state_get_name (old_state), gst_element_state_get_name (new_state));
182 /* We should not reach here */
183 g_printerr ("Unexpected message received.\n");
186 gst_message_unref (msg);
188 } while (!terminate);
191 gst_object_unref (bus);
192 gst_element_set_state (data.pipeline, GST_STATE_NULL);
193 gst_object_unref (data.pipeline);
197 /* This function will be called by the pad-added signal */
198 static void pad_added_handler (GstElement *src, GstPad *new_pad, CustomData *data) {
199 GstPad *sink_pad = gst_element_get_static_pad (data->convert, "sink");
200 GstPadLinkReturn ret;
201 GstCaps *new_pad_caps = NULL;
202 GstStructure *new_pad_struct = NULL;
203 const gchar *new_pad_type = NULL;
205 g_print ("Received new pad '%s' from '%s':\n", GST_PAD_NAME (new_pad), GST_ELEMENT_NAME (src));
207 /* If our converter is already linked, we have nothing to do here */
208 if (gst_pad_is_linked (sink_pad)) {
209 g_print (" We are already linked. Ignoring.\n");
213 /* Check the new pad's type */
214 new_pad_caps = gst_pad_query_caps (new_pad, NULL);
215 new_pad_struct = gst_caps_get_structure (new_pad_caps, 0);
216 new_pad_type = gst_structure_get_name (new_pad_struct);
217 if (!g_str_has_prefix (new_pad_type, "audio/x-raw")) {
218 g_print (" It has type '%s' which is not raw audio. Ignoring.\n", new_pad_type);
222 /* Attempt the link */
223 ret = gst_pad_link (new_pad, sink_pad);
224 if (GST_PAD_LINK_FAILED (ret)) {
225 g_print (" Type is '%s' but link failed.\n", new_pad_type);
227 g_print (" Link succeeded (type '%s').\n", new_pad_type);
231 /* Unreference the new pad's caps, if we got them */
232 if (new_pad_caps != NULL)
233 gst_caps_unref (new_pad_caps);
235 /* Unreference the sink pad */
236 gst_object_unref (sink_pad);
240 > ![Information](images/icons/emoticons/information.png)
243 > If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Build), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Build) or [Windows](installing/on-windows.md#InstallingonWindows-Build), or use this specific command on Linux:
244 > ``gcc basic-tutorial-3.c -o basic-tutorial-3 `pkg-config --cflags --libs gstreamer-1.0` ``
246 >If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](installing/on-linux.md#InstallingonLinux-Run), [Mac OS X](installing/on-mac-osx.md#InstallingonMacOSX-Run) or [Windows](installing/on-windows.md#InstallingonWindows-Run).
248 > This tutorial only plays audio. The media is fetched from the Internet, so it might take a few seconds to start, depending on your connection speed.
250 >Required libraries: `gstreamer-1.0`
255 /* Structure to contain all our information, so we can pass it to callbacks */
256 typedef struct _CustomData {
257 GstElement *pipeline;
264 So far we have kept all the information we needed (pointers
265 to `GstElement`s, basically) as local variables. Since this tutorial
266 (and most real applications) involves callbacks, we will group all our
267 data in a structure for easier handling.
270 /* Handler for the pad-added signal */
271 static void pad_added_handler (GstElement *src, GstPad *pad, CustomData *data);
274 This is a forward reference, to be used later.
277 /* Create the elements */
278 data.source = gst_element_factory_make ("uridecodebin", "source");
279 data.convert = gst_element_factory_make ("audioconvert", "convert");
280 data.sink = gst_element_factory_make ("autoaudiosink", "sink");
283 We create the elements as usual. `uridecodebin` will internally
284 instantiate all the necessary elements (sources, demuxers and decoders)
285 to turn a URI into raw audio and/or video streams. It does half the work
286 that `playbin` does. Since it contains demuxers, its source pads are
287 not initially available and we will need to link to them on the fly.
289 `audioconvert` is useful for converting between different audio formats,
290 making sure that this example will work on any platform, since the
291 format produced by the audio decoder might not be the same that the
294 The `autoaudiosink` is the equivalent of `autovideosink` seen in the
295 previous tutorial, for audio. It will render the audio stream to the
299 if (!gst_element_link (data.convert, data.sink)) {
300 g_printerr ("Elements could not be linked.\n");
301 gst_object_unref (data.pipeline);
306 Here we link the converter element to the sink, but we **DO NOT** link
307 them with the source, since at this point it contains no source pads. We
308 just leave this branch (converter + sink) unlinked, until later on.
311 /* Set the URI to play */
312 g_object_set (data.source, "uri", "https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm", NULL);
315 We set the URI of the file to play via a property, just like we did in
316 the previous tutorial.
321 /* Connect to the pad-added signal */
322 g_signal_connect (data.source, "pad-added", G_CALLBACK (pad_added_handler), &data);
325 `GSignals` are a crucial point in GStreamer. They allow you to be
326 notified (by means of a callback) when something interesting has
327 happened. Signals are identified by a name, and each `GObject` has its
330 In this line, we are *attaching* to the “pad-added” signal of our source
331 (an `uridecodebin` element). To do so, we use `g_signal_connect()` and
332 provide the callback function to be used (`pad_added_handler`) and a
333 data pointer. GStreamer does nothing with this data pointer, it just
334 forwards it to the callback so we can share information with it. In this
335 case, we pass a pointer to the `CustomData` structure we built specially
338 The signals that a `GstElement` generates can be found in its
339 documentation or using the `gst-inspect-1.0` tool as described in [Basic
340 tutorial 10: GStreamer
341 tools](tutorials/basic/gstreamer-tools.md).
343 We are now ready to go! Just set the pipeline to the PLAYING state and
344 start listening to the bus for interesting messages (like ERROR or EOS),
345 just like in the previous tutorials.
349 When our source element finally has enough information to start
350 producing data, it will create source pads, and trigger the “pad-added”
351 signal. At this point our callback will be
355 static void pad_added_handler (GstElement *src, GstPad *new_pad, CustomData *data) {
358 `src` is the `GstElement` which triggered the signal. In this example,
359 it can only be the `uridecodebin`, since it is the only signal to which
360 we have attached. The first parameter of a signal handler is always the object
361 that has triggered it.
363 `new_pad` is the `GstPad` that has just been added to the `src` element.
364 This is usually the pad to which we want to link.
366 `data` is the pointer we provided when attaching to the signal. In this
367 example, we use it to pass the `CustomData` pointer.
370 GstPad *sink_pad = gst_element_get_static_pad (data->convert, "sink");
373 From `CustomData` we extract the converter element, and then retrieve
374 its sink pad using `gst_element_get_static_pad ()`. This is the pad to
375 which we want to link `new_pad`. In the previous tutorial we linked
376 element against element, and let GStreamer choose the appropriate pads.
377 Now we are going to link the pads directly.
380 /* If our converter is already linked, we have nothing to do here */
381 if (gst_pad_is_linked (sink_pad)) {
382 g_print (" We are already linked. Ignoring.\n");
387 `uridecodebin` can create as many pads as it sees fit, and for each one,
388 this callback will be called. These lines of code will prevent us from
389 trying to link to a new pad once we are already linked.
392 /* Check the new pad's type */
393 new_pad_caps = gst_pad_query_caps (new_pad, NULL);
394 new_pad_struct = gst_caps_get_structure (new_pad_caps, 0);
395 new_pad_type = gst_structure_get_name (new_pad_struct);
396 if (!g_str_has_prefix (new_pad_type, "audio/x-raw")) {
397 g_print (" It has type '%s' which is not raw audio. Ignoring.\n", new_pad_type);
402 Now we will check the type of data this new pad is going to output,
403 because we are only interested in pads producing audio. We have
404 previously created a piece of pipeline which deals with audio (an
405 `audioconvert` linked with an `autoaudiosink`), and we will not be able
406 to link it to a pad producing video, for example.
408 `gst_pad_query_caps()` retrieves the *capabilities* of the pad (this is,
409 the kind of data it supports), wrapped in a `GstCaps` structure. A pad
410 can offer many capabilities, and hence `GstCaps` can contain many
411 `GstStructure`, each representing a different capability.
413 Since, in this case, we know that the pad we want only had one
414 capability (audio), we retrieve the first `GstStructure` with
415 `gst_caps_get_structure()`.
417 Finally, with `gst_structure_get_name()` we recover the name of the
418 structure, which contains the main description of the format (its *media
421 If the name is not `audio/x-raw`, this is not a decoded
422 audio pad, and we are not interested in it.
424 Otherwise, attempt the link:
427 /* Attempt the link */
428 ret = gst_pad_link (new_pad, sink_pad);
429 if (GST_PAD_LINK_FAILED (ret)) {
430 g_print (" Type is '%s' but link failed.\n", new_pad_type);
432 g_print (" Link succeeded (type '%s').\n", new_pad_type);
436 `gst_pad_link()` tries to link two pads. As it was the case
437 with `gst_element_link()`, the link must be specified from source to
438 sink, and both pads must be owned by elements residing in the same bin
441 And we are done! When a pad of the right kind appears, it will be
442 linked to the rest of the audio-processing pipeline and execution will
443 continue until ERROR or EOS. However, we will squeeze a bit more content
444 from this tutorial by also introducing the concept of State.
448 We already talked a bit about states when we said that playback does not
449 start until you bring the pipeline to the PLAYING state. We will
450 introduce here the rest of states and their meaning. There are 4 states
453 | State | Description |
454 |-----------|--------------------|
455 | `NULL` | the NULL state or initial state of an element. |
456 | `READY` | the element is ready to go to PAUSED. |
457 | `PAUSED` | the element is PAUSED, it is ready to accept and process data. Sink elements however only accept one buffer and then block. |
458 | `PLAYING` | the element is PLAYING, the clock is running and the data is flowing. |
460 You can only move between adjacent ones, this is, you can't go from NULL
461 to PLAYING, you have to go through the intermediate READY and PAUSED
462 states. If you set the pipeline to PLAYING, though, GStreamer will make
463 the intermediate transitions for you.
466 case GST_MESSAGE_STATE_CHANGED:
467 /* We are only interested in state-changed messages from the pipeline */
468 if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data.pipeline)) {
469 GstState old_state, new_state, pending_state;
470 gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
471 g_print ("Pipeline state changed from %s to %s:\n",
472 gst_element_state_get_name (old_state), gst_element_state_get_name (new_state));
477 We added this piece of code that listens to bus messages regarding state
478 changes and prints them on screen to help you understand the
479 transitions. Every element puts messages on the bus regarding its
480 current state, so we filter them out and only listen to messages coming
483 Most applications only need to worry about going to PLAYING to start
484 playback, then to PAUSE to perform a pause, and then back to NULL at
485 program exit to free all resources.
489 Dynamic pad linking has traditionally been a difficult topic for a lot
490 of programmers. Prove that you have achieved its mastery by
491 instantiating an `autovideosink` (probably with an `videoconvert` in
492 front) and link it to the demuxer when the right pad appears. Hint: You
493 are already printing on screen the type of the video pads.
495 You should now see (and hear) the same movie as in [Basic tutorial 1:
496 Hello world!](tutorials/basic/hello-world.md). In
497 that tutorial you used `playbin`, which is a handy element that
498 automatically takes care of all the demuxing and pad linking for you.
499 Most of the [Playback tutorials](tutorials/playback/index.md) are devoted
504 In this tutorial, you learned:
506 - How to be notified of events using `GSignals`
507 - How to connect `GstPad`s directly instead of their parent elements.
508 - The various states of a GStreamer element.
510 You also combined these items to build a dynamic pipeline, which was not
511 defined at program start, but was created as information regarding the
514 You can now continue with the basic tutorials and learn about performing
515 seeks and time-related queries in [Basic tutorial 4: Time
516 management](tutorials/basic/time-management.md) or move
517 to the [Playback tutorials](tutorials/playback/index.md), and gain more
518 insight about the `playbin` element.
520 Remember that attached to this page you should find the complete source
521 code of the tutorial and any accessory files needed to build it.
522 It has been a pleasure having you here, and see you soon!