2 title: Pipeline manipulation
5 # Pipeline manipulation
7 This chapter will discuss how you can manipulate your pipeline in
8 several ways from your application on. Parts of this chapter are very
9 lowlevel, so be assured that you'll need some programming knowledge and
10 a good understanding of GStreamer before you start reading this.
12 Topics that will be discussed here include how you can insert data into
13 a pipeline from your application, how to read data from a pipeline, how
14 to manipulate the pipeline's speed, length, starting point and how to
15 listen to a pipeline's data processing.
19 Probing is best envisioned as a pad listener. Technically, a probe is
20 nothing more than a callback that can be attached to a pad. You can
21 attach a probe using `gst_pad_add_probe ()`. Similarly, one can use the
22 `gst_pad_remove_probe ()` to remove the callback again. The probe
23 notifies you of any activity that happens on the pad, like buffers,
24 events and queries. You can define what kind of notifications you are
25 interested in when you add the probe.
27 The probe can notify you of the following activity on pads:
29 - A buffer is pushed or pulled. You want to specify the
30 GST\_PAD\_PROBE\_TYPE\_BUFFER when registering the probe. Because
31 the pad can be scheduled in different ways, it is possible to also
32 specify in what scheduling mode you are interested with the optional
33 GST\_PAD\_PROBE\_TYPE\_PUSH and GST\_PAD\_PROBE\_TYPE\_PULL flags.
35 You can use this probe to inspect, modify or drop the buffer. See
36 [Data probes](#data-probes).
38 - A bufferlist is pushed. Use the GST\_PAD\_PROBE\_TYPE\_BUFFER\_LIST
39 when registering the probe.
41 - An event travels over a pad. Use the
42 GST\_PAD\_PROBE\_TYPE\_EVENT\_DOWNSTREAM and
43 GST\_PAD\_PROBE\_TYPE\_EVENT\_UPSTREAM flags to select downstream
44 and upstream events. There is also a convenience
45 GST\_PAD\_PROBE\_TYPE\_EVENT\_BOTH to be notified of events going
46 both upstream and downstream. By default, flush events do not cause
47 a notification. You need to explicitly enable
48 GST\_PAD\_PROBE\_TYPE\_EVENT\_FLUSH to receive callbacks from
49 flushing events. Events are always only notified in push mode.
51 You can use this probe to inspect, modify or drop the event.
53 - A query travels over a pad. Use the
54 GST\_PAD\_PROBE\_TYPE\_QUERY\_DOWNSTREAM and
55 GST\_PAD\_PROBE\_TYPE\_QUERY\_UPSTREAM flags to select downstream
56 and upstream queries. The convenience
57 GST\_PAD\_PROBE\_TYPE\_QUERY\_BOTH can also be used to select both
58 directions. Query probes will be notified twice, once when the query
59 travels upstream/downstream and once when the query result is
60 returned. You can select in what stage the callback will be called
61 with the GST\_PAD\_PROBE\_TYPE\_PUSH and
62 GST\_PAD\_PROBE\_TYPE\_PULL, respectively when the query is
63 performed and when the query result is returned.
65 You can use this probe to inspect or modify the query. You can also
66 answer the query in the probe callback by placing the result value
67 in the query and by returning GST\_PAD\_PROBE\_DROP from the
70 - In addition to notifying you of dataflow, you can also ask the probe
71 to block the dataflow when the callback returns. This is called a
72 blocking probe and is activated by specifying the
73 GST\_PAD\_PROBE\_TYPE\_BLOCK flag. You can use this flag with the
74 other flags to only block dataflow on selected activity. A pad
75 becomes unblocked again if you remove the probe or when you return
76 GST\_PAD\_PROBE\_REMOVE from the callback. You can let only the
77 currently blocked item pass by returning GST\_PAD\_PROBE\_PASS from
78 the callback, it will block again on the next item.
80 Blocking probes are used to temporarily block pads because they are
81 unlinked or because you are going to unlink them. If the dataflow is
82 not blocked, the pipeline would go into an error state if data is
83 pushed on an unlinked pad. We will se how to use blocking probes to
84 partially preroll a pipeline. See also [Play a region of a media
85 file](#play-a-region-of-a-media-file).
87 - Be notified when no activity is happening on a pad. You install this
88 probe with the GST\_PAD\_PROBE\_TYPE\_IDLE flag. You can specify
89 GST\_PAD\_PROBE\_TYPE\_PUSH and/or GST\_PAD\_PROBE\_TYPE\_PULL to
90 only be notified depending on the pad scheduling mode. The IDLE
91 probe is also a blocking probe in that it will not let any data pass
92 on the pad for as long as the IDLE probe is installed.
94 You can use idle probes to dynamically relink a pad. We will see how
95 to use idle probes to replace an element in the pipeline. See also
96 [Dynamically changing the
97 pipeline](#dynamically-changing-the-pipeline).
101 Data probes allow you to be notified when there is data passing on a
102 pad. When adding the probe, specify the GST\_PAD\_PROBE\_TYPE\_BUFFER
103 and/or GST\_PAD\_PROBE\_TYPE\_BUFFER\_LIST.
105 Data probes run in pipeline streaming thread context, so callbacks
106 should try to not block and generally not do any weird stuff, since this
107 could have a negative impact on pipeline performance or, in case of
108 bugs, cause deadlocks or crashes. More precisely, one should usually not
109 call any GUI-related functions from within a probe callback, nor try to
110 change the state of the pipeline. An application may post custom
111 messages on the pipeline's bus though to communicate with the main
112 application thread and have it do things like stop the pipeline.
114 In any case, most common buffer operations that elements can do in
115 `_chain ()` functions, can be done in probe callbacks as well. The
116 example below gives a short impression on how to use them.
123 static GstPadProbeReturn
124 cb_have_data (GstPad *pad,
125 GstPadProbeInfo *info,
133 buffer = GST_PAD_PROBE_INFO_BUFFER (info);
135 buffer = gst_buffer_make_writable (buffer);
137 /* Making a buffer writable can fail (for example if it
138 * cannot be copied and is used more than once)
141 return GST_PAD_PROBE_OK;
143 /* Mapping a buffer can fail (non-writable) */
144 if (gst_buffer_map (buffer, &map, GST_MAP_WRITE)) {
145 ptr = (guint16 *) map.data;
147 for (y = 0; y < 288; y++) {
148 for (x = 0; x < 384 / 2; x++) {
149 t = ptr[384 - 1 - x];
150 ptr[384 - 1 - x] = ptr[x];
155 gst_buffer_unmap (buffer, &map);
158 GST_PAD_PROBE_INFO_DATA (info) = buffer;
160 return GST_PAD_PROBE_OK;
168 GstElement *pipeline, *src, *sink, *filter, *csp;
173 gst_init (&argc, &argv);
174 loop = g_main_loop_new (NULL, FALSE);
177 pipeline = gst_pipeline_new ("my-pipeline");
178 src = gst_element_factory_make ("videotestsrc", "src");
180 g_error ("Could not create 'videotestsrc' element");
182 filter = gst_element_factory_make ("capsfilter", "filter");
183 g_assert (filter != NULL); /* should always exist */
185 csp = gst_element_factory_make ("videoconvert", "csp");
187 g_error ("Could not create 'videoconvert' element");
189 sink = gst_element_factory_make ("xvimagesink", "sink");
191 sink = gst_element_factory_make ("ximagesink", "sink");
193 g_error ("Could not create neither 'xvimagesink' nor 'ximagesink' element");
196 gst_bin_add_many (GST_BIN (pipeline), src, filter, csp, sink, NULL);
197 gst_element_link_many (src, filter, csp, sink, NULL);
198 filtercaps = gst_caps_new_simple ("video/x-raw",
199 "format", G_TYPE_STRING, "RGB16",
200 "width", G_TYPE_INT, 384,
201 "height", G_TYPE_INT, 288,
202 "framerate", GST_TYPE_FRACTION, 25, 1,
204 g_object_set (G_OBJECT (filter), "caps", filtercaps, NULL);
205 gst_caps_unref (filtercaps);
207 pad = gst_element_get_static_pad (src, "src");
208 gst_pad_add_probe (pad, GST_PAD_PROBE_TYPE_BUFFER,
209 (GstPadProbeCallback) cb_have_data, NULL, NULL);
210 gst_object_unref (pad);
213 gst_element_set_state (pipeline, GST_STATE_PLAYING);
215 /* wait until it's up and running or failed */
216 if (gst_element_get_state (pipeline, NULL, NULL, -1) == GST_STATE_CHANGE_FAILURE) {
217 g_error ("Failed to go into PLAYING state");
220 g_print ("Running ...\n");
221 g_main_loop_run (loop);
224 gst_element_set_state (pipeline, GST_STATE_NULL);
225 gst_object_unref (pipeline);
234 Compare that output with the output of “gst-launch-1.0 videotestsrc \!
235 xvimagesink”, just so you know what you're looking for.
237 Strictly speaking, a pad probe callback is only allowed to modify the
238 buffer content if the buffer is writable. Whether this is the case or
239 not depends a lot on the pipeline and the elements involved. Often
240 enough, this is the case, but sometimes it is not, and if it is not then
241 unexpected modification of the data or metadata can introduce bugs that
242 are very hard to debug and track down. You can check if a buffer is
243 writable with `gst_buffer_is_writable ()`. Since you can pass back a
244 different buffer than the one passed in, it is a good idea to make the
245 buffer writable in the callback function with `gst_buffer_make_writable
248 Pad probes are suited best for looking at data as it passes through the
249 pipeline. If you need to modify data, you should better write your own
250 GStreamer element. Base classes like GstAudioFilter, GstVideoFilter or
251 GstBaseTransform make this fairly easy.
253 If you just want to inspect buffers as they pass through the pipeline,
254 you don't even need to set up pad probes. You could also just insert an
255 identity element into the pipeline and connect to its "handoff" signal.
256 The identity element also provides a few useful debugging tools like the
257 "dump" property or the "last-message" property (the latter is enabled by
258 passing the '-v' switch to gst-launch and by setting the silent property
259 on the identity to FALSE).
261 ### Play a region of a media file
263 In this example we will show you how to play back a region of a media
264 file. The goal is to only play the part of a file from 2 seconds to 5
265 seconds and then EOS.
267 In a first step we will set a uridecodebin element to the PAUSED state
268 and make sure that we block all the source pads that are created. When
269 all the source pads are blocked, we have data on all source pads and we
270 say that the uridecodebin is prerolled.
272 In a prerolled pipeline we can ask for the duration of the media and we
273 can also perform seeks. We are interested in performing a seek operation
274 on the pipeline to select the range of media that we are interested in.
276 After we configure the region we are interested in, we can link the sink
277 element, unblock the source pads and set the pipeline to the playing
278 state. You will see that exactly the requested region is played by the
279 sink before it goes to EOS.
281 What follows is an example application that loosly follows this
289 static GMainLoop *loop;
290 static volatile gint counter;
292 static gboolean prerolled = FALSE;
293 static GstPad *sinkpad;
296 dec_counter (GstElement * pipeline)
301 if (g_atomic_int_dec_and_test (&counter)) {
302 /* all probes blocked and no-more-pads signaled, post
303 * message on the bus. */
306 gst_bus_post (bus, gst_message_new_application (
307 GST_OBJECT_CAST (pipeline),
308 gst_structure_new_empty ("ExPrerolled")));
312 /* called when a source pad of uridecodebin is blocked */
313 static GstPadProbeReturn
314 cb_blocked (GstPad *pad,
315 GstPadProbeInfo *info,
318 GstElement *pipeline = GST_ELEMENT (user_data);
321 return GST_PAD_PROBE_REMOVE;
323 dec_counter (pipeline);
325 return GST_PAD_PROBE_OK;
328 /* called when uridecodebin has a new pad */
330 cb_pad_added (GstElement *element,
334 GstElement *pipeline = GST_ELEMENT (user_data);
339 g_atomic_int_inc (&counter);
341 gst_pad_add_probe (pad, GST_PAD_PROBE_TYPE_BLOCK_DOWNSTREAM,
342 (GstPadProbeCallback) cb_blocked, pipeline, NULL);
344 /* try to link to the video pad */
345 gst_pad_link (pad, sinkpad);
348 /* called when uridecodebin has created all pads */
350 cb_no_more_pads (GstElement *element,
353 GstElement *pipeline = GST_ELEMENT (user_data);
358 dec_counter (pipeline);
361 /* called when a new message is posted on the bus */
363 cb_message (GstBus *bus,
367 GstElement *pipeline = GST_ELEMENT (user_data);
369 switch (GST_MESSAGE_TYPE (message)) {
370 case GST_MESSAGE_ERROR:
371 g_print ("we received an error!\n");
372 g_main_loop_quit (loop);
374 case GST_MESSAGE_EOS:
375 g_print ("we reached EOS\n");
376 g_main_loop_quit (loop);
378 case GST_MESSAGE_APPLICATION:
380 if (gst_message_has_name (message, "ExPrerolled")) {
381 /* it's our message */
382 g_print ("we are all prerolled, do seek\n");
383 gst_element_seek (pipeline,
384 1.0, GST_FORMAT_TIME,
385 GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE,
386 GST_SEEK_TYPE_SET, 2 * GST_SECOND,
387 GST_SEEK_TYPE_SET, 5 * GST_SECOND);
389 gst_element_set_state (pipeline, GST_STATE_PLAYING);
402 GstElement *pipeline, *src, *csp, *vs, *sink;
405 gst_init (&argc, &argv);
406 loop = g_main_loop_new (NULL, FALSE);
409 g_print ("usage: %s <uri>", argv[0]);
414 pipeline = gst_pipeline_new ("my-pipeline");
416 bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
417 gst_bus_add_signal_watch (bus);
418 g_signal_connect (bus, "message", (GCallback) cb_message,
421 src = gst_element_factory_make ("uridecodebin", "src");
423 g_error ("Could not create 'uridecodebin' element");
425 g_object_set (src, "uri", argv[1], NULL);
427 csp = gst_element_factory_make ("videoconvert", "csp");
429 g_error ("Could not create 'videoconvert' element");
431 vs = gst_element_factory_make ("videoscale", "vs");
433 g_error ("Could not create 'videoscale' element");
435 sink = gst_element_factory_make ("autovideosink", "sink");
437 g_error ("Could not create 'autovideosink' element");
439 gst_bin_add_many (GST_BIN (pipeline), src, csp, vs, sink, NULL);
441 /* can't link src yet, it has no pads */
442 gst_element_link_many (csp, vs, sink, NULL);
444 sinkpad = gst_element_get_static_pad (csp, "sink");
446 /* for each pad block that is installed, we will increment
447 * the counter. for each pad block that is signaled, we
448 * decrement the counter. When the counter is 0 we post
449 * an app message to tell the app that all pads are
450 * blocked. Start with 1 that is decremented when no-more-pads
451 * is signaled to make sure that we only post the message
452 * after no-more-pads */
453 g_atomic_int_set (&counter, 1);
455 g_signal_connect (src, "pad-added",
456 (GCallback) cb_pad_added, pipeline);
457 g_signal_connect (src, "no-more-pads",
458 (GCallback) cb_no_more_pads, pipeline);
460 gst_element_set_state (pipeline, GST_STATE_PAUSED);
462 g_main_loop_run (loop);
464 gst_element_set_state (pipeline, GST_STATE_NULL);
466 gst_object_unref (sinkpad);
467 gst_object_unref (bus);
468 gst_object_unref (pipeline);
469 g_main_loop_unref (loop);
476 Note that we use a custom application message to signal the main thread
477 that the uridecidebin is prerolled. The main thread will then issue a
478 flushing seek to the requested region. The flush will temporarily
479 unblock the pad and reblock them when new data arrives again. We detect
480 this second block to remove the probes. Then we set the pipeline to
481 PLAYING and it should play from 2 to 5 seconds, then EOS and exit the
484 ## Manually adding or removing data from/to a pipeline
486 Many people have expressed the wish to use their own sources to inject
487 data into a pipeline. Some people have also expressed the wish to grab
488 the output in a pipeline and take care of the actual output inside their
489 application. While either of these methods are strongly discouraged,
490 GStreamer offers support for this. *Beware\! You need to know what you
491 are doing.* Since you don't have any support from a base class you need
492 to thoroughly understand state changes and synchronization. If it
493 doesn't work, there are a million ways to shoot yourself in the foot.
494 It's always better to simply write a plugin and have the base class
495 manage it. See the Plugin Writer's Guide for more information on this
496 topic. Also see the next section, which will explain how to embed
497 plugins statically in your application.
499 There's two possible elements that you can use for the above-mentioned
500 purposes. Those are called “appsrc” (an imaginary source) and “appsink”
501 (an imaginary sink). The same method applies to each of those elements.
502 Here, we will discuss how to use those elements to insert (using appsrc)
503 or grab (using appsink) data from a pipeline, and how to set
506 Both appsrc and appsink provide 2 sets of API. One API uses standard
507 GObject (action) signals and properties. The same API is also available
508 as a regular C api. The C api is more performant but requires you to
509 link to the app library in order to use the elements.
511 ### Inserting data with appsrc
513 First we look at some examples for appsrc, which lets you insert data
514 into the pipeline from the application. Appsrc has some configuration
515 options that define how it will operate. You should decide about the
516 following configurations:
518 - Will the appsrc operate in push or pull mode. The stream-type
519 property can be used to control this. stream-type of “random-access”
520 will activate pull mode scheduling while the other stream-types
523 - The caps of the buffers that appsrc will push out. This needs to be
524 configured with the caps property. The caps must be set to a fixed
525 caps and will be used to negotiate a format downstream.
527 - If the appsrc operates in live mode or not. This can be configured
528 with the is-live property. When operating in live-mode it is
529 important to configure the min-latency and max-latency in appsrc.
530 The min-latency should be set to the amount of time it takes between
531 capturing a buffer and when it is pushed inside appsrc. In live
532 mode, you should timestamp the buffers with the pipeline
533 running-time when the first byte of the buffer was captured before
534 feeding them to appsrc. You can let appsrc do the timestaping with
535 the do-timestamp property (but then the min-latency must be set to 0
536 because it timestamps based on the running-time when the buffer
539 - The format of the SEGMENT event that appsrc will push. The format
540 has implications for how the running-time of the buffers will be
541 calculated so you must be sure you understand this. For live sources
542 you probably want to set the format property to GST\_FORMAT\_TIME.
543 For non-live source it depends on the media type that you are
544 handling. If you plan to timestamp the buffers, you should probably
545 put a GST\_FORMAT\_TIME format, otherwise GST\_FORMAT\_BYTES might
548 - If appsrc operates in random-access mode, it is important to
549 configure the size property of appsrc with the number of bytes in
550 the stream. This will allow downstream elements to know the size of
551 the media and alows them to seek to the end of the stream when
554 The main way of handling data to appsrc is by using the function
555 `gst_app_src_push_buffer ()` or by emiting the push-buffer action
556 signal. This will put the buffer onto a queue from which appsrc will
557 read from in its streaming thread. It is important to note that data
558 transport will not happen from the thread that performed the push-buffer
561 The “max-bytes” property controls how much data can be queued in appsrc
562 before appsrc considers the queue full. A filled internal queue will
563 always signal the “enough-data” signal, which signals the application
564 that it should stop pushing data into appsrc. The “block” property will
565 cause appsrc to block the push-buffer method until free data becomes
568 When the internal queue is running out of data, the “need-data” signal
569 is emitted, which signals the application that it should start pushing
570 more data into appsrc.
572 In addition to the “need-data” and “enough-data” signals, appsrc can
573 emit the “seek-data” signal when the “stream-mode” property is set to
574 “seekable” or “random-access”. The signal argument will contain the
575 new desired position in the stream expressed in the unit set with the
576 “format” property. After receiving the seek-data signal, the
577 application should push-buffers from the new position.
579 When the last byte is pushed into appsrc, you must call
580 `gst_app_src_end_of_stream ()` to make it send an EOS downstream.
582 These signals allow the application to operate appsrc in push and pull
583 mode as will be explained next.
585 #### Using appsrc in push mode
587 When appsrc is configured in push mode (stream-type is stream or
588 seekable), the application repeatedly calls the push-buffer method with
589 a new buffer. Optionally, the queue size in the appsrc can be controlled
590 with the enough-data and need-data signals by respectively
591 stopping/starting the push-buffer calls. The value of the min-percent
592 property defines how empty the internal appsrc queue needs to be before
593 the need-data signal will be fired. You can set this to some value \>0
594 to avoid completely draining the queue.
596 When the stream-type is set to seekable, don't forget to implement a
599 Use this model when implementing various network protocols or hardware
602 #### Using appsrc in pull mode
604 In the pull model, data is fed to appsrc from the need-data signal
605 handler. You should push exactly the amount of bytes requested in the
606 need-data signal. You are only allowed to push less bytes when you are
607 at the end of the stream.
609 Use this model for file access or other randomly accessable sources.
613 This example application will generate black/white (it switches every
614 second) video to an Xv-window output by using appsrc as a source with
615 caps to force a format. We use a colorspace conversion element to make
616 sure that we feed the right format to your X server. We configure a
617 video stream with a variable framerate (0/1) and we set the timestamps
618 on the outgoing buffers in such a way that we play 2 frames per second.
620 Note how we use the pull mode method of pushing new buffers into appsrc
621 although appsrc is running in push mode.
628 static GMainLoop *loop;
631 cb_need_data (GstElement *appsrc,
635 static gboolean white = FALSE;
636 static GstClockTime timestamp = 0;
641 size = 385 * 288 * 2;
643 buffer = gst_buffer_new_allocate (NULL, size, NULL);
645 /* this makes the image black/white */
646 gst_buffer_memset (buffer, 0, white ? 0xff : 0x0, size);
650 GST_BUFFER_PTS (buffer) = timestamp;
651 GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale_int (1, GST_SECOND, 2);
653 timestamp += GST_BUFFER_DURATION (buffer);
655 g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
656 gst_buffer_unref (buffer);
658 if (ret != GST_FLOW_OK) {
659 /* something wrong, stop pushing */
660 g_main_loop_quit (loop);
668 GstElement *pipeline, *appsrc, *conv, *videosink;
671 gst_init (&argc, &argv);
672 loop = g_main_loop_new (NULL, FALSE);
675 pipeline = gst_pipeline_new ("pipeline");
676 appsrc = gst_element_factory_make ("appsrc", "source");
677 conv = gst_element_factory_make ("videoconvert", "conv");
678 videosink = gst_element_factory_make ("xvimagesink", "videosink");
681 g_object_set (G_OBJECT (appsrc), "caps",
682 gst_caps_new_simple ("video/x-raw",
683 "format", G_TYPE_STRING, "RGB16",
684 "width", G_TYPE_INT, 384,
685 "height", G_TYPE_INT, 288,
686 "framerate", GST_TYPE_FRACTION, 0, 1,
688 gst_bin_add_many (GST_BIN (pipeline), appsrc, conv, videosink, NULL);
689 gst_element_link_many (appsrc, conv, videosink, NULL);
692 g_object_set (G_OBJECT (appsrc),
694 "format", GST_FORMAT_TIME, NULL);
695 g_signal_connect (appsrc, "need-data", G_CALLBACK (cb_need_data), NULL);
698 gst_element_set_state (pipeline, GST_STATE_PLAYING);
699 g_main_loop_run (loop);
702 gst_element_set_state (pipeline, GST_STATE_NULL);
703 gst_object_unref (GST_OBJECT (pipeline));
704 g_main_loop_unref (loop);
713 ### Grabbing data with appsink
715 Unlike appsrc, appsink is a little easier to use. It also supports a
716 pull and push based model of getting data from the pipeline.
718 The normal way of retrieving samples from appsink is by using the
719 `gst_app_sink_pull_sample()` and `gst_app_sink_pull_preroll()` methods
720 or by using the “pull-sample” and “pull-preroll” signals. These methods
721 block until a sample becomes available in the sink or when the sink is
722 shut down or reaches EOS.
724 Appsink will internally use a queue to collect buffers from the
725 streaming thread. If the application is not pulling samples fast enough,
726 this queue will consume a lot of memory over time. The “max-buffers”
727 property can be used to limit the queue size. The “drop” property
728 controls whether the streaming thread blocks or if older buffers are
729 dropped when the maximum queue size is reached. Note that blocking the
730 streaming thread can negatively affect real-time performance and should
733 If a blocking behaviour is not desirable, setting the “emit-signals”
734 property to TRUE will make appsink emit the “new-sample” and
735 “new-preroll” signals when a sample can be pulled without blocking.
737 The “caps” property on appsink can be used to control the formats that
738 appsink can receive. This property can contain non-fixed caps, the
739 format of the pulled samples can be obtained by getting the sample caps.
741 If one of the pull-preroll or pull-sample methods return NULL, the
742 appsink is stopped or in the EOS state. You can check for the EOS state
743 with the “eos” property or with the `gst_app_sink_is_eos()` method.
745 The eos signal can also be used to be informed when the EOS state is
746 reached to avoid polling.
748 Consider configuring the following properties in the appsink:
750 - The “sync” property if you want to have the sink base class
751 synchronize the buffer against the pipeline clock before handing you
754 - Enable Quality-of-Service with the “qos” property. If you are
755 dealing with raw video frames and let the base class sycnhronize on
756 the clock, it might be a good idea to also let the base class send
759 - The caps property that contains the accepted caps. Upstream elements
760 will try to convert the format so that it matches the configured
761 caps on appsink. You must still check the `GstSample` to get the
762 actual caps of the buffer.
766 What follows is an example on how to capture a snapshot of a video
767 stream using appsink.
779 #define CAPS "video/x-raw,format=RGB,width=160,pixel-aspect-ratio=1/1"
782 main (int argc, char *argv[])
784 GstElement *pipeline, *sink;
788 GError *error = NULL;
789 gint64 duration, position;
790 GstStateChangeReturn ret;
794 gst_init (&argc, &argv);
797 g_print ("usage: %s <uri>\n Writes snapshot.png in the current directory\n",
802 /* create a new pipeline */
804 g_strdup_printf ("uridecodebin uri=%s ! videoconvert ! videoscale ! "
805 " appsink name=sink caps=\"" CAPS "\"", argv[1]);
806 pipeline = gst_parse_launch (descr, &error);
809 g_print ("could not construct pipeline: %s\n", error->message);
810 g_clear_error (&error);
815 sink = gst_bin_get_by_name (GST_BIN (pipeline), "sink");
817 /* set to PAUSED to make the first frame arrive in the sink */
818 ret = gst_element_set_state (pipeline, GST_STATE_PAUSED);
820 case GST_STATE_CHANGE_FAILURE:
821 g_print ("failed to play the file\n");
823 case GST_STATE_CHANGE_NO_PREROLL:
824 /* for live sources, we need to set the pipeline to PLAYING before we can
825 * receive a buffer. We don't do that yet */
826 g_print ("live sources not supported yet\n");
831 /* This can block for up to 5 seconds. If your machine is really overloaded,
832 * it might time out before the pipeline prerolled and we generate an error. A
833 * better way is to run a mainloop and catch errors there. */
834 ret = gst_element_get_state (pipeline, NULL, NULL, 5 * GST_SECOND);
835 if (ret == GST_STATE_CHANGE_FAILURE) {
836 g_print ("failed to play the file\n");
840 /* get the duration */
841 gst_element_query_duration (pipeline, GST_FORMAT_TIME, &duration);
844 /* we have a duration, seek to 5% */
845 position = duration * 5 / 100;
847 /* no duration, seek to 1 second, this could EOS */
848 position = 1 * GST_SECOND;
850 /* seek to the a position in the file. Most files have a black first frame so
851 * by seeking to somewhere else we have a bigger chance of getting something
852 * more interesting. An optimisation would be to detect black images and then
853 * seek a little more */
854 gst_element_seek_simple (pipeline, GST_FORMAT_TIME,
855 GST_SEEK_FLAG_KEY_UNIT | GST_SEEK_FLAG_FLUSH, position);
857 /* get the preroll buffer from appsink, this block untils appsink really
859 g_signal_emit_by_name (sink, "pull-preroll", &sample, NULL);
861 /* if we have a buffer now, convert it to a pixbuf. It's possible that we
862 * don't have a buffer because we went EOS right away or had an error. */
868 /* get the snapshot buffer format now. We set the caps on the appsink so
869 * that it can only be an rgb buffer. The only thing we have not specified
870 * on the caps is the height, which is dependant on the pixel-aspect-ratio
871 * of the source material */
872 caps = gst_sample_get_caps (sample);
874 g_print ("could not get snapshot format\n");
877 s = gst_caps_get_structure (caps, 0);
879 /* we need to get the final caps on the buffer to get the size */
880 res = gst_structure_get_int (s, "width", &width);
881 res |= gst_structure_get_int (s, "height", &height);
883 g_print ("could not get snapshot dimension\n");
887 /* create pixmap from buffer and save, gstreamer video buffers have a stride
888 * that is rounded up to the nearest multiple of 4 */
889 buffer = gst_sample_get_buffer (sample);
890 /* Mapping a buffer can fail (non-readable) */
891 if (gst_buffer_map (buffer, &map, GST_MAP_READ)) {
893 pixbuf = gdk_pixbuf_new_from_data (map.data,
894 GDK_COLORSPACE_RGB, FALSE, 8, width, height,
895 GST_ROUND_UP_4 (width * 3), NULL, NULL);
897 /* save the pixbuf */
898 gdk_pixbuf_save (pixbuf, "snapshot.png", "png", &error, NULL);
900 gst_buffer_unmap (buffer, &map);
902 gst_sample_unref (sample);
904 g_print ("could not make snapshot\n");
907 /* cleanup and exit */
908 gst_element_set_state (pipeline, GST_STATE_NULL);
909 gst_object_unref (pipeline);
918 Sometimes you'll want to set a specific format, for example a video size
919 and format or an audio bitsize and number of channels. You can do this
920 by forcing a specific `GstCaps` on the pipeline, which is possible by
921 using *filtered caps*. You can set a filtered caps on a link by using
922 the “capsfilter” element in between the two elements, and specifying a
923 `GstCaps` as “caps” property on this element. It will then only allow
924 types matching that specified capability set for negotiation. See also
925 [Creating capabilities for
926 filtering](manual-pads.md#creating-capabilities-for-filtering).
928 ### Changing format in a PLAYING pipeline
930 It is also possible to dynamically change the format in a pipeline while
931 PLAYING. This can simply be done by changing the caps property on a
932 capsfilter. The capsfilter will send a RECONFIGURE event upstream that
933 will make the upstream element attempt to renegotiate a new format and
934 allocator. This only works if the upstream element is not using fixed
935 caps on the source pad.
937 Below is an example of how you can change the caps of a pipeline while
938 in the PLAYING state:
947 #define MAX_ROUND 100
950 main (int argc, char **argv)
952 GstElement *pipe, *filter;
959 gst_init (&argc, &argv);
961 pipe = gst_parse_launch_full ("videotestsrc ! capsfilter name=filter ! "
962 "ximagesink", NULL, GST_PARSE_FLAG_NONE, NULL);
963 g_assert (pipe != NULL);
965 filter = gst_bin_get_by_name (GST_BIN (pipe), "filter");
972 for (round = 0; round < MAX_ROUND; round++) {
974 g_print ("resize to %dx%d (%d/%d) \r", width, height, round, MAX_ROUND);
976 /* we prefer our fixed width and height but allow other dimensions to pass
978 capsstr = g_strdup_printf ("video/x-raw, width=(int)%d, height=(int)%d",
981 caps = gst_caps_from_string (capsstr);
983 g_object_set (filter, "caps", caps, NULL);
984 gst_caps_unref (caps);
987 gst_element_set_state (pipe, GST_STATE_PLAYING);
992 else if (width < 200)
998 else if (height < 150)
1002 gst_bus_poll (GST_ELEMENT_BUS (pipe), GST_MESSAGE_ERROR,
1005 g_print ("got error \n");
1007 gst_message_unref (message);
1010 g_print ("done \n");
1012 gst_object_unref (filter);
1013 gst_element_set_state (pipe, GST_STATE_NULL);
1014 gst_object_unref (pipe);
1023 Note how we use `gst_bus_poll()` with a small timeout to get messages
1024 and also introduce a short sleep.
1026 It is possible to set multiple caps for the capsfilter separated with a
1027 ;. The capsfilter will try to renegotiate to the first possible format
1030 ## Dynamically changing the pipeline
1032 In this section we talk about some techniques for dynamically modifying
1033 the pipeline. We are talking specifically about changing the pipeline
1034 while it is in the PLAYING state without interrupting the flow.
1036 There are some important things to consider when building dynamic
1039 - When removing elements from the pipeline, make sure that there is no
1040 dataflow on unlinked pads because that will cause a fatal pipeline
1041 error. Always block source pads (in push mode) or sink pads (in pull
1042 mode) before unlinking pads. See also [Changing elements in a
1043 pipeline](#changing-elements-in-a-pipeline).
1045 - When adding elements to a pipeline, make sure to put the element
1046 into the right state, usually the same state as the parent, before
1047 allowing dataflow the element. When an element is newly created, it
1048 is in the NULL state and will return an error when it receives data.
1049 See also [Changing elements in a
1050 pipeline](#changing-elements-in-a-pipeline).
1052 - When adding elements to a pipeline, GStreamer will by default set
1053 the clock and base-time on the element to the current values of the
1054 pipeline. This means that the element will be able to construct the
1055 same pipeline running-time as the other elements in the pipeline.
1056 This means that sinks will synchronize buffers like the other sinks
1057 in the pipeline and that sources produce buffers with a running-time
1058 that matches the other sources.
1060 - When unlinking elements from an upstream chain, always make sure to
1061 flush any queued data in the element by sending an EOS event down
1062 the element sink pad(s) and by waiting that the EOS leaves the
1063 elements (with an event probe).
1065 If you do not do this, you will lose the data which is buffered by
1066 the unlinked element. This can result in a simple frame loss (one or
1067 more video frames, several milliseconds of audio). However if you
1068 remove a muxer (and in some cases an encoder or similar elements)
1069 from the pipeline, you risk getting a corrupted file which could not
1070 be played properly, as some relevant metadata (header, seek/index
1071 tables, internal sync tags) will not be stored or updated properly.
1073 See also [Changing elements in a
1074 pipeline](#changing-elements-in-a-pipeline).
1076 - A live source will produce buffers with a running-time of the
1077 current running-time in the pipeline.
1079 A pipeline without a live source produces buffers with a
1080 running-time starting from 0. Likewise, after a flushing seek, those
1081 pipelines reset the running-time back to 0.
1083 The running-time can be changed with `gst_pad_set_offset ()`. It is
1084 important to know the running-time of the elements in the pipeline
1085 in order to maintain synchronization.
1087 - Adding elements might change the state of the pipeline. Adding a
1088 non-prerolled sink, for example, brings the pipeline back to the
1089 prerolling state. Removing a non-prerolled sink, for example, might
1090 change the pipeline to PAUSED and PLAYING state.
1092 Adding a live source cancels the preroll stage and put the pipeline
1093 to the playing state. Adding a live source or other live elements
1094 might also change the latency of a pipeline.
1096 Adding or removing elements to the pipeline might change the clock
1097 selection of the pipeline. If the newly added element provides a
1098 clock, it might be worth changing the clock in the pipeline to the
1099 new clock. If, on the other hand, the element that provides the
1100 clock for the pipeline is removed, a new clock has to be selected.
1102 - Adding and removing elements might cause upstream or downstream
1103 elements to renegotiate caps and or allocators. You don't really
1104 need to do anything from the application, plugins largely adapt
1105 themself to the new pipeline topology in order to optimize their
1106 formats and allocation strategy.
1108 What is important is that when you add, remove or change elements in
1109 the pipeline, it is possible that the pipeline needs to negotiate a
1110 new format and this can fail. Usually you can fix this by inserting
1111 the right converter elements where needed. See also [Changing
1112 elements in a pipeline](#changing-elements-in-a-pipeline).
1114 GStreamer offers support for doing about any dynamic pipeline
1115 modification but it requires you to know a bit of details before you can
1116 do this without causing pipeline errors. In the following sections we
1117 will demonstrate a couple of typical use-cases.
1119 ### Changing elements in a pipeline
1121 In the next example we look at the following chain of elements:
1124 - ----. .----------. .---- -
1125 element1 | | element2 | | element3
1126 src -> sink src -> sink
1127 - ----' '----------' '---- -
1131 We want to change element2 by element4 while the pipeline is in the
1132 PLAYING state. Let's say that element2 is a visualization and that you
1133 want to switch the visualization in the pipeline.
1135 We can't just unlink element2's sinkpad from element1's source pad
1136 because that would leave element1's source pad unlinked and would cause
1137 a streaming error in the pipeline when data is pushed on the source pad.
1138 The technique is to block the dataflow from element1's source pad before
1139 we change element2 by element4 and then resume dataflow as shown in the
1142 - Block element1's source pad with a blocking pad probe. When the pad
1143 is blocked, the probe callback will be called.
1145 - Inside the block callback nothing is flowing between element1 and
1146 element2 and nothing will flow until unblocked.
1148 - Unlink element1 and element2.
1150 - Make sure data is flushed out of element2. Some elements might
1151 internally keep some data, you need to make sure not to lose data by
1152 forcing it out of element2. You can do this by pushing EOS into
1153 element2, like this:
1155 - Put an event probe on element2's source pad.
1157 - Send EOS to element2's sinkpad. This makes sure the all the data
1158 inside element2 is forced out.
1160 - Wait for the EOS event to appear on element2's source pad. When
1161 the EOS is received, drop it and remove the event probe.
1163 - Unlink element2 and element3. You can now also remove element2 from
1164 the pipeline and set the state to NULL.
1166 - Add element4 to the pipeline, if not already added. Link element4
1167 and element3. Link element1 and element4.
1169 - Make sure element4 is in the same state as the rest of the elements
1170 in the pipeline. It should be at least in the PAUSED state before it
1171 can receive buffers and events.
1173 - Unblock element1's source pad probe. This will let new data into
1174 element4 and continue streaming.
1176 The above algorithm works when the source pad is blocked, i.e. when
1177 there is dataflow in the pipeline. If there is no dataflow, there is
1178 also no point in changing the element (just yet) so this algorithm can
1179 be used in the PAUSED state as well.
1181 Let show you how this works with an example. This example changes the
1182 video effect on a simple pipeline every second.
1187 #include <gst/gst.h>
1189 static gchar *opt_effects = NULL;
1191 #define DEFAULT_EFFECTS "identity,exclusion,navigationtest," \
1192 "agingtv,videoflip,vertigotv,gaussianblur,shagadelictv,edgetv"
1194 static GstPad *blockpad;
1195 static GstElement *conv_before;
1196 static GstElement *conv_after;
1197 static GstElement *cur_effect;
1198 static GstElement *pipeline;
1200 static GQueue effects = G_QUEUE_INIT;
1202 static GstPadProbeReturn
1203 event_probe_cb (GstPad * pad, GstPadProbeInfo * info, gpointer user_data)
1205 GMainLoop *loop = user_data;
1208 if (GST_EVENT_TYPE (GST_PAD_PROBE_INFO_DATA (info)) != GST_EVENT_EOS)
1209 return GST_PAD_PROBE_PASS;
1211 gst_pad_remove_probe (pad, GST_PAD_PROBE_INFO_ID (info));
1213 /* push current effect back into the queue */
1214 g_queue_push_tail (&effects, gst_object_ref (cur_effect));
1215 /* take next effect from the queue */
1216 next = g_queue_pop_head (&effects);
1218 GST_DEBUG_OBJECT (pad, "no more effects");
1219 g_main_loop_quit (loop);
1220 return GST_PAD_PROBE_DROP;
1223 g_print ("Switching from '%s' to '%s'..\n", GST_OBJECT_NAME (cur_effect),
1224 GST_OBJECT_NAME (next));
1226 gst_element_set_state (cur_effect, GST_STATE_NULL);
1228 /* remove unlinks automatically */
1229 GST_DEBUG_OBJECT (pipeline, "removing %" GST_PTR_FORMAT, cur_effect);
1230 gst_bin_remove (GST_BIN (pipeline), cur_effect);
1232 GST_DEBUG_OBJECT (pipeline, "adding %" GST_PTR_FORMAT, next);
1233 gst_bin_add (GST_BIN (pipeline), next);
1235 GST_DEBUG_OBJECT (pipeline, "linking..");
1236 gst_element_link_many (conv_before, next, conv_after, NULL);
1238 gst_element_set_state (next, GST_STATE_PLAYING);
1241 GST_DEBUG_OBJECT (pipeline, "done");
1243 return GST_PAD_PROBE_DROP;
1246 static GstPadProbeReturn
1247 pad_probe_cb (GstPad * pad, GstPadProbeInfo * info, gpointer user_data)
1249 GstPad *srcpad, *sinkpad;
1251 GST_DEBUG_OBJECT (pad, "pad is blocked now");
1253 /* remove the probe first */
1254 gst_pad_remove_probe (pad, GST_PAD_PROBE_INFO_ID (info));
1256 /* install new probe for EOS */
1257 srcpad = gst_element_get_static_pad (cur_effect, "src");
1258 gst_pad_add_probe (srcpad, GST_PAD_PROBE_TYPE_BLOCK |
1259 GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM, event_probe_cb, user_data, NULL);
1260 gst_object_unref (srcpad);
1262 /* push EOS into the element, the probe will be fired when the
1263 * EOS leaves the effect and it has thus drained all of its data */
1264 sinkpad = gst_element_get_static_pad (cur_effect, "sink");
1265 gst_pad_send_event (sinkpad, gst_event_new_eos ());
1266 gst_object_unref (sinkpad);
1268 return GST_PAD_PROBE_OK;
1272 timeout_cb (gpointer user_data)
1274 gst_pad_add_probe (blockpad, GST_PAD_PROBE_TYPE_BLOCK_DOWNSTREAM,
1275 pad_probe_cb, user_data, NULL);
1281 bus_cb (GstBus * bus, GstMessage * msg, gpointer user_data)
1283 GMainLoop *loop = user_data;
1285 switch (GST_MESSAGE_TYPE (msg)) {
1286 case GST_MESSAGE_ERROR:{
1290 gst_message_parse_error (msg, &err, &dbg);
1291 gst_object_default_error (msg->src, err, dbg);
1292 g_clear_error (&err);
1294 g_main_loop_quit (loop);
1304 main (int argc, char **argv)
1306 GOptionEntry options[] = {
1307 {"effects", 'e', 0, G_OPTION_ARG_STRING, &opt_effects,
1308 "Effects to use (comma-separated list of element names)", NULL},
1311 GOptionContext *ctx;
1314 GstElement *src, *q1, *q2, *effect, *filter1, *filter2, *sink;
1315 gchar **effect_names, **e;
1317 ctx = g_option_context_new ("");
1318 g_option_context_add_main_entries (ctx, options, NULL);
1319 g_option_context_add_group (ctx, gst_init_get_option_group ());
1320 if (!g_option_context_parse (ctx, &argc, &argv, &err)) {
1321 g_print ("Error initializing: %s\n", err->message);
1322 g_clear_error (&err);
1323 g_option_context_free (ctx);
1326 g_option_context_free (ctx);
1328 if (opt_effects != NULL)
1329 effect_names = g_strsplit (opt_effects, ",", -1);
1331 effect_names = g_strsplit (DEFAULT_EFFECTS, ",", -1);
1333 for (e = effect_names; e != NULL && *e != NULL; ++e) {
1336 el = gst_element_factory_make (*e, NULL);
1338 g_print ("Adding effect '%s'\n", *e);
1339 g_queue_push_tail (&effects, el);
1343 pipeline = gst_pipeline_new ("pipeline");
1345 src = gst_element_factory_make ("videotestsrc", NULL);
1346 g_object_set (src, "is-live", TRUE, NULL);
1348 filter1 = gst_element_factory_make ("capsfilter", NULL);
1349 gst_util_set_object_arg (G_OBJECT (filter1), "caps",
1350 "video/x-raw, width=320, height=240, "
1351 "format={ I420, YV12, YUY2, UYVY, AYUV, Y41B, Y42B, "
1352 "YVYU, Y444, v210, v216, NV12, NV21, UYVP, A420, YUV9, YVU9, IYU1 }");
1354 q1 = gst_element_factory_make ("queue", NULL);
1356 blockpad = gst_element_get_static_pad (q1, "src");
1358 conv_before = gst_element_factory_make ("videoconvert", NULL);
1360 effect = g_queue_pop_head (&effects);
1361 cur_effect = effect;
1363 conv_after = gst_element_factory_make ("videoconvert", NULL);
1365 q2 = gst_element_factory_make ("queue", NULL);
1367 filter2 = gst_element_factory_make ("capsfilter", NULL);
1368 gst_util_set_object_arg (G_OBJECT (filter2), "caps",
1369 "video/x-raw, width=320, height=240, "
1370 "format={ RGBx, BGRx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR }");
1372 sink = gst_element_factory_make ("ximagesink", NULL);
1374 gst_bin_add_many (GST_BIN (pipeline), src, filter1, q1, conv_before, effect,
1375 conv_after, q2, sink, NULL);
1377 gst_element_link_many (src, filter1, q1, conv_before, effect, conv_after,
1380 gst_element_set_state (pipeline, GST_STATE_PLAYING);
1382 loop = g_main_loop_new (NULL, FALSE);
1384 gst_bus_add_watch (GST_ELEMENT_BUS (pipeline), bus_cb, loop);
1386 g_timeout_add_seconds (1, timeout_cb, loop);
1388 g_main_loop_run (loop);
1390 gst_element_set_state (pipeline, GST_STATE_NULL);
1391 gst_object_unref (pipeline);
1400 Note how we added videoconvert elements before and after the effect.
1401 This is needed because some elements might operate in different
1402 colorspaces than other elements. By inserting the conversion elements
1403 you ensure that the right format can be negotiated at any time.