1 <chapter id="chapter-dataaccess">
2 <title>Pipeline manipulation</title>
4 This chapter will discuss how you can manipulate your pipeline in several
5 ways from your application on. Parts of this chapter are very
6 lowlevel, so be assured that you'll need some programming knowledge
7 and a good understanding of &GStreamer; before you start reading this.
10 Topics that will be discussed here include how you can insert data into
11 a pipeline from your application, how to read data from a pipeline,
12 how to manipulate the pipeline's speed, length, starting point and how
13 to listen to a pipeline's data processing.
16 <sect1 id="section-using-probes">
17 <title>Using probes</title>
19 Probing is best envisioned as a pad listener. Technically, a probe is
20 nothing more than a callback that can be attached to a pad.
21 You can attach a probe using <function>gst_pad_add_probe ()</function>.
22 Similarly, one can use the
23 <function>gst_pad_remove_probe ()</function>
24 to remove the callback again. The probe notifies you of any activity
25 that happens on the pad, like buffers, events and queries. You can
26 define what kind of notifications you are interested in when you
30 The probe can notify you of the following activity on pads:
35 A buffer is pushed or pulled. You want to specify the
36 GST_PAD_PROBE_TYPE_BUFFER when registering the probe. Because the
37 pad can be scheduled in different ways, it is possible to also
38 specify in what scheduling mode you are interested with the
39 optional GST_PAD_PROBE_TYPE_PUSH and GST_PAD_PROBE_TYPE_PULL
43 You can use this probe to inspect, modify or drop the buffer.
44 See <xref linkend="section-data-probes"/>.
49 A bufferlist is pushed. Use the GST_PAD_PROBE_TYPE_BUFFER_LIST
50 when registering the probe.
55 An event travels over a pad. Use the GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM
56 and GST_PAD_PROBE_TYPE_EVENT_UPSTREAM flags to select downstream
57 and upstream events. There is also a convenience
58 GST_PAD_PROBE_TYPE_EVENT_BOTH to be notified of events going both
59 upstream and downstream. By default, flush events do not cause
60 a notification. You need to explicitly enable GST_PAD_PROBE_TYPE_EVENT_FLUSH
61 to receive callbacks from flushing events. Events are always
62 only notified in push mode.
65 You can use this probe to inspect, modify or drop the event.
70 A query travels over a pad. Use the GST_PAD_PROBE_TYPE_QUERY_DOWNSTREAM
71 and GST_PAD_PROBE_TYPE_QUERY_UPSTREAM flags to select downstream
72 and upstream queries. The convenience GST_PAD_PROBE_TYPE_QUERY_BOTH
73 can also be used to select both directions. Query probes will be
74 notified twice, once when the query travels upstream/downstream and
75 once when the query result is returned. You can select in what stage
76 the callback will be called with the GST_PAD_PROBE_TYPE_PUSH and
77 GST_PAD_PROBE_TYPE_PULL, respectively when the query is performed
78 and when the query result is returned.
81 You can use this probe to inspect or modify the query. You can also
82 answer the query in the probe callback by placing the result value
83 in the query and by returning GST_PAD_PROBE_DROP from the
89 In addition to notifying you of dataflow, you can also ask the
90 probe to block the dataflow when the callback returns. This is
91 called a blocking probe and is activated by specifying the
92 GST_PAD_PROBE_TYPE_BLOCK flag. You can use this flag with the
93 other flags to only block dataflow on selected activity. A pad
94 becomes unblocked again if you remove the probe or when you return
95 GST_PAD_PROBE_REMOVE from the callback. You can let only the
96 currently blocked item pass by returning GST_PAD_PROBE_PASS
97 from the callback, it will block again on the next item.
100 Blocking probes are used to temporarily block pads because they
101 are unlinked or because you are going to unlink them. If the
102 dataflow is not blocked, the pipeline would go into an error
103 state if data is pushed on an unlinked pad. We will se how
104 to use blocking probes to partially preroll a pipeline.
105 See also <xref linkend="section-preroll-probes"/>.
110 Be notified when no activity is happening on a pad. You install
111 this probe with the GST_PAD_PROBE_TYPE_IDLE flag. You can specify
112 GST_PAD_PROBE_TYPE_PUSH and/or GST_PAD_PROBE_TYPE_PULL to
113 only be notified depending on the pad scheduling mode.
114 The IDLE probe is also a blocking probe in that it will not let
115 any data pass on the pad for as long as the IDLE probe is
119 You can use idle probes to dynamically relink a pad. We will see
120 how to use idle probes to replace an element in the pipeline.
121 See also <xref linkend="section-dynamic-pipelines"/>.
126 <sect2 id="section-data-probes">
127 <title>Data probes</title>
129 Data probes allow you to be notified when there is data passing
130 on a pad. When adding the probe, specify the GST_PAD_PROBE_TYPE_BUFFER
131 and/or GST_PAD_PROBE_TYPE_BUFFER_LIST.
134 Data probes run in pipeline streaming thread context, so callbacks
135 should try to not block and generally not do any weird stuff, since
136 this could have a negative impact on pipeline performance or, in case
137 of bugs, cause deadlocks or crashes. More precisely, one should usually
138 not call any GUI-related functions from within a probe callback, nor try
139 to change the state of the pipeline. An application may post custom
140 messages on the pipeline's bus though to communicate with the main
141 application thread and have it do things like stop the pipeline.
144 In any case, most common buffer operations
145 that elements can do in <function>_chain ()</function> functions, can
146 be done in probe callbacks as well. The example below gives a short
147 impression on how to use them.
150 <!-- example-begin probe.c -->
154 static GstPadProbeReturn
155 cb_have_data (GstPad *pad,
156 GstPadProbeInfo *info,
164 buffer = GST_PAD_PROBE_INFO_BUFFER (info);
166 buffer = gst_buffer_make_writable (buffer);
168 gst_buffer_map (buffer, &map, GST_MAP_WRITE);
170 ptr = (guint16 *) map.data;
172 for (y = 0; y < 288; y++) {
173 for (x = 0; x < 384 / 2; x++) {
174 t = ptr[384 - 1 - x];
175 ptr[384 - 1 - x] = ptr[x];
180 gst_buffer_unmap (buffer, &map);
182 GST_PAD_PROBE_INFO_DATA (info) = buffer;
184 return GST_PAD_PROBE_OK;
192 GstElement *pipeline, *src, *sink, *filter, *csp;
197 gst_init (&argc, &argv);
198 loop = g_main_loop_new (NULL, FALSE);
201 pipeline = gst_pipeline_new ("my-pipeline");
202 src = gst_element_factory_make ("videotestsrc", "src");
204 g_error ("Could not create 'videotestsrc' element");
206 filter = gst_element_factory_make ("capsfilter", "filter");
207 g_assert (filter != NULL); /* should always exist */
209 csp = gst_element_factory_make ("videoconvert", "csp");
211 g_error ("Could not create 'videoconvert' element");
213 sink = gst_element_factory_make ("xvimagesink", "sink");
215 sink = gst_element_factory_make ("ximagesink", "sink");
217 g_error ("Could not create neither 'xvimagesink' nor 'ximagesink' element");
220 gst_bin_add_many (GST_BIN (pipeline), src, filter, csp, sink, NULL);
221 gst_element_link_many (src, filter, csp, sink, NULL);
222 filtercaps = gst_caps_new_simple ("video/x-raw",
223 "format", G_TYPE_STRING, "RGB16",
224 "width", G_TYPE_INT, 384,
225 "height", G_TYPE_INT, 288,
226 "framerate", GST_TYPE_FRACTION, 25, 1,
228 g_object_set (G_OBJECT (filter), "caps", filtercaps, NULL);
229 gst_caps_unref (filtercaps);
231 pad = gst_element_get_static_pad (src, "src");
232 gst_pad_add_probe (pad, GST_PAD_PROBE_TYPE_BUFFER,
233 (GstPadProbeCallback) cb_have_data, NULL, NULL);
234 gst_object_unref (pad);
237 gst_element_set_state (pipeline, GST_STATE_PLAYING);
239 /* wait until it's up and running or failed */
240 if (gst_element_get_state (pipeline, NULL, NULL, -1) == GST_STATE_CHANGE_FAILURE) {
241 g_error ("Failed to go into PLAYING state");
244 g_print ("Running ...\n");
245 g_main_loop_run (loop);
248 gst_element_set_state (pipeline, GST_STATE_NULL);
249 gst_object_unref (pipeline);
254 <!-- example-end probe.c -->
257 Compare that output with the output of <quote>gst-launch-1.0
258 videotestsrc ! xvimagesink</quote>, just so you know what you're
262 Strictly speaking, a pad probe callback is only allowed to modify the
263 buffer content if the buffer is writable. Whether this is the case or
264 not depends a lot on the pipeline and the elements involved. Often
265 enough, this is the case, but sometimes it is not, and if it is not
266 then unexpected modification of the data or metadata can introduce
267 bugs that are very hard to debug and track down. You can check if a
268 buffer is writable with <function>gst_buffer_is_writable ()</function>.
269 Since you can pass back a different buffer than the one passed in,
270 it is a good idea to make the buffer writable in the callback function
271 with <function>gst_buffer_make_writable ()</function>.
274 Pad probes are suited best for looking at data as it passes through
275 the pipeline. If you need to modify data, you should better write your
276 own GStreamer element. Base classes like GstAudioFilter, GstVideoFilter or
277 GstBaseTransform make this fairly easy.
280 If you just want to inspect buffers as they pass through the pipeline,
281 you don't even need to set up pad probes. You could also just insert
282 an identity element into the pipeline and connect to its "handoff"
283 signal. The identity element also provides a few useful debugging tools
284 like the "dump" property or the "last-message" property (the latter is
285 enabled by passing the '-v' switch to gst-launch and by setting the
286 silent property on the identity to FALSE).
290 <sect2 id="section-preroll-probes">
291 <title>Play a region of a media file</title>
293 In this example we will show you how to play back a region of
294 a media file. The goal is to only play the part of a file
295 from 2 seconds to 5 seconds and then EOS.
298 In a first step we will set a uridecodebin element to the PAUSED
299 state and make sure that we block all the source pads that are
300 created. When all the source pads are blocked, we have data on
301 all source pads and we say that the uridecodebin is prerolled.
304 In a prerolled pipeline we can ask for the duration of the media
305 and we can also perform seeks. We are interested in performing a
306 seek operation on the pipeline to select the range of media
307 that we are interested in.
310 After we configure the region we are interested in, we can link
311 the sink element, unblock the source pads and set the pipeline to
312 the playing state. You will see that exactly the requested
313 region is played by the sink before it goes to EOS.
316 What follows is an example application that loosly follows this
320 <!-- example-begin blockprobe.c -->
324 static GMainLoop *loop;
325 static volatile gint counter;
327 static gboolean prerolled = FALSE;
328 static GstPad *sinkpad;
331 dec_counter (GstElement * pipeline)
336 if (g_atomic_int_dec_and_test (&counter)) {
337 /* all probes blocked and no-more-pads signaled, post
338 * message on the bus. */
341 gst_bus_post (bus, gst_message_new_application (
342 GST_OBJECT_CAST (pipeline),
343 gst_structure_new_empty ("ExPrerolled")));
347 /* called when a source pad of uridecodebin is blocked */
348 static GstPadProbeReturn
349 cb_blocked (GstPad *pad,
350 GstPadProbeInfo *info,
353 GstElement *pipeline = GST_ELEMENT (user_data);
356 return GST_PAD_PROBE_REMOVE;
358 dec_counter (pipeline);
360 return GST_PAD_PROBE_OK;
363 /* called when uridecodebin has a new pad */
365 cb_pad_added (GstElement *element,
369 GstElement *pipeline = GST_ELEMENT (user_data);
374 g_atomic_int_inc (&counter);
376 gst_pad_add_probe (pad, GST_PAD_PROBE_TYPE_BLOCK_DOWNSTREAM,
377 (GstPadProbeCallback) cb_blocked, pipeline, NULL);
379 /* try to link to the video pad */
380 gst_pad_link (pad, sinkpad);
383 /* called when uridecodebin has created all pads */
385 cb_no_more_pads (GstElement *element,
388 GstElement *pipeline = GST_ELEMENT (user_data);
393 dec_counter (pipeline);
396 /* called when a new message is posted on the bus */
398 cb_message (GstBus *bus,
402 GstElement *pipeline = GST_ELEMENT (user_data);
404 switch (GST_MESSAGE_TYPE (message)) {
405 case GST_MESSAGE_ERROR:
406 g_print ("we received an error!\n");
407 g_main_loop_quit (loop);
409 case GST_MESSAGE_EOS:
410 g_print ("we reached EOS\n");
411 g_main_loop_quit (loop);
413 case GST_MESSAGE_APPLICATION:
415 if (gst_message_has_name (message, "ExPrerolled")) {
416 /* it's our message */
417 g_print ("we are all prerolled, do seek\n");
418 gst_element_seek (pipeline,
419 1.0, GST_FORMAT_TIME,
420 GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE,
421 GST_SEEK_TYPE_SET, 2 * GST_SECOND,
422 GST_SEEK_TYPE_SET, 5 * GST_SECOND);
424 gst_element_set_state (pipeline, GST_STATE_PLAYING);
437 GstElement *pipeline, *src, *csp, *vs, *sink;
440 gst_init (&argc, &argv);
441 loop = g_main_loop_new (NULL, FALSE);
444 g_print ("usage: %s <uri>", argv[0]);
449 pipeline = gst_pipeline_new ("my-pipeline");
451 bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
452 gst_bus_add_signal_watch (bus);
453 g_signal_connect (bus, "message", (GCallback) cb_message,
456 src = gst_element_factory_make ("uridecodebin", "src");
458 g_error ("Could not create 'uridecodebin' element");
460 g_object_set (src, "uri", argv[1], NULL);
462 csp = gst_element_factory_make ("videoconvert", "csp");
464 g_error ("Could not create 'videoconvert' element");
466 vs = gst_element_factory_make ("videoscale", "vs");
468 g_error ("Could not create 'videoscale' element");
470 sink = gst_element_factory_make ("autovideosink", "sink");
472 g_error ("Could not create 'autovideosink' element");
474 gst_bin_add_many (GST_BIN (pipeline), src, csp, vs, sink, NULL);
476 /* can't link src yet, it has no pads */
477 gst_element_link_many (csp, vs, sink, NULL);
479 sinkpad = gst_element_get_static_pad (csp, "sink");
481 /* for each pad block that is installed, we will increment
482 * the counter. for each pad block that is signaled, we
483 * decrement the counter. When the counter is 0 we post
484 * an app message to tell the app that all pads are
485 * blocked. Start with 1 that is decremented when no-more-pads
486 * is signaled to make sure that we only post the message
487 * after no-more-pads */
488 g_atomic_int_set (&counter, 1);
490 g_signal_connect (src, "pad-added",
491 (GCallback) cb_pad_added, pipeline);
492 g_signal_connect (src, "no-more-pads",
493 (GCallback) cb_no_more_pads, pipeline);
495 gst_element_set_state (pipeline, GST_STATE_PAUSED);
497 g_main_loop_run (loop);
499 gst_element_set_state (pipeline, GST_STATE_NULL);
501 gst_object_unref (sinkpad);
502 gst_object_unref (bus);
503 gst_object_unref (pipeline);
504 g_main_loop_unref (loop);
509 <!-- example-end blockprobe.c -->
512 Note that we use a custom application message to signal the
513 main thread that the uridecidebin is prerolled. The main thread
514 will then issue a flushing seek to the requested region. The
515 flush will temporarily unblock the pad and reblock them when
516 new data arrives again. We detect this second block to remove
517 the probes. Then we set the pipeline to PLAYING and it should
518 play from 2 to 5 seconds, then EOS and exit the application.
523 <sect1 id="section-data-spoof">
524 <title>Manually adding or removing data from/to a pipeline</title>
526 Many people have expressed the wish to use their own sources to inject
527 data into a pipeline. Some people have also expressed the wish to grab
528 the output in a pipeline and take care of the actual output inside
529 their application. While either of these methods are strongly
530 discouraged, &GStreamer; offers support for this.
531 <emphasis>Beware! You need to know what you are doing.</emphasis> Since
532 you don't have any support from a base class you need to thoroughly
533 understand state changes and synchronization. If it doesn't work,
534 there are a million ways to shoot yourself in the foot. It's always
535 better to simply write a plugin and have the base class manage it.
536 See the Plugin Writer's Guide for more information on this topic. Also
537 see the next section, which will explain how to embed plugins statically
541 There's two possible elements that you can use for the above-mentioned
542 purposes. Those are called <quote>appsrc</quote> (an imaginary source)
543 and <quote>appsink</quote> (an imaginary sink). The same method applies
544 to each of those elements. Here, we will discuss how to use those
545 elements to insert (using appsrc) or grab (using appsink) data from a
546 pipeline, and how to set negotiation.
549 Both appsrc and appsink provide 2 sets of API. One API uses standard
550 GObject (action) signals and properties. The same API is also
551 available as a regular C api. The C api is more performant but
552 requires you to link to the app library in order to use the elements.
555 <sect2 id="section-spoof-appsrc">
556 <title>Inserting data with appsrc</title>
558 First we look at some examples for appsrc, which lets you insert data
559 into the pipeline from the application. Appsrc has some configuration
560 options that define how it will operate. You should decide about the
561 following configurations:
566 Will the appsrc operate in push or pull mode. The stream-type
567 property can be used to control this. stream-type of
568 <quote>random-access</quote> will activate pull mode scheduling
569 while the other stream-types activate push mode.
574 The caps of the buffers that appsrc will push out. This needs to
575 be configured with the caps property. The caps must be set to a
576 fixed caps and will be used to negotiate a format downstream.
581 If the appsrc operates in live mode or not. This can be configured
582 with the is-live property. When operating in live-mode it is
583 important to configure the min-latency and max-latency in appsrc.
584 The min-latency should be set to the amount of time it takes between
585 capturing a buffer and when it is pushed inside appsrc.
586 In live mode, you should timestamp the buffers with the pipeline
587 running-time when the first byte of the buffer was captured before
588 feeding them to appsrc. You can let appsrc do the timestaping with
589 the do-timestamp property (but then the min-latency must be set
590 to 0 because it timestamps based on the running-time when the buffer
596 The format of the SEGMENT event that appsrc will push. The format
597 has implications for how the running-time of the buffers will
598 be calculated so you must be sure you understand this. For
599 live sources you probably want to set the format property to
600 GST_FORMAT_TIME. For non-live source it depends on the media type
601 that you are handling. If you plan to timestamp the buffers, you
602 should probably put a GST_FORMAT_TIME format, otherwise
603 GST_FORMAT_BYTES might be appropriate.
608 If appsrc operates in random-access mode, it is important to configure
609 the size property of appsrc with the number of bytes in the stream.
610 This will allow downstream elements to know the size of the media and
611 alows them to seek to the end of the stream when needed.
616 The main way of handling data to appsrc is by using the function
617 <function>gst_app_src_push_buffer ()</function> or by emiting the
618 push-buffer action signal. This will put the buffer onto a queue from
619 which appsrc will read from in its streaming thread. It is important
620 to note that data transport will not happen from the thread that
621 performed the push-buffer call.
624 The <quote>max-bytes</quote> property controls how much data can be
625 queued in appsrc before appsrc considers the queue full. A filled
626 internal queue will always signal the <quote>enough-data</quote>
627 signal, which signals the application that it should stop pushing
628 data into appsrc. The <quote>block</quote> property will cause appsrc to
629 block the push-buffer method until free data becomes available again.
632 When the internal queue is running out of data, the
633 <quote>need-data</quote> signal is emitted, which signals the application
634 that it should start pushing more data into appsrc.
637 In addition to the <quote>need-data</quote> and <quote>enough-data</quote>
638 signals, appsrc can emit the <quote>seek-data</quote> signal when the
639 <quote>stream-mode</quote> property is set to <quote>seekable</quote>
640 or <quote>random-access</quote>. The signal argument will contain the
641 new desired position in the stream expressed in the unit set with the
642 <quote>format</quote> property. After receiving the seek-data signal,
643 the application should push-buffers from the new position.
646 When the last byte is pushed into appsrc, you must call
647 <function>gst_app_src_end_of_stream ()</function> to make it send
651 These signals allow the application to operate appsrc in push and
652 pull mode as will be explained next.
655 <sect3 id="section-spoof-appsrc-push">
656 <title>Using appsrc in push mode</title>
658 When appsrc is configured in push mode (stream-type is stream or
659 seekable), the application repeatedly calls the push-buffer method
660 with a new buffer. Optionally, the queue size in the appsrc can be
661 controlled with the enough-data and need-data signals by respectively
662 stopping/starting the push-buffer calls. The value of the
663 min-percent property defines how empty the internal appsrc queue
664 needs to be before the need-data signal will be fired. You can set
665 this to some value >0 to avoid completely draining the queue.
668 When the stream-type is set to seekable, don't forget to implement
669 a seek-data callback.
672 Use this model when implementing various network protocols or
677 <sect3 id="section-spoof-appsrc-pull">
678 <title>Using appsrc in pull mode</title>
680 In the pull model, data is fed to appsrc from the need-data signal
681 handler. You should push exactly the amount of bytes requested in the
682 need-data signal. You are only allowed to push less bytes when you are
683 at the end of the stream.
686 Use this model for file access or other randomly accessable sources.
690 <sect3 id="section-spoof-appsrc-ex">
691 <title>Appsrc example</title>
693 This example application will generate black/white (it switches
694 every second) video to an Xv-window output by using appsrc as a
695 source with caps to force a format. We use a colorspace
696 conversion element to make sure that we feed the right format to
697 your X server. We configure a video stream with a variable framerate
698 (0/1) and we set the timestamps on the outgoing buffers in such
699 a way that we play 2 frames per second.
702 Note how we use the pull mode method of pushing new buffers into
703 appsrc although appsrc is running in push mode.
706 <!-- example-begin appsrc.c -->
710 static GMainLoop *loop;
713 cb_need_data (GstElement *appsrc,
717 static gboolean white = FALSE;
718 static GstClockTime timestamp = 0;
723 size = 385 * 288 * 2;
725 buffer = gst_buffer_new_allocate (NULL, size, NULL);
727 /* this makes the image black/white */
728 gst_buffer_memset (buffer, 0, white ? 0xff : 0x0, size);
732 GST_BUFFER_PTS (buffer) = timestamp;
733 GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale_int (1, GST_SECOND, 2);
735 timestamp += GST_BUFFER_DURATION (buffer);
737 g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
739 if (ret != GST_FLOW_OK) {
740 /* something wrong, stop pushing */
741 g_main_loop_quit (loop);
749 GstElement *pipeline, *appsrc, *conv, *videosink;
752 gst_init (&argc, &argv);
753 loop = g_main_loop_new (NULL, FALSE);
756 pipeline = gst_pipeline_new ("pipeline");
757 appsrc = gst_element_factory_make ("appsrc", "source");
758 conv = gst_element_factory_make ("videoconvert", "conv");
759 videosink = gst_element_factory_make ("xvimagesink", "videosink");
762 g_object_set (G_OBJECT (appsrc), "caps",
763 gst_caps_new_simple ("video/x-raw",
764 "format", G_TYPE_STRING, "RGB16",
765 "width", G_TYPE_INT, 384,
766 "height", G_TYPE_INT, 288,
767 "framerate", GST_TYPE_FRACTION, 0, 1,
769 gst_bin_add_many (GST_BIN (pipeline), appsrc, conv, videosink, NULL);
770 gst_element_link_many (appsrc, conv, videosink, NULL);
773 g_object_set (G_OBJECT (appsrc),
775 "format", GST_FORMAT_TIME, NULL);
776 g_signal_connect (appsrc, "need-data", G_CALLBACK (cb_need_data), NULL);
779 gst_element_set_state (pipeline, GST_STATE_PLAYING);
780 g_main_loop_run (loop);
783 gst_element_set_state (pipeline, GST_STATE_NULL);
784 gst_object_unref (GST_OBJECT (pipeline));
785 g_main_loop_unref (loop);
790 <!-- example-end appsrc.c -->
795 <sect2 id="section-spoof-appsink">
796 <title>Grabbing data with appsink</title>
798 Unlike appsrc, appsink is a little easier to use. It also supports
799 a pull and push based model of getting data from the pipeline.
802 The normal way of retrieving samples from appsink is by using the
803 <function>gst_app_sink_pull_sample()</function> and
804 <function>gst_app_sink_pull_preroll()</function> methods or by using
805 the <quote>pull-sample</quote> and <quote>pull-preroll</quote>
806 signals. These methods block until a sample becomes available in the
807 sink or when the sink is shut down or reaches EOS.
810 Appsink will internally use a queue to collect buffers from the
811 streaming thread. If the application is not pulling samples fast
812 enough, this queue will consume a lot of memory over time. The
813 <quote>max-buffers</quote> property can be used to limit the queue
814 size. The <quote>drop</quote> property controls whether the
815 streaming thread blocks or if older buffers are dropped when the
816 maximum queue size is reached. Note that blocking the streaming thread
817 can negatively affect real-time performance and should be avoided.
820 If a blocking behaviour is not desirable, setting the
821 <quote>emit-signals</quote> property to TRUE will make appsink emit
822 the <quote>new-sample</quote> and <quote>new-preroll</quote> signals
823 when a sample can be pulled without blocking.
826 The <quote>caps</quote> property on appsink can be used to control
827 the formats that appsink can receive. This property can contain
828 non-fixed caps, the format of the pulled samples can be obtained by
829 getting the sample caps.
832 If one of the pull-preroll or pull-sample methods return NULL, the
833 appsink is stopped or in the EOS state. You can check for the EOS state
834 with the <quote>eos</quote> property or with the
835 <function>gst_app_sink_is_eos()</function> method.
838 The eos signal can also be used to be informed when the EOS state is
839 reached to avoid polling.
842 Consider configuring the following properties in the appsink:
847 The <quote>sync</quote> property if you want to have the sink
848 base class synchronize the buffer against the pipeline clock
849 before handing you the sample.
854 Enable Quality-of-Service with the <quote>qos</quote> property.
855 If you are dealing with raw video frames and let the base class
856 sycnhronize on the clock, it might be a good idea to also let
857 the base class send QOS events upstream.
862 The caps property that contains the accepted caps. Upstream elements
863 will try to convert the format so that it matches the configured
864 caps on appsink. You must still check the
865 <classname>GstSample</classname> to get the actual caps of the
871 <sect3 id="section-spoof-appsink-ex">
872 <title>Appsink example</title>
874 What follows is an example on how to capture a snapshot of a video
875 stream using appsink.
878 <!-- example-begin appsink.c -->
887 #define CAPS "video/x-raw,format=RGB,width=160,pixel-aspect-ratio=1/1"
890 main (int argc, char *argv[])
892 GstElement *pipeline, *sink;
896 GError *error = NULL;
897 gint64 duration, position;
898 GstStateChangeReturn ret;
902 gst_init (&argc, &argv);
905 g_print ("usage: %s <uri>\n Writes snapshot.png in the current directory\n",
910 /* create a new pipeline */
912 g_strdup_printf ("uridecodebin uri=%s ! videoconvert ! videoscale ! "
913 " appsink name=sink caps=\"" CAPS "\"", argv[1]);
914 pipeline = gst_parse_launch (descr, &error);
917 g_print ("could not construct pipeline: %s\n", error->message);
918 g_clear_error (&error);
923 sink = gst_bin_get_by_name (GST_BIN (pipeline), "sink");
925 /* set to PAUSED to make the first frame arrive in the sink */
926 ret = gst_element_set_state (pipeline, GST_STATE_PAUSED);
928 case GST_STATE_CHANGE_FAILURE:
929 g_print ("failed to play the file\n");
931 case GST_STATE_CHANGE_NO_PREROLL:
932 /* for live sources, we need to set the pipeline to PLAYING before we can
933 * receive a buffer. We don't do that yet */
934 g_print ("live sources not supported yet\n");
939 /* This can block for up to 5 seconds. If your machine is really overloaded,
940 * it might time out before the pipeline prerolled and we generate an error. A
941 * better way is to run a mainloop and catch errors there. */
942 ret = gst_element_get_state (pipeline, NULL, NULL, 5 * GST_SECOND);
943 if (ret == GST_STATE_CHANGE_FAILURE) {
944 g_print ("failed to play the file\n");
948 /* get the duration */
949 gst_element_query_duration (pipeline, GST_FORMAT_TIME, &duration);
952 /* we have a duration, seek to 5% */
953 position = duration * 5 / 100;
955 /* no duration, seek to 1 second, this could EOS */
956 position = 1 * GST_SECOND;
958 /* seek to the a position in the file. Most files have a black first frame so
959 * by seeking to somewhere else we have a bigger chance of getting something
960 * more interesting. An optimisation would be to detect black images and then
961 * seek a little more */
962 gst_element_seek_simple (pipeline, GST_FORMAT_TIME,
963 GST_SEEK_FLAG_KEY_UNIT | GST_SEEK_FLAG_FLUSH, position);
965 /* get the preroll buffer from appsink, this block untils appsink really
967 g_signal_emit_by_name (sink, "pull-preroll", &sample, NULL);
969 /* if we have a buffer now, convert it to a pixbuf. It's possible that we
970 * don't have a buffer because we went EOS right away or had an error. */
976 /* get the snapshot buffer format now. We set the caps on the appsink so
977 * that it can only be an rgb buffer. The only thing we have not specified
978 * on the caps is the height, which is dependant on the pixel-aspect-ratio
979 * of the source material */
980 caps = gst_sample_get_caps (sample);
982 g_print ("could not get snapshot format\n");
985 s = gst_caps_get_structure (caps, 0);
987 /* we need to get the final caps on the buffer to get the size */
988 res = gst_structure_get_int (s, "width", &width);
989 res |= gst_structure_get_int (s, "height", &height);
991 g_print ("could not get snapshot dimension\n");
995 /* create pixmap from buffer and save, gstreamer video buffers have a stride
996 * that is rounded up to the nearest multiple of 4 */
997 buffer = gst_sample_get_buffer (sample);
998 gst_buffer_map (buffer, &map, GST_MAP_READ);
1000 pixbuf = gdk_pixbuf_new_from_data (map.data,
1001 GDK_COLORSPACE_RGB, FALSE, 8, width, height,
1002 GST_ROUND_UP_4 (width * 3), NULL, NULL);
1004 /* save the pixbuf */
1005 gdk_pixbuf_save (pixbuf, "snapshot.png", "png", &error, NULL);
1007 gst_buffer_unmap (buffer, &map);
1008 gst_sample_unref (sample);
1010 g_print ("could not make snapshot\n");
1013 /* cleanup and exit */
1014 gst_element_set_state (pipeline, GST_STATE_NULL);
1015 gst_object_unref (pipeline);
1020 <!-- example-end appsink.c -->
1026 <sect1 id="section-spoof-format">
1027 <title>Forcing a format</title>
1029 Sometimes you'll want to set a specific format, for example a video
1030 size and format or an audio bitsize and number of channels. You can
1031 do this by forcing a specific <classname>GstCaps</classname> on
1032 the pipeline, which is possible by using
1033 <emphasis>filtered caps</emphasis>. You can set a filtered caps on
1034 a link by using the <quote>capsfilter</quote> element in between the
1035 two elements, and specifying a <classname>GstCaps</classname> as
1036 <quote>caps</quote> property on this element. It will then
1037 only allow types matching that specified capability set for
1038 negotiation. See also <xref linkend="section-caps-filter"/>.
1041 <sect2 id="section-dynamic-format">
1042 <title>Changing format in a PLAYING pipeline</title>
1044 It is also possible to dynamically change the format in a pipeline
1045 while PLAYING. This can simply be done by changing the caps
1046 property on a capsfilter. The capsfilter will send a RECONFIGURE
1047 event upstream that will make the upstream element attempt to
1048 renegotiate a new format and allocator. This only works if
1049 the upstream element is not using fixed caps on the source pad.
1052 Below is an example of how you can change the caps of a pipeline
1053 while in the PLAYING state:
1056 <!-- example-begin dynformat.c -->
1060 #include <gst/gst.h>
1062 #define MAX_ROUND 100
1065 main (int argc, char **argv)
1067 GstElement *pipe, *filter;
1072 GstMessage *message;
1074 gst_init (&argc, &argv);
1076 pipe = gst_parse_launch_full ("videotestsrc ! capsfilter name=filter ! "
1077 "ximagesink", NULL, GST_PARSE_FLAG_NONE, NULL);
1078 g_assert (pipe != NULL);
1080 filter = gst_bin_get_by_name (GST_BIN (pipe), "filter");
1087 for (round = 0; round < MAX_ROUND; round++) {
1089 g_print ("resize to %dx%d (%d/%d) \r", width, height, round, MAX_ROUND);
1091 /* we prefer our fixed width and height but allow other dimensions to pass
1093 capsstr = g_strdup_printf ("video/x-raw, width=(int)%d, height=(int)%d",
1096 caps = gst_caps_from_string (capsstr);
1098 g_object_set (filter, "caps", caps, NULL);
1099 gst_caps_unref (caps);
1102 gst_element_set_state (pipe, GST_STATE_PLAYING);
1107 else if (width < 200)
1113 else if (height < 150)
1117 gst_bus_poll (GST_ELEMENT_BUS (pipe), GST_MESSAGE_ERROR,
1120 g_print ("got error \n");
1122 gst_message_unref (message);
1125 g_print ("done \n");
1127 gst_object_unref (filter);
1128 gst_element_set_state (pipe, GST_STATE_NULL);
1129 gst_object_unref (pipe);
1134 <!-- example-end dynformat.c -->
1137 Note how we use <function>gst_bus_poll()</function> with a
1138 small timeout to get messages and also introduce a short
1142 It is possible to set multiple caps for the capsfilter separated
1143 with a ;. The capsfilter will try to renegotiate to the first
1144 possible format from the list.
1149 <sect1 id="section-dynamic-pipelines">
1150 <title>Dynamically changing the pipeline</title>
1152 In this section we talk about some techniques for dynamically
1153 modifying the pipeline. We are talking specifically about changing
1154 the pipeline while it is in the PLAYING state without interrupting
1158 There are some important things to consider when building dynamic
1164 When removing elements from the pipeline, make sure that there
1165 is no dataflow on unlinked pads because that will cause a fatal
1166 pipeline error. Always block source pads (in push mode) or
1167 sink pads (in pull mode) before unlinking pads.
1168 See also <xref linkend="section-dynamic-changing"/>.
1173 When adding elements to a pipeline, make sure to put the element
1174 into the right state, usually the same state as the parent, before
1175 allowing dataflow the element. When an element is newly created,
1176 it is in the NULL state and will return an error when it
1178 See also <xref linkend="section-dynamic-changing"/>.
1183 When adding elements to a pipeline, &GStreamer; will by default
1184 set the clock and base-time on the element to the current values
1185 of the pipeline. This means that the element will be able to
1186 construct the same pipeline running-time as the other elements
1187 in the pipeline. This means that sinks will synchronize buffers
1188 like the other sinks in the pipeline and that sources produce
1189 buffers with a running-time that matches the other sources.
1194 When unlinking elements from an upstream chain, always make sure
1195 to flush any queued data in the element by sending an EOS event
1196 down the element sink pad(s) and by waiting that the EOS leaves
1197 the elements (with an event probe).
1200 If you do not do this, you will lose the data which is buffered
1201 by the unlinked element. This can result in a simple frame loss
1202 (one or more video frames, several milliseconds of audio). However
1203 if you remove a muxer (and in some cases an encoder or similar elements)
1204 from the pipeline, you risk getting a corrupted file which could not be
1205 played properly, as some relevant metadata (header, seek/index tables, internal
1206 sync tags) will not be stored or updated properly.
1209 See also <xref linkend="section-dynamic-changing"/>.
1214 A live source will produce buffers with a running-time of the
1215 current running-time in the pipeline.
1218 A pipeline without a live source produces buffers with a
1219 running-time starting from 0. Likewise, after a flushing seek,
1220 those pipelines reset the running-time back to 0.
1223 The running-time can be changed with
1224 <function>gst_pad_set_offset ()</function>. It is important to
1225 know the running-time of the elements in the pipeline in order
1226 to maintain synchronization.
1231 Adding elements might change the state of the pipeline. Adding a
1232 non-prerolled sink, for example, brings the pipeline back to the
1233 prerolling state. Removing a non-prerolled sink, for example, might
1234 change the pipeline to PAUSED and PLAYING state.
1237 Adding a live source cancels the preroll stage and put the pipeline
1238 to the playing state. Adding a live source or other live elements
1239 might also change the latency of a pipeline.
1242 Adding or removing elements to the pipeline might change the clock
1243 selection of the pipeline. If the newly added element provides a clock,
1244 it might be worth changing the clock in the pipeline to the new
1245 clock. If, on the other hand, the element that provides the clock
1246 for the pipeline is removed, a new clock has to be selected.
1251 Adding and removing elements might cause upstream or downstream
1252 elements to renegotiate caps and or allocators. You don't really
1253 need to do anything from the application, plugins largely
1254 adapt themself to the new pipeline topology in order to optimize
1255 their formats and allocation strategy.
1258 What is important is that when you add, remove or change elements
1259 in the pipeline, it is possible that the pipeline needs to
1260 negotiate a new format and this can fail. Usually you can fix this
1261 by inserting the right converter elements where needed.
1262 See also <xref linkend="section-dynamic-changing"/>.
1268 &GStreamer; offers support for doing about any dynamic pipeline
1269 modification but it requires you to know a bit of details before
1270 you can do this without causing pipeline errors. In the following
1271 sections we will demonstrate a couple of typical use-cases.
1274 <sect2 id="section-dynamic-changing">
1275 <title>Changing elements in a pipeline</title>
1277 In the next example we look at the following chain of elements:
1280 - ----. .----------. .---- -
1281 element1 | | element2 | | element3
1282 src -> sink src -> sink
1283 - ----' '----------' '---- -
1286 We want to change element2 by element4 while the pipeline is in
1287 the PLAYING state. Let's say that element2 is a visualization and
1288 that you want to switch the visualization in the pipeline.
1291 We can't just unlink element2's sinkpad from element1's source
1292 pad because that would leave element1's source pad
1293 unlinked and would cause a streaming error in the pipeline when
1294 data is pushed on the source pad.
1295 The technique is to block the dataflow from element1's source pad
1296 before we change element2 by element4 and then resume dataflow
1297 as shown in the following steps:
1302 Block element1's source pad with a blocking pad probe. When the
1303 pad is blocked, the probe callback will be called.
1308 Inside the block callback nothing is flowing between element1
1309 and element2 and nothing will flow until unblocked.
1314 Unlink element1 and element2.
1319 Make sure data is flushed out of element2. Some elements might
1320 internally keep some data, you need to make sure not to lose data
1321 by forcing it out of element2. You can do this by pushing EOS into
1322 element2, like this:
1327 Put an event probe on element2's source pad.
1332 Send EOS to element2's sinkpad. This makes sure the all the
1333 data inside element2 is forced out.
1338 Wait for the EOS event to appear on element2's source pad.
1339 When the EOS is received, drop it and remove the event
1347 Unlink element2 and element3. You can now also remove element2
1348 from the pipeline and set the state to NULL.
1353 Add element4 to the pipeline, if not already added. Link element4
1354 and element3. Link element1 and element4.
1359 Make sure element4 is in the same state as the rest of the elements
1360 in the pipeline. It should be at least in the PAUSED state before
1361 it can receive buffers and events.
1366 Unblock element1's source pad probe. This will let new data into
1367 element4 and continue streaming.
1372 The above algorithm works when the source pad is blocked, i.e. when
1373 there is dataflow in the pipeline. If there is no dataflow, there is
1374 also no point in changing the element (just yet) so this algorithm can
1375 be used in the PAUSED state as well.
1378 Let show you how this works with an example. This example changes the
1379 video effect on a simple pipeline every second.
1382 <!-- example-begin effectswitch.c -->
1384 #include <gst/gst.h>
1386 static gchar *opt_effects = NULL;
1388 #define DEFAULT_EFFECTS "identity,exclusion,navigationtest," \
1389 "agingtv,videoflip,vertigotv,gaussianblur,shagadelictv,edgetv"
1391 static GstPad *blockpad;
1392 static GstElement *conv_before;
1393 static GstElement *conv_after;
1394 static GstElement *cur_effect;
1395 static GstElement *pipeline;
1397 static GQueue effects = G_QUEUE_INIT;
1399 static GstPadProbeReturn
1400 event_probe_cb (GstPad * pad, GstPadProbeInfo * info, gpointer user_data)
1402 GMainLoop *loop = user_data;
1405 if (GST_EVENT_TYPE (GST_PAD_PROBE_INFO_DATA (info)) != GST_EVENT_EOS)
1406 return GST_PAD_PROBE_PASS;
1408 gst_pad_remove_probe (pad, GST_PAD_PROBE_INFO_ID (info));
1410 /* push current effect back into the queue */
1411 g_queue_push_tail (&effects, gst_object_ref (cur_effect));
1412 /* take next effect from the queue */
1413 next = g_queue_pop_head (&effects);
1415 GST_DEBUG_OBJECT (pad, "no more effects");
1416 g_main_loop_quit (loop);
1417 return GST_PAD_PROBE_DROP;
1420 g_print ("Switching from '%s' to '%s'..\n", GST_OBJECT_NAME (cur_effect),
1421 GST_OBJECT_NAME (next));
1423 gst_element_set_state (cur_effect, GST_STATE_NULL);
1425 /* remove unlinks automatically */
1426 GST_DEBUG_OBJECT (pipeline, "removing %" GST_PTR_FORMAT, cur_effect);
1427 gst_bin_remove (GST_BIN (pipeline), cur_effect);
1429 GST_DEBUG_OBJECT (pipeline, "adding %" GST_PTR_FORMAT, next);
1430 gst_bin_add (GST_BIN (pipeline), next);
1432 GST_DEBUG_OBJECT (pipeline, "linking..");
1433 gst_element_link_many (conv_before, next, conv_after, NULL);
1435 gst_element_set_state (next, GST_STATE_PLAYING);
1438 GST_DEBUG_OBJECT (pipeline, "done");
1440 return GST_PAD_PROBE_DROP;
1443 static GstPadProbeReturn
1444 pad_probe_cb (GstPad * pad, GstPadProbeInfo * info, gpointer user_data)
1446 GstPad *srcpad, *sinkpad;
1448 GST_DEBUG_OBJECT (pad, "pad is blocked now");
1450 /* remove the probe first */
1451 gst_pad_remove_probe (pad, GST_PAD_PROBE_INFO_ID (info));
1453 /* install new probe for EOS */
1454 srcpad = gst_element_get_static_pad (cur_effect, "src");
1455 gst_pad_add_probe (srcpad, GST_PAD_PROBE_TYPE_BLOCK |
1456 GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM, event_probe_cb, user_data, NULL);
1457 gst_object_unref (srcpad);
1459 /* push EOS into the element, the probe will be fired when the
1460 * EOS leaves the effect and it has thus drained all of its data */
1461 sinkpad = gst_element_get_static_pad (cur_effect, "sink");
1462 gst_pad_send_event (sinkpad, gst_event_new_eos ());
1463 gst_object_unref (sinkpad);
1465 return GST_PAD_PROBE_OK;
1469 timeout_cb (gpointer user_data)
1471 gst_pad_add_probe (blockpad, GST_PAD_PROBE_TYPE_BLOCK_DOWNSTREAM,
1472 pad_probe_cb, user_data, NULL);
1478 bus_cb (GstBus * bus, GstMessage * msg, gpointer user_data)
1480 GMainLoop *loop = user_data;
1482 switch (GST_MESSAGE_TYPE (msg)) {
1483 case GST_MESSAGE_ERROR:{
1487 gst_message_parse_error (msg, &err, &dbg);
1488 gst_object_default_error (msg->src, err, dbg);
1489 g_clear_error (&err);
1491 g_main_loop_quit (loop);
1501 main (int argc, char **argv)
1503 GOptionEntry options[] = {
1504 {"effects", 'e', 0, G_OPTION_ARG_STRING, &opt_effects,
1505 "Effects to use (comma-separated list of element names)", NULL},
1508 GOptionContext *ctx;
1511 GstElement *src, *q1, *q2, *effect, *filter1, *filter2, *sink;
1512 gchar **effect_names, **e;
1514 ctx = g_option_context_new ("");
1515 g_option_context_add_main_entries (ctx, options, NULL);
1516 g_option_context_add_group (ctx, gst_init_get_option_group ());
1517 if (!g_option_context_parse (ctx, &argc, &argv, &err)) {
1518 g_print ("Error initializing: %s\n", err->message);
1519 g_clear_error (&err);
1520 g_option_context_free (ctx);
1523 g_option_context_free (ctx);
1525 if (opt_effects != NULL)
1526 effect_names = g_strsplit (opt_effects, ",", -1);
1528 effect_names = g_strsplit (DEFAULT_EFFECTS, ",", -1);
1530 for (e = effect_names; e != NULL && *e != NULL; ++e) {
1533 el = gst_element_factory_make (*e, NULL);
1535 g_print ("Adding effect '%s'\n", *e);
1536 g_queue_push_tail (&effects, el);
1540 pipeline = gst_pipeline_new ("pipeline");
1542 src = gst_element_factory_make ("videotestsrc", NULL);
1543 g_object_set (src, "is-live", TRUE, NULL);
1545 filter1 = gst_element_factory_make ("capsfilter", NULL);
1546 gst_util_set_object_arg (G_OBJECT (filter1), "caps",
1547 "video/x-raw, width=320, height=240, "
1548 "format={ I420, YV12, YUY2, UYVY, AYUV, Y41B, Y42B, "
1549 "YVYU, Y444, v210, v216, NV12, NV21, UYVP, A420, YUV9, YVU9, IYU1 }");
1551 q1 = gst_element_factory_make ("queue", NULL);
1553 blockpad = gst_element_get_static_pad (q1, "src");
1555 conv_before = gst_element_factory_make ("videoconvert", NULL);
1557 effect = g_queue_pop_head (&effects);
1558 cur_effect = effect;
1560 conv_after = gst_element_factory_make ("videoconvert", NULL);
1562 q2 = gst_element_factory_make ("queue", NULL);
1564 filter2 = gst_element_factory_make ("capsfilter", NULL);
1565 gst_util_set_object_arg (G_OBJECT (filter2), "caps",
1566 "video/x-raw, width=320, height=240, "
1567 "format={ RGBx, BGRx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR }");
1569 sink = gst_element_factory_make ("ximagesink", NULL);
1571 gst_bin_add_many (GST_BIN (pipeline), src, filter1, q1, conv_before, effect,
1572 conv_after, q2, sink, NULL);
1574 gst_element_link_many (src, filter1, q1, conv_before, effect, conv_after,
1577 gst_element_set_state (pipeline, GST_STATE_PLAYING);
1579 loop = g_main_loop_new (NULL, FALSE);
1581 gst_bus_add_watch (GST_ELEMENT_BUS (pipeline), bus_cb, loop);
1583 g_timeout_add_seconds (1, timeout_cb, loop);
1585 g_main_loop_run (loop);
1587 gst_element_set_state (pipeline, GST_STATE_NULL);
1588 gst_object_unref (pipeline);
1593 <!-- example-end effectswitch.c -->
1596 Note how we added videoconvert elements before and after the effect.
1597 This is needed because some elements might operate in different
1598 colorspaces than other elements. By inserting the conversion elements
1599 you ensure that the right format can be negotiated at any time.