1 <!-- ############ chapter ############# -->
3 <chapter id="chapter-intro-basics" xreflabel="Basic Concepts">
4 <title>Basic Concepts</title>
6 This chapter of the guide introduces the basic concepts of &GStreamer;.
7 Understanding these concepts will help you grok the issues involved in
8 extending &GStreamer;. Many of these concepts are explained in greater
9 detail in the &GstAppDevMan;; the basic concepts presented here serve mainly
10 to refresh your memory.
13 <!-- ############ sect1 ############# -->
15 <sect1 id="section-basics-elements" xreflabel="Elements and Plugins">
16 <title>Elements and Plugins</title>
18 Elements are at the core of &GStreamer;. In the context of plugin
19 development, an <emphasis>element</emphasis> is an object derived from the
20 <classname>GstElement</classname> class. Elements provide some sort of
21 functionality when linked with other elements: For example, a source
22 element provides data to a stream, and a filter element acts on the data
23 in a stream. Without elements, &GStreamer; is just a bunch of conceptual
24 pipe fittings with nothing to link. A large number of elements ship
25 with &GStreamer;, but extra elements can also be written.
28 Just writing a new element is not entirely enough, however: You will need
29 to encapsulate your element in a <emphasis>plugin</emphasis> to enable
30 &GStreamer; to use it. A plugin is essentially a loadable block of code,
31 usually called a shared object file or a dynamically linked library. A
32 single plugin may contain the implementation of several elements, or just
33 a single one. For simplicity, this guide concentrates primarily on plugins
34 containing one element.
37 A <emphasis>filter</emphasis> is an important type of element that
38 processes a stream of data. Producers and consumers of data are called
39 <emphasis>source</emphasis> and <emphasis>sink</emphasis> elements,
40 respectively. <emphasis>Bin</emphasis> elements contain other elements.
41 One type of bin is responsible for scheduling the elements that they
42 contain so that data flows smoothly. Another type of bin, called
43 <emphasis>autoplugger</emphasis> elements, automatically add other
44 elements to the bin and link them together so that they act as a
45 filter between two arbitary stream types.
48 The plugin mechanism is used everywhere in &GStreamer;, even if only the
49 standard packages are being used. A few very basic functions reside in the
50 core library, and all others are implemented in plugins. A plugin registry
51 is used to store the details of the plugins in an XML file. This way, a
52 program using &GStreamer; does not have to load all plugins to determine
53 which are needed. Plugins are only loaded when their provided elements are
57 See the &GstLibRef; for the current implementation details of <ulink
59 url="../gstreamer/gstelement.html"><classname>GstElement</classname></ulink>
60 and <ulink type="http"
61 url="../gstreamer/gstreamer-gstplugin.html"><classname>GstPlugin</classname></ulink>.
65 <!-- ############ sect1 ############# -->
67 <sect1 id="section-basics-pads" xreflabel="Pads">
70 <emphasis>Pads</emphasis> are used to negotiate links and data flow
71 between elements in &GStreamer;. A pad can be viewed as a
72 <quote>place</quote> or <quote>port</quote> on an element where
73 links may be made with other elements, and through which data can
74 flow to or from those elements. Pads have specific data handling
75 capabilities: A pad can restrict the type of data that flows
76 through it. Links are only allowed between two pads when the
77 allowed data types of the two pads are compatible.
80 An analogy may be helpful here. A pad is similar to a plug or jack on a
81 physical device. Consider, for example, a home theater system consisting
82 of an amplifier, a DVD player, and a (silent) video projector. Linking
83 the DVD player to the amplifier is allowed because both devices have audio
84 jacks, and linking the projector to the DVD player is allowed because
85 both devices have compatible video jacks. Links between the
86 projector and the amplifier may not be made because the projector and
87 amplifier have different types of jacks. Pads in &GStreamer; serve the
88 same purpose as the jacks in the home theater system.
91 For the most part, all data in &GStreamer; flows one way through a link
92 between elements. Data flows out of one element through one or more
93 <emphasis>source pads</emphasis>, and elements accept incoming data through
94 one or more <emphasis>sink pads</emphasis>. Source and sink elements have
95 only source and sink pads, respectively.
98 See the &GstLibRef; for the current implementation details of a <ulink
100 url="../gstreamer/gstreamer-gstpad.html"><classname>GstPad</classname></ulink>.
104 <!-- ############ sect1 ############# -->
106 <sect1 id="section-basics-data" xreflabel="Data, Buffers and Events">
107 <title>Data, Buffers and Events</title>
109 All streams of data in &GStreamer; are chopped up into chunks that are
110 passed from a source pad on one element to a sink pad on another element.
111 <emphasis>Data</emphasis> are structures used to hold these chunks of
115 Data contains the following important types:
119 An exact type indicating what type of data (control, content, ...)
125 A reference count indicating the number of elements currently
126 holding a reference to the buffer. When the buffer reference count
127 falls to zero, the buffer will be unlinked, and its memory will be
128 freed in some sense (see below for more details).
134 There are two types of data defined: events (control) and buffers
138 Buffers may contain any sort of data that the two linked pads
139 know how to handle. Normally, a buffer contains a chunk of some sort of
140 audio or video data that flows from one element to another.
143 Buffers also contain metadata describing the buffer's contents. Some of
144 the important types of metadata are:
148 A pointer to the buffer's data.
153 An integer indicating the size of the buffer's data.
158 A timestamp indicating the preferred display timestamp of the
159 content in the buffer.
166 contain information on the state of the stream flowing between the two
167 linked pads. Events will only be sent if the element explicitely supports
168 them, else the core will (try to) handle the events automatically. Events
169 are used to indicate, for example, a clock discontinuity, the end of a
170 media stream or that the cache should be flushed.
173 Events may contain several of the following items:
177 A subtype indicating the type of the contained event.
182 The other contents of the event depend on the specific event type.
188 See the &GstLibRef; for the current implementation details of a <ulink
190 url="../gstreamer/gstreamer-gstdata.html"><classname>GstData</classname></ulink>, <ulink type="http"
191 url="../gstreamer/gstreamer-gstbuffer.html"><classname>GstBuffer</classname></ulink> and <ulink type="http"
192 url="../gstreamer/gstreamer-gstevent.html"><classname>GstEvent</classname></ulink>.
195 <sect2 id="sect2-buffers-bufferpools" xreflabel="Buffer Allocation and
197 <title>Buffer Allocation and Buffer Pools</title>
199 Buffers can be allocated using various schemes, and they may either be
200 passed on by an element or unreferenced, thus freeing the memory used by
201 the buffer. Buffer allocation and unlinking are important concepts when
202 dealing with real time media processing, since memory allocation is
203 relatively slow on most systems.
206 To improve the latency in a media pipeline, many &GStreamer; elements
207 use a <emphasis>buffer pool</emphasis> to handle buffer allocation and
208 releasing. A buffer pool is a virtual representation of one or more
209 buffers of which the data is not actually allocated by &GStreamer;
210 itself. Examples of these include hardware framebuffer memory in video
211 output elements or kernel-allocated DMA memory for video capture. The
212 huge advantage of using these buffers instead of creating our own is
213 that we do not have to copy memory from one place to another, thereby
214 saving a noticeable number of CPU cycles. Elements should not provide
215 a bufferpool to decrease the number of memory allocations: the kernel
216 will generally take care of that - and will probably do that much more
217 efficiently than we ever could. Using bufferpools in this way is highly
221 Normally in a media pipeline, most filter elements in &GStreamer; deal
222 with a buffer in place, meaning that they do not create or destroy
223 buffers. Sometimes, however, elements might need to alter the reference
224 count of a buffer, either by copying or destroying the buffer, or by
225 creating a new buffer. These topics are generally reserved for
226 non-filter elements, so they will be addressed at that point.
231 <!-- ############ sect1 ############# -->
233 <sect1 id="section-basics-types" xreflabel="Types and Properties">
234 <title>Mimetypes and Properties</title>
236 &GStreamer; uses a type system to ensure that the data passed between
237 elements is in a recognized format. The type system is also important
238 for ensuring that the parameters required to fully specify a format match
239 up correctly when linking pads between elements. Each link that is
240 made between elements has a specified type and optionally a set of
244 <!-- ############ sect2 ############# -->
246 <sect2 id="sect2-types-basictypes" xreflabel="Basic Types">
247 <title>The Basic Types</title>
249 &GStreamer; already supports many basic media types. Following is a
250 table of a few of the the basic types used for buffers in
251 &GStreamer;. The table contains the name ("mime type") and a
252 description of the type, the properties associated with the type, and
253 the meaning of each property. A full list of supported types is
254 included in <xref linkend="section-types-definitions"/>.
257 <table frame="all" id="table-basictypes" xreflabel="Table of Basic Types">
258 <title>Table of Basic Types</title>
259 <tgroup cols="6" align="left" colsep="1" rowsep="1">
263 <entry>Mime Type</entry>
264 <entry>Description</entry>
265 <entry>Property</entry>
266 <entry>Property Type</entry>
267 <entry>Property Values</entry>
268 <entry>Property Description</entry>
274 <!-- ############ type ############# -->
277 <entry morerows="1">audio/*</entry>
279 <emphasis>All audio types</emphasis>
282 <entry>integer</entry>
283 <entry>greater than 0</entry>
285 The sample rate of the data, in samples (per channel) per second.
289 <entry>channels</entry>
290 <entry>integer</entry>
291 <entry>greater than 0</entry>
293 The number of channels of audio data.
297 <!-- ############ type ############# -->
300 <entry morerows="3">audio/x-raw-int</entry>
302 Unstructured and uncompressed raw integer audio data.
304 <entry>endianness</entry>
305 <entry>integer</entry>
306 <entry>G_BIG_ENDIAN (1234) or G_LITTLE_ENDIAN (4321)</entry>
308 The order of bytes in a sample. The value G_LITTLE_ENDIAN (4321)
309 means <quote>little-endian</quote> (byte-order is <quote>least
310 significant byte first</quote>). The value G_BIG_ENDIAN (1234)
311 means <quote>big-endian</quote> (byte order is <quote>most
312 significant byte first</quote>).
316 <entry>signed</entry>
317 <entry>boolean</entry>
318 <entry>TRUE or FALSE</entry>
320 Whether the values of the integer samples are signed or not.
321 Signed samples use one bit to indicate sign (negative or
322 positive) of the value. Unsigned samples are always positive.
327 <entry>integer</entry>
328 <entry>greater than 0</entry>
330 Number of bits allocated per sample.
335 <entry>integer</entry>
336 <entry>greater than 0</entry>
338 The number of bits used per sample. This must be less than or
339 equal to the width: If the depth is less than the width, the
340 low bits are assumed to be the ones used. For example, a width
341 of 32 and a depth of 24 means that each sample is stored in a
342 32 bit word, but only the low 24 bits are actually used.
346 <!-- ############ type ############# -->
349 <entry morerows="3">audio/mpeg</entry>
351 Audio data compressed using the MPEG audio encoding scheme.
353 <entry>mpegversion</entry>
354 <entry>integer</entry>
355 <entry>1, 2 or 4</entry>
357 The MPEG-version used for encoding the data. The value 1 refers
358 to MPEG-1, -2 and -2.5 layer 1, 2 or 3. The values 2 and 4 refer
359 to the MPEG-AAC audio encoding schemes.
363 <entry>framed</entry>
364 <entry>boolean</entry>
365 <entry>0 or 1</entry>
367 A true value indicates that each buffer contains exactly one
368 frame. A false value indicates that frames and buffers do not
369 necessarily match up.
374 <entry>integer</entry>
375 <entry>1, 2, or 3</entry>
377 The compression scheme layer used to compress the data
378 <emphasis>(only if mpegversion=1)</emphasis>.
382 <entry>bitrate</entry>
383 <entry>integer</entry>
384 <entry>greater than 0</entry>
386 The bitrate, in bits per second. For VBR (variable bitrate)
387 MPEG data, this is the average bitrate.
391 <!-- ############ type ############# -->
394 <entry>audio/x-vorbis</entry>
395 <entry>Vorbis audio data</entry>
400 There are currently no specific properties defined for this type.