And fix some broken here and ther as well while at it.
# Autoplugging
-In [Your first application](manual/building/helloworld.md), you've learned to
+In [Your first application][helloworld], you've learned to
build a simple media player for Ogg/Vorbis files. By using alternative
elements, you are able to build media players for other media types,
such as Ogg/Speex, MP3 or even video formats. However, you would rather
by looking at all available elements in a system. This process is called
autoplugging, and GStreamer contains high-quality autopluggers. If
you're looking for an autoplugger, don't read any further and go to
-[Playback Components](manual/highlevel/playback-components.md). This chapter will
+[Playback Components][playback-components]. This chapter will
explain the *concept* of autoplugging and typefinding. It will explain
what systems GStreamer includes to dynamically detect the type of a
media stream, and how to generate a pipeline of decoder elements to
be used to setup a pipeline that will convert media from one mediatype
to another, for example for media decoding.
+[helloworld]: application-development/building/helloworld.md
+[playback-components]: application-development/highlevel/playback-components.md
+
## Media types as a way to identify streams
We have previously introduced the concept of capabilities as a way for
elements (or, rather, pads) to agree on a media type when streaming data
-from one element to the next (see [Capabilities of a
-pad](manual/building/pads.md#capabilities-of-a-pad)). We have explained that a
-capability is a combination of a media type and a set of properties. For
-most container formats (those are the files that you will find on your
-hard disk; Ogg, for example, is a container format), no properties are
+from one element to the next (see [Capabilities of a pad][pad-caps])). We have
+explained that a capability is a combination of a media type and a set of
+properties. For most container formats (those are the files that you will find
+on your hard disk; Ogg, for example, is a container format), no properties are
needed to describe the stream. Only a media type is needed. A full list
of media types and accompanying properties can be found in [the Plugin
-Writer's
-Guide](http://gstreamer.freedesktop.org/data/doc/gstreamer/head/pwg/html/section-types-definitions.html).
+Writer's Guide][pwg-type-defs].
An element must associate a media type to its source and sink pads when
it is loaded into the system. GStreamer knows about the different
GStreamer registry. This allows for very dynamic and extensible element
creation as we will see.
-In [Your first application](manual/building/helloworld.md), we've learned to
+In [Your first application][helloworld], we've learned to
build a music player for Ogg/Vorbis files. Let's look at the media types
associated with each pad in this pipeline. [The Hello world pipeline
with media types](#the-hello-world-pipeline-with-media-types) shows what
media type belongs to each pad in this pipeline.
-
+
Now that we have an idea how GStreamer identifies known media streams,
we can look at methods GStreamer uses to setup pipelines for media
handling and for media type detection.
+[pad-caps]: application-development/building/pads.md#capabilities-of-a-pad
+[pwg-type-defs]: pwg/advanced/building-types.md
+
## Media stream type detection
Usually, when loading a media stream, the type of the stream is not
## Dynamically autoplugging a pipeline
-See [Playback Components](manual/highlevel/playback-components.md) for using the
+See [Playback Components][playback-components] for using the
high level object that you can use to dynamically construct pipelines.
the “capsfilter” element in between the two elements, and specifying a
`GstCaps` as “caps” property on this element. It will then only allow
types matching that specified capability set for negotiation. See also
-[Creating capabilities for
-filtering](manual/building/pads.md#creating-capabilities-for-filtering).
+[Creating capabilities for filtering][filter-caps].
+
+[filter-caps]: application-development/building/pads.md#creating-capabilities-for-filtering
### Changing format in a PLAYING pipeline
# Interfaces
-In [Using an element as a
-GObject](manual/building/elements.md#using-an-element-as-a-gobject), you have
+In [Using an element as a GObject][element-object], you have
learned how to use `GObject` properties as a simple way to do
interaction between applications and elements. This method suffices for
the simple'n'straight settings, but fails for anything more complicated
See the API references for details. Here, we will just describe the
scope and purpose of each interface.
+[element-object]: application-development/building/elements.md#using-an-element-as-a-gobject
+
## The URI interface
In all examples so far, we have only supported local files through the
Stream information can most easily be read by reading it from a
`GstPad`. This has already been discussed before in [Using capabilities
-for metadata](manual/building/pads.md#using-capabilities-for-metadata).
+for metadata](application-development/building/pads.md#using-capabilities-for-metadata).
Therefore, we will skip it here. Note that this requires access to all
pads of which you want stream information.
Tag reading is done through a bus in GStreamer, which has been discussed
-previously in [Bus](manual/building/bus.md). You can listen for
+previously in [Bus](application-development/building/bus.md). You can listen for
`GST_MESSAGE_TAG` messages and handle them as you wish.
Note, however, that the `GST_MESSAGE_TAG` message may be fired multiple
- Data buffering, for example when dealing with network streams or
when recording data from a live stream such as a video or audio
card. Short hickups elsewhere in the pipeline will not cause data
- loss. See also [Stream
- buffering](manual/advanced/buffering.md#stream-buffering) about network
+ loss. See also [Stream buffering][stream-buffering] about network
buffering with queue2.
![Data buffering, from a networked
the pipeline), one can simply create a “queue” element and put this in
as part of the pipeline. GStreamer will take care of all threading
details internally.
+
+[stream-buffering]: application-development/advanced/buffering.md#stream-buffering
- Applications should no longer use signal handlers to be notified of
errors, end-of-stream and other similar pipeline events. Instead,
they should use the `GstBus`, which has been discussed in
- [Bus](manual/building/bus.md). The bus will take care that the messages will
+ [Bus][bus]. The bus will take care that the messages will
be delivered in the context of a main loop, which is almost
certainly the application's main thread. The big advantage of this
is that applications no longer need to be thread-aware; they don't
- need to use `g_idle_add
- ()` in the signal handler and do the actual real work in the
- idle-callback. GStreamer now does all that internally.
+ need to use `g_idle_add ()` in the signal handler and do the actual
+ real work in the idle-callback. GStreamer now does all that internally.
- Related to this, `gst_bin_iterate ()` has been removed. Pipelines
will iterate in their own thread, and applications can simply run a
`GMainLoop` (or call the mainloop of their UI toolkit, such as
- `gtk_main
- ()`).
+ `gtk_main ()`).
- State changes can be delayed (ASYNC). Due to the new fully threaded
nature of GStreamer-0.10, state changes are not always immediate, in
case in 0.10. In 0.10, queries and events can be sent to toplevel
pipelines, and the pipeline will do the dispatching internally for
you. This means less bookkeeping in your application. For a short
- code example, see [Position tracking and
- seeking](manual/advanced/queryevents.md). Related, seeking is now
- threadsafe, and your video output will show the new video position's
- frame while seeking, providing a better user experience.
+ code example, see [Position tracking and seeking][queries-and-events].
+ Related, seeking is now threadsafe, and your video output will show the new
+ video position's frame while seeking, providing a better user experience.
- The `GstThread` object has been removed. Applications can now simply
put elements in a pipeline with optionally some “queue” elements in
between for buffering, and GStreamer will take care of creating
threads internally. It is still possible to have parts of a pipeline
run in different threads than others, by using the “queue” element.
- See [Threads](manual/advanced/threads.md) for details.
+ See [Threads][threads] for details.
- Filtered caps -\> capsfilter element (the pipeline syntax for
gst-launch has not changed though).
GOption command line option API that was added to GLib 2.6.
`gst_init_get_option_group ()` is the new GOption-based equivalent
to `gst_init_get_ptop_table ()`.
+
+[bus]: application-development/building/bus.md
+[threads]: application-development/advanced/threads.md
+[queries-and-sevents]: application-development/advanced/queryevents.md
A bin is a container element. You can add elements to a bin. Since a bin
is an element itself, a bin can be handled in the same way as any other
-element. Therefore, the whole previous chapter
-([Elements](manual/building/elements.md)) applies to bins as well.
+element. Therefore, the whole previous chapter ([Elements][elements]) applies
+to bins as well.
+
+[elements]: application-development/building/elements.md
## What are bins
Information messages are for non-problem notifications. All those
messages contain a `GError` with the main error type and message,
and optionally a debug string. Both can be extracted using
- `gst_message_parse_error
- ()`, `_parse_warning ()` and `_parse_info ()`. Both error and debug
- strings should be freed after use.
+ `gst_message_parse_error()`, `_parse_warning ()` and `_parse_info ()`.
+ Both error and debug strings should be freed after use.
- End-of-stream notification: this is emitted when the stream has
ended. The state of the pipeline will not change, but further media
emitted multiple times for a pipeline (e.g. once for descriptive
metadata such as artist name or song title, and another one for
stream-information, such as samplerate and bitrate). Applications
- should cache metadata internally. `gst_message_parse_tag
- ()` should be used to parse the taglist, which should be
- `gst_tag_list_unref ()`'ed when no longer needed.
+ should cache metadata internally. `gst_message_parse_tag()` should be
+ used to parse the taglist, which should be `gst_tag_list_unref ()`'ed
+ when no longer needed.
- State-changes: emitted after a successful state change.
`gst_message_parse_state_changed ()` can be used to parse the old
- Buffering: emitted during caching of network-streams. One can
manually extract the progress (in percent) from the message by
extracting the “buffer-percent” property from the structure returned
- by `gst_message_get_structure
- ()`. See also [Buffering](manual/advanced/buffering.md).
+ by `gst_message_get_structure()`. See also [Buffering][buffering]
- Element messages: these are special messages that are unique to
certain elements and usually represent additional features. The
from some thread into the main thread. This is particularly useful
when the application is making use of element signals (as those
signals will be emitted in the context of the streaming thread).
+
+[buffering]: application-development/advanced/buffering.md
something with it and something else comes out at the other side. For a
decoder element, for example, you'd put in encoded data, and the element
would output decoded data. In the next chapter (see [Pads and
-capabilities](manual/building/pads.md)), you will learn more about data input
+capabilities][pads]), you will learn more about data input
and output in elements, and how you can set that up in your application.
+[pads]: application-development/building/pads.md
+
### Source elements
Source elements generate data for use by a pipeline, for example reading
a useful tool to query the properties of a particular element, it will
also use property introspection to give a short explanation about the
function of the property and about the parameter types and ranges it
-supports. See [gst-inspect](manual/appendix/checklist-element.md#gst-inspect) in
+supports. See [gst-inspect][checklist-gst-inspect] in
the appendix for details about `gst-inspect`.
For more information about `GObject` properties we recommend you read
signals a specific element supports. Together, signals and properties
are the most basic way in which elements and applications interact.
+[checklist-gst-inspect]: application-development/appendix/checklist-element.md#gst-inspect
+
## More about element factories
In the previous section, we briefly introduced the
for encoders, or it can be used for autoplugging purposes for media
players. All current GStreamer-based media players and autopluggers work
this way. We'll look closer at these features as we learn about `GstPad`
-and `GstCaps` in the next chapter: [Pads and
-capabilities](manual/building/pads.md)
+and `GstCaps` in the next chapter: [Pads and capabilities][pads]
## Linking elements
existing links. Also, you cannot directly link elements that are not in
the same bin or pipeline; if you want to link elements or pads at
different hierarchy levels, you will need to use ghost pads (more about
-ghost pads later, see [Ghost pads](manual/building/pads.md#ghost-pads)).
+ghost pads later, see [Ghost pads][ghostpads]
+
+[ghostpads]: application-development/building/pads.md#ghost-pads
## Element States
will also take care of switching messages from the pipeline's thread
into the application's own thread, by using a
[`GstBus`](http://gstreamer.freedesktop.org/data/doc/gstreamer/stable/gstreamer/html/GstBus.html).
-See [Bus](manual/building/bus.md) for details.
+See [Bus][bus] for details.
When you set a bin or pipeline to a certain target state, it will
usually propagate the state change to all elements within the bin or
set it to the desired target state yourself using `gst_element_set_state
()` or `gst_element_sync_state_with_parent ()`.
+[bus]: application-development/building/bus.md
+
1. The code for this example is automatically extracted from the
documentation and built under `tests/examples/manual` in the
GStreamer tarball.
components. The player will read a file specified on the command-line.
Let's get started\!
-We've learned, in [Initializing GStreamer](manual/building/init.md), that the
+We've learned, in [Initializing GStreamer][gst-init], that the
first thing to do in your application is to initialize GStreamer by
calling `gst_init ()`. Also, make sure that the application includes
`gst/gst.h` so all function names and objects are properly defined. Use
it's conveniently called “vorbisdec”. Since “oggdemux” creates dynamic
pads for each elementary stream, you'll need to set a “pad-added” event
handler on the “oggdemux” element, like you've learned in [Dynamic (or
-sometimes) pads](manual/building/pads.md#dynamic-or-sometimes-pads), to link the
+sometimes) pads][dynamic-pads], to link the
Ogg demuxer and the Vorbis decoder elements together. At last, we'll
also need an audio output element, we will use “autoaudiosink”, which
automatically detects your audio device.
The last thing left to do is to add all elements into a container
element, a `GstPipeline`, and wait until we've played the whole song.
We've previously learned how to add elements to a container bin in
-[Bins](manual/building/bins.md), and we've learned about element states in
-[Element States](manual/building/elements.md#element-states). We will also
-attach a message handler to the pipeline bus so we can retrieve errors
-and detect the end-of-stream.
+[Bins][bins], and we've learned about element states in
+[Element States][element-states]. We will also attach a message handler to
+the pipeline bus so we can retrieve errors and detect the end-of-stream.
Let's now add all the code together to get our very first audio player:

+[gst-init]: application-development/building/init.md
+[advanced]: application-development/advanced/index.md
+[dynamic-pads]: application-development/building/pads.md#dynamic-or-sometimes-pads
+[bins]: application-development/building/bins.md
+[element-states]: application-development/building/elements.md#element-states
+
## Compiling and Running helloworld.c
To compile the helloworld example, use: `gcc -Wall
very low-level but powerful. You will see later in this manual how you
can create a more powerful media player with even less effort using
higher-level interfaces. We will discuss all that in [Higher-level
-interfaces for GStreamer applications](manual/advanced/index.md). We will
+interfaces for GStreamer applications][advanced]. We will
first, however, go more in-depth into more advanced GStreamer internals.
It should be clear from the example that we can very easily replace the
# Pads and capabilities
-As we have seen in [Elements](manual/building/elements.md), the pads are the
+As we have seen in [Elements][elements], the pads are the
element's interface to the outside world. Data streams from one
element's source pad to another element's sink pad. The specific type of
media that the element can handle will be exposed by the pad's
capabilities. We will talk more on capabilities later in this chapter
(see [Capabilities of a pad](#capabilities-of-a-pad)).
+[elements]: application-development/building/elements.md
+
## Pads
A pad type is defined by two properties: its direction and its
“filtered caps” to set a specific (fixed or non-fixed) video size
that should stream between two pads. You will see an example of
filtered caps later in this manual, in [Manually adding or removing
- data from/to a
- pipeline](manual/advanced/dataaccess.md#manually-adding-or-removing-data-fromto-a-pipeline).
+ data from/to a pipeline][inserting-or-extracting-data].
You can do caps filtering by inserting a capsfilter element into
your pipeline and setting its “caps” property. Caps filters are
often placed after converter elements like audioconvert,
to convert data to a specific output format at a certain point in a
stream.
+[inserting-or-extracting-data]: application-development/advanced/dataaccess.md#manually-adding-or-removing-data-fromto-a-pipeline
+
### Using capabilities for metadata
A pad can have a set (i.e. one or more) of capabilities attached to it.
components is to integrate as closely as possible with a GStreamer
pipeline, but to hide the complexity of media type detection and several
other rather complex topics that have been discussed in [Advanced
-GStreamer concepts](manual/advanced/index.md).
+GStreamer concepts][advanced].
We currently recommend people to use either playbin (see
[Playbin](#playbin)) or decodebin (see [Decodebin](#decodebin)),
and so on. Its programming interface is more low-level than that of
playbin, though.
+[advanced]: application-development/advanced/index.md
+
## Playbin
Playbin is an element that can be created using the standard GStreamer
Uridecodebin will also automatically insert buffering elements when the
uri is a slow network source. The buffering element will post BUFFERING
messages that the application needs to handle as explained in
-[Buffering](manual/advanced/buffering.md). The following properties can be used
+[Buffering][buffering]. The following properties can be used
to configure the buffering method:
- The buffer-size property allows you to configure a maximum size in
in time for the buffer element. The time will be estimated based on
the bitrate of the network.
- - With the download property you can enable the download buffering
- method as described in [Download
- buffering](manual/advanced/buffering.md#download-buffering). Setting this
+ - With the download property you can enable the download buffering method
+ as described in [Download buffering][download-buffering]. Setting this
option to TRUE will only enable download buffering for selected
formats such as quicktime, flash video, avi and webm.
command `gst-launch-1.0 uridecodebin uri=file:///file.ogg !
! audioconvert ! audioresample ! autoaudiosink`.
+[buffering]: application-development/advanced/buffering.md
+[download-buffering]: application-development/advanced/buffering.md#download-buffering
+
## Playsink
The playsink element is a powerful sink element. It has request pads for
GStreamer appliction development. The parts of this guide are laid out
in the following order:
-[About GStreamer](manual/introduction/index.md) gives you an overview of
-GStreamer, it's design principles and foundations.
-
-[Building an Application](manual/building/index.md) covers the basics of
-GStreamer application programming. At the end of this part, you should
-be able to build your own audio player using GStreamer
-
-In [Advanced GStreamer concepts](manual/advanced/index.md), we will move on to
-advanced subjects which make GStreamer stand out of its competitors. We
-will discuss application-pipeline interaction using dynamic parameters
-and interfaces, we will discuss threading and threaded pipelines,
-scheduling and clocks (and synchronization). Most of those topics are
-not just there to introduce you to their API, but primarily to give a
-deeper insight in solving application programming problems with
-GStreamer and understanding their concepts.
-
-Next, in [Higher-level interfaces for GStreamer
-applications](manual/advanced/index.md), we will go into higher-level
-programming APIs for GStreamer. You don't exactly need to know all the
-details from the previous parts to understand this, but you will need to
-understand basic GStreamer concepts nevertheless. We will, amongst
-others, discuss XML, playbin and autopluggers.
-
-Finally in [Appendices](manual/appendix/index.md), you will find some random
+[About GStreamer][about] gives you an overview of GStreamer, its design
+principles and foundations.
+
+[Building an Application][app-building] covers the basics of GStreamer
+application programming. At the end of this part, you should be able to
+build your own audio player using GStreamer
+
+In [Advanced GStreamer concepts][advanced], we will move on to advanced
+subjects which make GStreamer stand out of its competitors. We will discuss
+application-pipeline interaction using dynamic parameters and interfaces, we
+will discuss threading and threaded pipelines, scheduling and clocks (and
+synchronization). Most of those topics are not just there to introduce you to
+their API, but primarily to give a deeper insight in solving application
+programming problems with GStreamer and understanding their concepts.
+
+Next, in [Higher-level interfaces for GStreamer applications][highlevel], we
+will go into higher-level programming APIs for GStreamer. You don't exactly
+need to know all the details from the previous parts to understand this, but
+you will need to understand basic GStreamer concepts nevertheless. We will,
+amongst others, discuss playbin and autopluggers.
+
+Finally in [Appendices][appendix], you will find some random
information on integrating with GNOME, KDE, OS X or Windows, some
debugging help and general tips to improve and simplify GStreamer
programming.
+
+[about]: application-development/introduction/index.md
+[app-building]: application-development/building/index.md
+[advanced]: application-development/advanced/index.md
+[highlevel]: application-development/highlevel/index.md
+[appendix]: application-development/appendix/index.md