*
* The chromaprint element calculates an acoustic fingerprint for an
* audio stream which can be used to identify a song and look up
- * further metadata from the <ulink url="http://acoustid.org/">Acoustid</ulink>
- * and Musicbrainz databases.
+ * further metadata from the [Acoustid](http://acoustid.org/) and Musicbrainz
+ * databases.
*
* ## Example launch line
* |[
* @title: dfbvideosink
*
* DfbVideoSink renders video frames using the
- * <ulink url="http://www.directfb.org/">DirectFB</ulink> library.
+ * [DirectFB](http://www.directfb.org/) library.
* Rendering can happen in two different modes :
*
* * Standalone: this mode will take complete control of the monitor forcing
- * <ulink url="http://www.directfb.org/">DirectFB</ulink> to fullscreen layout.
+ * DirectFB to fullscreen layout.
+ *
* This is convenient to test using the gst-launch-1.0 command line tool or
* other simple applications. It is possible to interrupt playback while
* being in this mode by pressing the Escape key.
* This mode handles navigation events for every input device supported by
- * the <ulink url="http://www.directfb.org/">DirectFB</ulink> library, it will
- * look for available video modes in the fb.modes file and try to switch
- * the framebuffer video mode to the most suitable one. Depending on
- * hardware acceleration capabilities the element will handle scaling or not.
+ * the DirectFB library, it will look for available video modes in the fb.modes
+ * file and try to switch the framebuffer video mode to the most suitable one.
+ * Depending on hardware acceleration capabilities the element will handle
+ * scaling or not.
+ *
* If no acceleration is available it will do clipping or centering of the
* video frames respecting the original aspect ratio.
*
* #GstDfbVideoSink:surface provided by the
* application developer. This is a more advanced usage of the element and
* it is required to integrate video playback in existing
- * <ulink url="http://www.directfb.org/">DirectFB</ulink> applications.
+ * DirectFB applications.
+ *
* When using this mode the element just renders to the
* #GstDfbVideoSink:surface provided by the
* application, that means it won't handle navigation events and won't resize
* @see_also: timidity, wildmidi
*
* This element renders midi-events as audio streams using
- * <ulink url="http://fluidsynth.sourceforge.net//">Fluidsynth</ulink>.
+ * [Fluidsynth](http://fluidsynth.sourceforge.net/).
* It offers better sound quality compared to the timidity or wildmidi element.
*
* ## Example pipeline
* @title: katedec
* @see_also: oggdemux
*
- * This element decodes Kate streams
- * <ulink url="http://libkate.googlecode.com/">Kate</ulink> is a free codec
+ * This element decodes Kate streams.
+ *
+ * [Kate](http://libkate.googlecode.com/) is a free codec
* for text based data, such as subtitles. Any number of kate streams can be
* embedded in an Ogg stream.
*
* @title: kateenc
* @see_also: oggmux
*
- * This element encodes Kate streams
- * <ulink url="http://libkate.googlecode.com/">Kate</ulink> is a free codec
- * for text based data, such as subtitles. Any number of kate streams can be
- * embedded in an Ogg stream.
+ * This element encodes Kate streams.
+ *
+ * [Kate](http://libkate.googlecode.com/) is a free codec for text based data,
+ * such as subtitles. Any number of kate streams can be embedded in an Ogg
+ * stream.
*
* libkate (see above url) is needed to build this plugin.
*
* @title: tiger
* @see_also: katedec
*
- * This element decodes and renders Kate streams
- * <ulink url="http://libkate.googlecode.com/">Kate</ulink> is a free codec
- * for text based data, such as subtitles. Any number of kate streams can be
- * embedded in an Ogg stream.
+ * This element decodes and renders Kate streams.
+ * [Kate](http://libkate.googlecode.com/) is a free codec for text based data,
+ * such as subtitles. Any number of kate streams can be embedded in an Ogg
+ * stream.
*
- * libkate (see above url) and <ulink url="http://libtiger.googlecode.com/">libtiger</ulink>
+ * libkate (see above url) and [libtiger](http://libtiger.googlecode.com/)
* are needed to build this element.
*
* ## Example pipeline
* @see_also: #GstAudioConvert #GstAudioResample, #GstAudioTestSrc, #GstAutoAudioSink
*
* The LADSPA (Linux Audio Developer's Simple Plugin API) element is a bridge
- * for plugins using the <ulink url="http://www.ladspa.org/">LADSPA</ulink> API.
+ * for plugins using the [LADSPA](http://www.ladspa.org/) API.
+ *
* It scans all installed LADSPA plugins and registers them as gstreamer
* elements. If available it can also parse LRDF files and use the metadata for
* element classification. The functionality you get depends on the LADSPA plugins
* a successor to LADSPA (Linux Audio Developer's Simple Plugin API).
*
* The LV2 element is a bridge for plugins using the
- * <ulink url="http://www.lv2plug.in/">LV2</ulink> API. It scans all
- * installed LV2 plugins and registers them as gstreamer elements.
+ * [LV2](http://www.lv2plug.in/) API. It scans all installed LV2 plugins and
+ * registers them as gstreamer elements.
*/
#ifdef HAVE_CONFIG_H
/**
* SECTION:element-modplug
*
- * Modplug uses the <ulink url="http://modplug-xmms.sourceforge.net/">modplug</ulink>
- * library to decode tracked music in the MOD/S3M/XM/IT and related formats.
+ * Modplug uses the [modplug](http://modplug-xmms.sourceforge.net/) library to
+ * decode tracked music in the MOD/S3M/XM/IT and related formats.
*
* ## Example pipeline
*
* @see_also: mpeg2dec
*
* This element encodes raw video into an MPEG-1/2 elementary stream using the
- * <ulink url="http://mjpeg.sourceforge.net/">mjpegtools</ulink> library.
+ * [mjpegtools](http://mjpeg.sourceforge.net/) library.
+ *
* Documentation on MPEG encoding in general can be found in the
- * <ulink url="https://sourceforge.net/docman/display_doc.php?docid=3456&group_id=5776">MJPEG Howto</ulink>
+ * [MJPEG Howto](https://sourceforge.net/docman/display_doc.php?docid=3456&group_id=5776)
* and on the various available parameters in the documentation
* of the mpeg2enc tool in particular, which shares options with this element.
*
*
* This element is an audio/video multiplexer for MPEG-1/2 video streams
* and (un)compressed audio streams such as AC3, MPEG layer I/II/III.
- * It is based on the <ulink url="http://mjpeg.sourceforge.net/">mjpegtools</ulink> library.
+ * It is based on the [mjpegtools](http://mjpeg.sourceforge.net/) library.
* Documentation on creating MPEG videos in general can be found in the
- * <ulink url="https://sourceforge.net/docman/display_doc.php?docid=3456&group_id=5776">MJPEG Howto</ulink>
+ * [MJPEG Howto](https://sourceforge.net/docman/display_doc.php?docid=3456&group_id=5776)
* and the man-page of the mplex tool documents the properties of this element,
* which are shared with the mplex tool.
*
* @see_also: #GstOpenMptDec
*
* openmpdec decodes module music formats, such as S3M, MOD, XM, IT.
- * It uses the <ulink url="https://lib.openmpt.org">OpenMPT library</ulink>
- * for this purpose. It can be autoplugged and therefore works with decodebin.
+ * It uses the [OpenMPT library](https://lib.openmpt.org) for this purpose.
+ * It can be autoplugged and therefore works with decodebin.
*
* ## Example launch line
*
* SECTION:element-srtsink
* @title: srtsink
*
- * srtsink is a network sink that sends <ulink url="http://www.srtalliance.org/">SRT</ulink>
+ * srtsink is a network sink that sends [SRT](http://www.srtalliance.org/)
* packets to the network.
*
* ## Examples</title>
* SECTION:element-srtsrc
* @title: srtsrc
*
- * srtsrc is a network source that reads <ulink url="http://www.srtalliance.org/">SRT</ulink>
+ * srtsrc is a network source that reads [SRT](http://www.srtalliance.org/)
* packets from the network.
*
* ## Examples
* SECTION:element-voaacenc
* @title: voaacenc
*
- * AAC audio encoder based on vo-aacenc library
- * <ulink url="http://sourceforge.net/projects/opencore-amr/files/vo-aacenc/">vo-aacenc library source file</ulink>.
+ * AAC audio encoder based on vo-aacenc library.
+ *
+ * [vo-aacenc library source file](http://sourceforge.net/projects/opencore-amr/files/vo-aacenc/)
*
* ## Example launch line
* |[
* @see_also: #GstAmrWbDec, #GstAmrWbParse
*
* AMR wideband encoder based on the
- * <ulink url="http://www.penguin.cz/~utx/amr">reference codec implementation</ulink>.
+ * [reference codec implementation](http://www.penguin.cz/~utx/amr).
*
* ## Example launch line
* |[
*
* The waylandsink is creating its own window and render the decoded video frames to that.
* Setup the Wayland environment as described in
- * <ulink url="http://wayland.freedesktop.org/building.html">Wayland</ulink> home page.
- * The current implementaion is based on weston compositor.
+ * [Wayland](http://wayland.freedesktop.org/building.html) home page.
+ *
+ * The current implementation is based on weston compositor.
*
* ## Example pipelines
* |[
* @title: GstWebRTCDataChannel
* @see_also: #GstWebRTCRTPTransceiver
*
- * <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcsctptransport">http://w3c.github.io/webrtc-pc/#dom-rtcsctptransport</ulink>
+ * <http://w3c.github.io/webrtc-pc/#dom-rtcsctptransport>
*/
#ifdef HAVE_CONFIG_H
* @see_also: #GstWildmidiDec
*
* wildmididec decodes MIDI files.
- * It uses <ulink url="https://www.mindwerks.net/projects/wildmidi/">WildMidi</ulink>
- * for this purpose. It can be autoplugged and therefore works with decodebin.
+ *
+ * It uses [WildMidi](https://www.mindwerks.net/projects/wildmidi/) for this
+ * purpose. It can be autoplugged and therefore works with decodebin.
*
* ## Example launch line
*
* @title: GstWebRTCDTLSTransport
* @see_also: #GstWebRTCRTPSender, #GstWebRTCRTPReceiver, #GstWebRTCICETransport
*
- * <ulink url="https://www.w3.org/TR/webrtc/#rtcdtlstransport">https://www.w3.org/TR/webrtc/#rtcdtlstransport</ulink>
+ * <https://www.w3.org/TR/webrtc/#rtcdtlstransport>
*/
#ifdef HAVE_CONFIG_H
* @title: GstWebRTCICETransport
* @see_also: #GstWebRTCRTPSender, #GstWebRTCRTPReceiver, #GstWebRTCDTLSTransport
*
- * <ulink url="https://www.w3.org/TR/webrtc/#rtcicetransport">https://www.w3.org/TR/webrtc/#rtcicetransport</ulink>
+ * <https://www.w3.org/TR/webrtc/#rtcicetransport>
*/
#ifdef HAVE_CONFIG_H
* @short_description: RTCSessionDescription object
* @title: GstWebRTCSessionDescription
*
- * <ulink url="https://www.w3.org/TR/webrtc/#rtcsessiondescription-class">https://www.w3.org/TR/webrtc/#rtcsessiondescription-class</ulink>
+ * <https://www.w3.org/TR/webrtc/#rtcsessiondescription-class>
*/
#ifdef HAVE_CONFIG_H
* @type: the #GstWebRTCSDPType of the description
* @sdp: the #GstSDPMessage of the description
*
- * See <ulink url="https://www.w3.org/TR/webrtc/#rtcsessiondescription-class">https://www.w3.org/TR/webrtc/#rtcsessiondescription-class</ulink>
+ * See <https://www.w3.org/TR/webrtc/#rtcsessiondescription-class>
*/
struct _GstWebRTCSessionDescription
{
* @title: GstWebRTCRTPReceiver
* @see_also: #GstWebRTCRTPSender, #GstWebRTCRTPTransceiver
*
- * <ulink url="https://www.w3.org/TR/webrtc/#rtcrtpreceiver-interface">https://www.w3.org/TR/webrtc/#rtcrtpreceiver-interface</ulink>
+ * <https://www.w3.org/TR/webrtc/#rtcrtpreceiver-interface>
*/
#ifdef HAVE_CONFIG_H
* @title: GstWebRTCRTPSender
* @see_also: #GstWebRTCRTPReceiver, #GstWebRTCRTPTransceiver
*
- * <ulink url="https://www.w3.org/TR/webrtc/#rtcrtpsender-interface">https://www.w3.org/TR/webrtc/#rtcrtpsender-interface</ulink>
+ * <https://www.w3.org/TR/webrtc/#rtcrtpsender-interface>
*/
#ifdef HAVE_CONFIG_H
* @title: GstWebRTCRTPTransceiver
* @see_also: #GstWebRTCRTPSender, #GstWebRTCRTPReceiver
*
- * <ulink url="https://www.w3.org/TR/webrtc/#rtcrtptransceiver-interface">https://www.w3.org/TR/webrtc/#rtcrtptransceiver-interface</ulink>
+ * <https://www.w3.org/TR/webrtc/#rtcrtptransceiver-interface>
*/
#ifdef HAVE_CONFIG_H
* @GST_WEBRTC_ICE_GATHERING_STATE_GATHERING: gathering
* @GST_WEBRTC_ICE_GATHERING_STATE_COMPLETE: complete
*
- * See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcicegatheringstate">http://w3c.github.io/webrtc-pc/#dom-rtcicegatheringstate</ulink>
+ * See <http://w3c.github.io/webrtc-pc/#dom-rtcicegatheringstate>
*/
typedef enum /*< underscore_name=gst_webrtc_ice_gathering_state >*/
{
* @GST_WEBRTC_ICE_CONNECTION_STATE_DISCONNECTED: disconnected
* @GST_WEBRTC_ICE_CONNECTION_STATE_CLOSED: closed
*
- * See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtciceconnectionstate">http://w3c.github.io/webrtc-pc/#dom-rtciceconnectionstate</ulink>
+ * See <http://w3c.github.io/webrtc-pc/#dom-rtciceconnectionstate>
*/
typedef enum /*< underscore_name=gst_webrtc_ice_connection_state >*/
{
* @GST_WEBRTC_SIGNALING_STATE_HAVE_LOCAL_PRANSWER: have-local-pranswer
* @GST_WEBRTC_SIGNALING_STATE_HAVE_REMOTE_PRANSWER: have-remote-pranswer
*
- * See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcsignalingstate">http://w3c.github.io/webrtc-pc/#dom-rtcsignalingstate</ulink>
+ * See <http://w3c.github.io/webrtc-pc/#dom-rtcsignalingstate>
*/
typedef enum /*< underscore_name=gst_webrtc_signaling_state >*/
{
* @GST_WEBRTC_PEER_CONNECTION_STATE_FAILED: failed
* @GST_WEBRTC_PEER_CONNECTION_STATE_CLOSED: closed
*
- * See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcpeerconnectionstate">http://w3c.github.io/webrtc-pc/#dom-rtcpeerconnectionstate</ulink>
+ * See <http://w3c.github.io/webrtc-pc/#dom-rtcpeerconnectionstate>
*/
typedef enum /*< underscore_name=gst_webrtc_peer_connection_state >*/
{
* @GST_WEBRTC_SDP_TYPE_ANSWER: answer
* @GST_WEBRTC_SDP_TYPE_ROLLBACK: rollback
*
- * See <ulink url="http://w3c.github.io/webrtc-pc/#rtcsdptype">http://w3c.github.io/webrtc-pc/#rtcsdptype</ulink>
+ * See <http://w3c.github.io/webrtc-pc/#rtcsdptype>
*/
typedef enum /*< underscore_name=gst_webrtc_sdp_type >*/
{
* GST_WEBRTC_SCTP_TRANSPORT_STATE_CONNECTED: connected
* GST_WEBRTC_SCTP_TRANSPORT_STATE_CLOSED: closed
*
- * See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcsctptransportstate">http://w3c.github.io/webrtc-pc/#dom-rtcsctptransportstate</ulink>
+ * See <http://w3c.github.io/webrtc-pc/#dom-rtcsctptransportstate>
*
* Since: 1.16
*/
* GST_WEBRTC_PRIORITY_TYPE_MEDIUM: medium
* GST_WEBRTC_PRIORITY_TYPE_HIGH: high
*
- * See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcprioritytype">http://w3c.github.io/webrtc-pc/#dom-rtcprioritytype</ulink>
+ * See <http://w3c.github.io/webrtc-pc/#dom-rtcprioritytype>
*
* Since: 1.16
*/
* GST_WEBRTC_DATA_CHANNEL_STATE_CLOSING: closing
* GST_WEBRTC_DATA_CHANNEL_STATE_CLOSED: closed
*
- * See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcdatachannelstate">http://w3c.github.io/webrtc-pc/#dom-rtcdatachannelstate</ulink>
+ * See <http://w3c.github.io/webrtc-pc/#dom-rtcdatachannelstate>
*
* Since: 1.16
*/
*
* The accurip element calculates a CRC for an audio stream which can be used
* to match the audio stream to a database hosted on
- * <ulink url="http://accuraterip.com/">AccurateRip</ulink>. This database
- * is used to check for a CD rip accuracy.
+ * [AccurateRip](http://accuraterip.com/). This database is used to check for a
+ * CD rip accuracy.
*
* ## Example launch line
* |[
* @title: festival
*
* This element connects to a
- * <ulink url="http://www.festvox.org/festival/index.html">festival</ulink>
- * server process and uses it to synthesize speech. Festival need to run already
- * in server mode, started as `festival --server`
+ * [festival](http://www.festvox.org/festival/index.html) server process and
+ * uses it to synthesize speech. Festival need to run already in server mode,
+ * started as `festival --server`
*
* ## Example pipeline
* |[
* #GstPcapParse:src-port and #GstPcapParse:dst-port to restrict which packets
* should be included.
*
- * The supported data format is the classical <ulink
- * url="https://wiki.wireshark.org/Development/LibpcapFileFormat">libpcap file
- * format</ulink>.
+ * The supported data format is the classical
+ * [libpcap file format](https://wiki.wireshark.org/Development/LibpcapFileFormat)
*
* ## Example pipelines
* |[