-- gst_element_get_request_pad() has been deprecated in favour of the
- newly-added gst_element_request_pad_simple() which does the exact
- same thing but has a less confusing name that hopefully makes clear
- that the function request a new pad rather than just retrieves an
- already-existing request pad.
-
-Development in GitLab was switched to a single git repository containing all the modules
-
-The GStreamer multimedia framework is a set of libraries and plugins
-split into a number of distinct modules which are released independently
-and which have so far been developed in separate git repositories in
-freedesktop.org GitLab.
-
-In addition to these separate git repositories there was a gst-build
-module that would use the Meson build systems’s subproject feature to
-download each individual module and then build everything in one go. It
-would also provide an uninstalled development environment that made it
-easy to work on GStreamer and use or test versions other than the
-system-installed GStreamer version.
-
-All of these modules have now (as of 28 September 2021) been merged into
-a single git repository (“Mono repository” or “monorepo”) which should
-simplify development workflows and continuous integration, especially
-where changes need to be made to multiple modules at once.
-
-This mono repository merge will primarily affect GStreamer developers
-and contributors and anyone who has workflows based on the GStreamer git
-repositories.
-
-The Rust bindings and Rust plugins modules have not been merged into the
-mono repository at this time because they follow a different release
-cycle.
-
-The mono repository lives in the existing GStreamer core git repository
-in GitLab in the new main branch and all future development will happen
-on this branch.
-
-Modules will continue to be released as separate tarballs.
-
-For more details, please see the GStreamer mono repository FAQ.
-
-GstPlay: new high-level playback library replacing GstPlayer
-
-- GstPlay is a new high-level playback library that replaces the older
- GstPlayer API. It is basically the same API as GstPlayer but
- refactored to use bus messages for application notifications instead
- of GObject signals. There is still a signal adapter object for those
- who prefer signals. Since the existing GstPlayer API is already in
- use in various applications, it didn’t seem like a good idea to
- break it entirely. Instead a new API was added, and it is expected
- that this new GstPlay API will be moved to gst-plugins-base in
- future.
-
-- The existing GstPlayer API is scheduled for deprecation and will be
- removed at some point in the future (e.g. in GStreamer 1.24), so
- application developers are urged to migrate to the new GstPlay API
- at their earliest convenience.
-
-WebM alpha decoding
-
-- Implement WebM alpha decoding (VP8/VP9 with alpha), which required
- support and additions in various places. This is supported both with
- software decoders and hardware-accelerated decoders.
-
-- VP8/VP9 don’t support alpha components natively in the codec, so the
- way this is implemented in WebM is by encoding the alpha plane with
- transparency data as a separate VP8/VP9 stream. Inside the WebM
- container (a variant of Matroska) this is coded as a single video
- track with the “normal” VP8/VP9 video data making up the main video
- data and each frame of video having an encoded alpha frame attached
- to it as extra data ("BlockAdditional").
-
-- matroskademux has been extended extract this per-frame alpha side
- data and attach it in form of a GstVideoCodecAlphaMeta to the
- regular video buffers. Note that this new meta is specific to this
- VP8/VP9 alpha support and can’t be used to just add alpha support to
- other codecs that don’t support it. Lastly, matroskademux also
- advertises the fact that the streams contain alpha in the caps.
-
-- The new codecalpha plugin contains various bits of infrastructure to
- support autoplugging and debugging:
-
- - codecalphademux splits out the alpha stream from the metas on
- the regular VP8/VP9 buffers
- - alphacombine takes two decoded raw video streams (one alpha, one
- the regular video) and combines it into a video stream with
- alpha
- - vp8alphadecodebin + vp9alphadecodebin are wrapper bins that use
- the regular vp8dec and vp9dec software decoders to decode
- regular and alpha streams and combine them again. To decodebin
- these look like regular decoders which ju
- - The V4L2 CODEC plugin has stateless VP8/VP9 decoders that can
- decode both alpha and non-alpha stream with a single decoder
- instance
-
-- A new AV12 video format was added which is basically NV12 with an
- alpha plane, which is more convenient for many hardware-accelerated
- decoders.
-
-- Watch Nicolas Dufresne’s LCA 2022 talk “Bringing WebM Alpha support
- to GStreamer” for all the details and a demo.
-
-RTP Header Extensions Base Class and Automatic Header Extension Handling in RTP Payloaders and Depayloaders
-
-- RTP Header Extensions are specified in RFC 5285 and provide a way to
- add small pieces of data to RTP packets in between the RTP header
- and the RTP payload. This is often used for per-frame metadata,
- extended timestamps or other application-specific extra data. There
- are several commonly-used extensions specified in various RFCs, but
- senders are free to put any kind of data in there, as long as sender
- and receiver both know what that data is. Receivers that don’t know
- about the header extensions will just skip the extra data without
- ever looking at it. These header extensions can often be combined
- with any kind of payload format, so may need to be supported by many
- RTP payloader and depayloader elements.
-
-- Inserting and extracting RTP header extension data has so far been a
- bit inconvenient in GStreamer: There are functions to add and
- retrieve RTP header extension data from RTP packets, but nothing
- works automatically, even for common extensions. People would have
- to do the insertion/extraction either in custom elements
- before/after the RTP payloader/depayloader, or inside pad probes,
- which isn’t very nice.
-
-- This release adds various pieces of new infrastructure for generic
- RTP header extension handling, as well as some implementations for
- common extensions:
-
- - GstRTPHeaderExtension is a new helper base class for reading and
- writing RTP header extensions. Nominally this subclasses
- GstElement, but only so these extensions are stored in the
- registry where they can be looked up by URI or name. They don’t
- have pads and don’t get added to the pipeline graph as an
- element.
-
- - "add-extension" and "clear-extension" action signals on RTP
- payloaders and depayloaders for manual extension management
-
- - The "request-extension" signal will be emitted if an extension
- is encountered that requires explicit mapping by the application
-
- - new "auto-header-extension" property on RTP payloaders and
- depayloaders for automatic handling of known header extensions.
- This is enabled by default. The extensions must be signalled via
- caps / SDP.
-
- - RTP header extension implementations:
-
- - rtphdrextclientaudiolevel: Client-to-Mixer Audio Level
- Indication (RFC 6464) (also see below)
- - rtphdrextcolorspace: Color Space extension, extends RTP
- packets with color space and high dynamic range (HDR)
- information
- - rtphdrexttwcc: Transport Wide Congestion Control support
-
-- gst_rtp_buffer_remove_extension_data() is a new helper function to
- remove an RTP header extension from an RTP buffer
-
-- The existing gst_rtp_buffer_set_extension_data() now also supports
- shrinking the extension data in size
-
-AppSink and AppSrc improvements
-
-- appsink: new API to pull events out of appsink in addition to
- buffers and buffer lists.
-
- There was previously no way for users to receive incoming events
- from appsink properly serialised with the data flow, even if they
- are serialised events. The reason for that is that the only way to
- intercept events was via a pad probe on the appsink sink pad, but
- there is also internal queuing inside of appsink, so it’s difficult
- to ascertain the right order of everything in all cases.
-
- There is now a new "new-serialized-event" signal which will be
- emitted when there’s a new event pending (just like the existing
- "new-sample" signal). The "emit-signals" property must be set to
- TRUE in order to activate this (but it’s also fine to just pull from
- the application thread without using the signals).
-
- gst_app_sink_pull_object() and gst_app_sink_try_pull_object() can be
- used to pull out either an event or a new sample carrying a buffer
- or buffer list, whatever is next in the queue.
-
- EOS events will be filtered and will not be returned. EOS handling
- can be done the ususal way, same as with _pull_sample().
-
-- appsrc: allow configuration of internal queue limits in time and
- buffers and add leaky mode.
-
- There is internal queuing inside appsrc so the application thread
- can push data into the element which will then be picked up by the
- source element’s streaming thread and pushed into the pipeline from
- that streaming thread. This queue is unlimited by default and until
- now it was only possible to set a maximum size limit in bytes. When
- that byte limit is reached, the pushing thread (application thread)
- would be blocked until more space becomes available.
-
- A limit in bytes is not particularly useful for many use cases, so
- now it is possible to also configure limits in time and buffers
- using the new "max-time" and "max-buffers" properties. Of course
- there are also matching new read-only"current-level-buffers" and
- "current-level-time properties" properties to query the current fill
- level of the internal queue in time and buffers.
-
- And as if that wasn’t enough the internal queue can also be
- configured as leaky using the new "leaky-type" property. That way
- when the queue is full the application thread won’t be blocked when
- it tries to push in more data, but instead either the new buffer
- will be dropped or the oldest data in the queue will be dropped.
-
-Better string serialization of nested GstCaps and GstStructures
-
-- New string serialisation format for structs and caps that can handle
- nested structs and caps properly by using brackets to delimit nested
- items (e.g. some-struct, some-field=[nested-struct, nested=true]).
- Unlike the default format the new variant can also support more than
- one level of nesting. For backwards-compatibility reasons the old
- format is still output by default when serialising caps and structs
- using the existing API. The new functions gst_caps_serialize() and
- gst_structure_serialize() can be used to output strings in the new
- format.
-
-Convenience API for custom GstMetas
-
-- New convenience API to register and create custom GstMetas:
- gst_meta_register_custom() and gst_buffer_add_custom_meta(). Such
- custom meta is backed by a GstStructure and does not require that
- users of the API expose their GstMeta implementation as public API
- for other components to make use of it. In addition, it provides a
- simpler interface by ignoring the impl vs. api distinction that the
- regular API exposes. This new API is meant to be the meta
- counterpart to custom events and messages, and to be more convenient
- than the lower-level API when the absolute best performance isn’t a
- requirement. The reason it’s less performant than a “proper” meta is
- that a proper meta is just a C struct in the end whereas this goes
- through the GstStructure API which has a bit more overhead, which
- for most scenarios is negligible however. This new API is useful for
- experimentation or proprietary metas, but also has some limitations:
- it can only be used if there’s a single producer of these metas;
- it’s not allowed to register the same custom meta multiple times or
- from multiple places.
-
-Additional Element Properties on Encoding Profiles
-
-- GstEncodingProfile: The new "element-properties" and
- gst_encoding_profile_set_element_properties() API allows
- applications to set additional element properties on encoding
- profiles to configure muxers and encoders. So far the encoding
- profile template was the only place where this could be specified,
- but often what applications want to do is take a ready-made encoding
- profile shipped by GStreamer or the application and then tweak the
- settings on top of that, which is now possible with this API. Since
- applications can’t always know in advance what encoder element will
- be used in the end, it’s even possible to specify properties on a
- per-element basis.
-
- Encoding Profiles are used in the encodebin, transcodebin and
- camerabin elements and APIs to configure output formats (containers
- and elementary streams).
-
-Audio Level Indication Meta for RFC 6464
-
-- New GstAudioLevelMeta containing Audio Level Indication as per RFC
- 6464
-
-- The level element has been updated to add GstAudioLevelMeta on
- buffers if the "audio-level-meta" property is set to TRUE. This can
- then in turn be picked up by RTP payloaders to signal the audio
- level to receivers through RTP header extensions (see above).
-
-- New Client-to-Mixer Audio Level Indication (RFC6464) RTP Header
- Extension which should be automatically created and used by RTP
- payloaders and depayloaders if their "auto-header-extension"
- property is enabled and if the extension is part of the RTP caps.
-
-Automatic packet loss, data corruption and keyframe request handling for video decoders
-
-- The GstVideoDecoder base class has gained various new APIs to
- automatically handle packet loss and data corruption better by
- default, especially in RTP, RTSP and WebRTC streaming scenarios, and
- to give subclasses more control about how they want to handle
- missing data:
-
- - Video decoder subclasses can mark output frames as corrupted via
- the new GST_VIDEO_CODEC_FRAME_FLAG_CORRUPTED flag
-
- - A new "discard-corrupted-frames" property allows applications to
- configure decoders so that corrupted frames are directly
- discarded instead of being forwarded inside the pipeline. This
- is a replacement for the "output-corrupt" property of the FFmpeg
- decoders.
-
- - RTP depayloaders can now signal to decoders that data is missing
- when sending GAP events for lost packets. GAP events can be sent
- for various reason in a GStreamer pipeline. Often they are just
- used to let downstream elements know that there isn’t a buffer
- available at the moment, so downstream elements can move on
- instead of waiting for one. They are also sent by RTP
- depayloaders in the case that packets are missing, however, and
- so far a decoder was not able to differentiate the two cases.
- This has been remedied now: GAP events can be decorated with
- gst_event_set_gap_flags() and GST_GAP_FLAG_MISSING_DATA to let
- decoders now what happened, and decoders can then use that in
- some cases to handle missing data better.
-
- - The GstVideoDecoder::handle_missing_data vfunc was added to
- inform subclasses about packet loss or missing data and let them
- handle it in their own way if they like.
-
- - gst_video_decoder_set_needs_sync_point() lets subclasses signal
- that they need the stream to start with a sync point. If
- enabled, the base class will discard all non-sync point frames
- in the beginning and after a flush and does not pass them to the
- subclass. Furthermore, if the first frame is not a sync point,
- the base class will try and request a sync frame from upstream
- by sending a force-key-unit event (see next items).
-
- - New "automatic-request-sync-points" and
- "automatic-request-sync-point-flags" properties to automatically
- request sync points when needed, e.g. on packet loss or if the
- first frame is not a keyframe. Applications may want to enable
- this on decoders operating in e.g. RTP/WebRTC/RTSP receiver
- pipelines.
-
- - The new "min-force-key-unit-interval" property can be used to
- ensure there’s a minimal interval between keyframe requests to
- upstream (and/or the sender) and we’re not flooding the sender
- with key unit requests.
-
- - gst_video_decoder_request_sync_point() allows subclasses to
- request a new sync point (e.g. if they choose to do their own
- missing data handling). This will still honour the
- "min-force-key-unit-interval" property if set.
-
-Improved support for custom minimal GStreamer builds
-
-- Element registration and registration of other plugin features
- inside plugin init functions has been improved in order to
- facilitate minimal custom GStreamer builds.
-
-- A number of new macros have been added to declare and create
- per-element and per-pluginfeature register functions in all plugins,
- and then call those from the per-plugin plugin_init functions:
-
- - GST_ELEMENT_REGISTER_DEFINE,
- GST_DEVICE_PROVIDER_REGISTER_DEFINE,
- GST_DYNAMIC_TYPE_REGISTER_DEFINE, GST_TYPE_FIND_REGISTER_DEFINE
- for the actual registration call with GStreamer
- - GST_ELEMENT_REGISTER, GST_DEVICE_PROVIDER_REGISTER,
- GST_DYNAMIC_TYPE_REGISTER, GST_PLUGIN_STATIC_REGISTER,
- GST_TYPE_FIND_REGISTER to call the registration function defined
- by the REGISTER_DEFINE macro
- - GST_ELEMENT_REGISTER_DECLARE,
- GST_DEVICE_PROVIDER_REGISTER_DECLARE,
- GST_DYNAMIC_TYPE_REGISTER_DECLARE,
- GST_TYPE_FIND_REGISTER_DECLARE to declare the registration
- function defined by the REGISTER_DEFINE macro
- - and various variants for advanced use cases.
-
-- This means that applications can call the per-element and
- per-pluginfeature registration functions for only the elements they
- need instead of registering plugins as a whole with all kinds of
- elements that may not be required (e.g. encoder and decoder instead
- of just decoder). In case of static linking all unused functions and
- their dependencies would be removed in this case by the linker,
- which helps minimise binary size for custom builds.
-
-- gst_init() will automatically call a gst_init_static_plugins()
- function if one exists.
-
-- See the GStreamer static build documentation and Stéphane’s blog
- post Generate a minimal GStreamer build, tailored to your needs for
- more details.