1 GStreamer 1.20 Release Notes
3 GStreamer 1.20 has not been released yet. It is scheduled for release in
4 late January / early February 2022.
6 1.19.x is the unstable development version that is being developed in
7 the git main branch and which will eventually result in 1.20, and
8 1.19.90 is the first release candidate in that series (1.20rc1).
10 1.20 will be backwards-compatible to the stable 1.18, 1.16, 1.14, 1.12,
11 1.10, 1.8, 1.6, 1.4, 1.2 and 1.0 release series.
13 See https://gstreamer.freedesktop.org/releases/1.20/ for the latest
14 version of this document.
16 Last updated: Wednesday 26 January 2022, 01:00 UTC (log)
20 The GStreamer team is proud to announce a new major feature release in
21 the stable 1.x API series of your favourite cross-platform multimedia
24 As always, this release is again packed with many new features, bug
25 fixes and other improvements.
29 - Development in GitLab was switched to a single git repository
30 containing all the modules
31 - GstPlay: new high-level playback library, replaces GstPlayer
32 - WebM Alpha decoding support
33 - Encoding profiles can now be tweaked with additional
34 application-specified element properties
35 - Compositor: multi-threaded video conversion and mixing
36 - RTP header extensions: unified support in RTP depayloader and
37 payloader base classes
38 - SMPTE 2022-1 2-D Forward Error Correction support
39 - Smart encoding (passthrough) support for VP8, VP9, H.265 in
40 encodebin and transcodebin
41 - Runtime compatibility support for libsoup2 and libsoup3 (libsoup3
43 - Video decoder subframe support
44 - Video decoder automatic packet-loss, data corruption, and keyframe
45 request handling for RTP / WebRTC / RTSP
46 - MP4 and Matroska muxers now support profile/level/resolution changes
47 for H264/H265 input streams (i.e. codec data changing on the fly)
48 - MP4 muxing mode that initially creates a fragmented mp4 which is
49 converted to a regular mp4 on EOS
50 - Audio support for the WebKit Port for Embedded (WPE) web page source
52 - CUDA based video color space convert and rescale elements and
53 upload/download elements
54 - NVIDIA memory:NVMM support for OpenGL glupload and gldownload
56 - Many WebRTC improvements
57 - The new VA-API plugin implemention fleshed out with more decoders
58 and new postproc elements
59 - AppSink API to retrieve events in addition to buffers and buffer
61 - AppSrc gained more configuration options for the internal queue
62 (leakiness, limits in buffers and time, getters to read current
64 - Updated Rust bindings and many new Rust plugins
65 - Improved support for custom minimal GStreamer builds
66 - Support build against FFmpeg 5.0
67 - Linux Stateless CODEC support gained MPEG2 and VP9
68 - Windows Direct3D11/DXVA decoder gained AV1 and MPEG2 support
69 - Lots of new plugins, features, performance improvements and bug
72 Major new features and changes
74 Noteworthy new features and API
76 - gst_element_get_request_pad() has been deprecated in favour of the
77 newly-added gst_element_request_pad_simple() which does the exact
78 same thing but has a less confusing name that hopefully makes clear
79 that the function request a new pad rather than just retrieves an
80 already-existing request pad.
82 Development in GitLab was switched to a single git repository containing all the modules
84 The GStreamer multimedia framework is a set of libraries and plugins
85 split into a number of distinct modules which are released independently
86 and which have so far been developed in separate git repositories in
87 freedesktop.org GitLab.
89 In addition to these separate git repositories there was a gst-build
90 module that would use the Meson build systems’s subproject feature to
91 download each individual module and then build everything in one go. It
92 would also provide an uninstalled development environment that made it
93 easy to work on GStreamer and use or test versions other than the
94 system-installed GStreamer version.
96 All of these modules have now (as of 28 September 2021) been merged into
97 a single git repository (“Mono repository” or “monorepo”) which should
98 simplify development workflows and continuous integration, especially
99 where changes need to be made to multiple modules at once.
101 This mono repository merge will primarily affect GStreamer developers
102 and contributors and anyone who has workflows based on the GStreamer git
105 The Rust bindings and Rust plugins modules have not been merged into the
106 mono repository at this time because they follow a different release
109 The mono repository lives in the existing GStreamer core git repository
110 in GitLab in the new main branch and all future development will happen
113 Modules will continue to be released as separate tarballs.
115 For more details, please see the GStreamer mono repository FAQ.
117 GstPlay: new high-level playback library replacing GstPlayer
119 - GstPlay is a new high-level playback library that replaces the older
120 GstPlayer API. It is basically the same API as GstPlayer but
121 refactored to use bus messages for application notifications instead
122 of GObject signals. There is still a signal adapter object for those
123 who prefer signals. Since the existing GstPlayer API is already in
124 use in various applications, it didn’t seem like a good idea to
125 break it entirely. Instead a new API was added, and it is expected
126 that this new GstPlay API will be moved to gst-plugins-base in
129 - The existing GstPlayer API is scheduled for deprecation and will be
130 removed at some point in the future (e.g. in GStreamer 1.24), so
131 application developers are urged to migrate to the new GstPlay API
132 at their earliest convenience.
136 - Implement WebM alpha decoding (VP8/VP9 with alpha), which required
137 support and additions in various places. This is supported both with
138 software decoders and hardware-accelerated decoders.
140 - VP8/VP9 don’t support alpha components natively in the codec, so the
141 way this is implemented in WebM is by encoding the alpha plane with
142 transparency data as a separate VP8/VP9 stream. Inside the WebM
143 container (a variant of Matroska) this is coded as a single video
144 track with the “normal” VP8/VP9 video data making up the main video
145 data and each frame of video having an encoded alpha frame attached
146 to it as extra data ("BlockAdditional").
148 - matroskademux has been extended extract this per-frame alpha side
149 data and attach it in form of a GstVideoCodecAlphaMeta to the
150 regular video buffers. Note that this new meta is specific to this
151 VP8/VP9 alpha support and can’t be used to just add alpha support to
152 other codecs that don’t support it. Lastly, matroskademux also
153 advertises the fact that the streams contain alpha in the caps.
155 - The new codecalpha plugin contains various bits of infrastructure to
156 support autoplugging and debugging:
158 - codecalphademux splits out the alpha stream from the metas on
159 the regular VP8/VP9 buffers
160 - alphacombine takes two decoded raw video streams (one alpha, one
161 the regular video) and combines it into a video stream with
163 - vp8alphadecodebin + vp9alphadecodebin are wrapper bins that use
164 the regular vp8dec and vp9dec software decoders to decode
165 regular and alpha streams and combine them again. To decodebin
166 these look like regular decoders which ju
167 - The V4L2 CODEC plugin has stateless VP8/VP9 decoders that can
168 decode both alpha and non-alpha stream with a single decoder
171 - A new AV12 video format was added which is basically NV12 with an
172 alpha plane, which is more convenient for many hardware-accelerated
175 - Watch Nicolas Dufresne’s LCA 2022 talk “Bringing WebM Alpha support
176 to GStreamer” for all the details and a demo.
178 RTP Header Extensions Base Class and Automatic Header Extension Handling in RTP Payloaders and Depayloaders
180 - RTP Header Extensions are specified in RFC 5285 and provide a way to
181 add small pieces of data to RTP packets in between the RTP header
182 and the RTP payload. This is often used for per-frame metadata,
183 extended timestamps or other application-specific extra data. There
184 are several commonly-used extensions specified in various RFCs, but
185 senders are free to put any kind of data in there, as long as sender
186 and receiver both know what that data is. Receivers that don’t know
187 about the header extensions will just skip the extra data without
188 ever looking at it. These header extensions can often be combined
189 with any kind of payload format, so may need to be supported by many
190 RTP payloader and depayloader elements.
192 - Inserting and extracting RTP header extension data has so far been a
193 bit inconvenient in GStreamer: There are functions to add and
194 retrieve RTP header extension data from RTP packets, but nothing
195 works automatically, even for common extensions. People would have
196 to do the insertion/extraction either in custom elements
197 before/after the RTP payloader/depayloader, or inside pad probes,
198 which isn’t very nice.
200 - This release adds various pieces of new infrastructure for generic
201 RTP header extension handling, as well as some implementations for
204 - GstRTPHeaderExtension is a new helper base class for reading and
205 writing RTP header extensions. Nominally this subclasses
206 GstElement, but only so these extensions are stored in the
207 registry where they can be looked up by URI or name. They don’t
208 have pads and don’t get added to the pipeline graph as an
211 - "add-extension" and "clear-extension" action signals on RTP
212 payloaders and depayloaders for manual extension management
214 - The "request-extension" signal will be emitted if an extension
215 is encountered that requires explicit mapping by the application
217 - new "auto-header-extension" property on RTP payloaders and
218 depayloaders for automatic handling of known header extensions.
219 This is enabled by default. The extensions must be signalled via
222 - RTP header extension implementations:
224 - rtphdrextclientaudiolevel: Client-to-Mixer Audio Level
225 Indication (RFC 6464) (also see below)
226 - rtphdrextcolorspace: Color Space extension, extends RTP
227 packets with color space and high dynamic range (HDR)
229 - rtphdrexttwcc: Transport Wide Congestion Control support
231 - gst_rtp_buffer_remove_extension_data() is a new helper function to
232 remove an RTP header extension from an RTP buffer
234 - The existing gst_rtp_buffer_set_extension_data() now also supports
235 shrinking the extension data in size
237 AppSink and AppSrc improvements
239 - appsink: new API to pull events out of appsink in addition to
240 buffers and buffer lists.
242 There was previously no way for users to receive incoming events
243 from appsink properly serialised with the data flow, even if they
244 are serialised events. The reason for that is that the only way to
245 intercept events was via a pad probe on the appsink sink pad, but
246 there is also internal queuing inside of appsink, so it’s difficult
247 to ascertain the right order of everything in all cases.
249 There is now a new "new-serialized-event" signal which will be
250 emitted when there’s a new event pending (just like the existing
251 "new-sample" signal). The "emit-signals" property must be set to
252 TRUE in order to activate this (but it’s also fine to just pull from
253 the application thread without using the signals).
255 gst_app_sink_pull_object() and gst_app_sink_try_pull_object() can be
256 used to pull out either an event or a new sample carrying a buffer
257 or buffer list, whatever is next in the queue.
259 EOS events will be filtered and will not be returned. EOS handling
260 can be done the ususal way, same as with _pull_sample().
262 - appsrc: allow configuration of internal queue limits in time and
263 buffers and add leaky mode.
265 There is internal queuing inside appsrc so the application thread
266 can push data into the element which will then be picked up by the
267 source element’s streaming thread and pushed into the pipeline from
268 that streaming thread. This queue is unlimited by default and until
269 now it was only possible to set a maximum size limit in bytes. When
270 that byte limit is reached, the pushing thread (application thread)
271 would be blocked until more space becomes available.
273 A limit in bytes is not particularly useful for many use cases, so
274 now it is possible to also configure limits in time and buffers
275 using the new "max-time" and "max-buffers" properties. Of course
276 there are also matching new read-only"current-level-buffers" and
277 "current-level-time properties" properties to query the current fill
278 level of the internal queue in time and buffers.
280 And as if that wasn’t enough the internal queue can also be
281 configured as leaky using the new "leaky-type" property. That way
282 when the queue is full the application thread won’t be blocked when
283 it tries to push in more data, but instead either the new buffer
284 will be dropped or the oldest data in the queue will be dropped.
286 Better string serialization of nested GstCaps and GstStructures
288 - New string serialisation format for structs and caps that can handle
289 nested structs and caps properly by using brackets to delimit nested
290 items (e.g. some-struct, some-field=[nested-struct, nested=true]).
291 Unlike the default format the new variant can also support more than
292 one level of nesting. For backwards-compatibility reasons the old
293 format is still output by default when serialising caps and structs
294 using the existing API. The new functions gst_caps_serialize() and
295 gst_structure_serialize() can be used to output strings in the new
298 Convenience API for custom GstMetas
300 - New convenience API to register and create custom GstMetas:
301 gst_meta_register_custom() and gst_buffer_add_custom_meta(). Such
302 custom meta is backed by a GstStructure and does not require that
303 users of the API expose their GstMeta implementation as public API
304 for other components to make use of it. In addition, it provides a
305 simpler interface by ignoring the impl vs. api distinction that the
306 regular API exposes. This new API is meant to be the meta
307 counterpart to custom events and messages, and to be more convenient
308 than the lower-level API when the absolute best performance isn’t a
309 requirement. The reason it’s less performant than a “proper” meta is
310 that a proper meta is just a C struct in the end whereas this goes
311 through the GstStructure API which has a bit more overhead, which
312 for most scenarios is negligible however. This new API is useful for
313 experimentation or proprietary metas, but also has some limitations:
314 it can only be used if there’s a single producer of these metas;
315 it’s not allowed to register the same custom meta multiple times or
316 from multiple places.
318 Additional Element Properties on Encoding Profiles
320 - GstEncodingProfile: The new "element-properties" and
321 gst_encoding_profile_set_element_properties() API allows
322 applications to set additional element properties on encoding
323 profiles to configure muxers and encoders. So far the encoding
324 profile template was the only place where this could be specified,
325 but often what applications want to do is take a ready-made encoding
326 profile shipped by GStreamer or the application and then tweak the
327 settings on top of that, which is now possible with this API. Since
328 applications can’t always know in advance what encoder element will
329 be used in the end, it’s even possible to specify properties on a
332 Encoding Profiles are used in the encodebin, transcodebin and
333 camerabin elements and APIs to configure output formats (containers
334 and elementary streams).
336 Audio Level Indication Meta for RFC 6464
338 - New GstAudioLevelMeta containing Audio Level Indication as per RFC
341 - The level element has been updated to add GstAudioLevelMeta on
342 buffers if the "audio-level-meta" property is set to TRUE. This can
343 then in turn be picked up by RTP payloaders to signal the audio
344 level to receivers through RTP header extensions (see above).
346 - New Client-to-Mixer Audio Level Indication (RFC6464) RTP Header
347 Extension which should be automatically created and used by RTP
348 payloaders and depayloaders if their "auto-header-extension"
349 property is enabled and if the extension is part of the RTP caps.
351 Automatic packet loss, data corruption and keyframe request handling for video decoders
353 - The GstVideoDecoder base class has gained various new APIs to
354 automatically handle packet loss and data corruption better by
355 default, especially in RTP, RTSP and WebRTC streaming scenarios, and
356 to give subclasses more control about how they want to handle
359 - Video decoder subclasses can mark output frames as corrupted via
360 the new GST_VIDEO_CODEC_FRAME_FLAG_CORRUPTED flag
362 - A new "discard-corrupted-frames" property allows applications to
363 configure decoders so that corrupted frames are directly
364 discarded instead of being forwarded inside the pipeline. This
365 is a replacement for the "output-corrupt" property of the FFmpeg
368 - RTP depayloaders can now signal to decoders that data is missing
369 when sending GAP events for lost packets. GAP events can be sent
370 for various reason in a GStreamer pipeline. Often they are just
371 used to let downstream elements know that there isn’t a buffer
372 available at the moment, so downstream elements can move on
373 instead of waiting for one. They are also sent by RTP
374 depayloaders in the case that packets are missing, however, and
375 so far a decoder was not able to differentiate the two cases.
376 This has been remedied now: GAP events can be decorated with
377 gst_event_set_gap_flags() and GST_GAP_FLAG_MISSING_DATA to let
378 decoders now what happened, and decoders can then use that in
379 some cases to handle missing data better.
381 - The GstVideoDecoder::handle_missing_data vfunc was added to
382 inform subclasses about packet loss or missing data and let them
383 handle it in their own way if they like.
385 - gst_video_decoder_set_needs_sync_point() lets subclasses signal
386 that they need the stream to start with a sync point. If
387 enabled, the base class will discard all non-sync point frames
388 in the beginning and after a flush and does not pass them to the
389 subclass. Furthermore, if the first frame is not a sync point,
390 the base class will try and request a sync frame from upstream
391 by sending a force-key-unit event (see next items).
393 - New "automatic-request-sync-points" and
394 "automatic-request-sync-point-flags" properties to automatically
395 request sync points when needed, e.g. on packet loss or if the
396 first frame is not a keyframe. Applications may want to enable
397 this on decoders operating in e.g. RTP/WebRTC/RTSP receiver
400 - The new "min-force-key-unit-interval" property can be used to
401 ensure there’s a minimal interval between keyframe requests to
402 upstream (and/or the sender) and we’re not flooding the sender
403 with key unit requests.
405 - gst_video_decoder_request_sync_point() allows subclasses to
406 request a new sync point (e.g. if they choose to do their own
407 missing data handling). This will still honour the
408 "min-force-key-unit-interval" property if set.
410 Improved support for custom minimal GStreamer builds
412 - Element registration and registration of other plugin features
413 inside plugin init functions has been improved in order to
414 facilitate minimal custom GStreamer builds.
416 - A number of new macros have been added to declare and create
417 per-element and per-pluginfeature register functions in all plugins,
418 and then call those from the per-plugin plugin_init functions:
420 - GST_ELEMENT_REGISTER_DEFINE,
421 GST_DEVICE_PROVIDER_REGISTER_DEFINE,
422 GST_DYNAMIC_TYPE_REGISTER_DEFINE, GST_TYPE_FIND_REGISTER_DEFINE
423 for the actual registration call with GStreamer
424 - GST_ELEMENT_REGISTER, GST_DEVICE_PROVIDER_REGISTER,
425 GST_DYNAMIC_TYPE_REGISTER, GST_PLUGIN_STATIC_REGISTER,
426 GST_TYPE_FIND_REGISTER to call the registration function defined
427 by the REGISTER_DEFINE macro
428 - GST_ELEMENT_REGISTER_DECLARE,
429 GST_DEVICE_PROVIDER_REGISTER_DECLARE,
430 GST_DYNAMIC_TYPE_REGISTER_DECLARE,
431 GST_TYPE_FIND_REGISTER_DECLARE to declare the registration
432 function defined by the REGISTER_DEFINE macro
433 - and various variants for advanced use cases.
435 - This means that applications can call the per-element and
436 per-pluginfeature registration functions for only the elements they
437 need instead of registering plugins as a whole with all kinds of
438 elements that may not be required (e.g. encoder and decoder instead
439 of just decoder). In case of static linking all unused functions and
440 their dependencies would be removed in this case by the linker,
441 which helps minimise binary size for custom builds.
443 - gst_init() will automatically call a gst_init_static_plugins()
444 function if one exists.
446 - See the GStreamer static build documentation and Stéphane’s blog
447 post Generate a minimal GStreamer build, tailored to your needs for
452 - New aesdec and aesenc elements for AES encryption and decryption in
455 - New encodebin2 element with dynamic/sometimes source pads in order
456 to support the option of doing the muxing outside of encodebin,
457 e.g. in combination with a splitmuxsink.
459 - New fakeaudiosink and videocodectestsink elements for testing and
460 debugging (see below for more details)
462 - rtpisacpay, rtpisacdepay: new RTP payloader and depayloader for iSAC
465 - rtpst2022-1-fecdec, rtpst2022-1-fecenc: new elements providing SMPTE
466 2022-1 2-D Forward Error Correction. More details in Mathieu’s blog
469 - isac: new plugin wrapping the Internet Speech Audio Codec reference
470 encoder and decoder from the WebRTC project.
472 - asio: plugin for Steinberg ASIO (Audio Streaming Input/Output) API
474 - gssrc, gssink: add source and sink for Google Cloud Storage
476 - onnx: new plugin to apply ONNX neural network models to video
478 - openaptx: aptX and aptX-HD codecs using libopenaptx (v0.2.0)
480 - qroverlay, debugqroverlay: new elements that allows overlaying data
481 on top of video in form of a QR code
483 - cvtracker: new OpenCV-based tracker element
485 - av1parse, vp9parse: new parsers for AV1 and VP9 video
487 - va: work on the new VA-API plugin implementation for
488 hardware-accelerated video decoding and encoding has continued at
489 pace, with various new decoders and filters having joined the
492 - vah265dec: VA-API H.265 decoder
493 - vavp8dec: VA-API VP8 decoder
494 - vavp9dec: VA-API VP9 decoder
495 - vaav1dec: VA-API AV1 decoder
496 - vampeg2dec: VA-API MPEG-2 decoder
497 - vadeinterlace: : VA-API deinterlace filter
498 - vapostproc: : VA-API postproc filter (color conversion,
499 resizing, cropping, color balance, video rotation, skin tone
500 enhancement, denoise, sharpen)
502 See Víctor’s blog post “GstVA in GStreamer 1.20” for more details
503 and what’s coming up next.
505 - vaapiav1dec: new AV1 decoder element (in gstreamer-vaapi)
507 - msdkav1dec: hardware-accelerated AV1 decoder using the Intel Media
510 - nvcodec plugin for NVIDIA NVCODEC API for hardware-accelerated video
511 encoding and decoding:
513 - cudaconvert, cudascale: new CUDA based video color space convert
515 - cudaupload, cudadownload: new helper elements for memory
516 transfer between CUDA and system memory spaces
517 - nvvp8sldec, nvvp9sldec: new GstCodecs-based VP8/VP9 decoders
519 - Various new hardware-accelerated elements for Windows:
521 - d3d11screencapturesrc: new desktop capture element, including a
522 GstDeviceProvider implementation to enumerate/select target
523 monitors for capture.
524 - d3d11av1dec and d3d11mpeg2dec: AV1 and MPEG-2 decoders
525 - d3d11deinterlace: deinterlacing filter
526 - d3d11compositor: video composing element
527 - see Windows section below for more details
531 - audiornnoise: Removes noise from an audio stream
532 - awstranscribeparse: Parses AWS audio transcripts into timed text
534 - ccdetect: Detects if valid closed captions are present in a
535 closed captions stream
536 - cea608tojson: Converts CEA-608 Closed Captions to a JSON
538 - cmafmux: CMAF fragmented MP4 muxer
539 - dashmp4mux: DASH fragmented MP4 muxer
540 - isofmp4mux: ISO fragmented MP4 muxer
541 - ebur128level: EBU R128 Loudness Level Measurement
542 - ffv1dec: FFV1 video decoder
543 - gtk4paintablesink: GTK4 video sink, which provides a
544 GdkPaintable that can be rendered in various widgets
545 - hlssink3: HTTP Live Streaming sink
546 - hrtfrender: Head-Related Transfer Function (HRTF) renderer
547 - hsvdetector: HSV colorspace detector
548 - hsvfilter: HSV colorspace filter
549 - jsongstenc: Wraps buffers containing any valid top-level JSON
550 structures into higher level JSON objects, and outputs those as
552 - jsongstparse: Parses ndjson as output by jsongstenc
553 - jsontovtt: converts JSON to WebVTT subtitles
554 - regex: Applies regular expression operations on text
555 - roundedcorners: Adds rounded corners to video
556 - spotifyaudiosrc: Spotify source
557 - textahead: Display upcoming text buffers ahead (e.g. for
559 - transcriberbin: passthrough bin that transcribes raw audio to
560 closed captions using awstranscriber and puts the captions as
562 - tttojson: Converts timed text to a JSON representation
563 - uriplaylistbin: Playlist source bin
564 - webpdec-rs: WebP image decoder with animation support
566 - New plugin codecalpha with elements to assist with WebM Alpha
569 - codecalphademux: Split stream with GstVideoCodecAlphaMeta into
571 - alphacombine: Combine two raw video stream (I420 or NV12) as one
572 stream with alpha channel (A420 or AV12)
573 - vp8alphadecodebin: A bin to handle software decoding of VP8 with
575 - vp9alphadecodebin: A bin to handle software decoding of VP9 with
578 - New hardware accelerated elements for Linux:
580 - v4l2slmpeg2dec: Support for Linux Stateless MPEG2 decoders
581 - v4l2slvp9dec: Support for Linux Stateless VP9 decoders
582 - v4l2slvp8alphadecodebin: Support HW accelerated VP8 with alpha
584 - v4l2slvp9alphadecodebin: Support HW accelerated VP9 with alpha
587 New element features and additions
589 - assrender: handle more font mime types; better interaction with
590 matroskademux for embedded fonts
592 - audiobuffersplit: Add support for specifying output buffer size in
593 bytes (not just duration)
595 - audiolatency: new "samplesperbuffer" property so users can configure
596 the number of samples per buffer. The default value is 240 samples
597 which is equivalent to 5ms latency with a sample rate of 48000,
598 which might be larger than actual buffer size of audio capture
601 - audiomixer, audiointerleave, GstAudioAggregator: now keep a count of
602 samples that are dropped or processed as statistic and can be made
603 to post QoS messages on the bus whenever samples are dropped by
604 setting the "qos-messages" property on input pads.
606 - audiomixer, compositor: improved handling of new inputs added at
607 runtime. New API was added to the GstAggregator base class to allow
608 subclasses to opt into an aggregation mode where inactive pads are
609 ignored when processing input buffers
610 (gst_aggregator_set_ignore_inactive_pads(),
611 gst_aggregator_pad_is_inactive()). An “inactive pad” in this context
612 is a pad which, in live mode, hasn’t yet received a first buffer,
613 but has been waited on at least once. What would happen usually in
614 this case is that the aggregator would wait for data on this pad
615 every time, up to the maximum configured latency. This would
616 inadvertently push mixer elements in live mode to the configured
617 latency envelope and delay processing when new inputs are added at
618 runtime until these inputs have actually produced data. This is
619 usually undesirable. With this new API, new inputs can be added
620 (requested) and configured and they won’t delay the data processing.
621 Applications can opt into this new behaviour by setting the
622 "ignore-inactive-pads" property on compositor, audiomixer or other
623 GstAudioAggregator-based elements.
625 - cccombiner: implement “scheduling” of captions. So far cccombiner’s
626 behaviour was essentially that of a funnel: it strictly looked at
627 input timestamps to associate together video and caption buffers.
628 Now it will try to smoothly schedule caption buffers in order to
629 have exactly one per output video buffer. This might involve
630 rewriting input captions, for example when the input is CDP then
631 sequence counters are rewritten, time codes are dropped and
632 potentially re-injected if the input video frame had a time code
633 meta. This can also lead to the input drifting from synchronization,
634 when there isn’t enough padding in the input stream to catch up. In
635 that case the element will start dropping old caption buffers once
636 the number of buffers in its internal queue reaches a certain limit
637 (configurable via the "max-scheduled" property). The new original
638 funnel-like behaviour can be restored by setting the "scheduling"
641 - ccconverter: new "cdp-mode" property to specify which sections to
642 include in CDP packets (timecode, CC data, service info). Various
643 software, including ffmpeg’s Decklink support, fails parsing CDP
644 packets that contain anything but CC data in the CDP packets.
646 - clocksync: new "sync-to-first" property for automatic timestamp
647 offset setup: if set clocksync will set up the "ts-offset" value
648 based on the first buffer and the pipeline’s running time when the
649 first buffer arrived. The newly configured "ts-offset" in this case
650 would be the value that allows outputting the first buffer without
651 waiting on the clock. This is useful for example to feed a non-live
652 input into an already-running pipeline.
656 - multi-threaded input conversion and compositing. Set the
657 "max-threads" property to activate this.
658 - new "sizing-policy" property to support display aspect ratio
659 (DAR)-aware scaling. By default the image is scaled to fill the
660 configured destination rectangle without padding and without
661 keeping the aspect ratio. With sizing-policy=keep-aspect-ratio
662 the input image is scaled to fit the destination rectangle
663 specified by GstCompositorPad:{xpos, ypos, width, height}
664 properties preserving the aspect ratio. As a result, the image
665 will be centered in the destination rectangle with padding if
667 - new "zero-size-is-unscaled" property on input pads. By default
668 pad width=0 or pad height=0 mean that the stream should not be
669 scaled in that dimension. But if the "zero-size-is-unscaled"
670 property is set to FALSE a width or height of 0 is instead
671 interpreted to mean that the input image on that pad should not
672 be composited, which is useful when creating animations where an
673 input image is made smaller and smaller until it disappears.
674 - improved handling of new inputs at runtime via
675 "ignore-inactive-pads"property (see above for details)
676 - allow output format with alpha even if none of the inputs have
677 alpha (also glvideomixer and other GstVideoAggregator
680 - dashsink: add h265 codec support and signals for allowing custom
681 playlist/fragment output
685 - improved decoder selection, especially for hardware decoders
686 - make input activation “atomic” when adding inputs dynamically
687 - better interleave handling: take into account decoder latency
692 - Updated DeckLink SDK to 11.2 to support DeckLink 8K Pro
694 - More accurate and stable capture timestamps: use the
695 hardware reference clock time when the frame was finished
696 being captured instead of a clock time much further down the
698 - Automatically detect widescreen vs. normal NTSC/PAL
702 - add “smart encoding” support for H.265, VP8 and VP9 (i.e. only
703 re-encode where needed and otherwise pass through encoded video
705 - H264/H265 smart encoding improvements: respect user-specified
706 stream-format, but if not specified default to avc3/hvc1 with
707 in-band SPS/PPS/VPS signalling for more flexibility.
708 - new encodebin2 element with dynamic/sometimes source pads in
709 order to support the option of doing the muxing outside of
710 encodebin, e.g. in combination with splitmuxsink.
711 - add APIs to set element properties on encoding profiles (see
714 - errorignore: new "ignore-eos" property to also ignore FLOW_EOS from
717 - giosrc: add support for growing source files: applications can
718 specify that the underlying file being read is growing by setting
719 the "is-growing" property. If set, the source won’t EOS when it
720 reaches the end of the file, but will instead start monitoring it
721 and will start reading data again whenever a change is detected. The
722 new "waiting-data" and "done-waiting-data" signals keep the
723 application informed about the current state.
725 - gtksink, gtkglsink:
727 - scroll event support: forwarded as navigation events into the
729 - "video-aspect-ratio-override" property to force a specific
731 - "rotate-method" property and support automatic rotation based on
734 - identity: new "stats" property allows applications to retrieve the
735 number of bytes and buffers that have passed through so far.
737 - interlace: add support for more formats, esp 10-bit, 12-bit and
740 - jack: new "low-latency" property for automatic latency-optimized
741 setting and "port-names" property to select ports explicitly
743 - jpegdec: support output conversion to RGB using libjpeg-turbo (for
748 - "mode" property to control whether and how detected closed
749 captions should be inserted in the list of existing close
750 caption metas on the input frame (if any): add, drop, or
752 - "ntsc-only" property to only look for captions if video has NTSC
755 - line21enc: new "remove-caption-meta" to remove metas from output
756 buffers after encoding the captions into the video data; support for
759 - matroskademux, matroskamux: Add support for ffv1, a lossless
760 intra-frame video coding format.
762 - matroskamux: accept in-band SPS/PPS/VPS for H264 and H265
763 (i.e. stream-format avc3 and hev1) which allows on-the-fly
764 profile/level/resolution changes.
766 - matroskamux: new "cluster-timestamp-offset" property, useful for use
767 cases where the container timestamps should map to some absolute
768 wall clock time, for example.
770 - rtpsrc: add "caps" property to allow explicit setting of the caps
773 - mpegts: support SCTE-35 passthrough via new "send-scte35-events"
774 property on MPEG-TS demuxer tsdemux. When enabled, SCTE 35 sections
775 (eg ad placement opportunities) are forwarded as events donwstream
776 where they can be picked up again by mpegtsmux. This required a
777 semantic change in the SCTE-35 section API: timestamps are now in
778 running time instead of muxer pts.
780 - tsdemux: Handle PCR-less MPEG-TS streams; more robust timestamp
781 handling in certain corner cases and for poorly muxed streams.
785 - More conformance improvements to make MPEG-TS analyzers happy:
786 - PCR timing accuracy: Improvements to the way mpegtsmux
787 outputs PCR observations in CBR mode, so that a PCR
788 observation is always inserted when needed, so that we never
789 miss the configured pcr-interval, as that triggers various
790 MPEG-TS analyser errors.
791 - Improved PCR/SI scheduling
792 - Don’t write PCR until PAT/PMT are output to make sure streams
793 start cleanly with a PAT/PMT.
794 - Allow overriding the automatic PMT PID selection via
795 application-supplied PMT_%d fields in the prog-map
800 - new "first-moov-then-finalise" mode for fragmented output where
801 the output will start with a self-contained moov atom for the
802 first fragment, and then produce regular fragments. Then at the
803 end when the file is finalised, the initial moov is invalidated
804 and a new moov is written covering the entire file. This way the
805 file is a “fragmented mp4” file while it is still being written
806 out, and remains playable at all times, but at the end it is
807 turned into a regular mp4 file (with former fragment headers
808 remaining as unused junk data in the file).
809 - support H.264 avc3 and H.265 hvc1 stream formats as input where
810 the codec data is signalled in-band inside the bitstream instead
811 of caps/file headers.
812 - support profile/level/resolution changes for H264/H265 input
813 streams (i.e. codec data changing on the fly). Each codec_data
814 is put into its own SampleTableEntry inside the stsd, unless the
815 input is in avc3 stream format in which case it’s written
816 in-band an not in the headers.
818 - multifilesink: new ""min-keyframe-distance"" property to make
819 minimum distance between keyframes in next-file=key-frame mode
820 configurable instead of hard-coding it to 10 seconds.
822 - mxfdemux has seen a big refactoring to support non-frame wrappings
823 and more accurate timestamp/seek handling for some formats
825 - msdk plugin for hardware-accelerated video encoding and decoding
826 using the Intel Media SDK:
828 - oneVPL support (Intel oneAPI Video Processing Library)
829 - AV1 decoding support
830 - H.264 decoder now supports constrained-high and progressive-high
833 - more configuration options (properties):
834 "intra-refresh-type", "min-qp" , "max-qp", "p-pyramid",
837 - can output main-still-picture profile
838 - now inserts HDR SEIs (mastering display colour volume and
840 - more configuration options (properties):
841 "intra-refresh-type", "min-qp" , "max-qp", "p-pyramid",
842 "b-pyramid", "dblk-idc", "transform-skip"
843 - support for RGB 10bit format
844 - External bitrate control in encoders
845 - Video post proc element msdkvpp gained support for 12-bit pixel
846 formats P012_LE, Y212_LE and Y412_LE
848 - nvh264sldec: interlaced stream support
850 - openh264enc: support main, high, constrained-high and
851 progressive-high profiles
853 - openjpeg: support for multithreaded decoding and encoding
855 - rtspsrc: now supports IPv6 also for tunneled mode (RTSP-over-HTTP);
856 new "ignore-x-server-reply" property to ignore the
857 x-server-ip-address server header reply in case of HTTP tunneling,
858 as it is often broken.
860 - souphttpsrc: Runtime compatibility support for libsoup2 and
861 libsoup3. libsoup3 is the latest major version of libsoup, but
862 libsoup2 and libsoup3 can’t co-exist in the same process because
863 there is no namespacing or versioning for GObject types. As a
864 result, it would be awkward if the GStreamer souphttpsrc plugin
865 linked to a specific version of libsoup, because it would only work
866 with applications that use the same version of libsoup. To make this
867 work, the soup plugin now tries to determine the libsoup version
868 used by the application (and its other dependencies) at runtime on
869 systems where GStreamer is linked dynamically. libsoup3 support is
870 still considered somewhat experimental at this point.
872 - srtsrc, srtsink: add signals for the application to accept/reject
875 - timeoverlay: new elapsed-running-time time mode which shows the
876 running time since the first running time (and each flush-stop).
878 - udpsrc: new timestamping mode to retrieve packet receive timestamps
879 from the kernel via socket control messages (SO_TIMESTAMPNS) on
882 - uritranscodebin: new setup-source and element-setup signals for
883 applications to configure elements used
885 - v4l2codecs plugin gained support for 4x4 and 32x32 tile formats
886 enabling some platforms or direct renders. Important memory usage
889 - v4l2slh264dec now implements the final Linux uAPI as shipped on
890 Linux 5.11 and later.
892 - valve: add "drop-mode" property and provide two new modes of
893 operation: in drop-mode=forward-sticky-events sticky events
894 (stream-start, segment, tags, caps, etc.) are forwarded downstream
895 even when dropping is enabled; drop-mode=transform-to-gap will in
896 addition also convert buffers into gap events when dropping is
897 enabled, which lets downstream elements know that time is advancing
898 and might allow for preroll in many scenarios. By default all events
899 and all buffers are dropped when dropping is enabled, which can
900 cause problems with caps negotiation not progressing or branches not
901 prerolling when dropping is enabled.
903 - videocrop: support for many more pixel formats, e.g. planar YUV
904 formats with > 8bits and GBR* video formats; can now also accept
905 video not backed by system memory as long as downstream supports the
908 - videotestsrc: new smpte-rp-219 pattern for SMPTE75 RP-219 conformant
911 - vp8enc: finish support for temporal scalability: two new properties
912 ("temporal-scalability-layer-flags",
913 "temporal-scalability-layer-sync-flags") and a unit change on the
914 "temporal-scalability-target-bitrate" property (now expects bps);
915 also make temporal scalability details available to RTP payloaders
918 - vp9enc: new properties to tweak encoder performance:
920 - "aq-mode" to configure adaptive quantization modes
921 - "frame-parallel-decoding" to configure whether to create a
922 bitstream that reduces decoding dependencies between frames
923 which allows staged parallel processing of more than one video
924 frames in the decoder. (Defaults to TRUE)
925 - "row-mt", "tile-columns" and "tile-rows" so multithreading can
926 be enabled on a per-tile basis, instead of on a per tile-column
927 basis. In combination with the new "tile-rows" property, this
928 allows the encoder to make much better use of the available CPU
931 - vp9dec, vp9enc: add support for 10-bit 4:2:0 and 4:2:2 YUV, as well
934 - vp8enc, vp9enc now default to “good quality” for the deadline
935 property rather then “best quality”. Having the deadline set to best
936 quality causes the encoder to be absurdly slow, most real-life users
937 will prefer good-enough quality with better performance instead.
941 - implement audio support: a new sometimes source pad will be
942 created for each audio stream created by the web engine.
943 - move wpesrc to wpevideosrc and add a wrapper bin wpesrc to also
945 - also handles web:// URIs now (same as cefsrc)
946 - post messages with the estimated load progress on the bus
948 - x265enc: add negative DTS support, which means timestamps are now
949 offset by 1h same as with x264enc
951 RTP Payloaders and Depayloaders
953 - rtpisacpay, rtpisacdepay: new RTP payloader and depayloader for iSAC
958 - new "request-keyframe" property to make the depayloader
959 automatically request a new keyframe from the sender on packet
960 loss, consistent with the new property on rtpvp8depay.
961 - new "wait-for-keyframe" property to make depayloader wait for a
962 new keyframe at the beginning and after packet loss (only
963 effective if the depayloader outputs AUs), consistent with the
964 existing property on rtpvp8depay.
966 - rtpopuspay, rtpopusdepay: support libwebrtc-compatible multichannel
967 audio in addition to the previously supported multichannel audio
970 - rtpopuspay: add DTX (Discontinuous Transmission) support
972 - rtpvp8depay: new "request-keyframe" property to make the depayloader
973 automatically request a new keyframe from the sender on packet loss.
975 - rtpvp8pay: temporal scaling support
977 - rtpvp9depay: Improved SVC handling (aggregate all layers)
981 - rtpst2022-1-fecdec, rtpst2022-1-fecenc: new elements providing SMPTE
982 2022-1 2-D Forward Error Correction. More details in Mathieu’s blog
985 - rtpreddec: BUNDLE support
987 - rtpredenc, rtpulpfecenc: add support for Transport-wide Congestion
990 - rtpsession: new "twcc-feedback-interval" property to allow RTCP TWCC
991 reports to be scheduled on a timer instead of per marker-bit.
993 Plugin and library moves
995 - There were no plugin moves or library moves in this cycle.
999 The following elements or plugins have been removed:
1001 - The ofa audio fingerprinting plugin has been removed. The MusicIP
1002 database has been defunct for years so this plugin is likely neither
1003 useful nor used by anyone.
1005 - The mms plugin containing mmssrc has been removed. It seems unlikely
1006 anyone still needs this or that there are even any streams left out
1007 there. The MMS protocol was deprecated in 2003 (in favour of RTSP)
1008 and support for it was dropped with Microsoft Media Services 2008,
1009 and Windows Media Player apparently also does not support it any
1012 Miscellaneous API additions
1016 - gst_buffer_new_memdup() is a convenience function for the
1017 widely-used gst_buffer_new_wrapped(g_memdup(data,size),size)
1020 - gst_caps_features_new_single() creates a new single GstCapsFeatures,
1021 avoiding the need to use the vararg function with NULL terminator
1024 - gst_element_type_set_skip_documentation() can be used by plugins to
1025 signal that certain elements should not be included in the GStreamer
1026 plugin documentation. This is useful for plugins where elements are
1027 registered dynamically based on hardware capabilities and/or where
1028 the available plugins and properties vary from system to system.
1029 This is used in the d3d11 plugin for example to ensure that only the
1030 list of default elements is advertised in the documentation.
1032 - gst_type_find_suggest_empty_simple() is a new convenience function
1033 for typefinders for cases where there’s only a media type and no
1036 - New API to create elements and set properties at construction time,
1037 which is not only convenient, but also allows GStreamer elements to
1038 have construct-only properties: gst_element_factory_make_full(),
1039 gst_element_factory_make_valist(),
1040 gst_element_factory_make_with_properties(),
1041 gst_element_factory_create_full(),
1042 gst_element_factory_create_valist(),
1043 gst_element_factory_create_with_properties().
1045 - GstSharedTaskPool: new “shared” task pool subclass with slightly
1046 different default behaviour than the existing GstTaskPool which
1047 would create unlimited number of threads for new tasks. The shared
1048 taskpool creates up to N threads (default: 1) and then distributes
1049 pending tasks to those threads round-robin style, and blocks if no
1050 thread is available. It is possible to join tasks. This can be used
1051 by plugins to implement simple multi-threaded processing and is used
1052 for the new multi-threaded video conversion and compositing done in
1053 GstVideoAggregator, videoconverter and compositor.
1055 Plugins Base Utils library
1059 - gst_discoverer_container_info_get_tags() was added to retrieve
1060 global/container tags (vs. per-stream tags). Per-Stream tags can
1061 be retrieved via the existing
1062 gst_discoverer_stream_info_get_tags().
1063 gst_discoverer_info_get_tags(), which for many files returns a
1064 confusing mix of stream and container tags, has been deprecated
1065 in favour of the container/stream-specific functions.
1066 - gst_discoverer_stream_info_get_stream_number() returns a unique
1067 integer identifier for a given stream within the given
1068 GstDiscoverer context. (If this matches the stream number inside
1069 the container bitstream that’s by coincidence and not by
1072 - gst_pb_utils_get_caps_description_flags() can be used to query
1073 whether certain caps represent a container, audio, video, image,
1074 subtitles, tags, or something else. This only works for formats
1077 - gst_pb_utils_get_file_extension_from_caps() returns a possible file
1078 extension for given caps.
1080 - gst_codec_utils_h264_get_profile_flags_level(): Parses profile,
1081 flags, and level from H264 AvcC codec_data. The format of H264 AVCC
1082 extradata/sequence_header is documented in the ITU-T H.264
1083 specification section 7.3.2.1.1 as well as in ISO/IEC 14496-15
1086 - gst_codec_utils_caps_get_mime_codec() to convert caps to a RFC 6381
1087 compatible MIME codec string codec. Useful for providing the codecs
1088 field inside the Content-Type HTTP header for containerized formats,
1089 such as mp4 or matroska.
1091 GStreamer OpenGL integration library and plugins
1093 - glcolorconvert: added suppport for converting the video formats
1094 A420, AV12, BGR, BGRA, RGBP and BGRP.
1096 - Added support to GstGLBuffer for persistent buffer mappings where a
1097 Pixel Buffer Object (PBO) can be mapped by both the CPU and the GPU.
1098 This removes a memcpy() when uploading textures or vertices
1099 particularly when software decoders (e.g. libav) are direct
1100 rendering into our memory. Improves transfer performance
1101 significantly. Requires OpenGL 4.4, GL_ARB_buffer_storage or
1102 GL_EXT_buffer_storage
1104 - Added various helper functions for handling 4x4 matrices of affine
1105 transformations as used by GstVideoAffineTransformationMeta.
1107 - Add support to GstGLContext for allowing the application to control
1108 the config (EGLConfig, GLXConfig, etc) used when creating the OpenGL
1109 context. This allows the ability to choose between RGB16 or RGB10A2
1110 or RGBA8 back/front buffer configurations that were previously
1111 hardcoded. GstGLContext also supports retrieving the configuration
1112 it was created with or from an externally provide OpenGL context
1113 handle. This infrastructure is also used to create a compatible
1114 config from an application/externally provided OpenGL context in
1115 order to improve compatibility with other OpenGL frameworks and GUI
1116 toolkits. A new environment variable GST_GL_CONFIG was also added to
1117 be able to request a specific configuration from the command line.
1118 Note: different platforms will have different functionality
1121 - Add support for choosing between EGL and WGL at runtime when running
1122 on Windows. Previously this was a build-time switch. Allows use in
1123 e.g. Gtk applications on Windows that target EGL/ANGLE without
1124 recompiling GStreamer. gst_gl_display_new_with_type() can be used by
1125 applications to choose a specific display type to use.
1127 - Build fixes to explicitly check for Broadcom-specific libraries on
1128 older versions of the Raspberry Pi platform. The Broadcom OpenGL ES
1129 and EGL libraries have different filenames. Using the vc4 Mesa
1130 driver on the Raspberry Pi is not affected.
1132 - Added support to glupload and gldownload for transferring RGBA
1133 buffers using the memory:NVMM available on the Nvidia Tegra family
1134 of embedded devices.
1136 - Added support for choosing libOpenGL and libGLX as used in a GLVND
1137 environment on unix-based platforms. This allows using desktop
1138 OpenGL and EGL without pulling in any GLX symbols as would be
1139 required with libGL.
1143 - New raw video formats:
1145 - AV12 (NV12 with alpha plane)
1146 - RGBP and BGRP (planar RGB formats)
1147 - ARGB64 variants with specified endianness instead of host
1149 - ARGB64_LE, ARGB64_BE
1150 - RGBA64_BE, RGBA64_LE
1151 - BGRA64_BE, BGRA64_LE
1152 - ABGR64_BE, ABGR64_LE
1154 - gst_video_orientation_from_tag() is new convenience API to parse the
1155 image orientation from a GstTagList.
1157 - GstVideoDecoder subframe support (see below)
1159 - GstVideoCodecState now also carries some HDR metadata
1161 - Ancillary video data: implement transform functions for AFD/Bar
1162 metas, so they will be forwarded in more cases
1166 This library only handles section parsing and such, see above for
1167 changes to the actual mpegtsmux and mpegtsdemux elements.
1169 - many additions and improvements to SCTE-35 section parsing
1170 - new API for fetching extended descriptors:
1171 gst_mpegts_find_descriptor_with_extension()
1172 - add support for SIT sections (Selection Information Tables)
1173 - expose event-from-section constructor gst_event_new_mpegts_section()
1174 - parse Audio Preselection Descriptor needed for Dolby AC-4
1176 GstWebRTC library + webrtcbin
1178 - Change the way in which sink pads and transceivers are matched
1179 together to support easier usage. If a pad is created without a
1180 specific index (i.e. using sink_%u as the pad template), then an
1181 available compatible transceiver will be searched for. If a specific
1182 index is requested (i.e. sink_1) then if a transceiver for that
1183 m-line already exists, that transceiver must match the new sink pad
1184 request. If there is no transceiver available in either scenario, a
1185 new transceiver is created. If a mixture of both sink_1 and sink_%u
1186 requests result in an impossible situation, an error will be
1187 produced at pad request time or from create offer/answer.
1189 - webrtcbin now uses regular ICE nomination instead of libnice’s
1190 default of aggressive ICE nomination. Regular ICE nomination is the
1191 default recommended by various relevant standards and improves
1192 connectivity in specific network scenarios.
1194 - Add support for limiting the port range used for RTP with the
1195 addition of the min-rtp-port and max-rtp-port properties on the ICE
1198 - Expose the SCTP transport as a property on webrtcbin to more closely
1199 match the WebRTC specification.
1201 - Added support for taking into account the data channel transport
1202 state when determining the value of the "connection-state" property.
1203 Previous versions of the WebRTC spec did not include the data
1204 channel state when computing this value.
1206 - Add configuration for choosing the size of the underlying sockets
1207 used for transporting media data
1209 - Always advertise support for the transport-cc RTCP feedback protocol
1210 as rtpbin supports it. For full support, the configured caps (input
1211 or through codec-preferences) need to include the relevant RTP
1214 - Numerous fixes to caps and media handling to fail-fast when an
1215 incompatible situation is detected.
1217 - Improved support for attaching the required media after a remote
1220 - Add support for dynamically changing the amount of FEC used for a
1223 - webrtcbin now stops further SDP processing at the first error it
1226 - Completed support for either local or the remote closing a data
1229 - Various fixes when performing BUNDLEing of the media streams in
1230 relation to RTX and FEC usage.
1232 - Add support for writing out QoS DSCP marking on outgoing packets to
1233 improve reliability in some network scenarios.
1235 - Improvements to the statistics returned by the get-stats signal
1236 including the addition of the raw statistics from the internal
1237 RTPSource, the TWCC stats when available.
1239 - The webrtc library does not expose any objects anymore with public
1240 fields. Instead properties have been added to replace that
1241 functionality. If you are accessing such fields in your application,
1242 switch to the corresponding properties.
1244 GstCodecs and Video Parsers
1246 - Support for render delays to improve throughput across all CODECs
1247 (used with NVDEC and V4L2).
1248 - lots of improvements to parsers and the codec parsing decoder base
1249 classes (H264, H265, VP8, VP9, AV1, MPEG-2) used for various
1250 hardware-accelerate decoder APIs.
1254 - gst_allocation_params_new() allocates a GstAllocationParams struct
1255 on the heap. This should only be used by bindings (and freed via
1256 gst_allocation_params_free() then). In C code you would allocate
1257 this on the stack and only init it in place.
1259 - gst_debug_log_literal() can be used to log a string to the debug log
1260 without going through any printf format expansion and associated
1261 overhead. This is mostly useful for bindings such as the Rust
1262 bindings which may have done their own formatting already .
1264 - Provide non-inlined versions of refcounting APIs for various
1265 GStreamer mini objects, so that they can be consumed by bindings
1266 (e.g. gstreamer-sharp): gst_buffer_ref, gst_buffer_unref,
1267 gst_clear_buffer, gst_buffer_copy, gst_buffer_replace,
1268 gst_buffer_list_ref, gst_buffer_list_unref, gst_clear_buffer_list,
1269 gst_buffer_list_copy, gst_buffer_list_replace, gst_buffer_list_take,
1270 gst_caps_ref, gst_caps_unref, gst_clear_caps, gst_caps_replace,
1271 gst_caps_take, gst_context_ref, gst_context_unref, gst_context_copy,
1272 gst_context_replace, gst_event_replace, gst_event_steal,
1273 gst_event_take, gst_event_ref, gst_event_unref, gst_clear_event,
1274 gst_event_copy, gst_memory_ref, gst_memory_unref, gst_message_ref,
1275 gst_message_unref, gst_clear_message, gst_message_copy,
1276 gst_message_replace, gst_message_take, gst_promise_ref,
1277 gst_promise_unref, gst_query_ref, gst_query_unref, gst_clear_query,
1278 gst_query_copy, gst_query_replace, gst_query_take, gst_sample_ref,
1279 gst_sample_unref, gst_sample_copy, gst_tag_list_ref,
1280 gst_tag_list_unref, gst_clear_tag_list, gst_tag_list_replace,
1281 gst_tag_list_take, gst_uri_copy, gst_uri_ref, gst_uri_unref,
1284 - expose a GType for GstMiniObject
1286 - gst_device_provider_probe() now returns non-floating device object
1290 - gst_element_get_request_pad() has been deprecated in favour of the
1291 newly-added gst_element_request_pad_simple() which does the exact
1292 same thing but has a less confusing name that hopefully makes clear
1293 that the function request a new pad rather than just retrieves an
1294 already-existing request pad.
1296 - gst_discoverer_info_get_tags(), which for many files returns a
1297 confusing mix of stream and container tags, has been deprecated in
1298 favour of the container-specific and stream-specific functions,
1299 gst_discoverer_container_info_get_tags() and
1300 gst_discoverer_stream_info_get_tags().
1302 - gst_video_sink_center_rect() was deprecated in favour of the more
1303 generic newly-added gst_video_center_rect().
1305 - The GST_MEMORY_FLAG_NO_SHARE flag has been deprecated, as it tends
1306 to cause problems and prevents sub-buffering. If pooling or lifetime
1307 tracking is required, memories should be allocated through a custom
1308 GstAllocator instead of relying on the lifetime of the buffers the
1309 memories were originally attached to, which is fragile anyway.
1311 - The GstPlayer high-level playback library is being replaced with the
1312 new GstPlay library (see above). GstPlayer should be considered
1313 deprecated at this point and will be marked as such in the next
1314 development cycle. Applications should be ported to GstPlay.
1316 - Gstreamer Editing Services: ges_video_transition_set_border(),
1317 ges_video_transition_get_border()
1318 ges_video_transition_set_inverted()
1319 ges_video_transition_is_inverted() have been deprecated, use
1320 ges_timeline_element_set_children_properties() instead.
1322 Miscellaneous performance, latency and memory optimisations
1324 More video conversion fast paths
1326 - v210 ↔ I420, YV12, Y42B, UYVY and YUY2
1329 Less jitter when waiting on the system clock
1331 - Better system clock wait accuracy, less jitter: where available,
1332 clock_nanosleep is used for higher accuracy for waits below 500
1333 usecs, and waits below 2ms will first use the regular waiting system
1334 and then clock_nanosleep for the remainder. The various wait
1335 implementation have a latency ranging from 50 to 500+ microseconds.
1336 While this is not a major issue when dealing with a low number of
1337 waits per second (for ex: video), it does introduce a non-negligible
1338 jitter for synchronisation of higher packet rate systems.
1340 Video decoder subframe support
1342 - The GstVideoDecoder base class gained API to process input at the
1343 sub-frame level. That way video decoders can start decoding slices
1344 before they have received the full input frame in its entirety (to
1345 the extent this is supported by the codec, of course). This helps
1346 with CPU utilisation and reduces latency.
1348 - This functionality is now being used in the OpenJPEG JPEG 2000
1349 decoder, the FFmpeg H.264 decoder (in case of NAL-aligned input) and
1350 the OpenMAX H.264/H.265 decoders (in case of NAL-aligned input).
1352 Miscellaneous other changes and enhancements
1354 - GstDeviceMonitor no longer fails to start just because one of the
1355 device providers failed to start. That could happen for example on
1356 systems where the pulseaudio device provider is installed, but
1357 pulseaudio isn’t actually running but ALSA is used for audio
1358 instead. In the same vein the device monitor now keeps track of
1359 which providers have been started (via the new
1360 gst_device_provider_is_started()) and only stops actually running
1361 device providers when stopping the device monitor.
1363 - On embedded systems it can be useful to create a registry that can
1364 be shared and read by multiple processes running as different users.
1365 It is now possible to set the new GST_REGISTRY_MODE environment
1366 variable to specify the file mode for the registry file, which by
1367 default is set to be only user readable/writable.
1369 - GstNetClientClock will signal lost sync in case the remote time
1370 resets (e.g. because device power cycles), by emitting the “synced”
1371 signal with synced=FALSE parameter, so applications can take action.
1373 - gst_value_deserialize_with_pspec() allows deserialization with a
1374 hint for what the target GType should be. This allows for example
1375 passing arrays of flags through the command line or
1376 gst_util_set_object_arg(), eg: foo="<bar,bar+baz>".
1378 - It’s now allowed to create an empty GstVideoOverlayComposition
1379 without any rectangles by passing a NULL rectangle to
1380 gst_video_overlay_composition_new(). This is useful for bindings and
1381 simplifies application code in some places.
1383 Tracing framework, debugging and testing improvements
1385 - New factories tracer to list loaded elements (and other plugin
1386 features). This can be useful to collect a list of elements needed
1387 for an application, which then in turn can be used to create a
1388 tailored minimal GStreamer build that contains just the elements
1389 needed and nothing else.
1390 - New plugin-feature-loaded tracing hook for use by tracers like the
1391 new factories tracer
1393 - GstHarness: Add gst_harness_set_live() so that harnesses can be set
1394 to non-live and return is-live=false in latency queries if needed.
1395 Default behaviour is to always return is-live=true in latency
1398 - navseek: new "hold-eos" property. When enabled, the element will
1399 hold back an EOS event until the next keystroke (via navigation
1400 events). This can be used to keep a video sink showing the last
1401 frame of a video pipeline until a key is pressed instead of tearing
1402 it down immediately on EOS.
1404 - New fakeaudiosink element: mimics an audio sink and can be used for
1405 testing and CI pipelines on systems where no audio system is
1406 installed or running. It differs from fakesink in that it only
1407 support audio caps and syncs to the clock by default like a normal
1408 audio sink. It also implements the GstStreamVolume interface like
1409 most audio sinks do.
1411 - New videocodectestsink element for video codec conformance testing:
1412 Calculates MD5 checksums for video frames and skips any padding
1413 whilst doing so. Can optionally also write back the video data with
1414 padding removed into a file for easy byte-by-byte comparison with
1421 - Can sort the list of plugins by passing --sort=name as command line
1426 - will now error out on top-level properties that don’t exist and
1427 which were silently ignored before
1428 - On Windows the high-resolution clock is enabled now, which provides
1429 better clock and timer performance on Windows (see Windows section
1430 below for more details).
1434 - New --start-position command line argument to start playback from
1435 the specified position
1436 - Audio can be muted/unmuted in interactive mode by pressing the m
1438 - On Windows the high-resolution clock is enabled now (see Windows
1439 section below for more details)
1441 gst-device-monitor-1.0
1443 - New --include-hidden command line argument to also show “hidden”
1448 - New interactive mode that allows seeking and such. Can be disabled
1449 by passing the --no-interactive argument on the command line.
1450 - Option to forward tags
1451 - Allow using an existing clip to determine the rendering format (both
1452 topology and profile) via new --profile-from command line argument.
1454 GStreamer RTSP server
1456 - GstRTSPMediaFactory gained API to disable RTCP
1457 (gst_rtsp_media_factory_set_enable_rtcp(), "enable-rtcp" property).
1458 Previously RTCP was always allowed for all RTSP medias. With this
1459 change it is possible to disable RTCP completely, no matter if the
1460 client wants to do RTCP or not.
1462 - Make a mount point of / work correctly. While not allowed by the
1463 RTSP 2 spec, the RTSP 1 spec is silent on this and it is used in the
1464 wild. It is now possible to use / as a mount path in
1465 gst-rtsp-server, e.g. rtsp://example.com/ would work with this now.
1466 Note that query/fragment parts of the URI are not necessarily
1467 correctly handled, and behaviour will differ between various
1468 client/server implementations; so use it if you must but don’t bug
1469 us if it doesn’t work with third party clients as you’d hoped.
1471 - multithreading fixes (races, refcounting issues, deadlocks)
1473 - ONVIF audio backchannel fixes
1475 - ONVIF trick mode optimisations
1477 - rtspclientsink: new "update-sdp" signal that allows updating the SDP
1478 before sending it to the server via ANNOUNCE. This can be used to
1479 add additional metadata to the SDP, for example. The order and
1480 number of medias must not be changed, however.
1484 - new AV1 decoder element (vaapiav1dec)
1486 - H264 decoder: handle stereoscopic 3D video with frame packing
1487 arrangement SEI messages
1489 - H265 encoder: added Screen Content Coding extensions support
1491 - H265 decoder: gained MAIN_444_12 profile support (decoded to
1492 Y412_LE), and 4:2:2 12-bits support (decoded to Y212_LE)
1494 - vaapipostproc: gained BT2020 color standard support
1496 - vaapidecode: now generates caps templates dynamically at runtime in
1497 order to advertise actually supported caps instead of all
1498 theoretically supported caps.
1500 - GST_VAAPI_DRM_DEVICE environment variable to force a specified DRM
1501 device when a DRM display is used. It is ignored when other types of
1502 displays are used. By default /dev/dri/renderD128 is used for DRM
1507 - subframe support in H.264/H.265 decoders
1509 GStreamer Editing Services and NLE
1511 - framepositioner: new "operator" property to access blending modes in
1513 - timeline: Implement snapping to markers
1514 - smart-mixer: Add support for d3d11compositor and glvideomixer
1515 - titleclip: add "draw-shadow" child property
1516 - ges:// URI support to define a timeline from a description.
1517 - command-line-formatter
1518 - Add track management to timeline description
1519 - Add keyframe support
1521 - Add an interactive mode where we can seek etc…
1522 - Add option to forward tags
1523 - Allow using an existing clip to determine the rendering format
1524 (both topology and profile) via new --profile-from command line
1530 - report: Add a way to force backtraces on reports even if not a
1531 critical issue (GST_VALIDATE_ISSUE_FLAGS_FORCE_BACKTRACE)
1532 - Add a flag to gst_validate_replace_variables_in_string() allow
1533 defining how to resolve variables in structs
1534 - Add gst_validate_bin_monitor_get_scenario() to get the bin monitor
1535 scenario, which is useful for applications that use Validate
1537 - Add an expected-values parameter to wait, message-type=XX allowing
1538 more precise filtering of the message we are waiting for.
1539 - Add config file support: each test can now use a config file for the
1540 given media file used to test.
1541 - Add support to check properties of object properties
1542 - scenario: Add an "action-done" signal to signal when an action is
1544 - scenario: Add a "run-command" action type
1545 - scenario: Allow forcing running action on idle from scenario file
1546 - scenario: Allow iterating over arrays in foreach
1547 - scenario: Rename ‘interlaced’ action to ‘non-blocking’
1548 - scenario: Add a non-blocking flag to the wait signal
1550 GStreamer Python Bindings
1552 - Fixes for Python 3.10
1553 - Various build fixes
1554 - at least one known breaking change caused by g-i annotation changes
1557 GStreamer C# Bindings
1559 - Fix GstDebugGraphDetails enum
1560 - Updated to latests GtkSharp
1561 - Updated to include GStreamer 1.20 API
1563 GStreamer Rust Bindings and Rust Plugins
1565 - The GStreamer Rust bindings are released separately with a different
1566 release cadence that’s tied to gtk-rs, but the latest release has
1567 already been updated for the upcoming new GStreamer 1.20 API (v1_20
1570 - gst-plugins-rs, the module containing GStreamer plugins written in
1571 Rust, has also seen lots of activity with many new elements and
1572 plugins. See the New Elements section above for a list of new Rust
1575 Build and Dependencies
1577 - Meson 0.59 or newer is required to build GStreamer now.
1579 - The GLib requirement has been bumped to GLib 2.56 or newer (from
1582 - The wpe plugin now requires wpe >= 2.28 and wpebackend-fdo >= 1.8
1584 Explicit opt-in required for build of certain plugins with (A)GPL dependencies
1586 Some plugins have GPL- or AGPL-licensed dependencies and those plugins
1587 will no longer be built by default unless you have explicitly opted in
1588 to allow (A)GPL-licensed dependencies by passing -Dgpl=enabled to Meson,
1589 even if the required dependencies are available.
1591 See Building plugins with (A)GPL-licensed dependencies for more details
1592 and a non-exhaustive list of plugins affected.
1594 gst-build: replaced by mono repository
1596 See mono repository section above and the GStreamer mono repository FAQ.
1600 Cerbero is a meta build system used to build GStreamer plus dependencies
1601 on platforms where dependencies are not readily available, such as
1602 Windows, Android, iOS and macOS.
1604 General Cerbero improvements
1606 - Plugin removed: libvisual
1607 - New plugins: rtpmanagerbad and rist
1609 macOS / iOS specific Cerbero improvements
1612 - macOS OS release support is now future-proof, similar to iOS
1613 - macOS Apple Silicon (ARM64) cross-compile support has been added
1614 - macOS Apple Silicon (ARM64) native support is currently experimental
1616 Windows specific Cerbero improvements
1618 - Visual Studio 2022 support has been added
1619 - bootstrap is faster since it requires building fewer build-tools
1621 - package is faster due to better scheduling of recipe stages and
1622 elimination of unnecessary autotools regeneration
1623 - The following plugins are no longer built on Windows:
1624 - a52dec (another decoder is still available in libav)
1628 Windows MSI installer
1632 Linux specific Cerbero improvements
1634 - Fedora, Debian OS release support is now more future-proof
1635 - Amazon Linux 2 support has been added
1637 Android specific Cerbero improvements
1641 Platform-specific changes and improvements
1649 - applemedia: add ProRes support to vtenc and vtdec
1653 - On Windows the high-resolution clock is enabled now in the
1654 gst-launch-1.0 and gst-play-1.0 command line tools, which provides
1655 better clock and timer performance on Windows, at the cost of higher
1656 power consumption. By default, without the high-resolution clock
1657 enabled, the timer precision on Windows is system-dependent and may
1658 be as bad as 15ms which is not good enough for many multimedia
1659 applications. Developers may want to do the same in their Windows
1660 applications if they think it’s a good idea for their application
1661 use case, and depending on the Windows version they target. This is
1662 not done automatically by GStreamer because on older Windows
1663 versions (pre-Windows 10) this affects a global Windows setting and
1664 also there’s a power consumption vs. performance trade-off that may
1665 differ from application to application.
1667 - dxgiscreencapsrc now supports resolution changes
1669 - The wasapi2 audio plugin was rewritten and now has a higher rank
1670 than the old wasapi plugin since it has a number of additional
1671 features such as automatic stream routing, and no
1672 known-but-hard-to-fix issues. The plugin is always built if the
1673 Windows 10 SDK is available now.
1675 - The wasapi device providers now detect and notify dynamic device
1678 - d3d11screencapturesrc: new desktop capture element, including
1679 GstDeviceProvider implementation to enumerate/select target monitors
1682 - Direct3D11/DXVA decoder now supports AV1 and MPEG2 codecs
1683 (d3d11av1dec, d3d11mpeg2dec)
1685 - VP9 decoding got more reliable and stable thanks to a newly written
1688 - Support for decoding interlaced H.264/AVC streams
1690 - Hardware-accelerated video deinterlacing (d3d11deinterlace) and
1691 video mixing (d3d11compositor)
1693 - Video mixing with the Direct3D11 API (d3d11compositor)
1695 - MediaFoundation API based hardware encoders gained the ability to
1696 receive Direct3D11 textures as an input
1698 - Seungha’s blog post “GStreamer ❤ Windows: A primer on the cool stuff
1699 you’ll find in the 1.20 release” describes many of the
1700 Windows-related improvements in more detail
1704 - bluez: LDAC Bluetooth audio codec support in a2dpsink and avdtpsink,
1705 as well as an LDAC RTP payloader (rtpldacpay) and an LDAC audio
1708 - kmssink: gained support for NV24, NV61, RGB16/BGR16 formats;
1709 auto-detect NVIDIA Tegra driver
1711 Documentation improvements
1713 - hardware-accelerated GPU plugins will now no longer always list all
1714 the element variants for all available GPUs, since those are
1715 system-dependent and it’s confusing for users to see those in the
1716 documentation just because the GStreamer developer who generated the
1717 docs had multiple GPUs to play with at the time. Instead just show
1718 the default elements.
1720 Possibly Breaking and Other Noteworthy Behavioural Changes
1722 - gst_parse_launch(), gst_parse_bin_from_description() and friends
1723 will now error out when setting properties that don’t exist on
1724 top-level bins. They were silently ignored before.
1726 - The GstWebRTC library does not expose any objects anymore with
1727 public fields. Instead properties have been added to replace that
1728 functionality. If you are accessing such fields in your application,
1729 switch to the corresponding properties.
1731 - playbin and uridecodebin now emit the source-setup signal before the
1732 element is added to the bin and linked so that the source element is
1733 already configured before any scheduling query comes in, which is
1734 useful for elements such as appsrc or giostreamsrc.
1736 - The source element inside urisourcebin (used inside uridecodebin3
1737 which is used inside playbin3) is no longer called "source". This
1738 shouldn’t affect anyone hopefully, because there’s a "setup-source"
1739 signal to configure the source element and no one should rely on
1740 names of internal elements anyway.
1742 - The vp8enc element now expects bps (bits per second) for the
1743 "temporal-scalability-target-bitrate" property, which is consistent
1744 with the "target-bitrate" property. Since additional configuration
1745 is required with modern libvpx to make temporal scaling work anyway,
1746 chances are that very few people will have been using this property
1748 - vp8enc and vp9enc now default to “good quality” for the "deadline"
1749 property rather then “best quality”. Having the deadline set to best
1750 quality causes the encoder to be absurdly slow, most real-life users
1751 will want the good quality tradeoff instead.
1753 - The experimental GstTranscoder library API in gst-plugins-bad was
1754 changed from a GObject signal-based notification mechanism to a
1755 GstBus/message-based mechanism akin to GstPlayer/GstPlay.
1757 - MPEG-TS SCTE-35 API: semantic change for SCTE-35 splice commands:
1758 timestamps passed by the application should be in running time now,
1759 since users of the API can’t really be expected to predict the local
1762 - The GstContext used by souphttpsrc to share the session between
1763 multiple element instances has changed. Previously it provided
1764 direct access to the internal SoupSession object, now it only
1765 provides access to an opaque, internal type. This change is
1766 necessary because SoupSession is not thread-safe at all and can’t be
1767 shared safely between arbitrary external code and souphttpsrc.
1769 - Python bindings: GObject-introspection related Annotation fixes have
1770 led to a case of a GstVideo.VideoInfo-related function signature
1771 changing in the Python bindings (possibly one or two other cases
1772 too). This is for a function that should never have been exposed in
1773 the first place though, so the bindings are being updated to throw
1774 an exception in that case, and the correct replacement API has been
1775 added in form of an override.
1779 - nothing in particular at this point (but also see possibly breaking
1780 changes section above)
1784 Aaron Boxer, Adam Leppky, Adam Williamson, Alba Mendez, Alejandro
1785 González, Aleksandr Slobodeniuk, Alexander Vandenbulcke, Alex Ashley,
1786 Alicia Boya García, Andika Triwidada, Andoni Morales Alastruey, Andrew
1787 Wesie, Andrey Moiseev, Antonio Ospite, Antonio Rojas, Arthur Crippa
1788 Búrigo, Arun Raghavan, Ashley Brighthope, Axel Kellermann, Baek, Bastien
1789 Nocera, Bastien Reboulet, Benjamin Gaignard, Bing Song, Binh Truong,
1790 Biswapriyo Nath, Brad Hards, Brad Smith, Brady J. Garvin, Branko
1791 Subasic, Camilo Celis Guzman, Chris Bass, ChrisDuncanAnyvision, Chris
1792 White, Corentin Damman, Daniel Almeida, Daniel Knobe, Daniel Stone,
1793 david, David Fernandez, David Keijser, David Phung, Devarsh Thakkar,
1794 Dinesh Manajipet, Dmitry Samoylov, Dmitry Shusharin, Dominique Martinet,
1795 Doug Nazar, Ederson de Souza, Edward Hervey, Emmanuel Gil Peyrot,
1796 Enrique Ocaña González, Ezequiel Garcia, Fabian Orccon, Fabrice
1797 Fontaine, Fernando Jimenez Moreno, Florian Karydes, Francisco Javier
1798 Velázquez-García, François Laignel, Frederich Munch, Fredrik Pålsson,
1799 George Kiagiadakis, Georg Lippitsch, Göran Jönsson, Guido Günther,
1800 Guillaume Desmottes, Guiqin Zou, Haakon Sporsheim, Haelwenn (lanodan)
1801 Monnier, Haihao Xiang, Haihua Hu, Havard Graff, He Junyan, Helmut
1802 Januschka, Henry Wilkes, Hosang Lee, Hou Qi, Ignacio Casal Quinteiro,
1803 Igor Kovalenko, Ilya Kreymer, Imanol Fernandez, Jacek Tomaszewski, Jade
1804 Macho, Jakub Adam, Jakub Janků, Jan Alexander Steffens (heftig), Jan
1805 Schmidt, Jason Carrete, Jason Pereira, Jay Douglass, Jeongki Kim, Jérôme
1806 Laheurte, Jimmi Holst Christensen, Johan Sternerup, John Hassell, John
1807 Lindgren, John-Mark Bell, Jonathan Matthew, Jordan Petridis, Jose
1808 Quaresma, Julian Bouzas, Julien, Kai Uwe Broulik, Kasper Steensig
1809 Jensen, Kellermann Axel, Kevin Song, Khem Raj, Knut Inge Hvidsten, Knut
1810 Saastad, Kristofer Björkström, Lars Lundqvist, Lawrence Troup, Lim Siew
1811 Hoon, Lucas Stach, Ludvig Rappe, Luis Paulo Fernandes de Barros, Luke
1812 Yelavich, Mads Buvik Sandvei, Marc Leeman, Marco Felsch, Marek Vasut,
1813 Marian Cichy, Marijn Suijten, Marius Vlad, Markus Ebner, Mart Raudsepp,
1814 Matej Knopp, Mathieu Duponchelle, Matthew Waters, Matthieu De Beule,
1815 Mengkejiergeli Ba, Michael de Gans, Michael Olbrich, Michael Tretter,
1816 Michal Dzik, Miguel Paris, Mikhail Fludkov, mkba, Nazar Mokrynskyi,
1817 Nicholas Jackson, Nicola Murino, Nicolas Dufresne, Niklas Hambüchen,
1818 Nikolay Sivov, Nirbheek Chauhan, Olivier Blin, Olivier Crete, Olivier
1819 Crête, Paul Goulpié, Per Förlin, Peter Boba, P H, Philippe Normand,
1820 Philipp Zabel, Pieter Willem Jordaan, Piotrek Brzeziński, Rafał
1821 Dzięgiel, Rafostar, raghavendra, Raghavendra, Raju Babannavar, Raleigh
1822 Littles III, Randy Li, Randy Li (ayaka), Ratchanan Srirattanamet, Raul
1823 Tambre, reed.lawrence, Ricky Tang, Robert Rosengren, Robert Swain, Robin
1824 Burchell, Roman Sivriver, R S Nikhil Krishna, Ruben Gonzalez, Ruslan
1825 Khamidullin, Sanchayan Maity, Scott Moreau, Sebastian Dröge, Sergei
1826 Kovalev, Seungha Yang, Sid Sethupathi, sohwan.park, Sonny Piers, Staz M,
1827 Stefan Brüns, Stéphane Cerveau, Stephan Hesse, Stian Selnes, Stirling
1828 Westrup, Théo MAILLART, Thibault Saunier, Tim, Timo Wischer, Tim-Philipp
1829 Müller, Tim Schneider, Tobias Ronge, Tom Schoonjans, Tulio Beloqui,
1830 tyler-aicradle, U. Artie Eoff, Ung, Val Doroshchuk, VaL Doroshchuk,
1831 Víctor Manuel Jáquez Leal, Vivek R, Vivia Nikolaidou, Vivienne
1832 Watermeier, Vladimir Menshakov, Will Miller, Wim Taymans, Xabier
1833 Rodriguez Calvar, Xavier Claessens, Xℹ Ruoyao, Yacine Bandou, Yinhang
1834 Liu, youngh.lee, youngsoo.lee, yychao, Zebediah Figura, Zhang yuankun,
1835 Zhang Yuankun, Zhao, Zhao Zhili, , Aleksandar Topic, Antonio Ospite,
1836 Bastien Nocera, Benjamin Gaignard, Brad Hards, Carlos Falgueras García,
1837 Célestin Marot, Corentin Damman, Corentin Noël, Daniel Almeida, Daniel
1838 Knobe, Danny Smith, Dave Piché, Dmitry Osipenko, Fabrice Fontaine,
1839 fjmax, Florian Zwoch, Guillaume Desmottes, Haihua Hu, Heinrich Kruger,
1840 He Junyan, Jakub Adam, James Cowgill, Jan Alexander Steffens (heftig),
1841 Jean Felder, Jeongki Kim, Jiri Uncovsky, Joe Todd, Jordan Petridis,
1842 Krystian Wojtas, Marc-André Lureau, Marcin Kolny, Marc Leeman, Mark
1843 Nauwelaerts, Martin Reboredo, Mathieu Duponchelle, Matthew Waters,
1844 Mengkejiergeli Ba, Michael Gruner, Nicolas Dufresne, Nirbheek Chauhan,
1845 Olivier Crête, Philippe Normand, Rafał Dzięgiel, Ralf Sippl, Robert
1846 Mader, Sanchayan Maity, Sangchul Lee, Sebastian Dröge, Seungha Yang,
1847 Stéphane Cerveau, Teh Yule Kim, Thibault Saunier, Thomas Klausner, Timo
1848 Wischer, Tim-Philipp Müller, Tobias Reineke, Tomasz Andrzejak, Trung Do,
1849 Tyler Compton, Ung, Víctor Manuel Jáquez Leal, Vivia Nikolaidou, Wim
1850 Taymans, wngecn, Wonchul Lee, wuchang li, Xavier Claessens, Xi Ruoyao,
1851 Yoshiharu Hirose, Zhao,
1853 … and many others who have contributed bug reports, translations, sent
1854 suggestions or helped testing.
1858 After the 1.20.0 release there will be several 1.20.x bug-fix releases
1859 which will contain bug fixes which have been deemed suitable for a
1860 stable branch, but no new features or intrusive changes will be added to
1861 a bug-fix release usually. The 1.20.x bug-fix releases will be made from
1862 the git 1.20 branch, which will be a stable branch.
1866 1.20.0 is scheduled to be released around early February 2022.
1870 Our next major feature release will be 1.22, and 1.21 will be the
1871 unstable development version leading up to the stable 1.22 release. The
1872 development of 1.21/1.22 will happen in the git main branch.
1874 The plan for the 1.22 development cycle is yet to be confirmed. Assuming
1875 no major project-wide reorganisations in the 1.22 cycle we might try and
1876 aim for a release around August 2022.
1878 1.22 will be backwards-compatible to the stable 1.20, 1.18, 1.16, 1.14,
1879 1.12, 1.10, 1.8, 1.6, 1.4, 1.2 and 1.0 release series.
1881 ------------------------------------------------------------------------
1883 These release notes have been prepared by Tim-Philipp Müller with
1884 contributions from Matthew Waters, Nicolas Dufresne, Nirbheek Chauhan,
1885 Sebastian Dröge and Seungha Yang.
1887 License: CC BY-SA 4.0