===============================================================
- Subtitle overlays, hardware-accelerated decoding and playbin2
+ Subtitle overlays, hardware-accelerated decoding and playbin
===============================================================
Status: EARLY DRAFT / BRAINSTORMING
-The following text will use "playbin" synonymous with "playbin2".
-
=== 1. Background ===
Subtitles can be muxed in containers or come from an external source.
is only a small rectangle at the bottom. This wastes memory and CPU.
We could do something better by introducing a new format that only
encodes the region(s) of interest, but we don't have such a format yet, and
-are not necessarily keen to rewrite this part of the logic in playbin2
+are not necessarily keen to rewrite this part of the logic in playbin
at this point - and we can't change existing elements' behaviour, so would
need to introduce new elements for this.
It's much better to upload the much smaller encoded data to the GPU/DSP
and then leave it there until rendered.
-Currently playbin2 only supports subtitles on top of raw decoded video.
+Currently playbin only supports subtitles on top of raw decoded video.
It will try to find a suitable overlay element from the plugin registry
based on the input subtitle caps and the rank. (It is assumed that we
will be able to convert any raw video format into any format required
It will not render subtitles if the video sent to the sink is not
raw YUV or RGB or if conversions have been disabled by setting the
-native-video flag on playbin2.
+native-video flag on playbin.
Subtitle rendering is considered an important feature. Enabling
hardware-accelerated decoding by default should not lead to a major
to buffers seems more intuitive than sending it interleaved as
events. And buffers stored or passed around (e.g. via the
"last-buffer" property in the sink when doing screenshots via
- playbin2) always contain all the information needed.
+ playbin) always contain all the information needed.
(d) create a video/x-raw-*-delta format and use a backend-specific videomixer
in the video backend plugin. It would also add a concept that
might be generally useful (think ximagesrc capture with xdamage).
However, it would require adding foorender variants of all the
- existing overlay elements, and changing playbin2 to that new
+ existing overlay elements, and changing playbin to that new
design, which is somewhat intrusive. And given the general
nature of such a new format/API, we would need to take a lot
of care to be able to accommodate all possible use cases when
-playbin2
+playbin
--------
The purpose of this element is to decode and render the media contained in a
- allows for configuration of visualisation element.
- allows for enable/disable of visualisation, audio and video.
-* playbin2
+* playbin
- combination of one or more uridecodebin elements to read the uri and subtitle
uri.
Gapless playback
----------------
-playbin2 has an "about-to-finish" signal. The application should configure a new
+playbin has an "about-to-finish" signal. The application should configure a new
uri (and optional suburi) in the callback. When the current media finishes, this
new media will be played next.