4 The phonon design is based around forming graphs using 3 basic components:
6 - a source component that generates raw audio/video/subtitle data, aka
8 - an effect component that applies effects to raw audio/video known as
9 AudioPath/VideoPath respectively. Subtitles are routed to a VideoPath
10 - output components that render audio or video called AudioOutput and
13 there is also a special input object that allows for feeding raw data in the
14 pipeline and specialized sinks to retrieve audio samples and video frames from
17 A typical graph or a source that produces an audio and a video stream that
18 need to be played. The VideoPath and AudioPath typically contain no filters
25 +-----------+ +-------------+
26 ----->| VideoPath |----->| VideoOutput |
27 | +-----------+ +-------------+
31 | +-----------+ +-------------+
32 +---->| AudioPath |----->| AudioOutput |
33 +-----------+ +-------------+
39 - This is very similar to a regular gstreamer playback pipeline.
41 A typical graph of playing and crosfading two sources:
47 +-------------+ +-----------+
48 | MediaObject |--->| AudioPath |\
49 +-------------+ +-----------+ \ +-------------+
51 +-------------+ +-----------+ / +-------------+
52 | MediaObject |--->| AudioPath |/
53 +-------------+ +-----------+
59 - As soon as two audio paths are connected to one sink, the input signals are
60 mixed before sending them to the sink. The mixing is typically done in the
61 audio sink by an element such as adder.
63 Other types of graphs are possible too:
67 +-------------+ / +-----------+ \ +-------------+
68 | MediaObject |-- ---->| AudioOutput |
69 +-------------+ \ +-----------+ / +-------------+
73 - This graph sends the same out data to 2 effect filter graphs and then mixes
74 it to an audio output. The splitting of the graph typically happens with a
75 tee element after the media object.
81 1) do the following chains run
83 - synchronized with a shared clock?
85 +-------------+ +-----------+ +-------------+
86 | MediaObject |--->| AudioPath |--->| AudioOutput |
87 +-------------+ +-----------+ +-------------+
89 +-------------+ +-----------+ +-------------+
90 | MediaObject |--->| VideoPath |--->| VideoOutput |
91 +-------------+ +-----------+ +-------------+
93 - no API to set both MediaObjects atomically to play so it is assumed that
94 the playback starts and follows the rate of the global clock as soon as
95 the MediaObject is set to play. This makes unconnected chains run as if
96 they were in different GstPipelines.
100 - Can signals be emited from any thread?
101 - what operations are permited from a signal handler?
105 - How does error reporting work?
106 * an audio/video device/port is busy.
107 * a fatal decoding error occured.
108 * a media type is not supported
114 - Setting up KDE and Phonon build environment
115 - Testing, identifying test applications, building test cases
116 - Asking questions to Phonon maintainers/designers
121 These classes are essential to implement a backend and should be implemented
124 Phonon::BackendCapabilities
126 Mostly exposes features in the registry like available decoders and effects.
130 Entry point for the GStreamer backend. Provides methods to create instances of
131 object from our backed.
137 The following classes need to be implemented in order to have simple playback
138 capabilities from the backend.
142 - Wrapper around audiosinks. Also needs provision for rate and format
144 - Mixing capabilities in the case when 2 audio paths are routed to it.
148 * is the volume related to the device or to the connection to the device.
152 - Wrapper around videosinks. Also needs provision for colorspace and size
153 conversions. Extends QWidget and probably needs to hook into the XOverlay
154 stuff to draw in the QT widget. Supports fullscreen mode with a switch.
156 - Needs mixing capabilities in the case when 2 video paths are routed to it.
158 Phonon::AbstractMediaProducer
160 - contains stream selection
163 - periodically performs tick callbacks.
167 - The object that decodes the media into raw audio/video/subtitle.
168 This object will use the GStreamer decodebin element to perform the
169 typefinding and decoding.
171 Phonon::AudioPath/Phonon::VideoPath
173 - Simple container for audio/video effect plugins.
174 - Handles adding/removing of effects, making sure that the streaming is not
175 interrupted and the formats are all compatible.
181 Phonon::Visualization
183 Connects an AudioPath to a VideoWidget and allows for selection of a
184 visualisation plugin.
186 Phonon::AudioEffect/Phonon::VideoEffect
190 Phonon::VolumeFaderEffect
192 Alows fade-in and fade-out with a configurable curve and time. Needs
195 Phonon::BrightnessControl
197 Controls the brightness of video.
205 ?? don't know yet where this fits in.
213 Synchronized audio and video capture.
225 Feed raw data into the pipeline. Used for streaming network access.
229 Possibly a specialized source element connected to a decodebin.
233 * Phonon::ByteStream::writeData
236 * Phonon::ByteStream::setStreamSeekable
237 - If called before starting the ByteStream, decodebin might operate in pull
238 based mode when supported. Else the source is activated in push mode.
239 - If called after starting ByteStream, the Phonon::ByteStream::seekStream
240 signal can be called for push-based seekable streams.
242 * Can the signals be emited from a streaming thread?
245 Phonon::AudioDataOutput/Phonon::VideoDataOutput/
247 Receive raw audio/video data from the pipeline. Used to allow applications to
248 deal with the raw data themselves.
252 Possibly a specialized sink element.
256 * Phonon::AudioDataOutput::dataReady
257 - can this be emited from the streaming threads?
259 * Phonon::AudioDataOutput::endOfMedia
260 - can this be emited from the streaming threads?
261 - We need to grab this EOS message synchronously from the bus.
262 - should be emited _before_ sending the last dataReady. This means we need
263 to cache at least one dataReady.
265 * Phonon::AudioDataOutput::setDataSize
266 - can this be a _suggested_ data size or does every callback need to be of