1 What we are trying to achieve:
4 patching of CVS checkout using our patch files placed in our CVS
9 non-srcdir build (ie, mkdir build; cd build; ../configure; make)
13 * configure checks whether or not it should update ffmpeg from CVS by looking
14 at the nano version number
15 - if it's 1, we're in cvs mode, and it should check it out
16 - if it's not 1, we're in prerel or rel mode, and the code should already
18 FIXME: we could change this to really check out the source code if some
19 required files aren't there just in case someone checks out from CVS
20 but CVS is not at nano 1
22 * patching of the checked-out copy happens at
24 Axioms under which we work:
25 - the dist tarball needs to include either
26 - the pristine ffmpeg checkout + our patches + a patch mechanism on make
28 - the ffmpeg checkout with patches already applied
30 - configure/make is not allowed to touch files that already live in the source
31 tree; if they need to then they need to be copied first and cleaned
34 - it would be very nice if, on update of either the Tag file or the patch set,
35 make would know exactly what to do with it.
37 Some notes on how ffmpeg wrapping inside GStreamer currently works:
38 * gstffmpeg{dec,enc,demux,mux}.c are wrappers for specific element types from
39 their ffmpeg counterpart. If you want to wrap a new type of element in
40 ffmpeg (e.g. the URLProtocol things), then you'd need to write a new
43 * gstffmpegcolorspace.c is a wrapper for one specific function in ffmpeg:
44 colorspace conversion. This works different from the previously mentioned
45 ones, and we'll come to that in the next item. If you want to wrap one
46 specific function, then that, too, belongs in a new wrapper file.
48 * the important difference between all those is that the colorspace element
49 contains one element, so there is a 1<->1 mapping. This makes for a fairly
50 basic element implementation. gstffmpegcolorspace.c, therefore, doesn't
51 differ much from other colorspace elements. The ffmpeg element types,
52 however, define a whole *list* of elements (in GStreamer, each decoder etc.
53 needs to be its own element). We use a set of tricks for that to keep
54 coding simple: codec mapping and dynamic type creation.
56 * ffmpeg uses CODEC_ID_* enumerations for their codecs. GStreamer uses caps,
57 which consists of a mimetype and a defined set of properties. In ffmpeg,
58 these properties live in a AVCodecContext struct, which contains anything
59 that could configure any codec (which makes it rather messy, but ohwell).
60 To convert from one to the other, we use codec mapping, which is done in
61 gstffmpegcodecmap.[ch]. This is the most important file in the whole
62 ffmpeg wrapping process! It contains functions to go from a codec type
63 (video or audio - used as the output format for decoding or the input
64 format for encoding), a codec id (to identify each format) or a format id
65 (a string identifying a file format - usually the file format extension)
66 to a GstCaps, and the other way around.
68 * to define multiple elements in one source file (which all behave similarly),
69 we dynamically create types for each plugin and let all of them operate on
70 the same struct (GstFFMpegDec, GstFFMpegEnc, ...). The functions in
71 gstffmpeg{dec,enc,demux,mux}.c called gst_ffmpeg*_register() do this.
72 The magic is as follows: for each codec or format, ffmpeg has a single
73 AVCodec or AV{Input,Output}Format, which are packed together in a list of
74 supported codecs/formats. We simply walk through the list, for each of
75 those, we check whether gstffmpegcodecmap.c knows about this single one.
76 If it does, we get the GstCaps for each pad template that belongs to it,
77 and register a type for all of those together. We also leave this inside
78 a caching struct, that will later be used by the base_init() function to
79 fill in information about this specific codec in the class struct of this
80 element (pad templates and codec/format information). Since the actual
81 codec information is the only thing that really makes each codec/format
82 different (they all behave the same through the ffmpeg API), we don't
83 really need to do anything else that is codec-specific, so all other
84 functions are rather simple.
86 * one particular thing that needs mention is how gstffmpeg{mux,demux}.c and
87 gstffmpegprotocol.c interoperate. ffmpeg uses URLProtocols for data input
88 and output. Now, of course, we want to use the *GStreamer* way of doing
89 input and output (filesrc, ...) rather than the ffmpeg way. Therefore, we
90 wrap up a GstPad as a URLProtocol and register this with ffmpeg. This is
91 what gstffmpegprotocol.c does. The URL is called gstreamer://%p, where %p
92 is the address of a GstPad. gstffmpeg{mux,demux}.c then open a file called
93 gstreamer://%p, with %p being their source/sink pad, respectively. This
94 way, we use GStreamer for data input/output through the ffmpeg API. It's
95 rather ugly, but it has worked quite well so far.
97 * there's lots of things that still need doing. See the TODO file for more