1 <?xml version="1.0" encoding="UTF-8"?>
2 <!-- vi:set sw=2 ts=4: -->
3 <?xml-stylesheet href="../../extension.xsl" type="text/xsl"?>
4 <proposal href="proposals/WEBGL_dynamic_texture/">
5 <name>WEBGL_dynamic_texture</name>
7 <contact><a href="https://www.khronos.org/webgl/public-mailing-list/">WebGL
8 working group</a> (public_webgl 'at' khronos.org) </contact>
11 <contributor>Mark Callow, HI Corporation</contributor>
13 <contributor>Acorn Pooley, while at NVIDIA</contributor>
15 <contributor>Ken Russell, Google</contributor>
17 <contributor>David Sheets, Ashima Arts</contributor>
19 <contributor>William Hennebois, STMicroelectronics</contributor>
21 <contributor>Members of the WebGL working group</contributor>
27 <api version="1.0.2"/>
30 <overview id="overview">
31 <p>A dynamic texture is a texture whose image changes frequently. The
32 source of the stream of images may be a producer outside the control of
33 the WebGL application. The classic example is using a playing video to
34 texture geometry. Texturing with video is currently achieved by using the
35 <code>TEXTURE2D</code> target and passing an <code>HTMLVideoElement</code>
36 to <code>texImage2D</code>. It is difficult, if not impossible to
37 implement video texturing with zero-copy efficiency via this API and much
38 of the behavior is underspecified.</p>
40 <p>This extension provides a mechanism for streaming image frames from an
41 <code>HTMLVideoElement</code>, <code>HTMLCanvasElement</code> or
42 <code>HTMLImageElement</code> (having multiple frames such those created
43 from animated GIF, APNG and MNG files) into a WebGL texture. This is done
44 via a new texture target, <code>TEXTURE_EXTERNAL_OES</code> which can only
45 be specified as being the consumer of an image stream from a new
46 <code>WDTStream</code> object which provides commands for connecting to a
49 <p>There is no support for most of the functions that manipulate other
50 texture targets (e.g. you cannot use <code>*[Tt]ex*Image*()</code>
51 functions with <code>TEXTURE_EXTERNAL_OES</code>). Also,
52 <code>TEXTURE_EXTERNAL_OES</code> targets never have more than a single
53 level of detail. These restrictions enable dynamic texturing with maximum
54 efficiency. They remove the need for a copy of the image data manipulable
55 via the WebGL API and allow sources which have internal formats not
56 otherwise supported by WebGL, such as planar or interleaved YUV data, to
57 be WebGL texture target siblings.</p>
59 <p>The extension extends GLSL ES with a new
60 <code>samplerExternalOES</code> type and matching sampling functions that
61 provide a place for an implementation to inject code for sampling non-RGB
62 data when necessary without degrading performance for other texture
63 targets. Sampling a <code>TEXTURE_EXTERNAL_OES</code> via a sampler of
64 type <code>samplerExternalOES</code> always returns RGBA data. This allows
65 the implementation to decide the most efficient format to use whether it
66 be RGB or YUV data. If the underlying format was exposed, the application
67 would have to query the format in use and provide shaders to handle both
70 <p><code>WDTStream</code> provides a command for <em>latching</em> an
71 image frame into the consuming texture as its contents. This is equivalent
72 to copying the image into the texture but, due to the restrictions
73 outlined above a copy is not necessary. Most implementations will be able
74 to avoid one so this can be much faster than using
75 <code>texImage2D</code>. Latching can and should be implemented in a way
76 that allows the producer to run independently of 3D rendering.</p>
78 <p><strong>Terminology note:</strong> throughout this specification
79 <em>opaque black</em> refers to the RGBA value (0,0,0,1).</p>
81 <mirrors href="https://cvs.khronos.org/svn/repos/registry/trunk/public/gles/extensions/NV/GL_NV_EGL_stream_consumer_external.txt"
82 name="NV_EGL_stream_consumer_external">
83 <!-- list the deviations here if there are any -->
86 <p>An <code>HTMLVideoElement</code>, <code>HTMLCanvasElement</code> or
87 <code>HTMLImageElement</code> is the producer of the stream of images
88 being consumed by the dynamic texture rather than the unspecified
89 external producer referred to in the extension.</p>
93 <p>A <code>WDTStream</code> is the deliverer of the stream of images
94 being consumed by the dynamic texture rather an
95 <code>EGLStream</code>.</p>
99 <p>References to <code>EGLImage</code> and associated state are
104 <p><code>WDTStream.connectSource</code> is used to connect a texture
105 to the image stream from an HTML element instead of the command
106 <code>eglStreamConsumerGLTextureNV</code> or its equivalent
107 <code>eglStreamConsumerGLTextureExternalKHR</code> referenced by the
112 <p><code>WDTStream.acquireImage</code> and
113 <code>WDTStream.releaseImage</code> are used to latch and unlatch
114 image frames instead of the commands
115 <code>eglStreamConsumerAcquireNV</code> or its equivalent
116 <code>eglStreamConsumerAcquireKHR</code> and
117 <code>eglStreamConsumerReleaseNV</code> or its equivalent
118 <code>eglStreamConsumerReleaseKHR</code> referenced by the
122 <p>For ease of reading, this specification briefly describes the new
123 functions and enumerants of <a
124 href="https://cvs.khronos.org/svn/repos/registry/trunk/public/gles/extensions/NV/GL_NV_EGL_stream_consumer_external.txt">NV_EGL_stream_consumer_external</a>.
125 Consult that extension for detailed documentation of their meaning and
126 behavior. Changes to the language of that extension are given <a
127 href="#differences">later</a> in this specification.</p>
132 <p>The <code>createStream</code> function is available. This command
133 is used for creating <code>WDTStream</code> objects for streaming
134 external data to texture objects. <code>WDTStream </code>objects have
135 a number of functions and attributes, the most important of which are
140 <p>The functions <code>ustnow</code>,
141 <code>getLastDrawingBufferPresentTime</code> and
142 <code>setDrawingBufferPresentTime</code> are available. These commands
143 are used for accurate timing and specifying when the drawing buffer
144 should next be presented.</p>
148 <p>The functions <code>WDTStream.connectSource </code> and
149 <code>WDTStream.disconnect()</code> are available for binding and
150 unbinding the stream to <code>HTML{Canvas,Image,Video}Elements</code>
151 as is the <code>WDTStream.getSource</code> function for querying the
152 current stream source.</p>
156 <p>The functions <code>WDTStream.acquireImage</code> and
157 <code>WDTStream.releaseImage</code> are available. These commands are
158 used before 3D rendering to latch an image that will not change during
159 sampling and after to unlatch the image.</p>
162 <glsl extname="WEBGL_dynamic_texture">
163 <alias extname="GL_NV_EGL_stream_consumer_external"/>
165 <alias extname="GL_OES_EGL_image_external"/>
167 <stage type="fragment"/>
169 <stage type="vertex"/>
171 <type name="samplerExternalOES"/>
173 <function name="texture2D" type="vec4">
174 <param name="sampler" type="samplerExternalOES"/>
176 <param name="coord" type="vec2"/>
179 <function name="texture2DProj" type="vec4">
180 <param name="sampler" type="samplerExternalOES"/>
182 <param name="coord" type="vec3"/>
185 <function name="texture2DProj" type="vec4">
186 <param name="sampler" type="samplerExternalOES"/>
188 <param name="coord" type="vec4"/>
194 <idl xml:space="preserve">
196 interface WEBGL_dynamic_texture {
197 typedef double WDTNanoTime;
199 const GLenum TEXTURE_EXTERNAL_OES = 0x8D65;
200 const GLenum SAMPLER_EXTERNAL_OES = 0x8D66;
201 const GLenum TEXTURE_BINDING_EXTERNAL_OES = 0x8D67;
202 const GLenum REQUIRED_TEXTURE_IMAGE_UNITS_OES = 0x8D68;
204 WDTStream? createStream();
206 WDTNanoTime getLastDrawingBufferPresentTime();
207 void setDrawingBufferPresentTime(WDTNanoTime pt);
208 WDTNanoTime ustnow();
209 }; // interface WEBGL_dynamic_texture
212 <!-- new functions -->
215 <p>On <code>WEBGL_dynamic_texture</code>:</p>
217 <function name="createStream" type="WDTStream">Creates and returns a
218 <code>WDTStream</code> object whose consumer is the
219 <code>WebGLTexture</code> bound to the <code>TEXTURE_EXTERNAL_OES</code>
220 target of the active texture unit at the time of the call.</function>
222 <function name="getMinFrameDuration" type="WDTNanoTime">Returns the
223 duration of the shortest frame of the currently connected dynamic source
224 when <code>playbackRate</code> of the associated
225 <code>MediaController</code> is 1.0.</function>
227 <!--XXX Need to add counters so app knows which "frame" the time is for.-->
229 <function name="getLastDBPresentTime" type="WDTNanoTime">Returns the UST
230 the last time the DrawingBuffer was presented to the screen, i.e., after
231 the last return of the script to the browser.</function>
233 <function name="setDBPresentationTime" type="void"><param
234 name="presentTime" type="WDTNanoTime"/>Sets the UST at which the drawing
235 buffer should be presented after the script returns to the
238 <function name="ustnow" type="WDTNanoTime">Returns the current
243 <p>On <code>WDTStream</code>:</p>
245 <function name="WDTStream.connectSource" type="void"><param name="source"
246 type="StreamSource"/>Connects the <code>StreamSource</code> specified by
247 <em>source</em> as the producer for the stream. <code>StreamSource</code>
248 can be an <code>HTMLCanvasElement</code>, <code>HTMLImageElement</code> or
249 <code>HTMLVideoElement</code>.</function>
251 <function name="WDTStream.getSource" type="StreamSource?">Returns the
252 <code>HTML{Canvas,Image,Video}Element</code> that is connected to the
253 WDTStream as the producer of images.</function>
255 <function name="WDTStream.acquireImage" type="WDTStreamFrameInfo">Latches
256 an image frame. Sampling the <code>WebGLTexture</code>, that is the
257 <code>WDTStream</code>'s <em>consumer</em>, will return values from the
258 latched image. The image data is guaranteed not to change as long as the
259 image is latched. <code>WDTStream</code> returns <code>true</code> when an
260 image is successfully latched, <code>false</code> otherwise.</function>
262 <function name="WDTStream.releaseImage" type="void">Releases the latched
263 image. Subsequent samping of the <code>WebGLTexture</code>, that was bound
264 to the <code>TEXTURE_EXTERNAL_OES</code> target of the active texture unit
265 when the WDTStream was created, will return opaque black.</function>
271 <p>The meaning and use of these tokens is exactly as described in <a
272 href="https://cvs.khronos.org/svn/repos/registry/trunk/public/gles/extensions/NV/GL_NV_EGL_stream_consumer_external.txt">NV_EGL_stream_consumer_external</a>.</p>
274 <function name="bindTexture" type="void"><param name="target"
275 type="GLenum"/><param name="texture"
276 type="WebGLTexture?"/><code>TEXTURE_EXTERNAL_OES</code> is accepted as a
277 target by the <code>target</code> parameter of
278 <code>bindTexture()</code></function>
280 <function name="getActiveUniform" type="WebGLActiveInfo?"><param
281 name="program" type="WebGLProgram?"/><param name="index"
282 type="GLuint"/><code>SAMPLER_EXTERNAL_OES</code> can be returned in the
283 <code>type</code> field of the <code>WebGLActiveInfo</code> returned by
284 <code>getActiveUniform()</code></function>
286 <function name="getParameter" type="any"><param name="pname"
287 type="GLenum"/><code>TEXTURE_BINDING_EXTERNAL_OES</code> is accepted by
288 the <code>pname</code> parameter of
289 <code>getParameter()</code>.</function>
291 <function name="getTexParameter*" type="any"><param name="target"
292 type="GLenum"/><param name="pname"
293 type="GLenum"/><code>REQUIRED_TEXTURE_IMAGE_UNITS_OES</code> is accepted
294 as the <code>pname</code> parameter of
295 <code>GetTexParameter*()</code></function>
298 <!-- Refer to the <http://www.opengl.org/registry/doc/template.txt> OpenGL
299 extension template for a description of these sections. These sections
300 should be eliminated for WebGL extensions simply mirroring OpenGL or
301 OpenGL ES extensions.
304 <!-- these take XHTML markup as contents -->
308 <ipstatus>No known IP claims.</ipstatus>
311 <typedef name="WDTNanoTime">
314 <p>This type is used for nanosecond time stamps and time periods.</p>
317 <interface name="WDTStreamFrameInfo" noobject="true">
318 <member>double frameTime;</member>
320 <member>WDTNanoTime presentTime</member>
322 <p>This interface is used to obtain information about the latched
326 <interface name="WDTStream" noobject="true">
329 <p>This interface is used to manage the image stream between the
330 producer and consumer.</p>
335 <!-- Additions to Chapters of the WebGL Specification-->
337 <p>In section 4.3 <cite>Supported GLSL Constructs</cite>, replace the
338 paragraph beginning <cite>A WebGL implementation must ...</cite> with the
339 following paragraph:<blockquote>A WebGL implementation must only accept
340 shaders which conform to The OpenGL ES Shading Language, Version 1.00 <a
341 href="https://www.khronos.org/registry/webgl/specs/1.0.2/#refsGLES20GLSL">[GLES20GLSL]</a>,
343 href="https://cvs.khronos.org/svn/repos/registry/trunk/public/gles/extensions/NV/GL_NV_EGL_stream_consumer_external.txt">NV_EGL_stream_consumer_external</a>,
344 and which do not exceed the minimum functionality mandated in Sections 4
345 and 5 of Appendix A. In particular, a shader referencing state variables
346 or commands that are available in other versions of GLSL (such as that
347 found in versions of OpenGL for the desktop), must not be allowed to
348 load.</blockquote></p>
350 <p>In section 5.14 <cite>The WebGL Context</cite> , add the following to
351 the WebGLRenderingContext interface. Note that until such time as this
352 extension enters core WebGL the tokens and commands mentioned below will
353 be located on the WebGL_dynamic_texture extension interface shown
354 above.<li>In the list following <code>/* GetPName */</code>:<pre
355 class="idl" xml:space="preserve">TEXTURE_BINDING_EXTERNAL = 0x8D67;</pre></li><li>In
356 the list following <code>/* TextureParameterName */</code>:<pre
357 class="idl" xml:space="preserve">REQUIRED_TEXTURE_IMAGE_UNITS = 0x8D68;</pre></li><li>In
358 the list following <code>/* TextureTarget */</code>:<pre class="idl"
359 xml:space="preserve">TEXTURE_EXTERNAL = 0x8D65;</pre></li><li>In the list
360 following <code>/* Uniform Types */</code>:<pre class="idl"
361 xml:space="preserve">SAMPLER_EXTERNAL = 0x8D66;</pre></li><li>In the
362 alphabetical list of commands add the following :<pre class="idl"
363 xml:space="preserve">WDTStream? createStream();
364 WDTNanoTime getLastDrawingBufferPresentTime();
365 void setDrawingBufferPresentationTime(WDTNanoTime pt);
366 WDTNanoTime ustnow();</pre></li></p>
368 <p>In section 5.14.3 <cite>Setting and getting state</cite>, add the
369 following to the table under <code>getParameter</code>.</p>
372 <dt class="idl-code">
376 <td>TEXTURE_BINDING_EXTERNAL</td>
387 <p>In section 5.14.8<cite>Texture objects</cite>, add the following to the
388 table under <code>getTexParameter</code>.</p>
391 <dt class="idl-code">
395 <td>REQUIRED_TEXTURE_IMAGE_UNITS</td>
406 <p>Add a new section 5.14.8.1 External textures.</p>
409 <h3>5.14.8.1 External textures</h3>
411 <p>External textures are texture objects which receive image data from
412 outside of the GL. They enable texturing with rapidly changing image
413 data, e.g, a video, at low overhead and are used in conjunction with <a
415 <code>WDTStream</code>
416 </a> objects to create <em>dynamic textures</em>. See <a
417 href="#dynamic-textures">Dynamic Textures</a> for more information. An
418 external texture object is created by binding an unused
419 <code>WebGLTexture</code> to the target
420 <code>TEXTURE_EXTERNAL_OES</code>. Note that only unused WebGLTextures
421 or those previously used as external textures can be bound to
422 <code>TEXTURE_EXTERNAL_OES</code>. Binding a <code>WebGLTexture</code>
423 previously used with a different target or binding a WebGLTexture
424 previously used with TEXTURE_EXTERNAL_OES to a different target
425 generates a <code>GL_INVALID_OPERATION</code> error as documented in <a
426 href="https://cvs.khronos.org/svn/repos/registry/trunk/public/gles/extensions/NV/GL_NV_EGL_stream_consumer_external.txt">GL_NV_EGL_stream_consumer_external.txt</a>.</p>
429 <p>In section 5.14.10 <cite>Uniforms and attributes</cite>, add the
430 following to the table under <code>getUniform</code>.</p>
433 <dt class="idl-code">
437 <td>samplerExternal</td>
448 <p>Add a new section 5.16 Dynamic Textures</p>
451 <h3 id="dynamic-textures">5.16 Dynamic Textures</h3>
453 <p>Dynamic textures are texture objects that display a stream of images
454 coming from a <em>producer</em> outside the WebGL application, the
455 classic example ibeing using a playing video to texture geometry from. A
456 <code>WDTStream</code> object mediates between the producer and the
457 <em>consumer</em>, the texture consuming the images.</p>
459 <p>The command<pre class="idl" xml:space="preserve">WDTStream? createStream();</pre>creates
460 a <a href="#wdtstream">WGTStream</a> object whose consumer is the
461 texture object currently bound to the <code>TEXTURE_EXTERNAL_OES</code>
462 target in the active texture unit. The initial <code>state</code> of the
463 newly created stream will be <code>STREAM_CONNECTING</code>. If the
464 texture object is already the consumer of a stream, createStream
465 generates an INVALID_OPERATION error and returns null. When a texture
466 object that is the consumer of a stream is deleted, the stream is also
469 <p>In order to maintain synchronization with other tracks of an
470 HTMLVideoElement's media group, most notably audio, the application must
471 be able to measure how long it takes to draw the scene containing the
472 dynamic texture and how long it takes the browser to compose and present
475 <p>The command <pre class="idl" xml:space="preserve">WDTNanoTime ustnow();</pre>
476 returns the <em>unadjusted system time</em>, a monotonically increasing
477 clock, in units of nanoseconds. The zero time of this clock is not
478 important. It could start at system boot, browser start or navigation
481 <p>The command <pre class="idl" xml:space="preserve">WDTNanoTime getLastDrawingBufferPresentTime();</pre>
482 returns the UST the last time the composited page containing the drawing
483 buffer's content was presented to the user.</p>
485 <p>To ensure accurate synchronization of the textured image with other
486 tracks of an HTMLVideoElement's media group, the application must be
487 able to specify the <em>presentation time</em> of the drawing
490 <p>The command <pre class="idl" xml:space="preserve">void setDrawingBufferPresentTime(WDTNanoTime pt);</pre>
491 tells the browser the UST when the drawing buffer must be presented
492 after the application returns to the browser. The browser must present
493 the composited page containing the canvas to the user at the specified
494 UST. If the specified time has already passed when control returns, the
495 browser should present the drawing buffer as soon as possible. Should an
496 explicit drawing buffer present function be added to WebGL, the
497 presentation time will become one of its parameters.</p>
499 <h3>5.16.1 WDTStreamFrameInfo</h3>
501 <p>The <code>WDTStreamFrameInfo</code> interface represents information
502 about a frame acquired from a WDTStream.</p>
504 <pre class="idl" xml:space="preserve">[NoInterfaceObject] interface WDTStreamFrameInfo {
505 readonly attribute double frameTime;
506 readonly attribute WDTNanoTime presentTime;
509 <h4>5.16.1.1 Attributes</h4>
511 <p>The following attributes are available:</p>
514 <dt><code class="attribute-name">frameTime</code> of type
515 <code>double</code></dt>
517 <dd>The time of the frame relative to the start of the producer's
518 MediaController timeline in seconds. Equivalent to
519 <code>currentTime</code> in an HTMLMediaElement.</dd>
521 <dt><code class="attribute-name">presentTime</code> of type
522 <code>WDTNanoTime</code></dt>
524 <dd>The time the frame must be presented in order to sync with other
525 tracks in the element's mediagroup, particularly audio.</dd>
528 <h3 id="wdtstream">5.16.2 WDTStream</h3>
530 <p>The <code>WDTStream</code> interface represents a stream object used
531 for controlling an image stream being fed to a dynamic texture
534 <pre class="idl" xml:space="preserve">[NoInterfaceObject] interface WDTStream {
535 typedef (HTMLCanvasElement or
537 HTMLVideoElement) StreamSource;
539 const GLenum STREAM_CONNECTING = 0;
540 const GLenum STREAM_EMPTY = 1;
541 const GLenum STREAM_NEW_FRAME_AVAILABLE = 2;
542 const GLenum STREAM_OLD_FRAME_AVAILABLE = 3;
543 const GLenum STREAM_DISCONNECTED = 4;
545 readonly attribute WebGLTexture consumer;
547 readonly attribute WDTStreamFrameInfo consumerFrame;
548 readonly attribute WDTStreamFrameInfo producerFrame;
550 readonly attribute WDTNanoTime minFrameDuration;
552 readonly attribute GLenum state;
554 attribute WDTNanotime acquireTimeout;
555 attribute WDTNanoTime consumerLatency;
557 void connectSource(StreamSource source);
559 StreamSource? getSource();
561 boolean acquireImage();
565 <h4>5.16.2.1 Attributes</h4>
568 <dt><code class="attribute-name">consumer</code> of type
569 <code>WebGLTexture</code></dt>
571 <dd>The <code>WebGLTexture</code> that was bound to the
572 TEXTURE_EXTERNAL_OES target of the active texture unit at the time the
573 stream was created. Sampling this texture in a shader will return
574 samples from the image latched by <code>acquireImage</code>.</dd>
576 <dt><code class="attribute-name">consumerFrame</code> of type
577 <code>WDTStreamFrameInfo</code></dt>
579 <dd>Information about the last frame latched by the consumer via
580 <code>acquireImage.</code></dd>
582 <dt><code class="attribute-name">producerFrame</code> of type
583 <code>WDTStreamFrameInfo</code></dt>
585 <dd>Information about the frame most recently inserted into the stream
586 by the producer.</dd>
588 <dt><code class="attribute-name">minFrameDuration</code> of type
589 <code>WDTNanoTime</code></dt>
591 <dd>The minimum duration of a frame in the producer. Ideally this
592 should be an attribute on HTMLVideoElement. Most video container
593 formats have metadata that can be used to calculate this. It can only
594 reflect the actual value once the stream is connected to a producer
595 and the producer's <code>READY_STATE</code> is at least
596 <code>HAVE_METADATA</code>. The initial value is
597 <code>Number.MAX_VALUE</code> (i.e., infinity). Applications need this
598 information to determine how complex their drawing can be while
599 maintaining the video's frame rate.</dd>
601 <dt><code class="attribute-name">state</code> of type
602 <code>GLenum</code></dt>
604 <dd>The state of the stream. Possible states are
605 <code>STREAM_CONNECTING</code>, <code>STREAM_EMPTY</code>,
606 <code>STREAM_NEW_FRAME_AVAILABLE</code>,
607 <code>STREAM_OLD_FRAME_AVAILABLE</code> and
608 <code>STREAM_DISCONNECTED</code>.</dd>
610 <dt><code class="attribute-name">consumerLatency</code> of type
611 <code>WDTNanoTime</code></dt>
613 <dd>The time between the application latching an image from the stream
614 and the drawing buffer being presented. This is the time by which the
615 producer should delay playback of any synchronized tracks such as
616 audio. The initial value is an implementation-dependent constant
617 value, possibly zero. This should only be changed when the video is
618 paused as producers will not be able to change the playback delay on,
619 e.g. audio, without glitches. It may only be possible to set this
620 prior to starting playback. Implementation experience is needed.</dd>
622 <dt><code class="attribute-name">acquireTimeout</code> of type
623 <code>WDTNanoTime</code></dt>
625 <dd>The maximum time to block in <code>acquireImage</code> waiting for
626 a new frame. The initial value is 0.</dd>
629 <h4>5.16.2.2 commands</h4>
631 <p>The command<pre class="idl" xml:space="preserve">void connectSource(StreamSource source);</pre>connects
632 the stream to the specified <code>StreamSource</code> element. If
633 <code>StreamSource</code> is an <code>HTMLMediaElement</code>, the
634 element's <code>autoPlay</code> attribute is set to <code>false</code>
635 to prevent playback starting before the application is ready. If
636 <code>state</code> is not <code>STREAM_CONNECTING</code>, an
637 <code>InvalidStateError</code> exception is thrown. After connecting
638 <code>state</code> becomes <code>STREAM_EMPTY</code>.</p>
640 <p>The command<pre class="idl" xml:space="preserve">void disconnect();</pre>disconnects
641 the stream from its source. Subsequent sampling of the associated
642 texture will return opaque black. <code>state</code> is set to
643 <code>STREAM_DISCONNECTED</code>.</p>
645 <p>The command<pre class="idl" xml:space="preserve">StreamSource? getSource();</pre>returns
646 the HTML element that is the producer for this stream.</p>
648 <p>The command<pre class="idl" xml:space="preserve">boolean acquireImage();</pre>causes
649 <em>consumer</em> to <em>latch</em> the most recent image frame from the
650 currently connected source. The rules for selecting the image to be
651 latched mirror those for selecting the image drawn by the
652 <code>drawImage</code> method of <a
653 href="http://www.whatwg.org/specs/web-apps/current-work/multipage/the-canvas-element.html#canvasrenderingcontext2d">CanvasRenderingContext2D</a>.</p>
655 <p>For HTMLVideoElements, it latches the frame of video that will
657 href="http://www.whatwg.org/specs/web-apps/current-work/multipage/the-video-element.html#current-playback-position">current
658 playback position</a> of the audio channel, as defined in the <a
659 href="http://www.whatwg.org/specs/web-apps/current-work/">HTML Living
660 Standard</a>, at least <em>latency</em> nanoseconds from the call
661 returning, where <em>latency</em> is the <code>consumerLatency</code>
662 attribute of the stream. If the element's <code>readyState</code>
663 attribute is either <code>HAVE_NOTHING</code> or
664 <code>HAVE_METADATA</code>, the command returns without latching
665 anything and the texture remains <em>incomplete</em>. The effective size
666 of the texture will be the element's <a
667 href="http://www.whatwg.org/specs/web-apps/current-work/#concept-video-intrinsic-width">intrinsic
669 href="http://www.whatwg.org/specs/web-apps/current-work/#concept-video-intrinsic-height">height</a>.</p>
671 <p>For animated HTMLImageElements it will latch the first frame of the
672 animation. The effective size of the texture will be the element's
673 intrinsic width and height. </p>
675 <p>For HTMLCanvasElements it will latch the current content of the
676 canvas as would be returned by a call to <code>toDataURL</code>.</p>
678 <p><code>acquireImage</code> will block until either the timeout
679 specified by <code>acquireTimeout</code> expires or state is neither
680 <code>STREAM_EMPTY</code> nor <code>STREAM_OLD_FRAME_AVAILABLE</code>,
681 whichever comes first.</p>
683 <p>The model is a stream of images between the producer and the
684 WebGLTexture consumer. <code>acquireImage</code> latches the most recent
685 image. If the producer has not inserted any new images since the last
686 call to <code>acquireImage</code> then <code>acquireImage</code> will
687 latch the same image it latched last time it was called. If the producer
688 has inserted one new image since the last call then
689 <code>acquireImage</code> will "latch" the newly inserted image. If the
690 producer has inserted more than one new image since the last call then
691 all but the most recently inserted image are discarded and
692 <code>acquireImage</code> will "latch" the most recently inserted image.
693 For <code>HTMLVideoElements</code>, the application can use the value of
694 the <code>frameTime</code> attribute in the <code>consumerFrame</code>
695 attribute to identify which image frame was actually latched.</p>
697 <p><code>acquireImage</code> returns <code>true</code> if an image has
698 been acquired, and <code>false</code> if the timeout fired. It throws
699 the following exceptions:<ul>
700 <li><code>InvalidStateError</code>, if no dynamic source is
701 connected to the stream.</li>
702 </ul>XXX Complete after resolving issue 22. XXX</p>
704 <p>The command <pre class="idl" xml:space="preserve">void releaseImage();</pre>releases
705 the latched image. <code>releaseImage</code> will prevent the producer
706 from re-using and/or modifying the image until all preceding WebGL
707 commands that use the image as a texture have completed. If
708 <code>acquireImage</code> is called twice without an intervening call to
709 <code>releaseImage</code> then <code>releaseImage</code> is implicitly
710 called at the start of <code>acquireImage</code>.</p>
712 <p>After successfully calling <code>releaseImage</code> the texture
713 becomes "incomplete".</p>
715 <p>If <code>releaseImage</code> is called twice without a successful
716 intervening call to <code>acquireImage</code>, or called with no
717 previous call to <code>acquireImage</code>, then the call does nothing
718 and the texture remains in "incomplete" state. This is not an error</p>
720 <p>It throws the following exceptions:<ul>
721 <li><code>InvalidStateError</code>, if no dynamic source is
722 connected to the stream.</li>
723 </ul>XXX Complete after resolving issue 22. XXX</p>
725 <p>To sample a dynamic texture, the texture object must be bound to the
726 target <code>TEXTURE_EXTERNAL_OES</code> and the sampler uniform must be
727 of type <code>samplerExternal</code>. If the texture object bound to
728 <code>TEXTURE_EXTERNAL_OES</code> is not bound to a dynamic source then
729 the texture is "incomplete" and the sampler will return opaque
733 <p><a id="differences"/>At the end of section 6 <cite>Differences between
734 WebGL and OpenGL ES</cite>, add the following new sections. Note that
735 differences are considered with respect to the OpenGL ES 2.0 specification
737 href="https://cvs.khronos.org/svn/repos/registry/trunk/public/gles/extensions/NV/GL_NV_EGL_stream_consumer_external.txt">NV_EGL_stream_consumer_external</a>
739 href="http://www.khronos.org/registry/gles/extensions/OES/OES_EGL_image_external.txt">OES_EGL_image_external</a>.</p>
742 <h3>6.25 External Texture Support</h3>
744 <p>WebGL supports <em>external textures</em> but provides its own
745 <code>WDTStream</code> interface instead of <code>EGLStream</code>.
746 <code>WDTStream </code>connects an HTMLCanvasElement, HTMLImageElement
747 or HTMLVideoElement as the producer for an external texture. Specific
748 language changes follow.</p>
750 <p>Section <cite>3.7.14.1 External Textures as Stream Consumers</cite>
751 is replaced with the following.<blockquote>
752 <p>To use a TEXTURE_EXTERNAL_OES texture as the consumer of images
753 from a dynamic HTML element, bind the texture to the active texture
754 unit, and call <code>createStream</code> to create a
755 <code>WDTStream</code>. Use the stream's <code>connectSource</code>
756 command to connect the stream to the desired producer HTML element.
757 The width, height, format, type, internalformat, border and image
758 data of the TEXTURE_EXTERNAL_OES texture will all be determined
759 based on the specified dynamic HTML element. If the element does not
760 have any source or the source is not yet loaded, the width, height
761 & border will be zero, the format and internal format will be
762 undefined. Once the element's source has been loaded and one (or
763 more) images have been decoded these attributes are determined
764 (internally by the implementation), but they are not exposed to the
765 WebGL application and there is no way to query their values.</p>
767 <p> The TEXTURE_EXTERNAL_OES texture remains the consumer of the
768 dynamic HTML element's image frames until the first of any of these
770 <li>The texture is associated with a different dynamic HTML
771 element (with a later call to
772 <code>WDTStream.connectSource</code>).</li>
774 <li>The texture is deleted in a call to
775 <code>deleteTextures</code>.</li>
778 <p>Sampling an external texture which is not connected to a dynamic
779 HTML element will return opaque black. Sampling an external texture
780 which is connected to a dynamic HTML element will return opaque
781 black unless an image frame has been 'latched' into the texture by a
782 successful call to WDTStream.acquireImage.</p>
793 <!-- New Implementation-Dependent State -->
796 <p>XXX IGNORE THIS SAMPLE CODE. IT HAS NOT YET BEEN UPDATED TO MATCH THE
797 NEW SPEC TEXT. XXX</p>
799 <div class="example">This a fragment shader that samples a video texture.
800 Note that the surrounding <code><script></code> tag is not
801 essential; it is merely one way to include shader text in an HTML
802 file.<pre xml:space="preserve"><script id="fshader" type="x-shader/x-fragment">
803 #extension OES_EGL_image_external : enable
804 precision mediump float;
806 uniform samplerExternalOES videoSampler;
809 varying vec2 v_texCoord;
813 vec2 texCoord = vec2(v_texCoord.s, 1.0 - v_texCoord.t);
814 vec4 color = texture2D(videoSampler, texCoord);
815 color += vec4(0.1, 0.1, 0.1, 1);
816 gl_FragColor = vec4(color.xyz * v_Dot, color.a);
818 </script></pre></div>
820 <div class="example">This shows fragments from an application that renders
821 a spinning cube textured with a live video.<pre xml:space="preserve"><html>
822 <script type="text/javascript">
824 ///////////////////////////////////////////////////////////////////////
825 // Create a video texture and bind a source to it.
826 ///////////////////////////////////////////////////////////////////////
828 // Array of files currently loading
831 // Clears all the files currently loading.
832 // This is used to handle context lost events.
833 function clearLoadingFiles() {
834 for (var ii = 0; ii < g_loadingFiles.length; ++ii) {
835 g_loadingFiles[ii].onload = undefined;
841 // createVideoTexture
843 // Load video from the passed HTMLVideoElement id, bind it to a new WebGLTexture object
844 // and return the WebGLTexture.
846 // Is there a constructor for an HTMLVideoElement so you can do like "new Image()?"
848 function createVideoTexture(ctx, videoId)
850 var texture = ctx.createTexture();
851 var video = document.getElementById(videoId);
852 g_loadingFiles.push(video);
853 video.onload = function() { doBindVideo(ctx, video, texture) }
857 function doBindVideo(ctx, video, texture)
859 g_loadingFiles.splice(g_loadingFiles.indexOf(image), 1);
860 ctx.bindTexture(ctx.TEXTURE_EXTERNAL_OES, texture);
861 ctx.dynamicTextureSetSource(video);
862 // These are the default values of these properties so the following
863 // 4 lines are not necessary.
864 ctx.texParameteri(ctx.TEXTURE_EXTERNAL_OES, ctx.TEXTURE_MAG_FILTER, ctx.LINEAR);
865 ctx.texParameteri(ctx.TEXTURE_EXTERNAL_OES, ctx.TEXTURE_MIN_FILTER, ctx.LINEAR);
866 ctx.texParameteri(ctx.TEXTURE_EXTERNAL_OES, ctx.TEXTURE_WRAP_S, ctx.CLAMP_TO_EDGE);
867 ctx.texParameteri(ctx.TEXTURE_EXTERNAL_OES, ctx.TEXTURE_WRAP_T, ctx.CLAMP_TO_EDGE);
868 ctx.bindTexture(ctx.TEXTURE_EXTERNAL_OES, null);
871 ///////////////////////////////////////////////////////////////////////
872 // Initialize the application.
873 ///////////////////////////////////////////////////////////////////////
882 // The id of the Canvas Element
887 var program = simpleSetup(
889 // The ids of the vertex and fragment shaders
890 "vshader", "fshader",
891 // The vertex attribute names used by the shaders.
892 // The order they appear here corresponds to their index
894 [ "vNormal", "vColor", "vPosition"],
895 // The clear color and depth values
896 [ 0, 0, 0.5, 1 ], 10000);
898 // Set some uniform variables for the shaders
899 gl.uniform3f(gl.getUniformLocation(program, "lightDir"), 0, 0, 1);
900 // Use the default texture unit 0 for the video
901 gl.uniform1i(gl.getUniformLocation(program, "samplerExternal"), 0);
903 // Create a box. On return 'gl' contains a 'box' property with
904 // the BufferObjects containing the arrays for vertices,
905 // normals, texture coords, and indices.
908 // Load an image to use. Returns a WebGLTexture object
909 videoTexture = createVideoTexture(gl, "video");
910 // Bind the video texture
911 gl.bindTexture(gl.TEXTURE_EXTERNAL_OES, videoTexture);
913 // Create some matrices to use later and save their locations in the shaders
914 g.mvMatrix = new J3DIMatrix4();
915 g.u_normalMatrixLoc = gl.getUniformLocation(program, "u_normalMatrix");
916 g.normalMatrix = new J3DIMatrix4();
917 g.u_modelViewProjMatrixLoc =
918 gl.getUniformLocation(program, "u_modelViewProjMatrix");
919 g.mvpMatrix = new J3DIMatrix4();
921 // Enable all of the vertex attribute arrays.
922 gl.enableVertexAttribArray(0);
923 gl.enableVertexAttribArray(1);
924 gl.enableVertexAttribArray(2);
926 // Set up all the vertex attributes for vertices, normals and texCoords
927 gl.bindBuffer(gl.ARRAY_BUFFER, g.box.vertexObject);
928 gl.vertexAttribPointer(2, 3, gl.FLOAT, false, 0, 0);
930 gl.bindBuffer(gl.ARRAY_BUFFER, g.box.normalObject);
931 gl.vertexAttribPointer(0, 3, gl.FLOAT, false, 0, 0);
933 gl.bindBuffer(gl.ARRAY_BUFFER, g.box.texCoordObject);
934 gl.vertexAttribPointer(1, 2, gl.FLOAT, false, 0, 0);
936 // Bind the index array
937 gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, g.box.indexObject);
944 ///////////////////////////////////////////////////////////////////////
946 ///////////////////////////////////////////////////////////////////////
949 // Make sure the canvas is sized correctly.
953 gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
955 // Make a model/view matrix.
956 g.mvMatrix.makeIdentity();
957 g.mvMatrix.rotate(20, 1,0,0);
958 g.mvMatrix.rotate(currentAngle, 0,1,0);
960 // Construct the normal matrix from the model-view matrix and pass it in
961 g.normalMatrix.load(g.mvMatrix);
962 g.normalMatrix.invert();
963 g.normalMatrix.transpose();
964 g.normalMatrix.setUniform(gl, g.u_normalMatrixLoc, false);
966 // Construct the model-view * projection matrix and pass it in
967 g.mvpMatrix.load(g.perspectiveMatrix);
968 g.mvpMatrix.multiply(g.mvMatrix);
969 g.mvpMatrix.setUniform(gl, g.u_modelViewProjMatrixLoc, false);
971 // Acquire the latest video image
972 gl.dynamicTextureAcquireImage();
975 gl.drawElements(gl.TRIANGLES, g.box.numIndices, gl.UNSIGNED_BYTE, 0);
977 // Allow updates to the image again
978 gl.dynamicTextureReleaseImage();
980 // Show the framerate
981 framerate.snapshot();
983 currentAngle += incAngle;
984 if (currentAngle > 360)
989 <body onload="start()">
990 <video id="video" src="resources/video.ogv" autoplay="true" style="visibility: hidden">
992 <canvas id="example">
993 If you're seeing this your web browser doesn't support the &lt;canvas&gt; element. Ouch!
995 <div id="framerate"></div>
998 </html></pre></div>
1001 <p>See the <a href="DynamicTexture.html" target="_blank">complete
1009 <p>Statistical fingerprinting is a privacy concern where a malicious web
1010 site may determine whether a user has visited a third-party web site by
1011 measuring the timing of cache hits and misses of resources in the
1012 third-party web site. Though the ustnow method of this extension returns
1013 time data to a greater accuracy than before, it does not make this privacy
1014 concern significantly worse than it was already. </p>
1020 <p>What do applications need to be able to determine about the
1023 <p>RESOLVED. Two things <ul>
1024 <li>the minimum inter-frame interval. This is needed to determine
1025 a rendering budget.</li>
1027 <li>whether a frame has been missed.</li>
1032 <p>Neither the minimum inter-frame interval nor frame rate is exposed
1033 by HTMLMediaElements. How can it be determined?</p>
1035 <p>RESOLVED. Although there have been requests to expose the frame
1036 rate, in connection with non-linear editing and <a
1037 href="http://lists.whatwg.org/pipermail/whatwg-whatwg.org/2011-January/029724.html">frame
1038 accurate seeks to SMPTE time-code positions</a>, there has been no
1039 resolution. Therefore the stream object interface will have to provide
1040 a query for the minimum inter-frame interval. It can easily be derived
1041 from the frame-rate of fixed-rate videos or from information that is
1042 commonly stored in the container metadata for variable-rate formats.
1044 href="http://matroska.org/technical/specs/index.html">Matroska</a> and
1045 <a href="http://www.webmproject.org/code/specs/container/">WebM</a>
1046 containers provide a FrameRate item, albeit listed as "information
1047 only." Note that there is a <a
1048 href="https://www.w3.org/Bugs/Public/show_bug.cgi?id=22678">tracking
1049 bug</a> for this feature at WHATWG/W3C where browser vendors can
1050 express interest in implementing it.</p>
1054 <p>How can the application determine whether it has missed a
1057 <p>RESOLVED. If a frame's <code>presentTime</code> is earlier than
1058 ustnow() + consumerLatency then the application will have to drop the
1059 frame and acquire the next one.</p>
1063 <p>Why not use the <code>TEXTURE2D</code> target and
1064 <code>texImage2D</code>?</p>
1066 <p>RESOLVED. Use a new texture target and new commands. A new texture
1067 target makes it easy to specify, implement and conformance test the
1068 restrictions that enable a zero-copy implementation of dynamic
1069 textures as described in the <a href="#overview">Overview</a>. Given
1070 that one of those restriction is not allowing modification of the
1071 texture data, which is normally done via <code>texImage2D</code> using
1072 a new command will make the usage model clearer.</p>
1076 <p>Why not use sampler2D uniforms?</p>
1078 <p>RESOLVED. Use a new sampler type. Many zero-copy implementations
1079 will need special shader code when sampling YUV format dynamic
1080 textures. Implementations may choose to (a) re-compile at run time or
1081 (b) inject conditional code which branches at run time according to
1082 the format of the texture bound to TEXTURE_EXTERNAL_OES in the texture
1083 unit to which the sampler variable is set. Without a new sampler type,
1084 such conditional code would have to be injected for every sampler
1085 fetch increasing the size of the shader and slowing sampling of other
1086 texture targets. In order to preserve the possibility of using
1087 approach (b), a new sampler type will be used.</p>
1091 <p>Should the API be implemented as methods on the texture object or
1092 as commands taking a texture object as a parameter?</p>
1094 <p>RESOLVED. Neither. The <code>WebGLTexture</code> object represents
1095 an OpenGL texture name. No object is created until the name is bound
1096 to a texture target. Therefore the new commands should operate on a
1097 the currently bound texture object.</p>
1101 <p>Should dynamic textures be a new texture type or can
1102 <code>WebGLTexture</code> be reused?</p>
1104 <p>RESOLVED. <code>WebGLTexture</code> can be reused. As noted in the
1105 previous issue a <code>WebGLTexture</code> represents a texture name
1106 and is a handle to multiple texture types. The type of texture is set
1107 according to the target to which the name is initially bound.</p>
1111 <p>Should this extension use direct texture access commands or should
1112 it use <code>texParameter</code> and <code>getTexParameter</code>?</p>
1114 <p>RESOLVED. Use the latter. There is no directly accessible texture
1115 object to which such commands can be added. Changing the API to have
1116 such objects is outside the scope of this extension.</p>
1120 <p>Should we re-use <code>#extension
1121 NV_EGL_stream_consumer_external</code>, create our own GLSL extension
1122 name or have both this and a WebGL-specific name?</p>
1124 <p>RESOLVED. Any of <code>WEBGL_dynamic_texture</code> or the aliases
1125 <code>GL_NV_EGL_stream_consumer_external</code> or
1126 <code>GL_OES_EGL_image_external</code> can be used to enable this
1127 extension's features in the shader. This permits the same shader to be
1128 used with both WebGL and OpenGL ES 2.0.</p>
1132 <p>What should happen when an object of type
1133 <code>HTMLCanvasElement</code>, <code>HTMLImageElement</code> or
1134 <code>HTMLVideoElement</code>is passed to the existing
1135 <code>tex*Image2D</code> commands?</p>
1137 <p>UNRESOLVED. This behavior is outside the scope of this extension
1138 but handling of these objects is very underspecified in the WebGL
1139 specification and needs to be clarified. Suggestion: for single-frame
1140 HTMLImageElement set the texture image to the HTMLImageElement; for an
1141 animated HTMLImageElement set the texture image to the first frame of
1142 the animation; for an HTMLCanvasElement, set the texture image to the
1143 current canvas image that would be returned by toDataURL; for an
1144 HTMLVideoElement, set the texture image to the current frame. In all
1145 cases, the texture image does not change until a subsequent call to a
1146 <code>tex*Image2D</code> command. <em>Is this a change from the way
1147 any of these elements are handled today?</em></p>
1151 <p>Should <code>acquireImage</code> and <code>releaseImage</code>
1152 generate errors if called when the stream is already in the state to
1153 be set or ignore those extra calls?</p>
1155 <p>RESOLVED. They should not generate errors.
1156 <code>acquireImage</code> will be defined to implicitly call
1157 <code>releaseImage</code> if there has not been an intervening
1162 <p>This API is implementable on any platform at varying levels of
1163 efficiency. Should it therefore move directly to core rather than
1164 being an extension?</p>
1166 <p>RESOLVED. No, unless doing so would result in implementations
1167 appearing sooner.</p>
1171 <p>Should this extension support HTMLImageElement?</p>
1173 <p>UNRESOLVED. The HTML 5 Living Standard provides virtually no rules
1174 for handling of animated HTMLImageElements and specifically no
1175 definition of a current frame. In order to texture the animations from
1176 such elements, this specification will need to provide rules. If we
1177 are tracking the behavior of <a
1178 href="http://www.whatwg.org/specs/web-apps/current-work/#dom-context-2d-drawimage">CanvasRenderingContext2D.drawImage</a>
1179 then there is no point supporting HTMLImageElement as the
1180 specification says to draw the first frame of animated
1181 <code>HTMLImageElements</code>. </p>
1185 <p>Should this extension extend <code>HTMLMediaElement</code> with an
1186 acquireImage/releaseImage API?</p>
1188 <p>RESOLVED. No. The API would have no purpose and would require
1189 HTML{Video,Canvas,Image}Element becoming aware of WebGLTexture or,
1190 even worse, aware of texture binding within WebGL. No similar API was
1191 exposed to support CanvasRenderingContext2D.drawImage. The HTMLElement
1192 is simply passed to drawImage.</p>
1197 href="http://dvcs.w3.org/hg/webperf/raw-file/tip/specs/HighResolutionTime/Overview.html#sec-DOMHighResTimeStamp">DOMHighResolutionTime</a>
1198 and <code>window.performance.now()</code> from the W3C <a
1199 href="http://dvcs.w3.org/hg/webperf/raw-file/tip/specs/HighResolutionTime/Overview.html">High-Resolution
1200 Time</a> draft be used for the timestamps and as UST?</p>
1202 <p>RESOLVED. No. The specified unit is milliseconds and, although the
1203 preferred accuracy is microseconds, the required accuracy is only
1204 milliseconds. At millisecond accuracy it is not possible to
1205 distinguish between 29.97 fps and 30 fps which means sound for a 29.97
1206 fps video will be ~3.5 seconds out of sync after 1 hour. Also
1207 fractional <code>double</code> values must be used to represent times
1208 < 1 ms with the attendant issues of variable time steps as the
1209 exponent changes. Feedback has been provided. Hopefully the draft
1210 specification will be updated.</p>
1214 <p>Should UST 0 be system start-up, browser start-up or <a
1215 href="http://www.w3.org/TR/navigation-timing/#dom-performancetiming-navigationstart">navigationStart</a>
1216 as defined in the W3C <a
1217 href="http://www.w3.org/TR/2012/PR-navigation-timing-20120726/">Navigation
1218 Timing</a> proposed recommendation?</p>
1220 <p>RESOLVED. If <code>DOMHighResolutionTime</code> is used, then
1221 navigationStart makes sense otherwise it can be left to the
1226 <p>Should UST wrap rather then increment the exponent, so as to
1227 maintain precision?</p>
1229 <p>UNRESOLVED. The exponent will need to be incremented after 2**53
1230 nanoseconds (~ 41 days). UST could wrap to 0 after that or just keep
1231 counting. If it keeps counting, the precision will be halved so each
1232 tick will be 2 nanoseconds. The next precision change will occur after
1233 a further ~82 days.</p>
1237 <p>Should WDTStream.state be a proper idl enum?</p>
1243 <p>Does the application need to be able to find out if it has missed a
1244 potential renderAnimationFrame callback, i.e, it has taken longer than
1245 the browser's natural rAF period? If so, how?</p>
1251 <p>What are the base and units of a renderbuffer's present time on
1258 <p><code>CanvasRenderingContext2D.drawImage</code> requires an
1259 InvalidStateError be thrown if either width or height of the source
1260 canvas is 0? Do we need to do mirror this?</p>
1262 <p>RESOLVED. Treating this situation as failing to acquire an image
1263 and so returning opaque black when sampled provides more consistent
1264 handling across StreamSource types and is more consistent with OpenGL
1269 <p>Should exceptions be used for errors on WDTStreams or should
1270 GL-style error handling be used?</p>
1278 <revision date="2012/07/05">
1279 <change>Initial revision.</change>
1282 <revision date="2012/07/06">
1283 <change>Fixed incorrect dependency and minor naming inconsistencies.
1284 Fixed missing parameter error and moved the location of the bindTexture
1285 call in the sample code.</change>
1288 <revision date="2012/07/20">
1289 <change>Significant rewrite that bases the extension on
1290 GL_NV_EGL_stream_consumer_external and which uses semantics and
1291 concepts" from EGLStream rather than EGLImage.</change>
1294 <revision date="2012/07/23">
1295 <change>Change #extension aliases to match bug fixes in mirrored
1296 extensions.</change>
1299 <revision date="2012/08/30">
1300 <change>Update contributors list. Reorder issues to put all texture
1301 object related issues together and modify them for consistency. Fix
1302 small typos.</change>
1305 <revision date="2013/07/12">
1306 <change>Major revamp. Expose stream objects and provide commands for
1307 measuring time to render and present the drawing buffer.</change>
1310 <revision date="2014/07/15">
1311 <change>Added NoInterfaceObject extended attribute.</change>
1314 <revision date="2014/10/29">
1315 <change>Used updated XSL to properly show interfaces add by this
1316 extension under New Types.</change>