/**
* SECTION:element-taginject
*
- * Element that injects new metadata tags, but passes incomming data through
+ * Element that injects new metadata tags, but passes incoming data through
* unmodified.
*
* <refsect2>
* @see_also: goom, synaesthesia
*
* Goom2k1 is an audio visualisation element. It creates warping structures
- * based on the incomming audio signal. Goom2k1 is the older version of the
+ * based on the incoming audio signal. Goom2k1 is the older version of the
* visualisation. Also available is goom2k4, with a different look.
*
* <refsect2>
case GST_EVENT_SEGMENT:
{
/* the newsegment values are used to clip the input samples
- * and to convert the incomming timestamps to running time so
+ * and to convert the incoming timestamps to running time so
* we can do QoS */
gst_event_copy_segment (event, &monoscope->segment);
v4l2src puts a GStreamer timestamp on the video frames base on the current
running_time. The encoder encodes and passed the timestamp on. The payloader
generates an RTP timestamp using the above formula and puts it in the RTP
- packet. It also copies the incomming GStreamer timestamp on the output RTP
+ packet. It also copies the incoming GStreamer timestamp on the output RTP
packet. udpsink synchronizes on the gstreamer timestamp before pushing out the
packet.
clock-rate=(int)90000, encoding-name=(string)H263-1998" ! rtph263pdepay !
avdec_h263 ! autovideosink
-It is important that the depayloader copies the incomming GStreamer timestamp
+It is important that the depayloader copies the incoming GStreamer timestamp
directly to the depayloaded output buffer. It should never attempt to perform
any logic with the RTP timestamp, this task is for the jitterbuffer as we will
see next.
gst_rtp_ac3_pay_reset (rtpac3pay);
}
- /* count the amount of incomming packets */
+ /* count the amount of incoming packets */
NF = 0;
left = map.size;
p = map.data;
offset += payload_len;
size -= payload_len;
- /* copy incomming timestamp (if any) to outgoing buffers */
+ /* copy incoming timestamp (if any) to outgoing buffers */
GST_BUFFER_PTS (outbuf) = timestamp;
fragmented = TRUE;
rtpmp4vpay->duration = 0;
}
- /* depay incomming data and see if we need to start a new RTP
+ /* depay incoming data and see if we need to start a new RTP
* packet */
flush =
gst_rtp_mp4v_pay_depay_data (rtpmp4vpay, map.data, size, &strip, &vopi);
* depayloader or other element to create concealment data or some other logic
* to gracefully handle the missing packets.
*
- * The jitterbuffer will use the DTS (or PTS if no DTS is set) of the incomming
+ * The jitterbuffer will use the DTS (or PTS if no DTS is set) of the incoming
* buffer and the rtptime inside the RTP packet to create a PTS on the outgoing
* buffer.
*
* @percent: the buffering percent
*
* Pops the oldest buffer from the packet queue of @jbuf. The popped buffer will
- * have its timestamp adjusted with the incomming running_time and the detected
+ * have its timestamp adjusted with the incoming running_time and the detected
* clock skew.
*
* Returns: a #GstBuffer or %NULL when there was no packet in the queue.
* @src: an #RTPSource
* @pinfo: an #RTPPacketInfo
*
- * Let @src handle the incomming RTP packet described in @pinfo.
+ * Let @src handle the incoming RTP packet described in @pinfo.
*
* Returns: a #GstFlowReturn.
*/
* SECTION:element-smpte
*
* smpte can accept I420 video streams with the same width, height and
- * framerate. The two incomming buffers are blended together using an effect
+ * framerate. The two incoming buffers are blended together using an effect
* specific alpha mask.
*
* The #GstSmpte:depth property defines the presision in bits of the mask. A