For the purposes of comparing Relay to traditional computational graph-based IRs, it
can be useful to consider Relay exrpessions in terms of dataflow and control fragments.
Each portion of a Relay program containing expressions that only affect the dataflow can
-be viewed as a traditional comptuation graph when writing and expressing transformations.
+be viewed as a traditional computation graph when writing and expressing transformations.
The dataflow fragment covers the set of Relay expressions that do not involve
control flow. That is, any portion of a program containing only the following
- Recursive Calls in Functions
From the point of view of a computation graph, a function is a subgraph and a function call inlines the subgraph, substituting its arguments for the free variables in the subgraph with corresponding names.
-Thus if a function's body uses only dataflow constructs
-, a call to that function is in the dataflow fragment; conversely, if the
+Thus, if a function's body uses only dataflow constructs,
+a call to that function is in the dataflow fragment; conversely, if the
function's body contains control flow, a call to that function is not part of the dataflow fragment.
Variables
any Relay type as follows:
.. code-block:: python
+
fn<t : Type>(%x : t) -> t {
%x
}
arguments to tensor types:
.. code-block:: python
+
fn<s : Shape, bt : BaseType>(%x : Tensor[s, bt]) {
%x
}
Notice that the return type is omitted and will be inferred.
-*Note: :code:`where` syntax is not yet supported in the text format.*
+*Note: "where" syntax is not yet supported in the text format.*
A function may also be subject to one or more type relations, such as in
the following:
===========================
This page contains the list of core tensor operator primitives pre-defined in tvm.relay.
-The core tensor operator primitives covers typical workloads in deep learning.
-They can represent workloads in front-end frameworks, and provide basic building blocks for optimization.
-Since deep learning is a fast evolving field and it is that possible to have operators that are not in here.
+The core tensor operator primitives cover typical workloads in deep learning.
+They can represent workloads in front-end frameworks and provide basic building blocks for optimization.
+Since deep learning is a fast evolving field, it is possible to have operators that are not in here.
.. note::
Relay's Type System
===================
-We briefly introduced types while detailing Relay's expression language
-, but have not yet described its type system. Relay is
+We briefly introduced types while detailing Relay's expression language,
+but have not yet described its type system. Relay is
a statically typed and type-inferred language, allowing programs to
be fully typed while requiring just a few explicit type annotations.
kernel_layout="OIHW",
output_padding=(0, 0),
out_dtype=""):
- """Two dimensional trnasposed convolution operator.
+ """Two dimensional transposed convolution operator.
Parameters
----------
def full_like(data, fill_value):
- """Return an scalar value array with the same shape and type as the input array.
+ """Return a scalar value array with the same shape and type as the input array.
Parameters
----------
return _make.where(condition, x, y)
def broadcast_to(data, shape):
- """Return an scalar value array with the same type, broadcast to
+ """Return a scalar value array with the same type, broadcast to
the provided shape.
Parameters
return _make.broadcast_to(data, shape)
def broadcast_to_like(data, broadcast_type):
- """Return an scalar value array with the same shape and type as the input array.
+ """Return a scalar value array with the same shape and type as the input array.
Parameters
----------
def collapse_sum_like(data, collapse_type):
- """Return an scalar value array with the same shape and type as the input array.
+ """Return a scalar value array with the same shape and type as the input array.
Parameters
----------
def strided_slice(data, begin, end, strides=None):
- """Strided slice of an array..
+ """Strided slice of an array.
Parameters
----------