From: iliya mironov Date: Wed, 15 Jul 2020 07:30:33 +0000 (+0300) Subject: Added mish layer doc opset (#1149) X-Git-Url: http://review.tizen.org/git/?a=commitdiff_plain;h=ac5217d17fcadf879e14a4d4eb9976cf7e574ae5;p=platform%2Fupstream%2Fdldt.git Added mish layer doc opset (#1149) * Added mish layer doc opset * Refactoring mish spec * Update mish spec * Change output description of Mish layer * Fix Mish according to review * Refactoring Mish and GELU spec according to code review * Update formula for ops in spec * Refactoring spec text * Update Mish opset * Change Mish version from 1 to 4 * Sort opset4 Co-authored-by: Your Name --- diff --git a/docs/ops/activation/GELU_2.md b/docs/ops/activation/GELU_2.md index 8e11ea0..5625062 100644 --- a/docs/ops/activation/GELU_2.md +++ b/docs/ops/activation/GELU_2.md @@ -15,13 +15,15 @@ Gelu(x)=x*Φ(x), where Φ(x) is the Cumulative Distribution Function for Gaussia The following equivalent combination is recognized and fused into single Gelu op: \f[ - Gelu(x) = 0.5*x*(1 + erf((x) / sqrt(2) ) + Gelu(x) = 0.5*x*(1.0 + erf((x) / \sqrt{2}) \f] -Similarly, the following Gelu approximation (typical for the TensorFlow*) is recognized and fused into single Gelu op +Similarly, the following Gelu approximation (typical for the TensorFlow*) is recognized and fused into single Gelu op + \f[ - Gelu(x) \approx 0.5*x*(1 + tanh((sqrt(2/pi)) * (x + 0.044715 * x ^ 3)) + Gelu(x) \approx 0.5x(1.0 + tanh(\sqrt{2.0/pi} * (x + 0.044715 * x ^ 3)) \f] + **Inputs**: * **1**: Multidimensional input tensor. Required. diff --git a/docs/ops/activation/Mish_4.md b/docs/ops/activation/Mish_4.md new file mode 100644 index 0000000..21bc372 --- /dev/null +++ b/docs/ops/activation/Mish_4.md @@ -0,0 +1,50 @@ +## Mish + +**Versioned name**: *Mish-4* + +**Category**: *Activation* + +**Short description**: Mish is a Self Regularized Non-Monotonic Neural Activation Function. + +**Detailed description**: Mish is a self regularized non-monotonic neural activation function proposed in the [article](https://arxiv.org/abs/1908.08681). + +**Attributes**: operation has no attributes. + +**Inputs**: + +* **1**: Input tensor *x* of any floating point type T. Required. + +**Outputs**: + +* **1**: Floating point tensor with shape and type matching the input tensor. Required. + +**Types** + +* *T*: any floating point type. + +**Mathematical Formulation** + + For each element from the input tensor calculates corresponding + element in the output tensor with the following formula: + \f[ + Mish(x) = x*tanh(ln(1.0+e^{x})) + \f] + +**Examples** + +```xml + + + + 256 + 56 + + + + + 256 + 56 + + + +``` \ No newline at end of file diff --git a/docs/ops/opset4.md b/docs/ops/opset4.md index 0e8aa05..ced7fe1 100644 --- a/docs/ops/opset4.md +++ b/docs/ops/opset4.md @@ -77,6 +77,7 @@ declared in `namespace opset4`. * [MaxPool](pooling/MaxPool_1.md) * [Maximum](arithmetic/Maximum_1.md) * [Minimum](arithmetic/Minimum_1.md) +* [Mish](activation/Mish_4.md) * [Mod](arithmetic/Mod_1.md) * [MVN](normalization/MVN_1.md) * [Multiply](arithmetic/Multiply_1.md)