Selu specification refactoring (#5039)
* Review spec of Selu operation * Fix path for Selu op in opset files * Remove unnecessary line in example * Address review comments related to wording
This commit is contained in:
parent
6f9544007f
commit
d3933bd316
@ -234,7 +234,7 @@ limitations under the License.
|
||||
<tab type="user" title="ScatterNDUpdate" url="@ref openvino_docs_ops_movement_ScatterNDUpdate_3"/>
|
||||
<tab type="user" title="ScatterUpdate-3" url="@ref openvino_docs_ops_movement_ScatterUpdate_3"/>
|
||||
<tab type="user" title="Select-1" url="@ref openvino_docs_ops_condition_Select_1"/>
|
||||
<tab type="user" title="Selu-1" url="@ref openvino_docs_ops_arithmetic_Selu_1"/>
|
||||
<tab type="user" title="Selu-1" url="@ref openvino_docs_ops_activation_Selu_1"/>
|
||||
<tab type="user" title="ShapeOf-1" url="@ref openvino_docs_ops_shape_ShapeOf_1"/>
|
||||
<tab type="user" title="ShapeOf-3" url="@ref openvino_docs_ops_shape_ShapeOf_3"/>
|
||||
<tab type="user" title="ShuffleChannels-1" url="@ref openvino_docs_ops_movement_ShuffleChannels_1"/>
|
||||
|
71
docs/ops/activation/Selu_1.md
Normal file
71
docs/ops/activation/Selu_1.md
Normal file
@ -0,0 +1,71 @@
|
||||
## Selu <a name="Selu"></a> {#openvino_docs_ops_activation_Selu_1}
|
||||
|
||||
**Versioned name**: *Selu-1*
|
||||
|
||||
**Category**: *Activation function*
|
||||
|
||||
**Short description**: *Selu* is a scaled exponential linear unit element-wise activation function.
|
||||
|
||||
**Detailed Description**
|
||||
|
||||
*Selu* operation is introduced in this [article](https://arxiv.org/abs/1706.02515), as activation function for self-normalizing neural networks (SNNs).
|
||||
|
||||
*Selu* performs element-wise activation function on a given input tensor `data`, based on the following mathematical formula:
|
||||
|
||||
\f[
|
||||
Selu(x) = \lambda \left\{\begin{array}{r}
|
||||
x \quad \mbox{if } x > 0 \\
|
||||
\alpha(e^{x} - 1) \quad \mbox{if } x \le 0
|
||||
\end{array}\right.
|
||||
\f]
|
||||
|
||||
where α and λ correspond to inputs `alpha` and `lambda` respectively.
|
||||
|
||||
Another mathematical representation that may be found in other references:
|
||||
|
||||
\f[
|
||||
Selu(x) = \lambda\cdot\big(\max(0, x) + \min(0, \alpha(e^{x}-1))\big)
|
||||
\f]
|
||||
|
||||
**Attributes**: *Selu* operation has no attributes.
|
||||
|
||||
**Inputs**
|
||||
|
||||
* **1**: `data`. A tensor of type `T` and arbitrary shape. **Required.**
|
||||
|
||||
* **2**: `alpha`. 1D tensor with one element of type `T`. **Required.**
|
||||
|
||||
* **3**: `lambda`. 1D tensor with one element of type `T`. **Required.**
|
||||
|
||||
**Outputs**
|
||||
|
||||
* **1**: The result of element-wise *Selu* function applied to `data` input tensor. A tensor of type `T` and the same shape as `data` input tensor.
|
||||
|
||||
**Types**
|
||||
|
||||
* *T*: arbitrary supported floating-point type.
|
||||
|
||||
**Example**
|
||||
|
||||
```xml
|
||||
<layer ... type="Selu">
|
||||
<input>
|
||||
<port id="0">
|
||||
<dim>256</dim>
|
||||
<dim>56</dim>
|
||||
</port>
|
||||
<port id="1">
|
||||
<dim>1</dim>
|
||||
</port>
|
||||
<port id="2">
|
||||
<dim>1</dim>
|
||||
</port>
|
||||
</input>
|
||||
<output>
|
||||
<port id="3">
|
||||
<dim>256</dim>
|
||||
<dim>56</dim>
|
||||
</port>
|
||||
</output>
|
||||
</layer>
|
||||
```
|
@ -1,65 +0,0 @@
|
||||
## Selu <a name="Selu"></a> {#openvino_docs_ops_arithmetic_Selu_1}
|
||||
|
||||
**Versioned name**: *Selu-1*
|
||||
|
||||
**Category**: Arithmetic unary operation
|
||||
|
||||
**Short description**: *Selu* calculates the SELU activation function (https://arxiv.org/abs/1706.02515) element-wise with given tensor.
|
||||
|
||||
**Detailed Description**
|
||||
|
||||
For each element from the input tensor calculates corresponding
|
||||
element in the output tensor with the following formula:
|
||||
\f[
|
||||
selu(x) = \lambda \left\{\begin{array}{ll}
|
||||
\alpha(e^{x} - 1) \quad \mbox{if } x \le 0 \\
|
||||
x \quad \mbox{if } x > 0
|
||||
\end{array}\right.
|
||||
\f]
|
||||
|
||||
**Attributes**:
|
||||
|
||||
No attributes available.
|
||||
|
||||
**Inputs**
|
||||
|
||||
* **1**: An tensor of type T. **Required.**
|
||||
|
||||
* **2**: `alpha` 1D tensor with one element of type T. **Required.**
|
||||
|
||||
* **3**: `lambda` 1D tensor with one element of type T. **Required.**
|
||||
|
||||
**Outputs**
|
||||
|
||||
* **1**: The result of element-wise operation. A tensor of type T.
|
||||
|
||||
**Types**
|
||||
|
||||
* *T*: any supported floating point type.
|
||||
|
||||
**Examples**
|
||||
|
||||
*Example 1*
|
||||
|
||||
```xml
|
||||
<layer ... type="Selu">
|
||||
<input>
|
||||
<port id="0">
|
||||
<dim>256</dim>
|
||||
<dim>56</dim>
|
||||
</port>
|
||||
<port id="1">
|
||||
<dim>1</dim>
|
||||
</port>
|
||||
<port id="2">
|
||||
<dim>1</dim>
|
||||
</port>
|
||||
</input>
|
||||
<output>
|
||||
<port id="3">
|
||||
<dim>256</dim>
|
||||
<dim>56</dim>
|
||||
</port>
|
||||
</output>
|
||||
</layer>
|
||||
```
|
@ -93,7 +93,7 @@ declared in `namespace opset1`.
|
||||
* [Result](infrastructure/Result_1.md)
|
||||
* [ReverseSequence](movement/ReverseSequence_1.md)
|
||||
* [Select](condition/Select_1.md)
|
||||
* [Selu](arithmetic/Selu_1.md)
|
||||
* [Selu](activation/Selu_1.md)
|
||||
* [ShapeOf](shape/ShapeOf_1.md)
|
||||
* [Sigmoid](activation/Sigmoid_1.md)
|
||||
* [Sign](arithmetic/Sign_1.md)
|
||||
|
@ -98,7 +98,7 @@ declared in `namespace opset2`.
|
||||
* [ReverseSequence](movement/ReverseSequence_1.md)
|
||||
* [ROIPooling](detection/ROIPooling_1.md)
|
||||
* [Select](condition/Select_1.md)
|
||||
* [Selu](arithmetic/Selu_1.md)
|
||||
* [Selu](activation/Selu_1.md)
|
||||
* [ShapeOf](shape/ShapeOf_1.md)
|
||||
* [Sigmoid](activation/Sigmoid_1.md)
|
||||
* [Sign](arithmetic/Sign_1.md)
|
||||
|
@ -113,7 +113,7 @@ declared in `namespace opset3`.
|
||||
* [ScatterElementsUpdate](movement/ScatterElementsUpdate_3.md)
|
||||
* [ScatterUpdate](movement/ScatterUpdate_3.md)
|
||||
* [Select](condition/Select_1.md)
|
||||
* [Selu](arithmetic/Selu_1.md)
|
||||
* [Selu](activation/Selu_1.md)
|
||||
* [ShapeOf](shape/ShapeOf_3.md)
|
||||
* [ShuffleChannels](movement/ShuffleChannels_1.md)
|
||||
* [Sigmoid](activation/Sigmoid_1.md)
|
||||
|
@ -121,7 +121,7 @@ declared in `namespace opset4`.
|
||||
* [ScatterNDUpdate](movement/ScatterNDUpdate_3.md)
|
||||
* [ScatterUpdate](movement/ScatterUpdate_3.md)
|
||||
* [Select](condition/Select_1.md)
|
||||
* [Selu](arithmetic/Selu_1.md)
|
||||
* [Selu](activation/Selu_1.md)
|
||||
* [ShapeOf](shape/ShapeOf_3.md)
|
||||
* [ShuffleChannels](movement/ShuffleChannels_1.md)
|
||||
* [Sigmoid](activation/Sigmoid_1.md)
|
||||
|
@ -129,7 +129,7 @@ declared in `namespace opset5`.
|
||||
* [ScatterNDUpdate](movement/ScatterNDUpdate_3.md)
|
||||
* [ScatterUpdate](movement/ScatterUpdate_3.md)
|
||||
* [Select](condition/Select_1.md)
|
||||
* [Selu](arithmetic/Selu_1.md)
|
||||
* [Selu](activation/Selu_1.md)
|
||||
* [ShapeOf](shape/ShapeOf_3.md)
|
||||
* [ShuffleChannels](movement/ShuffleChannels_1.md)
|
||||
* [Sigmoid](activation/Sigmoid_1.md)
|
||||
|
@ -135,7 +135,7 @@ declared in `namespace opset6`.
|
||||
* [ScatterNDUpdate](movement/ScatterNDUpdate_3.md)
|
||||
* [ScatterUpdate](movement/ScatterUpdate_3.md)
|
||||
* [Select](condition/Select_1.md)
|
||||
* [Selu](arithmetic/Selu_1.md)
|
||||
* [Selu](activation/Selu_1.md)
|
||||
* [ShapeOf](shape/ShapeOf_3.md)
|
||||
* [ShuffleChannels](movement/ShuffleChannels_1.md)
|
||||
* [Sigmoid](activation/Sigmoid_1.md)
|
||||
|
@ -138,7 +138,7 @@ declared in `namespace opset7`.
|
||||
* [ScatterNDUpdate](movement/ScatterNDUpdate_3.md)
|
||||
* [ScatterUpdate](movement/ScatterUpdate_3.md)
|
||||
* [Select](condition/Select_1.md)
|
||||
* [Selu](arithmetic/Selu_1.md)
|
||||
* [Selu](activation/Selu_1.md)
|
||||
* [ShapeOf](shape/ShapeOf_3.md)
|
||||
* [ShuffleChannels](movement/ShuffleChannels_1.md)
|
||||
* [Sigmoid](activation/Sigmoid_1.md)
|
||||
|
Loading…
Reference in New Issue
Block a user