HSwish operation specification (#1708)

* HSwish specification init

* Update docs/ops/activation/HSwish_4.md

Co-authored-by: Michał Karzyński <4430709+postrational@users.noreply.github.com>

* Update docs/ops/opset4.md

Co-authored-by: Michał Karzyński <4430709+postrational@users.noreply.github.com>
This commit is contained in:
Katarzyna Mitrus
2020-08-11 08:54:08 +02:00
committed by GitHub
parent 6cccbcf28a
commit 0be11a462f
2 changed files with 51 additions and 0 deletions

View File

@@ -0,0 +1,50 @@
## HSwish <a name="HSwish"></a> {#openvino_docs_ops_activation_HSwish_4}
**Versioned name**: *HSwish-4*
**Category**: *Activation*
**Short description**: HSwish takes one input tensor and produces output tensor where the hard version of swish function is applied to the tensor elementwise.
**Detailed description**: For each element from the input tensor calculates corresponding
element in the output tensor with the following formula:
\f[
HSwish(x) = x \frac{min(max(x + 3, 0), 6)}{6}
\f]
The HSwish operation is introduced in the following [article](https://arxiv.org/pdf/1905.02244.pdf).
**Attributes**: operation has no attributes.
**Inputs**:
* **1**: Multidimensional input tensor of type *T*. **Required**.
**Outputs**:
* **1**: The resulting tensor of the same shape and type as input tensor.
**Types**
* *T*: arbitrary supported floating point type.
**Example**
```xml
<layer ... type="HSwish">
<input>
<port id="0">
<dim>256</dim>
<dim>56</dim>
</port>
</input>
<output>
<port id="1">
<dim>256</dim>
<dim>56</dim>
</port>
</output>
</layer>
```

View File

@@ -64,6 +64,7 @@ declared in `namespace opset4`.
* [GRUCell](sequence/GRUCell_3.md)
* [GRUSequence](sequence/GRUSequence_4.md)
* [HardSigmoid](activation/HardSigmoid_1.md)
* [HSwish](activation/HSwish_4.md)
* [Interpolate](image/Interpolate_4.md)
* [Less](comparison/Less_1.md)
* [LessEqual](comparison/LessEqual_1.md)