SoftPlus specification refactoring (#5036)

* Review spec of SoftPlus operation

* Fix minor wording issues
This commit is contained in:
Gabriele Galiero Casay 2021-04-01 15:09:04 +02:00 committed by GitHub
parent 10d72a6631
commit 15b36ee8d1
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -2,15 +2,18 @@
**Versioned name**: *SoftPlus-4*
**Category**: *Activation*
**Category**: *Activation function*
**Short description**: SoftPlus takes one input tensor and produces output tensor where the softplus function is applied to the tensor elementwise.
**Short description**: *SoftPlus* is a rectified-based element-wise activation function.
**Detailed description**: For each element from the input tensor calculates corresponding
element in the output tensor with the following formula:
**Detailed description**
*SoftPlus* operation is introduced in this [article](https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.165.6419).
*SoftPlus* performs element-wise activation function on a given input tensor, based on the following mathematical formula:
\f[
SoftPlus(x) = ln(e^{x} + 1.0)
SoftPlus(x) = \ln(1+e^{x})
\f]
**Attributes**: *SoftPlus* operation has no attributes.
@ -18,16 +21,15 @@ SoftPlus(x) = ln(e^{x} + 1.0)
**Inputs**:
* **1**: Multidimensional input tensor of type *T*. **Required**.
* **1**: A tensor of type `T` and arbitrary shape. **Required**.
**Outputs**:
* **1**: The resulting tensor of the same shape and type as input tensor.
* **1**: The result of element-wise *SoftPlus* function applied to the input tensor. A tensor of type `T` and the same shape as input tensor.
**Types**
* *T*: arbitrary supported floating point type.
* *T*: arbitrary supported floating-point type.
**Example**
@ -46,4 +48,4 @@ SoftPlus(x) = ln(e^{x} + 1.0)
</port>
</output>
</layer>
```
```