* remove relu_backprop * Update ReLU spec * change inputs and outputs subsections of ReLU spec * Update Mathematical Formulation subsection * Update Category of ReLU in spec * Update Short description of ReLU in spec
1.2 KiB
1.2 KiB
ReLU
Versioned name: ReLU-1
Category: Activation function
Short description: ReLU element-wise activation function. (Reference)
Detailed description: Reference
Attributes: ReLU operation has no attributes.
Mathematical Formulation
For each element from the input tensor calculates corresponding element in the output tensor with the following formula: \f[ Y_{i}^{( l )} = max(0, Y_{i}^{( l - 1 )}) \f]
Inputs:
- 1: Multidimensional input tensor x of any supported numeric type. Required.
Outputs:
- 1: Result of ReLU function applied to the input tensor x. Tensor with shape and type matching the input tensor. Required.
Example
<layer ... type="ReLU">
<input>
<port id="0">
<dim>256</dim>
<dim>56</dim>
</port>
</input>
<output>
<port id="1">
<dim>256</dim>
<dim>56</dim>
</port>
</output>
</layer>