math formula fixes 2021.2 (#3514)
* math formula fix * ops math formula fix Co-authored-by: Nikolay Tyukaev <ntyukaev_lo@jenkins.inn.intel.com>
This commit is contained in:
@@ -1582,9 +1582,9 @@ OI, which means that Input changes the fastest, then Output.
|
||||
|
||||
**Mathematical Formulation**
|
||||
|
||||
\f[
|
||||
output[:, ... ,:, i, ... , j,:, ... ,:] = input2[:, ... ,:, input1[i, ... ,j],:, ... ,:]
|
||||
\f]
|
||||
\f[
|
||||
output[:, ... ,:, i, ... , j,:, ... ,:] = input2[:, ... ,:, input1[i, ... ,j],:, ... ,:]
|
||||
\f]
|
||||
|
||||
|
||||
**Inputs**
|
||||
@@ -5086,7 +5086,9 @@ t \in \left ( 0, \quad tiles \right )
|
||||
|
||||
Output tensor is populated by values computes in the following way:
|
||||
|
||||
output[i1, ..., i(axis-1), j, i(axis+1) ..., iN] = top_k(input[i1, ...., i(axis-1), :, i(axis+1), ..., iN]), k, sort, mode)
|
||||
\f[
|
||||
output[i1, ..., i(axis-1), j, i(axis+1) ..., iN] = top_k(input[i1, ...., i(axis-1), :, i(axis+1), ..., iN]), k, sort, mode)
|
||||
\f]
|
||||
|
||||
So for each slice `input[i1, ...., i(axis-1), :, i(axis+1), ..., iN]` which represents 1D array, top_k value is computed individually. Sorting and minimum/maximum are controlled by `sort` and `mode` attributes.
|
||||
|
||||
|
||||
@@ -9,9 +9,9 @@
|
||||
**Detailed description**: For each element from the input tensor calculates corresponding
|
||||
element in the output tensor with the following formula:
|
||||
|
||||
\f[
|
||||
HSwish(x) = x \frac{min(max(x + 3, 0), 6)}{6}
|
||||
\f]
|
||||
\f[
|
||||
HSwish(x) = x \frac{min(max(x + 3, 0), 6)}{6}
|
||||
\f]
|
||||
|
||||
The HSwish operation is introduced in the following [article](https://arxiv.org/pdf/1905.02244.pdf).
|
||||
|
||||
|
||||
@@ -26,9 +26,9 @@
|
||||
|
||||
For each element from the input tensor calculates corresponding
|
||||
element in the output tensor with the following formula:
|
||||
\f[
|
||||
Mish(x) = x*tanh(ln(1.0+e^{x}))
|
||||
\f]
|
||||
\f[
|
||||
Mish(x) = x*tanh(ln(1.0+e^{x}))
|
||||
\f]
|
||||
|
||||
**Examples**
|
||||
|
||||
|
||||
@@ -14,9 +14,9 @@
|
||||
|
||||
For each element from the input tensor calculates corresponding
|
||||
element in the output tensor with the following formula:
|
||||
\f[
|
||||
sigmoid( x ) = \frac{1}{1+e^{-x}}
|
||||
\f]
|
||||
\f[
|
||||
sigmoid( x ) = \frac{1}{1+e^{-x}}
|
||||
\f]
|
||||
|
||||
**Inputs**:
|
||||
|
||||
|
||||
@@ -9,9 +9,9 @@
|
||||
**Detailed description**: For each element from the input tensor calculates corresponding
|
||||
element in the output tensor with the following formula:
|
||||
|
||||
\f[
|
||||
SoftPlus(x) = ln(e^{x} + 1.0)
|
||||
\f]
|
||||
\f[
|
||||
SoftPlus(x) = ln(e^{x} + 1.0)
|
||||
\f]
|
||||
|
||||
**Attributes**: *SoftPlus* operation has no attributes.
|
||||
|
||||
|
||||
@@ -9,9 +9,9 @@
|
||||
**Detailed description**: For each element from the input tensor calculates corresponding
|
||||
element in the output tensor with the following formula:
|
||||
|
||||
\f[
|
||||
Swish(x) = x / (1.0 + e^{-(beta * x)})
|
||||
\f]
|
||||
\f[
|
||||
Swish(x) = x / (1.0 + e^{-(beta * x)})
|
||||
\f]
|
||||
|
||||
The Swish operation is introduced in the [article](https://arxiv.org/pdf/1710.05941.pdf).
|
||||
|
||||
|
||||
@@ -78,9 +78,9 @@
|
||||
|
||||
**Mathematical Formulation**
|
||||
|
||||
\f[
|
||||
output_{j} = \frac{\sum_{i = 0}^{n}x_{i}}{n}
|
||||
\f]
|
||||
\f[
|
||||
output_{j} = \frac{\sum_{i = 0}^{n}x_{i}}{n}
|
||||
\f]
|
||||
|
||||
**Example**
|
||||
|
||||
|
||||
@@ -70,9 +70,9 @@
|
||||
|
||||
**Mathematical Formulation**
|
||||
|
||||
\f[
|
||||
output_{j} = MAX\{ x_{0}, ... x_{i}\}
|
||||
\f]
|
||||
\f[
|
||||
output_{j} = MAX\{ x_{0}, ... x_{i}\}
|
||||
\f]
|
||||
|
||||
**Example**
|
||||
|
||||
|
||||
Reference in New Issue
Block a user