* doc migration * fix * Update FakeQuantize_1.md * Update performance_benchmarks.md * Updates graphs for FPGA * Update performance_benchmarks.md * Change DL Workbench structure (#1) * Changed DL Workbench structure * Fixed tags * fixes * Update ie_docs.xml * Update performance_benchmarks_faq.md * Fixes in DL Workbench layout * Fixes for CVS-31290 * [DL Workbench] Minor correction * Fix for CVS-30955 * Added nGraph deprecation notice as requested by Zoe * fix broken links in api doxy layouts * CVS-31131 fixes * Additional fixes * Fixed POT TOC * Update PAC_Configure.md PAC DCP 1.2.1 install guide. * Update inference_engine_intro.md * fix broken link * Update opset.md
947 B
947 B
Elu
Versioned name: Elu-1
Category: Activation function
Short description: Exponential linear unit element-wise activation function.
Detailed Description
For each element from the input tensor calculates corresponding element in the output tensor with the following formula: \f[ elu(x) = \left{\begin{array}{ll} alpha(e^{x} - 1) \quad \mbox{if } x < 0 \ x \quad \mbox{if } x \geq 0 \end{array}\right. \f]
Attributes
-
alpha
- Description: scale for the negative factor
- Range of values: arbitrary floating point number
- Type: float
- Default value: none
- Required: yes
Inputs:
- 1: Input tensor x of any floating point type. Required.
Outputs:
- 1: Result of Elu function applied to the input tensor x. Floating point tensor with shape and type matching the input tensor. Required.