* Doc Migration from Gitlab (#1289) * doc migration * fix * Update FakeQuantize_1.md * Update performance_benchmarks.md * Updates graphs for FPGA * Update performance_benchmarks.md * Change DL Workbench structure (#1) * Changed DL Workbench structure * Fixed tags * fixes * Update ie_docs.xml * Update performance_benchmarks_faq.md * Fixes in DL Workbench layout * Fixes for CVS-31290 * [DL Workbench] Minor correction * Fix for CVS-30955 * Added nGraph deprecation notice as requested by Zoe * fix broken links in api doxy layouts * CVS-31131 fixes * Additional fixes * Fixed POT TOC * Update PAC_Configure.md PAC DCP 1.2.1 install guide. * Update inference_engine_intro.md * fix broken link * Update opset.md * fix * added opset4 to layout * added new opsets to layout, set labels for them * Update VisionAcceleratorFPGA_Configure.md Updated from 2020.3 to 2020.4 Co-authored-by: domi2000 <domi2000@users.noreply.github.com>
1.1 KiB
1.1 KiB
Mish
Versioned name: Mish-4
Category: Activation
Short description: Mish is a Self Regularized Non-Monotonic Neural Activation Function.
Detailed description: Mish is a self regularized non-monotonic neural activation function proposed in the article.
Attributes: operation has no attributes.
Inputs:
- 1: Input tensor x of any floating point type T. Required.
Outputs:
- 1: Floating point tensor with shape and type matching the input tensor. Required.
Types
- T: any floating point type.
Mathematical Formulation
For each element from the input tensor calculates corresponding element in the output tensor with the following formula: \f[ Mish(x) = x*tanh(ln(1.0+e^{x})) \f]
Examples
<layer ... type="Mish">
<input>
<port id="0">
<dim>256</dim>
<dim>56</dim>
</port>
</input>
<output>
<port id="3">
<dim>256</dim>
<dim>56</dim>
</port>
</output>
</layer>