Files
openvino/docs/ops/activation/GELU_2.md
Nikolay Tyukaev ef45b5da8d Doc Migration (master) (#1377)
* Doc Migration from Gitlab (#1289)

* doc migration

* fix

* Update FakeQuantize_1.md

* Update performance_benchmarks.md

* Updates graphs for FPGA

* Update performance_benchmarks.md

* Change DL Workbench structure (#1)

* Changed DL Workbench structure

* Fixed tags

* fixes

* Update ie_docs.xml

* Update performance_benchmarks_faq.md

* Fixes in DL Workbench layout

* Fixes for CVS-31290

* [DL Workbench] Minor correction

* Fix for CVS-30955

* Added nGraph deprecation notice as requested by Zoe

* fix broken links in api doxy layouts

* CVS-31131 fixes

* Additional fixes

* Fixed POT TOC

* Update PAC_Configure.md

PAC DCP 1.2.1 install guide.

* Update inference_engine_intro.md

* fix broken link

* Update opset.md

* fix

* added opset4 to layout

* added new opsets to layout, set labels for them

* Update VisionAcceleratorFPGA_Configure.md

Updated from 2020.3 to 2020.4

Co-authored-by: domi2000 <domi2000@users.noreply.github.com>
2020-07-20 17:36:08 +03:00

1.2 KiB

GELU- Gaussian Error Linear Unit

Versioned name: Gelu-2

Category: Activation

Short description: Reference

Detailed description: Reference

Attributes: Gelu operation has no attributes.

Mathematical Formulation Gelu(x)=x*Φ(x), where Φ(x) is the Cumulative Distribution Function for Gaussian Distribution. The following equivalent combination is recognized and fused into single Gelu op:

\f[ Gelu(x) = 0.5x(1.0 + erf((x) / \sqrt{2}) \f]

Similarly, the following Gelu approximation (typical for the TensorFlow*) is recognized and fused into single Gelu op

\f[ Gelu(x) \approx 0.5x(1.0 + tanh(\sqrt{2.0/pi} * (x + 0.044715 * x ^ 3)) \f]

Inputs:

  • 1: Multidimensional input tensor. Required.

Example

<layer ... type="Gelu">
    <input>
        <port id="0">
            <dim>1</dim>
            <dim>128</dim>
        </port>
    </input>
    <output>
        <port id="1">
            <dim>1</dim>
            <dim>128</dim>
        </port>
    </output>
</layer>