diff --git a/docs/IE_DG/Extensibility_DG/VPU_Kernel.md b/docs/IE_DG/Extensibility_DG/VPU_Kernel.md index b4e9fb875ab..0b57e459da8 100644 --- a/docs/IE_DG/Extensibility_DG/VPU_Kernel.md +++ b/docs/IE_DG/Extensibility_DG/VPU_Kernel.md @@ -28,7 +28,9 @@ The OpenCL toolchain for the IntelĀ® Neural Compute Stick 2 supports offline com * `SHAVE_MOVIASM_DIR=/deployment_tools/tools/cl_compiler/bin/` 2. Run the compilation with the command below. You should use `--strip-binary-header` to make an OpenCL runtime-agnostic binary runnable with the Inference Engine. ```bash + source /bin/setupvars.sh cd /deployment_tools/tools/cl_compiler/bin + source cltools_setenv.sh ./clc --strip-binary-header custom_layer.cl -o custom_layer.bin ```