fix the link to VPU extensibility article (#11956)
This commit is contained in:
parent
5008ee8090
commit
079d1d6e74
@ -9,7 +9,7 @@
|
||||
openvino_docs_Extensibility_UG_add_openvino_ops
|
||||
openvino_docs_Extensibility_UG_Frontend_Extensions
|
||||
openvino_docs_Extensibility_UG_GPU
|
||||
openvino_docs_IE_DG_Extensibility_DG_VPU_Kernel
|
||||
openvino_docs_Extensibility_UG_VPU_Kernel
|
||||
openvino_docs_MO_DG_prepare_model_customize_model_optimizer_Customize_Model_Optimizer
|
||||
|
||||
@endsphinxdirective
|
||||
|
@ -1,4 +1,4 @@
|
||||
# How to Implement Custom Layers for VPU (Intel® Neural Compute Stick 2) {#openvino_docs_IE_DG_Extensibility_DG_VPU_Kernel}
|
||||
# How to Implement Custom Layers for VPU (Intel® Neural Compute Stick 2) {#openvino_docs_Extensibility_UG_VPU_Kernel}
|
||||
|
||||
To enable operations not supported by OpenVINO™ out of the box, you need a custom extension for Model Optimizer, a custom nGraph operation set, and a custom kernel for the device you will target. This page describes custom kernel support for one the VPU, the Intel® Neural Compute Stick 2 device, which uses the MYRIAD device plugin.
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user