Updated documentation for compile_tool (#11049)
This commit is contained in:
@@ -8,6 +8,7 @@
|
||||
|
||||
openvino_docs_OV_Runtime_UG_Model_Representation
|
||||
openvino_docs_OV_Runtime_UG_Infer_request
|
||||
openvino_docs_OV_Runtime_UG_Python_API_exclusives
|
||||
|
||||
@endsphinxdirective
|
||||
|
||||
|
||||
@@ -19,7 +19,6 @@
|
||||
openvino_docs_OV_UG_Performance_Hints
|
||||
openvino_docs_OV_UG_Automatic_Batching
|
||||
openvino_docs_IE_DG_network_state_intro
|
||||
openvino_docs_OV_Runtime_UG_Python_API_exclusives
|
||||
|
||||
@endsphinxdirective
|
||||
|
||||
|
||||
@@ -40,17 +40,17 @@ Devices similar to the ones we have used for benchmarking can be accessed using
|
||||
## Features support matrix
|
||||
The table below demonstrates support of key features by OpenVINO device plugins.
|
||||
|
||||
| Capability | [CPU](CPU.md) | [GPU](GPU.md) | [GNA](GNA.md) | [VPU](VPU.md) | [Arm® CPU](ARM_CPU.md) |
|
||||
| ---------- | --- | --- | --- | --- | --- |
|
||||
| [Heterogeneous execution](../hetero_execution.md)| Yes | Yes | No | ? | Yes |
|
||||
| [Multi-device execution](../multi_device.md) | Yes | Yes | Partial | ? | Yes |
|
||||
| [Automatic batching](../automatic_batching.md) | No | Yes | No | ? | No |
|
||||
| [Multi-stream execution](@ref openvino_docs_optimization_guide_dldt_optimization_guide) | Yes | Yes | No | ? | Yes |
|
||||
| [Models caching](../Model_caching_overview.md) | Yes | Partial | Yes | ? | No |
|
||||
| [Dynamic shapes](../ov_dynamic_shapes.md) | Yes | Partial | No | ? | No |
|
||||
| Import/Export | Yes | No | Yes | ? | No |
|
||||
| [Preprocessing acceleration](../preprocessing_overview.md) | Yes | Yes | No | ? | Partial |
|
||||
| [Stateful models](../network_state_intro.md) | Yes | No | Yes | ? | No |
|
||||
| [Extensibility](@ref openvino_docs_Extensibility_UG_Intro) | Yes | Yes | No | ? | No |
|
||||
| Capability | [CPU](CPU.md) | [GPU](GPU.md) | [GNA](GNA.md) |[Arm® CPU](ARM_CPU.md) |
|
||||
| ---------- | --- | --- | --- | --- |
|
||||
| [Heterogeneous execution](../hetero_execution.md)| Yes | Yes | No | Yes |
|
||||
| [Multi-device execution](../multi_device.md) | Yes | Yes | Partial | Yes |
|
||||
| [Automatic batching](../automatic_batching.md) | No | Yes | No | No |
|
||||
| [Multi-stream execution](../../optimization_guide/dldt_deployment_optimization_tput.md) | Yes | Yes | No | Yes |
|
||||
| [Models caching](../Model_caching_overview.md) | Yes | Partial | Yes | No |
|
||||
| [Dynamic shapes](../ov_dynamic_shapes.md) | Yes | Partial | No | No |
|
||||
| [Import/Export](../../../tools/compile_tool/README.md) | Yes | No | Yes | No |
|
||||
| [Preprocessing acceleration](../preprocessing_overview.md) | Yes | Yes | No | Partial |
|
||||
| [Stateful models](../network_state_intro.md) | Yes | No | Yes | No |
|
||||
| [Extensibility](@ref openvino_docs_Extensibility_UG_Intro) | Yes | Yes | No | No |
|
||||
|
||||
For more details on plugin specific feature limitation, see corresponding plugin pages.
|
||||
|
||||
@@ -1,19 +1,23 @@
|
||||
# Compile Tool {#openvino_inference_engine_tools_compile_tool_README}
|
||||
|
||||
Compile tool is a C++ application that enables you to compile a network for inference on a specific device and export it to a binary file.
|
||||
With the Compile Tool, you can compile a network using supported Inference Engine plugins on a machine that doesn't have the physical device connected and then transfer a generated file to any machine with the target inference device available.
|
||||
Compile tool is a C++ application that enables you to compile a model for inference on a specific device and export the compiled representation to a binary file.
|
||||
With the Compile Tool, you can compile a model using supported OpenVINO Runtime devices on a machine that doesn't have the physical device connected and then transfer a generated file to any machine with the target inference device available. See the [Features support matrix](../../docs/OV_Runtime_UG/supported_plugins/Device_Plugins.md) to understand which device support import / export functionality.
|
||||
|
||||
The tool compiles networks for the following target devices using corresponding Inference Engine plugins:
|
||||
The tool compiles networks for the following target devices using corresponding OpenVINO Runtime plugins:
|
||||
* Intel® Neural Compute Stick 2 (MYRIAD plugin)
|
||||
|
||||
|
||||
The tool is delivered as an executable file that can be run on both Linux* and Windows*.
|
||||
The tool is located in the `<INSTALLROOT>/tools/compile_tool` directory.
|
||||
|
||||
The workflow of the Compile tool is as follows:
|
||||
## Workflow of the Compile tool
|
||||
|
||||
1. First, the application reads command-line parameters and loads a network to the Inference Engine device.
|
||||
2. The application exports a blob with the compiled network and writes it to the output file.
|
||||
1. First, the application reads command-line parameters and loads a model to the OpenVINO Runtime device.
|
||||
2. Then the application exports a blob with the compiled model and writes it to the output file.
|
||||
|
||||
Also, the compile_tool supports the following capabilities:
|
||||
- Embedding [layout](../../docs/OV_Runtime_UG/layout_overview.md) and precision conversions (see [Optimize Preprocessing](../../docs/OV_Runtime_UG/preprocessing_overview.md)). To compile the model with advanced preprocessing capabilities, refer to [Use Case - Integrate and Save Preprocessing Steps Into IR](../../docs/OV_Runtime_UG/preprocessing_usecase_save.md) which shows how to have all the preprocessing in the compiled blob.
|
||||
- Compile blobs for OpenVINO Runtime API 2.0 by default or for Inference Engine API with explicit option `-ov_api_1_0`
|
||||
- Accepts device specific options for customizing the compilation process
|
||||
|
||||
## Run the Compile Tool
|
||||
|
||||
@@ -85,5 +89,5 @@ To import a blob with the network from a generated file into your application, u
|
||||
```cpp
|
||||
ov::Core ie;
|
||||
std::ifstream file{"model_name.blob"};
|
||||
ov::CompiledModel compiled_model = ie.import_model(file, "MYRIAD", {});
|
||||
ov::CompiledModel compiled_model = ie.import_model(file, "MYRIAD");
|
||||
```
|
||||
|
||||
Reference in New Issue
Block a user