* Doc Migration from Gitlab (#1289) * doc migration * fix * Update FakeQuantize_1.md * Update performance_benchmarks.md * Updates graphs for FPGA * Update performance_benchmarks.md * Change DL Workbench structure (#1) * Changed DL Workbench structure * Fixed tags * fixes * Update ie_docs.xml * Update performance_benchmarks_faq.md * Fixes in DL Workbench layout * Fixes for CVS-31290 * [DL Workbench] Minor correction * Fix for CVS-30955 * Added nGraph deprecation notice as requested by Zoe * fix broken links in api doxy layouts * CVS-31131 fixes * Additional fixes * Fixed POT TOC * Update PAC_Configure.md PAC DCP 1.2.1 install guide. * Update inference_engine_intro.md * fix broken link * Update opset.md * fix * added opset4 to layout * added new opsets to layout, set labels for them * Update VisionAcceleratorFPGA_Configure.md Updated from 2020.3 to 2020.4 Co-authored-by: domi2000 <domi2000@users.noreply.github.com>
4.3 KiB
Introduction to Inference Engine Device Query API
This section provides a high-level description of the process of querying of different device properties and configuration values. Refer to the Hello Query Device Sample sources and Multi-Device Plugin guide for example of using the Inference Engine Query API in user applications.
Using the Inference Engine Query API in Your Code
The Inference Engine Core class provides the following API to query device information, set or get different device configuration properties:
InferenceEngine::Core::GetAvailableDevices- Provides a list of available devices. If there are more than one instance of a specific device, the devices are enumerated with.suffixwheresuffixis a unique string identifier. The device name can be passed to all methods of theInferenceEngine::Coreclass that work with devices, for exampleInferenceEngine::Core::LoadNetwork.InferenceEngine::Core::GetMetric- Provides information about specific device.InferenceEngine::Core::GetConfig- Gets the current value of a specific configuration key.InferenceEngine::Core::SetConfig- Sets a new value for the configuration key.
The InferenceEngine::ExecutableNetwork class is also extended to support the Query API:
InferenceEngine::ExecutableNetwork::GetMetricInferenceEngine::ExecutableNetwork::GetConfigInferenceEngine::ExecutableNetwork::SetConfig
Query API in the Core Class
GetAvailableDevices
InferenceEngine::Core core;
std::vector<std::string> availableDevices = ie.GetAvailableDevices();
The function returns list of available devices, for example:
MYRIAD.1.2-ma2480
MYRIAD.1.4-ma2480
FPGA.0
FPGA.1
CPU
GPU
...
Each device name can then be passed to:
InferenceEngine::Core::LoadNetworkto load the network to a specific device.InferenceEngine::Core::GetMetricto get common or device specific metrics.- All other methods of the
Coreclass that acceptdeviceName.
GetConfig()
The code below demonstrates how to understand whether HETERO device dumps .dot files with split graphs during the split stage:
InferenceEngine::Core core;
bool dumpDotFile = core.GetConfig("HETERO", HETERO_CONFIG_KEY(DUMP_GRAPH_DOT)).as<bool>();
For documentation about common configuration keys, refer to ie_plugin_config.hpp. Device specific configuration keys can be found in corresponding plugin folders.
GetMetric()
- To extract device properties such as available device, device name, supported configuration keys, and others, use the
InferenceEngine::Core::GetMetricmethod:
InferenceEngine::Core core;
std::string cpuDeviceName = core.GetMetric("GPU", METRIC_KEY(FULL_DEVICE_NAME)).as<std::string>();
A returned value looks as follows: Intel(R) Core(TM) i7-8700 CPU @ 3.20GHz.
Note
: All metrics have specific type, which is specified during metric instantiation. The list of common device-agnostic metrics can be found in
ie_plugin_config.hpp. Device specific metrics (for example, forHDDL,MYRIADdevices) can be found in corresponding plugin folders.
Query API in the ExecutableNetwork Class
GetMetric()
The method is used to get executable network specific metric such as METRIC_KEY(OPTIMAL_NUMBER_OF_INFER_REQUESTS):
InferenceEngine::Core core;
auto exeNetwork = core.LoadNetwork(network, "CPU");
auto nireq = exeNetwork.GetMetric(METRIC_KEY(OPTIMAL_NUMBER_OF_INFER_REQUESTS)).as<unsigned int>();
Or the current temperature of MYRIAD device:
InferenceEngine::Core core;
auto exeNetwork = core.LoadNetwork(network, "MYRIAD");
float temperature = exeNetwork.GetMetric(METRIC_KEY(DEVICE_THERMAL)).as<float>();
GetConfig()
The method is used to get information about configuration values the executable network has been created with:
InferenceEngine::Core core;
auto exeNetwork = core.LoadNetwork(network, "CPU");
auto ncores = exeNetwork.GetConfig(PluginConfigParams::KEY_CPU_THREADS_NUM).as<std::string>();
SetConfig()
The only device that supports this method is Multi-Device.