[IE Python Sample] Update docs (#9807)
* update hello_classification readme * update classification_async readme * update hello_query_device readme * Fix hello_classification launch line * Update hello_reshape_ssd readme * update speech sample docs * update ngraph sample docs * fix launch command * refactor py ngraph imports * Replace `network` with `model` * update example section with openvino-dev * Update samples/python/classification_sample_async/README.md Co-authored-by: Anastasiya Ageeva <anastasiya.ageeva@intel.com> * Update samples/python/classification_sample_async/README.md Co-authored-by: Anastasiya Ageeva <anastasiya.ageeva@intel.com> * Update samples/python/hello_classification/README.md Co-authored-by: Anastasiya Ageeva <anastasiya.ageeva@intel.com> * Update samples/python/hello_classification/README.md Co-authored-by: Anastasiya Ageeva <anastasiya.ageeva@intel.com> * Update samples/python/hello_reshape_ssd/README.md Co-authored-by: Anastasiya Ageeva <anastasiya.ageeva@intel.com> * Update samples/python/ngraph_function_creation_sample/README.md Co-authored-by: Anastasiya Ageeva <anastasiya.ageeva@intel.com> * Update samples/python/ngraph_function_creation_sample/README.md Co-authored-by: Anastasiya Ageeva <anastasiya.ageeva@intel.com> * Update samples/python/ngraph_function_creation_sample/README.md Co-authored-by: Anastasiya Ageeva <anastasiya.ageeva@intel.com> * Update samples/python/ngraph_function_creation_sample/README.md Co-authored-by: Andrey Zaytsev <andrey.zaytsev@intel.com> * Replace `Inference Engine` with `OpenVINO` * fix ngraph ref * Replace `Inference Engine` by `OpenVINO™ Runtime` * Fix IR mentions Co-authored-by: Vladimir Dudnik <vladimir.dudnik@intel.com> Co-authored-by: Anastasiya Ageeva <anastasiya.ageeva@intel.com> Co-authored-by: Andrey Zaytsev <andrey.zaytsev@intel.com>
This commit is contained in:
@@ -1,37 +1,37 @@
|
||||
# Hello Query Device Python* Sample {#openvino_inference_engine_ie_bridges_python_sample_hello_query_device_README}
|
||||
|
||||
This sample demonstrates how to show Inference Engine devices and prints their metrics and default configuration values using [Query Device API feature](../../../docs/OV_Runtime_UG/InferenceEngine_QueryAPI.md).
|
||||
This sample demonstrates how to show OpenVINO™ Runtime devices and prints their metrics and default configuration values using [Query Device API feature](../../../docs/OV_Runtime_UG/InferenceEngine_QueryAPI.md).
|
||||
|
||||
The following Inference Engine Python API is used in the application:
|
||||
The following Python API is used in the application:
|
||||
|
||||
| Feature | API | Description |
|
||||
| :----------- | :--------------------------------------- | :-------------------- |
|
||||
| Basic | [IECore] | Common API |
|
||||
| Query Device | [IECore.get_metric], [IECore.get_config] | Get device properties |
|
||||
| Feature | API | Description |
|
||||
| :----------- | :---------------------------------------------------------------------------------------------------------------- | :-------------------- |
|
||||
| Basic | [openvino.runtime.Core] | Common API |
|
||||
| Query Device | [openvino.runtime.Core.available_devices], [openvino.runtime.Core.get_metric], [openvino.runtime.Core.get_config] | Get device properties |
|
||||
|
||||
| Options | Values |
|
||||
| :------------------------- | :---------------------------------------------------------------------- |
|
||||
| Options | Values |
|
||||
| :------------------------- | :---------------------------------------------------------------- |
|
||||
| Supported devices | [All](../../../docs/OV_Runtime_UG/supported_plugins/Supported_Devices.md) |
|
||||
| Other language realization | [C++](../../../samples/cpp/hello_query_device/README.md) |
|
||||
| Other language realization | [C++](../../../samples/cpp/hello_query_device/README.md) |
|
||||
|
||||
## How It Works
|
||||
|
||||
The sample queries all available Inference Engine devices and prints their supported metrics and plugin configuration parameters.
|
||||
The sample queries all available OpenVINO™ Runtime devices and prints their supported metrics and plugin configuration parameters.
|
||||
|
||||
## Running
|
||||
|
||||
The sample has no command-line parameters. To see the report, run the following command:
|
||||
|
||||
```
|
||||
python <path_to_sample>/hello_query_device.py
|
||||
python hello_query_device.py
|
||||
```
|
||||
|
||||
## Sample Output
|
||||
|
||||
The application prints all available devices with their supported metrics and default values for configuration parameters. (Some lines are not shown due to length.) For example:
|
||||
The application prints all available devices with their supported metrics and default values for configuration parameters.
|
||||
For example:
|
||||
|
||||
```
|
||||
[ INFO ] Creating Inference Engine
|
||||
[ INFO ] Available devices:
|
||||
[ INFO ] CPU :
|
||||
[ INFO ] SUPPORTED_METRICS:
|
||||
@@ -40,9 +40,11 @@ The application prints all available devices with their supported metrics and de
|
||||
[ INFO ] OPTIMIZATION_CAPABILITIES: FP32, FP16, INT8, BIN
|
||||
[ INFO ] RANGE_FOR_ASYNC_INFER_REQUESTS: 1, 1, 1
|
||||
[ INFO ] RANGE_FOR_STREAMS: 1, 8
|
||||
[ INFO ] IMPORT_EXPORT_SUPPORT: True
|
||||
[ INFO ]
|
||||
[ INFO ] SUPPORTED_CONFIG_KEYS (default values):
|
||||
[ INFO ] CPU_BIND_THREAD: NUMA
|
||||
[ INFO ] CACHE_DIR:
|
||||
[ INFO ] CPU_BIND_THREAD: NO
|
||||
[ INFO ] CPU_THREADS_NUM: 0
|
||||
[ INFO ] CPU_THROUGHPUT_STREAMS: 1
|
||||
[ INFO ] DUMP_EXEC_GRAPH_AS_DOT:
|
||||
@@ -50,6 +52,8 @@ The application prints all available devices with their supported metrics and de
|
||||
[ INFO ] DYN_BATCH_LIMIT: 0
|
||||
[ INFO ] ENFORCE_BF16: NO
|
||||
[ INFO ] EXCLUSIVE_ASYNC_REQUESTS: NO
|
||||
[ INFO ] PERFORMANCE_HINT:
|
||||
[ INFO ] PERFORMANCE_HINT_NUM_REQUESTS: 0
|
||||
[ INFO ] PERF_COUNT: NO
|
||||
[ INFO ]
|
||||
[ INFO ] GNA :
|
||||
@@ -57,54 +61,33 @@ The application prints all available devices with their supported metrics and de
|
||||
[ INFO ] AVAILABLE_DEVICES: GNA_SW
|
||||
[ INFO ] OPTIMAL_NUMBER_OF_INFER_REQUESTS: 1
|
||||
[ INFO ] FULL_DEVICE_NAME: GNA_SW
|
||||
[ INFO ] GNA_LIBRARY_FULL_VERSION: 2.0.0.1047
|
||||
[ INFO ] GNA_LIBRARY_FULL_VERSION: 3.0.0.1455
|
||||
[ INFO ] IMPORT_EXPORT_SUPPORT: True
|
||||
[ INFO ]
|
||||
[ INFO ] SUPPORTED_CONFIG_KEYS (default values):
|
||||
[ INFO ] EXCLUSIVE_ASYNC_REQUESTS: NO
|
||||
[ INFO ] GNA_COMPACT_MODE: NO
|
||||
[ INFO ] GNA_COMPACT_MODE: YES
|
||||
[ INFO ] GNA_COMPILE_TARGET:
|
||||
[ INFO ] GNA_DEVICE_MODE: GNA_SW_EXACT
|
||||
[ INFO ] GNA_EXEC_TARGET:
|
||||
[ INFO ] GNA_FIRMWARE_MODEL_IMAGE:
|
||||
[ INFO ] GNA_FIRMWARE_MODEL_IMAGE_GENERATION:
|
||||
[ INFO ] GNA_LIB_N_THREADS: 1
|
||||
[ INFO ] GNA_PRECISION: I16
|
||||
[ INFO ] GNA_PWL_MAX_ERROR_PERCENT: 1.000000
|
||||
[ INFO ] GNA_PWL_UNIFORM_DESIGN: NO
|
||||
[ INFO ] GNA_SCALE_FACTOR: 1.000000
|
||||
[ INFO ] GNA_SCALE_FACTOR_0: 1.000000
|
||||
[ INFO ] LOG_LEVEL: LOG_NONE
|
||||
[ INFO ] PERF_COUNT: NO
|
||||
[ INFO ] SINGLE_THREAD: YES
|
||||
[ INFO ]
|
||||
[ INFO ] GPU :
|
||||
[ INFO ] SUPPORTED_METRICS:
|
||||
[ INFO ] AVAILABLE_DEVICES: 0
|
||||
[ INFO ] FULL_DEVICE_NAME: Intel(R) UHD Graphics 620 (iGPU)
|
||||
[ INFO ] OPTIMIZATION_CAPABILITIES: FP32, BIN, FP16
|
||||
[ INFO ] RANGE_FOR_ASYNC_INFER_REQUESTS: 1, 2, 1
|
||||
[ INFO ] RANGE_FOR_STREAMS: 1, 2
|
||||
[ INFO ]
|
||||
[ INFO ] SUPPORTED_CONFIG_KEYS (default values):
|
||||
[ INFO ] CACHE_DIR:
|
||||
[ INFO ] CLDNN_ENABLE_FP16_FOR_QUANTIZED_MODELS: YES
|
||||
[ INFO ] CLDNN_GRAPH_DUMPS_DIR:
|
||||
[ INFO ] CLDNN_MEM_POOL: YES
|
||||
[ INFO ] CLDNN_NV12_TWO_INPUTS: NO
|
||||
[ INFO ] CLDNN_PLUGIN_PRIORITY: 0
|
||||
[ INFO ] CLDNN_PLUGIN_THROTTLE: 0
|
||||
[ INFO ] CLDNN_SOURCES_DUMPS_DIR:
|
||||
[ INFO ] CONFIG_FILE:
|
||||
[ INFO ] DEVICE_ID:
|
||||
[ INFO ] DUMP_KERNELS: NO
|
||||
[ INFO ] DYN_BATCH_ENABLED: NO
|
||||
[ INFO ] EXCLUSIVE_ASYNC_REQUESTS: NO
|
||||
[ INFO ] GPU_THROUGHPUT_STREAMS: 1
|
||||
[ INFO ] PERF_COUNT: NO
|
||||
[ INFO ] TUNING_FILE:
|
||||
[ INFO ] TUNING_MODE: TUNING_DISABLED
|
||||
[ INFO ]
|
||||
```
|
||||
|
||||
## See Also
|
||||
|
||||
- [Using Inference Engine Samples](../../../docs/OV_Runtime_UG/Samples_Overview.md)
|
||||
- [Using OpenVINO™ Toolkit Samples](../../../docs/OV_Runtime_UG/Samples_Overview.md)
|
||||
|
||||
[IECore]:https://docs.openvino.ai/latest/ie_python_api/classie__api_1_1IECore.html
|
||||
[IECore.get_metric]:https://docs.openvino.ai/latest/ie_python_api/classie__api_1_1IECore.html#af1cdf2ecbea6399c556957c2c2fdf8eb
|
||||
[IECore.get_config]:https://docs.openvino.ai/latest/ie_python_api/classie__api_1_1IECore.html#a48764dec7c235d2374af8b8ef53c6363
|
||||
<!-- [openvino.runtime.Core]:
|
||||
[openvino.runtime.Core.available_devices]:
|
||||
[openvino.runtime.Core.get_metric]:
|
||||
[openvino.runtime.Core.get_config]: -->
|
||||
|
||||
Reference in New Issue
Block a user