Files
openvino/tools/benchmark_tool
River Li dc64268564 Remove ov::hint::PerformanceMode::UNDEFINED (#21592)
* Remove ov::hint::PerformanceMode::UNDEFINED

* Update for reviewer comments and build issue

* Fix build error - may be used uninitialized

* Update

---------

Co-authored-by: Ilya Lavrenov <ilya.lavrenov@intel.com>
2023-12-20 21:15:26 +04:00
..
2023-09-21 22:03:09 +04:00

Benchmark Python Tool

This page demonstrates how to use the Benchmark Python Tool to estimate deep learning inference performance on supported devices.

Note

: This page describes usage of the Python implementation of the Benchmark Tool. For the C++ implementation, refer to the Benchmark C++ Tool page. The Python version is recommended for benchmarking models that will be used in Python applications, and the C++ version is recommended for benchmarking models that will be used in C++ applications. Both tools have a similar command interface and backend.

For more detailed information on how this sample works, check the dedicated article

Requriements

The Python benchmark_app is automatically installed when you install OpenVINO Developer Tools using PyPI Before running benchmark_app, make sure the openvino_env virtual environment is activated, and navigate to the directory where your model is located.

The benchmarking application works with models in the OpenVINO IR (model.xml and model.bin) and ONNX (model.onnx) formats. Make sure to convert your models if necessary.