DOCS shift to rst - Benchmark Samples and Tools (#16566)

This commit is contained in:
Sebastian Golebiewski
2023-03-27 18:29:05 +02:00
committed by GitHub
parent 5c5a29d095
commit 1ca94326cb
5 changed files with 796 additions and 566 deletions

View File

@@ -1,50 +1,69 @@
# Bert Benchmark Python* Sample {#openvino_inference_engine_ie_bridges_python_sample_bert_benchmark_README}
This sample demonstrates how to estimate performace of a Bert model using Asynchronous Inference Request API. Unlike [demos](@ref omz_demos) this sample doesn't have configurable command line arguments. Feel free to modify sample's source code to try out different options.
@sphinxdirective
The following Python\* API is used in the application:
This sample demonstrates how to estimate performance of a Bert model using Asynchronous Inference Request API. Unlike :doc:`demos <omz_demos>` this sample doesn't have configurable command line arguments. Feel free to modify sample's source code to try out different options.
| Feature | API | Description |
| :--- | :--- | :--- |
| OpenVINO Runtime Version | [openvino.runtime.get_version] | Get Openvino API version |
| Basic Infer Flow | [openvino.runtime.Core], [openvino.runtime.Core.compile_model] | Common API to do inference: compile a model |
| Asynchronous Infer | [openvino.runtime.AsyncInferQueue], [openvino.runtime.AsyncInferQueue.start_async], [openvino.runtime.AsyncInferQueue.wait_all] | Do asynchronous inference |
| Model Operations | [openvino.runtime.CompiledModel.inputs] | Get inputs of a model |
The following Python API is used in the application:
## How It Works
+--------------------------------+-------------------------------------------------+----------------------------------------------+
| Feature | API | Description |
+================================+=================================================+==============================================+
| OpenVINO Runtime Version | [openvino.runtime.get_version] | Get Openvino API version. |
+--------------------------------+-------------------------------------------------+----------------------------------------------+
| Basic Infer Flow | [openvino.runtime.Core], | Common API to do inference: compile a model. |
| | [openvino.runtime.Core.compile_model] | |
+--------------------------------+-------------------------------------------------+----------------------------------------------+
| Asynchronous Infer | [openvino.runtime.AsyncInferQueue], | Do asynchronous inference. |
| | [openvino.runtime.AsyncInferQueue.start_async], | |
| | [openvino.runtime.AsyncInferQueue.wait_all] | |
+--------------------------------+-------------------------------------------------+----------------------------------------------+
| Model Operations | [openvino.runtime.CompiledModel.inputs] | Get inputs of a model. |
+--------------------------------+-------------------------------------------------+----------------------------------------------+
The sample downloads a model and a tokenizer, export the model to onnx, reads the exported model and reshapes it to enforce dynamic inpus shapes, compiles the resulting model, downloads a dataset and runs benchmarking on the dataset.
How It Works
####################
The sample downloads a model and a tokenizer, export the model to onnx, reads the exported model and reshapes it to enforce dynamic input shapes, compiles the resulting model, downloads a dataset and runs benchmarking on the dataset.
You can see the explicit description of
each sample step at [Integration Steps](../../../../docs/OV_Runtime_UG/integrate_with_your_application.md) section of "Integrate OpenVINO™ Runtime with Your Application" guide.
each sample step at :doc:`Integration Steps <openvino_docs_OV_UG_Integrate_OV_with_your_application>` section of "Integrate OpenVINO™ Runtime with Your Application" guide.
## Running
Running
####################
Install the `openvino` Python package:
Install the ``openvino`` Python package:
```
python -m pip install openvino
```
.. code-block:: sh
Install packages from `requirements.txt`:
python -m pip install openvino
Install packages from ``requirements.txt``:
.. code-block:: sh
python -m pip install -r requirements.txt
```
python -m pip install -r requirements.txt
```
Run the sample
```
python bert_benchmark.py
```
.. code-block:: sh
## Sample Output
python bert_benchmark.py
Sample Output
####################
The sample outputs how long it takes to process a dataset.
## See Also
See Also
####################
- [Integrate the OpenVINO™ Runtime with Your Application](../../../../docs/OV_Runtime_UG/integrate_with_your_application.md)
- [Using OpenVINO™ Toolkit Samples](../../../../docs/OV_Runtime_UG/Samples_Overview.md)
- [Model Downloader](@ref omz_tools_downloader)
- [Model Optimizer](../../../../docs/MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md)
* :doc:`Integrate the OpenVINO™ Runtime with Your Application <openvino_docs_OV_UG_Integrate_OV_with_your_application>`
* :doc:`Using OpenVINO Samples <openvino_docs_OV_UG_Samples_Overview>`
* :doc:`Model Downloader <omz_tools_downloader>`
* :doc:`Model Optimizer <openvino_docs_MO_DG_Deep_Learning_Model_Optimizer_DevGuide>`
@endsphinxdirective