* Add sync_bnehcmark * Fix Unix comilation * niter->time * Explain main loop * samples: factor out common * Code style * clang-format -i * return 0; -> return EXIT_SUCCESS;, +x * Update throughput_benchmark * Add READMEs * Fix READMEs refs * Add sync_benchmark.py * Add niter, infer_new_request, -pc * from datetime import timedelta * Fix niter and seconds_to_run * Add disclaimer about benchmark_app performance * Update samples/cpp/benchmark/sync_benchmark/README.md * Add dynamic_shape_bert_benhcmark * Add dynamic_shape_detection_benchmark * Adopt for detr-resnet50 * Remove sync_benchmark2, throughput_benchmark2, perf counters * clang-format -i * Fix flake8 * Add README.md * Add links to sample_dynamic_shape_bert_benchmark * Add softmax * nameless LatencyMetrics * parent.parent -> parents[2] * Add bert_benhcmark sample * Code style * Add bert_benhcmark/README.md * rm -r samples/python/benchmark/dynamic_shape_bert_benhcmark/ * rm -r samples/cpp/benchmark/dynamic_shape_detection_benchmark/ * bert_benhcmark/README.md: remove dynamic shape * Remove add_subdirectory(dynamic_shape_detection_benchmark) * flake8 * samples: Add a note about CUMULATIVE_THROUGHPUT, don’t expect get_property() to throw, don’t introduce json dependency for samples/cpp/common * / namespace * Add article * namespace -> static * Update README, seconds_ro_run 10, niter 10, no inter alinment * percentile->median * benchmark samples: use generate(), align logs, update READMEs * benchmakr samples: remove percentile() * samples/python/benchmark/bert_benhcmark/bert_benhcmark.py: report average sequence length and processing time * Python samples: move requirements.txt to every sample * Remove numpy from requirements.txt * Remove Building section from Python samples, install only required extras from openvino-dev, set up environment for bert_benhcmark, report duration for bert_benhcmark * Install openvino-dev for Hello Reshape SSD C++ Sample
51 lines
2.1 KiB
Markdown
51 lines
2.1 KiB
Markdown
# Bert Benchmark Python* Sample {#openvino_inference_engine_ie_bridges_python_sample_bert_benchmark_README}
|
|
|
|
This sample demonstrates how to estimate performace of a Bert model using Asynchronous Inference Request API. Unlike [demos](@ref omz_demos) this sample doesn't have configurable command line arguments. Feel free to modify sample's source code to try out different options.
|
|
|
|
The following Python\* API is used in the application:
|
|
|
|
| Feature | API | Description |
|
|
| :--- | :--- | :--- |
|
|
| OpenVINO Runtime Version | [openvino.runtime.get_version] | Get Openvino API version |
|
|
| Basic Infer Flow | [openvino.runtime.Core], [openvino.runtime.Core.compile_model] | Common API to do inference: compile a model |
|
|
| Asynchronous Infer | [openvino.runtime.AsyncInferQueue], [openvino.runtime.AsyncInferQueue.start_async], [openvino.runtime.AsyncInferQueue.wait_all] | Do asynchronous inference |
|
|
| Model Operations | [openvino.runtime.CompiledModel.inputs] | Get inputs of a model |
|
|
|
|
## How It Works
|
|
|
|
The sample downloads a model and a tokenizer, export the model to onnx, reads the exported model and reshapes it to enforce dynamic inpus shapes, compiles the resulting model, downloads a dataset and runs benhcmarking on the dataset.
|
|
|
|
You can see the explicit description of
|
|
each sample step at [Integration Steps](../../../../docs/OV_Runtime_UG/integrate_with_your_application.md) section of "Integrate OpenVINO™ Runtime with Your Application" guide.
|
|
|
|
## Running
|
|
|
|
Install the `openvino` Python package:
|
|
|
|
```
|
|
python -m pip install openvino
|
|
```
|
|
|
|
Install packages from `requirements.txt`:
|
|
|
|
```
|
|
python -m pip install -r requirements.txt
|
|
```
|
|
|
|
Run the sample
|
|
|
|
```
|
|
python bert_benhcmark.py
|
|
```
|
|
|
|
## Sample Output
|
|
|
|
The sample outputs how long it takes to process a dataset.
|
|
|
|
## See Also
|
|
|
|
- [Integrate the OpenVINO™ Runtime with Your Application](../../../../docs/OV_Runtime_UG/integrate_with_your_application.md)
|
|
- [Using OpenVINO™ Toolkit Samples](../../../../docs/OV_Runtime_UG/Samples_Overview.md)
|
|
- [Model Downloader](@ref omz_tools_downloader)
|
|
- [Model Optimizer](../../../../docs/MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md)
|