* enable apply_processor_type() * declare PROCESSOR_TYPE * enable readProperties * test case for get_property() * enable set_property() and test cases * reduce changes * fix code style issue * fix python test case issue * remove python interface * move processor type definition out of dev_api * refine coding * add dependency * update header file * update description * merge intel_cpu header file * add inline in-code documentation * change 'UNDEFINED' to 'DEFAULT' * remove ProcTypeConfig * refine change * refine change * enable new property use hyper threading * update description * resume legacy code * change to ov::hint namespace * update including header file * update C API and Python API * update description for comments * update test case for comments * update function location for comments * fix typo * fix typo * fix code style issue and update test case * move cpu_map_scheduling into threading folder
OpenVINO™ Inference
OpenVINO Inference is a part of the OpenVINO Runtime library. The component is responsible for model inference on hardware devices and provides API for OpenVINO Plugin development.
OpenVINO Inference uses the common coding style rules.
Key contacts
People from the openvino-ie-maintainers group have the rights to approve and merge PRs to the inference component. They can assist with any questions about the component.
Components
OpenVINO Inference has the following structure:
- dev_api contains developer API required to develop OpenVINO Plugins. To use this API, link your component against
openvino::runtime::dev. - include contains public API. Find more information in the OpenVINO Inference API document.
- src contains sources of the component.
OpenVINO Inference has unit and functional tests. Unit tests are located in src/tests/unit/inference_engine, functional tests are located in src/tests/functional/inference_engine.