Changed structure of core components and fixed links (#13366)

* Changed structure and fixed links

* Exclude README.md from CI check

* Update README.md

Co-authored-by: Tatiana Savina <tatiana.savina@intel.com>

* Update README.md

Co-authored-by: Tatiana Savina <tatiana.savina@intel.com>

* Update src/README.md

Co-authored-by: Tatiana Savina <tatiana.savina@intel.com>

* Update src/README.md

Co-authored-by: Tatiana Savina <tatiana.savina@intel.com>

* Update src/README.md

Co-authored-by: Tatiana Savina <tatiana.savina@intel.com>

* Fixed links

* Update src/frontends/README.md

Co-authored-by: Tatiana Savina <tatiana.savina@intel.com>

* Update src/frontends/README.md

Co-authored-by: Tatiana Savina <tatiana.savina@intel.com>

* Update src/plugins/README.md

Co-authored-by: Tatiana Savina <tatiana.savina@intel.com>

Co-authored-by: Tatiana Savina <tatiana.savina@intel.com>
This commit is contained in:
Ilya Churaev
2022-10-07 19:04:55 +04:00
committed by GitHub
parent f5febef8a6
commit 767944cc46
18 changed files with 92 additions and 38 deletions

View File

@@ -8,6 +8,7 @@ trigger:
- docs/
- /**/docs/*
- /**/*.md
- README.md
pr:
branches:
@@ -19,6 +20,7 @@ pr:
- docs/
- /**/docs/*
- /**/*.md
- README.md
resources:
repositories:

View File

@@ -8,6 +8,7 @@ trigger:
- docs/
- /**/docs/*
- /**/*.md
- README.md
pr:
branches:
@@ -19,6 +20,7 @@ pr:
- docs/
- /**/docs/*
- /**/*.md
- README.md
resources:
repositories:

View File

@@ -8,6 +8,7 @@ trigger:
- docs/
- /**/docs/*
- /**/*.md
- README.md
pr:
branches:
@@ -19,6 +20,7 @@ pr:
- docs/
- /**/docs/*
- /**/*.md
- README.md
resources:
repositories:

View File

@@ -8,6 +8,7 @@ trigger:
- docs/
- /**/docs/*
- /**/*.md
- README.md
pr:
branches:
@@ -19,6 +20,7 @@ pr:
- docs/
- /**/docs/*
- /**/*.md
- README.md
jobs:
- job: LinCC

View File

@@ -8,6 +8,7 @@ trigger:
- docs/
- /**/docs/*
- /**/*.md
- README.md
pr:
branches:
@@ -19,6 +20,7 @@ pr:
- docs/
- /**/docs/*
- /**/*.md
- README.md
resources:
repositories:

View File

@@ -8,6 +8,7 @@ trigger:
- docs/
- /**/docs/*
- /**/*.md
- README.md
pr:
branches:
@@ -19,6 +20,7 @@ pr:
- docs/
- /**/docs/*
- /**/*.md
- README.md
resources:
repositories:

View File

@@ -8,6 +8,7 @@ trigger:
- docs/
- /**/docs/*
- /**/*.md
- README.md
pr:
branches:
@@ -19,6 +20,7 @@ pr:
- docs/
- /**/docs/*
- /**/*.md
- README.md
jobs:
- job: OpenVINO_ONNX_CI

View File

@@ -8,6 +8,7 @@ trigger:
- docs/
- /**/docs/*
- /**/*.md
- README.md
pr:
branches:
@@ -19,6 +20,7 @@ pr:
- docs/
- /**/docs/*
- /**/*.md
- README.md
jobs:
- job: onnxruntime

View File

@@ -8,6 +8,7 @@ trigger:
- docs/
- /**/docs/*
- /**/*.md
- README.md
pr:
branches:
@@ -19,6 +20,7 @@ pr:
- docs/
- /**/docs/*
- /**/*.md
- README.md
resources:
repositories:

View File

@@ -8,6 +8,7 @@ trigger:
- docs/
- /**/docs/*
- /**/*.md
- README.md
pr:
branches:
@@ -19,6 +20,7 @@ pr:
- docs/
- /**/docs/*
- /**/*.md
- README.md
resources:
repositories:

View File

@@ -8,6 +8,7 @@ trigger:
- docs/
- /**/docs/*
- /**/*.md
- README.md
pr:
branches:
@@ -19,6 +20,7 @@ pr:
- docs/
- /**/docs/*
- /**/*.md
- README.md
jobs:
- job: WinCC

View File

@@ -40,18 +40,18 @@ source and public models in popular formats such as TensorFlow, ONNX, PaddlePadd
### Components
* [OpenVINO™ Runtime] - is a set of C++ libraries with C and Python bindings providing a common API to deliver inference solutions on the platform of your choice.
* [core](https://github.com/openvinotoolkit/openvino/tree/master/src/core) - provides the base API for model representation and modification.
* [inference](https://github.com/openvinotoolkit/openvino/tree/master/src/inference) - provides an API to infer models on device.
* [transformations](https://github.com/openvinotoolkit/openvino/tree/master/src/common/transformations) - contains the set of common transformations which are used in OpenVINO plugins.
* [low precision transformations](https://github.com/openvinotoolkit/openvino/tree/master/src/common/low_precision_transformations) - contains the set of transformations which are used in low precision models
* [bindings](https://github.com/openvinotoolkit/openvino/tree/master/src/bindings) - contains all available OpenVINO bindings which are maintained by OpenVINO team.
* [c](https://github.com/openvinotoolkit/openvino/tree/master/src/bindings/c) - provides C API for OpenVINO™ Runtime
* [python](https://github.com/openvinotoolkit/openvino/tree/master/src/bindings/python) - Python API for OpenVINO™ Runtime
* [Plugins](https://github.com/openvinotoolkit/openvino/tree/master/src/plugins) - contains OpenVINO plugins which are maintained in open-source by OpenVINO team. For more information please take a look to the [list of supported devices](#supported-hardware-matrix).
* [Frontends](https://github.com/openvinotoolkit/openvino/tree/master/src/frontends) - contains available OpenVINO frontends which allow to read model from native framework format.
* [core](./src/core) - provides the base API for model representation and modification.
* [inference](./src/inference) - provides an API to infer models on the device.
* [transformations](./src/common/transformations) - contains the set of common transformations which are used in OpenVINO plugins.
* [low precision transformations](./src/common/low_precision_transformations) - contains the set of transformations that are used in low precision models
* [bindings](./src/bindings) - contains all available OpenVINO bindings which are maintained by the OpenVINO team.
* [c](./src/bindings/c) - provides C API for OpenVINO™ Runtime
* [python](./src/bindings/python) - Python API for OpenVINO™ Runtime
* [Plugins](./src/plugins) - contains OpenVINO plugins which are maintained in open-source by OpenVINO team. For more information, take a look at the [list of supported devices](#supported-hardware-matrix).
* [Frontends](./src/frontends) - contains available OpenVINO frontends that allow reading models from the native framework format.
* [Model Optimizer] - is a cross-platform command-line tool that facilitates the transition between training and deployment environments, performs static model analysis, and adjusts deep learning models for optimal execution on end-point target devices.
* [Post-Training Optimization Tool] - is designed to accelerate the inference of deep learning models by applying special methods without model retraining or fine-tuning, for example, post-training 8-bit quantization.
* [Samples] - applications on C, C++ and Python languages which shows basic use cases of OpenVINO usages.
* [Samples] - applications in C, C++ and Python languages that show basic OpenVINO use cases.
## Supported Hardware matrix
@@ -70,7 +70,7 @@ The OpenVINO™ Runtime can infer models on different hardware devices. This sec
<tr>
<td rowspan=2>CPU</td>
<td> <a href="https://docs.openvino.ai/nightly/openvino_docs_OV_UG_supported_plugins_CPU.html#doxid-openvino-docs-o-v-u-g-supported-plugins-c-p-u">Intel CPU</a></tb>
<td><b><i><a href="https://github.com/openvinotoolkit/openvino/tree/master/src/plugins/intel_cpu">openvino_intel_cpu_plugin</a></i></b></td>
<td><b><i><a href="./src/plugins/intel_cpu">openvino_intel_cpu_plugin</a></i></b></td>
<td>Intel Xeon with Intel® Advanced Vector Extensions 2 (Intel® AVX2), Intel® Advanced Vector Extensions 512 (Intel® AVX-512), and AVX512_BF16, Intel Core Processors with Intel AVX2, Intel Atom Processors with Intel® Streaming SIMD Extensions (Intel® SSE)</td>
</tr>
<tr>
@@ -81,19 +81,19 @@ The OpenVINO™ Runtime can infer models on different hardware devices. This sec
<tr>
<td>GPU</td>
<td><a href="https://docs.openvino.ai/nightly/openvino_docs_OV_UG_supported_plugins_GPU.html#doxid-openvino-docs-o-v-u-g-supported-plugins-g-p-u">Intel GPU</a></td>
<td><b><i><a href="https://github.com/openvinotoolkit/openvino/tree/master/src/plugins/intel_gpu">openvino_intel_gpu_plugin</a></i></b></td>
<td><b><i><a href="./src/plugins/intel_gpu">openvino_intel_gpu_plugin</a></i></b></td>
<td>Intel Processor Graphics, including Intel HD Graphics and Intel Iris Graphics</td>
</tr>
<tr>
<td>GNA</td>
<td><a href="https://docs.openvino.ai/nightly/openvino_docs_OV_UG_supported_plugins_GNA.html#doxid-openvino-docs-o-v-u-g-supported-plugins-g-n-a">Intel GNA</a></td>
<td><b><i><a href="https://github.com/openvinotoolkit/openvino/tree/master/src/plugins/intel_gna">openvino_intel_gna_plugin</a></i></b></td>
<td><b><i><a href="./src/plugins/intel_gna">openvino_intel_gna_plugin</a></i></b></td>
<td>Intel Speech Enabling Developer Kit, Amazon Alexa* Premium Far-Field Developer Kit, Intel Pentium Silver J5005 Processor, Intel Pentium Silver N5000 Processor, Intel Celeron J4005 Processor, Intel Celeron J4105 Processor, Intel Celeron Processor N4100, Intel Celeron Processor N4000, Intel Core i3-8121U Processor, Intel Core i7-1065G7 Processor, Intel Core i7-1060G7 Processor, Intel Core i5-1035G4 Processor, Intel Core i5-1035G7 Processor, Intel Core i5-1035G1 Processor, Intel Core i5-1030G7 Processor, Intel Core i5-1030G4 Processor, Intel Core i3-1005G1 Processor, Intel Core i3-1000G1 Processor, Intel Core i3-1000G4 Processor</td>
</tr>
<tr>
<td>VPU</td>
<td><a href="https://docs.openvino.ai/nightly/openvino_docs_IE_DG_supported_plugins_VPU.html#doxid-openvino-docs-i-e-d-g-supported-plugins-v-p-u">Myriad plugin</a></td>
<td><b><i><a href="https://github.com/openvinotoolkit/openvino/tree/master/src/plugins/intel_myriad">openvino_intel_myriad_plugin</a></i></b></td>
<td><b><i><a href="./src/plugins/intel_myriad">openvino_intel_myriad_plugin</a></i></b></td>
<td>Intel® Neural Compute Stick 2 powered by the Intel® Movidius™ Myriad™ X</td>
</tr>
</tbody>
@@ -111,22 +111,22 @@ Also OpenVINO™ Toolkit contains several plugins which should simplify to load
<tbody>
<tr>
<td><a href="https://docs.openvino.ai/nightly/openvino_docs_IE_DG_supported_plugins_AUTO.html#doxid-openvino-docs-i-e-d-g-supported-plugins-a-u-t-o">Auto</a></td>
<td><b><i><a href="https://github.com/openvinotoolkit/openvino/tree/master/src/plugins/auto">openvino_auto_plugin</a></i></b></td>
<td><b><i><a href="./src/plugins/auto">openvino_auto_plugin</a></i></b></td>
<td>Auto plugin enables selecting Intel device for inference automatically</td>
</tr>
<tr>
<td><a href="https://docs.openvino.ai/nightly/openvino_docs_OV_UG_Automatic_Batching.html">Auto Batch</a></td>
<td><b><i><a href="https://github.com/openvinotoolkit/openvino/tree/master/src/plugins/auto_batch">openvino_auto_batch_plugin</a></i></b></td>
<td><b><i><a href="./src/plugins/auto_batch">openvino_auto_batch_plugin</a></i></b></td>
<td>Auto batch plugin performs on-the-fly automatic batching (i.e. grouping inference requests together) to improve device utilization, with no programming effort from the user</td>
</tr>
<tr>
<td><a href="https://docs.openvino.ai/nightly/openvino_docs_OV_UG_Hetero_execution.html#doxid-openvino-docs-o-v-u-g-hetero-execution">Hetero</a></td>
<td><b><i><a href="https://github.com/openvinotoolkit/openvino/tree/master/src/plugins/hetero">openvino_hetero_plugin</a></i></b></td>
<td><b><i><a href="./src/plugins/hetero">openvino_hetero_plugin</a></i></b></td>
<td>Heterogeneous execution enables automatic inference splitting between several devices</td>
</tr>
<tr>
<td><a href="https://docs.openvino.ai/nightly/openvino_docs_OV_UG_Running_on_multiple_devices.html#doxid-openvino-docs-o-v-u-g-running-on-multiple-devices">Multi</a></td>
<td><b><i><a href="https://github.com/openvinotoolkit/openvino/tree/master/src/plugins/auto">openvino_auto_plugin</a></i></b></td>
<td><b><i><a href="./src/plugins/auto">openvino_auto_plugin</a></i></b></td>
<td>Multi plugin enables simultaneous inference of the same model on several devices in parallel</td>
</tr>
</tbody>

View File

@@ -45,30 +45,17 @@ flowchart LR
OpenVINO Frontends allow to convert model from framework to OpenVINO representation.
* [ir](./frontends/ir/README.md)
* [onnx](./frontends/onnx)
* [paddle](./frontends/paddle)
* [tensorflow](./frontends/tensorflow)
Go to the [Frontends page](./frontends/README.md) to get more information.
## OpenVINO Plugins
Plugins provide a support of hardware device
OpenVINO Plugins provide support for hardware devices.
* [auto](./plugins/auto)
* [auto_batch](./plugins/auto_batch)
* [hetero](./plugins/hetero)
* [intel_cpu](./plugins/intel_cpu)
* [intel_gna](./plugins/intel_gna)
* [intel_gpu](./plugins/intel_gpu)
* [intel_myriad](./plugins/intel_myriad)
* [template](./plugins/template)
To get more information about supported OpenVINO Plugins, go to the [Plugins page](./plugins/README.md).
## OpenVINO Bindings
OpenVINO provides bindings for several languages:
* [c](./bindings/c)
* [python](./bindings/python)
OpenVINO provides bindings for different languages. To get the full list of supported languages, go to the [page](./bindings/README.md).
## Core developer topics

11
src/bindings/README.md Normal file
View File

@@ -0,0 +1,11 @@
## OpenVINO Bindings
OpenVINO provides bindings for several languages:
* [c](./c)
* [python](./python)
## See also
* [OpenVINO™ README](../../README.md)
* [OpenVINO Core Components](../README.md)
* [Developer documentation](../../docs/dev/index.md)

15
src/frontends/README.md Normal file
View File

@@ -0,0 +1,15 @@
# OpenVINO Frontends
OpenVINO Frontends allow converting models from the native framework to OpenVINO representation.
Below is the full list of supported frontends:
* [ir](./ir/README.md)
* [onnx](./onnx)
* [paddle](./paddle)
* [tensorflow](./tensorflow)
## See also
* [OpenVINO™ README](../../README.md)
* [OpenVINO Core Components](../README.md)
* [Developer documentation](../../docs/dev/index.md)

View File

@@ -31,7 +31,7 @@ OpenVINO IR Frontend contains the next components:
## Architecture
OpenVINO IR Frontend uses [pugixml](../../../thirdparty/pugixml/README.md) library to parse xml files.
OpenVINO IR Frontend uses [pugixml](https://github.com/zeux/pugixml/blob/master/README.md) library to parse xml files.
For detailed information about OpenVINO IR Frontend architecture, read the [architecture guide](./docs/architecture.md).
## Tutorials

View File

@@ -1,6 +1,6 @@
# OpenVINO IR Frontend Architecture
OpenVINO IR Frontend uses [pugixml](../../../thirdparty/pugixml/README.md) library to parse XML files. After that, based on the version and name of the operation, the Frontend creates the supported operation and initializes it using OpenVINO Visitor API:
OpenVINO IR Frontend uses [pugixml](https://github.com/zeux/pugixml/blob/master/README.md) library to parse XML files. After that, based on the version and name of the operation, the Frontend creates the supported operation and initializes it using OpenVINO Visitor API:
```mermaid
flowchart TB
fw_model[(IR)]

19
src/plugins/README.md Normal file
View File

@@ -0,0 +1,19 @@
# OpenVINO Plugins
OpenVINO Plugins provide support for hardware devices.
The list of supported plugins:
* [auto](./auto)
* [auto_batch](./auto_batch)
* [hetero](./hetero)
* [intel_cpu](./intel_cpu)
* [intel_gna](./intel_gna)
* [intel_gpu](./intel_gpu)
* [intel_myriad](./intel_myriad)
* [template](./template)
## See also
* [OpenVINO™ README](../../README.md)
* [OpenVINO Core Components](../README.md)
* [Developer documentation](../../docs/dev/index.md)