Removed ngraph mentions (#10647)

This commit is contained in:
Ilya Lavrenov 2022-02-25 07:02:09 +03:00 committed by GitHub
parent ffd63f9758
commit 53d3ef8eab
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
10 changed files with 9 additions and 19 deletions

View File

@ -843,16 +843,6 @@ INPUT = "@MARKDOWN_INPUT@" \
"@OpenVINO_SOURCE_DIR@/src/common/transformations/include/" \
"@OpenVINO_SOURCE_DIR@/src/common/util/include/" \
"@OpenVINO_SOURCE_DIR@/src/core/include/" \
"@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/" \
"@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/descriptor" \
"@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/op/" \
"@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/op/util" \
"@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/opsets/" \
"@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/pass/" \
"@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/pattern/" \
"@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/pattern/op/" \
"@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/runtime/" \
"@OpenVINO_SOURCE_DIR@/src/core/include/ngraph/type/" \
"@OpenVINO_SOURCE_DIR@/src/core/include/openvino/" \
"@OpenVINO_SOURCE_DIR@/src/core/include/openvino/core/" \
"@OpenVINO_SOURCE_DIR@/src/core/include/openvino/core/descriptor/" \

View File

@ -91,7 +91,7 @@ This section provides reference documents that guide you through developing your
With the [Model Downloader](@ref omz_tools_downloader) and [Model Optimizer](MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md) guides, you will learn to download pre-trained models and convert them for use with the OpenVINO™ toolkit. You can provide your own model or choose a public or Intel model from a broad selection provided in the [Open Model Zoo](model_zoo.md).
## Deploying Inference
The [OpenVINO™ Runtime User Guide](OV_Runtime_UG/openvino_intro.md) explains the process of creating your own application that runs inference with the OpenVINO™ toolkit. The [API Reference](./api_references.html) defines the Inference Engine API for Python, C++, and C and the nGraph API for Python and C++. The Inference Engine API is what you'll use to create an OpenVINO™ application, while the nGraph API is available for using enhanced operations sets and other features. After writing your application, you can use the [Deployment Manager](install_guides/deployment-manager-tool.md) for deploying to target devices.
The [OpenVINO™ Runtime User Guide](OV_Runtime_UG/openvino_intro.md) explains the process of creating your own application that runs inference with the OpenVINO™ toolkit. The [API Reference](./api_references.html) defines the OpenVINO Runtime API for Python, C++, and C. The OpenVINO Runtime API is what you'll use to create an OpenVINO™ inference application, use enhanced operations sets and other features. After writing your application, you can use the [Deployment Manager](install_guides/deployment-manager-tool.md) for deploying to target devices.
## Tuning for Performance
The toolkit provides a [Performance Optimization Guide](optimization_guide/dldt_optimization_guide.md) and utilities for squeezing the best performance out of your application, including [Accuracy Checker](@ref omz_tools_accuracy_checker), [Post-Training Optimization Tool](@ref pot_README), and other tools for measuring accuracy, benchmarking performance, and tuning your application.

View File

@ -3,7 +3,7 @@
This specification document describes `opset1` operation set supported in OpenVINO.
Support for each particular operation from the list below depends on the capabilities available in a inference plugin
and may vary among different hardware platforms and devices. Examples of operation instances are expressed as IR V10 xml
snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding nGraph operation classes
snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding OpenVINO operation classes
declared in `namespace opset1`.

View File

@ -3,7 +3,7 @@
This specification document describes `opset2` operation set supported in OpenVINO.
Support for each particular operation from the list below depends on the capabilities available in a inference plugin
and may vary among different hardware platforms and devices. Examples of operation instances are expressed as IR V10 xml
snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding nGraph operation classes
snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding OpenVINO operation classes
declared in `namespace opset2`.

View File

@ -3,7 +3,7 @@
This specification document describes `opset3` operation set supported in OpenVINO.
Support for each particular operation from the list below depends on the capabilities available in a inference plugin
and may vary among different hardware platforms and devices. Examples of operation instances are expressed as IR V10 xml
snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding nGraph operation classes
snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding OpenVINO operation classes
declared in `namespace opset3`.

View File

@ -3,7 +3,7 @@
This specification document describes `opset4` operation set supported in OpenVINO.
Support for each particular operation from the list below depends on the capabilities available in a inference plugin
and may vary among different hardware platforms and devices. Examples of operation instances are expressed as IR V10 xml
snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding nGraph operation classes
snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding OpenVINO operation classes
declared in `namespace opset4`.

View File

@ -3,7 +3,7 @@
This specification document describes `opset5` operation set supported in OpenVINO.
Support for each particular operation from the list below depends on the capabilities available in a inference plugin
and may vary among different hardware platforms and devices. Examples of operation instances are expressed as IR V10 xml
snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding nGraph operation classes
snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding OpenVINO operation classes
declared in `namespace opset5`.

View File

@ -3,7 +3,7 @@
This specification document describes `opset6` operation set supported in OpenVINO.
Support for each particular operation from the list below depends on the capabilities available in a inference plugin
and may vary among different hardware platforms and devices. Examples of operation instances are expressed as IR V10 xml
snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding nGraph operation classes
snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding OpenVINO operation classes
declared in `namespace opset6`.

View File

@ -3,7 +3,7 @@
This specification document describes the `opset7` operation set supported in OpenVINO™.
Support for each particular operation from the list below depends on the capabilities available in an inference plugin
and may vary among different hardware platforms and devices. Examples of operation instances are provided as IR V10 xml
snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding nGraph operation classes
snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding OpenVINO operation classes
declared in `namespace opset7`.

View File

@ -3,7 +3,7 @@
This specification document describes the `opset8` operation set supported in OpenVINO™.
Support for each particular operation from the list below depends on the capabilities of an inference plugin
and may vary among different hardware platforms and devices. Examples of operation instances are provided as IR V10 xml
snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding nGraph operation classes
snippets. Such IR is generated by the Model Optimizer. The semantics match corresponding OpenVINO operation classes
declared in `namespace opset8`.