DOCS: Quantization doc rewrites - port to master (#14374)
This commit is contained in:
parent
9ba4610797
commit
ece0341377
@ -1,4 +1,4 @@
|
|||||||
# Optimizing Models at Training Time {#tmo_introduction}
|
# Compressing Models During Training {#tmo_introduction}
|
||||||
|
|
||||||
@sphinxdirective
|
@sphinxdirective
|
||||||
|
|
||||||
@ -12,30 +12,24 @@
|
|||||||
@endsphinxdirective
|
@endsphinxdirective
|
||||||
|
|
||||||
## Introduction
|
## Introduction
|
||||||
Training-time model optimization is a way to get a more efficient and HW-friendly model when applying optimization methods with fine-tuning. It can help when the [post-training optimization](@ref pot_introduction) does not provide the desired accuracy or performance results. OpenVINO™ does not have training capabilities but it provides a Neural Network Compression Framework (NNCF) tool that can be used to integrate training-time optimizations supported by OpenVINO in the training scripts created using source frameworks, such as PyTorch or TensorFlow 2.
|
Training-time model compression improves model performance by applying optimizations (such as quantization) during the training. The training process minimizes the loss associated with the lower-precision optimizations, so it is able to maintain the model’s accuracy while reducing its latency and memory footprint. Generally, training-time model optimization results in better model performance and accuracy than [post-training optimization](@ref pot_introduction), but it can require more effort to set up.
|
||||||
|
|
||||||
To apply training-time optimization methods you need typical artefacts used to train model:
|
OpenVINO provides the Neural Network Compression Framework (NNCF) tool for implementing compression algorithms on models to improve their performance. NNCF is a Python library that integrates into PyTorch and TensorFlow training pipelines to add training-time compression methods to the pipeline. To apply training-time compression methods with NNCF, you need:
|
||||||
- A floating-point model in the framework representation.
|
|
||||||
- Training script written with framework API.
|
- A floating-point model from the PyTorch or TensorFlow framework.
|
||||||
|
- A training pipeline set up in the PyTorch or TensorFlow framework.
|
||||||
- Training and validation datasets.
|
- Training and validation datasets.
|
||||||
|
|
||||||
Figure below shows a common workflow of applying training-time optimizations with NNCF.
|
Adding compression to a training pipeline only requires a few lines of code. The compression techniques are defined through a single configuration file that specifies which algorithms to use during fine-tuning.
|
||||||
|
|
||||||

|
### NNCF Quick Start Examples
|
||||||
|
See the following Jupyter Notebooks for step-by-step examples showing how to add model compression to a PyTorch or Tensorflow training pipeline with NNCF:
|
||||||
|
|
||||||
## Optimization methods
|
- [Quantization Aware Training with NNCF and PyTorch](https://docs.openvino.ai/2022.2/notebooks/302-pytorch-quantization-aware-training-with-output.html).
|
||||||
There are two methods available to improve model performance with OpenVINO™:
|
- [Quantization Aware Training with NNCF and TensorFlow](https://docs.openvino.ai/2022.2/notebooks/305-tensorflow-quantization-aware-training-with-output.html).
|
||||||
- **8-bit uniform quantization** (or simply quantization) is a technique that allows moving from floating-point precision to 8-bit integer precision for weights and activations during the inference time. It helps to reduce the model size, memory footprint and latency as well as improve the computational efficiency using integer arithmetic. During the quantization process the model undergoes the transformation process when additional operations, that contain quantization information, are inserted into the model. However, the model continues to be the floating-point and can be fine-tuned to restore the accuracy degradation introduced by the quantization the same way as the original model. This procedure is called Quantization-aware Training (QAT). The actual transition to integer arithmetic happens at model inference.
|
|
||||||
- **Structured pruning** is used to remove unnecessary or redundant groups of weights from Deep Neural Networks. In the case of Convolutional Neural Networks it usually results in **Filter Pruning** where the whole convolutional filters are being removed from the model reducing the model size and footprint as well as overall computational complexity. This process consists of two steps: 1. search and zero out redundant filters along with model fine-tuning 2. remove zero filters after the fine-tuning. Since this method changes the model structure it usually requires a long fine-tuning or even retraining depending on the pruning ratio.
|
|
||||||
|
|
||||||
## Recommended workflow:
|
|
||||||
Based on the complexity and ease of use of the optimization methods, we recommend the following workflow to accelerate the inference with the fine-tuning.
|
|
||||||
|
|
||||||
- **Quantization-aware Training (QAT)** to get fast and accurate results with significant improvement in the inference performance. Currently, a HW-compatible (CPU, GPU, VPU) QAT for 8-bit inference is available. For details, see [Quantization-aware Training](./qat.md) documentation.
|
|
||||||
- **Filter Pruning**, used to get additional speedup on top of quantization. For details, see [Filter Pruning](./filter_pruning.md) documentation.
|
|
||||||
|
|
||||||
## Installation
|
## Installation
|
||||||
NNCF is open-sourced on [GitHub](https://github.com/openvinotoolkit/nncf) and distributed as a separate package. It is also available on PyPI. We recommend installing it to the Python* environment where the framework is installed.
|
NNCF is open-sourced on [GitHub](https://github.com/openvinotoolkit/nncf) and distributed as a separate package from OpenVINO. It is also available on PyPI. Install it to the same Python environment where PyTorch or TensorFlow is installed.
|
||||||
|
|
||||||
### Install from PyPI
|
### Install from PyPI
|
||||||
To install the latest released version via pip manager run the following command:
|
To install the latest released version via pip manager run the following command:
|
||||||
@ -44,9 +38,49 @@ pip install nncf
|
|||||||
```
|
```
|
||||||
|
|
||||||
> **NOTE**: To install with specific frameworks, use the `pip install nncf[extras]` command, where extras is a list of possible extras, for example, `torch`, `tf`, `onnx`.
|
> **NOTE**: To install with specific frameworks, use the `pip install nncf[extras]` command, where extras is a list of possible extras, for example, `torch`, `tf`, `onnx`.
|
||||||
|
|
||||||
To install the latest NNCF version from source follow the instruction on [GitHub](https://github.com/openvinotoolkit/nncf#installation).
|
To install the latest NNCF version from source follow the instruction on [GitHub](https://github.com/openvinotoolkit/nncf#installation).
|
||||||
|
|
||||||
> **NOTE**: NNCF does not have OpenVINO™ as an installation requirement. To deploy optimized models you should install OpenVINO™ separately.
|
> **NOTE**: NNCF does not have OpenVINO as an installation requirement. To deploy optimized models you should install OpenVINO separately.
|
||||||
|
|
||||||
## See also
|
## Working with NNCF
|
||||||
- [Post-training Optimization](@ref pot_introduction)
|
The figure below shows a common workflow of applying training-time compressions with NNCF. The NNCF optimizations are added to the TensorFlow or PyTorch training script, and then the model undergoes fine-tuning. The optimized model can then be exported to OpenVINO IR format for accelerated performance with OpenVINO Runtime.
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
|
||||||
|
### Training-Time Compression Methods
|
||||||
|
NNCF provides several methods for improving model performance with training-time compression.
|
||||||
|
|
||||||
|
#### Quantization
|
||||||
|
Quantization is the process of converting the weights and activation values in a neural network from a high-precision format (such as 32-bit floating point) to a lower-precision format (such as 8-bit integer). It helps to reduce the model’s memory footprint and latency. NNCF uses quantization-aware training to quantize models.
|
||||||
|
|
||||||
|
Quantization-aware training inserts nodes into the neural network during training that simulate the effect of lower precision. This allows the training algorithm to consider quantization errors as part of the overall training loss that gets minimized during training. The network is then able to achieve enhanced accuracy when quantized.
|
||||||
|
|
||||||
|
The officially supported method of quantization in NNCF is uniform 8-bit quantization. This means all the weights and activation functions in the neural network are converted to 8-bit values. See the [Quantization-ware Training guide](@ref qat_introduction) to learn more.
|
||||||
|
|
||||||
|
#### Filter pruning
|
||||||
|
Filter pruning algorithms compress models by zeroing out the output filters of convolutional layers based on a certain filter importance criterion. During fine-tuning, an importance criteria is used to search for redundant filters that don’t significantly contribute to the network’s output and zero them out. After fine-tuning, the zeroed-out filters are removed from the network. For more information, see the [Filter Pruning](@ref filter_pruning) page.
|
||||||
|
|
||||||
|
#### Experimental methods
|
||||||
|
NNCF also provides state-of-the-art compression techniques that are still in experimental stages of development and are only recommended for expert developers. These include:
|
||||||
|
|
||||||
|
- Mixed-precision quantization
|
||||||
|
- Sparsity
|
||||||
|
- Binarization
|
||||||
|
|
||||||
|
To learn more about these methods, visit the [NNCF repository on GitHub](https://github.com/openvinotoolkit/nncf).
|
||||||
|
|
||||||
|
### Recommended Workflow
|
||||||
|
Using compression-aware training requires a training pipeline, an annotated dataset, and compute resources (such as CPUs or GPUs). If you don't already have these set up and available, it can be easier to start post-training quantization to quickly see quantized results. Then you can use compression-aware training if the model isn't accurate enough. We recommend the following workflow for compressing models with NNCF:
|
||||||
|
|
||||||
|
1. [Perform post-training quantization](@ref pot_introduction) on your model and then compare performance to the original model.
|
||||||
|
2. If the accuracy is too degraded, use [Quantization-aware Training](@ref qat_introduction) to increase accuracy while still achieving faster inference time.
|
||||||
|
3. If the quantized model is still too slow, use [Filter Pruning](@ref filter_pruning) to further improve the model’s inference speed.
|
||||||
|
|
||||||
|
## Additional Resources
|
||||||
|
- [Quantizing Models Post-training](@ref pot_introduction)
|
||||||
|
- [NNCF GitHub repository](https://github.com/openvinotoolkit/nncf)
|
||||||
|
- [NNCF FAQ](https://github.com/openvinotoolkit/nncf/blob/develop/docs/FAQ.md)
|
||||||
|
- [Quantization Aware Training with NNCF and PyTorch](https://docs.openvino.ai/2022.2/notebooks/302-pytorch-quantization-aware-training-with-output.html)
|
||||||
|
- [Quantization Aware Training with NNCF and TensorFlow](https://docs.openvino.ai/2022.2/notebooks/305-tensorflow-quantization-aware-training-with-output.html)
|
@ -10,7 +10,7 @@ and run on CPU with the OpenVINO™.
|
|||||||
* A representative calibration dataset representing a use case scenario, for example, 300 samples.
|
* A representative calibration dataset representing a use case scenario, for example, 300 samples.
|
||||||
|
|
||||||
Figure below shows the optimization workflow:
|
Figure below shows the optimization workflow:
|
||||||

|

|
||||||
|
|
||||||
To get started with POT tool refer to the corresponding OpenVINO™ [documentation](https://docs.openvino.ai/latest/openvino_docs_model_optimization_guide.html).
|
To get started with POT tool refer to the corresponding OpenVINO™ [documentation](https://docs.openvino.ai/latest/openvino_docs_model_optimization_guide.html).
|
||||||
|
|
||||||
|
@ -1,4 +1,4 @@
|
|||||||
# Optimizing Models Post-training {#pot_introduction}
|
# Quantizing Models Post-training {#pot_introduction}
|
||||||
|
|
||||||
@sphinxdirective
|
@sphinxdirective
|
||||||
|
|
||||||
@ -16,35 +16,50 @@
|
|||||||
|
|
||||||
@endsphinxdirective
|
@endsphinxdirective
|
||||||
|
|
||||||
|
## Introduction
|
||||||
|
Post-training quantization is a model compression technique where the values in a neural network are converted from a 32-bit or 16-bit format to an 8-bit integer format after the network has been fine-tuned on a training dataset. This helps to reduce the model’s latency by taking advantage of computationally efficient 8-bit integer arithmetic. It also reduces the model's size and memory footprint.
|
||||||
|
|
||||||
Post-training model optimization is the process of applying special methods without model retraining or fine-tuning, for example, post-training 8-bit quantization. Therefore, this process does not require a training dataset or a training pipeline in the source DL framework. To apply post-training methods in OpenVINO™, you need:
|
Post-training quantization is easy to implement and is a quick way to boost model performance. It only requires a representative dataset, and it can be performed using the Post-training Optimization Tool (POT) in OpenVINO. POT is distributed as part of the [OpenVINO Development Tools](@ref openvino_docs_install_guides_install_dev_tools) package. To apply post-training quantization with POT, you need:
|
||||||
* A floating-point precision model, FP32 or FP16, converted into the OpenVINO™ Intermediate Representation (IR) format that can be run on CPU.
|
|
||||||
* A representative calibration dataset representing a use case scenario, for example, 300 samples.
|
|
||||||
* In case of accuracy constraints, a validation dataset and accuracy metrics should be available.
|
|
||||||
|
|
||||||
For the needs of post-training optimization, OpenVINO™ provides a **Post-training Optimization Tool (POT)** which supports the **uniform integer quantization** method. This method allows moving from floating-point precision to integer precision (for example, 8-bit) for weights and activations during the inference time. It helps to reduce the model size, memory footprint and latency, as well as improve the computational efficiency, using integer arithmetic. During the quantization process the model undergoes the transformation process when additional operations, that contain quantization information, are inserted into the model. The actual transition to integer arithmetic happens at model inference.
|
* A floating-point precision model, FP32 or FP16, converted into the OpenVINO Intermediate Representation (IR) format.
|
||||||
|
* A representative dataset (annotated or unannotated) of around 300 samples that depict typical use cases or scenarios.
|
||||||
|
* (Optional) An annotated validation dataset that can be used for checking the model’s accuracy.
|
||||||
|
|
||||||
The figure below shows the optimization workflow with POT:
|
The post-training quantization algorithm takes samples from the representative dataset, inputs them into the network, and calibrates the network based on the resulting weights and activation values. Once calibration is complete, values in the network are converted to 8-bit integer format.
|
||||||

|
|
||||||
|
|
||||||
POT is distributed as a part of OpenVINO™ [Development Tools](@ref openvino_docs_install_guides_install_dev_tools) package and also available on [GitHub](https://github.com/openvinotoolkit/openvino/tree/master/tools/pot).
|
While post-training quantization makes your model run faster and take less memory, it may cause a slight reduction in accuracy. If you performed post-training quantization on your model and find that it isn’t accurate enough, try using [Quantization-aware Training](@ref qat_introduction) to increase its accuracy.
|
||||||
|
|
||||||
## Quantizing models with POT
|
|
||||||
|
|
||||||
Depending on your needs and requirements, POT provides two main quantization methods that can be used:
|
### Post-Training Quantization Quick Start Examples
|
||||||
|
Try out these interactive Jupyter Notebook examples to learn the POT API and see post-training quantization in action:
|
||||||
|
|
||||||
* [Default Quantization](@ref pot_default_quantization_usage) -- a recommended method that provides fast and accurate results in most cases. It requires only an unannotated dataset for quantization. For more details, see the [Default Quantization algorithm](@ref pot_compression_algorithms_quantization_default_README) documentation.
|
* [Quantization of Image Classification Models with POT](https://docs.openvino.ai/2022.2/notebooks/113-image-classification-quantization-with-output.html).
|
||||||
|
* [Object Detection Quantization with POT](https://docs.openvino.ai/2022.2/notebooks/111-detection-quantization-with-output.html).
|
||||||
|
|
||||||
* [Accuracy-aware Quantization](@ref pot_accuracyaware_usage) -- an advanced method that allows keeping accuracy at a predefined range, at the cost of performance improvement, when `Default Quantization` cannot guarantee it. This method requires an annotated representative dataset and may require more time for quantization. For more details, see the
|
## Quantizing Models with POT
|
||||||
[Accuracy-aware Quantization algorithm](@ref accuracy_aware_README) documentation.
|
The figure below shows the post-training quantization workflow with POT. In a typical workflow, a pre-trained model is converted to OpenVINO IR format using Model Optimizer. Then, the model is quantized with a representative dataset using POT.
|
||||||
|
|
||||||
Different hardware platforms support different integer precisions and quantization parameters. For example, 8-bit is used by CPU, GPU, VPU, and 16-bit by GNA. POT abstracts this complexity by introducing a concept of the "target device" used to set quantization settings, specific to the device.
|
|
||||||
|
|
||||||
> **NOTE**: There is a special `target_device: "ANY"` which leads to portable quantized models compatible with CPU, GPU, and VPU devices. GNA-quantized models are compatible only with CPU.
|

|
||||||
|
|
||||||
For benchmarking results collected for the models optimized with the POT tool, refer to the [INT8 vs FP32 Comparison on Select Networks and Platforms](@ref openvino_docs_performance_int8_vs_fp32).
|
|
||||||
|
### Post-training Quantization Methods
|
||||||
|
Depending on your needs and requirements, POT provides two quantization methods that can be used: Default Quantization and Accuracy-aware Quantization.
|
||||||
|
|
||||||
|
#### Default Quantization
|
||||||
|
Default Quantization uses an unannotated dataset to perform quantization. It uses representative dataset items to estimate the range of activation values in a network and then quantizes the network. This method is recommended to start with, because it results in a fast and accurate model in most cases. To quantize your model with Default Quantization, see the [Quantizing Models](@ref pot_default_quantization_usage) page.
|
||||||
|
|
||||||
|
#### Accuracy-aware Quantization
|
||||||
|
Accuracy-aware Quantization is an advanced method that maintains model accuracy within a predefined range by leaving some network layers unquantized. It uses a trade-off between speed and accuracy to meet user-specified requirements. This method requires an annotated dataset and may require more time for quantization. To quantize your model with Accuracy-aware Quantization, see the [Quantizing Models with Accuracy Control](@ref pot_accuracyaware_usage) page.
|
||||||
|
|
||||||
|
### Quantization Best Practices and FAQs
|
||||||
|
If you quantized your model and it isn’t accurate enough, visit the [Quantization Best Practices](@ref pot_docs_BestPractices) page for tips on improving quantized performance. Sometimes, older Intel CPU generations can encounter a saturation issue when running quantized models that can cause reduced accuracy: learn more on the [Saturation Issue Workaround](@ref pot_saturation_issue) page.
|
||||||
|
|
||||||
|
Have more questions about post-training quantization or encountering errors using POT? Visit the [POT FAQ](@ref pot_docs_FrequentlyAskedQuestions) page for answers to frequently asked questions and solutions to common errors.
|
||||||
|
|
||||||
## Additional Resources
|
## Additional Resources
|
||||||
|
|
||||||
* [Performance Benchmarks](https://docs.openvino.ai/latest/openvino_docs_performance_benchmarks_openvino.html)
|
* [Post-training Quantization Examples](@ref pot_examples_description)
|
||||||
* [INT8 Quantization by Using Web-Based Interface of the DL Workbench](https://docs.openvino.ai/latest/workbench_docs_Workbench_DG_Int_8_Quantization.html)
|
* [Quantization Best Practices](@ref pot_docs_BestPractices)
|
||||||
|
* [Post-training Optimization Tool FAQ](@ref pot_docs_FrequentlyAskedQuestions)
|
||||||
|
* [Performance Benchmarks](@ref openvino_docs_performance_benchmarks_openvino)
|
||||||
|
@ -1,3 +0,0 @@
|
|||||||
version https://git-lfs.github.com/spec/v1
|
|
||||||
oid sha256:1f23982591a6acc707c2a8494ed32e8de98fd73521143c1b286b2faee3c3b516
|
|
||||||
size 40722
|
|
418
tools/pot/docs/images/workflow_simple.svg
Normal file
418
tools/pot/docs/images/workflow_simple.svg
Normal file
@ -0,0 +1,418 @@
|
|||||||
|
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
|
||||||
|
<svg
|
||||||
|
width="992"
|
||||||
|
height="528"
|
||||||
|
overflow="hidden"
|
||||||
|
version="1.1"
|
||||||
|
id="svg266"
|
||||||
|
sodipodi:docname="workflow_simple.svg"
|
||||||
|
inkscape:version="1.2.1 (9c6d41e410, 2022-07-14)"
|
||||||
|
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
|
||||||
|
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
|
||||||
|
xmlns="http://www.w3.org/2000/svg"
|
||||||
|
xmlns:svg="http://www.w3.org/2000/svg">
|
||||||
|
<sodipodi:namedview
|
||||||
|
id="namedview268"
|
||||||
|
pagecolor="#ffffff"
|
||||||
|
bordercolor="#000000"
|
||||||
|
borderopacity="0.25"
|
||||||
|
inkscape:showpageshadow="2"
|
||||||
|
inkscape:pageopacity="0.0"
|
||||||
|
inkscape:pagecheckerboard="0"
|
||||||
|
inkscape:deskcolor="#d1d1d1"
|
||||||
|
showgrid="false"
|
||||||
|
inkscape:zoom="1.2409274"
|
||||||
|
inkscape:cx="496"
|
||||||
|
inkscape:cy="264.31844"
|
||||||
|
inkscape:window-width="1920"
|
||||||
|
inkscape:window-height="1009"
|
||||||
|
inkscape:window-x="-8"
|
||||||
|
inkscape:window-y="-8"
|
||||||
|
inkscape:window-maximized="1"
|
||||||
|
inkscape:current-layer="svg266" />
|
||||||
|
<defs
|
||||||
|
id="defs204">
|
||||||
|
<clipPath
|
||||||
|
id="clip0">
|
||||||
|
<rect
|
||||||
|
x="0"
|
||||||
|
y="0"
|
||||||
|
width="992"
|
||||||
|
height="528"
|
||||||
|
id="rect201" />
|
||||||
|
</clipPath>
|
||||||
|
</defs>
|
||||||
|
<g
|
||||||
|
clip-path="url(#clip0)"
|
||||||
|
id="g264">
|
||||||
|
<path
|
||||||
|
id="rect206"
|
||||||
|
style="fill:#ffffff"
|
||||||
|
d="M 0,0 H 992 V 528 H 0 Z" />
|
||||||
|
<path
|
||||||
|
id="rect208"
|
||||||
|
style="fill:none;stroke:#000000;stroke-width:1.33333;stroke-miterlimit:8;stroke-dasharray:5.33333, 4"
|
||||||
|
d="M 30.500099,103.5 H 894.5001 v 380 H 30.500099 Z" />
|
||||||
|
<path
|
||||||
|
d="M81.5001 40.6669C81.5001 35.6042 85.6042 31.5001 90.6669 31.5001L207.333 31.5001C212.396 31.5001 216.5 35.6042 216.5 40.6669L216.5 77.3332C216.5 82.3959 212.396 86.5001 207.333 86.5001L90.6669 86.5001C85.6042 86.5001 81.5001 82.3959 81.5001 77.3332Z"
|
||||||
|
stroke="#76CEFF"
|
||||||
|
stroke-width="1.33333"
|
||||||
|
stroke-miterlimit="8"
|
||||||
|
fill="#76CEFF"
|
||||||
|
fill-rule="evenodd"
|
||||||
|
id="path210" />
|
||||||
|
<g
|
||||||
|
aria-label="Model"
|
||||||
|
transform="translate(119.499 66)"
|
||||||
|
id="text212"
|
||||||
|
style="font-size:21px;font-family:'IntelOne Display Regular', 'IntelOne Display Regular_MSFontService', sans-serif;fill:#00285a">
|
||||||
|
<path
|
||||||
|
d="m 13.692,-14.7 h 2.394 V 0 H 14.28 v -12.39 l -4.809,8.4 h -1.47 l -4.83,-8.379 V 0 H 1.365 v -14.7 h 2.394 l 4.977,8.589 z"
|
||||||
|
id="path376" />
|
||||||
|
<path
|
||||||
|
d="m 28.664987,-5.061 q 0,2.289 -1.533,3.843 -1.533,1.533 -3.78,1.533 -2.247,0 -3.759,-1.533 -1.491,-1.533 -1.491,-3.822 0,-2.289 1.533,-3.822 1.533,-1.554 3.78,-1.554 2.247,0 3.738,1.533 1.512,1.533 1.512,3.822 z m -8.82,0.021 q 0,1.638 1.008,2.751 1.029,1.113 2.541,1.113 1.491,0 2.499,-1.113 1.029,-1.113 1.029,-2.772 0,-1.617 -1.029,-2.73 -1.029,-1.134 -2.52,-1.134 -1.491,0 -2.52,1.113 -1.008,1.113 -1.008,2.772 z"
|
||||||
|
id="path378" />
|
||||||
|
<path
|
||||||
|
d="m 38.702998,-14.7 h 1.722 V 0 h -1.701 v -1.638 q -0.693,0.882 -1.68,1.365 -0.966,0.483 -2.079,0.483 -2.121,0 -3.57,-1.47 -1.428,-1.491 -1.428,-3.78 0,-2.31 1.47,-3.822 1.491,-1.533 3.654,-1.533 1.05,0 1.974,0.441 0.945,0.42 1.638,1.218 z m -3.549,13.419 q 1.533,0 2.562,-1.092 1.05,-1.113 1.05,-2.73 0,-1.638 -1.008,-2.709 -1.008,-1.071 -2.52,-1.071 -1.554,0 -2.541,1.05 -0.987,1.05 -0.987,2.751 0,1.701 0.966,2.751 0.966,1.05 2.478,1.05 z"
|
||||||
|
id="path380" />
|
||||||
|
<path
|
||||||
|
d="m 47.522983,-1.134 q 1.008,0 1.848,-0.42 0.861,-0.441 1.575,-1.323 l 1.029,0.945 q -0.861,1.134 -2.016,1.701 -1.155,0.546 -2.583,0.546 -2.289,0 -3.759,-1.491 -1.449,-1.512 -1.449,-3.864 0,-2.289 1.491,-3.822 1.491,-1.533 3.675,-1.533 2.226,0 3.612,1.449 1.386,1.449 1.386,3.78 0,0.168 -0.021,0.399 -0.021,0.21 -0.042,0.315 h -8.337 q 0.168,1.491 1.155,2.415 1.008,0.903 2.436,0.903 z m -0.168,-7.812 q -1.323,0 -2.268,0.861 -0.945,0.861 -1.155,2.31 h 6.678 q -0.189,-1.47 -1.071,-2.31 -0.882,-0.861 -2.184,-0.861 z"
|
||||||
|
id="path382" />
|
||||||
|
<path
|
||||||
|
d="m 54.242998,-14.7 h 1.722 V 0 h -1.722 z"
|
||||||
|
id="path384" />
|
||||||
|
</g>
|
||||||
|
<path
|
||||||
|
d="M310.5 216.167C310.5 204.753 319.753 195.5 331.167 195.5L417.833 195.5C429.247 195.5 438.5 204.753 438.5 216.167L438.5 298.833C438.5 310.247 429.247 319.5 417.833 319.5L331.167 319.5C319.753 319.5 310.5 310.247 310.5 298.833Z"
|
||||||
|
stroke="#76CEFF"
|
||||||
|
stroke-width="1.33333"
|
||||||
|
stroke-miterlimit="8"
|
||||||
|
fill="#76CEFF"
|
||||||
|
fill-rule="evenodd"
|
||||||
|
id="path214" />
|
||||||
|
<g
|
||||||
|
aria-label="OpenVINO IR model"
|
||||||
|
transform="translate(328.049 252)"
|
||||||
|
id="text218"
|
||||||
|
style="font-size:19px;font-family:'IntelOne Display Regular', 'IntelOne Display Regular_MSFontService', sans-serif;fill:#00285a">
|
||||||
|
<path
|
||||||
|
d="m 0.703,-6.631 q 0,-2.945 1.957,-4.959 1.957,-2.014 4.826,-2.014 2.888,0 4.807,1.995 1.938,1.976 1.938,4.94 0,2.964 -1.957,4.978 -1.938,1.995 -4.807,1.995 -2.888,0 -4.826,-1.976 -1.938,-1.995 -1.938,-4.959 z m 1.672,-0.038 q 0,2.337 1.463,3.933 1.482,1.577 3.648,1.577 2.166,0 3.61,-1.558 1.463,-1.577 1.463,-3.914 0,-2.337 -1.463,-3.914 -1.463,-1.596 -3.629,-1.596 -2.185,0 -3.648,1.577 -1.444,1.558 -1.444,3.895 z"
|
||||||
|
id="path387" />
|
||||||
|
<path
|
||||||
|
d="m 17.479987,3.42 h -1.539 V -9.177 h 1.52 v 1.482 q 0.608,-0.798 1.501,-1.235 0.893,-0.456 1.9,-0.456 1.938,0 3.23,1.349 1.292,1.33 1.292,3.42 0,2.109 -1.349,3.477 -1.33,1.349 -3.287,1.349 -0.931,0 -1.805,-0.38 -0.855,-0.399 -1.463,-1.121 z m 3.211,-11.438 q -1.387,0 -2.337,1.007 -0.931,0.988 -0.931,2.47 0,1.463 0.912,2.432 0.931,0.969 2.299,0.969 1.406,0 2.28,-0.931 0.893,-0.95 0.893,-2.508 0,-1.539 -0.874,-2.489 -0.855,-0.95 -2.242,-0.95 z"
|
||||||
|
id="path389" />
|
||||||
|
<path
|
||||||
|
d="m 31.406975,-1.026 q 0.912,0 1.672,-0.38 0.779,-0.399 1.425,-1.197 l 0.931,0.855 q -0.779,1.026 -1.824,1.539 -1.045,0.494 -2.337,0.494 -2.071,0 -3.401,-1.349 -1.311,-1.368 -1.311,-3.496 0,-2.071 1.349,-3.458 1.349,-1.387 3.325,-1.387 2.014,0 3.268,1.311 1.254,1.311 1.254,3.42 0,0.152 -0.019,0.361 -0.019,0.19 -0.038,0.285 h -7.543 q 0.152,1.349 1.045,2.185 0.912,0.817 2.204,0.817 z m -0.152,-7.068 q -1.197,0 -2.052,0.779 -0.855,0.779 -1.045,2.09 h 6.042 q -0.171,-1.33 -0.969,-2.09 -0.798,-0.779 -1.976,-0.779 z"
|
||||||
|
id="path391" />
|
||||||
|
<path
|
||||||
|
d="m 37.315987,-9.177 h 1.52 v 1.425 q 0.551,-0.798 1.387,-1.216 0.855,-0.437 1.9,-0.437 1.824,0 2.85,1.178 1.026,1.178 1.026,3.23 V 0 h -1.539 v -4.902 q 0,-1.444 -0.722,-2.28 -0.703,-0.836 -1.976,-0.836 -1.311,0 -2.109,0.893 -0.798,0.893 -0.798,2.318 V 0 h -1.539 z"
|
||||||
|
id="path393" />
|
||||||
|
<path
|
||||||
|
d="m 52.078986,0 -5.89,-13.3 h 1.786 l 4.959,11.21 4.902,-11.21 h 1.748 l -5.89,13.3 z"
|
||||||
|
id="path395" />
|
||||||
|
<path
|
||||||
|
d="m 61.217991,0 v -13.3 h 1.634 V 0 Z"
|
||||||
|
id="path397" />
|
||||||
|
<path
|
||||||
|
d="M 67.050986,-10.925 V 0 h -1.596 v -13.3 h 1.748 l 8.303,10.868 V -13.3 h 1.596 V 0 h -1.71 z"
|
||||||
|
id="path399" />
|
||||||
|
<path
|
||||||
|
d="m 79.039983,-6.631 q 0,-2.945 1.957,-4.959 1.957,-2.014 4.826,-2.014 2.888,0 4.807,1.995 1.938,1.976 1.938,4.94 0,2.964 -1.957,4.978 -1.938,1.995 -4.807,1.995 -2.888,0 -4.826,-1.976 -1.938,-1.995 -1.938,-4.959 z m 1.672,-0.038 q 0,2.337 1.463,3.933 1.482,1.577 3.648,1.577 2.166,0 3.61,-1.558 1.463,-1.577 1.463,-3.914 0,-2.337 -1.463,-3.914 -1.463,-1.596 -3.629,-1.596 -2.185,0 -3.648,1.577 -1.444,1.558 -1.444,3.895 z"
|
||||||
|
id="path401" />
|
||||||
|
<path
|
||||||
|
d="M 12.1213,23 V 9.7 h 1.634 V 23 Z"
|
||||||
|
id="path403" />
|
||||||
|
<path
|
||||||
|
d="m 23.483291,17.395 3.781,5.605 h -1.938 l -3.591,-5.415 h -3.743 V 23 h -1.634 V 9.7 h 5.852 q 2.242,0 3.496,1.045 1.254,1.026 1.254,2.831 0,1.482 -0.912,2.489 -0.912,1.007 -2.565,1.33 z m 1.824,-3.781 q 0,-1.159 -0.836,-1.805 -0.817,-0.665 -2.299,-0.665 h -4.18 v 5.016 h 4.047 q 1.501,0 2.375,-0.684 0.893,-0.703 0.893,-1.862 z"
|
||||||
|
id="path405" />
|
||||||
|
<path
|
||||||
|
d="m 32.337267,13.823 h 1.52 v 1.102 q 0.513,-0.665 1.159,-0.988 0.665,-0.342 1.482,-0.342 0.95,0 1.691,0.399 0.741,0.399 1.178,1.159 0.513,-0.76 1.292,-1.159 0.798,-0.399 1.786,-0.399 1.615,0 2.565,1.045 0.969,1.045 0.969,2.717 V 23 h -1.558 v -5.548 q 0,-1.159 -0.589,-1.805 -0.57,-0.665 -1.615,-0.665 -1.026,0 -1.672,0.722 -0.627,0.703 -0.627,1.843 V 23 h -1.539 v -5.548 q 0,-1.159 -0.589,-1.805 -0.57,-0.665 -1.615,-0.665 -1.026,0 -1.672,0.722 -0.627,0.703 -0.627,1.843 V 23 h -1.539 z"
|
||||||
|
id="path407" />
|
||||||
|
<path
|
||||||
|
d="m 57.01828,18.421 q 0,2.071 -1.387,3.477 -1.387,1.387 -3.42,1.387 -2.033,0 -3.401,-1.387 -1.349,-1.387 -1.349,-3.458 0,-2.071 1.387,-3.458 1.387,-1.406 3.42,-1.406 2.033,0 3.382,1.387 1.368,1.387 1.368,3.458 z m -7.98,0.019 q 0,1.482 0.912,2.489 0.931,1.007 2.299,1.007 1.349,0 2.261,-1.007 0.931,-1.007 0.931,-2.508 0,-1.463 -0.931,-2.47 -0.931,-1.026 -2.28,-1.026 -1.349,0 -2.28,1.007 -0.912,1.007 -0.912,2.508 z"
|
||||||
|
id="path409" />
|
||||||
|
<path
|
||||||
|
d="m 66.100292,9.7 h 1.558 V 23 h -1.539 v -1.482 q -0.627,0.798 -1.52,1.235 -0.874,0.437 -1.881,0.437 -1.919,0 -3.23,-1.33 -1.292,-1.349 -1.292,-3.42 0,-2.09 1.33,-3.458 1.349,-1.387 3.306,-1.387 0.95,0 1.786,0.399 0.855,0.38 1.482,1.102 z m -3.211,12.141 q 1.387,0 2.318,-0.988 0.95,-1.007 0.95,-2.47 0,-1.482 -0.912,-2.451 -0.912,-0.969 -2.28,-0.969 -1.406,0 -2.299,0.95 -0.893,0.95 -0.893,2.489 0,1.539 0.874,2.489 0.874,0.95 2.242,0.95 z"
|
||||||
|
id="path411" />
|
||||||
|
<path
|
||||||
|
d="m 74.080282,21.974 q 0.912,0 1.672,-0.38 0.779,-0.399 1.425,-1.197 l 0.931,0.855 q -0.779,1.026 -1.824,1.539 -1.045,0.494 -2.337,0.494 -2.071,0 -3.401,-1.349 -1.311,-1.368 -1.311,-3.496 0,-2.071 1.349,-3.458 1.349,-1.387 3.325,-1.387 2.014,0 3.268,1.311 1.254,1.311 1.254,3.42 0,0.152 -0.019,0.361 -0.019,0.19 -0.038,0.285 h -7.543 q 0.152,1.349 1.045,2.185 0.912,0.817 2.204,0.817 z m -0.152,-7.068 q -1.197,0 -2.052,0.779 -0.855,0.779 -1.045,2.09 h 6.042 q -0.171,-1.33 -0.969,-2.09 -0.798,-0.779 -1.976,-0.779 z"
|
||||||
|
id="path413" />
|
||||||
|
<path
|
||||||
|
d="m 80.160288,9.7 h 1.558 V 23 h -1.558 z"
|
||||||
|
id="path415" />
|
||||||
|
</g>
|
||||||
|
<path
|
||||||
|
d="M44.5001 235.5C44.5001 229.425 49.425 224.5 55.5002 224.5L242.5 224.5C248.575 224.5 253.5 229.425 253.5 235.5L253.5 279.5C253.5 285.575 248.575 290.5 242.5 290.5L55.5002 290.5C49.425 290.5 44.5001 285.575 44.5001 279.5Z"
|
||||||
|
stroke="#00285A"
|
||||||
|
stroke-width="1.33333"
|
||||||
|
stroke-miterlimit="8"
|
||||||
|
fill="#00285A"
|
||||||
|
fill-rule="evenodd"
|
||||||
|
id="path220" />
|
||||||
|
<g
|
||||||
|
aria-label="Model Optimizer"
|
||||||
|
transform="translate(62.5826 265)"
|
||||||
|
id="text222"
|
||||||
|
style="font-size:24px;font-family:'IntelOne Display Regular', 'IntelOne Display Regular_MSFontService', sans-serif;fill:#ffffff">
|
||||||
|
<path
|
||||||
|
d="m 15.648,-16.8 h 2.736 V 0 H 16.32 v -14.16 l -5.496,9.6 h -1.68 l -5.52,-9.576 V 0 H 1.56 v -16.8 h 2.736 l 5.688,9.816 z"
|
||||||
|
id="path418" />
|
||||||
|
<path
|
||||||
|
d="m 32.759985,-5.784 q 0,2.616 -1.752,4.392 -1.752,1.752 -4.32,1.752 -2.568,0 -4.296,-1.752 -1.704,-1.752 -1.704,-4.368 0,-2.616 1.752,-4.368 1.752,-1.776 4.32,-1.776 2.568,0 4.272,1.752 1.728,1.752 1.728,4.368 z m -10.08,0.024 q 0,1.872 1.152,3.144 1.176,1.272 2.904,1.272 1.704,0 2.856,-1.272 1.176,-1.272 1.176,-3.168 0,-1.848 -1.176,-3.12 -1.176,-1.296 -2.88,-1.296 -1.704,0 -2.88,1.272 -1.152,1.272 -1.152,3.168 z"
|
||||||
|
id="path420" />
|
||||||
|
<path
|
||||||
|
d="m 44.231998,-16.8 h 1.968 V 0 h -1.944 v -1.872 q -0.792,1.008 -1.92,1.56 -1.104,0.552 -2.376,0.552 -2.424,0 -4.08,-1.68 -1.632,-1.704 -1.632,-4.32 0,-2.64 1.68,-4.368 1.704,-1.752 4.176,-1.752 1.2,0 2.256,0.504 1.08,0.48 1.872,1.392 z m -4.056,15.336 q 1.752,0 2.928,-1.248 1.2,-1.272 1.2,-3.12 0,-1.872 -1.152,-3.096 -1.152,-1.224 -2.88,-1.224 -1.776,0 -2.904,1.2 -1.128,1.2 -1.128,3.144 0,1.944 1.104,3.144 1.104,1.2 2.832,1.2 z"
|
||||||
|
id="path422" />
|
||||||
|
<path
|
||||||
|
d="m 54.311983,-1.296 q 1.152,0 2.112,-0.48 0.984,-0.504 1.8,-1.512 l 1.176,1.08 q -0.984,1.296 -2.304,1.944 -1.32,0.624 -2.952,0.624 -2.616,0 -4.296,-1.704 -1.656,-1.728 -1.656,-4.416 0,-2.616 1.704,-4.368 1.704,-1.752 4.2,-1.752 2.544,0 4.128,1.656 1.584,1.656 1.584,4.32 0,0.192 -0.024,0.456 -0.024,0.24 -0.048,0.36 h -9.528 q 0.192,1.704 1.32,2.76 1.152,1.032 2.784,1.032 z m -0.192,-8.928 q -1.512,0 -2.592,0.984 -1.08,0.984 -1.32,2.64 h 7.632 q -0.216,-1.68 -1.224,-2.64 -1.008,-0.984 -2.496,-0.984 z"
|
||||||
|
id="path424" />
|
||||||
|
<path
|
||||||
|
d="m 61.991998,-16.8 h 1.968 V 0 h -1.968 z"
|
||||||
|
id="path426" />
|
||||||
|
<path
|
||||||
|
d="m 70.77597,-8.376 q 0,-3.72 2.472,-6.264 2.472,-2.544 6.096,-2.544 3.648,0 6.072,2.52 2.448,2.496 2.448,6.24 0,3.744 -2.472,6.288 -2.448,2.52 -6.072,2.52 -3.648,0 -6.096,-2.496 -2.448,-2.52 -2.448,-6.264 z m 2.112,-0.048 q 0,2.952 1.848,4.968 1.872,1.992 4.608,1.992 2.736,0 4.56,-1.968 1.848,-1.992 1.848,-4.944 0,-2.952 -1.848,-4.944 -1.848,-2.016 -4.584,-2.016 -2.76,0 -4.608,1.992 -1.824,1.968 -1.824,4.92 z"
|
||||||
|
id="path428" />
|
||||||
|
<path
|
||||||
|
d="m 91.967953,4.32 h -1.944 v -15.912 h 1.92 v 1.872 q 0.768,-1.008 1.896,-1.56 1.128,-0.576 2.4,-0.576 2.448,0 4.079997,1.704 1.632,1.68 1.632,4.32 0,2.664 -1.704,4.392 -1.679997,1.704 -4.151997,1.704 -1.176,0 -2.28,-0.48 -1.08,-0.504 -1.848,-1.416 z m 4.056,-14.448 q -1.752,0 -2.952,1.272 -1.176,1.248 -1.176,3.12 0,1.848 1.152,3.072 1.176,1.224 2.904,1.224 1.776,0 2.88,-1.176 1.128,-1.2 1.128,-3.168 0,-1.944 -1.104,-3.144 -1.08,-1.2 -2.832,-1.2 z"
|
||||||
|
id="path430" />
|
||||||
|
<path
|
||||||
|
d="m 104.49596,-3.096 v -6.768 h -1.584 v -1.728 h 1.584 v -3.552 h 1.944 v 3.552 h 2.688 v 1.728 h -2.688 v 6.6 q 0,0.768 0.336,1.152 0.36,0.36 1.08,0.36 h 1.272 V 0 h -1.56 q -1.608,0 -2.352,-0.744 -0.72,-0.744 -0.72,-2.352 z"
|
||||||
|
id="path432" />
|
||||||
|
<path
|
||||||
|
d="m 111.21596,-15.48 h 2.16 v 2.136 h -2.16 z m 0.096,15.48 v -11.592 h 1.968 V 0 Z"
|
||||||
|
id="path434" />
|
||||||
|
<path
|
||||||
|
d="m 116.03994,-11.592 h 1.92 v 1.392 q 0.648,-0.84 1.464,-1.248 0.84,-0.432 1.872,-0.432 1.2,0 2.136,0.504 0.936,0.504 1.488,1.464 0.648,-0.96 1.632,-1.464 1.008,-0.504 2.256,-0.504 2.04,0 3.24,1.32 1.224,1.32 1.224,3.432 V 0 h -1.968 v -7.008 q 0,-1.464 -0.744,-2.28 -0.72,-0.84 -2.04,-0.84 -1.296,0 -2.112,0.912 -0.792,0.888 -0.792,2.328 V 0 h -1.944 v -7.008 q 0,-1.464 -0.744,-2.28 -0.72,-0.84 -2.04,-0.84 -1.296,0 -2.112,0.912 -0.792,0.888 -0.792,2.328 V 0 h -1.944 z"
|
||||||
|
id="path436" />
|
||||||
|
<path
|
||||||
|
d="m 135.79196,-15.48 h 2.16 v 2.136 h -2.16 z m 0.096,15.48 v -11.592 h 1.968 V 0 Z"
|
||||||
|
id="path438" />
|
||||||
|
<path
|
||||||
|
d="m 150.02395,0 h -10.152 v -1.536 l 7.392,-8.424 h -7.104 v -1.632 h 9.552 v 1.536 l -7.32,8.424 h 7.632 z"
|
||||||
|
id="path440" />
|
||||||
|
<path
|
||||||
|
d="m 156.98398,-1.296 q 1.152,0 2.112,-0.48 0.984,-0.504 1.8,-1.512 l 1.176,1.08 q -0.984,1.296 -2.304,1.944 -1.32,0.624 -2.952,0.624 -2.616,0 -4.296,-1.704 -1.656,-1.728 -1.656,-4.416 0,-2.616 1.704,-4.368 1.704,-1.752 4.2,-1.752 2.544,0 4.128,1.656 1.584,1.656 1.584,4.32 0,0.192 -0.024,0.456 -0.024,0.24 -0.048,0.36 h -9.528 q 0.192,1.704 1.32,2.76 1.152,1.032 2.784,1.032 z m -0.192,-8.928 q -1.512,0 -2.592,0.984 -1.08,0.984 -1.32,2.64 h 7.632 q -0.216,-1.68 -1.224,-2.64 -1.008,-0.984 -2.496,-0.984 z"
|
||||||
|
id="path442" />
|
||||||
|
<path
|
||||||
|
d="m 164.44799,-11.592 h 1.92 v 1.464 q 0.528,-0.768 1.368,-1.152 0.84,-0.384 1.992,-0.384 h 1.056 v 1.848 h -1.08 q -1.656,0 -2.496,0.864 -0.816,0.864 -0.816,2.616 V 0 h -1.944 z"
|
||||||
|
id="path444" />
|
||||||
|
</g>
|
||||||
|
<path
|
||||||
|
d="M744.5 216.167C744.5 204.753 753.753 195.5 765.167 195.5L851.833 195.5C863.247 195.5 872.5 204.753 872.5 216.167L872.5 298.833C872.5 310.247 863.247 319.5 851.833 319.5L765.167 319.5C753.753 319.5 744.5 310.247 744.5 298.833Z"
|
||||||
|
stroke="#76CEFF"
|
||||||
|
stroke-width="1.33333"
|
||||||
|
stroke-miterlimit="8"
|
||||||
|
fill="#76CEFF"
|
||||||
|
fill-rule="evenodd"
|
||||||
|
id="path224" />
|
||||||
|
<g
|
||||||
|
aria-label="Optimized IR model"
|
||||||
|
transform="translate(759.837 252)"
|
||||||
|
id="text228"
|
||||||
|
style="font-size:21px;font-family:'IntelOne Display Regular', 'IntelOne Display Regular_MSFontService', sans-serif;fill:#00285a">
|
||||||
|
<path
|
||||||
|
d="m 0.777,-7.329 q 0,-3.255 2.163,-5.481 2.163,-2.226 5.334,-2.226 3.192,0 5.313,2.205 2.142,2.184 2.142,5.46 0,3.276 -2.163,5.502 -2.142,2.205 -5.313,2.205 -3.192,0 -5.334,-2.184 -2.142,-2.205 -2.142,-5.481 z m 1.848,-0.042 q 0,2.583 1.617,4.347 1.638,1.743 4.032,1.743 2.394,0 3.99,-1.722 1.617,-1.743 1.617,-4.326 0,-2.583 -1.617,-4.326 -1.617,-1.764 -4.011,-1.764 -2.415,0 -4.032,1.743 -1.596,1.722 -1.596,4.305 z"
|
||||||
|
id="path447" />
|
||||||
|
<path
|
||||||
|
d="m 19.319985,3.78 h -1.701 v -13.923 h 1.68 v 1.638 q 0.672,-0.882 1.659,-1.365 0.987,-0.504 2.1,-0.504 2.142,0 3.57,1.491 1.428,1.47 1.428,3.78 0,2.331 -1.491,3.843 -1.47,1.491 -3.633,1.491 -1.029,0 -1.995,-0.42 -0.945,-0.441 -1.617,-1.239 z m 3.549,-12.642 q -1.533,0 -2.583,1.113 -1.029,1.092 -1.029,2.73 0,1.617 1.008,2.688 1.029,1.071 2.541,1.071 1.554,0 2.52,-1.029 0.987,-1.05 0.987,-2.772 0,-1.701 -0.966,-2.751 -0.945,-1.05 -2.478,-1.05 z"
|
||||||
|
id="path449" />
|
||||||
|
<path
|
||||||
|
d="m 30.28199,-2.709 v -5.922 h -1.386 v -1.512 h 1.386 v -3.108 h 1.701 v 3.108 h 2.352 v 1.512 h -2.352 v 5.775 q 0,0.672 0.294,1.008 0.315,0.315 0.945,0.315 h 1.113 V 0 h -1.365 q -1.407,0 -2.058,-0.651 -0.63,-0.651 -0.63,-2.058 z"
|
||||||
|
id="path451" />
|
||||||
|
<path
|
||||||
|
d="m 36.161993,-13.545 h 1.89 v 1.869 h -1.89 z M 36.245993,0 v -10.143 h 1.722 V 0 Z"
|
||||||
|
id="path453" />
|
||||||
|
<path
|
||||||
|
d="m 40.382978,-10.143 h 1.68 v 1.218 q 0.567,-0.735 1.281,-1.092 0.735,-0.378 1.638,-0.378 1.05,0 1.869,0.441 0.819,0.441 1.302,1.281 0.567,-0.84 1.428,-1.281 0.882,-0.441 1.974,-0.441 1.785,0 2.835,1.155 1.071,1.155 1.071,3.003 V 0 h -1.722 v -6.132 q 0,-1.281 -0.651,-1.995 -0.63,-0.735 -1.785,-0.735 -1.134,0 -1.848,0.798 -0.693,0.777 -0.693,2.037 V 0 h -1.701 v -6.132 q 0,-1.281 -0.651,-1.995 -0.63,-0.735 -1.785,-0.735 -1.134,0 -1.848,0.798 -0.693,0.777 -0.693,2.037 V 0 h -1.701 z"
|
||||||
|
id="path455" />
|
||||||
|
<path
|
||||||
|
d="m 57.665994,-13.545 h 1.89 v 1.869 h -1.89 z M 57.749994,0 v -10.143 h 1.722 V 0 Z"
|
||||||
|
id="path457" />
|
||||||
|
<path
|
||||||
|
d="m 70.118979,0 h -8.883 v -1.344 l 6.468,-7.371 h -6.216 v -1.428 h 8.358 v 1.344 l -6.405,7.371 h 6.678 z"
|
||||||
|
id="path459" />
|
||||||
|
<path
|
||||||
|
d="m 76.209011,-1.134 q 1.008,0 1.848,-0.42 0.861,-0.441 1.575,-1.323 l 1.029,0.945 q -0.861,1.134 -2.016,1.701 -1.155,0.546 -2.583,0.546 -2.289,0 -3.759,-1.491 -1.449,-1.512 -1.449,-3.864 0,-2.289 1.491,-3.822 1.491,-1.533 3.675,-1.533 2.226,0 3.612,1.449 1.386,1.449 1.386,3.78 0,0.168 -0.021,0.399 -0.021,0.21 -0.042,0.315 h -8.337 q 0.168,1.491 1.155,2.415 1.008,0.903 2.436,0.903 z m -0.168,-7.812 q -1.323,0 -2.268,0.861 -0.945,0.861 -1.155,2.31 h 6.678 q -0.189,-1.47 -1.071,-2.31 -0.882,-0.861 -2.184,-0.861 z"
|
||||||
|
id="path461" />
|
||||||
|
<path
|
||||||
|
d="m 91.014022,-14.7 h 1.722 V 0 h -1.701 v -1.638 q -0.693,0.882 -1.68,1.365 -0.966,0.483 -2.079,0.483 -2.121,0 -3.57,-1.47 -1.428,-1.491 -1.428,-3.78 0,-2.31 1.47,-3.822 1.491,-1.533 3.654,-1.533 1.05,0 1.974,0.441 0.945,0.42 1.638,1.218 z m -3.549,13.419 q 1.533,0 2.562,-1.092 1.05,-1.113 1.05,-2.73 0,-1.638 -1.008,-2.709 -1.008,-1.071 -2.52,-1.071 -1.554,0 -2.541,1.05 -0.987,1.05 -0.987,2.751 0,1.701 0.966,2.751 0.966,1.05 2.478,1.05 z"
|
||||||
|
id="path463" />
|
||||||
|
<path
|
||||||
|
d="M 8.93198,25 V 10.3 h 1.806 V 25 Z"
|
||||||
|
id="path465" />
|
||||||
|
<path
|
||||||
|
d="m 21.48997,18.805 4.179,6.195 h -2.142 l -3.969,-5.985 h -4.137 V 25 h -1.806 V 10.3 h 6.468 q 2.478,0 3.864,1.155 1.386,1.134 1.386,3.129 0,1.638 -1.008,2.751 -1.008,1.113 -2.835,1.47 z m 2.016,-4.179 q 0,-1.281 -0.924,-1.995 -0.903,-0.735 -2.541,-0.735 h -4.62 v 5.544 h 4.473 q 1.659,0 2.625,-0.756 0.987,-0.777 0.987,-2.058 z"
|
||||||
|
id="path467" />
|
||||||
|
<path
|
||||||
|
d="m 31.275944,14.857 h 1.68 v 1.218 q 0.567,-0.735 1.281,-1.092 0.735,-0.378 1.638,-0.378 1.05,0 1.869,0.441 0.819,0.441 1.302,1.281 0.567,-0.84 1.428,-1.281 0.882,-0.441 1.974,-0.441 1.785,0 2.835,1.155 1.071,1.155 1.071,3.003 V 25 h -1.722 v -6.132 q 0,-1.281 -0.651,-1.995 -0.63,-0.735 -1.785,-0.735 -1.134,0 -1.848,0.798 -0.693,0.777 -0.693,2.037 V 25 h -1.701 v -6.132 q 0,-1.281 -0.651,-1.995 -0.63,-0.735 -1.785,-0.735 -1.134,0 -1.848,0.798 -0.693,0.777 -0.693,2.037 V 25 h -1.701 z"
|
||||||
|
id="path469" />
|
||||||
|
<path
|
||||||
|
d="m 58.554962,19.939 q 0,2.289 -1.533,3.843 -1.533,1.533 -3.78,1.533 -2.247,0 -3.759,-1.533 -1.491,-1.533 -1.491,-3.822 0,-2.289 1.533,-3.822 1.533,-1.554 3.78,-1.554 2.247,0 3.738,1.533 1.512,1.533 1.512,3.822 z m -8.82,0.021 q 0,1.638 1.008,2.751 1.029,1.113 2.541,1.113 1.491,0 2.499,-1.113 1.029,-1.113 1.029,-2.772 0,-1.617 -1.029,-2.73 -1.029,-1.134 -2.52,-1.134 -1.491,0 -2.52,1.113 -1.008,1.113 -1.008,2.772 z"
|
||||||
|
id="path471" />
|
||||||
|
<path
|
||||||
|
d="m 68.592972,10.3 h 1.722 V 25 h -1.701 v -1.638 q -0.693,0.882 -1.68,1.365 -0.966,0.483 -2.079,0.483 -2.121,0 -3.57,-1.47 -1.428,-1.491 -1.428,-3.78 0,-2.31 1.47,-3.822 1.491,-1.533 3.654,-1.533 1.05,0 1.974,0.441 0.945,0.42 1.638,1.218 z m -3.549,13.419 q 1.533,0 2.562,-1.092 1.05,-1.113 1.05,-2.73 0,-1.638 -1.008,-2.709 -1.008,-1.071 -2.52,-1.071 -1.554,0 -2.541,1.05 -0.987,1.05 -0.987,2.751 0,1.701 0.966,2.751 0.966,1.05 2.478,1.05 z"
|
||||||
|
id="path473" />
|
||||||
|
<path
|
||||||
|
d="m 77.412957,23.866 q 1.008,0 1.848,-0.42 0.861,-0.441 1.575,-1.323 l 1.029,0.945 q -0.861,1.134 -2.016,1.701 -1.155,0.546 -2.583,0.546 -2.289,0 -3.759,-1.491 -1.449,-1.512 -1.449,-3.864 0,-2.289 1.491,-3.822 1.491,-1.533 3.675,-1.533 2.226,0 3.612,1.449 1.386,1.449 1.386,3.78 0,0.168 -0.021,0.399 -0.021,0.21 -0.042,0.315 h -8.337 q 0.168,1.491 1.155,2.415 1.008,0.903 2.436,0.903 z m -0.168,-7.812 q -1.323,0 -2.268,0.861 -0.945,0.861 -1.155,2.31 h 6.678 q -0.189,-1.47 -1.071,-2.31 -0.882,-0.861 -2.184,-0.861 z"
|
||||||
|
id="path475" />
|
||||||
|
<path
|
||||||
|
d="m 84.132976,10.3 h 1.722 V 25 h -1.722 z"
|
||||||
|
id="path477" />
|
||||||
|
</g>
|
||||||
|
<path
|
||||||
|
d="M485.5 235.5C485.5 229.425 490.425 224.5 496.5 224.5L675.5 224.5C681.575 224.5 686.5 229.425 686.5 235.5L686.5 279.5C686.5 285.575 681.575 290.5 675.5 290.5L496.5 290.5C490.425 290.5 485.5 285.575 485.5 279.5Z"
|
||||||
|
stroke="#00285A"
|
||||||
|
stroke-width="1.33333"
|
||||||
|
stroke-miterlimit="8"
|
||||||
|
fill="#00285A"
|
||||||
|
fill-rule="evenodd"
|
||||||
|
id="path230" />
|
||||||
|
<g
|
||||||
|
aria-label="Model Optimizer"
|
||||||
|
transform="translate(499.618 265)"
|
||||||
|
id="text232"
|
||||||
|
style="font-size:24px;font-family:'IntelOne Display Regular', 'IntelOne Display Regular_MSFontService', sans-serif;fill:#ffffff">
|
||||||
|
<path
|
||||||
|
d="m 15.648,-16.8 h 2.736 V 0 H 16.32 v -14.16 l -5.496,9.6 h -1.68 l -5.52,-9.576 V 0 H 1.56 v -16.8 h 2.736 l 5.688,9.816 z"
|
||||||
|
id="path480" />
|
||||||
|
<path
|
||||||
|
d="m 32.759985,-5.784 q 0,2.616 -1.752,4.392 -1.752,1.752 -4.32,1.752 -2.568,0 -4.296,-1.752 -1.704,-1.752 -1.704,-4.368 0,-2.616 1.752,-4.368 1.752,-1.776 4.32,-1.776 2.568,0 4.272,1.752 1.728,1.752 1.728,4.368 z m -10.08,0.024 q 0,1.872 1.152,3.144 1.176,1.272 2.904,1.272 1.704,0 2.856,-1.272 1.176,-1.272 1.176,-3.168 0,-1.848 -1.176,-3.12 -1.176,-1.296 -2.88,-1.296 -1.704,0 -2.88,1.272 -1.152,1.272 -1.152,3.168 z"
|
||||||
|
id="path482" />
|
||||||
|
<path
|
||||||
|
d="m 44.231998,-16.8 h 1.968 V 0 h -1.944 v -1.872 q -0.792,1.008 -1.92,1.56 -1.104,0.552 -2.376,0.552 -2.424,0 -4.08,-1.68 -1.632,-1.704 -1.632,-4.32 0,-2.64 1.68,-4.368 1.704,-1.752 4.176,-1.752 1.2,0 2.256,0.504 1.08,0.48 1.872,1.392 z m -4.056,15.336 q 1.752,0 2.928,-1.248 1.2,-1.272 1.2,-3.12 0,-1.872 -1.152,-3.096 -1.152,-1.224 -2.88,-1.224 -1.776,0 -2.904,1.2 -1.128,1.2 -1.128,3.144 0,1.944 1.104,3.144 1.104,1.2 2.832,1.2 z"
|
||||||
|
id="path484" />
|
||||||
|
<path
|
||||||
|
d="m 54.311983,-1.296 q 1.152,0 2.112,-0.48 0.984,-0.504 1.8,-1.512 l 1.176,1.08 q -0.984,1.296 -2.304,1.944 -1.32,0.624 -2.952,0.624 -2.616,0 -4.296,-1.704 -1.656,-1.728 -1.656,-4.416 0,-2.616 1.704,-4.368 1.704,-1.752 4.2,-1.752 2.544,0 4.128,1.656 1.584,1.656 1.584,4.32 0,0.192 -0.024,0.456 -0.024,0.24 -0.048,0.36 h -9.528 q 0.192,1.704 1.32,2.76 1.152,1.032 2.784,1.032 z m -0.192,-8.928 q -1.512,0 -2.592,0.984 -1.08,0.984 -1.32,2.64 h 7.632 q -0.216,-1.68 -1.224,-2.64 -1.008,-0.984 -2.496,-0.984 z"
|
||||||
|
id="path486" />
|
||||||
|
<path
|
||||||
|
d="m 61.991998,-16.8 h 1.968 V 0 h -1.968 z"
|
||||||
|
id="path488" />
|
||||||
|
<path
|
||||||
|
d="m 70.77597,-8.376 q 0,-3.72 2.472,-6.264 2.472,-2.544 6.096,-2.544 3.648,0 6.072,2.52 2.448,2.496 2.448,6.24 0,3.744 -2.472,6.288 -2.448,2.52 -6.072,2.52 -3.648,0 -6.096,-2.496 -2.448,-2.52 -2.448,-6.264 z m 2.112,-0.048 q 0,2.952 1.848,4.968 1.872,1.992 4.608,1.992 2.736,0 4.56,-1.968 1.848,-1.992 1.848,-4.944 0,-2.952 -1.848,-4.944 -1.848,-2.016 -4.584,-2.016 -2.76,0 -4.608,1.992 -1.824,1.968 -1.824,4.92 z"
|
||||||
|
id="path490" />
|
||||||
|
<path
|
||||||
|
d="m 91.967953,4.32 h -1.944 v -15.912 h 1.92 v 1.872 q 0.768,-1.008 1.896,-1.56 1.128,-0.576 2.4,-0.576 2.448,0 4.079997,1.704 1.632,1.68 1.632,4.32 0,2.664 -1.704,4.392 -1.679997,1.704 -4.151997,1.704 -1.176,0 -2.28,-0.48 -1.08,-0.504 -1.848,-1.416 z m 4.056,-14.448 q -1.752,0 -2.952,1.272 -1.176,1.248 -1.176,3.12 0,1.848 1.152,3.072 1.176,1.224 2.904,1.224 1.776,0 2.88,-1.176 1.128,-1.2 1.128,-3.168 0,-1.944 -1.104,-3.144 -1.08,-1.2 -2.832,-1.2 z"
|
||||||
|
id="path492" />
|
||||||
|
<path
|
||||||
|
d="m 104.49596,-3.096 v -6.768 h -1.584 v -1.728 h 1.584 v -3.552 h 1.944 v 3.552 h 2.688 v 1.728 h -2.688 v 6.6 q 0,0.768 0.336,1.152 0.36,0.36 1.08,0.36 h 1.272 V 0 h -1.56 q -1.608,0 -2.352,-0.744 -0.72,-0.744 -0.72,-2.352 z"
|
||||||
|
id="path494" />
|
||||||
|
<path
|
||||||
|
d="m 111.21596,-15.48 h 2.16 v 2.136 h -2.16 z m 0.096,15.48 v -11.592 h 1.968 V 0 Z"
|
||||||
|
id="path496" />
|
||||||
|
<path
|
||||||
|
d="m 116.03994,-11.592 h 1.92 v 1.392 q 0.648,-0.84 1.464,-1.248 0.84,-0.432 1.872,-0.432 1.2,0 2.136,0.504 0.936,0.504 1.488,1.464 0.648,-0.96 1.632,-1.464 1.008,-0.504 2.256,-0.504 2.04,0 3.24,1.32 1.224,1.32 1.224,3.432 V 0 h -1.968 v -7.008 q 0,-1.464 -0.744,-2.28 -0.72,-0.84 -2.04,-0.84 -1.296,0 -2.112,0.912 -0.792,0.888 -0.792,2.328 V 0 h -1.944 v -7.008 q 0,-1.464 -0.744,-2.28 -0.72,-0.84 -2.04,-0.84 -1.296,0 -2.112,0.912 -0.792,0.888 -0.792,2.328 V 0 h -1.944 z"
|
||||||
|
id="path498" />
|
||||||
|
<path
|
||||||
|
d="m 135.79196,-15.48 h 2.16 v 2.136 h -2.16 z m 0.096,15.48 v -11.592 h 1.968 V 0 Z"
|
||||||
|
id="path500" />
|
||||||
|
<path
|
||||||
|
d="m 150.02395,0 h -10.152 v -1.536 l 7.392,-8.424 h -7.104 v -1.632 h 9.552 v 1.536 l -7.32,8.424 h 7.632 z"
|
||||||
|
id="path502" />
|
||||||
|
<path
|
||||||
|
d="m 156.98398,-1.296 q 1.152,0 2.112,-0.48 0.984,-0.504 1.8,-1.512 l 1.176,1.08 q -0.984,1.296 -2.304,1.944 -1.32,0.624 -2.952,0.624 -2.616,0 -4.296,-1.704 -1.656,-1.728 -1.656,-4.416 0,-2.616 1.704,-4.368 1.704,-1.752 4.2,-1.752 2.544,0 4.128,1.656 1.584,1.656 1.584,4.32 0,0.192 -0.024,0.456 -0.024,0.24 -0.048,0.36 h -9.528 q 0.192,1.704 1.32,2.76 1.152,1.032 2.784,1.032 z m -0.192,-8.928 q -1.512,0 -2.592,0.984 -1.08,0.984 -1.32,2.64 h 7.632 q -0.216,-1.68 -1.224,-2.64 -1.008,-0.984 -2.496,-0.984 z"
|
||||||
|
id="path504" />
|
||||||
|
<path
|
||||||
|
d="m 164.44799,-11.592 h 1.92 v 1.464 q 0.528,-0.768 1.368,-1.152 0.84,-0.384 1.992,-0.384 h 1.056 v 1.848 h -1.08 q -1.656,0 -2.496,0.864 -0.816,0.864 -0.816,2.616 V 0 h -1.944 z"
|
||||||
|
id="path506" />
|
||||||
|
</g>
|
||||||
|
<path
|
||||||
|
id="rect234"
|
||||||
|
style="fill:#00285a;stroke:#00285a;stroke-width:1.33333;stroke-miterlimit:8"
|
||||||
|
d="m 531.5,415.5 h 109 v 32 h -109 z" />
|
||||||
|
<path
|
||||||
|
d="M531.5 414.5C531.5 410.634 555.901 407.5 586 407.5 616.1 407.5 640.5 410.634 640.5 414.5 640.5 418.366 616.1 421.5 586 421.5 555.901 421.5 531.5 418.366 531.5 414.5Z"
|
||||||
|
stroke="#0068B5"
|
||||||
|
stroke-width="1.33333"
|
||||||
|
stroke-miterlimit="8"
|
||||||
|
fill="#0068B5"
|
||||||
|
fill-rule="evenodd"
|
||||||
|
id="path236" />
|
||||||
|
<path
|
||||||
|
id="rect238"
|
||||||
|
style="fill:#00285a;stroke:#00285a;stroke-width:1.33333;stroke-miterlimit:8"
|
||||||
|
d="m 531.5,379.5 h 109 v 32 h -109 z" />
|
||||||
|
<g
|
||||||
|
aria-label="Data"
|
||||||
|
transform="translate(560.867 407)"
|
||||||
|
id="text240"
|
||||||
|
style="font-size:24px;font-family:'IntelOne Display Regular', 'IntelOne Display Regular_MSFontService', sans-serif;fill:#ffffff">
|
||||||
|
<path
|
||||||
|
d="m 1.56,0 v -16.8 h 6.312 q 3.936,0 6.336,2.328 2.424,2.328 2.424,6.048 0,3.696 -2.448,6.072 Q 11.736,0 7.824,0 Z m 12.984,-8.376 q 0,-2.88 -1.872,-4.704 -1.848,-1.848 -4.824,-1.848 H 3.624 v 13.056 h 4.224 q 2.976,0 4.824,-1.824 1.872,-1.824 1.872,-4.68 z"
|
||||||
|
id="path511" />
|
||||||
|
<path
|
||||||
|
d="m 28.776022,0 h -1.896 v -1.896 q -0.696,1.008 -1.896,1.584 -1.176,0.552 -2.544,0.552 -1.92,0 -3.12,-0.96 -1.2,-0.984 -1.2,-2.568 0,-1.752 1.344,-2.64 1.344,-0.912 3.984,-0.912 h 3.408 q -0.024,-1.632 -0.864,-2.52 -0.816,-0.888 -2.352,-0.888 -0.912,0 -1.8,0.384 -0.888,0.384 -1.752,1.176 l -1.032,-1.296 q 1.008,-0.96 2.208,-1.44 1.2,-0.504 2.568,-0.504 2.328,0 3.624,1.392 1.32,1.368 1.32,3.792 z m -1.92,-5.328 h -3.384 q -1.752,0 -2.592,0.48 -0.84,0.48 -0.84,1.512 0,0.864 0.72,1.44 0.744,0.576 1.896,0.576 1.728,0 2.904,-1.104 1.2,-1.128 1.296,-2.904 z"
|
||||||
|
id="path513" />
|
||||||
|
<path
|
||||||
|
d="m 31.704027,-3.096 v -6.768 h -1.584 v -1.728 h 1.584 v -3.552 h 1.944 v 3.552 h 2.688 v 1.728 h -2.688 v 6.6 q 0,0.768 0.336,1.152 0.36,0.36 1.08,0.36 h 1.272 V 0 h -1.56 q -1.608,0 -2.352,-0.744 -0.72,-0.744 -0.72,-2.352 z"
|
||||||
|
id="path515" />
|
||||||
|
<path
|
||||||
|
d="m 48.384008,0 h -1.896 v -1.896 q -0.696,1.008 -1.896,1.584 -1.176,0.552 -2.544,0.552 -1.92,0 -3.12,-0.96 -1.2,-0.984 -1.2,-2.568 0,-1.752 1.344,-2.64 1.344,-0.912 3.984,-0.912 h 3.408 q -0.024,-1.632 -0.864,-2.52 -0.816,-0.888 -2.352,-0.888 -0.912,0 -1.8,0.384 -0.888,0.384 -1.752,1.176 l -1.032,-1.296 q 1.008,-0.96 2.208,-1.44 1.2,-0.504 2.568,-0.504 2.328,0 3.624,1.392 1.32,1.368 1.32,3.792 z m -1.92,-5.328 h -3.384 q -1.752,0 -2.592,0.48 -0.84,0.48 -0.84,1.512 0,0.864 0.72,1.44 0.744,0.576 1.896,0.576 1.728,0 2.904,-1.104 1.2,-1.128 1.296,-2.904 z"
|
||||||
|
id="path517" />
|
||||||
|
</g>
|
||||||
|
<path
|
||||||
|
d="M531.5 379C531.5 375.41 555.901 372.5 586 372.5 616.1 372.5 640.5 375.41 640.5 379 640.5 382.59 616.1 385.5 586 385.5 555.901 385.5 531.5 382.59 531.5 379Z"
|
||||||
|
stroke="#0068B5"
|
||||||
|
stroke-width="1.33333"
|
||||||
|
stroke-miterlimit="8"
|
||||||
|
fill="#0068B5"
|
||||||
|
fill-rule="evenodd"
|
||||||
|
id="path242" />
|
||||||
|
<path
|
||||||
|
id="rect244"
|
||||||
|
style="fill:#00285a;stroke:#00285a;stroke-width:1.33333;stroke-miterlimit:8"
|
||||||
|
d="m 531.5,345.5 h 109 v 32 h -109 z" />
|
||||||
|
<path
|
||||||
|
d="M531.5 344.5C531.5 340.634 555.901 337.5 586 337.5 616.1 337.5 640.5 340.634 640.5 344.5 640.5 348.366 616.1 351.5 586 351.5 555.901 351.5 531.5 348.366 531.5 344.5Z"
|
||||||
|
stroke="#0068B5"
|
||||||
|
stroke-width="1.33333"
|
||||||
|
stroke-miterlimit="8"
|
||||||
|
fill="#0068B5"
|
||||||
|
fill-rule="evenodd"
|
||||||
|
id="path246" />
|
||||||
|
<path
|
||||||
|
d="M623.5 441C623.5 439.619 625.963 438.5 629 438.5 632.038 438.5 634.5 439.619 634.5 441 634.5 442.381 632.038 443.5 629 443.5 625.963 443.5 623.5 442.381 623.5 441Z"
|
||||||
|
stroke="#FFFFFF"
|
||||||
|
stroke-width="1.33333"
|
||||||
|
stroke-miterlimit="8"
|
||||||
|
fill="#FFFFFF"
|
||||||
|
fill-rule="evenodd"
|
||||||
|
id="path248" />
|
||||||
|
<path
|
||||||
|
d="M150 86.5001 150 217 147 217 147 86.5001ZM153 215.5 148.5 224.5 144 215.5Z"
|
||||||
|
fill="#AEAEAE"
|
||||||
|
id="path250" />
|
||||||
|
<path
|
||||||
|
d="M253.5 256 303.603 256 303.603 259 253.5 259ZM302.103 253 311.103 257.5 302.103 262Z"
|
||||||
|
fill="#AEAEAE"
|
||||||
|
id="path252" />
|
||||||
|
<path
|
||||||
|
d="M438.5 256 477.619 256 477.619 259 438.5 259ZM476.119 253 485.119 257.5 476.119 262Z"
|
||||||
|
fill="#AEAEAE"
|
||||||
|
id="path254" />
|
||||||
|
<path
|
||||||
|
d="M686.5 256 736.603 256 736.603 259 686.5 259ZM735.103 253 744.103 257.5 735.103 262Z"
|
||||||
|
fill="#AEAEAE"
|
||||||
|
id="path256" />
|
||||||
|
<path
|
||||||
|
d="M1.5-3.29817e-06 1.50009 40.2477-1.49991 40.2477-1.5 3.29817e-06ZM4.50009 38.7477 0.000104987 47.7477-4.49991 38.7477Z"
|
||||||
|
fill="#AEAEAE"
|
||||||
|
transform="matrix(1 0 0 -1 586.5 338.248)"
|
||||||
|
id="path258" />
|
||||||
|
<path
|
||||||
|
d="M623.5 405C623.5 403.619 625.963 402.5 629 402.5 632.038 402.5 634.5 403.619 634.5 405 634.5 406.381 632.038 407.5 629 407.5 625.963 407.5 623.5 406.381 623.5 405Z"
|
||||||
|
stroke="#FFFFFF"
|
||||||
|
stroke-width="1.33333"
|
||||||
|
stroke-miterlimit="8"
|
||||||
|
fill="#FFFFFF"
|
||||||
|
fill-rule="evenodd"
|
||||||
|
id="path260" />
|
||||||
|
<path
|
||||||
|
d="M623.5 371C623.5 369.619 625.963 368.5 629 368.5 632.038 368.5 634.5 369.619 634.5 371 634.5 372.381 632.038 373.5 629 373.5 625.963 373.5 623.5 372.381 623.5 371Z"
|
||||||
|
stroke="#FFFFFF"
|
||||||
|
stroke-width="1.33333"
|
||||||
|
stroke-miterlimit="8"
|
||||||
|
fill="#FFFFFF"
|
||||||
|
fill-rule="evenodd"
|
||||||
|
id="path262" />
|
||||||
|
</g>
|
||||||
|
</svg>
|
After Width: | Height: | Size: 32 KiB |
Loading…
Reference in New Issue
Block a user