* unroll ti transformation, lstm sequence ie, rnn sequence ie * Update unroll ti transformation, added GRUSequenceIE op, fixed several ti e2e tests * apply ngraph codestyle * fix naming after unroll transformation * Added default constructor for RNNCellBase, fix conversions * copy runtime info * added UnrollTI unit tests * clean up, move sequence ops in a separate PR * clean up, ngraph code style * temporary disable ngraph reader unit tests for ti * fix unit tests on windows * naming: use name of tensor after unroll tensor iteration transformation * apply transformations to tensor iterator body, separate pass for ti transformations, fix naming issue * fix build * remove TensorIterationTransformations pass * fix includes * resolve conflicts * fix build: incorrect includes * remove split/concat for single iteration of TI, update to opset4, unit tests * use matcher pass instead of graph rewrite * try to enable UnrollTI transformation for all plugins * disable unrollTI transformation for cpu plugin * resolve review comments, enable unit tests * update transformation description * fix unit tests * update transformation pipeline * clean up * clean up * resolve review comments
OpenVINO™ Toolkit - Deep Learning Deployment Toolkit repository
This toolkit allows developers to deploy pre-trained deep learning models through a high-level C++ Inference Engine API integrated with application logic.
This open source version includes two components: namely Model Optimizer and Inference Engine, as well as CPU, GPU and heterogeneous plugins to accelerate deep learning inferencing on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from the Open Model Zoo, along with 100+ open source and public models in popular formats such as Caffe*, TensorFlow*, MXNet* and ONNX*.
Repository components:
License
Deep Learning Deployment Toolkit is licensed under Apache License Version 2.0. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.
Documentation
- OpenVINO™ Release Notes
- OpenVINO™ Inference Engine Build Instructions
- Get Started with Deep Learning Deployment Toolkit on Linux*
- Introduction to Deep Learning Deployment Toolkit
- Inference Engine Developer Guide
- Model Optimizer Developer Guide
How to Contribute
See CONTRIBUTING for contribution to the code. See CONTRIBUTING_DOCS for contribution to the documentation. Thank you!
Support
Please report questions, issues and suggestions using:
- The
openvinotag on StackOverflow* - GitHub* Issues
- Forum
* Other names and brands may be claimed as the property of others.