* Added info on DockerHub CI Framework
* Feature/azaytsev/change layout (#3295)
* Changes according to feedback comments
* Replaced @ref's with html links
* Fixed links, added a title page for installing from repos and images, fixed formatting issues
* Added links
* minor fix
* Added DL Streamer to the list of components installed by default
* Link fixes
* Link fixes
* ovms doc fix (#2988)
* added OpenVINO Model Server
* ovms doc fixes
Co-authored-by: Trawinski, Dariusz <dariusz.trawinski@intel.com>
* Updated openvino_docs.xml
* Added Intel® Iris® Xe Dedicated Graphics, naming convention info (#3523)
* Added Intel® Iris® Xe Dedicated Graphics, naming convention info
* Added GPU.0 GPU.1
* added info about Intel® Iris® Xe MAX Graphics drivers
* Feature/azaytsev/transition s3 bucket (#3609)
* Replaced https://download.01.org/ links with https://storage.openvinotoolkit.org/
* Fixed links
# Conflicts:
# inference-engine/ie_bridges/java/samples/README.md
* Benchmarks 2021 2 (#3590)
* Initial changes
* Updates
* Updates
* Updates
* Fixed graph names
* minor fix
* Fixed link
* Implemented changes according to the review changes
* fixed links
* Updated Legal_Information.md according to review feedback
* Replaced Uzel* UI-AR8 with Mustang-V100-MX8
* Feature/azaytsev/ovsa docs (#3627)
* Added ovsa_get_started.md
* Fixed formatting issues
* Fixed formatting issues
* Fixed formatting issues
* Fixed formatting issues
* Fixed formatting issues
* Fixed formatting issues
* Fixed formatting issues
* Updated the GSG topic, added a new image
* Formatting issues fixes
* Formatting issues fixes
* Formatting issues fixes
* Formatting issues fixes
* Formatting issues fixes
* Formatting issues fixes
* Formatting issues fixes
* Formatting issues fixes
* Formatting issues fixes
* Formatting issues fixes
* Formatting issues fixes
* Formatting issues fixes
* Formatting issues fixes
* Revert "Formatting issues fixes"
This reverts commit c6e6207431.
* Replaced to Security section
* doc fixes (#3626)
Co-authored-by: Nikolay Tyukaev <ntyukaev_lo@jenkins.inn.intel.com>
# Conflicts:
# docs/IE_DG/network_state_intro.md
* fix latex formula (#3630)
Co-authored-by: Nikolay Tyukaev <ntyukaev_lo@jenkins.inn.intel.com>
* fix comments ngraph api 2021.2 (#3520)
* fix comments ngraph api
* remove whitespace
* fixes
Co-authored-by: Nikolay Tyukaev <ntyukaev_lo@jenkins.inn.intel.com>
* Feature/azaytsev/g api docs (#3731)
* Initial commit
* Added content
* Added new content for g-api documentation. Removed obsolete links through all docs
* Fixed layout
* Fixed layout
* Added new topics
* Added new info
* added a note
* Removed redundant .svg
# Conflicts:
# docs/get_started/get_started_dl_workbench.md
* [Cherry-pick] DL Workbench cross-linking (#3488)
* Added links to MO and Benchmark App
* Changed wording
* Fixes a link
* fixed a link
* Changed the wording
* Links to WB
* Changed wording
* Changed wording
* Fixes
* Changes the wording
* Minor corrections
* Removed an extra point
* cherry-pick
* Added the doc
* More instructions and images
* Added slide
* Borders for screenshots
* fixes
* Fixes
* Added link to Benchmark app
* Replaced the image
* tiny fix
* tiny fix
* Fixed a typo
* Feature/azaytsev/g api docs (#3731)
* Initial commit
* Added content
* Added new content for g-api documentation. Removed obsolete links through all docs
* Fixed layout
* Fixed layout
* Added new topics
* Added new info
* added a note
* Removed redundant .svg
* Doc updates 2021 2 (#3749)
* Change the name of parameter tensorflow_use_custom_operations_config to transformations_config
* Fixed formatting
* Corrected MYRIAD plugin name
* Installation Guides formatting fixes
* Installation Guides formatting fixes
* Installation Guides formatting fixes
* Installation Guides formatting fixes
* Installation Guides formatting fixes
* Installation Guides formatting fixes
* Installation Guides formatting fixes
* Installation Guides formatting fixes
* Installation Guides formatting fixes
* Fixed link to Model Optimizer Extensibility
* Fixed link to Model Optimizer Extensibility
* Fixed link to Model Optimizer Extensibility
* Fixed link to Model Optimizer Extensibility
* Fixed link to Model Optimizer Extensibility
* Fixed formatting
* Fixed formatting
* Fixed formatting
* Fixed formatting
* Fixed formatting
* Fixed formatting
* Fixed formatting
* Fixed formatting
* Fixed formatting
* Fixed formatting
* Fixed formatting
* Updated IGS, added links to Get Started Guides
* Fixed links
* Fixed formatting issues
* Fixed formatting issues
* Fixed formatting issues
* Fixed formatting issues
* Move the Note to the proper place
* Removed optimization notice
# Conflicts:
# docs/ops/detection/DetectionOutput_1.md
* minor fix
* Benchmark updates (#4041)
* Link fixes for 2021.2 benchmark page (#4086)
* Benchmark updates
* Fixed links
Co-authored-by: Trawinski, Dariusz <dariusz.trawinski@intel.com>
Co-authored-by: Nikolay Tyukaev <nikolay.tyukaev@intel.com>
Co-authored-by: Nikolay Tyukaev <ntyukaev_lo@jenkins.inn.intel.com>
Co-authored-by: Alina Alborova <alina.alborova@intel.com>
13 KiB
Configuration Guide for the Intel® Distribution of OpenVINO™ toolkit 2018R5 and the Intel® Vision Accelerator Design with an Intel® Arria® 10 FPGA (IEI's Mustang-F100-A10) on Linux*
NOTES:
- For a first-time installation, use all steps.
- Use steps 1 and 2 only after receiving a new FPGA card.
- Repeat steps 2-5 when installing a new version of the Intel® Distribution of OpenVINO™ toolkit.
- Use steps 3-5 when a Neural Network topology used by an Intel® Distribution of OpenVINO™ toolkit application changes.
1. Configure and Install the Intel® Vision Accelerator Design with an Intel® Arria® 10 FPGA
-
Download
fpga_support_files.tgzfrom the Intel Registration Center. The files in this.tgzarchive are required to ensure your FPGA card and the Intel® Distribution of OpenVINO™ toolkit work correctly. -
Go to the directory where you downloaded the
fpga_support_files.tgzarchive. -
Unpack the
.tgzfile:
tar -xvzf fpga_support_files.tgz
A directory named fpga_support_files is created.
- Go to the
fpga_support_filesdirectory:
cd fpga_support_files
- Source
setup_env.shto set your environment variables:
source /home/<user>/Downloads/fpga_support_files/setup_env.sh
- Configure the FPGA Driver Blacklist:
sudo mv config/blacklist-altera-cvp.conf /etc/modprobe.d
- Switch to superuser:
sudo su
- Use the
setup_env.shscript fromfpga_support_files.tgzto set your environment variables:
source /home/<user>/Downloads/fpga_support_files/setup_env.sh
- Change directory to
Downloads/fpga_support_files/:
cd /home/<user>/Downloads/fpga_support_files/
- Run the FPGA dependencies script, which allows OpenCL to support Ubuntu* and recent kernels:
./install_openvino_fpga_dependencies.sh
-
When asked, select the FPGA card, Intel® GPU, and Intel® Neural Compute Stick 2, then you can install the correct dependencies.
-
If you installed the 4.14 kernel as part of the installation script, you will need to reboot the machine and select the new kernel in the Ubuntu (grub) boot menu. You will also need to rerun
setup_env.shto set up your environmental variables again. -
Install OpenCL™ devices. Enter Y when prompted to install:
aocl install
- Reboot the machine:
reboot
- Use the
setup_env.shscript fromfpga_support_files.tgzto set your environment variables:
source /home/<user>/Downloads/fpga_support_files/setup_env.sh
- Run
aocl diagnose:
aocl diagnose
Your screen displays DIAGNOSTIC_PASSED.
2. Set Up the Intel® Vision Accelerator Design with Intel® Arria® 10 FPGA for 2018R5
For the 2018R5 release, the Intel® Distribution of OpenVINO™ toolkit introduced a new board support package (BSP) a10_1150_sg1 for the Intel® Vision Accelerator Design with an Intel® Arria® 10 FPGA, which is included into the fpga_support_files.tgz archive. To program the bitstreams for the Intel® Distribution of OpenVINO™ toolkit R5, you need to program the BSP into the board using the USB blaster.
Note
: These steps apply only if you update to the Intel® Distribution of OpenVINO™ toolkit R5. Otherwise, you can skip them.
- Go to the
configfolder of thefpga_support_filesdirectory where thea10_1150_sg1is located:
cd /home/<user>/Downloads/fpga_support_files/config/
- Copy the
a10_1150_sg1folder to theboarddirectory:
sudo cp -rf a10_1150_sg1 /opt/altera/aocl-pro-rte/aclrte-linux64/board/
- Convert the BSP files from DOS to UNIX:
sudo chmod +x a10_1150_sg1
find a10_1150_sg1 -type f -print0 | xargs -0 dos2unix
-
Set up the USB Blaster:
-
Connect the cable between the board and the host system. Use the letter codes in the diagram below for the connection points:
-
Connect the B end of the cable to point B on the board.
-
Connect the F end of the cable to point F on the FPGA download cable.
-
From point F end of the cable to point F on the FPGA download cable, the connection is as shown:

-
-
Source the
setup_env.shscript from thefpga_support_filesto set up the environment variables:
source /home/<user>/Downloads/fpga_support_files/setup_env.sh
- Update the Intel® FPGA Download Cable rules to program the board without root permissions and to flash the initialization bitstreams so that the Intel® FPGA Download Cable can communicate with the board:
sudo cp config/51-usbblaster.rules /etc/udev/rules.d
- Load the USB rules:
sudo udevadm control --reload-rules && udevadm trigger
-
Unplug and re-plug the Intel® FPGA Download Cable to enable JTAG connection.
-
Run
jtagconfigto ensure that your Intel FPGA Download Cable driver is ready to use:
jtagconfig
Your output is similar to:
1) USB-Blaster [1-6]
02E660DD 10AX115H1(.|E2|ES)/10AX115H2/..
- Download Intel® Quartus® Prime Software Lite Edition 17.1. Install the Intel® Quartus® Prime Software Lite to the
/home/<user>/intelFPGA/17.1directory.
Note
: You will need the complete the Intel® Quartus® Prime Software Lite version when you want to program the
boardtest_1ddr_top.aocxinto the flash for permanent availability.
- Export the Intel® Quartus® Prime Software Lite environment variable:
export QUARTUS_ROOTDIR=/home/<user>/intelFPGA/17.1/quartus
- Use
jtagconfigto slow the clock:
jtagconfig --setparam 1 JtagClock 6M
- (OPTIONAL) Confirm the clock is set to 6M:
jtagconfig --getparam 1 JtagClock
You should see the following:
6M
- Go to
/opt/altera/aocl-pro-rte/aclrte-linux64/board/a10_1150_sg1/bringup, whereboardtest_1ddr_top.aocxis located:
cd /opt/altera/aocl-pro-rte/aclrte-linux64/board/a10_1150_sg1/bringup
- Program the
boardtest_1ddr_top.aocxfile to the flash to be made permanently available even after power cycle:
aocl flash acl0 boardtest_1ddr_top.aocx
Note
: You will need the USB Blaster for this.
-
Reboot the host system.
-
Check if the host system recognizes the Intel® Vision Accelerator Design with Intel® Arria® 10 FPGA board. Confirm you can detect the PCIe card:
lspci | grep -i Altera
Your output is similar to:
01:00.0 Processing accelerators: Altera Corporation Device 2494 (rev 01)
- Source the
setup_env.shscript from thefpga_support_filesdirectory to setup the environment variables:
source /home/<user>/Downloads/fpga_support_file/setup_env.sh
- Uninstall the previous BSP before installing the OpenCL drivers for the R5 BSP:
aocl uninstall /opt/altera/aocl-pro-rte/aclrte-linux64/board/<BSP_package>/
- Export and source the environment script:
export AOCL_BOARD_PACKAGE_ROOT=/opt/altera/aocl-pro-rte/aclrte-linux64/board/a10_1150_sg1
source /opt/altera/aocl-pro-rte/aclrte-linux64/init_opencl.sh
- Install OpenCL™ devices:
aocl install
- Run the
diagnosecommand:
aocl diagnose
You should see DIAGNOSTIC_PASSED before proceeding to the next steps.
3. Program a Bitstream
The bitstream you program should correspond to the topology you want to deploy. In this section, you program a SqueezeNet bitstream and deploy the classification sample with a SqueezeNet model that you used the Model Optimizer to convert in the steps before.
Important
: Only use bitstreams from the installed version of the Intel® Distribution of OpenVINO™ toolkit. Bitstreams from older versions of the Intel® Distribution of OpenVINO™ toolkit are incompatible with later versions of the Intel® Distribution of OpenVINO™ toolkit. For example, you cannot use the
1-0-1_A10DK_FP16_Genericbitstream, when the Intel® Distribution of OpenVINO™ toolkit supports the2-0-1_A10DK_FP16_Genericbitstream.
Depending on how many bitstreams you selected, there are different folders for each FPGA card type which were downloaded in the Intel® Distribution of OpenVINO™ toolkit package:
-
For the Intel® Vision Accelerator Design with Intel® Arria® 10 FPGA the pre-trained bistreams are in
/opt/intel/openvino/bitstreams/a10_vision_design_bitstreams. This example uses a SqueezeNet bitstream with low precision for the classification sample. -
Rerun the environment setup script:
source /home/<user>/Downloads/fpga_support_files/setup_env.sh
- Change to your home directory:
cd /home/<user>
- Program the bitstream for the Intel® Vision Accelerator Design with Intel® Arria® 10 FPGA:
aocl program acl0 /opt/intel/openvino/bitstreams/a10_vision_design_bitstreams/5-0_PL1_FP11_SqueezeNet.aocx
Optional Steps to Flash the FPGA Card
Note
:
- To avoid having to reprogram the board after a power down, a bitstream will be programmed to permanent memory on the Intel® Vision Accelerator Design with Intel® Arria® 10 FPGA. This will take about 20 minutes.
- The following steps 1-5 need to be done only once for a new Intel® Arria 10 FPGA card.
-
Plug in the micro USB cable to the card and your host system.
-
Run
jtagconfigto ensure that the cable is properly inserted:
jtagconfig
- Use
jtagconfigto slow the clock:
jtagconfig --setparam 1 JtagClock 6M
- Store the Intel® Vision Accelerator Design with Intel® Arria® 10 FPGA bitstream on the board:
aocl flash acl0 /opt/intel/openvino/bitstreams/a10_vision_design_bitstreams/5-0_PL1_FP11_SqueezeNet.aocx
Your output is similar to:
USB-BlasterII [1-14]
02E660DD 10AX115H1(.|E2|ES)/10AX115H2/..
020A40DD 5M(1270ZF324|2210Z)/EPM2210
4. Setup a Neural Network Model for FPGA
In this section, you will create an FP16 model suitable for hardware accelerators. For more information, see the FPGA plugin section in the Inference Engine Developer Guide.
- Create a directory for the FP16 SqueezeNet Model:
mkdir /home/<user>/squeezenet1.1_FP16
- Go to
/home/<user>/squeezenet1.1_FP16:
cd /home/<user>/squeezenet1.1_FP16
- Use the Model Optimizer to convert an FP16 SqueezeNet Caffe* model into an optimized Intermediate Representation (IR):
python3 /opt/intel/openvino/deployment_tools/model_optimizer/mo.py --input_model /home/<user>/openvino_models/FP32/classification/squeezenet/1.1/caffe/squeezenet1.1.caffemodel --data_type FP16 --output_dir .
- The
squeezenet1.1.labelsfile contains the classesImageNetuses. This file is included so that the inference results show text instead of classification numbers. Copysqueezenet1.1.labelsto the your optimized model location:
cp /home/<user>/openvino_models/ir/squeezenet1.1/FP32/squeezenet1.1.labels .
- Copy a sample image to the release directory. You will use this with your optimized model:
sudo cp /opt/intel/openvino/deployment_tools/demo/car.png ~/inference_engine_samples/intel64/Release
5. Run a Sample Application
- Go to the samples directory
cd /home/<user>/inference_engine_samples/intel64/Release
- Use an Inference Engine sample to run a sample application on the CPU:
./classification_sample_async -i car.png -m ~/openvino_models/ir/squeezenet1.1/FP32/squeezenet1.1.xml
Note the CPU throughput in Frames Per Second (FPS). This tells you how quickly the inference is done on the hardware. Now run the inference using the FPGA.
- Add the
-doption to target the FPGA:
./classification_sample_async -i car.png -m ~/squeezenet1.1_FP16/squeezenet1.1.xml -d HETERO:FPGA,CPU
The throughput on FPGA is listed and may show a lower FPS. This is due to the initialization time. To account for that, the next step increases the iterations to get a better sense of the speed the FPGA can run inference at.
- Use
-nito increase the number of iterations, This option reduces the initialization impact:
./classification_sample_async -i car.png -m ~/squeezenet1.1_FP16/squeezenet1.1.xml -d HETERO:FPGA,CPU -ni 100
Congratulations, you are done with the Intel® Distribution of OpenVINO™ toolkit installation for FPGA.
Additional Resources
Intel® Distribution of OpenVINO™ toolkit home page: https://software.intel.com/en-us/openvino-toolkit
Intel® Distribution of OpenVINO™ toolkit documentation: https://docs.openvinotoolkit.org/
Inference Engine FPGA plugin documentation: https://docs.openvinotoolkit.org/latest/_docs_IE_DG_supported_plugins_FPGA.html