Developing With Vitis AI API_0 - 1.2 English

Vitis AI Library User Guide (UG1354)

Document ID
UG1354
Release Date
2020-07-21
Version
1.2 English
  1. Install the cross-compilation system on the host side, refer to Installation.
  2. Download the xilinx_model_zoo_zcu102-1.2.0-1.aarch64.rpm packet, and copy it to the board via scp.
  3. Install the Xilinx Model Package on the target side.
    #rpm -ivh xilinx_model_zoo_zcu102-1.2.0-1.aarch64.rpm
    After the installation, the models can be found in the /usr/share/vitis_ai_library/models directory on the target side.
    Note: You do not need to install the Xilinx model packet if they want to use their own model.
  4. Git clone the corresponding AI Library from https://github.com/Xilinx/Vitis-AI.
  5. Create a folder under your workspace, using classification as an example.
    $mkdir classification
  6. Create the demo_classification.cpp source file. The main flow is shown below. See ~/Vitis-AI/Vitis-AI-Library/demo/classification/demo_classification.cpp for a complete code example.
    Figure 1. Main program Flow Chart
  7. Create a build.sh file as shown below, or copy one from the AI Library's demo and modify it.
    #/bin/sh
    CXX=${CXX:-g++}
    $CXX -std=c++11 -O3 -I. -o demo_classification demo_classification.cpp -lopencv_core -lopencv_video -lopencv_videoio -lopencv_imgproc -lopencv_imgcodecs -lopencv_highgui -lglog -lvitis_ai_library-dpu_task -lvitis_ai_library-model_config -lvart-runner
    
  8. Cross compile the program.
    $sh -x build.sh
  9. Copy the executable program to the target board via scp.
    $scp demo_classification root@IP_OF_BOARD:~/
  10. Execute the program on the target board. Before running the program, make sure the target board has the AI Library installed, and prepare the images you want to test.
    #./demo_classification /usr/share/vitis_ai_library/models/resnet50/resnet50.elf resnet50_0 demo_classification.jpg
Note:
  • demo_classification.cpp uses user-defined pre-processing parameter as input.
  • demo_classification.cpp uses user post-processing code. And if you want to use the AI Library's post-processing library, please check Using the AI Library's Post-Processing Library