One-step Pruning - 2.0 English

Vitis AI Optimizer User Guide (UG1333)

Document ID
UG1333
Release Date
2022-01-20
Version
2.0 English

One-step pruning implements the EagleEye 1 algorithm. It introduces a strong positive correlation between different pruned models and their corresponding fine-tuned accuracy by a simple yet efficient evaluation component, that is the adaptive batch normalization. It enables you to get the subnetwork with the highest potential accuracy without actually fine-tuning the models. In short, the one-step pruning method searches for a bunch of subnetworks (i.e., generated pruned models) that meet the required model size, and applies an evaluation for selecting the most potential one from them. The selected subnetwork is then retained to recover the accuracy.

The pruning steps are as follows:

  1. Search for subnetworks that meet the required pruning ratio.
  2. Select a potential network from a bunch of subnetworks with an evaluation component.
  3. Fine-tune the pruned model.
Figure 1. One-step Pruning Workflow

Note:
  1. Bailin Li et al., EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning, arXiv:2007.02491