Tips for QAT - 1.4.1 English

Vitis AI User Guide (UG1414)

Document ID
UG1414
Release Date
2021-12-13
Version
1.4.1 English

The following are some tips for QAT.

Dropout
Experiments shows that QAT works better without dropout ops. This tool does not support finetuning with dropouts at the moment and they should be removed or disabled before running QAT. This can be done by setting is_training=false when using tf.layers or call tf.keras.backend.set_learning_phase(0) when using tf.keras.layers.
Hyper-param
QAT is like float model training/finetuning, so the techniques for float model training/finetuning are also needed. The optimizer type and the learning rate curve are some important parameters to tune.