Tips for QAT - 2.0 English

Vitis AI User Guide (UG1414)

Document ID
UG1414
Release Date
2022-01-20
Version
2.0 English

The following are some tips for QAT.

Dropout
Experiments shows that QAT works better without dropout ops. This tool does not support finetuning with dropouts at the moment and they should be removed or disabled before running QAT. This can be done by setting is_training=false when using tf.layers or call tf.keras.backend.set_learning_phase(0) when using tf.keras.layers.
Hyper-param
QAT is like float model training/finetuning, so the techniques for float model training/finetuning are also needed. The optimizer type and the learning rate curve are some important parameters to tune.