(Optional) Evaluating the Quantized Model - 1.3 English

Vitis AI User Guide (UG1414)

Document ID
UG1414
Release Date
2021-02-03
Version
1.3 English

If you have scripts to evaluate float models, like the models in Xilinx Model Zoo, you can replace the float model file with the quantized model for evaluation. To support the customized quantize layers, the quantized model should be loaded to "quantize_scope", for example:

from tensorflow_model_optimization.quantization.keras import vitis_quantize
with vitis_quantize.quantize_scope():
    quantized_model = tf.keras.models.load_model('quantized_model.h5')

After that, evaluate the quantized model just as the float model, for example:

quantized_model.compile(	loss=tf.keras.losses.SparseCategoricalCrossentropy(),
	metrics= keras.metrics.SparseTopKCategoricalAccuracy())
quantized_model.evaluate(eval_dataset)