Next, conduct a System C simulation of your AI Engine application using the AI Engine System C Simulator (aiesimulator
). This simulator can model the AI Engine array, global memory (DDR memory), and network-on-chip (NoC). Invoke the AI Engine System C simulator with the following make
command:
make sim
or
cd build;
aiesimulator –pkg-dir Work –output-dir aiesimulator-output |& tee aiesimulator-output/aiesim.log
The aiesimulator
executes the AI Engine application where the AI Engine graph is initialized, run, and terminated by the control thread expressed in the main
function. By default, the dut.run()
option specifies a graph that runs forever. In our AI Engine application, dut.run(NITER) is specified, where NITER = 1. This runs the graph for one iteration. This means the simulation has the AI Engine receiving “one block” of data samples through the input ports, and the AI Engine outputs “one block” of data samples through the output ports. The block size of the data input, coefficient input, and data output ports is specified by the IN_DATA_WINSZ
, IN_COEF_WINSZ
, and OUT_DATA_WINSZ
global variables.