AI Engine

Versal Adaptive SoC Technical Reference Manual (AM011)

Document ID
AM011
Release Date
2023-10-05
Revision
1.6 English

The AI Engine is a two-dimensional array of tiles that each contain a high-performance VLIW vector (SIMD) engine, integrated memory, as well as interconnects for streaming, configuration, and debug. At the bottom of these tiles are the AI Engine array interface tiles that provide the necessary logic to connect the AI Engine to other the PL, PS, and the NoC.

There are two versions of AI Engines:

  • Original AI Engine (AI Engine)
  • Enhanced machine-learning version (AI Engine-ML)

Multiple devices are available that scale the size of the AI Engine and dynamically configure the programmable logic (PL) for many high-end applications including AI applications, cloud-based workloads, high-performance networks, and more.

The number of tiles in the AI Engine and the size of the PL are listed in Versal Architecture and Product Data Sheet: Overview (DS950).

AI Engine

The Versal AI Edge and AI Core devices include the AI Engine with its compute engine tiles and array interface tiles.

For more information about the AI Engine, see AI Engines and Their Applications (WP506) and Versal Adaptive SoC AI Engine Architecture Manual (AM009).

AI Engine ML

The AI Engine-ML version of the AI Engine adds more data processing performance for machine learning inference applications. The AI Engine-ML includes native support for INT4 and BFLOAT16 processor instructions. The memory array tiles include DMAs that support the 4D tensor address generation for some machine learning type applications.

The AI Engine-ML Engine is available in select devices. See the Versal Architecture and Product Data Sheet: Overview (DS950) for instances within a device generation and a specific device.

The AI Engine-ML engine architecture and arrays are described in the Versal Adaptive SoC AIE-ML Architecture Manual (AM020).