Calibration for Single-Ended IO Based Design - 1.0 English

ADC DAC Interface LogiCORE IP Product Guide (PG388)

Document ID
PG388
Release Date
2022-05-16
Version
1.0 English

The purpose of the calibration block is to ensure that UI sampling is always at the center for asynchronous signals. The sampling of UI is performed at the same frequency as the data rate. For example, if the interface speed is 1250 Mbps then the PLL clock frequency should be 1250 MHz. Thus, you are sampling each UI twice: Once in the center of the UI and once at the edge of UI. The sample from the center of the UI is valid data and the edge sample is used to keep the clock in the center of the data by updating the delay line. The block diagram of calibration block is shown in the following figure.

Figure 1. Block Diagram of Calibration Implementation for Single-Ended IO Designs

The samples from the delay line are fed into the phase detector circuit to determine if the delay line value should be increased or decreased. For each UI, two samples are taken from each bitslice. Depending on whether the clock is early or late, the delay is incremented or decremented.

Depending on the phase detector output, delay line values are updated after a certain number of cycles. Once the respective D samples received from the PHY are in the centre of the UI, the particular bitslice is considered locked. Once the bitslice is locked, of the eight bits given by PHY, four bits are given to the RX gearbox. Among the eight bits, four bits are selected depending on whether the data is N centered or P centered.