Error detection latency is the major component of the total error mitigation latency. Error detection latency is a function of the FPGA size (frame count) and the solution clock frequency. It is also a function of the type of error and the relative position of the error with respect to the position of the silicon readback process. Table: Maximum Device Scan Times at ICAP FMax illustrates full device scan times.
The error detection latency can be bounded as follows:
• Absolute minimum error detection latency is effectively zero.
• Average error detection latency for detection by ECC is 0.5 × Scan Time ACTUAL
• Maximum error detection latency for detection by ECC is Scan Time ACTUAL
• Absolute maximum error detection latency for detection by CRC alone is 2.0 × Scan Time ACTUAL
The frame-based ECC method used always detects single, double, triple, and all odd-count bit errors in a frame. The remaining error types are usually detected by the frame-based ECC method as well. It is rare to encounter an error that defeats the ECC and is detected by CRC alone.