

To reduce the clock jitter to a more tolerable 100 fs or less, the designer needs to understand where the clock jitter is coming from, as well as how much jitter the ADC can tolerate. When the same device is tested at the 3 rd Nyquist zone with a 105-MHz analog input, the degradation can be as much as 10 dB. It may be relatively easy to design an analog-to-digital converter encode circuit with a respectable 350 femtoseconds (fs) of jitter, but is this adequate for today’s high speed requirements? For example, when testing an AD9446-100-a 16-bit, 100-MHz ADC-at Nyquist with a 100-MHz sample clock, 350 fs of jitter can degrade the signal-to-noise ratio (SNR) by about 3 dB. System clock optimization can be both challenging and rewarding.


Analog-to-Digital Converter Clock Optimization: A Test Engineering Perspective
