Time-stretched analog-to-digital converters (ADCs) have offered revolutionary enhancements in the performance of electronic converters by reducing the signal bandwidth prior to digitization. An inherent limitation of the time-stretched ADC is the frequency-selective response of the optical system that reduces the effective number of bits for ultrawideband signals. This paper proposes a solution based on spatio-temporal digital processing. The digital algorithm exploits the optical phase diversity to create a flat RF frequency response, even when the system's transfer function included deep nulls within the signal spectrum. For a $10\times$ time-stretch factor with a 10-GHz input signal, simulations show that the proposed solution increases the overall achievable signal-to-noise-and-distortion ratio to 52 dB in the presence of linear distortions. The proposed filter can be used to mitigate the dispersion penalty in other fiber optic applications as well.
© 2007 IEEE
Alireza Tarighat, Shalabh Gupta, Ali H. Sayed, and Bahram Jalali, "Two-Dimensional Spatio-Temporal Signal Processing for Dispersion Compensation in Time-Stretched ADC," J. Lightwave Technol. 25, 1580-1587 (2007)