What distinguishes engineering from other technical areas is inventiveness in the solution to a problem even when complete understanding is lacking. Consider that steam engines were developed before thermodynamics was well understood and it took mathematicians to put Fourier’s method on a firm theoretical foundation. Once established, theory is used to make continued advances. Such is the case in imaging as well. Spectacles were used long before an understanding of optics was developed, but that understanding is what led to the development of more complicated instruments.
Optical sensing and imaging stands now at the edge of just such a gap between new applications and new understanding. The gap represents a desire, but few tools, for combining optics with electronics optimally in image and sensor processing. Examples of how optics and electronics could enhance processing by working together, instead of independently, began appearing 20 years ago when the power of electronics was beginning to have widespread noticeable impact on processing. But rather than placing the entire burden for processing on the optics, as was the case 40 years ago, or even entirely on the electronics (most typical today), the emerging trend is toward systems that balance processing between optics and electronics, referred to here as integrated computational imaging systems (ICIS).
The maturing of these ideas over the past 10 years is one of the primary reasons for this focus issue in Optics Express and follows previous attention provided by OSA’s Topical Meeting on Integrated Computational Imaging Systems held in Albuquerque in November 2001 and the Applied Optics’ Feature Issue on Integrated Analysis and Design of Analog and Digital Processing in Imaging Systems (October 2002). The eight papers that appear here provide a glimpse of the applications and thinking that are slowly defining the ICIS philosophy.
The first paper, in effect a continuation of this introduction, by Mait et al. attempts to put developments in imaging in historical perspective. The paper indicates that with regard to optics and electronics complementing each other, three distinct approaches have begun to appear. Wavefront encoding uses optics to preferentially mark information subsequently processed electronically. In multiplex imaging the optics is used to produce redundant information, which makes it easier for the electronics to extract or enhance. The third approach is feature extraction where transforms implemented optically are processed electronically to produce a quantitative result. The remaining papers expand on these ideas.
Kubala et al. show how wavefront encoding can be applied to a simple case of aberration correction. In an optical system where a single optical element is incapable of forming an image without severe aberrations, the conventional solution is to use two elements. However, wavefront encoding allows the second element to be replaced by electronic processing to reduce weight and volume yet maintain system performance.
Multiplex systems for extracting spectral information are discussed in the papers by H. S. Pal and M. A. Neifeld, and Xu et al. Pal and Neifeld use multiplex imaging to extract spectral features from images. The system demonstrated by Xu and co-workers exploits the spatially random spectral variations in a photonic bandgap structure to extract spectral information.
Potuluri et al. and Gopinathan et al. apply multiplex imaging to motion detection in their papers. These systems are essentially extensions to coded aperture imaging that assume a source and aperture that are both three-dimensional. Potuluri and co-workers exploit randomness in a three-dimensional structure to encode information about source points in space. In contrast, the deterministic structures used by Gopinathan and co-workers with multiple detectors provide motion detection with reduced noise.
The final paper by A. Ashok and M. A. Neifeld attempts to provide an understanding of some of the properties displayed in the previous papers. New imaging modalities require new tools for analysis, especially when visually pleasing images are no longer the goal. One of these new tools is information theory as applied to imaging systems. The final paper presents analytic values of the mutual information in three different incoherent imaging systems: a conventional one, one with a cubic asphere in the pupil plane, and one with a random phase in the pupil plane. The effectiveness of random phase as a means for introducing diversity and, therefore, information is confirmed in this work.