<div><div> <h2>CHAPTER 1</h2> <p><b>PAST, PRESENT, AND FUTURE</b></p> <br> <p>I believe that if more effort is directed into the No-Man's land between raw sensory data and the distinguishable signals which are the starting point of statistical theory, the second decade of information theory will be as rich in practical improvements in communications techniques as the first was in intellectual clarifications.</p> <p>—D. Gabor</p> <br> <p><b>1.1 THREE REVOLUTIONS</b></p> <p><i>Sensing</i> is the interface between the physical and digital worlds. This text focuses on <i>computational optical sensing</i>, by which we mean the creation of digital information from electromagnetic radiation with wavelengths ranging from 200 to 20,000 nanometers (nm). Optical sensors are incorporated in imagers, spectrometers, communication transceivers, and optical information processing devices. This text focuses on imaging and spectroscopy. Imagers include microscopes, telescopes, video- and still cameras, and machine vision systems. <i>Spectrometers</i> are sensor engines for molecular detection and imaging, chemical analysis, environmental monitoring, and manufacturing process control.</p> <p>Computational sensing is revolutionizing the design and utility of optical imagers and spectrometers. In emerging applications, optical sensors are the backbone of robotics; transit control systems; security systems; medical diagnostics and genomics; and physical, chemical, and biological research. This text does not specifically consider these applications, but it does provide the reader with a solid foundation to design systems for any of them. The text focuses on</p> <p>• The relationship between continuous object and optical field parameters and digital image data</p> <p>• The use of coherence functions, most commonly the cross-spectral density and the power spectral density, to analyze optical systems</p> <p>• Coding strategies in the design of computational sensors</p> <p>• The limits of specific spectrometer and imager design strategies</p> <br> <p>Readers active in physical, chemical, or biological research or nonoptical sensor design should find these topics helpful in understanding the limits of modern sensors. Readers seeking to become expert in optical imaging system design and development will need to supplement this text with courses in digital image processing, lens system design, and optoelectronics. Optical systems is a field of stunning complexity and beauty, and we hope that the basics of system analysis presented here will draw the reader into continuing research and study.</p> <p>The optical sensing problem is illustrated in Fig. 1.1. The goal is to sense a remote object using signals communicated through the optical field. The sensor consists of optical elements, optoelectronic detectors, and digital processing. In some cases, we consider the remote object to be ambiently illuminated or to be self-luminous. In other cases we may consider temporally or spatially structured illumination as part of the sensor system. The system forms an image of the object consisting of a spatial map of the object radiance or density or of spatially resolved object features such as spectral density, polarization, or even chemical composition.</p> <p>Figure 1.1 illustrates the culmination of several milliennia of optical sensor system development. The history of optical sensors is punctuated by three revolutions:</p> <p>1. <i>Optical Elements.</i> Optical instruments capable of extending natural vision emerged approximately 700 years ago. Early instruments included spectacles to correct natural vision and the camera obscura for convenient image tracing. Over several hundred years these instruments evolved into microscopes and telescopes. These systems used human vision to transduce light into images. Image storage and communication occurred through handmade copies or traces or through written descriptions.</p> <p>2. <i>Automatic Image Recording.</i> Photochemical recording began to replace handmade images approximately 200 years ago. The first true photographic processes emerged in 1839 from Daguerre's work in France and Talbot's work in England. Each inventor worked over a decade to perfect his process. At first, long exposure times limited photographs to static scenes. Early portraits required the subject to remain stationary for several minutes. Daguerre's famous image shot in 1838 from his laboratory overlooking the Boulevard du Temple in Paris is generally considered the first photograph of a human subject, a man standing still to have his shoes shined. Photographs of dynamic scenes emerged over succeeding decades with the development of flash photography, faster optical elements, and faster photochemistry. Consider, however, the revolutionary impact of the introduction of photography. Images recorded prior to 1839 have been "retouched" by the human hand. Kings are taller and cleaner-looking than they really were. Commoners are not recorded at all. Only since 1839 can one observe true snapshots of history.</p> <p>3. <i>Computational Imaging.</i> Electronic imaging began about 80 years ago with the development of video capture systems for television. As with early optics, the first systems enabled people to see the previously unseen, in this case images of remote places, but did not record images for prosperity. True computational imaging requires three twentieth-century inventions: (a) optoelectronic signal transduction; (b) signal recording, communication, and digitization; and (c) digital signal processing. Signal transduction began with television, but the first electronic recording system, the Ampex VR-1000 magnetic tape deck, was not introduced until 1956. Digital signal processing emerged during World War II. Initial computational imaging applications emerged from <i>ra</i>dio <i>de</i>tecting <i>a</i>nd <i>r</i>anging (radar) applications. Electronic systems continued to emerge through the 1970s with the development of deep-space imaging and facsimile transmission. The period from 1950 through 1980 was also rich in the development of medical imaging based on x-ray and magnetic resonance tomography. The most important inventions for optical imaging during this period included semiconductor focal planes, microprocessors, and memories. These developments resulted in the first digital optical imaging systems by the mid-1980s. These systems have continued to evolve as computational hardware has gone from 1970s-style building-scale data centers, to 1980s-style desktop personal computers, to 1990s-style microprocessors in embedded microcameras.</p> <br> <p>At the moment of this writing the displacement of photochemical recording by optoelectronics is nearly complete, but the true implications of the third revolution are only just emerging. Just as the transition from an image that one could see through a telescope to an image that one could hold in one's hand was profound, the transition from analog photography to digital imaging is not about making old technology better, but about creating new technology. One hopes that this text will advance the continuing process of invention and discovery.</p> <br> <p><b>1.2 COMPUTATIONAL IMAGING</b></p> <p>The transition from imaging by photochemistry to imaging by computer is comparable to the transition from accounting by abacus to accounting by computer. Just as computational accounting enables finance on a scale unimaginable in the paper era, computational imaging has drastically expanded the number of imaging systems, the number of images captured, and the utility of images—and yet, what has really changed? Isn't a picture recorded on film or on an electronic focal plane basically the same thing? The electronic version can be stored and recalled automatically, but the film version generally has comparable or better resolution, dynamic range, and sensitivity. How is being digital different or better?</p> <p>In contrast to a physical object consisting of patterns on paper or film, a digital object is a mathematical entity. The digital object is independent of its physical instantiation in silicon, magnetic dipoles, or dimples on a disk. With proper care in coding and transmission, the digital object may be copied infinitely many times without loss of fidelity. A physical image, in contrast, looses resolution when copied and degrades with time. The primary difference between an analog image and a computational image is that the former is a tangible thing while the latter is an algebraic object.</p> <p>Early applications exploited the mathematical nature of electronic images by enabling nearly instantaneous image transmission and storage, by creating images of multidimensional objects or invisible fields and by creating automated image analysis and enhancement systems. New disciplines of <i>computer vision</i> and <i>digital image processing</i> emerged to computationally analyze and enhance image data.</p> <p>Excellent texts and a strong literature exist in support of computer vision and digital image processing. This text focuses on the tools and methods of an emerging community at the interface between digital and physical imaging and sensing system design. Computational sensing does not replace computer vision or digital image processing. Rather, by providing a more powerful and efficient physical layer, computational sensing provides new tools and options to the digital image processing and interpretation communities.</p> <p>The basic issue addressed by this text is that the revolutionary opportunity represented by electronic detection and digital signal processing has yet to be fully exploited in sensor system design. The only difference between analog and digital cameras in many cases is that an electronic focal plane as replaced film. The differences between conventional design and computational sensor design are delineated as follows:</p> <p>• The goal of conventional optical systems, even current electronic cameras and spectrometers, is to create an isomorphism. These systems rely on analog processing by lenses or gratings to form the image. The image is digitized after analog processing. Only modest improvements are made to the digitized image.</p> <p>• The goal of computational sensor design, in contrast, is to jointly design analog preprocessing, analog-to-digital conversion, and digital postprocessing to optimize image quality or utility metrics.</p> <br> <p>Computational imaging systems may not have a "focal plane" or may deliberately distort focal plane data to enhance postprocessing capacity.</p> <p>The central question, of course, is: <i>How might computational optical sensing improve the performance and utility of optical systems?</i> The short answer to this question is <i>in every way!</i> Computational design improves conventional image metrics, the utility of images for machine vision and the amenity of images for digital processing. Specific opportunities include the following:</p> <p>1. <i>Image Metrics.</i> Computational sensing can improve depth of field, field of view, spatial resolution, spectral resolution, signal fidelity, sensitivity, and dynamic range. Digital systems to the time of this writing often compromised image quality to obtain the utility of digital signals, but over time digital images will increasingly exceed analog performance on all metrics.</p> <p>2. <i>Multidimensional Imaging.</i> The goal of a multidimensional imaging system is to reconstruct a digital model of objects in their native embedding spaces. Conventional two-dimensional (2D) images of three-dimensional (3D) objects originate in the capacity of lens and mirror systems to form physical isomorphisms between the fields on two planes. With the development of digital processing, tomographic algorithms have been developed to transform arrays of 2D images into digital 3D object models. Integrated physical and digital design can improve on these methods by eliminating dimensional tradeoffs (such as the need to scan in time for tomographic data acquisition) and by enabling reconstruction of increasingly abstract object dimensions (space–time, space–spectrum, space–polarization, etc.).</p> <p>3. <i>Object Analysis and Feature Detection.</i> The goal of object analysis isto abstract nonimage data from a scene. In emerging applications, sensors enable completely automated tasks, such as robotic positioning and control, biometric recognition, and human–computer interface management. Current systems emphasize heuristic analysis of images. Integrated design allows direct measurement of low-level physical primitives, such as basic object size, shape, position, polarization, and spectral radiance. Direct measurement of significant primitives can dramatically reduce the computational cost of object analysis. On a deeper level, one can consider object abstraction as measurement on generalized object basis states.</p> <p>4. <i>Image Compression and Analysis.</i> The goal of image compression is to represent the digital model of an object as compactly as possible. One can regard the possibility of digital compression as a failure of sensor design. If it is possible to compress measured data, one might argue that too many measurements were taken. As with multidimensional imaging and object analysis, current compression algorithms assume a 2D focal model for objects. Current technology seeks a compressed linear basis or a nonlinear feature map capable of efficiently representing a picture. Integrated physical and digital design implements generalized bases and adaptive maps directly in the optical layer. One has less freedom to implement algorithms in the physical layer than in the digital system, but early data reduction enables both simpler and lower-power acquisition platforms and more efficient data processing.</p> <p>5. <i>Sensor Array Data Fusion and Analysis.</i> Multiaperture imaging is common in biological systems but was alien to artificial imaging prior to the computational age. Modern computational systems will dramatically surpass the multiaperture capabilities of biology by fusing data from many subapertures spanning broad spectral ranges.</p> <br> <p><b>1.3 OVERVIEW</b></p> <p>An optical sensor estimates the state of a physical object by measuring the optical field. The state of the object may be encoded in a variety of optical parameters, including both spatial and spectral features or functions of these features.</p> <p>Referring again to Fig. 1.1, note that an optical sensing system includes</p> <p>1. An <i>embedding space</i> populated by target objects</p> <p>2. <i>A radiation model</i> mapping object properties onto the optical signal</p> <p>3. <i>A propagation model</i> describing the transmission of optical signals across the embedding space</p> <p>4. <i>A modulation model</i> describing the coding of optical signals by optical elements</p> <p>5. <i>A detection model</i> describing transduction of optical signals at electronic interfaces</p> <p>6. <i>An image model</i> describing the relationship of transduced and processed digital data to object parameters</p> <br> <p>Considerable analytical and physical complexity is possible in each of these system components. The radiation model may range from simple scattering or fluorescence up to sophisticated quantum mechanical field–matter interactions. As this is an optics text, we generally ignore the potential complexity of the object–field relationship and simply assume that we wish to image the field itself.</p> <p>This text considers three propagation models:</p> <p>• <i>Geometric fields</i> propagate along rays. A <i>ray</i> is a line between a point on a radiating object and a measurement sensor. In geometric analysis, light propagates in straight lines until it is reflected, refracted, or detected. Geometric fields are discussed in Chapter 2.</p> <p>• <i>Wave fields</i> propagate according to physical wave equations. Wave fields add diffractive effects to the geometric description and enable physical description of the state of the field at any point in space. After review of basic mathematical tools in Chapter 3, we analyze wave fields in Chapter 4.</p> <p>• <i>Correlation fields</i> propagate according to models derived from wave fields, but focus on transformations of optical observables rather than the generally unobservable electric fields. Correlation field analysis combines wave analysis with a simple model of the quantum process of optical detection. After reviewing detection processes in Chapter 5, we develop correlation field analysis in Chapter 6.</p> <br> <p>The progression from geometric to wave to correlation descriptions involves increasing attention to the physical details of the object-measurement mapping system. The geometric description shows how one might form isomorphic and encoded image capture devices, but cannot account for diffractive, spectral, or interferometric artifacts in these systems. The wave model describes diffraction, but cannot explain interferometry, noise, or spectroscopy. The correlation model accounts for these effects, but would need augmentation in analysis of quantum coherence and nonlinear optical effects. We develop optical modulation and detection models for optical sensors consistent with each propagation model in the corresponding chapters. </div></div><br/> <i>(Continues...)</i> <!-- Copyright Notice --> <blockquote><hr noshade size='1'><font size='-2'>Excerpted from <b>Optical Imaging and Spectroscopy</b> by <b>David J. Brady</b>. Copyright © 2009 John Wiley & Sons, Inc.. Excerpted by permission of John Wiley & Sons. <br/>All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.<br/>Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.</font><hr noshade size='1'></blockquote>