scispace - formally typeset
Search or ask a question

Showing papers on "Image sensor published in 1999"


Patent
24 Aug 1999
TL;DR: In this article, a vehicle system is disclosed that includes a vehicle lamp assembly including a plurality of LEDs that emit white light so as to function as an illuminator light, and a controller that rapidly pulses the LEDs on and off at a rate that is imperceivable by the human eye.
Abstract: A vehicle system is disclosed that includes a vehicle lamp assembly including a plurality of LEDs that emit white light so as to function as an illuminator light. The lamp assembly also may include a plurality of LEDs that emit colored light, such as red or red-orange, so as to function as a signal light. Alternatively or additionally, the lamp assembly may include a camera of a vehicle imaging system. The lamp assembly may serve as a center high mounted stop light or as a tail light. The system also includes a controller that rapidly pulses the LEDs on and off at a rate that is imperceivable by the human eye. The pulsing intervals of the LEDs may be related to the readout intervals of the camera sensor array. In this manner, the LEDs may be pulsed on during camera readout so as to increase their intensity while the camera is capturing an image, or may be pulsed off during camera readout to prevent feedback glare from interfering with image capture by a highly sensitive image sensor array of the camera.

431 citations


Patent
11 Jun 1999
TL;DR: In this paper, a two-dimensional image sensor, apparatus for focusing images at different focal disclosures, an aiming system, a hi-low beam illumination system and related signal processing circuits are described.
Abstract: An imaging engine (10) and signal processing devices (72) and methods are disclosed for reading various kinds of optical codes. The compact structure (54') may include a two-dimensional image sensor, apparatus for focusing images at different focal disclosures, an aiming system, a hi-low beam illumination system and related signal processing circuits.

391 citations


Book
01 Jan 1999
TL;DR: The aim of this book is to provide a history of 3-D Imaging and its applications in Computer Vision up to and including the 1990s, as well as some of the techniques used in that period.
Abstract: Contents of Volume One: Preface. Contributors. B. Jahne, Introduction. Illumination and Image Formation. H. Hausecker, Radiation. H. Hausecker, Interaction of Radiation with Matter. P. Geisler, Geometric and Wave Optics. H. Hausecker, Radiometry of Imaging. H. Hausecker and B. Jahne, Illumination Sources and Techniques. Imagine Sensors. P. Seitz, Solid-State Image Sensing. U. Seger, U. Apel and B. Hofflinger, HDRC-Imagers for Natural Visual Perception. B. Schneider, P. Rieve and M. Bohm, Image Sensors in TFA (Thin Film on ASIC) Technology. S. Sedky and P. Fiorini, Poly SiGe Bolometers. B. Jahne, Hyperspectral and Color Imaging. 2-D Imaging. D. Uttenweiler and R.H.A. Fink, Dynamic Fluorescence Imaging. H. Stegmann, R. Wepf and R.R. Schroder, Electron Microscopic Image Acquisition. W. Albert and M. Pandit, Processing of Ultrasound Images in Medical Diagnosis. M.J. Buckingham, Acoustic Daylight Imaging in the Ocean. R. Massen, The Multisensorial Camera for Industrial Vision Applications. 3-D Imaging. R. Godding, Geometric Calibration of Digital Imaging Systems. R. Schwarte et al., Principles of 3-D Imaging Techniques. G. Hausler, 3-D Sensors - Potentials and Limitations. R.W. Malz, High Performance Surface Measurement. E.H.K. Stelzer, Three-Dimensional Light Microscopy. W.G. Schreiber and G. Brix, Magnetic Resonance Imaging in Medicine. A. Haase, J. Ruff and M. Rokitta, NMR Microscopy in Biological and Medical Research. Index. Color Plates. Contents of Volume Two: Preface. Contributors. B. Jahne, Introduction. Signal Representation. B. Jahne, Continuous and Digital Signals. B. Jahne, Spatial and Fourier Domain. B. Jahne, Multiresolutional Signal Representation. B. Jahne, Neighborhood Operators. B. Jahne, H. Scharr and S. Korkel, Principles of Filter Design. B. Jahne, Local Averaging. B. Jahne, Interpolation. B. Jahne, Image Warping. Feature Estimation. B. Jahne, Local Structure. T. Lindeberg, Principles for Automatic Scale Selection. T. Wagner, Texture Analysis. H. Hausecker and H. Spies, Motion. E.P. Simoncelli, Bayesian Multi-Scale Differential Optical Flow. J. Weickert, Nonlinear Diffusion Filtering. C. Schnorr, Variational Methods. H.A. Mallot, Stereopsis - Geometrical and Global Aspects. G. Gimel'farb, Stereo Terrain Reconstruction by Dynamic Programming. R. Klette, R. Kozera and K. Schluns, Reflectance Based Shape Recovery. P. Geisler and T. Dierig, Depth from Focus. Object Analysis, Classification, Modeling, Visualization.P. Soille, Morphological Operators. H. Hausecker and H.R. Tizhoosh, Fuzzy Image Processing. A. Meyer-Base, Neural Net Computing for Image Processing. D. Willersinn et al., Graph Theoretical Concepts for Computer Vision. R. Eils and K. Satzler, Shape Reconstruction from Volumetric Data. J. Hornegger, D. Paulus and H. Niemann, Probabilistic Modeling in Computer Vision. H. Niemann, Knowledge-Based Interpretation of Images. J. Hesser and C. Poliwoda, Visualization of Volume Data. N. Salmon, S. Lindek and E.H.K. Stelzer, Databases for Microscopes and Microscopical Images. Index. Color Plates. Contents of Volume Three: Preface. Contributors. B. Jahne, Introduction. Architecture of Computer Vision Systems..K.-H. Noffz et al., FPGA Image Processing. B. Jahne and H. Herrmann, Multimedia Architectures. A.M. Demiris, C.E. Cardenas, and H.-P. Meinzer, Customizable Medical Image Processing Systems. D. Paulus, J. Hornegger and H. Niemann, Software Engineering for Image Processing and Analysis. U. Kothe, Reusable Software in Computer Vision. P. Klausmann et al., Application-oriented Assessment of CV Algorithms. G. Hartmann, U. Buker and S. Drue, A Hybrid Neuro-AI-Architecture. B. Mertsching and S. Schmaiz, Active Vision Systems. G. Sommer, The Global Algebraic Frame of the Perception-Action Cycle. Industrial and Technical Applications. K. Singer, Market and Future Needs of Industrial Imaging. P. Soille, Applications of Morphological Operators. T. Wagner and P. Plankensteiner, Industrial Object Recognition. R. Koy-Oberthur, T. Munsterer, and S. Sun, Character Recognition in Industrial Production. R. Frischholz, Motion Tracking. H.A. Beyer, 3-D Image Metrology for Industrial Production. S. Karbacher, G. Hausler and H. Schonfeld, Reverse Engineering Using Optical Range Sensors. T. Scheuermann, G. Wiora and M. Graf, Topographical Maps of Microstructures. P. Soille, Processing of Digital Elevation Maps. R. Koch, 3-D Modeling of Objects from Image Sequences. N. Stein and B. Minge, 3-D Fast Full Body Scanning. B. Radig et al., 3-D Model Driven Person Detection. S. Lancer, C. Zierl and B. Radig, Single Perspective 3-D Object Recognition. T. Vetter, Flexible Models of Human Faces. Th. Hermes, C. Klauck and O. Herzog, Knowledge-Based Image Retrieval. M. Keller et al., A Tactile Vision Substitution System. B. Mertsching et al., The Neural Active Vision System NAVIS. E.D. Dickmanns and H.-J. Wunsche, Dynamic Vision for Perception and Control of Motion. Scientific Applications. P. Geisler and T. Scholz, Size Distributions of Small Particles. S. Eichkorn et al., Fluorescence Imaging of Air-Water Gas Exchange. F. Hering et al., Particle Tracking and Particle Imaging Velocimetry. H. Spies et al., Analyzing Particle Movements at Soil Interfaces. D. Schmundt and U. Schurr, Plant Leaf Growth Studies. D. Uttenweiler and R.H.A. Fink, Mathematical Modeling of Ca -Fluorescence Images. U. Schimpf, H. Hausecker and B. Jahne, Thermography for Small-Scale Air-Sea Interaction. B. Kummerlen et al., Thermography to Measure Water Relations of Plant Leaves. C. Leue, M. Wenig and U. Platt, Retrieval of Atmospheric Trace Gas Concentrations. J.L. Barron et al., Tracking "Fuzzy" Storms in Doppler Radar Images. R. Watzel et al., Detection of Dendritic Spine Synapses. C. Cremer et al., Spectral Precision Distance Confocal Microscopy. H. Bornfleth et al., 3-Dimensional Analysis of Genome Topology. Index. Color Plates.

380 citations


Journal ArticleDOI
TL;DR: Applications of the constrained and unconstrained algorithms of the MMT technique are illustrated on examples of unmixing and fusion of the multiresolution reflective and thermal bands of a real TM/LANDSAT image as well as of a simulated image of the future ASTER/EOS-AMI sensor.
Abstract: Constrained and unconstrained algorithms of the multisensor multiresolution technique (MMT) are discussed. They can be applied to unmix low-resolution images using the information about their pixel composition from co-registered high-resolution images. This makes it possible to fuse the low- and high-resolution images for a synergetic interpretation. The constrained unmixing preserves all the available radiometric information of the low-resolution image. On the other hand, the unconstrained unmixing may be preferable in case of noisy data. An analysis of the MMT sensitivity to sensor errors showed that the strongest requirement is the accuracy of geometric co-registration of the data; the co-registration errors should not exceed 0.1-0.2 of the low-resolution pixel size. Applications of the constrained and unconstrained algorithms are illustrated on examples of unmixing and fusion of the multiresolution reflective and thermal bands of a real TM/LANDSAT image as well as of a simulated image of the future ASTER/EOS-AMI sensor.

372 citations


Journal ArticleDOI
15 Feb 1999
TL;DR: A 640/spl times/512 image sensor with Nyquist rate pixel level ADC implemented in a 0.35 /spl mu/m CMOS technology shows how a pixellevel ADC enables flexible efficient implementation of multiple sampling.
Abstract: Analysis results demonstrate that multiple sampling can achieve consistently higher signal-to-noise ratio at equal or higher dynamic range than using other image sensor dynamic range enhancement schemes such as well capacity adjusting. Implementing multiple sampling, however, requires much higher readout speeds than can be achieved using typical CMOS active pixel sensor (APS). This paper demonstrates, using a 640/spl times/512 CMOS image sensor with 8-b bit-serial Nyquist rate analog-to-digital converter (ADC) per 4 pixels, that pixel-level ADC enables a highly flexible and efficient implementation of multiple sampling to enhance dynamic range. Since pixel values are available to the ADC's at all times, the number and timing of the samples as well as the number of bits obtained from each sample can be freely selected and read out at fast SRAM speeds. By sampling at exponentially increasing exposure times, pixel values with binary floating-point resolution can be obtained. The 640/spl times/512 sensor is implemented in 0.35-/spl mu/m CMOS technology and achieves 10.5/spl times/10.5 /spl mu/m pixel size at 29% fill factor. Characterization techniques and measured quantum efficiency, sensitivity, ADC transfer curve, and fixed pattern noise are presented. A scene with measured dynamic range exceeding 10000:1 is sampled nine times to obtain an image with dynamic range of 65536:1. Limits on achievable dynamic range using multiple sampling are presented.

345 citations


Patent
10 Dec 1999
TL;DR: A self-contained device for capturing video imagery in response to a triggering event may include a mirror (16) and be mounted to a vehicle windshield (14) in place of a conventional rearview mirror as mentioned in this paper.
Abstract: A self-contained device (10) for capturing video imagery in response to a triggering event may include a mirror (16) and be mounted to a vehicle windshield (14) in place of a conventional rear-view mirror. The device includes a housing (18) in which the electronics and related elements of the invention are contained. These elements include one or more data sensors (40, 42), at least one of which is an image sensor. Also included are a data sensor circuit and a capture circuit. The data sensor circuit responds to the triggering event, and may include data sensors coupled to vehicle systems such as a speedometer, tachometer, brake, turn signals or the like, or other data sensors such as an accelerometer (40) or a vehicle position sensor. The triggering event may be, for example, a sudden change in acceleration indicative of an impending collision, or it may be a change in the signal provided by any such data sensor, including the image sensor. The capture circuit is coupled to the image sensor and captures a signal representing the video imagery by recording it in a digital memory, by transmitting it to a remote location, or by other suitable means. The capture circuit terminates capture of the signal in response to the data sensor circuit sensing a triggering event. The captured data thus describe circumstances leading up to the time of the triggering event. The data can be analyzed to help police, insurance or other investigative personnel understand those circumstances.

263 citations


Journal ArticleDOI
TL;DR: The magnitude of image lag is such that significant artifacts in tomographic reconstructions may result if strategies are not adopted either to reduce or correct the lag between successive projections (e.g., rapid scanning between projections or iterative correction algorithms, respectively).
Abstract: Spatial and temporal imaging characteristics of an amorphous silicon flat-panel imager(FPI) were investigated in terms relevant to the application of such devices in cone-beam computed tomography(CBCT) and other x-ray imaging modalities, including general radiography, fluoroscopy, mammography, radiotherapy portal imaging, and nondestructive testing. Specifically, issues of image lag (including the magnitude, spatial uniformity, temporal-frequency characteristics, and dependence upon exposure and frame time) and long-term image persistence (“ghosts”) were investigated. As part of the basic characterization of the FPI, pixel dark signal and noise (magnitude, temporal stability, and spatial uniformity) as well as radiation response (signal size, linearity, gain, and reciprocity) were also measured. Image lag was analyzed as a function of frame time and incident exposure. First-frame lag (i.e., the relative residual signal in the first frame following readout of an exposure) was ∼2–10%, depending upon incident exposure and was spatially nonuniform to a slight degree across the FPI; second-, third-, and fourth-frame lag were ∼0.7%, 0.4%, and 0.3%, respectively (at 25% sensor saturation). Image lag was also analyzed in terms of the temporal-frequency-dependent transfer function derived from the radiation response, allowing a quantitative description of system components contributing to lag. Finally, the contrast of objects as a function of time following an exposure was measured in order to examine long-term image persistence (“ghosts”). Ghosts were found to persist up to 30 min or longer, depending upon the exposure and frame time. Two means of reducing the apparent contrast of ghost images were tested: (i) rapid scanning of the FPI at maximum frame rate, and (ii) flood-field exposure of the FPI; neither was entirely satisfactory. These results pose important considerations for application of FPIs in CBCT as well as other x-ray imaging modalities. For example in CBCT, the magnitude of image lag is such that significant artifacts in tomographic reconstructions may result if strategies are not adopted either to reduce or correct the lag between successive projections (e.g., rapid scanning between projections or iterative correction algorithms, respectively). Similarly, long-term image persistence may necessitate frequent recalibration of offset corrections.

262 citations


Patent
15 Jan 1999
TL;DR: In this paper, an improved digital camera that produces digital images of high qualities without using expensive image sensors and optics is disclosed, which use multiple image sensors with multiple lenses and is made to be responsive to all intensity information in visible color spectrum and a (gray intensity) image resulting from the sensor is used to compensate lost information in images from other image sensors responsive to certain colors.
Abstract: An improved digital camera that produces digital images of high qualities without using expensive image sensors and optics is disclosed. The disclosed digital cameras use multiple image sensors with multiple lenses. One of the multiple image sensors is made to be responsive to all intensity information in visible color spectrum and a (gray intensity) image resulting from the sensor is used to compensate lost information in images from other image sensors responsive to certain colors. A final color image is obtained by a digital image processing circuitry that performs pixel registration process with reference to the gray intensity image so that a true color image with true resolution is obtained therefrom.

248 citations


Patent
07 May 1999
TL;DR: In this article, an endoscope with a complementary metal dioxide semiconductor (CMOS) image sensor and an objective lens is described. But the authors did not specify the objective lens's position in the endoscope.
Abstract: A penetrating endoscope (10) provides visualization of organ or tissue structures of foreign objects in a body. The penetrating endoscope includes an elongate penetrating member (12), a complementary metal dioxide semiconductor (CMOS) image sensor (42), and an objective lens (36). The CMOS image sensor is substantially planar, and includes a plurality of pixels with a pixel signal processing circuit for generating a color image ready signal. The CMOS image sensor converts image light energy into electrical color image ready signal energy for transmission out of the body. The color image ready signal is viewed on a color image display (128). The CMOS image sensor is carried on the elongate penetrating member adjacent a distal end of the elongate penetrating member. The objective lens is also carried on the distal end of the elongated penetrating member on an optical axis, and focuses an image corresponding to an endoscope field of view at an image plane intersecting the optical axis.

231 citations


Patent
02 Feb 1999
TL;DR: In this paper, the image sensors are comprised of image sensing elements (pixels) each of which comprises a thin layer (or multiple layers) of organic semiconductor(s) sandwiched between conductive electrodes.
Abstract: Image sensors with monochromatic or multi-color response made from organic semiconductors are disclosed The image sensors are comprised of image sensing elements (pixels) each of which comprises a thin layer (or multiple layers) of organic semiconductor(s) sandwiched between conductive electrodes These image sensors can be integrated or hybridized with electronic or optical devices on the same substrate or on different substrates The electrical output signals from the image sensors resulting from the input image are probed by a circuit connected to the electrodes The spectral response of the image sensing elements can be modified and adjusted to desired spectral profiles through material selection, through device thickness adjustment and/or through optical filtering Several approaches for achieving red, green, and blue full-color detection are disclosed Similar approaches can be used for multiple-band detection (wavelength multiplexing) in desired response profiles and in other selected spectral ranges

229 citations


Patent
09 Aug 1999
TL;DR: In this article, an active-pixel image sensor array is directed toward the cornea of the eye through a matrix of micro-lens, integrated with a comparator array which is interfaced to bilateral switches.
Abstract: A retinal scanning display, an active-pixel image sensor array, and an im processor track the movements of the human eye. The scanning nature of the display acts as a sequential source of eye illumination. The active-pixel image sensor array is directed toward the cornea of the eye through a matrix of micro-lens. The sensor is integrated with a comparator array which is interfaced to bilateral switches. An element address encoder and latch determines the sensor element which reaches maximum intensity during the raster-scan period of the display driver. Over a display field refresh cycle, the invention maps the corneal surface to a data table by pairing sensor activations to the specular reflections from the cornea of the sequenced source lights.

Patent
08 Dec 1999
TL;DR: In this paper, a substrate includes a plurality of individual substrates integrally connected together in an array format, and an adhesive layer attaches the molded window array to the substrate.
Abstract: Image sensor packages are fabricated simultaneously to minimize the cost associated with each individual image sensor package. To fabricate the image sensor packages, windows are molded in molding compound to form a molded window array. A substrate includes a plurality of individual substrates integrally connected together in an array format. Image sensors are attached and electrically connected to corresponding individual substrates. An adhesive layer attaches the molded window array to the substrate. The substrate and attached molded window array are singulated into a plurality of individual image sensor packages.

Patent
04 Jun 1999
TL;DR: In this paper, an imaging system for use in a vehicle headlamp control system includes an opening, an image sensor, a red lens blocking red complement light between the opening and the image sensor.
Abstract: An imaging system for use in a vehicle headlamp control system includes an opening, an image sensor, a red lens blocking red complement light between the opening and the image sensor, and a red complement lens blocking red light between the opening and the image sensor. Each lens focuses light onto a different subwindow of the image sensor. The imaging system allows processing and control logic to detect the presence of headlamps on oncoming vehicles and tail lights on vehicles approached from the rear for the purpose of controlling headlamps. A light sampling lens may be used to redirect light rays from an arc spanning above the vehicle to in front of the vehicle into substantially horizontal rays. The light sampling lens is imaged by the image sensor to produce an indication of light intensity at various elevations. The processing and control logic uses the light intensity to determine whether headlamps should be turned on or off. A shutter may be used to protect elements of the imaging system from excessive light exposure.

Journal ArticleDOI
TL;DR: This paper examines the possibility of a low-cost, high-resolution fingerprint sensor chip, composed of 64/spl times/256 sensing cells, which enables a high resolution of 600 dpi, even using a conventional 0.6 /spl mu/m CMOS process.
Abstract: This paper examines the possibility of a low-cost, high-resolution fingerprint sensor chip. The test chip is composed of 64/spl times/256 sensing cells (chip size: 2.7/spl times/10.8 mm/sup 2/). A new detection circuit of charge sharing is proposed, which eliminates the influences of internal parasitic capacitances. Thus, the reduced sensing-capacitor size enables a high resolution of 600 dpi, even using a conventional 0.6 /spl mu/m CMOS process. The partial fingerprint images captured are synthesized into a full fingerprint image with an image-synthesis algorithm. The problems and possibilities of this image-synthesis technique are also analyzed and discussed.

Proceedings ArticleDOI
TL;DR: It is argued that CMOS technology scaling will make pixel level processing increasingly popular and interpixel analog processing is not likely to become mainstream even for computational sensors due to the poor scaling popular since it minimizes analog processing, and requires only simple and imprecise circuits to implement.
Abstract: Pixel level processing promises many significant advantages including high SNR, low power, and the ability to adapt image capture and processing to different environments by processing signals during integration. However, the severe limitation on pixel size has precluded its mainstream use. In this paper we argue that CMOS technology scaling will make pixel level processing increasingly popular. Since pixel size is limited primarily by optical and light collection considerations, as CMOS technology scales, an increasing number of transistors can be integrated at the pixel. We first demonstrate that our argument is supported by the evolution of CMOS image sensor from PPS to APS. We then briefly survey existing work on analog pixel level processing an d pixel level ADC. We classify analog processing into intrapixel and interpixel. Intrapixel processing is mainly used to improve sensor performance, while interpixel processing is used to perform early vision processing. We briefly describe the operation and architecture of our recently developed pixel level MCBS ADC. Finally we discuss future directions in pixel level processing. We argue that interpixel analog processing is not likely to become mainstream even for computational sensors due to the poor scaling popular since it minimizes analog processing, and requires only simple and imprecise circuits to implement. We then discuss the inclusion of digital memory and interpixel digital processing in future technologies to implement programmable digital pixel sensors.

Proceedings ArticleDOI
15 Feb 1999
TL;DR: The key to the realization of high speed and flexible image processing beyond the video signal rate is to remove the bottleneck of data transfer and implement general purpose functionality.
Abstract: Conventional image processing systems use a video signal as a transmission signal between an image sensor and image processor. The video rate, however, is not fast enough for some applications such as visual feedback for robot control, automobiles, gesture recognition for human interfaces, high speed visual inspection, microscope image processing, and so on. For such applications, the video signal, which is a time-multiplexed signal of pixel data using scanning circuits, is a bottleneck of conventional image processing systems. The key to the realization of high speed and flexible image processing beyond the video signal rate is to remove the bottleneck of data transfer and implement general purpose functionality. In other words, a fully parallel architecture with generality of processing should be adopted instead of scanning circuits. This CMOS vision chip has massively parallel processing elements integrated with photodetectors. These chips have a SIMD massively parallel processing architecture with each processing element (PE) connected to a photodetector (PD) without scanning circuits.

Proceedings ArticleDOI
09 Sep 1999
TL;DR: In this article, a complete range camera system, working with the time-of-flight principle, is introduced, which uses modulated LEDs as active illumination source and a new lock-in CCD sensor as demodulator and detector.
Abstract: A complete range camera system, working with the time-of- flight principle, is introduced. This ranging system uses modulated LEDs as active illumination source and a new lock-in CCD sensor as demodulator and detector. It requires no mechanically scanning parts because every pixel of the sensor contains a lock-in amplifier, enabling both intensity and range measurement for all pixels in parallel. Two such lock-in imagers are realized in 2.0 micrometer CMOS/CCD technology, (1) a line sensor with 108 pixels and an optical fill factor of 100% and (2) a 64 X 25 pixel image sensor with 20% fill factor. The basics of time-of-flight ranging are introduced with a detailed description of the elements necessary. Shot noise limitation to ranging resolution is deduced and confirmed by simulation. An optical power budget is offered giving the relation between the number of photons in a pixel depending on the light source, the observed object, and several camera parameters. With the described lab setup, non- cooperative targets can be measured over a distance of several meters with a resolution of some centimeters.© (1999) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

Proceedings ArticleDOI
05 Dec 1999
TL;DR: A 3D image sensor test chip was fabricated using this 3D integration technology in this paper, and basic electric characteristics were evaluated in the test chip using the three-dimensional integration technology.
Abstract: A new three-dimensional (3D) integration technology based on wafer bonding technique has been proposed for intelligent image sensor chip with 3D stacked structure. We have developed key technologies for such 3D integration. A 3D image sensor test chip was fabricated using this 3D integration technology. Basic electric characteristics were evaluated in the 3D image sensor test chip.

Patent
16 Feb 1999
TL;DR: In this paper, an active pixel sensor (APS) system is described, which includes an array of sensors arranged into lines that form rows and columns, and each sensor in the array includes a photosensor that collects electric charge when exposed to light.
Abstract: A digital imaging system, such as an active pixel sensor (APS) system, includes an array of sensors arranged into lines that form rows and columns. Each sensor in the array includes a photosensor that collects electric charge when exposed to light. Each sensor also includes a select circuit that generates an output signal indicating the amount of charge collected by the photosensor during a given time, and a reset circuit that clears collected charge from the sensor at a selected time. The APS system also includes a line decoder circuit that produces select and reset signals and delivers the signals to the select circuits over control lines. Each control line connects to two adjacent lines, e.g., rows or columns, of the array, delivering a select signal to the image sensors in one of the two lines and delivering a reset signal to the image sensors in the other line.

Patent
17 Jun 1999
TL;DR: In this paper, the authors describe a surgical endoscopic instrument for visualizing anatomical tissue and including a tubular barrel carrying an endoscope with a CMOS image sensor and a physical parameter (e.g., temperature) sensor carried on the shaft.
Abstract: A surgical endoscopic instrument for visualizing anatomical tissue and including a tubular barrel carrying an endoscope with a CMOS image sensor and a physical parameter (e.g., temperature) sensor carried on the shaft. CMOS image sensors are fabricated on a single substrate or chip, with all of the required image signal processing circuitry, using the economical CMOS process. The CMOS image sensor is preferably mounted onto the distal end of the endoscope shaft and may include a light source for illuminating the field of visualization. In an alternative embodiment, the endoscope has a fixed objective lens carried on the distal end of a tubular body and a lens train or fiber optic bundle transmits the endoscopic image proximally to a CMOS sensor located in the hand piece. In the present invention, a single semiconductor substrate or chip has a CMOS imaging sensor with, preferably, a microprocessor and the associated signal processing circuitry for generating image signals for transmission to a display and to a data logging computer, outside the body. The surgical instrument carrying the CMOS image sensor preferably includes a distally mounted display, preferably a Liquid Crystal Display (LCD), mounted on a flexible stalk for convenient repositioning during a surgical procedure.

Patent
18 Oct 1999
TL;DR: A symbol reader employs an optical element having first and second optical axes positioned to image a same portion of a color coded symbol onto two different portions of an image sensor as mentioned in this paper, which includes one or more filters to remove different color portions of the light reflected from the symbol to create color separations at the image sensor.
Abstract: A symbol reader employs an optical element having first and second optical axes positioned to image a same portion of a color coded symbol onto two different portions of an image sensor. The reader includes one or more filters to remove different color portions of the light reflected from the symbol to create color separations at the image sensor. Thus, the image sensor detects different intensities of light, corresponding to different color states. A comparator, such as a microprocessor, programmed general purposed computer, or digital logic circuit, can determine the position and color of the various symbol elements based on image data produced by the image sensors, and decode the color coded symbol.

Patent
07 May 1999
TL;DR: In this paper, an image producing device including a digital camera and an area image sensor is disposed at an image plane for receiving the image of a subject and producing a digitized image thereof.
Abstract: An image producing device (10) including a digital camera an area image sensor (20) disposed at an image plane for receiving the image of a subject (21), and producing a digitized image thereof. A CPU (32 a) is adapted to direct the output to a variety of storage mediums. A printer (30) is connected to the CPU (32 a) for producing a hard copy (39) of the digital image as shown on an image display (38 b). The integral digital camera adjustably; directable from 0° to 360° in a horizontal plane and selectively directed at a sheet display (54) or at a subject (21) for a portrait.

Journal ArticleDOI
TL;DR: In this article, a chip architecture that integrates a fingerprint sensor and an identifier in a single chip is proposed, which realizes a wide-area sensor without a large increase of chip size and ensures high sensor sensitivity while maintaining a high image density.
Abstract: A chip architecture that integrates a fingerprint sensor and an identifier in a single chip is proposed. The fingerprint identifier is formed by an array of pixels, and each pixel contains a sensing element and a processing element. The sensing element senses capacitances formed by a finger surface to capture a fingerprint image. An identification is performed by the pixel-parallel processing of the pixels. The sensing element is built above the processing element in each pixel. The chip architecture realizes a wide-area sensor without a large increase of chip size and ensures high sensor sensitivity while maintaining a high image density. The sensing element is covered with a hard film to prevent physical and chemical degradation and surrounded by a ground wall to shield it. The wall is also exposed on the chip surface to protect against damage by electrostatic discharges from the finger contacting the chip. A 15/spl times/15 mm/sup 2/ single-chip fingerprint sensor/identifier LSI was fabricated using 0.5-/spl mu/m standard CMOS with the sensor process. The sensor area is 10.1/spl times/13.5 mm/sup 2/. The sensing and identification time is 102 ms with power consumption of 8.8 mW at 3.3 V. Five hundred tests confirmed a stranger-rejection rate of the chip of more than 99% and a user-rejection rate of less than 1%.

Patent
29 Jun 1999
TL;DR: In this article, a 3D camera and a method for recording surface structures on an object of interest by triangulation, in particular for dental purposes, are presented. But the camera provides for producing a group of light beams in order to illuminate the target via a projection optical path, an image sensor for receiving light back-scattered by the target, and a pattern projected onto the target.
Abstract: A 3-D camera and a method for recording surface structures on an object of interest by triangulation, in particular for dental purposes. The camera provides for producing a group of light beams in order to illuminate the object of interest via a projection optical path, an image sensor for receiving light back-scattered by the object of interest via an observation optical path, and provides, in the projection optical path, for producing a pattern projected onto the object of interest. To avoid ambiguities in the event of large height differences, the camera provides for the projection optical path and/or the observation optical path for altering the triangulation angle, which is defined by the angle between the centroid beam of the projection optical path and the centroid beam of the observation optical path. The proposed process involves the taking of at least two 3-D measurements of the same object of interest with different triangulation angles.

Proceedings ArticleDOI
TL;DR: A simple model is introduced to describe the sensor output response as a function of the photogenerate signal, dark signal, and noise for sensors operation in integration mode with and without dynamic range enhancement schemes.
Abstract: Dynamic range is a critical figure of merit for image sensors. Often a sensor with higher dynamic range is regarded as higher quality than one with lower dynamic range. For CCD and CMOS sensors operating in the integration mode the sensor SNR monotonically increases with the signal. Therefore, a sensor with higher dynamic range, generally, produces higher quality images than one with lower dynamic range. This, however, is not necessarily the case when dynamic range enhancement schemes are used. For example, suing the well capacity adjusting scheme dynamic range is enhanced but at the expense of substantial degradation in SNR. On the other hand, using multiple sampling dynamic range can be enhanced without degrading SNR. Therefore, even if both schemes achieve the same dynamic range the latter can produce higher image quality than the former. The paper provides a quantitative framework for comparing SNR for image sensors with enhanced dynamic range. We introduce a simple model to describe the sensor output response as a function of the photogenerate signal, dark signal, and noise for sensors operation in integration mode with and without dynamic range enhancement schemes. We use the model to quantify and compare dynamic range and SNR for three sensor operation modes, integration with shuttering, using the well capacity adjusting scheme, and using multiple sampling.© (1999) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

Patent
02 Oct 1999
TL;DR: In this paper, an apparatus and method of processing images, hand-drawn or written on a suitable writing surface (21A, 21B, 21C), viewed by an Image Sensor (22A, 22B, 22C) such as a video camera, and captured by an image sensing circuit (i.e. frame grabber or similar) used for acquistion of image frames by a computer.
Abstract: This application discloses an apparatus and method of processing images, hand-drawn or written on a suitable Writing Surface (21A, 21B, 21C), viewed by an Image Sensor (22A, 22B, 22C) such as a video camera, and captured by an image sensing circuit (i.e. frame grabber or similar) used for acquistion of image frames by a computer. More particularly, this invention discriminates among changes detected in these Viewed Imges in order to identify and disregard non-informational, transient and/or redundant content. Removal of such content, a writer's arm for example, from the captured image facilitates isolating meaningful changes, specifically intentional new Writing and Erasures appearing on the Writing Surface. Preserving only meaningful changes on the surface promotes optimized compressed transmission of a subset of the visual data, when used in conjunction with digital computer display systems.

Patent
12 Nov 1999
TL;DR: In this paper, an apparatus and method for acquiring an image of a patterned object such as a fingerprint including a light refracting device, a focusing lens, and a light source is presented.
Abstract: An apparatus and method for acquiring an image of a patterned object such as a fingerprint including a light refracting device, a focusing lens, and a light source. The light refracting device can, for example, be a prism and includes an imaging surface, a light receiving surface and a viewing surface. Incident light from the light source is projected through the light receiving surface and reflected off a surface other than the imaging surface. This reflected light is then projected onto the imaging surface to create an image of the patterned object from substantially all scattered light through the viewing surface. The lens is placed adjacent to the viewing surface to focus the light on an image sensor.

Patent
22 Jun 1999
TL;DR: In this paper, an apparatus and a method for correction of a deviation of an imaging sensor of a digital camera in which an image of an object or a scene is formed on an image plane of the imaging sensor to output an image signal, is disclosed.
Abstract: An apparatus and a method for correction of a deviation of an imaging sensor of a digital camera in which an image of an object or a scene is formed on an image plane of the imaging sensor to output an image signal, are disclosed. A quantity of rotation of the digital camera causing a deviation of the imaging sensor from a reference position to occur, is detected. A change of a positional angle of the imaging sensor is calculated based on the detected rotation quantity. A target vector is calculated based on the calculated positional angle change, the target vector describing a magnitude and a direction of an inverse movement of the imaging sensor needed to reach the reference position and cancel the deviation. Movement of the imaging sensor is controlled based on the calculated target vector, so that the imaging sensor is moved back to the reference position thus correcting the deviation. The calculation of the target vector and the movement of the imaging sensor are executed within an image acquisition time for a single frame of the image signal.

Patent
29 Dec 1999
TL;DR: In this article, a handheld device for capturing and communicating digital images includes a digital camera having an electronic image sensor for sensing an image and producing a digital image indicative of the sensed image, and a wireless transceiver having a transmitter operable with the digital camera for transmitting the digital image over a wireless communications link through the establishment of a data call.
Abstract: A handheld, portable device for capturing and communicating digital images includes a digital camera having an electronic image sensor for sensing an image and producing a digital image indicative of the sensed image. The device further includes a wireless transceiver having a transmitter operable with the digital camera for transmitting the digital image over a wireless communications link through the establishment of a data call. A second device having a wireless transceiver may communicate digital images over the wireless communications link through the establishment of the data call with the wireless transceiver of the first device.

Patent
23 Dec 1999
TL;DR: In this article, an imaging assembly includes a miniature electronic image sensor including an imaging substrate having a plurality of pixels and a microlens array aligned with corresponding pixels on said imaging substrate and focusing optics for focusing an optical image of a target onto the imaging substrate including at least one adaptive lens element.
Abstract: An imaging assembly includes a miniature electronic image sensor including an imaging substrate having a plurality of pixels and a microlens array aligned with corresponding pixels on said imaging substrate and focusing optics for focusing an optical image of a target onto the imaging substrate including at least one adaptive lens element. The focusing optics have a first exit pupil distance defining a first field of view and the miniature electronic image sensor has a second exit pupil distance defining a second field of view which is different than the first exit pupil distance. The adaptive lens element directs light onto said imaging substrate through said microlens array while maintaining the first field of view.