scispace - formally typeset
Search or ask a question

Showing papers on "Image sensor published in 2000"


Journal ArticleDOI
ZhenQiu Zhang1
TL;DR: A flexible technique to easily calibrate a camera that only requires the camera to observe a planar pattern shown at a few (at least two) different orientations is proposed and advances 3D computer vision one more step from laboratory environments to real world use.
Abstract: We propose a flexible technique to easily calibrate a camera. It only requires the camera to observe a planar pattern shown at a few (at least two) different orientations. Either the camera or the planar pattern can be freely moved. The motion need not be known. Radial lens distortion is modeled. The proposed procedure consists of a closed-form solution, followed by a nonlinear refinement based on the maximum likelihood criterion. Both computer simulation and real data have been used to test the proposed technique and very good results have been obtained. Compared with classical techniques which use expensive equipment such as two or three orthogonal planes, the proposed technique is easy to use and flexible. It advances 3D computer vision one more step from laboratory environments to real world use.

13,200 citations


Patent
18 Sep 2000
TL;DR: In this paper, a three-dimensional imaging system includes a two-dimensional array (230) of pixel light sensing detectors and dedicated electronics and associated processing circuitry fabricated on a common IC using CMOS fabrication techniques.
Abstract: A three-dimensional imaging system includes a two-dimensional array (230) of pixel light sensing detectors and dedicated electronics and associated processing circuitry fabricated on a common IC (210) using CMOS fabrication techniques. In one embodiment, each detector (240) has an associated high speed counter (250) that accumulates clock pulses in number directly proportional to time of flight (TOF) for a system-emitted pulse to reflect from an object point and be detected by a pixel detector focused upon that point. The TOF data provides a direct digital measure of distance from the particular pixel to a point on the object reflecting the emitted light pulse. In a second embodiment, the counters and high speed clock circuits are eliminated, and instead each pixel detector (240) is provided with a charge accumulator (600) and an electronic shutter (SI) such that each pixel detector accumulates charge, the amount of which provides a direct measure of round-trip TOF.

285 citations


Journal ArticleDOI
TL;DR: In this article, the effect of threshold voltage variations in pixels is cancelled by employing on-chip calibration, which is a technique to remove the fixed pattern noise (FPN) in CMOS image sensors.
Abstract: CMOS image sensors with logarithmic response are attractive devices for applications where a high dynamic range is required. Their strong point is the high dynamic range. Their weak point is the sensitivity to pixel parameter variations introduced during fabrication. This gives rise to a considerable fixed pattern noise (FPN) that deteriorates the image quality unless pixel calibration is used. In the present work a technique to remove the FPN by employing on-chip calibration is introduced, where the effect of threshold voltage variations in pixels is cancelled. An image sensor based on an active pixel structure with five transistors has been designed, fabricated, and tested. The sensor consists of 525/spl times/525 pixels measuring 7.5 /spl mu/m/spl times/10 /spl mu/m, and is fabricated in a 0.5-/spl mu/m CMOS process. The measured dynamic range is 120 dB while the FPN is 2.5% of the output signal range.

244 citations


Journal ArticleDOI
TL;DR: In this article, a modular system for time-resolved two-dimensional luminescence lifetime imaging of planar optical chemical sensors is presented, which is based on a fast, gateable charge-coupled device (CCD) camera without image intensifier and a pulsable light-emitting diode (LED) array as a light source.
Abstract: We present a modular system for time-resolved two-dimensional luminescence lifetime imaging of planar optical chemical sensors. It is based on a fast, gateable charge-coupled device (CCD) camera without image intensifier and a pulsable light-emitting diode (LED) array as a light source. Software was developed for data acquisition with a maximum of parameter variability and for background suppression. This approach allows the operation of the system even under daylight. Optical sensors showing analyte-specific changes of their luminescence decay time were tested and used for sensing pO2, pCO2, pH, and temperature. The luminophores employed are either platinum(II)-porphyrins or ruthenium(II)-polypyridyl complexes, contained in polymer films, and can be efficiently excited by blue LEDs. The decay times of the sensor films vary from 70mus for the Pt(II)-porphyrins to several 100 ns for the Ru(II) complexes. In a typical application, 7 mm-diameter spots of the respective optical sensor films were placed at the bottom of the wells of microtiter-plates. Thus, every well represents a separate calibration chamber with an integrated sensor element. Both luminescence intensity-based and time-resolved images of the sensor spots were evaluated and compared. The combination of optical sensor technology with time-resolved imaging allows a determination of the distribution of chemical or physical parameters in heterogeneous systems and is therefore a powerful tool for screening and mapping applications. Index Headings: Optical sensor films; Time-resolved imaging system; Microsecond decay time sensors.

230 citations


Patent
29 Dec 2000
TL;DR: In this paper, a handheld mobile telephone system with a detachable camera/battery module for capturing images and a mobile telephone for communicating with a receiving unit is described, which includes a lens for focusing light from a scene to produce an image; an image sensor for capturing one or more images.
Abstract: A handheld mobile telephone system is disclosed including a detachable camera/battery module for capturing images and a mobile telephone for communicating with a receiving unit. The detachable camera/battery module includes a lens for focusing light from a scene to produce an image; an image sensor for capturing one or more images; and a converter for producing digital image signals from the at least one captured image. The detachable camera/battery module further includes a battery for supplying power to the mobile telephone system; and a first connector for detachably supplying the digital image signals and the power to the mobile telephone. The mobile telephone includes a memory for storing the digital image signals; and a processor for processing the stored digital image signals. The mobile telephone further includes a display for displaying the processed digital image signals; a second connector for interfacing with the first connector on the camera/battery module to receive the digital image signals and the power; and a radio frequency transmitter for transmitting the processed digital image signals to the receiving unit. When the camera/battery module is connected to the mobile telephone, images are captured by the camera/battery module and are transmitted to the receiving unit using the mobile telephone.

194 citations


Patent
29 Mar 2000
TL;DR: In this article, an image sensor packaging technique based on a Ball Grid Array (BGA) IC packaging technique, further referred to as image sensor ball grid array (ISBGA), is presented.
Abstract: The present invention is related to an image sensor packaging technique based on a Ball Grid Array (BGA) IC packaging technique, further referred to as image sensor ball grid array (ISBGA). A transparent cover is attached to a semiconductor substrate. Depending on the method of attaching the cover to the substrate a hermetic or non-hermitic sealing is obtained. The obtained structure can be connected trough wire-bonding or flip chip connection.

193 citations


Patent
26 May 2000
TL;DR: In this article, a variable-transmittance mask is used to generate a spatially varying light attenuation pattern across the image sensor, which can be interpolated to account for image sensor pixels that are either under or over exposed to enhance the dynamic range.
Abstract: Apparatus and methods are provided for obtaining high dynamic range images using a low dynamic range image sensor. The scene is exposed to the image sensor in a spatially varying manner. A variable-transmittance mask, which is interposed between the scene and the image sensor, imposes a spatially varying attenuation on the scene light incident on the image sensor. The mask includes light transmitting cells whose transmittance is controlled by application of suitable control signals. The mask is configured to generate a spatially varying light attenuation pattern across the image sensor. The image frame sensed by the image sensor is normalized with respect to the spatially varying light attenuation pattern. The normalized image data can be interpolated to account for image sensor pixels that are either under or over exposed to enhance the dynamic range of the image sensor.

193 citations


Journal ArticleDOI
TL;DR: A gradient-based registration algorithm is utilized to estimate the shifts between the acquired frames and then a weighted nearest-neighbor approach is used for placing the frames onto a uniform grid to form a final high-resolution image.
Abstract: Forward looking infrared (FLIR) detector arrays generally produce spatially undersampled images because the FLIR arrays cannot be made dense enough to yield a sufficiently high spatial sampling frequency. Multi-frame techniques, such as microscanning, are an effective means of reducing aliasing and increasing resolution in images produced by staring imaging systems. These techniques involve interlacing a set of image frames that have been shifted with respect to each other during acquisition. The FLIR system is mounted on a moving platform, such as an aircraft, and the vibrations associated with the platform are used to generate the shifts. Since a fixed number of image frames is required, and the shifts are random, the acquired frames will not fall on a uniformly spaced grid. Furthermore, some of the acquired frames may have almost similar shifts thus making them unusable for high-resolution image reconstruction. In this paper, we utilize a gradient-based registration algorithm to estimate the shifts between the acquired frames and then use a weighted nearest-neighbor approach for placing the frames onto a uniform grid to form a final high-resolution image. Blurring by the detector and optics of the imaging system limits the increase in image resolution when microscanning is attempted at sub-pixel movements of less than half the detector width. We resolve this difficulty by the application of the Wiener filter, designed using the modulation transfer function (MTF) of the imaging system, to the high-resolution image. Simulation and experimental results are presented to verify the effectiveness of the proposed technique. The techniques proposed herein are significantly faster than alternate techniques, and are found to be especially suitable for real-time applications.

184 citations


Patent
05 Jul 2000
TL;DR: In this paper, an automatic distancing, focusing and optical imaging apparatus for optical imaging of an object is described, having at least one lens, a distancing sensor adapted to receive light rays representative of the image that travel through the lens, an imaging sensor adaptively receiving light rays representing the image.
Abstract: An automatic distancing, focusing and optical imaging apparatus for optical imaging of an object is disclosed, having at least one lens, a distancing sensor adapted to receive light rays representative of the image that travel through the lens, an imaging sensor adapted to receive light rays representative of the image that travel through the lens, and at least one processor coupled to the distancing sensor and the imaging sensor, the processor for controlling the movement of the imaging sensor to a position for optimal imaging and for processing the image received by the imaging sensor.

184 citations


Patent
22 Jun 2000
TL;DR: In this article, a vehicular rain sensor which senses precipitation at a vehicle window is presented, and a control is used to determine the presence of precipitation via spatial filtering of the image received by the camera device.
Abstract: A vehicular rain sensor which senses precipitation at a vehicle window. The rain sensor comprises an imaging array sensor and a control. The imaging array sensor is directed at the vehicle window from inside the vehicle and comprises a camera device capable of imaging precipitation at a surface of the window. The camera is operable to image the precipitation at least in response to ambient light present at the window. The control responds to an output of the imaging array sensor in order to indicate precipitation at the surface of the window. The control determines the presence of precipitation via spatial filtering of the image received by the camera device.

169 citations


Journal ArticleDOI
TL;DR: In this article, a multiple wavelength surface plasmon resonance apparatus for imaging applications is presented, which can be used for biosensing, e.g., for monitoring of chemical and biological reactions in real time with label-free molecules.
Abstract: A new, multiple wavelength surface plasmon resonance apparatus for imaging applications is presented. It can be used for biosensing, e.g., for monitoring of chemical and biological reactions in real time with label-free molecules. A setup with a fixed incident angle in the Kretschmann configuration with gold as the supporting metal is described, both theoretically and experimentally. Simulations of the sensor response based on independently recorded optical (ellipsometric) data of gold show that the sensitivity for three-dimensional recognition layers (bulk) increases with increasing wavelength. For two-dimensional recognition layers (adlayer) maximum sensitivity is obtained within a limited wavelength range. In this situation, the rejection of bulk disturbances, e.g., emanating from temperature variations, decreases, with increasing wavelength. For imaging surface plasmon resonance the spatial resolution decreases with increasing wavelength. Hence, there is always a compromise between spatial resolution, bulk disturbance rejection, and sensitivity. Most importantly, by simultaneously using multiple wavelengths, it is possible to maintain a high sensitivity and accuracy over a large dynamic range. Furthermore, our simulations show that the sensitivity is independent of the refractive index of the prism. (C) 2000 American Institute of Physics. [S0034-6748(00)02909-9].

Proceedings ArticleDOI
18 Mar 2000
TL;DR: A method for augmented reality with a stereo vision sensor and a video see-through head-mounted display (HMD) that can synchronize the display timing between the virtual and real worlds so that the alignment error is reduced.
Abstract: In an augmented reality system, it is required to obtain the position and orientation of the user's viewpoint in order to display the composed image while maintaining a correct registration between the real and virtual worlds. All the procedures must be done in real time. This paper proposes a method for augmented reality with a stereo vision sensor and a video see-through head-mounted display (HMD). It can synchronize the display timing between the virtual and real worlds so that the alignment error is reduced. The method calculates camera parameters from three markers in image sequences captured by a pair of stereo cameras mounted on the HMD. In addition, it estimates the real-world depth from a pair of stereo images in order to generate a composed image maintaining consistent occlusions between real and virtual objects. The depth estimation region is efficiently limited by calculating the position of the virtual object by using the camera parameters. Finally, we have developed a video see-through augmented reality system which mainly consists of a pair of stereo cameras mounted on the HMD and a standard graphics workstation. The feasibility of the system has been successfully demonstrated with experiments.

Patent
06 Jun 2000
TL;DR: In this paper, a surveillance system utilizing a plurality of communication devices that are operative with portable telephones to permit an easily reconfigurable remote surveillance area is presented, where each of the devices can have audio and imaging transmitting functions and at least one has an audio and image receiving function.
Abstract: A surveillance system utilizing a plurality of communication devices that are operative with portable telephones to permit an easily reconfigurable remote surveillance area. The portable telephones can have audio and imaging transmitting functions and at least one has an audio and imaging receiving function. An image sensor can sense images within a predetermined area and is operatively connected to a portable telephone for providing an output signal upon detection of a predetermined amount of change in the image area. An auto-dialing section is connected to a first telephone for calling a second remote telephone automatically based on the output signal. Infrared detecting sensors can also image within the surveillance area and be utilized to activate the auto-dialing function. The portable telephones can further include an image sensor for providing an image of the surveillance area.

Patent
14 Apr 2000
TL;DR: In this article, an imaging device can be incorporated in the housing of a standard medical camera adapted for use with traditional rod lens endoscopes, which allows its use with many surgical instruments such as Jackson grasping forceps, balloon catheters, over-tube tissue separating, dissecting or fulgeration devices; modified endotracheal intubation devices or trochars.
Abstract: The imaging device can be incorporated in the housing of a standard medical camera adapted for use with traditional rod lens endoscopes (42). The image sensor may be placed alone on a first circuit board (40), or timing and control circuits may be included on the first circuit board (40) containing the image sensor. One or more video processing boards (50, 60) can be stacked with respect to the first board (40), or the video processing boards may be placed in an external control box (30). The small size of the tubular structure or microendoscope (14) which houses the imaging device allows its use with many surgical instruments such as Jackson grasping forceps; stent placement catheters; balloon catheters; over-tube tissue separating, dissecting or fulgeration devices; modified endotracheal intubation devices or trochars.

Journal ArticleDOI
TL;DR: From the extensive set or experiments carried out to evaluate the measurement performance, good linearity has been observed, and an overall mean value of the measurement error equal to 40 /spl mu/m, with a variability of about /spl plusmn/35 /splmu/m has been estimated.
Abstract: In this paper, the procedure developed to calibrate a whole-field optical profilometer and the evaluation of the measurement performance of the system are presented. The sensor is based on the projection of structured light and on active triangulation. The dependence of the measurements on the geometric parameters of the system is shown, as well as the criterion to calibrate the system. From the extensive set or experiments carried out to evaluate the measurement performance, good linearity has been observed, and an overall mean value of the measurement error equal to 40 /spl mu/m, with a variability of about /spl plusmn/35 /spl mu/m has been estimated.

Patent
26 Dec 2000
TL;DR: In this paper, a programmable arithmetic circuit is proposed to form multiple circuit modules for different arithmetic operations that share certain common electronic elements to reduce the number of elements, which can be integrated to an imaging sensor array such as a CMOS active pixel sensor array to form arithmetic operations and analog to digital conversion for imaging processing.
Abstract: A programmable arithmetic circuit to form multiple circuit modules for different arithmetic operations that share certain common electronic elements to reduce the number of elements. Such circuit can be integrated to an imaging sensor array such as a CMOS active pixel sensor array to form arithmetic operations and analog-to-digital conversion for imaging processing.

Patent
13 Sep 2000
TL;DR: In this paper, a small diameter endoscope with a light transmitting path was proposed for imaging of objects or tissue within a body, where an image relay was used to couple image light between optical elements of the probe such that an image is detected by an imaging sensor at a proximal end of the device.
Abstract: The present invention relates to a small diameter imaging probe or endoscope (10),a solid state imaging device (44),and a light transmitting path (40,30,42) that collects light at a distal end of the probe and directs the light along the length of the probe to the imaging device. The invention also relates to a small diameter endoscope having a light transmitting path with a light absorbing layer (32) and a super clad layer (68) defining the image aperture. The invention relates to a small diameter endoscope system for imaging of objects or tissue within a body. An image relay (42) is used to couple image light between optical elements of the probe such that an image is detected by an imaging sensor (44) at a proximal end of the device.

Patent
25 Sep 2000
TL;DR: In this article, an event recorder (10) mounted in a vehicle includes one or more wave pattern detectors (200) for detection and recognition of the presence of a predetermined wave produced external the vehicle and for producing a trigger signal denoting predetermined wave presence.
Abstract: An event recorder ( 10 ) mounted in a vehicle ( 20 ) includes one or more wave pattern detectors ( 200 ) for detection and recognition of the presence of a predetermined wave produced external the vehicle ( 20 ) and for producing a trigger signal denoting predetermined wave presence. Event recorder ( 10 ) includes sensors, including image sensor ( 60 ), sound sensor ( 90 ), location sensor ( 95 ), and vehicle performance sensors, and a capture circuit for storing sensed data signals for the time period before, during and after the wave was detected. A playback circuit ( 13 ) presents the captured data. The detected wave is produced such as by the police or fire department, or by an emergency vehicle, and is typically produced for a purpose other than being detected by wave detector ( 200 ). Wave detector ( 200 ) may be illumination wave, e.g. infrared beam or flash, detector ( 230 ), a radar detector ( 214 ), a laser kB detector ( 224 ), a flashing light detector ( 240 ), or a siren detector ( 250 ).

Patent
27 Mar 2000
TL;DR: In this paper, a flip-chip image sensor package is fabricated by forming an aperture in a substrate and mounting an image sensor to the substrate, such that an active area of the image sensor is aligned with the aperture.
Abstract: A method of fabricating a flip chip image sensor package includes forming an aperture in a substrate and mounting an image sensor to the substrate The image sensor is mounted such that an active area of the image sensor is aligned with the aperture A bead is formed around a periphery of the image sensor An aperture side of the aperture, the image sensor, and the bead define a pocket The method further includes filling the pocket with a transparent liquid encapsulant and hardening the transparent liquid encapsulant The hardened transparent liquid encapsulant serves as the window for the flip chip image sensor package

Patent
05 Jul 2000
TL;DR: In this paper, a series of shallow cuts are made in an interior surface of a window sheet having a plurality of windows, and a window support layer is formed on an upper surface of an image sensor wafer, such that the windows are above active areas of the image sensors.
Abstract: To form an image sensor package, a series of shallow cuts are made in an interior surface of a window sheet having a plurality of windows. A window support layer is formed on an upper surface of a wafer having a plurality of image sensors. The interior surface of the window sheet is pressed into the window support layer such that the windows are above active areas of the image sensors. The shallow cuts in combination with the window support layer define cavities above bond pads of the image sensors. The window sheet is cut from an exterior surface directly opposite of the cavities above the bond pads to singulating the windows from one another. The wafer is then singulated to form a plurality of image sensor packages.

Journal ArticleDOI
TL;DR: In this paper, a 256/spl times/256 pixel CMOS imager is described that exhibits 120 dB dynamic range, 56 dB signal-to-noise ratio (SNR), 65% fill factor, and an effective frame rate of 50 Hz.
Abstract: In this paper, a 256/spl times/256 pixel CMOS imager is described that exhibits 120 dB dynamic range, 56 dB signal-to-noise ratio (SNR), 65% fill factor, and an effective frame rate of 50 Hz. This has been achieved using a unique combination of a multiexposure and a multigain linear readout. The imager has been integrated in 1 /spl mu/m double-metal CMOS technology. The intended application is for driver's assistant systems, but the imager can be used for a wide range of applications requiring high dynamic range.

Patent
05 Jul 2000
TL;DR: In this article, the authors proposed a window support on the upper surface of an image sensor to enclose the active area and bond pads of the image sensor, and the window support entirely encloses the entire image sensor.
Abstract: An image sensor package includes an image sensor having bond pads and an active area on an upper surface of the image sensor. The image sensor package further includes a window support on the upper surface of the image sensor. The window support entirely encloses the upper surface including the active area and the bond pads. A window is in contact with the window support, the window overlying the active area. Generally, the window support and the window entirely enclose, and thus protect, the active area of the image sensor.

Patent
23 Feb 2000
TL;DR: In this paper, an image sensor operable to vary the output spatial resolution according to a received light level while maintaining a desired signal-to-noise ratio is presented. But the authors do not consider the use of column integrators for uniform column-parallel signal summation.
Abstract: An image sensor operable to vary the output spatial resolution according to a received light level while maintaining a desired signal-to-noise ratio. Signals from neighboring pixels in a pixel patch with an adjustable size are added to increase both the image brightness and signal-to-noise ratio. One embodiment comprises a sensor array for receiving input signals, a frame memory array for temporarily storing a full frame, and an array of self-calibration column integrators for uniform column-parallel signal summation. The column integrators are capable of substantially canceling fixed pattern noise.

Patent
08 Aug 2000
TL;DR: An imaging sensor module assembly adapted to be mounted to a substrate for use in electronic imaging devices is described in this paper, which includes an optical lens, and a sensor package having a sensor surface containing an optical detector portion.
Abstract: An imaging sensor module assembly adapted to be mounted to a substrate for use in electronic imaging devices. The imaging sensor includes an optical lens, and a sensor package having a sensor surface containing an optical detector portion. The sensor further includes a plurality of sensor contacts in electrical communication with the optical detector portion. A flex circuit includes a plurality of circuits terminating at respective terminals electrically coupled to a corresponding sensor contact. The module assembly further includes a lens housing assembly configured to support the optical lens, and a barrel portion adapted to fixedly couple to the sensor package. This coupling orients the lens a predetermined focal length from the sensor package such that light waves passing through the lens are focused onto the optical detector portion.

Proceedings ArticleDOI
TL;DR: This paper describes a methodology, using a camera simulator and image quality metrics, for determining the optimal pixel size, and it is shown that the optimalpixel size scales with technology, btu at slower rate than the technology itself.
Abstract: Pixel design is a key part of image sensor design. After deciding on pixel architecture, a fundamental tradeoff is made to select pixel size. A small pixel size is desirable because it results in a smaller die size and/or higher spatial resolution; a large pixel size is desirable because it results in higher dynamic range and signal-to-noise ratio. Given these two ways to improve image quality and given a set of process and imaging constraints an optimal pixel size exists. It is difficult, however, to analytically determine the optimal pixel size, because the choice depends on many factors, including the sensor parameters, imaging optics and the human perception of image quality. This paper describes a methodology, using a camera simulator and image quality metrics, for determining the optimal pixel size. The methodology is demonstrated for APS implemented in CMOS processes down to 0.18 (mu) technology. For a typical 0.35 (mu) CMOS technology the optimal pixel size is found to be approximately 6.5 micrometers at fill factor of 30%. It is shown that the optimal pixel size scales with technology, btu at slower rate than the technology itself.

Patent
02 Jun 2000
TL;DR: In this article, a digital camera (10) is disclosed for capturing digital images and organizing the captured images for subsequent transfer from the digital camera to an external device (40) that utilizes the digital images.
Abstract: A digital camera (10) is disclosed for capturing digital images and organizing the captured images for subsequent transfer from the digital camera to an external device (40) that utilizes the digital images. The digital camera includes a database having a plurality of customized profiles, wherein each customized profile contains a plurality of image utilization fields. A user selects one of the plurality of customized profiles from the database. The digital camera further includes a structure for defining a plurality of profile indices respectively corresponding to ones of the plurality of customized profiles, and an image sensor for capturing images. A profile index is associated with at least one captured image to identify the corresponding selected customized profile. The digital camera further includes a memory for receiving and storing the at least one captured image and the corresponding profile index.

Proceedings ArticleDOI
10 Jul 2000
TL;DR: Methods and results for fusion of imagery from multiple sensors to create a color night vision capability are presented and how results from these multi-sensor fusion systems are used as inputs to an interactive tool for target designation, learning, and search based on a fuzzy ARTMAP neural network is demonstrated.
Abstract: We present methods and results for fusion of imagery from multiple sensors to create a color night vision capability. The fusion system architectures are based on biological models of the spatial and opponent-color processes in the human retina and visual cortex, implemented as shunting center-surround feed-forward neural networks. Real-time implementation of the dual-sensor fusion system combines imagery from either a low-light CCD camera or a short-wave infrared camera, with thermal long-wave infrared imagery. Results are also shown for extensions of this fusion architecture to include imagery from all three of these sensors, Visible/SWIR/LWIR, as well as a four sensor system using Visible/SWIR/MWIR/LWIR cameras. We also demonstrate how results from these multi-sensor fusion systems are used as inputs to an interactive tool for target designation, learning, and search based on a fuzzy ARTMAP neural network.

Patent
20 Nov 2000
TL;DR: In this paper, the optical beam received from the camera lens is split into two beams via a beam splitter, and each beam is applied to the corresponding color and monochrome image sensors.
Abstract: A video signal generating apparatus where the video signal is produced using two different image sensors has been described. The optical beam received from the camera lens is split into two beams via a beam splitter. Each beam is applied to the corresponding color and monochrome image sensors. Both imagers are scanned synchronously, and the corresponding output signals are digitized. The monochrome imager is scanned at a lower frame rate and generates a high-resolution luminance signal Y H . The second imager has color filters, arranged as vertical stripes. This imager is scanned at a higher frame rate, faster than the first frame rate, and generates a set of two low-resolution color difference signals C R and C B , and one low-resolution luminance signals Y L . At frame 1 both image sensors are scanned, at frames 2 and 3 only the color sensor is scanned, at frame 4 both image sensors are scanned, at frames 5 and 6 only the color sensor is scanned, at frame 7 both image sensors are scanned, and so on. The low-resolution color signals C R and C B from frames 1,4,7 etc. are digitally interpolated to have the same resolution as corresponding high-resolution luminance signal Y H .Parallel with this, a set of motion signals are generated from the low-resolution luminance Y L from all frames The motion signals, the interpolated color difference signals, and the high-resolution luminance Y H are mixed into a composite data stream, compressed if necessary, and formatted into a standard 20 bits wide 1.5 Giga bits per second data rate. Additional information such as date, time, camera location, etc. can be added to the data stream if needed.

Patent
26 May 2000
TL;DR: In this article, a method and apparatus for obtaining relatively high dynamic range images using a relatively low dynamic range image sensor without significant loss of resolution is presented, where an image of a scene is captured with the image sensor and stored as brightness values at respective pixel positions in a linear or two-dimensional uniform grid.
Abstract: Disclosed are method and apparatus for obtaining relatively high dynamic range images using a relatively low dynamic range image sensor without significant loss of resolution. The image sensor has an array of light-sensing elements with different sensitivity levels in accordance with a predetermined spatially varying sensitivity pattern for the array of light-sensing elements. An image of a scene is captured with the image sensor and stored as brightness values at respective pixel positions in a linear or two-dimensional uniform grid. The brightness values of the captured image at the pixel positions are then used to estimate the brightness values at off-grid positions of a uniform off-grid array located at respective interstices of the pixel position grid. The estimated off-grid brightness values are either used directly as the pixel brightness values of a relatively high dynamic output image or interpolated to derive resampled on-grid brightness values at the pixel positions of the pixel position grid to provide a relatively high dynamic range output image. Alternatively, the brightness values of the captured image are interpolated by an on-grid interpolation filter to derive pixel brightness values of a relatively high dynamic range output image, each pixel brightness value of the output image being derived from a corresponding plurality of the captured image brightness values. In each instance, either the captured image brightness values or the pixel brightness values of the output image may be compensated for non-linearities of the radiometric response function of the light-sensing elements of the image sensor.

Patent
14 Jan 2000
TL;DR: In this article, a CMOS image sensor die is fabricated and packaged to allow the light sensitive area of the die to be illuminated from either the front side or the backside, or both.
Abstract: A CMOS image sensor die is fabricated and packaged to allow the light sensitive area of the die to be illuminated from either the front side or the backside, or both. The implementation is achieved using wafer level processing that facilitates photon collection at both surfaces. This approach permits processing apt the wafer level to allow the deposition of color filter arrays (CFA) on either surface. The silicon is thinned and the bump contacts and interconnect lines are relocated away from the image area of the die. The die is covered with an optically transparent material to provide additional support.