scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

High frame rate for 3D Time-of-Flight cameras by dynamic sensor calibration

08 Apr 2011-pp 1-8
TL;DR: This work presents a method to implicitly calibrate these asymmetries of multi-tap 3D Time-of-Flight sensors using the proposed correction of raw data supersedes the commonly used averaging technique and increases the frame rate by at least a factor of two.
Abstract: 3D Time-of-Flight cameras are able to deliver robust depth maps of dynamic scenes. The frame rate, however, is limited because today's systems utilizing two-tap sensors need to acquire the required raw images in multiple instances in order to compute one depth map. These multiple raw images allow canceling out systematic errors introduced by asymmetries in the two taps, which otherwise would distort the reconstructed depth map. This work presents a method to implicitly calibrate these asymmetries of multi-tap 3D Time-of-Flight sensors. The calibration data are gathered from arbitrary live acquisitions possibly in real-time. The proposed correction of raw data supersedes the commonly used averaging technique. Thus it is possible to compute multiple depth maps from a single set of raw images. This increases the frame rate by at least a factor of two. The method is verified using real camera data and is evaluated quantitatively.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: An experimental study is conducted to evaluate techniques for mitigation of non-linearity, and it is found that harmonic cancellation provides a significant improvement in phase and amplitude linearity.
Abstract: Amplitude modulated continuous wave (AMCW) lidar systems commonly suffer from non-linear phase and amplitude responses due to a number of known factors such as aliasing and multipath inteference. In order to produce useful range and intensity information it is necessary to remove these perturbations from the measurements. We review the known causes of non-linearity, namely aliasing, temporal variation in correlation waveform shape and mixed pixels/multipath inteference. We also introduce other sources of non-linearity, including crosstalk, modulation waveform envelope decay and non-circularly symmetric noise statistics, that have been ignored in the literature. An experimental study is conducted to evaluate techniques for mitigation of non-linearity, and it is found that harmonic cancellation provides a significant improvement in phase and amplitude linearity.

35 citations


Cites methods from "High frame rate for 3D Time-of-Flig..."

  • ...[5] have developed a method for dynamic determination of bias and gain correction coefficients, potentially enabling approaches such as Hussman’s to be implemented without bias and gain variation induced systematic errors....

    [...]

Proceedings ArticleDOI
01 Dec 2011
TL;DR: This work investigates the causes leading to motion artifacts and proposes a method which significantly reduces this kind of errors by analyzing the temporal raw data signal of individual pixels, leading to a possibility for identification and correction of affected raw data values.
Abstract: 3D Time-of-Flight (ToF) cameras are capable to acquire dense depth maps of a scene by determining the time it takes for light to travel from a source to an object and back to the camera. Determining the depth requires multiple measurements. Current ToF system are not able to acquire all these measurements simultaneously. If the observed scene is changing during the acquisition of data for computation of a single depth map, the reconstructed values are erroneous. Such errors are known as motion artifacts. This work investigates the causes leading to motion artifacts and proposes a method which significantly reduces this kind of errors. This is done by analyzing the temporal raw data signal of individual pixels, leading to a possibility for identification and correction of affected raw data values. Using a commercial ToF system the method is demonstrated. The proposed algorithms can be implemented in a computationally very efficient way. Thus they can be applied in real-time, even on systems with limited computational resources (e. g. embedded systems).

11 citations


Cites background or methods from "High frame rate for 3D Time-of-Flig..."

  • ...The use of a multitap system in combination with the dynamic calibration method from [6] allows to employ multiple sets out of a burst (BID method)....

    [...]

  • ...The depth map computed from a subset of the available acquisitions shows an increased statistical uncertainty (see [6])....

    [...]

  • ...In combination with the dynamic calibration from [6] this method is able...

    [...]

  • ...A very promising possibility to overcome this limitation is opened up by a dynamic calibration method presented in [6]....

    [...]

  • ...The dynamic calibration method proposed in [6] facilitates to split each set of acquisitions into two subsets....

    [...]

Dissertation
01 Jan 2012
TL;DR: A detailed model of signal formation is developed, that models noise statistics not included in previously reported models and is found to provide an order of magnitude improvement in ranging accuracy, albeit at the expense of ranging precision.
Abstract: This thesis presents an analysis of systematic error in full-field amplitude modulated continuous wave range-imaging systems. The primary focus is on the mixed pixel/multipath interference problem, with digressions into defocus restoration, irregular phase sampling and the systematic phase perturbations introduced by random noise. As an integral part of the thesis, a detailed model of signal formation is developed, that models noise statistics not included in previously reported models. Prior work on the mixed pixel/multipath interference problem has been limited to detection and removal of perturbed measurements or partial amelioration using spatial information, such as knowledge of the spatially variant scattering point spread function, or raytracing using an assumption of Lambertian reflection. Furthermore, prior art has only used AMCW range measurements at a single modulation frequency. In contrast, in this thesis, by taking multiple measurements at different modulation frequencies with known ratio-of-integers frequency relationships, a range of new closed-form and lookup table based inversion and bounding methods are explored. These methods include: sparse spike train deconvolution based multiple return separation, a closed-form inverse using attenuation ratios and a normalisation based lookup table method that uses a new property we term the characteristic measurement. Other approaches include a Cauchy distribution based model for backscattering sources which are range-diffuse, like fog or hair. Novel bounding methods are developed using the characteristic measurement and attenuation ratios on relative intensity, relative phase and phase perturbutation. A detailed noise and performance analysis is performed of the characteristic measurement lookup table method and the bounding methods using simulated data. Experiments are performed using the University of Waikato Heterodyne range-imager, the Canesta XZ-422 and the Mesa Imaging Swissranger 4000 in order to demonstrate the performance of the lookup table method. The lookup table method is found to provide an order of magnitude improvement in ranging accuracy, albeit at the expense of ranging precision.

10 citations

Proceedings ArticleDOI
02 May 2017
TL;DR: This work proposes combining the Kinect depth camera with an ordinary color camera to synthesize a high frame rate and low latency depth image, and exploits common CMOS camera region of interest (ROI) functionality to obtain a highframe rate image over a small ROI.
Abstract: The low frame rate and high latency of consumer depth cameras limits their use in interactive applications. We propose combining the Kinect depth camera with an ordinary color camera to synthesize a high frame rate and low latency depth image. We exploit common CMOS camera region of interest (ROI) functionality to obtain a high frame rate image over a small ROI. Motion in the ROI is computed by a fast optical flow implementation. The resulting flow field is used to extrapolate Kinect depth images to achieve high frame rate and low latency depth, and optionally predict depth to further reduce latency. Our "Hybrid HFR Depth" prototype generates useful depth images at maximum 500Hz with minimum 20ms latency. We demonstrate Hybrid HFR Depth in tracking fast moving objects, handwriting in the air, and projecting onto moving hands. Based on commonly available cameras and image processing implementations, Hybrid HFR Depth may be useful to HCI practitioners seeking to create fast, fluid depth camera-based interactions.

8 citations


Cites methods from "High frame rate for 3D Time-of-Flig..."

  • ...[15] presented a method to implicitly calibrate multi-tap 3D Time of Flight sensors, increasing the frame rate by a factor of two....

    [...]

  • ...Compared to customized hardware [9, 13, 14] and software-only approaches [3, 7, 8, 15, 17, 19], it is hybrid in nature....

    [...]

Patent
31 Oct 2019
TL;DR: In this paper, a distance measurement processing apparatus including a correction parameter calculating section configured to calculate a correction parameters for correcting deviation of characteristics between a first tap and a second tap using a predetermined number of detection signals which are detected two each for two or more types of irradiated light.
Abstract: There is provided a distance measurement processing apparatus including: a correction parameter calculating section configured to calculate a correction parameter for correcting deviation of characteristics between a first tap and a second tap using a predetermined number of detection signals which are detected two each for two or more types of irradiated light, two or more types of the irradiated light with a predetermined phase difference being radiated on an object, and electric charges generated by reflected light reflected by the object being received being sorted into the first tap and the second tap in accordance with a distance to the object; and a distance measuring section configured to obtain a depth indicating a distance to the object on the basis of the correction parameter and a predetermined number of the detection signals.

1 citations

References
More filters
Proceedings ArticleDOI
27 Jun 2004
TL;DR: A CMOS-based time-of-flight depth sensor based on a special CMOS pixel structure that can extract phase information from the received light pulses that offers significant advantages, including superior accuracy, high frame rate, cost effectiveness and a drastic reduction in processing required to construct the depth maps.
Abstract: This paper describes a CMOS-based time-of-flight depth sensor and presents some experimental data while addressing various issues arising from its use. Our system is a single-chip solution based on a special CMOS pixel structure that can extract phase information from the received light pulses. The sensor chip integrates a 64x64 pixel array with a high-speed clock generator and ADC. A unique advantage of the chip is that it can be manufactured with an ordinary CMOS process. Compared with other types of depth sensors reported in the literature, our solution offers significant advantages, including superior accuracy, high frame rate, cost effectiveness and a drastic reduction in processing required to construct the depth maps. We explain the factors that determine the resolution of our system, discuss various problems that a time-of-flight depth sensor might face, and propose practical solutions.

528 citations

13 Nov 2006

391 citations


"High frame rate for 3D Time-of-Flig..." refers methods in this paper

  • ...As derived by Lange [5], assuming a sinusoidal modulation of the light source and a rectangular window function, the correlation function is...

    [...]

  • ...As derived by Lange [5], assuming a sinusoidal modulation of the light source and a rectangular window function, the correlation function is I(Θ) = a0 + a1 · cos(Θ + ϕ) , (1) where a0 and a1 are the offset and amplitude of the electrooptical signal, and ϕ is the phase shift between both signals....

    [...]

Proceedings ArticleDOI
18 Feb 2004
TL;DR: The SwissRanger 2 as mentioned in this paper is a 3D camera system based on the time-of-flight (TOF) principle, which can achieve sub-centimeter depth resolution for a wide range of operating conditions.
Abstract: A new miniaturized camera system that is capable of 3-dimensional imaging in real-time is presented. The compact imaging device is able to entirely capture its environment in all three spatial dimensions. It reliably and simultaneously delivers intensity data as well as range information on the objects and persons in the scene. The depth measurement is based on the time-of-flight (TOF) principle. A custom solid-state image sensor allows the parallel measurement of the phase, offset and amplitude of a radio frequency (RF) modulated light field that is emitted by the system and reflected back by the camera surroundings without requiring any mechanical scanning parts. In this paper, the theoretical background of the implemented TOF principle is presented, together with the technological requirements and detailed practical implementation issues of such a distance measuring system. Furthermore, the schematic overview of the complete 3D-camera system is provided. The experimental test results are presented and discussed. The present camera system can achieve sub-centimeter depth resolution for a wide range of operating conditions. A miniaturized version of such a 3D-solid-state camera, the SwissRanger 2, is presented as an example, illustrating the possibility of manufacturing compact, robust and cost effective ranging camera products for 3D imaging in real-time.

301 citations

DOI
01 Jan 2006
TL;DR: In this paper some of the parameters which influence the behavior and performance of the range imaging camera SwissRanger (provided by the Swiss Center for Electronics and Microtechnology CSEM) are described.
Abstract: Range imaging is a new suitable choice for measurement and modeling in many different applications. But due to the technology’s relatively new appearance on the market with a few different realizations, the knowledge of its capabilities is very low. In most applications, like robotics and measurement systems, the accuracy wanted, lies at some millimeters. The raw data of range imaging cameras do not reach this level. Therefore, the calibration of the sensors output is needed. In this paper some of the parameters which influence the behavior and performance of the range imaging camera SwissRanger (provided by the Swiss Center for Electronics and Microtechnology CSEM) are described. Because of the highly systematic structure and correlations between parameters and output data, a parameter based calibration approach is presented. This includes a photogrammetric camera calibration and a distance system calibration with respect to the reflectivity and the distance itself.

229 citations


"High frame rate for 3D Time-of-Flig..." refers background in this paper

  • ...For instance Kahlmann, Remondino, and Ingensand [4], Lindner and Kolb [6] and Rapp [9] presented methods to decrease systematic deviations of the estimated scene unknowns....

    [...]

Journal Article
TL;DR: In this paper, an accurate distance calibration approach for PMD-based distance sensoring is described. But the approach is a rather new and unexplored method, proper calibration techniques have not been widely investigated yet.
Abstract: A growing number of modern applications such as position determination, object recognition and collision prevention depend on accurate scene analysis. The estimation of an object's distance relative to an observers position by image analysis or laser scan techniques is thereby still the most time-consuming and expensive part. A lower-priced and much faster alternative is the distance measurement with modulated, coherent infrared light based on the Photo Mixing Detector (PMD) technique. As this approach is a rather new and unexplored method, proper calibration techniques have not been widely investigated yet. This paper describes an accurate distance calibration approach for PMD-based distance sensoring.

208 citations