scispace - formally typeset
Search or ask a question

Showing papers by "Stephen McLaughlin published in 2021"


Journal ArticleDOI
TL;DR: A deep network built to take advantage of the multiple features that can be extracted from a camera's histogram data is developed, providing significant image resolution enhancement and image denoising across a wide range of signal-to-noise ratios and photon levels.
Abstract: The number of applications that use depth imaging is increasing rapidly, e.g. self-driving autonomous vehicles and auto-focus assist on smartphone cameras. Light detection and ranging (LIDAR) via single-photon sensitive detector (SPAD) arrays is an emerging technology that enables the acquisition of depth images at high frame rates. However, the spatial resolution of this technology is typically low in comparison to the intensity images recorded by conventional cameras. To increase the native resolution of depth images from a SPAD camera, we develop a deep network built to take advantage of the multiple features that can be extracted from a camera’s histogram data. The network is designed for a SPAD camera operating in a dual-mode such that it captures alternate low resolution depth and high resolution intensity images at high frame rates, thus the system does not require any additional sensor to provide intensity images. The network then uses the intensity images and multiple features extracted from down-sampled histograms to guide the up-sampling of the depth. Our network provides significant image resolution enhancement and image denoising across a wide range of signal-to-noise ratios and photon levels. Additionally, we show that the network can be applied to other data types of SPAD data, demonstrating the generality of the algorithm.

30 citations


Journal ArticleDOI
TL;DR: In this article, a hierarchical Bayesian algorithm for the robust reconstruction of multispectral single-photon Lidar data in high-noise environments is presented. But the proposed algorithm exploits multi-scale information to provide robust depth and reflectivity estimates together with their uncertainties to help with decision making.
Abstract: 3D Lidar imaging can be a challenging modality when using multiple wavelengths, or when imaging in high noise environments (e.g., imaging through obscurants). This paper presents a hierarchical Bayesian algorithm for the robust reconstruction of multispectral single-photon Lidar data in such environments. The algorithm exploits multi-scale information to provide robust depth and reflectivity estimates together with their uncertainties to help with decision making. The proposed weight-based strategy allows the use of available guide information that can be obtained by using state-of-the-art learning based algorithms. The proposed Bayesian model and its estimation algorithm are validated on both synthetic and real images showing competitive results regarding the quality of the inferences and the computational complexity when compared to the state-of-the-art algorithms.

20 citations


Journal ArticleDOI
TL;DR: A new similarity measure for robust depth estimation is considered, which allows us to use a simple observation model and a non-iterative estimation procedure while being robust to mis-specification of the background illumination model, and leads to a computationally attractive depth estimation procedure without significant degradation of the reconstruction performance.
Abstract: In this article, we present a new algorithm for fast, online 3D reconstruction of dynamic scenes using times of arrival of photons recorded by single-photon detector arrays. One of the main challenges in 3D imaging using single-photon lidar in practical applications is the presence of strong ambient illumination which corrupts the data and can jeopardize the detection of peaks/surface in the signals. This background noise not only complicates the observation model classically used for 3D reconstruction but also the estimation procedure which requires iterative methods. In this work, we consider a new similarity measure for robust depth estimation, which allows us to use a simple observation model and a non-iterative estimation procedure while being robust to mis-specification of the background illumination model. This choice leads to a computationally attractive depth estimation procedure without significant degradation of the reconstruction performance. This new depth estimation procedure is coupled with a spatio-temporal model to capture the natural correlation between neighboring pixels and successive frames for dynamic scene analysis. The resulting online inference process is scalable and well suited for parallel implementation. The benefits of the proposed method are demonstrated through a series of experiments conducted with simulated and real single-photon lidar videos, allowing the analysis of dynamic scenes at 325 m observed under extreme ambient illumination conditions.

11 citations


Posted Content
TL;DR: In this paper, a task-optimized adaptive sampling framework is proposed to enable fast acquisition and processing of high-dimensional single-photon LiDAR data, where the iterative sampling strategy targets the most informative regions of a scene which are defined as those minimizing parameter uncertainties.
Abstract: 3D single-photon LiDAR imaging plays an important role in numerous applications. However, long acquisition times and significant data volumes present a challenge to LiDAR imaging. This paper proposes a task-optimized adaptive sampling framework that enables fast acquisition and processing of high-dimensional single-photon LiDAR data. Given a task of interest, the iterative sampling strategy targets the most informative regions of a scene which are defined as those minimizing parameter uncertainties. The task is performed by considering a Bayesian model that is carefully built to allow fast per-pixel computations while delivering parameter estimates with quantified uncertainties. The framework is demonstrated on multispectral 3D single-photon LiDAR imaging when considering object classification and/or target detection as tasks. It is also analysed for both sequential and parallel scanning modes for different detector array sizes. Results on simulated and real data show the benefit of the proposed optimized sampling strategy when compared to fixed sampling strategies.

4 citations


Proceedings ArticleDOI
23 Sep 2021
TL;DR: In this paper, the problem of joint surface detection and depth estimation from single-photon Lidar (SPL) data is formulated as a single inference problem and recast as a model selection/averaging problem to avoid the intractable integrals usually involved with variable marginalization.
Abstract: This paper addresses the problem of joint surface detection and depth estimation from single-photon Lidar (SPL) data. Traditional 3D ranging methods for SPL usually perform surface detection and range estimation sequentially to alleviate the computational burden of joint detection and estimation. Adopting a Bayesian formalism, the joint detection/estimation problem is formulated as a single inference problem. To avoid the intractable integrals usually involved with variable marginalization, we consider discrete variables and the resulting problem is recast as a model selection/averaging problem.We illustrate our method for a case where the expected signal-to-background (e.g., the target reflectivity and ambient illumination level) is unknown but the proposed framework can be adapted to more complex problems where the target depth can be obtained by combining several estimators. We demonstrate the additional benefits of the proposed method in also providing a conservative approach to uncertainty quantification of the calculated depth estimates, which can be used for real time analysis. The benefits of the proposed methods are illustrated using synthetic and real SPL data for targets at up to 8.6 km.

2 citations


Proceedings ArticleDOI
24 Jan 2021
TL;DR: In this article, an ExpectationPropagation (EP) algorithm is proposed to address the problem of joint robust linear regression and sparse anomaly detection from data corrupted by Poisson noise.
Abstract: In this paper, we propose a new Expectation-Propagation (EP) algorithm to address the problem of joint robust linear regression and sparse anomaly detection from data corrupted by Poisson noise. Adopting an approximate Bayesian approach, an EP method is derived to approximate the posterior distribution of interest. The method accounts not only for additive anomalies, but also for destructive anomalies, i.e., anomalies that can lead to observations with amplitudes lower than the expected signals. Experiments conducted with both synthetic and real data illustrate the potential benefits of the proposed EP method in joint spectral unmixing and anomaly detection in the photon-starved regime of a Lidar system.

1 citations


Proceedings ArticleDOI
23 Sep 2021
TL;DR: In this article, a fast pixel-wise classification algorithm for multispectral single-photon LiDAR imaging is proposed, which allows the detection of histograms containing surfaces with specific spectral signatures (i.e., specific materials) and discarding those histograms without reflective surfaces.
Abstract: Multispectral 3D LiDAR imaging plays an important role in the remote sensing community as it can provide rich spectral and depth information from targets. This paper proposes a fast pixel-wise classification algorithm for multispectral single-photon LiDAR imaging. The algorithm allows the detection of histograms containing surfaces with specific spectral signatures (i.e., specific materials) and discarding those histograms without reflective surfaces. The proposed Bayesian model is carefully built to allow the marginalization of latent variables leading to a tractable formulation and fast estimation of the parameters of interest, together with their uncertainties. Results on simulated and real single-photon data illustrates the robustness and good performance of this approach.

1 citations


Book ChapterDOI
01 Jan 2021
TL;DR: In this article, a family of approximate Bayesian methods for joint anomaly detection and linear regression in the presence of non-Gaussian noise is presented, which aim at approximating complex distributions by more tractable models to simplify the inference process.
Abstract: This paper presents a family of approximate Bayesian methods for joint anomaly detection and linear regression in the presence of non-Gaussian noise Robust anomaly detection using non-convex sparsity-promoting regularization terms is generally challenging, in particular when additional uncertainty measures about the estimation process are needed, eg, posterior probabilities of anomaly presence The problem becomes even more challenging in the presence of non-Gaussian, (eg, Poisson distributed), additional constraints on the regression coefficients (eg, positivity) and when the anomalies present complex structures (eg, structured sparsity) Uncertainty quantification is classically addressed using Bayesian methods Specifically, Monte Carlo methods are the preferred tools to handle complex models Unfortunately, such simulation methods suffer from a significant computational cost and are thus not scalable for fast inference in high dimensional problems In this paper, we thus propose fast alternatives based on Expectation-Propagation (EP) methods, which aim at approximating complex distributions by more tractable models to simplify the inference process The main problem addressed in this paper is linear regression and (sparse) anomaly detection in the presence of noisy measurements The aim of this paper is to demonstrate the potential benefits and assess the performance of such EP-based methods The results obtained illustrate that approximate methods can provide satisfactory results with a reasonable computational cost It is important to note that the proposed methods are sufficiently generic to be used in other applications involving condition monitoring

1 citations


Journal ArticleDOI
TL;DR: In this article, the problem of activity estimation in passive gamma emission tomography (PGET) of spent nuclear fuel was formulated within a Bayesian framework as a linear inverse problem and prior distributions were assigned to the unknown model parameters.
Abstract: In this paper, we address the problem of activity estimation in passive gamma emission tomography (PGET) of spent nuclear fuel. Two different noise models are considered and compared, namely, the isotropic Gaussian and the Poisson noise models. The problem is formulated within a Bayesian framework as a linear inverse problem and prior distributions are assigned to the unknown model parameters. In particular, a Bernoulli-truncated Gaussian prior model is considered to promote sparse pin configurations. A Markov chain Monte Carlo (MCMC) method, based on a split and augmented Gibbs sampler, is then used to sample the posterior distribution of the unknown parameters. The proposed algorithm is first validated by simulations conducted using synthetic data, generated using the nominal models. We then consider more realistic data simulated using a bespoke simulator, whose forward model is non-linear and not available analytically. In that case, the linear models used are mis-specified and we analyse their robustness for activity estimation. The results demonstrate superior performance of the proposed approach in estimating the pin activities in different assembly patterns, in addition to being able to quantify their uncertainty measures, in comparison with existing methods.

1 citations


Posted Content
TL;DR: In this paper, a hierarchical Bayesian algorithm for the robust reconstruction of multispectral single-photon Lidar data in high-noise environments is presented. But the proposed algorithm exploits multi-scale information to provide robust depth and reflectivity estimates together with their uncertainties to help with decision making.
Abstract: 3D Lidar imaging can be a challenging modality when using multiple wavelengths, or when imaging in high noise environments (e.g., imaging through obscurants). This paper presents a hierarchical Bayesian algorithm for the robust reconstruction of multispectral single-photon Lidar data in such environments. The algorithm exploits multi-scale information to provide robust depth and reflectivity estimates together with their uncertainties to help with decision making. The proposed weight-based strategy allows the use of available guide information that can be obtained by using state-of-the-art learning based algorithms. The proposed Bayesian model and its estimation algorithm are validated on both synthetic and real images showing competitive results regarding the quality of the inferences and the computational complexity when compared to the state-of-the-art algorithms.

1 citations


Posted Content
TL;DR: In this paper, a spike-and-slab abundance prior is adopted to promote sparse mixtures and an Ising prior model is used to capture spatial correlation of the mixture support across pixels.
Abstract: This paper presents a novel Bayesian approach for hyperspectral image unmixing. The observed pixels are modeled by a linear combination of material signatures weighted by their corresponding abundances. A spike-and-slab abundance prior is adopted to promote sparse mixtures and an Ising prior model is used to capture spatial correlation of the mixture support across pixels. We approximate the posterior distribution of the abundances using the expectation-propagation (EP) method. We show that it can significantly reduce the computational complexity of the unmixing stage and meanwhile provide uncertainty measures, compared to expensive Monte Carlo strategies traditionally considered for uncertainty quantification. Moreover, many variational parameters within each EP factor can be updated in a parallel manner, which enables mapping of efficient algorithmic architectures based on graphics processing units (GPU). Under the same approximate Bayesian framework, we then extend the proposed algorithm to semi-supervised unmixing, whereby the abundances are viewed as latent variables and the expectation-maximization (EM) algorithm is used to refine the endmember matrix. Experimental results on synthetic data and real hyperspectral data illustrate the benefits of the proposed framework over state-of-art linear unmixing methods.

Posted Content
TL;DR: In this article, patch-based prior distributions are used to approximate the posterior distributions using products of multivariate Gaussian densities, imposing structural constraints on the covariance matrices of these densities allows for greater scalability and distributed computation.
Abstract: This paper presents a new Expectation Propagation (EP) framework for image restoration using patch-based prior distributions. While Monte Carlo techniques are classically used to sample from intractable posterior distributions, they can suffer from scalability issues in high-dimensional inference problems such as image restoration. To address this issue, EP is used here to approximate the posterior distributions using products of multivariate Gaussian densities. Moreover, imposing structural constraints on the covariance matrices of these densities allows for greater scalability and distributed computation. While the method is naturally suited to handle additive Gaussian observation noise, it can also be extended to non-Gaussian noise. Experiments conducted for denoising, inpainting and deconvolution problems with Gaussian and Poisson noise illustrate the potential benefits of such flexible approximate Bayesian method for uncertainty quantification in imaging problems, at a reduced computational cost compared to sampling techniques.

Proceedings ArticleDOI
24 Jan 2021
TL;DR: In this article, a Bayesian approach is developed to perform the estimation of model parameters in a reduced computational time, which is achieved by transforming an EM-based algorithm recently proposed into a stochastic EM algorithm.
Abstract: This paper addresses the problem of estimating spectral and range profiles from single-photon Lidar waveforms associated with single surfaces in presence of an unknown background. A single Lidar waveform per pixel is considered, whereby a single detector is used to acquire information simultaneously at multiple wavelengths. A novel Bayesian approach is developed to perform the estimation of model parameters in a reduced computational time. This is achieved by transforming an EM-based algorithm recently proposed into a stochastic EM algorithm, which is computationally more attractive. The reconstruction performance and computational complexity of our approach are assessed through a series of experiments using synthetic data under different observation scenarios. The obtained results demonstrate a significant speed-up compared to the state-of-the-art method, without significant degradation of the estimation quality.

Posted Content
TL;DR: In this article, a knowledge graph driven approach for autonomic network management in software defined networks (SDNs), termed as SeaNet, is proposed, which is reprogrammed based on Mininet (a SDN emulator).
Abstract: Automatic network management driven by Artificial Intelligent technologies has been heatedly discussed over decades. However, current reports mainly focus on theoretic proposals and architecture designs, works on practical implementations on real-life networks are yet to appear. This paper proposes our effort toward the implementation of knowledge graph driven approach for autonomic network management in software defined networks (SDNs), termed as SeaNet. Driven by the ToCo ontology, SeaNet is reprogrammed based on Mininet (a SDN emulator). It consists three core components, a knowledge graph generator, a SPARQL engine, and a network management API. The knowledge graph generator represents the knowledge in the telecommunication network management tasks into formally represented ontology driven model. Expert experience and network management rules can be formalized into knowledge graph and by automatically inferenced by SPARQL engine, Network management API is able to packet technology-specific details and expose technology-independent interfaces to users. The Experiments are carried out to evaluate proposed work by comparing with a commercial SDN controller Ryu implemented by the same language Python. The evaluation results show that SeaNet is considerably faster in most circumstances than Ryu and the SeaNet code is significantly more compact. Benefit from RDF reasoning, SeaNet is able to achieve O(1) time complexity on different scales of the knowledge graph while the traditional database can achieve O(nlogn) at its best. With the developed network management API, SeaNet enables researchers to develop semantic-intelligent applications on their own SDNs.

Proceedings ArticleDOI
09 May 2021
TL;DR: In this article, a 2.5D non-line-of-sight reconstruction of large-scale scenes using time-correlated single-photon detection and pulsed illumination along an arc at a small opening where a vertical wall edge meets a floor plane is presented.
Abstract: We demonstrate 2.5-dimensional, 180° field-of-view non-line-of-sight reconstructions of large-scale scenes using time-correlated single-photon detection and pulsed illumination along an arc at a small opening where a vertical wall edge meets a floor plane.