scispace - formally typeset
Search or ask a question

Showing papers by "Stephane Carlier published in 2016"


Journal ArticleDOI
TL;DR: High segmentation accuracy and consistency substantiates the characteristics of this method to reliably segment lumen across pullbacks in the presence of vulnerability cues and necrotic pool and has a deterministic finite time-complexity.
Abstract: Intravascular imaging using ultrasound or optical coherence tomography (OCT) is predominantly used to adjunct clinical information in interventional cardiology. OCT provides high-resolution images for detailed investigation of atherosclerosis-induced thickening of the lumen wall resulting in arterial blockage and triggering acute coronary events. However, the stochastic uncertainty of speckles limits effective visual investigation over large volume of pullback data, and clinicians are challenged by their inability to investigate subtle variations in the lumen topology associated with plaque vulnerability and onset of necrosis. This paper presents a lumen segmentation method using OCT imaging physics-based graph representation of signals and random walks image segmentation approaches. The edge weights in the graph are assigned incorporating OCT signal attenuation physics models. Optical backscattering maxima is tracked along each A-scan of OCT and is subsequently refined using global graylevel statistics and used for initializing seeds for the random walks image segmentation. Accuracy of lumen versus tunica segmentation has been measured on 15 in vitro and 6 in vivo pullbacks, each with 150–200 frames using 1) Cohen's kappa coefficient $(0.9786 \pm 0.0061)$ measured with respect to cardiologist's annotation and 2) divergence of histogram of the segments computed with Kullback–Leibler $(5.17 \pm 2.39)$ and Bhattacharya measures $(0.56 \pm 0.28)$ . High segmentation accuracy and consistency substantiates the characteristics of this method to reliably segment lumen across pullbacks in the presence of vulnerability cues and necrotic pool and has a deterministic finite time-complexity. This paper in general also illustrates the development of methods and framework for tissue classification and segmentation incorporating cues of tissue–energy interaction physics in imaging.

31 citations


Journal ArticleDOI
TL;DR: A novel method for DA is introduced through an error-correcting hierarchical transfer relaxation scheme with domain alignment, feature normalization, and leaf posterior reweighting to correct for the distribution shift between the domains.

29 citations


Proceedings ArticleDOI
01 Apr 2016
TL;DR: A first of its kind approach to solve the problem of inability to discriminate high tissue heterogeneity in absence of a well studied distribution by learning the multiscale statistical distribution model of the data using the proposed distribution preserving (DP) autoencoder (AE) based neural network (NN).
Abstract: Interventional cardiologists use intravascular imaging techniques like optical coherence tomography (OCT) as adjunct to angiography for detailed diagnosis of atherosclerosis. Each tissue type is associated with characteristic speckle intensity distribution, which forms the basis for tissue characterization (TC). Classical approaches follow statistical machine learning using apriori assumed speckle models, and are challenged by inability to discriminate high tissue heterogeneity. As a first of its kind approach, we solve this problem in absence of a well studied distribution, by learning the multiscale statistical distribution model of the data using our proposed distribution preserving (DP) autoencoder (AE) based neural network (NN). The learning rule introduces a scale importance parameter associated with error backpropagation. We have evaluated performance of DPAE vs. prior-art and AE (with L2 norm and cross-entropy cost function) to obtain LogLoss of 0.16, 0.28, 0.22, 0.53 respectively, and 93.6% average classification accuracy with DPAE predictions were judged to be clinically acceptable.

11 citations