scispace - formally typeset
Search or ask a question

Showing papers on "Iterative reconstruction published in 1989"


Journal ArticleDOI
TL;DR: The simplified description of the phenomena involved in MR is intended to be comprehensive and does not require foreknowledge of classical physics, quantum mechanics, fluency with mathematical formulae or an understanding of image reconstruction.
Abstract: Magnetic Resonance (MR), which has no known biological hazard, is capable of producing high resolution thin tomographic images in any plane and blocks of 3-dimensional information. It can be used to study blood flow and to gain information about the composition of important materials seen and quantified on dimensionally accurate images. The MR image is a thin tomographic slice or a true three dimensional block of data which can be reconstructed in any desired way rather than a shadowgram of all the structures in the beam. It is the only imaging technique which can acquire data in a 3-dimensional format. CT images can be reconstructed to form a pseudo 3-D image or a hologram but the flexibility conferred by acquiring the data as a true 3-D block gives many advantages. The spatial resolution of MR images are theoretically those of low powered microscopy, the practical limits with the present generation of equipment are voxel sizes of one third by one third by two millimetres. The term Magnetic Resonance Imaging (MRI) is used commonly, particularly in the USA, avoiding association with the term, nuclear, and emphasizing the imaging potential of the technique. The terms Nuclear Magnetic Resonance (NMR) or Magnetic Resonance (MR) more correctly describe the most powerful diagnostic instrument yet devised. The simplified description of the phenomena involved in MR which follows is intended to be comprehensive and does not require foreknowledge of classical physics, quantum mechanics, fluency with mathematical formulae or an understanding of image reconstruction. There are many explanations of MR, some omitting the more difficult concepts. An accurate, comprehensive description is found on the textbook on MR by Gadian, Nuclear Magnetic Resonance and its Applications for Living Systems (Oxford University Press, 1982).

906 citations


Journal ArticleDOI
TL;DR: An adaptive technique for measuring and correcting the effects of patient motion during magnetic resonance image acquisition was developed and tested and shows promise for addressing the problem of respiratory motion in thoracoabdominal imaging.
Abstract: An adaptive technique for measuring and correcting the effects of patient motion during magnetic resonance image acquisition was developed and tested. A set of algorithms that can reverse the effects of object displacements and phase shifts was used. These algorithms essentially transfer the frame of reference of the image reconstruction from the static frame of the imager couch to the moving "visceral frame." An accurate record of tissue motion during image acquisition is required. To achieve this, the authors used specially encoded "navigator" echoes that are interleaved with the imaging sequence. Postprocessing of the navigator echo data provides a highly detailed record of the displacements and phase shifts that occur during imaging. Phantom studies demonstrated that the technique can directly correct image degradation caused by motion. In contrast to conventional artifact reduction techniques, such as ordered phase encoding and gradient moment nulling, this new method has a unique capacity to reduce motion unsharpness. Preliminary in vivo studies have demonstrated that the technique can markedly improve images degraded by voluntary motion and shows promise for addressing the problem of respiratory motion in thoracoabdominal imaging.

839 citations


Journal ArticleDOI
Paul E. Kinahan1, J.G. Rogers1
TL;DR: In this paper, the authors present an algorithm for three-dimensional image reconstruction that uses all gamma-ray coincidence events detected by a PET (positron emission tomography) volume imaging scanner.
Abstract: The authors present the results of testing an algorithm for three-dimensional image reconstruction that uses all gamma-ray coincidence events detected by a PET (positron emission tomography) volume imaging scanner. By using two iterations of an analytic filter-backprojection method, the algorithm is not constrained by the requirement of a spatially invariant detector point spread function, which limits normal analytic techniques. Removing this constraint allows the incorporation of all detected events, regardless of orientation, which improves the statistical quality of the final reconstructed image. >

788 citations


Journal ArticleDOI
TL;DR: It is shown that readily obtained prior knowledge can be used to obtain good-quality imagery with reduced data and the effect of noise on the reconstruction process is considered.
Abstract: We consider the problem of reconstructing remotely obtained images from image-plane detector arrays. Although the individual detectors may be larger than the blur spot of the imaging optics, high-resolution reconstructions can be obtained by scanning or rotating the image with respect to the detector. As an alternative to matrix inversion or least-squares estimation [Appl. Opt. 26, 3615 (1987)], the method of convex projections is proposed. We show that readily obtained prior knowledge can be used to obtain good-quality imagery with reduced data. The effect of noise on the reconstruction process is considered.

719 citations


Journal ArticleDOI
TL;DR: The problem of image reconstruction and restoration is first formulated, and some of the current regularization approaches used to solve the problem are described, and a Bayesian interpretation of the regularization techniques is given.
Abstract: Developments in the theory of image reconstruction and restoration over the past 20 or 30 years are outlined. Particular attention is paid to common estimation structures and to practical problems not properly solved yet. The problem of image reconstruction and restoration is first formulated. Some of the current regularization approaches used to solve the problem are then described. The concepts of a priori information and compound criterion are introduced. A Bayesian interpretation of the regularization techniques is given which clarifies the role of the tuning parameters and indicates how they could be estimated. The practical aspects of computing the solution, first when the hyperparameters are known and second when they must be estimated, are then considered. Conclusions are drawn, and points that still need to be investigated are outlined. >

716 citations


Journal ArticleDOI
TL;DR: In this article, the authors analyzed finite impulse response (FIR) filter banks in both the z-transform and time domains, showing the alternatives between designs in the two domains, and relations between previously known systems are given.
Abstract: Perfect reconstruction finite impulse-response (FIR) filter banks are analyzed both in the z-transform and time domains, showing the alternatives between designs in the two domains. Various classes of perfect reconstruction schemes are indicated, and relations between previously known systems are given. Windowed modulated filter banks with low computational complexity and perfect reconstruction are shown. New factorizations of polyphase filter matrices leading in particular to linear-phase filters, are given. The computational complexity and the architecture of the new structures are indicated. >

384 citations


Journal ArticleDOI
TL;DR: An approach is described that integrates the processes of feature matching, contour detection, and surface interpolation to determine the three-dimensional distance, or depth, of objects from a stereo pair of images.
Abstract: An approach is described that integrates the processes of feature matching, contour detection, and surface interpolation to determine the three-dimensional distance, or depth, of objects from a stereo pair of images. Integration is necessary to ensure that the detected surfaces are smooth. Surface interpolation takes into account detected occluding and ridge contours in the scene; interpolation is performed within regions enclosed by these contours. Planar and quadratic patches are used as local models of the surface. Occluded regions in the image are identified, and are not used for matching and interpolation. A coarse-to-fine algorithm is presented that generates a multiresolution hierarchy of surface maps, one at each level of resolution. Experimental results are given for a variety of stereo images. >

366 citations


Journal ArticleDOI
TL;DR: Computer simulation studies are presented which demonstrate significantly improved reconstructed images achieved by an ART algorithm as compared to IRR methods.
Abstract: The author presents an algebraic reconstruction technique (ART) as a viable alternative in computerized tomography (CT) from limited views. Recently, algorithms of iterative reconstruction-reprojection (IRR) based on the method of convolution-backprojection have been proposed for application in limited-view CT. Reprojection was used in an iterative fashion alternating with backprojection as a means of estimating projection values within the sector of missing views. In algebraic methods of reconstruction for CT, only those projections corresponding to known data are required. Reprojection along missing views would merely serve to introduce redundant equations. Computer simulation studies are presented which demonstrate significantly improved reconstructed images achieved by an ART algorithm as compared to IRR methods. >

290 citations


Journal ArticleDOI
TL;DR: A multiresolution image representation is presented in which iterative morphological filters of many scales but identical shape serve as basis functions and is well suited for VLSI implementation.

256 citations


Journal ArticleDOI
TL;DR: The Wiener solution of a multichannel restoration scheme uses both the within-channel and between-channel correlation; hence, the restored result is a better estimate than that produced by independent channel restoration.
Abstract: The Wiener solution of a multichannel restoration scheme is presented. Using matrix diagonalization and block-Toeplitz to block-circulant approximation, the inversion of the multichannel, linear space-invariant imaging system becomes feasible by utilizing a fast iterative matrix inversion procedure. The restoration uses both the within-channel (spatial) and between-channel (spectral) correlation; hence, the restored result is a better estimate than that produced by independent channel restoration. Simulations are also presented. >

235 citations


Journal ArticleDOI
TL;DR: It is concluded that the deterministic algorithm (graduated nonconvexity) outstrips stochastic (simulated annealing) algorithms both in computational efficiency and in problem-solving power.
Abstract: Piecewise continuous reconstruction of real-valued data can be formulated in terms of nonconvex optimization problems. Both stochastic and deterministic algorithms have been devised to solve them. The simplest such reconstruction process is the weak string. Exact solutions can be obtained for it and are used to determine the success or failure of the algorithms under precisely controlled conditions. It is concluded that the deterministic algorithm (graduated nonconvexity) outstrips stochastic (simulated annealing) algorithms both in computational efficiency and in problem-solving power. >

Proceedings ArticleDOI
01 Jul 1989
TL;DR: The goals of the system are to produce high quality antialiased images at a modest average sample rate, and to refine the image progressively so that the image is available in a usable form early and is refined gradually toward the final result.
Abstract: We describe an antialiasing system for ray tracing based on adaptive progressive refinement. The goals of the system are to produce high quality antialiased images at a modest average sample rate, and to refine the image progressively so that the image is available in a usable form early and is refined gradually toward the final result.The method proceeds by adaptive stochastic sampling of the image plane, evaluation of the samples by ray tracing, and image reconstruction from the samples. Adaptive control of the sample generation process is driven by three basic goals: coverage of the image, location of features, and confidence in the values at a distinguished "pixel level" of resolution.A three-stage process of interpolation, filtering, and resampling is used to reconstruct a regular grid of display pixels. This reconstruction can be either batch or incremental.

Journal ArticleDOI
TL;DR: It is shown that for magnetic resonance (MR) images with signal-to-noise ratio (SNR) less than 2 it is advantageous to use a phase-corrected real reconstruction, rather than the more usual magnitude reconstruction.
Abstract: We show that for magnetic resonance(MR)images with signal‐to‐noise ratio (SNR)<2 it is advantageous to use a phase‐corrected real reconstruction, rather than the more usual magnitude reconstruction We discuss the results of the phase correction algorithm used to experimentally verify the result We supplement the existing literature by presenting closed form expressions (in an MR context) for the probability distribution and first moments of the signal resulting from a magnitude reconstruction

Journal ArticleDOI
TL;DR: An iterative procedure which performs the parameter estimation and image reconstruction tasks at the same time, and is a generalization to the MRF context of a general algorithm, known as the EM algorithm, used to approximate maximum-likelihood estimates for incomplete data problems.

Journal ArticleDOI
TL;DR: Clinical use of computational techniques to permit the quantitative integration of magnetic resonance, positron emission tomography, and x-ray computed tomography imaging data sets for treatment planning has resulted in improvements in localization of treatment volumes and critical structures in the brain.
Abstract: This paper describes computational techniques to permit the quantitative integration of magnetic resonance (MR), positron emission tomography (PET), and x-ray computed tomography (CT) imaging data sets. These methods are used to incorporate unique diagnostic information provided by PET and MR imaging into CT-based treatment planning for radiotherapy of intracranial tumors and vascular malformations. Integration of information from the different imaging modalities is treated as a two-step process. The first step is to determine the set of geometric parameters relating the coordinates of two imaging data sets. No universal method for determining these parameters is appropriate because of the diversity of contemporary imaging methods and data formats. Most situations can be handled by one of the four different techniques described. These four methods make use of specific geometric objects contained in the two data sets to determine the parameters. These objects are: (a) anatomical and/or fiducial points, (b) attached line markers, (c) anatomical surfaces, and (d) outlines of anatomical structures. The second step involves using the derived transformation to transfer outlines of treatment volumes and/or anatomical structures drawn on the images of one imaging study to the images of another study, usually the treatment planning CT. Solid modelling and image processing techniques have been adapted and developed further to accomplish this task. Clinical examples and phantom studies are presented which verify the different aspects of these techniques and demonstrate the accuracy with which they can be applied. Clinical use of these techniques for treatment planning has resulted in improvements in localization of treatment volumes and critical structures in the brain. These improvements have allowed greater sparing of normal tissues and more precise delivery of energy to the desired irradiation volume. It is believed that these improvements will have a positive impact on the outcome of radiation therapy.

Journal ArticleDOI
TL;DR: The concept of a feasible image is introduced, which is a result of a reconstruction that, if it were a radiation field, could have generated the initial projection data by the Poisson process that governs radioactive decay.
Abstract: The discussion of the causes of image deterioration in the maximum-likelihood estimator (MLE) method of tomographic image reconstruction, initiated with the publication of a stopping rule for that iterative process (E. Veklerov and J. Llacer, 1987) is continued. The concept of a feasible image is introduced, which is a result of a reconstruction that, if it were a radiation field, could have generated the initial projection data by the Poisson process that governs radioactive decay. From the premise that the result of a reconstruction should be feasible, the shape and characteristics of the region of feasibility in projection space are examined. With a new rule, reconstructions from real data can be tested for feasibility. Results of the tests and reconstructed images for the Hoffman brain phantom are shown. A comparative examination of the current methods of dealing with MLE image deterioration is included. >

Proceedings ArticleDOI
14 May 1989
TL;DR: An algorithm to search a tree of interpretations efficiently to determine the solution poses(s) is developed, taking into account errors in the landmark directions extracted by image processing.
Abstract: Following and extending the approach of K. Sugihara (1988), the author assumes that a mobile robot is equipped with a single camera and a map marking the positions in its environment of landmarks. The robot moves on a flat surface, acquires one image, extracts vertical edges from it, and computes the directions to visible landmarks. The problem is to determine the robot's position and orientation (pose) by establishing the correspondence between landmark directions and points in the map. This approach capitalizes on the excellent angular resolution of standard CCD cameras, while avoiding the feature-correspondence and 3D reconstruction problems. The problem is formulated as a search in a tree of interpretations (pairings of landmark directions and landmark points), and an algorithm to search the tree efficiently to determine the solution poses(s) is developed, taking into account errors in the landmark directions extracted by image processing. Quantitative results from simulations and experiments with real imagery are presented. >

Proceedings ArticleDOI
25 May 1989
TL;DR: In this paper, a method based on Fourier analysis of the multi-angular data set as a whole, using special depth-dependent characteristics of the Fourier coefficients to achieve spatially-variant inverse filtering of the data in all views simultaneously, is described.
Abstract: A method is described for preprocessing projection data prior to image reconstruction in single-photon emission computed tomography. The projection data of the desired spatial distribution of emission activity is blurred by the point-response function of the collimator that is used to define the range of directions of gamma-ray photons reaching the detector. The point-response function of the collimator is not stationary, but depends on the distance from the collimator to the point. Conventional methods for deblurring collimator projection data are based on approximating the actual depth-dependent point-response function by a spatially-invariant blurring function, so that deconvolution methods can be applied independently to the data at each angle of view. The method described in this paper is based on Fourier analysis of the multi-angular data set as a whole, using special depth-dependent characteristics of the Fourier coefficients to achieve spatially-variant inverse filtering of the data in all views simultaneously. Preliminary results are presented for simulated data with a simple collimator model.

Journal ArticleDOI
TL;DR: Two iterative algorithms for synthesis of binary computer-generated holograms (CGHs) for image reconstruction are described: alternating projections onto constraint sets (POCS) and direct binary search (DBS).
Abstract: Two iterative algorithms for synthesis of binary computer-generated holograms (CGHs) for image reconstruction are described: alternating projections onto constraint sets (POCS) and direct binary search (DBS). Comparisons with conventional methods for CGH synthesis show that POCS and DBS yield substantially lower mean-squared reconstruction error and higher diffraction efficiency. The best method is DBS, but it is also the most computationally intensive. To ameliorate this disadvantage, an acceleration technique is presented for DBS. With this technique, the design of binary holograms with moderate space-bandwidth product becomes feasible. For example, the computation required to design a hologram with 256K elements is reduced by a factor of 121.

Journal ArticleDOI
TL;DR: A reconstruction algorithm employing sensitivity coefficients calculated from Laplace fields for single-pass image reconstruction in a manner more closely related to back-projection methods is described and images of a phantom and a human chest section produced using the algorithm are displayed.
Abstract: A number of proposed reconstruction algorithms for electrical impedance tomography have employed the concept of a sensitivity coefficient which can be used to relate the magnitude of a voltage change measured at the surface of an object to the change in impedance within the object which has given rise to it. Iterative algorithms are required where the approach to the full non-linear problem involves the formal inversion of the sensitivity coefficient matrix, but the task of matrix inversion is still not trivial even for a linearised version of the problem. An alternative approach is to use sensitivity coefficients calculated from Laplace fields for single-pass image reconstruction in a manner more closely related to back-projection methods. A reconstruction algorithm employing sensitivity coefficients in this manner is described and images of a phantom and a human chest section produced using the algorithm are displayed.

Journal ArticleDOI
TL;DR: A new technique based on object modeling and estimation is developed to achieve superresolution reconstruction from partial Fourier transform data and is robust with respect to Gaussian white noise perturbation to the measured data and withrespect to systematic modeling errors.
Abstract: Many problems in physics involve imaging objects with high spatial frequency content in a limited amount of time. The limitation of available experimental data leads to the infamous problem of diffraction limited data which manifests itself by causing ringing in the image. This ringing is due to the interference phenomena in optics and is known as the Gibbs phenomenon in engineering. Present tehniques to cope with this problem include filtering and regularization schemes based on minimum norm or maximum entropy constraints. In this paper, a new technique based on object modeling and estimation is developed to achieve superresolutionreconstruction from partial Fourier transform data. The nonlinear parameters of the object model are obtained using the singular value decomposition (SVD)‐based all‐pole model framework, and the linear parameters are determined using a standard least squares estimation method. This technique is capable, in principle, of unlimited resolution and is robust with respect to Gaussian white noiseperturbation to the measured data and with respect to systematic modeling errors. Reconstruction results from simulated data and real magnetic resonance data are presented to illustrate the performance of the proposed method.

Journal ArticleDOI
TL;DR: This paper clarifies the relationship between the various filters proposed by analysing and generalising the different classes of published filters and establishes the properties and characteristics of a general solution to the three-dimensional reconstruction problem.
Abstract: A number of authors have studied the problems associated with full three-dimensional reconstruction, especially in the case of positron tomography where three-dimensional reconstruction is likely to offer the greatest benefits. While most approaches follow that of filtered backprojection, the relationship between the various filters that have been proposed is far from evident. The authors clarify this relationship by analysing and generalising the different classes of published filters and establish the properties and characteristics of a general solution to the three-dimensional reconstruction problem. Some guidelines are suggested for the choice of an appropriate filter in a given situation.

Journal ArticleDOI
TL;DR: A significant amount of progress has been made and it is now possible to produce tomographic images of in vivo distributions of impedance, albeit with low spatial resolution, and future developments should improve image quality.
Abstract: There has recently been an increasing interest in the possibility of producing images of electrical impedance within the human body. When an electric current is applied to the body of a voltage distribution is developed across the body surface. This distribution is in part dependent on the internal impedance distribution within the body and its is possible to estimate this distribution from a suitable set of voltage measurements. Because of the nonlinear relationship between the impedance distribution and the voltage distribution at the surface of the body, the reconstruction problem is much more difficult than for other tomographic imaging techniques, but a significant amount of progress has been made, and it is now possible to produce tomographic images of i n v i v o distributions of impedance, albeit with low spatial resolution. Future developments should improve image quality.

Journal ArticleDOI
TL;DR: Comparisons of the a priori uniform and nonuniform Bayesian algorithms to the maximum-likelihood algorithm are carried out using computer-generated noise-free and Poisson randomized projections.
Abstract: A method that incorporates a priori uniform or nonuniform source distribution probabilistic information and data fluctuations of a Poisson nature is presented. The source distributions are modeled in terms of a priori source probability density functions. Maximum a posteriori probability solutions, as determined by a system of equations, are given. Interactive Bayesian imaging algorithms for the solutions are derived using an expectation maximization technique. Comparisons of the a priori uniform and nonuniform Bayesian algorithms to the maximum-likelihood algorithm are carried out using computer-generated noise-free and Poisson randomized projections. Improvement in image reconstruction from projections with the Bayesian algorithm is demonstrated. Superior results are obtained using the a priori nonuniform source distribution. >

Journal ArticleDOI
TL;DR: In this article, a diffuser-free reconstruction of digital and optical holograms of diffusely scattering objects is presented. But this method is not suitable for the case of optical diffusers.
Abstract: Generally, speckles impair the reconstruction of digital and optical holograms of diffusely scattering objects. With the help of an iterative method, it is possible to introduce diffusers in digital holography that do not suffer from this disadvantage. Optically obtained speckle-free reconstructions of digital holograms are presented.

Journal ArticleDOI
TL;DR: An electrical impedance tomography (EIT) system has been constructed, operating at two frequencies, 40.96 and 81.92 kHz, for investigating the practicability of the dual-frequency imaging method discussed theoretically in a previous paper.
Abstract: An electrical impedance tomography (EIT) system has been constructed, operating at two frequencies, 40.96 and 81.92 kHz, for investigating the practicability of the dual-frequency imaging method discussed theoretically in a previous paper (Griffiths and Ahmed, 1987). For testing the system, a phantom with a frequency-dependent electrical conductivity was designed. The properties of the phantom can be adjusted to match the frequency dependence observed in a given type of tissue. Dual-frequency images were obtained from a phantom simulating liver and also from 200 g of porcine liver in a saline tank. Prior to image reconstruction, it was necessary to apply a correction to the data to cancel the effects of stray capacitance within the electronics.

Journal ArticleDOI
TL;DR: Methods for estimating the regional variance in emission tomography images which arise from the Poisson nature of the raw data are discussed, based on the bootstrap and jackknife methods of statistical resampling theory.
Abstract: Methods for estimating the regional variance in emission tomography images which arise from the Poisson nature of the raw data are discussed. The methods are based on the bootstrap and jackknife methods of statistical resampling theory. The bootstrap is implemented in time-of-flight PET (positron emission tomography); the same techniques can be applied to non-time-of-flight PET and SPECT (single-photon-emission computed tomography). The estimates are validated by comparing them to those obtained by repetition of emission scans, using data from a time-of-flight positron emission tomograph. Simple expressions for the accuracy of the estimates are given. The present approach is computationally feasible and can be applied to any reconstruction technique as long as the data are acquired in a raw, uncorrected form. >

Book
30 Apr 1989
TL;DR: The geometry of projections on convex sets has been studied in signal recovery problems as discussed by the authors, with a focus on the reconstruction of the support and the object of the signal in two dimensions.
Abstract: 1 Introduction.- 2 Polynomials: A Review.- 3 Entire Functions and Signal Recovery.- 4 Homometric Distributions.- 5 Analytic Signals and Signal Recovery from Zero Crossings.- 6 Signal Representation by Fourier Phase and Magnitude in One Dimension.- 7 Recovery of Distorted Band-Limited Signals.- 8 Compact Operators, Singular Value Analysis and Reproducing Kernel Hilbert Spaces.- 9 Kaczmarz Method, Landweber Iteration, Gerchberg-Papoulis and Regularization.- 10 Two Dimensional Signal Recovery Problems.- 11 Reconstruction Algorithms in Two Dimensions.- 12 Nonexpansive Maps and Signal Recovery.- 13 Projections on Convex Sets in Signal Recovery.- 14 Method of Generalized Projections and Steepest Descent.- 15 Closed Form Reconstruction of the Support and the Object.- 16 Fienup's Input-Output Algorithms and Variations on this Theme.- 17 Topics and Applications of Signal Recovery.- A The Geometry of Projections on Convex Sets.- B Reference Summary.- C References.

Journal ArticleDOI
R. C. Wright1, Stephen J. Riederer1, Farhad Farzaneh1, Phillip J. Rossman1, Yu Liu1 
TL;DR: An experimental system for performing high‐speed reconstruction of MR image data acquired with a GRASS sequence, with an image acquisition time of 627 ms, continuous image reconstruction at a rate of 6 images/s, and an image reconstruction time of 120 ms.
Abstract: We describe an experimental system for performing high-speed reconstruction of MR image data acquired with a GRASS sequence. System characteristics are an image acquisition time of 627 ms, continuous image reconstruction at a rate of 6 images/s, and an image reconstruction time of 120 ms. The result is a system for performing MR imaging in real time. © 1989 Academic Press, Inc.

Journal ArticleDOI
TL;DR: A new iterative reconstruction algorithm is used as a figure of merit to determine the optimum size of the model correction at each step, which is robust, stable, and produces very good reconstruction even for high contrast materials where standard methods tend to diverge.
Abstract: Fermat's principle shows that a definite convex set of feasible slownessmodels, depending only on the traveltime data, exists for the fully nonlineartraveltime inversion problem. In a new iterative reconstruction algorithm, theminimum number of nonfeasible ray paths is used as a figure of merit todetermine the optimum size of the model correction at each step. The numericalresults show that the new algorithm is robust, stable, and produces very goodreconstructions even for high contrast materials where standard methods tend todiverge.