scispace - formally typeset
Search or ask a question

Showing papers on "Iterative reconstruction published in 1982"


Journal ArticleDOI
TL;DR: Iterative algorithms for phase retrieval from intensity data are compared to gradient search methods and it is shown that both the error-reduction algorithm for the problem of a single intensity measurement and the Gerchberg-Saxton algorithm forThe problem of two intensity measurements converge.
Abstract: Iterative algorithms for phase retrieval from intensity data are compared to gradient search methods. Both the problem of phase retrieval from two intensity measurements (in electron microscopy or wave front sensing) and the problem of phase retrieval from a single intensity measurement plus a non-negativity constraint (in astronomy) are considered, with emphasis on the latter. It is shown that both the error-reduction algorithm for the problem of a single intensity measurement and the Gerchberg-Saxton algorithm for the problem of two intensity measurements converge. The error-reduction algorithm is also shown to be closely related to the steepest-descent method. Other algorithms, including the input-output algorithm and the conjugate-gradient method, are shown to converge in practice much faster than the error-reduction algorithm. Examples are shown.

5,210 citations


Journal ArticleDOI
TL;DR: The reconstruction algorithm is derived for parallel beam transmission computed tomography through two-dimensional structures in which diffraction of the insonifying beam must be taken into account and is applicable to diffraction tomography within either the first Born or Rytov approximations.

741 citations


Journal ArticleDOI
TL;DR: In this paper, the authors used divergent fan beam convolutional reconstruction with a minimal complete (180 degrees plus the fan angle) data set and showed that by proper weighting of the initial data set, image quality essentially equivalent to the quality of reconstructions from 360 degrees data sets is obtained.
Abstract: The problem of using a divergent fan beam convolution reconstruction algorithm in conjunction with a minimal complete (180 degrees plus the fan angle) data set is reviewed. It is shown that by proper weighting of the initial data set, image quality essentially equivalent to the quality of reconstructions from 360 degrees data sets is obtained. The constraints on the weights are that the sum of the two weights corresponding to the same line-integral must equal one, in regions of no data the weights must equal zero, and the weights themselves as well as the gradient of the weights must be continuous over the full 360 degrees. After weighting the initial data with weights that satisfy these constraints, image reconstruction can be conveniently achieved by using the standard (hardwired if available) convolver and backprojector of the specific scanner.

555 citations


Journal ArticleDOI
TL;DR: In this article, the phase or magnitude information alone is not sufficient, in general, to uniquely specify a sequence, however, a large class of sequences are shown to be recoverable from their phases or magnitudes.
Abstract: This paper addresses two fundamental issues involved in the reconstruction of a multidimensional sequence from either the phase or magnitude of its Fourier transform The first issue relates to the uniqueness of a multidimensional sequence in terms of its phase or magnitude Although phase or magnitude information alone is not sufficient, in general, to uniquely specify a sequence, a large class of sequences are shown to be recoverable from their phase or magnitude The second issue which is addressed in this paper concerns the actual reconstruction of a multidimensional sequence from its phase or magnitude For those sequences which are uniquely specified by their phase, several practical algorithms are described which may be used to reconstruct a sequence from its phase Several examples of phase-only reconstruction are also presented Unfortunately, however, even for those sequences which are uniquely defined by their magnitude, it appears that a practical algorithm is yet to be developed for reconstructing a sequence from only its magnitude Nevertheless, an iterative procedure which has been proposed is briefly discussed and evaluated

472 citations


Journal ArticleDOI
TL;DR: A parametric implementation of cubic convolution image reconstruction is presented which is generally superior to the standard algorithm and which can be optimized to the frequency content of the image.
Abstract: Cubic convolution, which has been discussed by Rifman and McKinnon (1974), was originally developed for the reconstruction of Landsat digital images. In the present investigation, the reconstruction properties of the one-parameter family of cubic convolution interpolation functions are considered and thee image degradation associated with reasonable choices of this parameter is analyzed. With the aid of an analysis in the frequency domain it is demonstrated that in an image-independent sense there is an optimal value for this parameter. The optimal value is not the standard value commonly referenced in the literature. It is also demonstrated that in an image-dependent sense, cubic convolution can be adapted to any class of images characterized by a common energy spectrum.

349 citations


Journal ArticleDOI
TL;DR: The results show that the method of image restoration by projection onto convex sets, by providing a convenient technique for utilizing a priori information, performs significantly better than the Gerchberg-Papoulis method.
Abstract: The image restoration theory discussed in a previous paper by Youla and Webb [1] is applied to a simulated image and the results compared with the well-known method known as the Gerchberg-Papoulis algorithm. The results show that the method of image restoration by projection onto convex sets, by providing a convenient technique for utilizing a priori information, performs significantly better than the Gerchberg-Papoulis method.

348 citations


Journal ArticleDOI
TL;DR: In this paper, a linear interpolation between pixels is proposed to improve the line integrals of reconstructed CT density pixels for the purpose of improving CT image quality, without unnecessary loss of resolution.
Abstract: It is often desired to calculate line integrals through a field of reconstructed CT density pixels for the purpose of improving CT image quality. Two algorithms widely published and discussed in the past are known to either degrade spatial resolution or generate errors in the results due to the discontinuous "square pixel" modeling of the reconstructed image. An algorithm is described, based on linear interpolation between pixels, which provides superior accuracy without unnecessary loss of resolution. It was tested on simulated data for a head section and on a narrow Gaussian density distribution. The experimental results demonstrated improved performance. The method is expected to prove useful for many types of post-reconstruction processing, including beam hardening, missing data, and noise supression algorithms.

341 citations


Journal ArticleDOI
TL;DR: Methods for reconstructing the object’s support are given for objects whose support is convex and for certain objects consisting of collections of distinct points.
Abstract: The phase-retrieval problem consists of the reconstruction of an object from the modulus of its Fourier transform or, equivalently, from its autocorrelation. This paper describes a number of results relating to the reconstruction of the support of an object from the support of its autocorrelation. Methods for reconstructing the object’s support are given for objects whose support is convex and for certain objects consisting of collections of distinct points. The uniqueness of solutions is discussed. In addition, for the objects consisting of collections of points, a simple method is shown for completely reconstructing the object functions.

151 citations


Journal ArticleDOI
TL;DR: The proposed IRR algorithm enables the use of convolution-backprojection in limited angle of view and in limited field of view CT cases and the potential of this method for cardiac CT reconstruction is demonstrated using computer simulated data.
Abstract: Cardiac X-ray computed tomography (CT) has been limited due to scanning times which are considerably longer (1 s) than required to resolve the beating heart (0.1 s). The otherwise attractive convolution-backprojection algorithm is not suited for CT image reconstruction from measurements comprising an incomplete set of projection data. In this paper, an iterative reconstruction-reprojection (IRR) algorithm is proposed for limited projection data CT image reconstruction. At each iteration, the missing views are estimated based on reprojection, which is a software substitute for the scanning process. The standard fan-beam convolution-backprojection algorithm is then used for image reconstruction. The proposed IRR algorithm enables the use of convolution-backprojection in limited angle of view and in limited field of view CT cases. The potential of this method for cardiac CT reconstruction is demonstrated using computer simulated data.

115 citations


Journal ArticleDOI
TL;DR: Current state-of-the-art CT technology holds promise of exciting new clinical and research capabilities, such as quantitative analysis of regional blood flow and perfusion, simultaneous measurement of physiologic function and anatomic structure, and differential diagnosis of disease based on determination of tissue composition in any organ or region of the body.
Abstract: The trend in development of X-ray computed tomography systems over the past decade has been toward faster and faster scanners, with comcomitant improvement in image quality. The preliminary results from the DSR scanner suggest the advent of two new, powerful dimensions in X-ray computed tomography?high temporal resolution and synchronous volume scanning. That is, true stop-action, full three-dimensional imaging at a high repetition rate is possible with CT scanners. Such capabilities promise to make possible new basic investigative and clinical studies of the structural-to-functional relationships of moving organ systems like the heart and lungs, and of the circulation in any organ of the body. Current state-of-the-art CT technology holds promise of exciting new clinical and research capabilities, such as quantitative analysis of regional blood flow and perfusion, simultaneous measurement of physiologic function and anatomic structure, and differential diagnosis of disease based on determination of tissue composition in any organ or region of the body, with a sensitivity and specificity not possible before.

112 citations


Journal ArticleDOI
TL;DR: This paper defines basic concepts in unified terminology and presents algorithms for a boundary detection task in multidimensional space and the performance of these algorithms is discussed with respect to theoretical maximum complexity.
Abstract: The development of image processing algorithms for time-varying imagery and computerized tomography data calls for generalization of the concepts of adjacency, connectivity, boundary, etc., to three and four-dimensional discrete spaces. This paper defines these basic concepts in unified terminology and presents algorithms for a boundary detection task in multidimensional space. The performance of these algorithms is discussed with respect to theoretical maximum complexity, and is illustrated with simulated computerized tomography data.

Journal ArticleDOI
TL;DR: A cyclically controlled method of subgradient projections (CSP) for the convex feasibility problem of solving convex inequalities is presented and a particular application to an image reconstruction problem of emission computerized tomography is mentioned.
Abstract: A cyclically controlled method of subgradient projections (CSP) for the convex feasibility problem of solving convex inequalities is presented. The features of this method make it an efficient tool in handling huge and sparse problems. A particular application to an image reconstruction problem of emission computerized tomography is mentioned.

Journal ArticleDOI
TL;DR: "EM" is a computer program system concerned with the processing of electron micrographs that provides facilities for two- and three-dimensional image reconstruction, correlation, filtering, etc. as well as for storage and display of image data.

Journal ArticleDOI
01 Oct 1982
TL;DR: In this article, a review of the basic principles and methods involved in NMR tomographic imaging, computer simulations and modelings are presented to clarify the complexity of the NMR imaging method and provide an insight into the method, especially image-formation aspects and processing.
Abstract: Nuclear Magnetic Resonance (NMR) tomographic imaging is a newly emerging, noninvasive, three-dimensional (3-D) imaging technique. Although similar to the well known X-ray Computerized Tomography (X-CT), it uses magnetic fields and RF signals to obtain anatomical information about the human body as cross-sectional images in any desired direction, and can easily discriminate between healthy and abnormal tissues. This new technique is an interdisciplinary science which encompasses the latest technologies in electrical, electronics, computers, physics, chemistry, mathematics, and medical sciences. Principles of this new technique known as "Fourier transform nuclear magnetic resonance imaging" or simply "NMR imaging" are reviewed from the physics and engineering points of view to provide basic concepts and tools, which, hopefully, will be useful for the future development of this exciting new field. Along with the review of the basic principles and methods involved in NMR tomography, computer simulations and modelings are presented to clarify the complexity of the NMR imaging method and provide an insight into the method, especially image-formation aspects and processing, the central theme of NMR tomography. In this paper, four main types of imaging methods-namely, line-scan imaging, direct Fourier-transform (Kumar-Welti-Ernst method) imaging, line-integral projection reconstruction, and plane-integral projection reconstruction, as well as the possibility of relaxation time imaging, are discussed in detail Methods of improving performance with respect to the statistical aspects of image quality and imaging times are also discussed.

Journal ArticleDOI
TL;DR: This paper is a 1-D analysis of the degradation caused by image sampling and interpolative reconstruction that includes the sample-scene phase as an explicit random parameter and provides a complete characterization of this image degradation as the sum of two terms.
Abstract: This paper is a 1-D analysis of the degradation caused by image sampling and interpolative reconstruction. The analysis includes the sample-scene phase as an explicit random parameter and provides a complete characterization of this image degradation as the sum of two terms: one term accounts for the mean effect of undersampling (aliasing) and nonideal reconstruction averaged over all sample-scene phases; the other term accounts for variations about this mean. The results of this paper have application to the design and performance analysis of image scanning, sampling, and reconstruction systems.

Journal ArticleDOI
TL;DR: A reconstruction algorithm for TOF-positron computed tomography (PCT) based on the back-projection with 1-dimensional weight and 2-dimensional filtering is presented and a formula to evaluate the variance of the reconstructed image and the optimal back- projection function are presented.
Abstract: In positron CT, the path difference of annhilation pair gamma rays can be measured by time-of-flight (TOF) difference of pair gamma rays. This TOF information gives us rough position information along a projection line and will reduce noise propagation in the reconstruction process. A reconstruction algorithm for TOF-positron computed tomography (PCT) based on the back-projection with 1-dimensional weight and 2-dimensional filtering is presented. Also a formula to evaluate the variance of the reconstructed image and the optimal back-projection function are presented. The advantage of TOF-PCT over conventional PCT was investigated in view of noise figure. An example of such noise figure evaluations for CsF and liquid Xenon scintillators is given.

Proceedings ArticleDOI
01 May 1982
TL;DR: This paper presents various conditions that are sufficient for reconstructing a discrete-time signal from samples of its short-time Fourier transform magnitude, for applications such as speech processing.
Abstract: This paper presents various conditions that are sufficient for reconstructing a discrete-time signal from samples of its short-time Fourier transform magnitude. For applications such as speech processing, these conditions place very mild restrictions on the signal as well as the analysis window of the transform. Examples of such reconstruction for speech signals are included in the paper.

Journal ArticleDOI
TL;DR: In this paper, the effects of errors in center to receiver distance, limited angle data, and limited frequency data were studied for reconstructing the distribution of refraction index using the Rytov approximation to the wave equation.
Abstract: Data required for reconstructing the distribution of refraction index using the Rytov approximation to the wave equation were simulatcd and used to test the fidelity of the reconstruction method proposed by lwata and Nagata in 1975. The effects of errors in center to receiver distance, limited angle data, and limited frequency data were studied. Limited angle data can be used if multiple frequencies are used. Experimental data were obtained and used to reconstruct simple cylinders containing saline. In addition, experimental data having multiple frequencies were obtained and images reconstructed from two views ninety degrees apart. Aberrations in the images reconstructed from real data may be a result of lost terms in the wave equation signal phase due to movement of the subject or due to the use o f imperfect

Journal ArticleDOI
TL;DR: Detailed model studies demonstrate that the proposed laser absorption computed tomography system is capable of providing 2-D maps of pollutant concentration at ranges and resolutions superior to that attainable from contemporary direct detection laser radars.
Abstract: Laser absorption computed tomography offers the possibility of sensitive remote atmospheric measurements of pollutants over kilometer sized areas with 2-D resolution at modest laser source powers. We present detailed model studies which demonstrate the potential of this new remote sensing technique. The tomographic reconstruction process is studied as a function of measurement signal to noise, laser power, range, and system geometry. The analysis shows that the proposed system is capable of providing 2-D maps of pollutant concentration at ranges and resolutions superior to that attainable from contemporary direct detection laser radars.

Journal ArticleDOI
TL;DR: The noise performance of an emission tomography system having time-of-flight measurements is shown in several examples to be superior for a confidence-weighted data array compared to a most likely position data array as mentioned in this paper.
Abstract: The noise performance of an emission tomography system having time-of-flight measurements is shown in several examples to be superior for a confidence-weighted data array compared to a most likely position data array. The examples range from a point to a planar distribution of radioactivity, and include a crude model of the left ventricle of a heart containing radioactive palmitate.

Journal ArticleDOI
TL;DR: The present method can be applied to compute the velocity of blood flow using the dye-edge displacement and the three-dimensional distance data.
Abstract: A dye-edge tracking algorithm was used to determine the corresponding points in the two images (anterior-posterior and lateral) of the digital subtraction biplane angiography. This correspondence was used to reconstruct three-dimensional images of cerebral arteries in a dog experiment and a clinical observation. The method was tested by comparing the measured image of oblique view to the computed reconstructed image. For the present study, we have developed three new algorithms. The first algorithm is to determine the corresponding dye-edge points using the fact that the density of contrast media at the moving edge shows the same changing pattern in the two projection views. This moving pattern of dye-edge density is computed using a matching method of cross correlation for the two sequential frames' dye density. The second algorithm is for simplified perspective transformation, and the third is to identify the corresponding points using a complementary method for locating the approximate points on the small vessels. The present method can be applied to compute the velocity of blood flow using the dye-edge displacement and the three-dimensional distance data.

Journal ArticleDOI
TL;DR: In this article, the problem of finding an accurate pseudo-inverse for even a modest PET array of 8 × 8 pixels is shown to be a difficult task for a computer with 48-bit mantissa.
Abstract: This paper analyzes in detail the process of tomographic image reconstruction by pseudo-inversion of the blurring matrix of a PET imaging system. Eigenvector and eigenvalue decomposition is used as a method to evaluate the physical reasons for the ill-conditioned nature of the problem. It is shown that finding an accurate pseudo-inverse for even a modest PET array of 8 × 8 pixels is a difficult task for a computer with 48-bit mantissa. The problem is caused by the strong ambiguity with which the detector system measures the activity at each pixel. For a problem in which imaging with a complete detector ring is not possible, and in which invariance of the point response function cannot be maintained, the pseudo-inverse method of reconstruction is, however, shown to be very useful. Advantage is taken of the fact that the activitiy to be measured is localized in a single plane, without over-or underlying activity. A planar camera configuration yields very well conditioned matrices that are separable for a large number of useful cases. It is even possible to define pixel sizes which are considerably smaller than the detector size and solve the problem without a substantial increase in the noise magnification factor. Recognizing that the above application is equivalent to a case of very well defined time-of-flight (TOF) measurement, the simple initial PET study is reevaluated by inclusion of TOF information.

Journal ArticleDOI
TL;DR: The derivation and results of an ART-like algorithm (SPARTAF) oriented towards prevention of streaks via optimization of a cost function based on features of streaks, subject to the constraints of the given projection data are presented.
Abstract: Streaks arise in computed tomograms for a variety of reasons, such as presence of high-contrast edges and objects, aliasing errors, patient movement, and use of very few views. The problem appears to be an inherent difficulty with all reconstruction methods, including backprojection (with convolution) and the algebraic reconstruction technique (ART). This paper presents the derivation and results of an ART-like algorithm (SPARTAF) oriented towards prevention of streaks via optimization of a cost function based on features of streaks, subject to the constraints of the given projection data. The object-dependent method employs pattern recognition of streaks and adaptive filtering during iterative reconstruction by ART. Results of experiments with a test pattern and of application of the method to reconstructive tomography from radiographic films are presented and the convergence properties demonstrated.

Journal ArticleDOI
TL;DR: A generalized mathematical model describing this technique together with computer simulations demonstrating its feasibility are presented and the implementation of this real-time reconstruction scheme in the TOFPET scanner currently under construction at the institution is discussed.
Abstract: Several researchers have proven the benefits of using time-of-flight data in Positron Emission Tomography (TOFPET). One of the characteristics of TOFPET which has not been previously explored is the ability to carry out real-time image reconstruction. The advantages of real-time reconstruction are four-fold: 1) Immediate visual feedback of image build-up, 2) Substantial savings of image memory, 3) Faster thoughput of patients through the system and 4) Region-of-interest reconstruction for further memory savings. We present a generalized mathematical model describing this technique together with computer simulations demonstrating its feasibility. We also discuss the implementation of this real-time reconstruction scheme in the TOFPET scanner currently under construction at our institution.

Journal ArticleDOI
TL;DR: A method for reconstructing a diffraction-limited image from data consisting of many short-exposure turbulence degraded images is described and a computer simulation indicates that the method may be applicable to astronomical objects as faint as approximately 11th magnitude.
Abstract: A method for reconstructing a diffraction-limited image from data consisting of many short-exposure turbulence degraded images is described. The results of a computer simulation of the method are presented. The method should prove useful for obtaining high-angular resolution images from large earth bound telescopes. The simulation indicates that the method may be applicable to astronomical objects as faint as ~11th magnitude.

Journal Article
TL;DR: A computer-based simulation method is developed to assess the relative effectiveness of attenuation compensation procedures in SPECT reconstruction and concludes that the additional expense of the iterative method is not justified under the conditions of this study.
Abstract: Attenuation of photons in single-photon emission tomography (SPECT) makes three-dimensional reconstruction of unknown radioactivity distributions a mathematically intractable problem. Approaches to approximate SPECT reconstruction range from ignoring the effect of photon attenuation to incorporating assumed attenuation coefficients into an iterative reconstruction procedure. We have developed a computer-based simulation method to assess the relative effectiveness of attenuation compensation procedures. The method was used to study four procedures for myocardial SPECT using an infarct-avid radiopharmaceutical, /sup 99m/Tc stannous pyrophosphate. Reconstructions were evaluated by two criteria: overall (sum-of-squares) accuracy, and accuracy of lesion sizing. For moderate- to high-contrast studies there were no significant differences among the reconstructions by either evaluation criterion; for low contrast ratios the iterative method produced lower sum-of-squares criterion; for low contrast ratios the iterative method produced lower sum-of-squares error. We conclude that the additional expense of the iterative method is not justified under the conditions of this study. The approach used here is a convenient tool for evaluating specific SPECT reconstruction alternatives.

Proceedings ArticleDOI
01 Nov 1982
TL;DR: In this article, a simple approximation method is presented for evaluating the variance and autocovariance function of noise for objects having given distributions of radionuclides and of attenuation coefficient.
Abstract: The magnitude and texture of statistical noise in images of positron emission computed tomography are evaluated analytically. A simple approximation method is presented for evaluating the variance and autocovariance function of noise for objects having given distributions of radionuclides and of attenuation coefficient. The comparison between the approximation method and the accurate estimation shows excellent agreements at various points in the images of uniform disc sources having constant attenuation. The dependence of noise amplification on convolution filters in image reconstruction and on the additional image processing (smoothing) is given. Considerable anisotropy of the autocovariance is observed near the periphery of the image even for a uniform disc source. This fact suggests the usefulness of spatially variant, anisotropic smoothing in some cases.

Journal ArticleDOI
TL;DR: This nonlinear scaling scheme is devised which exploits the simplicity of this binary nature, treating images logically instead of arithmetically; a convolution-like effect is achieved without a single addition or multiplication!
Abstract: The importance of enlarging and reducing two-level images such as graphical and documentary matter by digital means continues to grow as more such images are digitally represented. A nonlinear scaling scheme is devised which exploits the simplicity of this binary nature, treating images logically instead of arithmetically; a convolution-like effect is achieved without a single addition or multiplication! This method yields high-fidelity digital scaling and meets the objectives of being fast, conducive to hardware realization, and void of special pre-encoding requirements.

Journal ArticleDOI
TL;DR: In this article, the performance of different iterative algorithms has been studied for reconstructing emission images of single gamma emitters, using a new gamma camera based upon an electronic method of collimation.
Abstract: The performance of different iterative algorithms has been studied for reconstructing emission images of single gamma emitters, using a new gamma camera based upon an electronic method of collimation. Adaptations of the ART, SIRT, and Iterative Least Squares algorithms have been studied. The algorithms have been compared via simulation with respect to the rate of convergence, sensitivity to noise, resolution, and contrast of the reconstructed image. Results have also been obtained with experimentally measured data from a point source of Tc-99m.

Journal ArticleDOI
TL;DR: A very efficient back-projection algorithm which results in large time savings when implemented in machine code and a minor modification to this algorithm which converts it to a re- projection procedure with comparable efficiency is described.
Abstract: While the computation time for reconstructing images in C.T. is not a problem in commercial systems, there are many experimental and developmental applications where resources are limited and image reconstruction places a heavy burden on the computer system. This paper describes a very efficient back-projection algorithm which results in large time savings when implemented in machine code. Also described is a minor modification to this algorithm which converts it to a re-projection procedure with comparable efficiency.