scispace - formally typeset
Search or ask a question

Showing papers on "Iterative reconstruction published in 1988"


Journal ArticleDOI
TL;DR: A piecewise-smooth surface model for image data that possesses surface coherence properties is used to develop an algorithm that simultaneously segments a large class of images into regions of arbitrary shape and approximates image data with bivariate functions so that it is possible to compute a complete, noiseless image reconstruction based on the extracted functions and regions.
Abstract: The solution of the segmentation problem requires a mechanism for partitioning the image array into low-level entities based on a model of the underlying image structure. A piecewise-smooth surface model for image data that possesses surface coherence properties is used to develop an algorithm that simultaneously segments a large class of images into regions of arbitrary shape and approximates image data with bivariate functions so that it is possible to compute a complete, noiseless image reconstruction based on the extracted functions and regions. Surface curvature sign labeling provides an initial coarse image segmentation, which is refined by an iterative region-growing method based on variable-order surface fitting. Experimental results show the algorithm's performance on six range images and three intensity images. >

1,151 citations


Journal ArticleDOI
TL;DR: It is shown how the sliding window technique lends itself to high‐speed reconstruction, with each newly acquired echo used to quickly update the image on display to result in realtime MR image acquisition and reconstruction.
Abstract: A method of magnetic resonance image acquisition and reconstruction is described in which high imaging rates and fast reconstruction times are allowed. The acquisition is a modification of the basic FLASH sequence but with a restricted number N of phase encodings. The encodings are applied sequentially, periodically, and continuously. Images are formed by sliding a window of width N encodings along the acquired data and reconstructing an image for each position of the window. In general the acquisition time per image exceeds the time between successive images, and the method thus has a temporal lag. Experimental studies were performed with a dynamic phantom using 48 phase encodings and a TR of 20 ms, for an image acquisition time of about 1 s. The image display rate in the reconstructed sequence was 12.5 images/s, and the image sequence portrayed the motion of the phantom. Additional studies were done with 24 encodings. It is shown how the sliding window technique lends itself to high-speed reconstruction, with each newly acquired echo used to quickly update the image on display. The combination of the acquisition technique described and a hardware implementation of the reconstruction algorithm can result in realtime MR image acquisition and reconstruction. © 1988 Academic Press, Inc.

374 citations


Journal ArticleDOI
TL;DR: Simultaneous correction of nonuniform attenuation and detector response was implemented in single-photon-emission computed tomography (SPECT) image reconstruction and provides more-accurate quantitation and superior image quality.
Abstract: Simultaneous correction of nonuniform attenuation and detector response was implemented in single-photon-emission computed tomography (SPECT) image reconstruction. A ray-driven projector-backprojector that exactly models attenuation in the reconstructed image slice and the spatially variant detector response was developed and used in the iterative maximum-likelihood algorithm for the correction. A computer-generated heart-lung phantom was used in simulation studies to compare the simultaneous correction method with an intrinsic attenuation correction method using a smoothing filter, and intrinsic attenuation correction method using a deconvolution filter, and a modified Chang attenuation correction method using a nonuniform attenuation distribution. The results demonstrate that the present method provides more-accurate quantitation and superior image quality. >

231 citations


Journal ArticleDOI
TL;DR: In this paper, Gonsalves's phase diversity method was used to correct images blurred by a misaligned segmented-aperture telescope and the final image was obtained by a Wiener-Helstrom filtering of the degraded image using the retrieved phase errors.
Abstract: A segmented-aperture telescope such as the Multiple-Mirror Telescope will suffer from phase errors unless the segments are aligned to within a small fraction of a wavelength. Such a coherent alignment of the segments is difficult to achieve in real time. An alternative is to record the images degraded by phase errors and to restore them after detection by using phase-retrieval techniques. In this paper we describe the use of Gonsalves’s phase-diversity method (which was previously used to combat atmospheric turbulence) to correct imagery blurred by a misaligned segmented-aperture telescope. Two images are recorded simultaneously: the usual degraded image in the focal plane and a second degraded image in an out-of-focus plane. An iterative gradient-search algorithm finds the phase error of the telescope that is consistent with both degraded images. We refer to this technique as the method of multiple-plane measurements with iterative reconstruction. The final image is obtained by a Wiener–Helstrom filtering of the degraded image using the retrieved phase errors. The results of reconstruction experiments performed with simulated data including the effects of noise are shown for the case of random piston phase errors on each of six segments.

185 citations


Journal ArticleDOI
TL;DR: T theory shows that the spatial frequency content of beam-deflection measurements is well suited for tomographic reconstruction, and the theory for the diffraction-limited resolution for tomography is presented.
Abstract: We report 3-D imaging of density in a supersonic expansion using beam-deflection optical tomography. Quantitative high-resolution images with absolute accuracy of 3%, dynamic range of 500:1, and spatial resolution to within a factor of 1.7 of the diffraction limit were produced with a He-Ne laser and simple apparatus. Theory shows that the spatial frequency content of beam-deflection measurements is well suited for tomographic reconstruction. The theory for the diffraction-limited resolution for tomography is presented.

170 citations


Journal ArticleDOI
Yair Censor1
TL;DR: Block-iterative versions of special objective function and constraints structure of row-action methods enable parallel computation when the underlying problem is appropriately decomposed, opening the door for parallel computation in image reconstruction problems of computerized tomography and in the inverse problem of radiation therapy treatment planning.
Abstract: Some row-action algorithms which exploit special objective function and constraints structure have proven advantageous for solving huge and sparse feasibility or optimization problems. Recently developed block-iterative versions of such special-purpose methods enable parallel computation when the underlying problem is appropriately decomposed. This opens the door for parallel computation in image reconstruction problems of computerized tomography and in the inverse problem of radiation therapy treatment planning, all in their fully discretized modelling approach. Since there is more than one way of deriving block-iterative versions of any row-action method, the choice has to be made with reference to the underlying real-world problem.

132 citations


Journal ArticleDOI
TL;DR: Noncoherent optical-imaging systems are identified as potential applications for the maximum-likelihood image-restoration methods that are currently being studied for various modalities of nuclear-medicine imaging and results of a computer simulation support its feasibility.
Abstract: Noncoherent optical-imaging systems are identified as potential applications for the maximum-likelihood image-restoration methods that are currently being studied for various modalities of nuclear-medicine imaging. An analogy between the quantum-photon measurements of such an optical system and that of a gamma camera allow for this new application. Results of a computer simulation are presented that support its feasibility. One important property revealed by this simulation is that the maximum-likelihood method demonstrates the ability to extrapolate the Fourier spectrum of a band-limited signal. This ability can be partially understood in that this algorithm, similar to some of the other spectral-extrapolation algorithms, constrains the solution to nonnegative values. This observation has implications on the potential of superresolution, the restoration of images from a defocused optical system, and three-dimensional imaging with a microscope.

129 citations


Journal ArticleDOI
TL;DR: An interpolation method is proposed for generating the intermediate contours between a start contour and a goal contour, which provides a powerful tool for reconstructing the 3D object from serial cross sections.
Abstract: An interpolation method is proposed for generating the intermediate contours between a start contour and a goal contour. Coupled with the display method for voxel-based objects, it provides a powerful tool for reconstructing the 3D object from serial cross sections. The method tries to fill in the lost information between two slices, assuming that there is smooth change between them. This is a reasonable assumption provided that the sampling is at least twice the Nyquist rate, in which case the result of the interpolation is expected to be very close to reality. One of the major advantages of this approach is its ability to handle the branching problem. Another major advantage is that after each intermediate contour is generated and sent to display device, there is no need to keep it in the memory unless the solid model will be used for further processing. Thus, the space complexity of this algorithm is relatively low. >

115 citations


Journal ArticleDOI
TL;DR: This paper looks at errors of reconstruction produced by non-ideal placement of the electrodes and shows that the reconstruction method is insensitive to such placement errors.
Abstract: Reconstruction of electrical impedance images using the filtered back projection method of Barber and Brown (ATP Information Processing in Medical Imaging, ed. S.L. Bacharach, (Dordrecht: Martinus Nijholf) p.106-21, 1986) makes several important assumptions about the object being imaged. These are principally that the object has a circular boundary, is two-dimensional and of impedance close to uniform, and has electrodes equally spaced on its boundary. In practice few of these assumptions are met, yet the method appears to give sensible and useful images. This paper looks at errors of reconstruction produced by nonideal placement of the electrodes and shows that the reconstruction method is insensitive to such placement errors.

109 citations


Journal ArticleDOI
TL;DR: It is shown that a special set of convex projections duplicates the result of the algebraic reconstruction technique (ART), and use of a priori information enhances the quality of the results, especially when partial data have been used, in which case ART fails.
Abstract: The method of convex projections is applied to reconstruct an image in computer tomography. This appears to the first time that the method has been used to obtain geometry-free reconstruction from ray-sum data. It is shown that a special set of convex projections duplicates the result of the algebraic reconstruction technique (ART). The similarities and differences of these two methods are discussed. It is pointed out that use of a priori information enhances the quality of the results, especially when partial data have been used, in which case ART fails. Simulations and reconstruction of a CT image are also furnished to demonstrate the feasibility of this method. >

105 citations


Journal ArticleDOI
TL;DR: Two independent algorithms for coronal image alignment that have been successfully implemented in computer programs are reported, and a quantitative assessment of the registration of non-identical images is considered.

Journal ArticleDOI
TL;DR: A general reconstruction algorithm for magnetic resonance imaging (MRI) with gradients having arbitrary time dependence is presented and an explicit representation of the point spread function (PSF) in the weighted correlation method is given.
Abstract: A general reconstruction algorithm for magnetic resonance imaging (MRI) with gradients having arbitrary time dependence is presented. This method estimates spin density by calculating the weighted correlation of the observed free induction decay signal and the phase modulation function at each point. A theorem which states that this method can be derived from the conditions of linearity and shift invariance is presented. Since these conditions are general, most of the MRI reconstruction algorithms proposed so far are equivalent to the weighted correlation method. An explicit representation of the point spread function (PSF) in the weighted correlation method is given. By using this representation, a method to control the PSF and the static field inhomogeneity effects is studied. A correction method for the inhomogeneity is proposed, and a limitation is clarified. Some simulation results are presented. >

Journal ArticleDOI
TL;DR: A robust parameter-estimation algorithm for a nonsymmetric half-plane (NSHP) autoregressive model, where the driving noise is a mixture of a Gaussian and an outlier process, and an algorithm to restore realistic images is presented.
Abstract: A robust parameter-estimation algorithm for a nonsymmetric half-plane (NSHP) autoregressive model, where the driving noise is a mixture of a Gaussian and an outlier process, is presented. The convergence of the estimation algorithm is proved. An algorithm to estimate parameters and original image intensity simultaneously from the impulse-noise-corrupted image, where the model governing the image is not available, is also presented. The robustness of the parameter estimates is demonstrated by simulation. Finally, an algorithm to restore realistic images is presented. The entire image generally does not obey a simple image model, but a small portion (e.g. 8*8) of the image is assumed to obey an NSHP model. The original image is divided into windows and the robust estimation algorithm is applied for each window. The restoration algorithm is tested by comparing it to traditional methods on several different images. >

Journal ArticleDOI
TL;DR: In this paper, the maximum likelihood estimator (MLE) algorithm was applied to positron-emission tomography (PET) data obtained by the ECAT-III tomograph from a brain phantom, where the procedure for subtracting accidental coincidences from the data stream generated by this physical phantom is such that the resultant data are not Poisson distributed.
Abstract: In order to study properties of the maximum-likelihood-estimator (MLE) algorithm for image reconstruction in positron-emission tomography (PET), the algorithm is applied to data obtained by the ECAT-III tomograph from a brain phantom. The procedure for subtracting accidental coincidences from the data stream generated by this physical phantom is such that the resultant data are not Poisson distributed. This makes the present investigation different from other investigations based on computer-simulated phantoms. It is shown that the MLE algorithm is robust enough to yield comparatively good images, especially when the phantom is in the periphery of the field of view, even though the underlying assumption of the algorithm is violated. A stopping rule derived earlier and allowing the user to stop the iterative process before the images begin to deteriorate is tested. Since the rule is based on the Poisson assumption, it does not work well with the presently available data, although it is successful with computer-simulated Poisson data. >

Journal ArticleDOI
TL;DR: It is demonstrated that the performance of the algorithm is superior to that of the filter back projection method in computational speed on realistic size problems and is equivalent to filtered backprojection in accuracy of reconstruction.
Abstract: The notion of a linogram corresponds to the notion of a sinogram in the conventional representation of projection data for image reconstruction. In the sinogram, points which correspond to rays through a fixed point in the cross section to be reconstructed all fall on a sinusoidal curve. In the linogram, however, these points fall on a straight line. The implementation of a novel image reconstruction method using this property is discussed. The implementation is of order N/sup 2/ log N, where N is proportional to the number of pixels on a side of the reconstruction region. It is demonstrated that the performance of the algorithm is superior to that of the filter backprojection method in computational speed on realistic size problems and is equivalent to filtered backprojection in accuracy of reconstruction. >

Journal ArticleDOI
TL;DR: An efficient knowledge-based multigrid reconstruction algorithm based on the ML approach is presented to overcome problems of the slow convergence rate, the large computation time, and the nonuniform correction efficiency of each iteration.
Abstract: The problem of reconstruction in positron emission tomography (PET) is basically estimating the number of photon pairs emitted from the source. Using the concept of the maximum-likelihood (ML) algorithm, the problem of reconstruction is reduced to determining an estimate of the emitter density that maximizes the probability of observing the actual detector count data over all possible emitter density distributions. A solution using this type of expectation maximization (EM) algorithm with a fixed grid size is severely handicapped by the slow convergence rate, the large computation time, and the nonuniform correction efficiency of each iteration, which makes the algorithm very sensitive to the image pattern. An efficient knowledge-based multigrid reconstruction algorithm based on the ML approach is presented to overcome these problems. >

Journal ArticleDOI
L. Wang1, M. Goldberg1
TL;DR: A progressive image transmission scheme based on iterative transform coding structure is proposed for application in interactive image communication over low-bandwidth channels and it is shown that the average reconstruction error variance converges to zero as the number of iterative stages approaches infinity.
Abstract: A progressive image transmission scheme based on iterative transform coding structure is proposed for application in interactive image communication over low-bandwidth channels. The scheme not only provides progressive transmission, but also guarantees lossless reproduction combined with a degree of compression. The image to be transmitted undergoes an orthogonal transform, and the transform coefficients are quantized (scalar or vector) before transmission. The novelty is that the residual error array due to quantization is iteratively fedback and requantized (scalar or vector); the coded residual error information is progressively transmitted and utilized in reconstructing the successive approximations. It is shown that the average reconstruction error variance converges to zero as the number of iterative stages approaches infinity. In practice, lossless reproduction can be achieved with a small number of iterations by using an entropy coder on the final residual-error image. Computer simulation results demonstrate the effectiveness of the technique. >

Journal ArticleDOI
TL;DR: Evidence is presented that useful reconstructions can be obtained with only one or two extra tilts from highly disordered specimens, even if the objects are asymmetric.

Journal ArticleDOI
TL;DR: An algorithm which can automatically construct 3D solid objects from 2D orthographic views is proposed, which may be polyhedra, cylinders, partial cylinders and their composites.

Journal ArticleDOI
TL;DR: In this article, three practical methods for digital triple-correlation processing have been developed, and each of these methods has successfully reconstructed images from computer-simulated speckle interferograms.
Abstract: The triple-correlation technique of speckle imaging has been investigated in the low-light-level regime. Three practical methods for digital triple-correlation processing have been developed. Each of these methods has successfully reconstructed images from computer-simulated speckle interferograms. In this paper we describe and compare the three algorithms.

Proceedings ArticleDOI
11 Apr 1988
TL;DR: Variable-rate encoding increases the compression efficiency by allocating greater resolution to high-detail regions and provides a perceptually more consistent image quality throughout all areas of the decoded image.
Abstract: A hierarchical approach to encoding of images using vector quantization (VQ) is described which allows VQ of large blocks with a tolerable computational complexity and permits progressive image reconstruction at bit rates as low as 0.3 bit/pixel. The technique used, multi-stage hierarchical VQ (MSHVQ), is a successive approximation approach for quantization that is based on recursive decomposition of larger image blocks into smaller subblocks. The encoder of MSHVQ consists of several stages, each of which encodes the quantization error generated from the previous stage. Digital decimation and interpolation techniques are used to convert between different vector dimensions in the higher level stages and to reduce the otherwise highly noticeable blocking effect in the reconstructed image. Variable-rate encoding increases the compression efficiency by allocating greater resolution to high-detail regions and provides a perceptually more consistent image quality throughout all areas of the decoded image. Experimental results show that MSHVQ yields a significantly improved image quality over conventional VQ of equivalent complexity. >

Journal ArticleDOI
TL;DR: The author addresses the problem of reconstructing a bandlimited signal from a finite number of its unevenly spaced sampled data with a Fourier analysis and develops an interpolation scheme from the available data.
Abstract: The author addresses the problem of reconstructing a bandlimited signal from a finite number of its unevenly spaced sampled data. A Fourier analysis of the available unevenly spaced sampled data is presented. The result is utilized to develop an interpolation scheme from the available data. Conditions for accurate reconstruction are examined. Algorithms to implement the reconstruction scheme are discussed. The method's application to one-dimensional and two-dimensional reconstruction problems is shown. >

Journal ArticleDOI
TL;DR: In this paper, a new method for full field automatic 3D surface reconstruction is proposed which makes use of multiple contourograms shifted in phase by object translation and is demonstrated for shadow moire topography.
Abstract: A new method for full field automatic 3-D surface reconstruction is proposed which makes use of multiple contourograms shifted in phase by object translation. The method is demonstrated for shadow moire topography. It is shown that surface reconstruction can be done fast and with a resolution at least 10 times higher than the fringe distance of the measuring setup. Convexity and concavity of the surface are automatically determined. Also shown is the possibility of measuring irregular surfaces with very sudden height jumps.

Journal ArticleDOI
TL;DR: A first generation automated 3-d particle tracking velocimeter was constructed and an example of the application of the technique to the gas-phase flow through a transparent model of an engine throttle-body assembly is presented.
Abstract: A first generation automated 3-d particle tracking velocimeter was constructed. It measures the velocity (all 3 components simultaneously) of the fluid by imaging the light scattered from nylon particles, which move with the flow. The images are captured into two SIT cameras at video rates. The cameras are to orthogonal positions and the computational algorithms necessary to reconstruct the 3-d velocity information from these two 2-d projections are discussed. An example of the application of the technique to the gas-phase flow (≅ 6.5 m/s) through a transparent model of an engine throttle-body assembly (3 × 3 × 3 cm test section) is presented.

Journal ArticleDOI
TL;DR: One algorithm based on the regularised Newton's method of Levenburg and Marquardt is described, and a second modified version of this algorithm which uses optimal current drive patterns is shown to give superior reconstruction in a simulation study.
Abstract: In electrical impedance tomography the reconstruction problem is a non-linear inverse problem and can only be solved by iterative methods. This paper describes two such algorithms, one based on the regularised Newton's method of Levenburg and Marquardt, and a second modified version of this algorithm which uses optimal current drive patterns. The second algorithm is shown to give superior reconstruction in a simulation study. Some effects of errors in the knowledge of boundary shape and electrode position are also discussed.

Journal ArticleDOI
TL;DR: The authors devised a way to generate in real time a cross-sectional image of an object with uniformly high resolution based on the synthetic aperture focusing technique (SAFT) and its expected performance was demonstrated.
Abstract: The authors devised a way to generate in real time a cross-sectional image of an object with uniformly high resolution based on the synthetic aperture focusing technique (SAFT). A computer simulation was conducted to study the effects of essential parameters on the resulting images. An imaging system was built that produces a cross-sectional image composed of an assembly of line images of depth direction, i.e. processed A-scan images, and displays a scroll picture on a CRT (cathode ray tube) with no interruption regardless of the object size. It takes only 3 ms from the start of transmission of the ultrasonic wave to the completion of a line image reconstruction, and the framed image on a CRT is updated at the TV rate of 1/30 s. Imaging experiments were conducted using the system, and its expected performance was demonstrated. >

Dissertation
14 Sep 1988
TL;DR: In this article, a large-scale upper mantle delay-time tomography was studied using the conjugate gradient method (LSQR) and the Simultaneous Iterative Reconstruction Techniques (SIRT).
Abstract: More than a decade ago the method of seismic delay time tomography was introduced in geophysics by Alci et al. (1974, 1977). In the 1977 paper the inverse problem is formulated of retrieving the three-dimensional seismic velocity structure of the Earth's interior from a finite number of data (delay times). Originally the method aimed at imaging the seismic structure of the lithosphere using a simple plane wave approximation for the incident seismic waves which illuminated a three dimensional block model of the lithosphere. Concurrently, papers dealing with mantle tomography on a global scale were presented by Sengupta and Toksoz (1976) and Dziewonski et al. (1977) also using a block division of the Earth's interior. In the early days of seismic tomography the computer hardware and inversion algorithms limited the applications of delay time tomography to using only a few hundred velocity model parameters. Elegant inversion algorithms like the Singular Value Decomposition method were used which also allowed formal computation of the errors and of the spatial resolution in the tomographic mapping. However, square matrices of the size of the number of unknowns needed to be inverted and, at that time, the relatively small computer memories available imposed a severe restriction on the inversions. This changed in the middle of the 1980's when Clayton and Comer (1984) were the first to apply an iterative row-action method in large scale lower mantle tomography. Rowaction methods do not perform a full matrix inversion inside the computer's memory. Instead, one matrix row at a time is processed. Since the matrix describing the tomographic problem can be stored on disk, the great advantage is that many more model parameters, 0(105), can be used which allows a detailed parameterization of relatively large volumes of the Earth. In the young history of seismic delay time tomography many geophysicists have been impressed by the beauty of the method, but have been much less convinced of the reliability of the results. First, the quality of the delay time data has never been thoroughly studied. Secondly, the emphasis has been on imaging the Earth's interior rather than on studying the reliability of the tomographic mapping, which is indeed difficult to assess. The subject of this thesis is large scale upper mantle delay time tomography where we use two different row-action methods to solve the inversion problem. The algorithms are the conjugate gradient method called LSQR of Paige and Saunders (1982) and a member of the family of Simultaneous Iterative Reconstruction Techniques (SIRT). The research is discussed in the framework of an application to the upper mantle beneath central and south-eastern Europe, the Mediterranean region and the Middle East. When this research started little was known about the characteristics of row-action methods in delay time tomography. Nolet (1985) showed in a synthetic experiment that the LSQR method provided a much faster convergence rate to the least squares solution than SIRT, but the behaviour of these algorithms in a real tomographic problem needed to be scrutinized more fully. Another interesting problem to investigate was whether the enonnous amount of ISC (International Seismological Centre) P delay time data for seismic waves which bottom in the upper mantle were useful for tomographic inversion. In the upper mantle the ray geometry is more complex than in the lower mantle. This poses additional problems in delay time tomography of the upper mantle. Moreover, most of the 'upper mantle data' are derived from low magnitude events which in many cases can only be located with poor accuracy. Consequently, delay times belonging to low magnitude earthquakes presumably contain large errors. Hitherto in most tomographic studies only delay time data from well detennined large magnitude events were used and only observations from stations at distances larger than 300 were admitted. In this research the situation is just complementary.

Journal ArticleDOI
TL;DR: A digital electronic architecture for parallel processing of the expectation maximization (EM) algorithm for positron-emission-tomography (PET) image reconstruction is proposed and EM images are shown that are produced with software simulating the proposed hardware reconstruction algorithm.
Abstract: A digital electronic architecture for parallel processing of the expectation maximization (EM) algorithm for positron-emission-tomography (PET) image reconstruction is proposed. Rapid (0.2-s) EM iterations on high-resolution (256*256) images are supported. Arrays of two VLSI chips perform forward and back projection calculations. The architecture is described, including data flow and partitioning relevant to EM and parallel processing. EM images are shown that are produced with software simulating the proposed hardware reconstruction algorithm. Projected cost of the system is estimated to be small in comparison to the cost of current PET scanners. >

Journal ArticleDOI
TL;DR: Phase retrieval from experimental (laboratory) data has been successfully demonstrated and the reconstructed image compares favorably with a conventional image with the same spatial-frequency bandwidth.
Abstract: Phase retrieval from experimental (laboratory) data has been successfully demonstrated. A diffuse object was coherently illuminated and Fourier intensity data were collected by a charge-coupled device detector and a video digitizer. By using the data and an a priori triangular image support constraint, an iterative Fourier-transform algorithm was used to estimate the phase of the Fourier transform of the object. The reconstructed image compares favorably with a conventional image with the same spatial-frequency bandwidth.

Journal ArticleDOI
TL;DR: In this paper, optical tomography is applied to the speckle photographic measurement of an asymmetric flow field with variable fluid density, and the convolution back projection algorithm is used for obtaining the 3D density distribution.
Abstract: Optical tomography is applied to the speckle photographic measurement of an asymmetric flow field with variable fluid density. The convolution back projection algorithm is used for obtaining the 3-D density distribution. Noise in the experimental data is reduced by spline smoothing. The method is verified with a steady, laminar, axisymmetric helium jet exhausting vertically into the ambient air, and then applied to a non-axisymmetric helium jet for determining the helium concentration. It is found that speckle photographic recordings are very adequate for tomographic reconstruction, because they provide a high number of data points from each projection. The influence of the limited number of projections on the reconstruction quality is particularly investigated.