scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Transactions on Medical Imaging in 1991"


Journal Article•DOI•
TL;DR: The authors compare the artifact introduced into the image for various convolving functions of different sizes, including the Kaiser-Bessel window and the zero-order prolate spheroidal wave function (PSWF).
Abstract: In the technique known as gridding, the data samples are weighted for sampling density and convolved with a finite kernel, then resampled on a grid preparatory to a fast Fourier transform. The authors compare the artifact introduced into the image for various convolving functions of different sizes, including the Kaiser-Bessel window and the zero-order prolate spheroidal wave function (PSWF). They also show a convolving function that improves upon the PSWF in some circumstances. >

1,187 citations


Journal Article•DOI•
TL;DR: An overview of the Shinnar-Le Roux (SLR) algorithm is presented and it is shown how the performance of SLR pulses can be very accurately specified analytically, and how to design a pulse that produces a specified slice profile.
Abstract: An overview of the Shinnar-Le Roux (SLR) algorithm is presented. It is shown how the performance of SLR pulses can be very accurately specified analytically. This reveals how to design a pulse that produces a specified slice profile and allows the pulse designer to trade off analytically the parameters describing the pulse performance. Several examples are presented to illustrate the more important tradeoffs. These include linear-phase and minimum- and maximum-phase pulses. Linear-phase pulses can be refocused with a gradient reversal and can be used as spin-echo pulses. Minimum- and maximum-phase pulses have better slice profiles, but cannot be completely refocused. >

804 citations


Journal Article•DOI•
TL;DR: For three different activity distributions in cylinder phantoms, simulation tests gave good agreement between the activity distributions reconstructed from unscattered photons and those from the corrected data.
Abstract: A new method is proposed to subtract the count of scattered photons from that acquired with a photopeak window at each pixel in each planar image of single-photon emission computed tomography (SPECT). The subtraction is carried out using two sets of data: one set is acquired with a main window centered at photopeak energy and the other is acquired with two subwindows on both sides of the main window. The scattered photons included in the main window are estimated from the counts acquired with the subwindows and then they are subtracted from the count acquired with the main windows. Since the subtraction is performed at each pixel in each planar image, the proposed method has the potential to be more precise than conventional methods. For three different activity distributions in cylinder phantoms, simulation tests gave good agreement between the activity distributions reconstructed from unscattered photons and those from the corrected data. >

577 citations


Journal Article•DOI•
TL;DR: It is suggested that synchronous detection or demodulation can be used in MRI systems in place of magnitude detection to provide complete suppression of undesired quadrature components, to preserve polarity and phase information, and to eliminate the biases and reduction in signal-to-noise ratio (SNR) and contrast in low SNR images.
Abstract: Magnetic detection of complex images in magnetic resonance imaging (MRI) is immune to the effects of incidental phase variations, although in some applications information is lost or images are degraded. It is suggested that synchronous detection or demodulation can be used in MRI systems in place of magnitude detection to provide complete suppression of undesired quadrature components, to preserve polarity and phase information, and to eliminate the biases and reduction in signal-to-noise ratio (SNR) and contrast in low SNR images. The incidental phase variations in an image are removed through the use of a homodyne demodulation reference, which is derived from the image or the object itself. Synchronous homodyne detection has been applied to the detection of low SNR images, the reconstruction of partial k-space images, the simultaneous detection of water and lipid signals in quadrature, and the preservation of polarity in inversion-recovery images. >

574 citations


Journal Article•DOI•
TL;DR: It is concluded that the principle source of systematic error was the finite slice thickness, which causes blurring of boundaries.
Abstract: A noninvasive tissue current measurement technique and its use in measuring a nonuniform current density are described. This current density image is created by measuring the magnetic field arising from these currents and taking its curl. These magnetic fields are proportional to the phase component of a complex magnetic resonance image. Measurements of all three components of a quasistatic nonuniform current density in a phantom are described. Expected current density calculations from a numerical solution for the magnetic field which was created by the phantom are presented for comparison. The results of a numerical simulation of the experiment, which used this field solution and which included the effects of slice selection and sampling, are also presented. The experimental and simulated results are quantitatively compared. It is concluded that the principle source of systematic error was the finite slice thickness, which causes blurring of boundaries. >

324 citations


Journal Article•DOI•
TL;DR: The authors present a method for compensating for the gray-level variation of MR images between different slices, which is primarily caused by the inhomogeneity of the RF field produced by the imaging coil.
Abstract: A single volume element (voxel) in a medical image may be composed of a mixture of multiple tissue types. The authors call voxels which contain multiple tissue classes mixels. A statistical mixel image model based on Markov random field (MRF) theory and an algorithm for the classification of mixels are presented. The authors concentrate on the classification of multichannel magnetic resonance (MR) images of the brain although the algorithm has other applications. The authors also present a method for compensating for the gray-level variation of MR images between different slices, which is primarily caused by the inhomogeneity of the RF field produced by the imaging coil. >

318 citations


Journal Article•DOI•
TL;DR: A reconstruction and homogeneity correction method to correct for the zeroth order effects of inhomogeneous using prior knowledge of the inhomogeneity is introduced.
Abstract: When time-varying gradients are used for imaging, the off-resonance behavior does not just cause geometric distortion as is the case with spin-warp imaging, but changes the shape of the impulse response and causes blurring. This effect is well known for projection reconstruction and spiral k-space scanning sequences. The authors introduce a reconstruction and homogeneity correction method to correct for the zeroth order effects of inhomogeneity using prior knowledge of the inhomogeneity. In this method, the data are segmented according to collection time, reconstructed using some fast, linear algorithm, correlated for inhomogeneity, and then superimposed to yield a homogeneity corrected image. This segmented method is compared to a conjugate phase reconstruction in terms of degree of correction and execution time. The authors apply this method to in vivo images using projection-reconstruction and spiral-scan sequences. >

289 citations


Journal Article•DOI•
TL;DR: A method of computing the three-dimensional (3-D) velocity field from 3-D cine computer tomographs (CTs) of a beating heart using the Euler-Lagrange method and the results are presented.
Abstract: A method of computing the three-dimensional (3-D) velocity field from 3-D cine computer tomographs (CTs) of a beating heart is proposed. Using continuum theory, the authors develop two constraints on the 3-D velocity field generated by a beating heart. With these constraints, the computation of the 3-D velocity field is formulated as an optimization problem and a solution to the optimization problem is developed using the Euler-Lagrange method. The solution is then discretized for computer implementation. The authors present the results for both simulated images and clinical cine CT images of a beating heart. >

208 citations


Journal Article•DOI•
TL;DR: A computer vision technique for the acquisition and processing of 3-D images of the profile of wax dental imprints in the automation of diagnosis in orthodontics and results show that the two operators are very effective at detecting the interstices.
Abstract: The authors present a computer vision technique for the acquisition and processing of 3-D images of the profile of wax dental imprints in the automation of diagnosis in orthodontics. The acquisition of the 3-D images is based on the absorption of light by a dispersive medium and uses standard CCD (charge coupled device) cameras. The profiles of both sides of the imprint are acquired simultaneously. The 3-D image of each side of the imprint is segmented by nonlinear filtering of the 3-D data, and the interstices between the teeth are detected. Two operators are presented: one for the detection of the interstices between the teeth for incisors, canines, and premolars, and one for those between molars. A method for deciding the optimal neighborhood of application of each operator is also presented. Experimental results show that the two operators are very effective at detecting the interstices. >

196 citations


Journal Article•DOI•
TL;DR: In this article, a 3D reconstruction algorithm was developed to reconstruct data from a 16 ring PET camera (a Siemens/CTI 953B) with automatically retractable septa.
Abstract: A fully 3-D reconstruction algorithm has been developed to reconstruct data from a 16 ring PET camera (a Siemens/CTI 953B) with automatically retractable septa. The tomograph is able to acquire coincidences between any pair of detector rings and septa retraction increases the total system count rate by a factor of 7.8 (including scatter) and 4.7 (scatter subtracted) for a uniform, 20 cm diameter cylinder. The reconstruction algorithm is based on 3-D filtered backprojection, expressed in a form suitable for the multi-angle sinogram data. Sinograms which are not measured due to the truncated cylindrical geometry of the tomograph, but which are required for a spatially invariant response function, are obtained by forward projection. After filtering, the complete set of sinograms is backprojected into a 3-D volume of 128*128*31 voxels using a voxel-driven procedure. The algorithm has been validated with simulation, and tested with both phantom and clinical data from the 953B. >

189 citations


Journal Article•DOI•
TL;DR: A novel method, called GSLIM (generalized spectral location by imaging), is proposed to make possible the marriage of high-resolution proton imaging with spectroscopic imaging and localization and may achieve an optimal combination of sensitivity, quantitative accuracy, speed, and flexibility for in vivo spectroscopy.
Abstract: The problem of precise spatial localization of spectral information in magnetic resonance (MR) spectroscopic imaging is addressed. A novel method, called GSLIM (generalized spectral location by imaging), is proposed to make possible the marriage of high-resolution proton imaging with spectroscopic imaging and localization. This method improves on the conventional Fourier series inversion method used in chemical shift imaging (CSI) and the compartmental modeling method used in SLIM by using a generalized series framework for optimal representation of the spectral function. In this way, a priori information extracted from proton imaging can be used, as in SLIM, and the robustness and data consistency of CSI are also retained. Simulation results show that GSLIM can significantly reduce spectral leakage in CSI and inhomogeneity errors in SLIM. It can also reveal compartmental inhomogeneities, and can easily be extended to handle other a priori constraints when necessary. This approach, with some further development, may achieve an optimal combination of sensitivity, quantitative accuracy, speed, and flexibility for in vivo spectroscopy. >

Journal Article•DOI•
TL;DR: Reconstruction procedures that account for attenuation in forming maximum-likelihood estimates of activity distributions in positron-emission tomography are extended to include regularization constraints and accidental coincidences.
Abstract: Reconstruction procedures that account for attenuation in forming maximum-likelihood estimates of activity distributions in positron-emission tomography are extended to include regularization constraints and accidental coincidences. A mathematical model is used for these effects. The corrections are incorporated into the iterations of an expectation-maximization algorithm for numerically producing the maximum-likelihood estimate of the distribution of radioactivity within a patient. The images reconstructed with this procedure are unbiased and exhibit lower variance than those reconstructed from precorrected data. >

Journal Article•DOI•
TL;DR: An approach to the three-dimensional reconstruction of coronary arteries is presented to show how modeling of a vascular network, together with algorithmic procedures, can lead to accurate 3-D structure and feature labeling.
Abstract: An approach to the three-dimensional reconstruction of coronary arteries is presented. The principal objective is to show how modeling of a vascular network, together with algorithmic procedures, can lead to accurate 3-D structure and feature labeling. The labeling problem is stated directly within the 3-D reconstruction framework. The reconstruction ambiguities inherent to biplane techniques are solved by means of a knowledge base, modeling of the object, and heuristic rules. Feasibility in near-real situations has been demonstrated. The critical importance of the object 3-D reference to achieving the data and modeling matching is emphasized, and a way to deal with it is pointed out. The overall system implies an incremental development in methodologies and experiments. All of them have been elaborated and tested independently, and the most appropriate ones have been selected for integration into a modular system. All the stages of the process (calibration, segmentation, reconstruction, and display) are discussed, with the main focus on modeling. Examples of automatic reconstruction from a phantom are provided. >

Journal Article•DOI•
TL;DR: A method for the automatic measurement of femur length in fetal ultrasound images is presented and exploits prior knowledge of the general range of femoral size and shape by using morphological operators, which process images based on shape characteristics.
Abstract: A method for the automatic measurement of femur length in fetal ultrasound images is presented. Fetal femur length measurements are used to estimate gestational age by comparing the measurement to a typical growth chart. Using a real-time ultrasound system, sonographers currently indicate the femur endpoints on the ultrasound display station with a mouse-like device. The measurements are subjective, and have been proven to be inconsistent. The automatic approach described exploits prior knowledge of the general range of femoral size and shape by using morphological operators, which process images based on shape characteristics. Morphological operators are used first to remove the background (noise) from the image, next to refine the shape of the femur and remove spurious artifacts, and finally to produce a single pixel-wide skeleton of the femur. The skeleton endpoints are assumed to be the femur endpoints. The length of the femur is calculated as the distance between those endpoints. A comparison of the measurements obtained with the manual and with the automated techniques is included. >

Journal Article•DOI•
TL;DR: An iterative reconstruction method which minimizes the effects of ill-conditioning is discussed and a regularization method which integrates prior information into the image reconstruction was developed which improves the conditioning of the information matrix in the modified Newton-Raphson algorithm.
Abstract: An iterative reconstruction method which minimizes the effects of ill-conditioning is discussed. Based on the modified Newton-Raphson algorithm, a regularization method which integrates prior information into the image reconstruction was developed. This improves the conditioning of the information matrix in the modified Newton-Raphson algorithm. Optimal current patterns were used to obtain voltages with maximal signal-to-noise ratio (SNR). A complete finite element model (FEM) was used for both the internal and the boundary electric fields. Reconstructed images from phantom data show that the use of regularization optimal current patterns, and a complete FEM model improves image accuracy. The authors also investigated factors affecting the image quality of the iterative algorithm such as the initial guess, image iteration, and optimal current updating. >

Journal Article•DOI•
TL;DR: The authors show that for SPECT imaging on 64x64 image grids, the single-instruction, multiple data (SIMD) distributed array processor containing 64(2) processors performs the expectation-maximization (EM) algorithm with Good's smoothing at a rate of 1 iteration/1.5 s, promising for emission tomography fully Bayesian reconstructions including regularization in clinical computation times which are on the order of 1 min/slice.
Abstract: Extending the work of A.W. McCarthy et al. (1988) and M.I. Miller and B. Roysam (1991), the authors demonstrate that a fully parallel implementation of the maximum-likelihood method for single-photon emission computed tomography (SPECT) can be accomplished in clinical time frames on massively parallel systolic array processors. The authors show that for SPECT imaging on 64*64 image grids, with 96 view angles, the single-instruction, multiple data (SIMD) distributed array processor containing 64/sup 2/ processors performs the expectation-maximization (EM) algorithm with Good's smoothing at a rate of 1 iteration/1.5 s. This promises for emission tomography fully Bayesian reconstructions including regularization in clinical computation times which are on the order of 1 min/slice. The most important result of the implementations is that the scaling rules for computation times are roughly linear in the number of processors. >

Journal Article•DOI•
TL;DR: It is demonstrated that a feasibility stopping criterion controls the noise in a reconstructed image, but is insensitive to quantitation errors, and that the use of an appropriate overrelaxation parameter can accelerate the convergence of the ML-based method during the iterative process without quantitative instabilities.
Abstract: Emission computerised tomography images reconstructed using a maximum likelihood-expectation maximization (ML)-based method with different reconstruction kernels and 1-200 iterations are compared to images reconstructed using filtered backprojection (FBP). ML-based reconstructions using a single pixel (SP) kernel with or without a sieve filter show no quantitative advantage over FBP except in the background where a reduction of noise is possible if the number of iterations is kept small ( >

Journal Article•DOI•
TL;DR: A low-cost PC-based system for 3-D localization of brain targets in stereotaxic imaging is presented, which relies on a method, using MR images, in which four markers are inserted in the fastenings of a Talairach stereOTaxic frame during MRI examination.
Abstract: A low-cost PC-based system for 3-D localization of brain targets in stereotaxic imaging is presented. It relies on a method, using MR images, in which four markers are inserted in the fastenings of a Talairach stereotaxic frame during MRI examination. By locating these markers on the images with this system, the transformation matrixes can be computed to obtain the 3-D coordinates of the center of a tumour in the stereotaxic space or in the MRI space. The system calculates the frame and arc setting parameters of a probe trajectory to the target, either for an orthogonal or a double oblique approach if needed. Simulated probe trajectory intersections with consecutive slices can be viewed in order to validate the trajectory before and during the surgical procedure. The method presents no major constraints in routine examinations. Mathematical details on the calculation of the transformation matrices are given. >

Journal Article•DOI•
TL;DR: Initial experience is reported with the Scanditronix PC 2048-15B, a 15-slice positron emission tomography (PET) system using multicrystal/multiphoto-multiplier modules to obtain high spatial resolution.
Abstract: Initial experience is reported with the Scanditronix PC 2048-15B, a 15-slice positron emission tomography (PET) system using multicrystal/multiphoto-multiplier modules to obtain high spatial resolution. Random and scattered events are reduced using an orbiting /sup 68/Ge rod source for transmission scans by only accepting coincidence lines which intersect the instantaneous position of the source. Scatter correction of the emission data is removed with a deconvolution kernel, random and dead-time correction by the use of observed singles rates. The peak count rates are 11.7/20.0 Kcps for the direct cross slices at concentrations of 4.5/5.1 mu Ci/cc. respectively. Over the radial range 0-9 cm from the ring center, radial transverse resolution is 4.6-6.4 mm, aid tangential transverse resolution is 4.6-5.1 mm using a Hanning filter. Over the same range, axial resolution varies from 6.1-6.2 mm in direct slices and from 5.4-7.1 mm in cross slices. This near-isotropic resolution allows collection of image volume data with no preferred direction for signal averaging errors. >

Journal Article•DOI•
TL;DR: The noise-equivalent count-rate (NEC) performance of a neuro-positron emission tomography (PET) scanner has been determined with and without interplane septa on uniform cylindrical phantoms of differing radii and in human studies to assess the optimum count rate conditions that realize the maximum gain this paper.
Abstract: The noise-equivalent count-rate (NEC) performance of a neuro-positron emission tomography (PET) scanner has been determined with and without interplane septa on uniform cylindrical phantoms of differing radii and in human studies to assess the optimum count rate conditions that realize the maximum gain. In the brain, the effective gain in NEC performance for three-dimensions (3-D) ranges from >5 at low count rates to approximately=3.3 at 200 kcps (equivalent to 37 kcps in 2-D). The gains of the 3-D method assessed by this analysis are significant, and are shown to be highly dependent on count rate and object dimensions. >

Journal Article•DOI•
TL;DR: In this article, an image reconstruction method motivated by positron emission tomography (PET) is discussed and an iterative approach which requires the solution of simple quadratic equations is proposed.
Abstract: An image reconstruction method motivated by positron emission tomography (PET) is discussed. The measurements tend to be noisy and so the reconstruction method should incorporate the statistical nature of the noise. The authors set up a discrete model to represent the physical situation and arrive at a nonlinear maximum a posteriori probability (MAP) formulation of the problem. An iterative approach which requires the solution of simple quadratic equations is proposed. The authors also present a methodology which allows them to experimentally optimize an image reconstruction method for a specific medical task and to evaluate the relative efficacy of two reconstruction methods for a particular task in a manner which meets the high standards set by the methodology of statistical hypothesis testing. The new MAP algorithm is compared to a method which maximizes likelihood and with two variants of the filtered backprojection method. >

Journal Article•DOI•
M.E. Brummer1•
TL;DR: The Sobel magnitude edge operator, used for preprocessing, proved adequate for magnetic resonance scans with positive and negative brain/cerebrospinal fluid contrast and is a three-dimensional variant of the Hough transform principle.
Abstract: A technique is presented for automatic detection of the longitudinal fissure in tomographic scans of the brain. The technique utilizes the planar nature of the fissure and is a three-dimensional variant of the Hough transform principle. Algorithmic and computational aspects of the technique are discussed. Results and performance on coronal and transaxial magnetic resonance data show that the algorithm is robust with respect to variations in image contrast in the data and to slight anatomic anomalies. A crucial resolution requirement in the data for accurate parameter estimations is a sufficient number of slices covering the whole brain. The Sobel magnitude edge operator, used for preprocessing, proved adequate for magnetic resonance scans with positive and negative brain/cerebrospinal fluid contrast. >

Journal Article•DOI•
TL;DR: A fully automated system for detecting the endocardial and epicardial boundaries in a two-dimensional echocardiography by using fuzzy reasoning techniques is proposed, which deduces local intensity change from the knowledge of global intensity change through fuzzy reasoning.
Abstract: A fully automated system for detecting the endocardial and epicardial boundaries in a two-dimensional echocardiography by using fuzzy reasoning techniques is proposed. The image is first enhanced by applying the Laplacian-of-Gaussian edge detector. Second, the center of the left ventricle is determined automatically by analyzing the original image. Next, a search process radiated from the estimated center is performed to locate the endocardial boundary by using the zero-crossing points. After this step, the estimation of the range of radius of a possible epicardial boundary is carried out by comparing the high-level knowledge of intensity changes along all directions with the actual image intensity changes. The high-level knowledge of global intensity change in the image is acquired from experts in advance, and is represented in the form of fuzzy linguistic descriptions and relations. Knowledge of local intensity change can therefore be deduced from the knowledge of global intensity change through fuzzy reasoning. >

Journal Article•DOI•
TL;DR: It is shown that the EM algorithm can be efficiently parallelized using the (modified) partition-by-box scheme with the message passing model.
Abstract: The EM algorithm for PET image reconstruction has two major drawbacks that have impeded the routine use of the EM algorithm: the long computation time due to slow convergence and a large memory required for the image, projection, and probability matrix. An attempt is made to solve these two problems by parallelizing the EM algorithm on multiprocessor systems. An efficient data and task partitioning scheme, called partition-by-box, based on the message passing model is proposed. The partition-by-box scheme and its modified version have been implemented on a message passing system, Intel iPSC/2, and a shared memory system, BBN Butterfly GP1000. The implementation results show that, for the partition-by-box scheme, a message passing system of complete binary tree interconnection with fixed connectivity of three at each node can have similar performance to that with the hypercube topology, which has a connectivity of log/sub 2/ N for N PEs. It is shown that the EM algorithm can be efficiently parallelized using the (modified) partition-by-box scheme with the message passing model. >

Journal Article•DOI•
TL;DR: A new algorithm to enhance the edges and contrast of chest and breast radiographs while minimally amplifying image noise is presented.
Abstract: The authors present a new algorithm to enhance the edges and contrast of chest and breast radiographs while minimally amplifying image noise. The algorithm consists of a linear combination of an original image and two smoothed images obtained from it by using different masks and parameters, followed by the application of nonlinear contrast stretching. The result is an image which retains the high median frequency local variations (edge and contrast-enhancing). >

Journal Article•DOI•
TL;DR: An algorithm that suppresses translational motion artifacts in magnetic resonance imaging (MRI) by using post processing on a standard spin-warp image by using an iterative algorithm of generalized projections is presented.
Abstract: An algorithm that suppresses translational motion artifacts in magnetic resonance imaging (MRI) by using post processing on a standard spin-warp image is presented. It is shown that translational motion causes an additional phase factor in the detected signal and that this phase error can be removed using an iterative algorithm of generalized projections. The method has been tested using computer simulations and it successfully removed most of the artifact. The algorithm converges even in the presence of severe noise. >

Journal Article•DOI•
TL;DR: A computerized approach to the problem of the assessment of skeletal maturity in pediatric radiology is presented and the assessed age has been compared to the estimates obtained by a radiologist using the atlas matching method as well as the chronological age.
Abstract: A computerized approach to the problem of the assessment of skeletal maturity in pediatric radiology is presented. A CR (computed radiography) hand image to be analyzed is first standardized to obtain a left hand, upright, PA view. Then the phalangeal region of interest is defined and thresholded. After the separation of the third finger, the lengths of the distal, middle, and proximal phalanx are measured automatically. Using the standard phalangeal length table, the skeletal age is estimated. The assessed age has been compared to the estimates obtained by a radiologist using the atlas matching method as well as the chronological age. >

Journal Article•DOI•
TL;DR: The multigrid implementation was found to accelerate the convergence rate of high-frequency components of the image when the image possessed the local smoothness property, but in other cases it was unhelpful, and may even slow down the converge rate.
Abstract: The numerical behavior of multigrid implementations of the Landweber, generalized Landweber, ART, and MLEM iterative image reconstruction algorithms is investigated. Comparisons between these algorithms, and with their single-grid implementations, are made on two small-scale synthetic PET systems, for phantom objects exhibiting different characteristics, and on one full-scale synthetic system, for a Shepp-Logan phantom. The authors also show analytically the effects of noise and initial condition on the generalized Landweber iteration, and note how to choose the shaping operator to filter out noise in the data, or to enhance features of interest in the reconstructed image. Original contributions include (1) numerical studies of the convergence rates of single-grid and multigrid implementations of the Landweber, generalized Landweber, ART, and MLEM iterations and (2) effects of noise and initial condition on the generalized Landweber iteration, with procedures for filtering out noise or enhancing image features. >

Journal Article•DOI•
TL;DR: A general model-based surface detector for finding the four-dimensional endocardial and epicardial left ventricular boundaries was developed and accuracy was investigated using actual patient data.
Abstract: The authors have developed a general model-based surface detector for finding the four-dimensional (three spatial dimensions plus time) endocardial and epicardial left ventricular boundaries. The model encoded left ventricular (LV) shape, smoothness, and connectivity into the compatibility coefficients of a relaxation labeling algorithm. This surface detection method was applied to gated single photon emission computed tomography (SPECT) perfusion images, tomographic radionuclide ventriculograms, and cardiac rotation magnetic resonance images. Its accuracy was investigated using actual patient data. Global left ventricular volumes correlated well, with a maximum correlation coefficient of 0.98 for magnetic resonance imaging (MRI) endocardial surfaces and a minimum of 0.88 for SPECT epicardial surfaces. The average absolute errors of edge detection were 6.4, 5.6. and 4.6 mm for tomographic radionuclide ventriculograms, gated perfusion SPECT, and magnetic resonance images, respectively. >

Journal Article•DOI•
TL;DR: In this paper, the authors proposed a new 2D point source scattering deconvolution method by modeling a scattering point source function, where the scattering dependence on axial and transaxial directions is reflected in the exponential fitting parameters, and these parameters are directly estimated from a limited number of measured point response functions.
Abstract: The authors propose a new 2-D point source scattering deconvolution method. The cross-plane scattering is incorporated into the algorithm by modeling a scattering point source function. In the model, the scattering dependence on axial and transaxial directions is reflected in the exponential fitting parameters, and these parameters are directly estimated from a limited number of measured point response functions. The results comparing the standard in-plane line source deconvolution to the cross-plane point source deconvolution show that for a small source the former technique overestimates the scatter fraction in the plane of the source and underestimates the scatter fraction in adjacent planes. In addition, the authors also propose a simple approximation technique for deconvolution. >