scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Penalized-likelihood image reconstruction for x-ray fluorescence computed tomography

01 Jul 2006-Optical Engineering (International Society for Optics and Photonics)-Vol. 45, Iss: 7, pp 077005
TL;DR: A penalized-likelihood image reconstruction strategy that alternates between updating the distribution of a given element and updating the attenuation map for that element's fluorescence X-rays and is guaranteed to increase the penalized likelihood at each iteration.
Abstract: X-ray fluorescence computed tomography (XFCT) allows for the reconstruction of the distribution of elements within a sample from measurements of fluorescence x rays produced by irradiation of the sample with monochromatic synchrotron radiation. XFCT is not a transmission tomography modality, but rather a stimulated emission tomography modality; thus correction for attenuation of the incident and fluorescence photons is essential if accurate images are to be obtained. This is challenging because the attenuation map is, in general, known only at the stimulating beam energy and not at the various fluorescence energies of interest. We make use of empirically fitted analytic expressions for x-ray attenuation coefficients to express the unknown attenuation maps as linear combinations of known quantities and the unknown elemental concentrations of interest. We then develop an iterative image reconstruction algorithm based on penalized-likelihood methods that have been developed for medical emission tomography. Studies with numerical phantoms indicate that the approach is able to produce qualitatively and quantitatively accurate reconstructed images even in the face of severe attenuation. We also apply the method to real synchrotron-acquired data and demonstrate a marked improvement in image quality relative to filtered backprojection reconstruction.
Citations
More filters
Journal ArticleDOI
TL;DR: X-ray fluorescence imaging is a powerful technique that can be used to determine elemental and chemical species distributions at a range of spatial resolutions within samples of biological tissues, and the technique is capable of determining metal and nonmetal distributions on a variety of length scales.
Abstract: From the perspective of a chemist, biology confers a rich variety of roles on a number of metal ions. It is widely agreed that a large fraction of the genomic output of living things contains metal or metalloid ions, although estimates of this fraction vary widely and depend upon which metal ions are considered.1−3 Moreover, recent reports suggest that, at least in some cases, there are many uncharacterized metalloproteins.4 With inclusion of the s-block metals such as Na, K, Mg, and Ca, the proportion likely approaches 100%; recent estimates from the protein data bank indicate that the prevalence of heavier metal ions of atomic number above 20 within proteins is around 22%,5 with Zn2+ proteins alone constituting about 11%. Living organisms have an inherent and very rich physical structure, with relevant length scales ranging from the nanometer scale for subcellular structure to hundreds of micrometers and above for tissue, organ, or organism-level organization. The ability to derive the spatial distribution of elements on this diversity of length scales is a key to understanding their function. Metals play essential and central roles in the most important and chemically challenging processes required for life, with active site structures and mechanisms that, at the time of their discovery, have usually not yet been duplicated in the chemical laboratory. Furthermore, diseases of metal dysregulation can cause disruption in the distribution of metals.6 For example, Menke’s disease and Occipital Horn Syndrome,7 and Wilson’s disease,8 involve disruption in copper uptake and excretion, respectively, through mutation in the ATP7A and ATP7B Cu transporters.9 The mechanisms of action of toxic elements such as mercury and arsenic are also of interest, as are essential nonmetal trace elements, such as selenium. Likewise, an increasing number of pharmaceuticals include metals or heavier elements; such chemotherapeutic drugs include the platinum derivatives cisplatin and carboplatin,10 some promising new ruthenium drugs,11 and arsenic trioxide, which has been used to treat promyelocytic leukemia.12 Understanding the localization, speciation, and distribution of these at various length scales is of significant interest. A wide variety of heavier elements can be probed by X-ray spectroscopic methods; these are shown graphically in Figure ​Figure1.1. X-ray fluorescence imaging is a powerful technique that can be used to determine elemental and chemical species distributions at a range of spatial resolutions within samples of biological tissues. Most modern applications require the use of synchrotron radiation as a tunable and high spectral brightness source of X-rays. The method uses a microfocused X-ray beam to excite X-ray fluorescence from specific elements within a sample. Because the method depends upon atomic physics, it is highly specific and enables a wide range of chemical elements to be investigated. A significant advantage over more conventional methods is the ability to measure intact biological samples without significant treatment with exogenous reagents. The technique is capable of determining metal and nonmetal distributions on a variety of length scales, with information on chemical speciation also potentially available. Figure ​Figure22 shows examples of rapid-scan X-ray fluorescence imaging at two contrasting length scales: rapid-scan imaging13 of a section of a human brain taken from an individual suffering from multiple sclerosis and showing elemental profiles for Fe, Cu, and Zn;14 and a high-resolution image showing mercury and other elements in a section of retina from a zebrafish larva treated with methylmercury chloride.15 We will discuss both the state of the art in terms of experimental methods and some recent applications of the methods. This Review considers X-ray fluorescence imaging with incident X-ray energies in the hard X-ray regime, which we define as 2 keV and above. We review technologies for producing microfocused X-ray beams and for detecting X-ray fluorescence, as well as methods that confer chemical selectivity or three-dimensional visualization. We discuss applications in key areas with a view to providing examples of how the technique can provide information on biological systems. We also discuss synergy with other methods, which have overlapping or complementary capabilities. Our goal is to provide useful and pertinent information to encourage and enable further use of this powerful method in chemical and biochemical studies of living organisms. Figure 1 Periodic table of the elements showing elements of biological interest that can be probed using X-ray fluorescence imaging. Elements are divided into three categories, those that are physiologically important, those that are pharmacologically active, ...

245 citations

Journal ArticleDOI
TL;DR: The potential of coupling pencil-beam tomography with X-ray diffraction to examine unidentified phases in nanomaterials and polycrystalline materials is shown, enabling a multimodal analysis of prime importance in materials science, chemistry, geology, environmental science, medical science, palaeontology and cultural heritage.
Abstract: The advent of nanosciences calls for the development of local structural probes, in particular to characterize ill-ordered or heterogeneous materials. Furthermore, because materials properties are often related to their heterogeneity and the hierarchical arrangement of their structure, different structural probes covering a wide range of scales are required. X-ray diffraction is one of the prime structural methods but suffers from a relatively poor detection limit, whereas transmission electron analysis involves destructive sample preparation. Here we show the potential of coupling pencil-beam tomography with X-ray diffraction to examine unidentified phases in nanomaterials and polycrystalline materials. The demonstration is carried out on a high-pressure pellet containing several carbon phases and on a heterogeneous powder containing chalcedony and iron pigments. The present method enables a non-invasive structural refinement with a weight sensitivity of one part per thousand. It enables the extraction of the scattering patterns of amorphous and crystalline compounds with similar atomic densities and compositions. Furthermore, such a diffraction-tomography experiment can be carried out simultaneously with X-ray fluorescence, Compton and absorption tomographies, enabling a multimodal analysis of prime importance in materials science, chemistry, geology, environmental science, medical science, palaeontology and cultural heritage.

188 citations

Journal ArticleDOI
TL;DR: Computer simulations exploring the feasibility of imaging small objects with X-ray luminescence computed tomography, such as research animals, are presented.
Abstract: X-ray luminescence computed tomography (XLCT) is proposed as a new molecular imaging modality based on the selective excitation and optical detection of X-ray-excitable phosphor nanoparticles. These nano-sized particles can be fabricated to emit near-infrared (NIR) light when excited with X-rays, and, because because both X-rays and NIR photons propagate long distances in tissue, they are particularly well suited for in vivo biomedical imaging. In XLCT, tomographic images are generated by irradiating the subject using a sequence of programmed X-ray beams, while sensitive photo-detectors measure the light diffusing out of the subject. By restricting the X-ray excitation to a single, narrow beam of radiation, the origin of the optical photons can be inferred regardless of where these photons were detected, and how many times they scattered in tissue. This study presents computer simulations exploring the feasibility of imaging small objects with XLCT, such as research animals. The accumulation of 50 nm phosphor nanoparticles in a 2-mm-diameter target can be detected and quantified with subpicomolar sensitivity using less than 1 cGy of radiation dose. Provided sufficient signal-to-noise ratio, the spatial resolution of the system can be made as high as needed by narrowing the beam aperture. In particular, 1 mm spatial resolution was achieved for a 1-mm-wide X-ray beam. By including an X-ray detector in the system, anatomical imaging is performed simultaneously with molecular imaging via standard X-ray computed tomography (CT). The molecular and anatomical images are spatially and temporally co-registered, and, if a single-pixel X-ray detector is used, they have matching spatial resolution.

159 citations


Cites methods from "Penalized-likelihood image reconstr..."

  • ...XLCT uses a selective excitation mechanism similar to X-ray fluorescence computed tomography (XFCT, [13]–[15]) to image samples containing unknown distributions of X-ray-excitable nanophosphor....

    [...]

  • ...Like XLCT, XFCT uses a selective excitation mechanism [13]–[15]....

    [...]

Journal ArticleDOI
TL;DR: Several complementary approaches to X-ray fluorescence tomography will be routinely available to the biologist in the near future, and these are discussed and applications of biological relevance are reviewed.

143 citations


Additional excerpts

  • ...La Rivière PJ, Vargas P, Newville M, Sutton SR: Reduced-scan schemes for X-ray fluorescence computed tomography....

    [...]

  • ...Other approaches to reconstruct in the presence of self-absorption have been developed by La Rivière and Vargas [45] and Miqueles and De Pierro [46]....

    [...]

  • ...La Rivière PJ, Billmire D, Vargas P, Rivers M, Sutton SR: Penalizedlikelihood image reconstruction for X-ray fluorescence computed tomography....

    [...]

  • ...La Rivière PJ, Vargas PA: Monotonic penalized-likelihood image reconstruction for X-ray fluorescence computed tomography....

    [...]

  • ...Other approaches to reconstruct in the presence of self-absorption have been developed by La Rivière and Vargas [45] and Miqueles and De Pierro [46]....

    [...]

Journal ArticleDOI
Y Kuang1, Guillem Pratx1, Magdalena Bazalova1, B Meng1, J Qian1, Lei Xing1 
TL;DR: XFCT is a promising modality for multiplexed imaging of high atomic number probes for imaging multiple elements simultaneously (multiplexing) using XRF computed tomography using the element-specific nature of the X-ray fluorescence signal.
Abstract: Simultaneous imaging of multiple probes or biomarkers represents a critical step toward high specificity molecular imaging In this work, we propose to utilize the element-specific nature of the X-ray fluorescence (XRF) signal for imaging multiple elements simultaneously (multiplexing) using XRF computed tomography (XFCT) A 5-mm-diameter pencil beam produced by a polychromatic X-ray source (150 kV, 20 mA) was used to stimulate emission of XRF photons from 2% (weight/volume) gold (Au), gadolinium (Gd), and barium (Ba) embedded within a water phantom The phantom was translated and rotated relative to the stationary pencil beam in a first-generation CT geometry The X-ray energy spectrum was collected for 18 s at each position using a cadmium telluride detector The spectra were then used to isolate the K shell XRF peak and to generate sinograms for the three elements of interest The distribution and concentration of the three elements were reconstructed with the iterative maximum likelihood expectation maximization algorithm The linearity between the XFCT intensity and the concentrations of elements of interest was investigated We found that measured XRF spectra showed sharp peaks characteristic of Au, Gd, and Ba The narrow full-width at half-maximum (FWHM) of the peaks strongly supports the potential of XFCT for multiplexed imaging of Au, Gd, and Ba (FWHMAu,Kα1 = 0619 keV, FWHMAu,Kα2=1371 keV , FWHMGd,Kα=1297 keV, FWHMGd,Kβ=0974 keV , FWHMBa,Kα=0852 keV, and FWHMBa,Kβ=0594 keV ) The distribution of Au, Gd, and Ba in the water phantom was clearly identifiable in the reconstructed XRF images Our results showed linear relationships between the XRF intensity of each tested element and their concentrations (R2Au=0944 , RGd2=0986, and RBa2=0999), suggesting that XFCT is capable of quantitative imaging Finally, a transmission CT image was obtained to show the potential of the approach for providing attenuation correction and morphological information In conclusion, XFCT is a promising modality for multiplexed imaging of high atomic number probes

84 citations


Cites background from "Penalized-likelihood image reconstr..."

  • ...Moreover, novel image reconstruction algorithms that can process incompletely sampled data could be helpful [15]....

    [...]

References
More filters
01 Jan 1994
TL;DR: The Diskette v 2.06, 3.5''[1.44M] for IBM PC, PS/2 and compatibles [DOS] Reference Record created on 2004-09-07, modified on 2016-08-08.
Abstract: Note: Includes bibliographical references, 3 appendixes and 2 indexes.- Diskette v 2.06, 3.5''[1.44M] for IBM PC, PS/2 and compatibles [DOS] Reference Record created on 2004-09-07, modified on 2016-08-08

19,881 citations

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a more accurate general mathematical model for ET where an unknown emission density generates, and is to be reconstructed from, the number of counts n*(d) in each of D detector units d. Within the model, they gave an algorithm for determining an estimate? of? which maximizes the probability p(n*|?) of observing the actual detector count data n* over all possible densities?.
Abstract: Previous models for emission tomography (ET) do not distinguish the physics of ET from that of transmission tomography. We give a more accurate general mathematical model for ET where an unknown emission density ? = ?(x, y, z) generates, and is to be reconstructed from, the number of counts n*(d) in each of D detector units d. Within the model, we give an algorithm for determining an estimate ? of ? which maximizes the probability p(n*|?) of observing the actual detector count data n* over all possible densities ?. Let independent Poisson variables n(b) with unknown means ?(b), b = 1, ···, B represent the number of unobserved emissions in each of B boxes (pixels) partitioning an object containing an emitter. Suppose each emission in box b is detected in detector unit d with probability p(b, d), d = 1, ···, D with p(b, d) a one-step transition matrix, assumed known. We observe the total number n* = n*(d) of emissions in each detector unit d and want to estimate the unknown ? = ?(b), b = 1, ···, B. For each ?, the observed data n* has probability or likelihood p(n*|?). The EM algorithm of mathematical statistics starts with an initial estimate ?0 and gives the following simple iterative procedure for obtaining a new estimate ?new, from an old estimate ?old, to obtain ?k, k = 1, 2, ···, ?new(b)= ?old(b) ?Dd=1 n*(d)p(b,d)/??old(b?)p(b?,d),b=1,···B.

4,288 citations

Journal ArticleDOI
TL;DR: Ordered subsets EM (OS-EM) provides a restoration imposing a natural positivity condition and with close links to the EM algorithm, applicable in both single photon (SPECT) and positron emission tomography (PET).
Abstract: The authors define ordered subset processing for standard algorithms (such as expectation maximization, EM) for image restoration from projections. Ordered subsets methods group projection data into an ordered sequence of subsets (or blocks). An iteration of ordered subsets EM is defined as a single pass through all the subsets, in each subset using the current estimate to initialize application of EM with that data subset. This approach is similar in concept to block-Kaczmarz methods introduced by Eggermont et al. (1981) for iterative reconstruction. Simultaneous iterative reconstruction (SIRT) and multiplicative algebraic reconstruction (MART) techniques are well known special cases. Ordered subsets EM (OS-EM) provides a restoration imposing a natural positivity condition and with close links to the EM algorithm. OS-EM is applicable in both single photon (SPECT) and positron emission tomography (PET). In simulation studies in SPECT, the OS-EM algorithm provides an order-of-magnitude acceleration over EM, with restoration quality maintained. >

3,740 citations


"Penalized-likelihood image reconstr..." refers methods in this paper

  • ...The first is to employ an ordered subsets strategy [26], in which equally distributed subsets of projections are used to calculate the update....

    [...]

Journal Article
TL;DR: The general principles behind all EM algorithms are discussed and in detail the specific algorithms for emission and transmission tomography are derived and the specification of necessary physical features such as source and detector geometries are discussed.
Abstract: Two proposed likelihood models for emission and transmission image reconstruction accurately incorporate the Poisson nature of photon counting noise and a number of other relevant physical features As in most algebraic schemes, the region to be reconstructed is divided into small pixels For each pixel a concentration or attenuation coefficient must be estimated In the maximum likelihood approach these parameters are estimated by maximizing the likelihood (probability of the observations) EM algorithms are iterative techniques for finding maximum likelihood estimates In this paper we discuss the general principles behind all EM algorithms and derive in detail the specific algorithms for emission and transmission tomography The virtues of the EM algorithms include (a) accurate incorporation of a good physical model, (b) automatic inclusion of non-negativity constraints on all parameters, (c) an excellent measure of the quality of a reconstruction, and (d) global convergence to a single vector of parameter estimates We discuss the specification of necessary physical features such as source and detector geometries Actual reconstructions are deferred to a later time

1,921 citations

Journal ArticleDOI
TL;DR: Qualitative results suggest that the streak artifacts common to the FBP method are nearly eliminated by the PWLS+SOR method, and indicate that the proposed method for weighting the measurements is a significant factor in the improvement over FBP.
Abstract: Presents an image reconstruction method for positron-emission tomography (PET) based on a penalized, weighted least-squares (PWLS) objective. For PET measurements that are precorrected for accidental coincidences, the author argues statistically that a least-squares objective function is as appropriate, if not more so, than the popular Poisson likelihood objective. The author proposes a simple data-based method for determining the weights that accounts for attenuation and detector efficiency. A nonnegative successive over-relaxation (+SOR) algorithm converges rapidly to the global minimum of the PWLS objective. Quantitative simulation results demonstrate that the bias/variance tradeoff of the PWLS+SOR method is comparable to the maximum-likelihood expectation-maximization (ML-EM) method (but with fewer iterations), and is improved relative to the conventional filtered backprojection (FBP) method. Qualitative results suggest that the streak artifacts common to the FBP method are nearly eliminated by the PWLS+SOR method, and indicate that the proposed method for weighting the measurements is a significant factor in the improvement over FBP. >

673 citations