scispace - formally typeset
Search or ask a question

Showing papers in "Medical Physics in 2010"


Journal ArticleDOI
TL;DR: The task group report includes a review of the literature to identify reported clinical findings and expected outcomes for this treatment modality.
Abstract: Task Group 101 of the AAPM has prepared this report for medical physicists, clinicians, and therapists in order to outline the best practice guidelines for the external-beam radiation therapy technique referred to as stereotactic body radiation therapy (SBRT). The task group report includes a review of the literature to identify reported clinical findings and expected outcomes for this treatment modality. Information is provided for establishing a SBRT program, including protocols, equipment, resources, and QA procedures. Additionally, suggestions for developing consistent documentation for prescribing, reporting, and recording SBRT treatment delivery is provided.

1,586 citations


Journal ArticleDOI
TL;DR: The XCAT provides an important tool in imaging research to evaluate and improve imaging devices and techniques and may also provide the necessary foundation with which to optimize clinical CT applications in terms of image quality versus radiation dose.
Abstract: Purpose: The authors develop the 4D extended cardiac-torso (XCAT) phantom for multimodality imaging research. Methods: Highly detailed whole-body anatomies for the adult male and female were defined in the XCAT using nonuniform rational B-spline (NURBS) and subdivision surfaces based on segmentation of the Visible Male and Female anatomical datasets from the National Library of Medicine as well as patient datasets. Using the flexibility of these surfaces, the Visible Human anatomies were transformed to match body measurements and organ volumes for a 50th percentile (height and weight) male and female. The desired body measurements for the models were obtained using the PEOPLESIZE program that contains anthropometric dimensions categorized from 1st to the 99th percentile for US adults. The desired organ volumes were determined from ICRP Publication 89 [ICRP, ‘‘Basic anatomical and physiological data for use in radiological protection: reference values,” ICRP Publication 89 (International Commission on Radiological Protection, New York, NY, 2002)]. The male and female anatomies serve as standard templates upon which anatomical variations may be modeled in the XCAT through user-defined parameters. Parametrized models for the cardiac and respiratory motions were also incorporated into the XCAT based on high-resolution cardiac- and respiratory-gated multislice CT data. To demonstrate the usefulness of the phantom, the authors show example simulation studies in PET, SPECT, and CT using publicly available simulation packages. Results: As demonstrated in the pilot studies, the 4D XCAT (which includes thousands of anatomical structures) can produce realistic imaging data when combined with accurate models of the imaging process. With the flexibility of the NURBS surface primitives, any number of different anatomies, cardiac or respiratory motions or patterns, and spatial resolutions can be simulated to perform imaging research. Conclusions: With the ability to produce realistic, predictive 3D and 4D imaging data from populations of normal and abnormal patients under various imaging parameters, the authors conclude that the XCAT provides an important tool in imaging research to evaluate and improve imaging devices and techniques. In the field of x-ray CT, the phantom may also provide the necessary foundation with which to optimize clinical CT applications in terms of image quality versus radiation dose, an area of research that is becoming more significant with the growing use of CT.

1,054 citations


Journal ArticleDOI
TL;DR: In this paper, a generalized normalization technique for MAR is proposed, which can reduce metal artifacts to a minimum, even close to metal regions, even for patients with dental fillings, which cause most severe artifacts.
Abstract: Purpose: While modern clinical CT scanners under normal circumstances produce high quality images, severe artifacts degrade the image quality and the diagnostic value if metal prostheses or other metal objects are present in the field of measurement. Standard methods for metal artifact reduction (MAR) replace those parts of the projection data that are affected by metal (the so-called metal trace or metal shadow) by interpolation. However, while sinogram interpolation methods efficiently remove metal artifacts, new artifacts are often introduced, as interpolation cannot completely recover the information from the metal trace. The purpose of this work is to introduce a generalized normalization technique for MAR, allowing for efficient reduction of metal artifacts while adding almost no new ones. The method presented is compared to a standard MAR method, as well as MAR using simple length normalization. Methods: In the first step, metal is segmented in the image domain by thresholding. A 3D forward projection identifies the metal trace in the original projections. Before interpolation, the projections are normalized based on a 3D forward projection of a prior image. This prior image is obtained, for example, by a multithreshold segmentation of the initial image. The original rawdata are divided by the projection data of the prior image and, after interpolation, denormalized again. Simulations and measurements are performed to compare normalized metal artifact reduction (NMAR) to standard MAR with linear interpolation and MAR based on simple length normalization. Results: Promising results for clinical spiral cone-beam data are presented in this work. Included are patients with hip prostheses, dental fillings, and spine fixation, which were scanned at pitch values ranging from 0.9 to 3.2. Image quality is improved considerably, particularly for metal implants within bone structures or in their proximity. The improvements are evaluated by comparing profiles through images and sinograms for the different methods and by inspecting ROIs. NMAR outperforms both other methods in all cases. It reduces metal artifacts to a minimum, even close to metal regions. Even for patients with dental fillings, which cause most severe artifacts, satisfactory results are obtained with NMAR. In contrast to other methods, NMAR prevents the usual blurring of structures close to metal implants if the metal artifacts are moderate. Conclusions: NMAR clearly outperforms the other methods for both moderate and severe artifacts. The proposed method reliably reduces metal artifacts from simulated as well as from clinical CT data. Computationally efficient and inexpensive compared to iterative methods, NMAR can be used as an additional step in any conventional sinogram inpainting-based MAR method.

505 citations


Journal ArticleDOI
TL;DR: The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase, and represent a first step in the extension of the GEANT4 Monte Carlo toolkit to the simulation of biological effects of ionizing radiation.
Abstract: Purpose: TheGEANT4 general-purpose Monte Carlo simulation toolkit is able to simulate physical interaction processes of electrons, hydrogen and helium atoms with charge states ( H 0 , H + ) and ( He 0 , He + , He 2 + ), respectively, in liquid water, the main component of biological systems, down to the electron volt regime and the submicrometer scale, providing GEANT4 users with the so-called “GEANT4-DNA” physics models suitable for microdosimetry simulation applications. The corresponding software has been recently re-engineered in order to provide GEANT4 users with a coherent and unique approach to the simulation of electromagnetic interactions within the GEANT4 toolkit framework (since GEANT4 version 9.3 beta). This work presents a quantitative comparison of these physics models with a collection of experimental data in water collected from the literature. Methods: An evaluation of the closeness between the total and differential cross section models available in theGEANT4 toolkit for microdosimetry and experimental reference data is performed using a dedicated statistical toolkit that includes the Kolmogorov–Smirnov statistical test. The authors used experimental data acquired in water vapor as direct measurements in the liquid phase are not yet available in the literature. Comparisons with several recommendations are also presented. Results: The authors have assessed the compatibility of experimental data withGEANT4microdosimetry models by means of quantitative methods. The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase. Nevertheless, a comparison with existing experimental data in water vapor provides a qualitative appreciation of the plausibility of the simulation models. The existing reference data themselves should undergo a critical interpretation and selection, as some of the series exhibit significant deviations from each other. Conclusions: TheGEANT4-DNA physics models available in the GEANT4 toolkit have been compared in this article to available experimental data in the water vapor phase as well as to several published recommendations on the mass stopping power. These models represent a first step in the extension of the GEANT4 Monte Carlo toolkit to the simulation of biological effects of ionizing radiation.

410 citations


Journal ArticleDOI
TL;DR: The authors have successfully visualized submillimeter breast vasculature to a depth of 40 mm using an illumination intensity that is 32 times less than the maximum permissible exposure according to the American National Standard for Safe Use of Lasers.
Abstract: Purpose The authors report a noninvasive technique and instrumentation for visualizing vasculature in the breast in three dimensions without using either ionizing radiation or exogenous contrast agents, such as iodine or gadolinium. Vasculature is visualized by virtue of its high hemoglobin content compared to surrounding breast parenchyma. The technique is compatible with dynamic contrast-enhanced studies. Methods Photoacoustic sonic waves were stimulated in the breast with a pulsed laser operating at 800 nm and a mean exposure of 20 mJ/pulse over an area of approximately 20 cm2. These waves were subsequently detected by a hemispherical array of piezoelectric transducers, the temporal signals from which were filtered and backprojected to form three-dimensional images with nearly uniform k-space sampling. Results Three-dimensional vascular images of a human volunteer demonstrated a clear visualization of vascular anatomy with submillimeter spatial resolution to a maximum depth of 40 mm using a 24 s image acquisition protocol. Spatial resolution was nearly isotropic and approached 250 microm over a 64 x 64 x 50 mm field of view. Conclusions The authors have successfully visualized submillimeter breast vasculature to a depth of 40 mm using an illumination intensity that is 32 times less than the maximum permissible exposure according to the American National Standard for Safe Use of Lasers. Clearly, the authors can achieve greater penetration depth in the breast by increasing the intensity and the cross-sectional area of the illumination beam. Given the 24 s image acquisition time without contrast agent, dynamic, contrast-enhanced, photoacoustic breast imaging using optically absorbing contrast agents is conceivable in the future.

300 citations


Journal ArticleDOI
TL;DR: For relative and absolute dosimetry of radiation therapy beams, the weak energy dependence of the EBT2 makes it most suitable for clinical use compared to other films.
Abstract: Purpose: Since the Gafchromic film EBT has been recently replaced by the newer model EBT2, its characterization, especially energy dependence, has become critically important. The energy dependence of the dose response of Gafchromic EBT2 film is evaluated for a broad range of energies from different radiation sources used in radiation therapy. Methods: The beams used for this study comprised of kilovoltage x rays (75, 125, and 250 kVp), C 137 s gamma (662 KeV), C 60 o gamma (1.17–1.33 MeV), megavoltage x rays (6 and 18 MV), electron beams (6 and 20 MeV), and protonbeams (100 and 250 MeV). The film’s response to each of the above energies was measured over the dose range of 0.4–10 Gy, which corresponds to optical densities ranging from 0.05 to 0.74 for the film reader used. Results: The energy dependence of EBT2 was found to be relatively small within measurement uncertainties ( 1 σ = ± 4.5 % ) for all energies and modalities. Conclusion: For relative and absolute dosimetry of radiation therapybeams, the weak energy dependence of the EBT2 makes it most suitable for clinical use compared to other films.

262 citations


Journal ArticleDOI
TL;DR: It is found that high quality CBCT image can be reconstructed from undersampled and potentially noisy projection data by using the proposed method, and it is demonstrated that compressed sensing outperforms the traditional algorithm when dealing with sparse, and possibly noisy, CBCT projection views.
Abstract: Purpose: This article considers the problem of reconstructingcone-beam computed tomography(CBCT)images from a set of undersampled and potentially noisy projection measurements. Methods: The authors cast the reconstruction as a compressed sensing problem based on l 1 norm minimization constrained by statistically weighted least-squares of CBCT projection data. For accurate modeling, the noise characteristics of the CBCT projection data are used to determine the relative importance of each projection measurement. To solve the compressed sensing problem, the authors employ a method minimizing total-variation norm, satisfying a prespecified level of measurement consistency using a first-order method developed by Nesterov. Results: The method converges fast to the optimal solution without excessive memory requirement, thanks to the method of iterative forward and back-projections. The performance of the proposed algorithm is demonstrated through a series of digital and experimental phantom studies. It is found a that high quality CBCTimage can be reconstructed from undersampled and potentially noisy projection data by using the proposed method. Both sparse sampling and decreasing x-ray tube current (i.e., noisy projection data) lead to the reduction of radiationdose in CBCTimaging. Conclusions: It is demonstrated that compressed sensing outperforms the traditional algorithm when dealing with sparse, and potentially noisy, CBCT projection views.

254 citations


Journal ArticleDOI
TL;DR: A robust beam or plan is defined as one that maintained a diode percentage pass rate greater than 90% at 2%/ 1 mm, indicating delivery that was deemed accurate when compared to the planned dose, even under stricter evaluation criterion.
Abstract: Purpose: To evaluate the utility of a new complexity metric, the modulation complexity score (MCS), in the treatment planning and quality assurance processes and to evaluate the relationship of the metric with deliverability. Methods: A multisite (breast, rectum, prostate, prostate bed, lung, and head and neck) and site-specific (lung)dosimetric evaluation has been completed. The MCS was calculated for each beam and the overall treatment plan. A 2D diode array (MapCHECK™, Sun Nuclear, Melbourne, FL) was used to acquire measurements for each beam. The measured and planned dose (PINNACLE3, Phillips, Madison, WI) was evaluated using different percent differences and distance to agreement (DTA) criteria (3%/3 mm and 2%/1 mm) and the relationship between the dosimetric results and complexity (as measured by the MCS or simple beam parameters) assessed. Results: For the multisite analysis (243 plans total), the mean MCS scores for each treatment site were breast (0.92), rectum (0.858), prostate (0.837), prostate bed (0.652), lung (0.631), and head and neck (0.356). The MCS allowed for compilation of treatment site-specific statistics, which is useful for comparing different techniques, as well as for comparison of individual treatment plans with the typical complexity levels. For the six plans selected for dosimetry, the average diode percent pass rate was 98.7% (minimum of 96%) for 3%/3 mm evaluation criteria. The average difference in absolute dose measurement between the planned and measured dose was 1.7 cGy. The detailed lung analysis also showed excellent agreement between the measured and planned dose, as all beams had a diode percentage pass rate for 3%/3 mm criteria of greater than 95.9%, with an average pass rate of 99.0%. The average absolute maximum dose difference for the lung plans was 0.7 cGy. There was no direct correlation between the MCS and simple beam parameters which could be used as a surrogate for complexity level (i.e., number of segments or MU). An evaluation criterion of 2%/1 mm reliably allowed for the identification of beams that are dosimetrically robust. In this study we defined a robust beam or plan as one that maintained a diode percentage pass rate greater than 90% at 2%/1 mm, indicating delivery that was deemed accurate when compared to the planned dose, even under stricter evaluation criterion. MCS and MU threshold criteria were determined by defining a required specificity of 1.0. A MCS threshold of 0.8 allowed for identification of robust deliverability with a sensitivity of 0.36. In contrast, MU had a lower sensitivity of 0.23 for a threshold of 50 MU. Conclusions: The MCS allows for a quantitative assessment of plan complexity, on a fixed scale, that can be applied to all treatment sites and can provide more information related to dose delivery than simple beam parameters. This could prove useful throughout the entire treatment planning and QA process.

235 citations


Journal ArticleDOI
TL;DR: A fast GPU-based algorithm to reconstruct CBCT from undersampled and noisy projection data so as to lower the imaging dose considerably is developed and the high computation efficiency in this algorithm makes the iterative CBCT reconstruction approach applicable in real clinical environments.
Abstract: Purpose: Cone-beam CT (CBCT) plays an important role in image guided radiation therapy (IGRT). However, the large radiation dose from serial CBCT scans in most IGRT procedures raises a clinical concern, especially for pediatric patients who are essentially excluded from receiving IGRT for this reason. The goal of this work is to develop a fast GPU-based algorithm to reconstruct CBCT from undersampled and noisy projection data so as to lower the imaging dose. Methods: The CBCT is reconstructed by minimizing an energy functional consisting of a data fidelity term and a total variation regularization term. The authors developed a GPU-friendly version of the forward-backward splitting algorithm to solve this model. A multigrid technique is also employed. Results: It is found that 20-40 x-ray projections are sufficient to reconstruct images with satisfactory quality for IGRT. The reconstruction time ranges from 77 to 130 s on an NVIDIA Tesla C1060 (NVIDIA, Santa Clara, CA) GPU card, depending on the number of projections used, which is estimated about 100 times faster than similar iterative reconstruction approaches. Moreover, phantom studies indicate that the algorithm enables the CBCT to be reconstructed under a scanning protocol with as low as 0.1 mA s/projection. Comparing with currentlymore » widely used full-fan head and neck scanning protocol of {approx}360 projections with 0.4 mA s/projection, it is estimated that an overall 36-72 times dose reduction has been achieved in our fast CBCT reconstruction algorithm. Conclusions: This work indicates that the developed GPU-based CBCT reconstruction algorithm is capable of lowering imaging dose considerably. The high computation efficiency in this algorithm makes the iterative CBCT reconstruction approach applicable in real clinical environments.« less

233 citations


Journal ArticleDOI
TL;DR: The results of this study support the utility of 3-D microwave tomography for imaging the distribution of normal tissues in the breast, specifically, dense fibroglandular tissue versus less dense adipose tissue, and suggest that further investigation of its use for volumetric evaluation of breast density is warranted.
Abstract: Conclusions: The results of this study support the utility of 3-D microwave tomography for imaging the distribution of normal tissues in the breast, specifically, dense fibroglandular tissue versus less dense adipose tissue, and suggest that further investigation of its use for volumetric evaluation of breast density is warranted. © 2010 American Association of Physicists in Medicine. DOI: 10.1118/1.3443569

232 citations


Journal ArticleDOI
TL;DR: Initial clinical results demonstrated that the proposed technique can be applied to diagnosis of lesions with calcifications or hemorrhages, and it bears the potential to obviate the need for confirmation with computed tomography.
Abstract: Purpose: Identification of calcifications and hemorrhages is essential for the etiological diagnosis of cerebral lesions. The purpose of this work was to develop a robust method for characterization of para- and diamagnetic intracerebral lesions based on clinical gradient-echo magnetic resonance phase data acquired at 1.5 Tesla. Methods: The magnetic susceptibility distribution of biological tissue produces a distinct magnetic field pattern, which is directly reflected in gradient-echo magnetic resonance phase images. Compared to brain parenchyma, iron-laden tissues are more paramagnetic, whereas mineralized tissues usually possess more diamagneticsusceptibilities. Magnetic resonance phase data were inverted to the underlying susceptibility distribution utilizing additional geometrical information about the lesions, which was obtained from the gradient-echo magnitude signal void corresponding to the lesions. Clinical magnetic resonance exams of three patients with multiple brain lesions (total n = 70 ) were processed and evaluated. For one patient, the results were validated by an additionally available computed tomography scan. Numerical simulations were conducted to evaluate the robustness of the method. Results: The obtained susceptibility maps showed impressive delineation of lesions, vessels, and potentially iron-laden tissue. Compensation of the nonlocal field perturbations was clearly discernable on the susceptibility maps. In all cases, discrimination of para- from diamagnetic lesions was achieved and the results were confirmed by the additional computed tomography. The numerical simulations demonstrated that robust determination of the total magnetic moment of lesions is possible. Thus, the proposed method is able to yield quantitative values for the minimum magnetic susceptibility of lesions. Conclusions: A method has been developed for noninvasive, semiautomatic characterization of brain lesions based on magnetic resonance imaging data. Initial clinical results demonstrated that the proposed technique can be applied to diagnosis of lesions with calcifications or hemorrhages. If confirmed by larger studies, it bears the potential to obviate the need for confirmation with computed tomography.

Journal ArticleDOI
TL;DR: VMAT was able to provide approximately a 40% reduction in treatment time while maintaining comparable plan quality to that of HT, demonstrating that both VMAT and HT are capable of providing more uniform target doses and improved normal tissue sparing as compared with fixed field IMRT.
Abstract: Purpose: Helical tomotherapy (HT) and volumetric modulated arc therapy (VMAT) are arc-based approaches to IMRT delivery. The objective of this study is to compare VMAT to both HT and fixed field IMRT in terms of plan quality, delivery efficiency, and accuracy. Methods: Eighteen cases including six prostate, six head-and-neck, and six lung cases were selected for this study. IMRT plans were developed using direct machine parameter optimization in the Pinnacle{sup 3} treatment planning system. HT plans were developed using a Hi-Art II planning station. VMAT plans were generated using both the Pinnacle{sup 3} SmartArc IMRT module and a home-grown arc sequencing algorithm. VMAT and HT plans were delivered using Elekta's PreciseBeam VMAT linac control system (Elekta AB, Stockholm, Sweden) and a TomoTherapy Hi-Art II system (TomoTherapy Inc., Madison, WI), respectively. Treatment plan quality assurance (QA) for VMAT was performed using the IBA MatriXX system while an ion chamber and films were used for HT plan QA. Results: The results demonstrate that both VMAT and HT are capable of providing more uniform target doses and improved normal tissue sparing as compared with fixed field IMRT. In terms of delivery efficiency, VMAT plan deliveries on average took 2.2 min for prostate andmore » lung cases and 4.6 min for head-and-neck cases. These values increased to 4.7 and 7.0 min for HT plans. Conclusions: Both VMAT and HT plans can be delivered accurately based on their own QA standards. Overall, VMAT was able to provide approximately a 40% reduction in treatment time while maintaining comparable plan quality to that of HT.« less

Journal ArticleDOI
TL;DR: In this article, the authors used the Monte Carlo code EGSnrc to obtain the spectra of secondary electrons from atoms of gold approximating GNPs and molecules of water under photon irradiation of a tumor loaded with GNPs.
Abstract: Purpose: An approach known as goldnanoparticle-aidedradiation therapy (GNRT) is a recent development in radiation therapy which seeks to make a tumor more susceptible to radiation damage by modifying its photon interaction properties with an infusion of goldnanoparticles (GNPs). The purpose of this study was to quantify the energy deposition due to secondary electrons from GNPs on a nanometer scale and to calculate the corresponding microscopic dose enhancement factor around GNPs. Methods: The Monte Carlo code EGSnrc was modified to obtain the spectra of secondary electrons from atoms of gold approximating GNPs and molecules of water under photon irradiation of a tumor loaded with GNPs. Six different photon sources were used: I 125 , P 103 d , Y 169 b , I 192 r , 50 kVp, and 6 MV x rays. Treating the scored electron spectra as point sources within an infinite medium of water, the event-by-event Monte Carlo code NOREC was used to quantify the radial dose distribution, giving rise to gold/water electron dose point kernels and corresponding microscopic dose enhancement factors. These kernels were applied to a test case based on a scanning electron microscope image of a GNP distribution in tissue, enabling the determination of the microscopic dose enhancement at each dose point. Results: For the lower energy sources I 125 , P 103 d , Y 169 b , and 50 kVp, the secondary electron fluence within a GNP-loaded tumor was increased by as much as two orders of magnitude, leading to two orders of magnitude increase in electron energy deposition over radial distances up to 10 μ m . For the test case considered, the dose was enhanced by factors ranging from 2 to 20 within 5 μ m of GNPs, and by 5% as far away as 30 μ m . Conclusions: This study demonstrates a remarkable microscopic dose enhancement due to GNPs and low energy photon sources. By quantifying the microscopic dose enhancement factor for a given photon source as a function of distance from GNPs, it also enables the selection of either a passive or an active tumor targeting strategy using GNPs which will maximize the radiobiological benefit from GNRT.

Journal ArticleDOI
TL;DR: In this paper, the authors developed a new analytical pulse pileup model for both peak and tail pileup effects for nonparalyzable detectors, which takes into account the bipolar shape of the pulse, the distribution function of time intervals between random events, and the input probability density function of photon energies.
Abstract: Purpose: Recently, novel CdTe photon counting x-ray detectors (PCXDs) with energy discrimination capabilities have been developed. When such detectors are operated under a high x-ray flux, however, coincident pulses distort the recorded energy spectrum. These distortions are called pulse pileup effects. It is essential to compensate for these effects on the recorded energy spectrum in order to take full advantage of spectral information PCXDs provide. Such compensation can be achieved by incorporating a pileup model into the image reconstruction process for computed tomography, that is, as a part of the forward imaging process, and iteratively estimating either the imaged object or the line integrals using, e.g., a maximum likelihood approach. The aim of this study was to develop a new analytical pulse pileup model for both peak and tail pileup effects for nonparalyzable detectors. Methods: The model takes into account the following factors: The bipolar shape of the pulse, the distribution function of time intervals between random events, and the input probability density function of photon energies. The authors used Monte Carlo simulations to evaluate the model. Results: The recorded spectra estimated by the model were in an excellent agreement with those obtained by Monte Carlo simulations for various levels of pulse pileup effects. The coefficients of variation (i.e., the root mean square difference divided by the mean of measurements) were 5.3%–10.0% for deadtime losses of 1%–50% with a polychromatic incident x-ray spectrum. Conclusions: The proposed pulse pileup model can predict recorded spectrum with relatively good accuracy.

Journal ArticleDOI
TL;DR: The emphasis of the report is to describe the rationale for the proposed QA program and to provide example tests that can be performed, drawing from the collective experience of the task group members and the published literature.
Abstract: Helical tomotherapy is a relatively new modality with integrated treatment planning and delivery hardware for radiation therapy treatments. In view of the uniqueness of the hardware design of the helical tomotherapy unit and its implications in routine quality assurance, the Therapy Physics Committee of the American Association of Physicists in Medicine commissioned Task Group 148 to review this modality and make recommendations for quality assurance related methodologies. The specific objectives of this Task Group are: (a) To discuss quality assurance techniques, frequencies, and tolerances and (b) discuss dosimetric verification techniques applicable to this unit. This report summarizes the findings of the Task Group and aims to provide the practicing clinical medical physicist with the insight into the technology that is necessary to establish an independent and comprehensive quality assurance program for a helical tomotherapy unit. The emphasis of the report is to describe the rationale for the proposed QA program and to provide example tests that can be performed, drawing from the collective experience of the task group members and the published literature. It is expected that as technology continues to evolve, so will the test procedures that may be used in the future to perform comprehensive quality assurance for helical tomotherapy units.

Journal ArticleDOI
TL;DR: A novel unsupervised PET image segmentation technique that allows the quantification of lesions in the presence of heterogeneity of tracer uptake was developed and evaluated and could find other applications in clinical oncology such as the assessment of response to treatment.
Abstract: PURPOSE: Accurate and robust image segmentation was identified as one of the most challenging issues facing PET quantification in oncological imaging. This difficulty is compounded by the low spatial resolution and high noise characteristics of PET images. The fuzzy C-means (FCM) clustering algorithm was largely used in various medical image segmentation approaches. However, the algorithm is sensitive to both noise and intensity heterogeneity since it does not take into account spatial contextual information. METHODS: To overcome this limitation, a new fuzzy segmentation technique adapted to typical noisy and low resolution oncological PET data is proposed. PET images smoothed using a nonlinear anisotropic diffusion filter are added as a second input to the proposed FCM algorithm to incorporate spatial information (FCM-S). In addition, a methodology was developed to integrate the a trous wavelet transform in the standard FCM algorithm (FCM-SW) to allow handling of heterogeneous lesions' uptake. The algorithm was applied to the simulated data of the NCAT phantom, incorporating heterogeneous lesions in the lung and clinical PET/CT images of 21 patients presenting with histologically proven nonsmall-cell lung cancer (NSCLC) and 7 patients presenting with laryngeal squamous cell carcinoma (LSCC) to assess its performance for segmenting tumors with arbitrary size, shape, and tracer uptake. For NSCLC patients, the maximal tumor diameters measured from the macroscopic examination of the surgical specimen served as the ground truth for comparison with the maximum diameter estimated by the segmentation technique, whereas for LSCC patients, the 3D macroscopic tumor volume was considered as the ground truth for comparison with the corresponding PET-based volume. The proposed algorithm was also compared to the classical FCM segmentation technique. RESULTS: There is a good correlation (R2 = 0.942) between the actual maximal diameter of primary NSCLC tumors estimated using the proposed PET segmentation procedure and those measured from the macroscopic examination, and the regression line agreed well with the line of identity (slope = 1.08) for the group analysis of the clinical data. The standard FCM algorithm seems to underestimate actual maximal diameters of the clinical data, resulting in a mean error of -4.6 mm (relative error of -10.8 +/- 23.1%) for all data sets. The mean error of maximal diameter estimation was reduced to 0.1 mm (0.9 +/- 14.4%) using the proposed FCM-SW algorithm. Likewise, the mean relative error on the estimated volume for LSCC patients was reduced from 21.7 +/- 22.0% for FCM to 8.6 +/- 28.3% using the proposed FCM-SW technique. CONCLUSIONS: A novel unsupervised PET image segmentation technique that allows the quantification of lesions in the presence of heterogeneity of tracer uptake was developed and evaluated. The technique is being further refined and assessed in clinical setting to delineate treatment volumes for the purpose of PET-guided radiation therapy treatment planning but could find other applications in clinical oncology such as the assessment of response to treatment.

Journal ArticleDOI
TL;DR: The importance of in vivo (EPID) dosimetry for all treatment plans as well as the ability of the method to assess the dosimetric impact of deviations found are shown.
Abstract: The potential for detrimental incidents and the ever increasing complexity of patient treatments emphasize the need for accurate dosimetric verification in radiotherapy For this reason, all curative treatments are verified, either pretreatment or in vivo, by electronic portal imaging device(EPID)dosimetry in the Radiation Oncology Department of the Netherlands Cancer Institute-Antoni van Leeuwenhoek hospital, Amsterdam, The Netherlands Since the clinical introduction of the method in January 2005 until August 2009, treatment plans of 4337 patients have been verified Among these plans, 17 serious errors were detected that led to intervention Due to their origin, nine of these errors would not have been detected with pretreatment verification The method is illustrated in detail by the case of a plan transfer error detected in a 5 × 5 Gy intensity-modulated radiotherapy(IMRT) rectum treatment The EPID reconstructed dose at the isocenter was 63% below the planned value Investigation of the plan transfer chain revealed that due to a network transfer error, the plan was corrupted 3D analysis of the acquired EPID data revealed serious underdosage of the planning target volume: On average 116%, locally up to 20% This report shows the importance of in vivo(EPID)dosimetry for all treatment plans as well as the ability of the method to assess the dosimetric impact of deviations found

Journal ArticleDOI
TL;DR: The effect of acquisition parameters on lesion detectability depends on signal size, and increasing the angular scan range increased detectability for all signal sizes.
Abstract: Purpose: Tomosynthesis is a promising modality for breast imaging. The appearance of the tomosynthesis reconstructed image is greatly affected by the choice of acquisition and reconstruction parameters. The purpose of this study was to investigate the limitations of tomosynthesis breast imaging due to scan parameters and quantum noise. Tomosynthesis image quality was assessed based on performance of a mathematical observer model in a signal-known exactly (SKE) detection task. Methods: SKE detectability (d′) was estimated using a prewhitening observer model. Structured breast background was simulated using filtered noise. Detectability was estimated for designer nodules ranging from 0.05 to 0.8 cm in diameter. Tomosynthesis slices were reconstructed using iterative maximum-likelihood expectation-maximization. The tomosynthesis scan angle was varied between 15° and 60°, the number of views between 11 and 41 and the total number of x-ray quanta was ∞, 6×105, and 6×104. Detectability in tomosynthesis was compared to that in a single projection. Results: For constant angular sampling distance, increasing the angular scan range increased detectability for all signal sizes. Large-scale signals were little affected by quantum noise or angular sampling. For small-scale signals, quantum noise and insufficient angular sampling degraded detectability. At high quantum noise levels, angular step size of 3° or below was sufficient to avoid image degradation. At lower quantum noise levels, increased angular sampling always resulted in increased detectability. The ratio of detectability in the tomosynthesis slice to that in a single projection exhibited a peak that shifted to larger signal sizes when the angular range increased. For a given angular range, the peak shifted toward smaller signals when the number of views was increased. The ratio was greater than unity for all conditions evaluated. Conclusion: The effect of acquisition parameters on lesion detectability depends on signal size. Tomosynthesis scan angle had an effect on detectability for all signals sizes, while quantum noise and angular sampling only affected the detectability small-scale signals.

Journal ArticleDOI
TL;DR: Gafchromic EBT2 can be used for clinical practice in the same way as the old EBT film, however, a much easier handling as the result of all new enhancements improves film behavior, expanding in this way the potential applications of radiochromic film dosimetry.
Abstract: Purpose: Radiochromic film has become an important tool to assess complex dose distributions. In particular, EBT was accepted by the scientific community as a reference two-dimensional detector. Recently, Gafchromic EBT2 has replaced old film, providing new improvements in both accuracy and handling. Methods: This work presents a dosimetric study of the new Gafchromic EBT2 using an Epson 10000XL flatbed scanner, also comparing the results with EBT film as reference when necessary. The most important filmcharacteristics have been studied, such as ambient light sensitivity, different possibilities of the three RGB color channels, postirradiation development, high dose behavior, exposition at temperatures similar to the human body, and dependence on orientation during the scanning process. Results: The results obtained confirm a considerably lower sensitivity to ambient light of EBT2, as well as a fast stabilization of the film within 2 h. It has also been found that the green channel has a better behavior at high dose levels up to 35 Gy, in addition to good behavior of the red channel at doses below 10 Gy. Other features, such as temperature independence and scanning orientation dependence, have also been shown. Conclusions: Gafchromic EBT2 can be used for clinical practice in the same way as the old EBT film. However, a much easier handling as the result of all new enhancements improves film behavior, expanding in this way the potential applications of radiochromic filmdosimetry.

Journal ArticleDOI
Jon J. Kruse1
TL;DR: Deconstruction of an IMRT plan for field-by-field QA requires complex analysis methods such as the gamma function and a fraction of pixels passing the gamma analysis was found to be a poor predictor of dosimetric accuracy with both planar dosimeters, as well as both sets of gamma criteria.
Abstract: Purpose: To report on the sensitivity of single field planar measurements in identifying IMRT plans with poor calculational accuracy. Methods: Three IMRT plans for head and neck cancer were subjected to extensive quality assurance. The plans were recalculated on a cylindrical phantom and between eight and 18 low gradient points were measured in each plan with an ion chamber. Every point measured in these plans agreed to within 4% of the dose predicted by the planning system and the plans were judged acceptable for clinical use. Each plan was then reoptimized with aggressive dose constraints so that the new treatment fields were more highly modulated than the ones from the original plans. Very complex fields can be calculated less accurately and ion chamber measurements of these plans in the cylindrical phantom confirmed significant dosimetric errors—Several of the measured points in each plan differed from the calculated dose by more than 4%, with a maximum single deviation of 10.6%. These three plans were judged unacceptable for clinical use. All six plans (three acceptable, three unacceptable) were then analyzed with two means of individual field planar dosimetry: Portal imaging with an electronic portal imaging device(EPID) and an ion chamber array. Gamma analysis was performed on each set of planar measurements with 2%/2 mm distance to agreement (DTA) and 3%/3 mm DTA criteria to try to determine a gamma analysis threshold which would differentiate the flawed plans from the acceptable ones. Results: With the EPID and 2%/2 mm DTA criteria, between 88.2% and 92.8% of pixels from the acceptable IMRT plans passed the gamma analysis, and between 87.5% and 91.9% passed for the unacceptable IMRT plans. With the ion chamber array and 2%/2 mm DTA criteria, between 92.4% and 94.9% of points in the acceptable plans passed the gamma analysis, while 86.8% to 98.3% of the points in the unacceptable plans passed the gamma analysis. The difference between acceptable and unacceptable plans was diminished further when gamma criteria were expanded to 3%/3 mm DTA. A fraction of pixels passing the gamma analysis was found to be a poor predictor of dosimetric accuracy with both planar dosimeters, as well as both sets of gamma criteria. Conclusions: Deconstruction of an IMRT plan for field-by-field QA requires complex analysis methods such as the gamma function. Distance to agreement, a component of the gamma function, has clinical relevance in a composite plan but when applied to individual, highly modulated fields, it can mask important dosimetric errors. While single field planar dosimetry may comprise one facet of an effective QA protocol, gamma analysis of single field measurements is insensitive to important dosimetric inaccuracies of the overall plan.

Journal ArticleDOI
TL;DR: It is possible to estimate patient- specific radiation dose and cancer risk from CT examinations by combining a validated Monte Carlo program with patient-specific anatomical models that are derived from the patients' clinical CT data and supplemented by transformed models of reference adults.
Abstract: Purpose: Current methods for estimating and reporting radiation dose from CT examinations are largely patient-generic; the body size and hence dose variation from patient to patient is not reflected. Furthermore, the current protocol designs rely on dose as a surrogate for the risk of cancer incidence, neglecting the strong dependence of risk on age and gender. The purpose of this study was to develop a method for estimating patient-specific radiation dose and cancer risk from CT examinations. Methods: The study included two patients (a 5-week-old female patient and a 12-year-old male patient), who underwent 64-slice CT examinations (LightSpeed VCT, GE Healthcare) of the chest, abdomen, and pelvis at our institution in 2006. For each patient, a nonuniform rational B-spine (NURBS) based full-body computer model was created based on the patient's clinical CT data. Large organs and structures inside the image volume were individually segmented and modeled. Other organs were created by transforming an existing adult male or female full-body computer model (developed from visible human data) to match the framework defined by the segmented organs, referencing the organ volume and anthropometry data in ICRP Publication 89. A Monte Carlo program previously developed and validated for dose simulation on the LightSpeedmore » VCT scanner was used to estimate patient-specific organ dose, from which effective dose and risks of cancer incidence were derived. Patient-specific organ dose and effective dose were compared with patient-generic CT dose quantities in current clinical use: the volume-weighted CT dose index (CTDI{sub vol}) and the effective dose derived from the dose-length product (DLP). Results: The effective dose for the CT examination of the newborn patient (5.7 mSv) was higher but comparable to that for the CT examination of the teenager patient (4.9 mSv) due to the size-based clinical CT protocols at our institution, which employ lower scan techniques for smaller patients. However, the overall risk of cancer incidence attributable to the CT examination was much higher for the newborn (2.4 in 1000) than for the teenager (0.7 in 1000). For the two pediatric-aged patients in our study, CTDI{sub vol} underestimated dose to large organs in the scan coverage by 30%-48%. The effective dose derived from DLP using published conversion coefficients differed from that calculated using patient-specific organ dose values by -57% to 13%, when the tissue weighting factors of ICRP 60 were used, and by -63% to 28%, when the tissue weighting factors of ICRP 103 were used. Conclusions: It is possible to estimate patient-specific radiation dose and cancer risk from CT examinations by combining a validated Monte Carlo program with patient-specific anatomical models that are derived from the patients' clinical CT data and supplemented by transformed models of reference adults. With the construction of a large library of patient-specific computer models encompassing patients of all ages and weight percentiles, dose and risk can be estimated for any patient prior to or after a CT examination. Such information may aid in decisions for image utilization and can further guide the design and optimization of CT technologies and scan protocols.« less

Journal ArticleDOI
TL;DR: Spatiotemporal registration can provide accurate motion estimation for 4D CT and improves the robustness to artifacts and is found most suitable to account for the sudden changes of motion at this breathing phase.
Abstract: Purpose: Four-dimensional computed tomography (4D CT) can provide patient-specific motion information for radiotherapy planning and delivery. Motion estimation in 4D CT is challenging due to the reduced image quality and the presence of artifacts. We aim to improve the robustness of deformable registration applied to respiratory-correlated imaging of the lungs, by using a global problem formulation and pursuing a restrictive parametrization for the spatiotemporal deformation model.

Journal ArticleDOI
TL;DR: Although no physics apart from the initial segmentation procedure enter the correction process, beam hardening artifacts were significantly reduced by EBHC and the image quality for clinical CT, micro-CT, and C-arm CT was highly improved.
Abstract: Purpose: Due to x-ray beam polychromaticity and scattered radiation, attenuation measurements tend to be underestimated. Cupping and beam hardening artifacts become apparent in the reconstructed CT images. If only one material such as water, for example, is present, these artifacts can be reduced by precorrecting the rawdata. Higher order beam hardening artifacts, as they result when a mixture of materials such as water and bone, or water and bone and iodine is present, require an iterative beam hardening correction where the image is segmented into different materials and those are forward projected to obtain new rawdata. Typically, the forward projection must correctly model the beam polychromaticity and account for all physical effects, including the energy dependence of the assumed materials in the patient, the detector response, and others. We propose a new algorithm that does not require any knowledge about spectra or attenuation coefficients and that does not need to be calibrated. The proposed method corrects beam hardening in single energy CT data. Methods: The only a priori knowledge entering EBHC is the segmentation of the object into different materials. Materials other than water are segmented from the original image, e.g., by using simple thresholding. Then, a (monochromatic) forward projectionmore » of these other materials is performed. The measured rawdata and the forward projected material-specific rawdata are monomially combined (e.g., multiplied or squared) and reconstructed to yield a set of correction volumes. These are then linearly combined and added to the original volume. The combination weights are determined to maximize the flatness of the new and corrected volume. EBHC is evaluated using data acquired with a modern cone-beam dual-source spiral CT scanner (Somatom Definition Flash, Siemens Healthcare, Forchheim, Germany), with a modern dual-source micro-CT scanner (TomoScope Synergy Twin, CT Imaging GmbH, Erlangen, Germany), and with a modern C-arm CT scanner (Axiom Artis dTA, Siemens Healthcare, Forchheim, Germany). A large variety of phantom, small animal, and patient data were used to demonstrate the data and system independence of EBHC. Results: Although no physics apart from the initial segmentation procedure enter the correction process, beam hardening artifacts were significantly reduced by EBHC. The image quality for clinical CT, micro-CT, and C-arm CT was highly improved. Only in the case of C-arm CT, where high scatter levels and calibration errors occur, the relative improvement was smaller. Conclusions: The empirical beam hardening correction is an interesting alternative to conventional iterative higher order beam hardening correction algorithms. It does not tend to over- or undercorrect the data. Apart from the segmentation step, EBHC does not require assumptions on the spectra or on the type of material involved. Potentially, it can therefore be applied to any CT image.« less

Journal ArticleDOI
TL;DR: Results show that the use of a spatial constraint is useful to increase the robustness of the deformable model comparatively to a deformable surface that is only driven by an image appearance model.
Abstract: Purpose: We present a fully automatic algorithm for the segmentation of the prostate in three-dimensional magnetic resonance (MR) images. Method: Our approach requires the use of an anatomical atlas which is built by computing transformation fields mapping a set of manually segmented images to a common reference. These transformation fields are then applied to the manually segmented structures of the training set in order to get a probabilistic map on the atlas. The segmentation is then realized through a two stage procedure. In the first stage, the processed image is registered to the probabilistic atlas. Subsequently, a \textit{probabilistic segmentation} is obtained by mapping the probabilistic map of the atlas to the patient's anatomy. In the second stage, a deformable surface evolves towards the prostate boundaries by merging information coming from the \textit{probabilistic segmentation}, an \textit{image feature model} and a \textit{statistical shape model}. During the evolution of the surface, the \textit{probabilistic segmentation} allows the introduction of a \textit{spatial constraint} that prevents the deformable surface from leaking in an unlikely configuration. Results: The proposed method is evaluated on 36 exams, that were manually segmented by a single expert. A median Dice similarity coefficient of 0.86 and an average surface error of 2.41 mm are achieved. Conclusion: By merging prior knowledge, the presented method achieves a robust and completely automatic segmentation of the prostate in MR images. Results show that the use of a \textit{spatial constraint} is useful to increase the robustness of the deformable model comparatively to a deformable surface that is only driven by an image appearance model.

Journal ArticleDOI
TL;DR: Results are found to disagree with previous experimental studies suggesting the possibility of an intrinsic energy dependence at lower photon energies and it is recommended that the effective atomic number of future films be produced as close to that of water and that thicker active layers are advantageous.
Abstract: Purpose: The absorbed-dose energy dependence of GAFCHROMIC EBT and EBT2 film irradiated in photon beams is studied to understand the shape of the curves and the physics behind them Methods: The absorbed-dose energy dependence is calculated using the EGSnrc-based EGS_chamber and DOSRZnrc codes by calculating the ratio of dose to water to dose to active film layers at photonenergies ranging from 3 keV to 18 MeV These data are compared to the mass energyabsorption coefficient ratios and the restricted stopping power ratios of water to active film materials as well as to previous experimental results Results: In the photonenergy range of 100 keV to 18 MeV the absorbed-dose energy dependence is found to be energy independent within ±06% However, below 100 keV, the absorbed-dose energy dependence of EBT varies by approximately 10% due to changes in mass energyabsorption coefficient ratios of water to film materials, as well as an increase in the number of electrons being created and scattered in the central surface layer of the film Results are found to disagree with previous experimental studies suggesting the possibility of an intrinsic energy dependence at lower photonenergies For EBT2 film the absorbed-dose energy dependence at low photonenergies varies by 50% or 10% depending on the manufacturing lot due to changes in the ratio of mass energyabsorption coefficients of the active emulsion layers to water Conclusions: Caution is recommended when using GAFCHROMIC EBT/EBT2 films at photonenergies below 100 keV It is recommended that the effective atomic number of future films be produced as close to that of water and that thicker active layers are advantageous

Journal ArticleDOI
TL;DR: It is verified that the new scanning delivery system can produce an accurate 3D dose distribution for the target volume in combination with the planning software.
Abstract: Purpose: A project to construct a new treatment facility, as an extension of the existing HIMAC facility, has been initiated for the further development of carbon-ion therapy at NIRS. This new treatment facility is equipped with a 3D irradiation system with pencil-beam scanning. The challenge of this project is to realize treatment of a moving target by scanning irradiation. To achieve fast rescanning within an acceptable irradiation time, the authors developed a fast scanning system. Methods: In order to verify the validity of the design and to demonstrate the performance of the fast scanning prior to use in the new treatment facility, a new scanning-irradiation system was developed and installed into the existing HIMAC physics-experiment course. The authors made strong efforts to develop (1) the fast scanning magnet and its power supply, (2) the high-speed control system, and (3) the beam monitoring. The performance of the system including 3D dose conformation was tested by using the carbon beam from the HIMAC accelerator. Results: The performance of the fast scanning system was verified by beam tests. Precision of the scanned beam position was less than {+-}0.5 mm. By cooperating with the planning software, the authors verified the homogeneity of the deliveredmore » field within {+-}3% for the 3D delivery. This system took only 20 s to deliver the physical dose of 1 Gy to a spherical target having a diameter of 60 mm with eight rescans. In this test, the average of the spot-staying time was considerably reduced to 154 {mu}s, while the minimum staying time was 30 {mu}s. Conclusions: As a result of this study, the authors verified that the new scanning delivery system can produce an accurate 3D dose distribution for the target volume in combination with the planning software.« less

Journal ArticleDOI
TL;DR: It seems likely that beam gating will be used initially to mitigate interplay effects only but not to considerably decrease treatment planning margins, while beam tracking, based on more accurate motion monitoring systems, provides the possibility to restore target conformity as well as steep dose gradients due to reduced treatment planning margin.
Abstract: Clinical outcomes of charged particle therapy are very promising. Currently, several dedicated centers that use scanning beam technology are either close to clinical use or under construction. Since scanned beam treatments of targets that move with respiration most likely result in marked local over- and underdosage due to interplay of target motion and dynamic beam application, dedicated motion mitigation techniques have to be employed. To date, the motion mitigation techniques, rescanning, beam gating, and beam tracking, have been proposed and tested in experimental studies. Rescanning relies on repeated irradiations of the target with the number of particles reduced accordingly per scan to statistically average local misdosage. Specific developments to prohibit temporal correlation between beam scanning and target motion will be required to guarantee adequate averaging. For beam gating, residual target motion within gating windows has to be mitigated in order to avoid local misdosage. Possibly the most promising strategy is to increase the overlap of adjacent particle pencil beams laterally as well as longitudinally to effectively reduce the sensitivity against small residual target motion. The most conformal and potentially most precise motion mitigation technique is beam tracking. Individual particle pencil beams have to be adapted laterally as well as longitudinally according to the target motion. Within the next several years, it can be anticipated that rescanning as well as beam gating will be ready for clinical use. For rescanning, treatment planning margins that incorporate the full extent of target motion as well as motion induced density variations in the beam paths will result in reduced target conformity of the applied dose distributions. Due to the limited precision of motion monitoring devices, it seems likely that beam gating will be used initially to mitigate interplay effects only but not to considerably decrease treatment planning margins. Then, in the next step, beam gating, based on more accurate motion monitoring systems, provides the possibility to restore target conformity as well as steep dose gradients due to reduced treatment planning margins. Accurate motion monitoring systems will be required for beam tracking. Even though beam tracking has already been successfully tested experimentally, full clinical implementation requires direct feedback of the actual target position in quasireal time to the treatment control system and can be anticipated to be several more years ahead.

Journal ArticleDOI
TL;DR: The normalized probabilistic atlases and computer-aided medical image analysis to automatically segment and quantify livers and spleens for extracting imaging biomarkers have the potential to assist the diagnosis of abdominal disorders from routine analysis of clinical data and guide clinical management.
Abstract: Purpose: To investigate the potential of the normalized probabilistic atlases and computer-aided medical image analysis to automatically segment and quantify livers and spleens for extracting imaging biomarkers (volume and height). Methods: A clinical tool was developed to segment livers and spleen from 257 abdominal contrast-enhanced CT studies. There were 51 normal livers, 44 normal spleens, 128 splenomegaly, 59 hepatomegaly, and 23 partial hepatectomy cases. 20 more contrast-enhanced CT scans from a public site with manual segmentations of mainly pathological livers were used to test the method. Data were acquired on a variety of scanners from different manufacturers and at varying resolution. Probabilistic atlases of livers and spleens were created using manually segmented data from ten noncontrast CT scans (five male and five female). The organ locations were modeled in the physical space and normalized to the position of an anatomical landmark, the xiphoid. The construction and exploitation of liver and spleen atlases enabled the automated quantifications of liver∕spleen volumes and heights (midhepatic liver height and cephalocaudal spleen height) from abdominal CT data. The quantification was improved incrementally by a geodesic active contour, patient specific contrast-enhancement characteristics passed to an adaptive convolution, and correction for shape and location errors. Results: The livers and spleens were robustly segmented from normal and pathological cases. For the liver, the Dice∕Tanimoto volume overlaps were 96.2%∕92.7%, the volume∕height errors were 2.2%∕2.8%, the root-mean-squared error (RMSE) was 2.3 mm, and the average surface distance (ASD) was 1.2 mm. The spleen quantification led to 95.2%∕91% Dice∕Tanimoto overlaps, 3.3%∕1.7% volume∕height errors, 1.1 mm RMSE, and 0.7 ASD. The correlations (R2) with clinical∕manual height measurements were 0.97 and 0.93 for the spleen and liver, respectively (p 0.2) was found comparing interobserver and automatic-manual volume∕height errors for liver and spleen. Conclusions: The algorithm is robust to segmenting normal and enlarged spleens and livers, and in the presence of tumors and large morphological changes due to partial hepatectomy. Imaging biomarkers of the liver and spleen from automated computer-assisted tools have the potential to assist the diagnosis of abdominal disorders from routine analysis of clinical data and guide clinical management.

Journal ArticleDOI
TL;DR: A concise and critical review of in vivo small animal imaging is presented, focusing on currently available modalities as well as emerging imaging technologies on one side and molecularly targeted contrast agents on the other.
Abstract: The use of small animal models in basic and preclinical sciences constitutes an integral part of testing new pharmaceutical agents prior to commercial translation to clinical practice. Whole-body small animal imaging is a particularly elegant and cost-effective experimental platform for the timely validation and commercialization of novel agents from the bench to the bedside. Biomedical imaging is now listed along with genomics, proteomics, and metabolomics as an integral part of biological and medical sciences. Miniaturized versions of clinical diagnostic modalities, including but not limited to microcomputed tomography, micromagnetic resonance tomography, microsingle-photon-emission tomography, micropositron-emission tomography, optical imaging, digital angiography, and ultrasound, have all greatly improved our investigative abilities to longitudinally study various experimental models of human disease in mice and rodents. After an exhaustive literature search, the authors present a concise and critical review of in vivo small animal imaging, focusing on currently available modalities as well as emerging imaging technologies on one side and molecularly targeted contrast agents on the other. Aforementioned scientific topics are analyzed in the context of cancer angiogenesis and innovative antiangiogenic strategies under-the-way to the clinic. Proposed hybrid approaches for diagnosis and targeted site-specific therapy are highlighted to offer an intriguing glimpse of the future.

Journal ArticleDOI
TL;DR: There is considerable variation among modern MDCT scanners in both CTDIvol and organ dose values, but when organ doses are normalized by CTDIvoI values, the differences across scanners become very small.
Abstract: Purpose: Monte Carlo radiation transport techniques have made it possible to accurately estimate the radiation dose to radiosensitive organs in patient models from scans performed with modern multidetector row computed tomography (MDCT) scanners. However, there is considerable variation in organ doses across scanners, even when similar acquisition conditions are used. The purpose of this study was to investigate the feasibility of a technique to estimate organ doses that would be scanner independent. This was accomplished by assessing the ability of CTDIvol measurements to account for differences in MDCT scanners that lead to organ dose differences. Methods: Monte Carlo simulations of 64-slice MDCT scanners from each of the four major manufacturers were performed. An adult female patient model from the GSF family of voxelized phantoms was used in which all ICRP Publication 103 radiosensitive organs were identified. A 120 kVp, full-body helical scan with a pitch of 1 was simulated for each scanner using similar scan protocols across scanners. From each simulated scan, the radiation dose to each organ was obtained on a per mA s basis (mGy∕mA s). In addition, CTDIvol values were obtained from each scanner for the selected scan parameters. Then, to demonstrate the feasibility of generating organ dose estimates from scanner-independent coefficients, the simulated organ dose values resulting from each scanner were normalized by the CTDIvol value for those acquisition conditions. Results: CTDIvol values across scanners showed considerable variation as the coefficient of variation (CoV) across scanners was 34.1%. The simulated patient scans also demonstrated considerable differences in organ dose values, which varied by up to a factor of approximately 2 between some of the scanners. The CoV across scanners for the simulated organ doses ranged from 26.7% (for the adrenals) to 37.7% (for the thyroid), with a mean CoV of 31.5% across all organs. However, when organ doses are normalized by CTDIvol values, the differences across scanners become very small. For the CTDIvol, normalized dose values the CoVs across scanners for different organs ranged from a minimum of 2.4% (for skin tissue) to a maximum of 8.5% (for the adrenals) with a mean of 5.2%. Conclusions: This work has revealed that there is considerable variation among modern MDCT scanners in both CTDIvol and organ dose values. Because these variations are similar, CTDIvol can be used as a normalization factor with excellent results. This demonstrates the feasibility of establishing scanner-independent organ dose estimates by using CTDIvol to account for the differences between scanners.