scispace - formally typeset
Search or ask a question

Showing papers in "Medical Physics in 2007"


Journal ArticleDOI
Karl Otto1
TL;DR: This work presents a novel aperture-based algorithm for treatment plan optimization where dose is delivered during a single gantry arc of up to 360 deg, similar to tomotherapy but fundamentally different in that the entire dose volume is delivered in a single source rotation.
Abstract: In this work a novel plan optimization platform is presented where treatment is delivered efficiently and accurately in a single dynamically modulated arc. Improvements in patient care achieved through image-guided positioning and plan adaptation have resulted in an increase in overall treatment times. Intensity-modulated radiation therapy(IMRT) has also increased treatment time by requiring a larger number of beam directions, increased monitor units (MU), and, in the case of tomotherapy, a slice-by-slice delivery. In order to maintain a similar level of patient throughput it will be necessary to increase the efficiency of treatmentdelivery. The solution proposed here is a novel aperture-based algorithm for treatment plan optimization where dose is delivered during a single gantry arc of up to 360 deg. The technique is similar to tomotherapy in that a full 360 deg of beam directions are available for optimization but is fundamentally different in that the entire dose volume is delivered in a single source rotation. The new technique is referred to as volumetric modulated arc therapy (VMAT). Multileaf collimator(MLC) leaf motion and number of MU per degree of gantry rotation is restricted during the optimization so that gantry rotation speed, leaf translation speed, and dose rate maxima do not excessively limit the delivery efficiency. During planning, investigators model continuous gantry motion by a coarse sampling of static gantry positions and fluence maps or MLC aperture shapes. The technique presented here is unique in that gantry and MLC position sampling is progressively increased throughout the optimization. Using the full gantry range will theoretically provide increased flexibility in generating highly conformal treatment plans. In practice, the additional flexibility is somewhat negated by the additional constraints placed on the amount of MLC leaf motion between gantry samples. A series of studies are performed that characterize the relationship between gantry and MLC sampling, dose modeling accuracy, and optimization time. Results show that gantry angle and MLC sample spacing as low as 1 deg and 0.5 cm, respectively, is desirable for accurate dose modeling. It is also shown that reducing the sample spacing dramatically reduces the ability of the optimization to arrive at a solution. The competing benefits of having small and large sample spacing are mutually realized using the progressive sampling technique described here. Preliminary results show that plans generated with VMAT optimization exhibit dose distributions equivalent or superior to static gantry IMRT. Timing studies have shown that the VMAT technique is well suited for on-line verification and adaptation with delivery times that are reduced to ∼ 1.5 – 3 min for a 200 cGy fraction.

1,698 citations


Journal ArticleDOI
TL;DR: Enhanced image resolution and lower noise have been achieved, concurrently with the reduction of helical cone-beam artifacts, as demonstrated by phantom studies and clinical results illustrate the capabilities of the algorithm on real patient data.
Abstract: Multislice helical computed tomography scanning offers the advantages of faster acquisition and wide organ coverage for routine clinical diagnostic purposes. However, image reconstruction is faced with the challenges of three-dimensional cone-beam geometry, data completeness issues, and low dosage. Of all available reconstruction methods, statistical iterative reconstruction (IR) techniques appear particularly promising since they provide the flexibility of accurate physical noise modeling and geometric system description. In this paper, we present the application of Bayesian iterative algorithms to real 3D multislice helical data to demonstrate significant image quality improvement over conventional techniques. We also introduce a novel prior distribution designed to provide flexibility in its parameters to fine-tune image quality. Specifically, enhanced image resolution and lower noise have been achieved, concurrently with the reduction of helical cone-beam artifacts, as demonstrated by phantom studies. Clinical results also illustrate the capabilities of the algorithm on real patient data. Although computational load remains a significant challenge for practical development, superior image quality combined with advancements in computing technology make IR techniques a legitimate candidate for future clinical applications.

987 citations


Journal ArticleDOI
TL;DR: This article summarizes the present knowledge and gives an insight into the future procedures to handle the nonequilibrium radiation dosimetry problems and is anticipated that new miniature detectors with controlled perturbations and corrections will be available to meet the demand for accurate measurements.
Abstract: Advances in radiation treatment with beamlet-based intensity modulation, image-guided radiation therapy, and stereotactic radiosurgery (including specialized equipments like CyberKnife, Gamma Knife, tomotherapy, and high-resolution multileaf collimating systems) have resulted in the use of reduced treatment fields to a subcentimeter scale. Compared to the traditional radiotherapy with fields > or =4 x 4 cm2, this can result in significant uncertainty in the accuracy of clinical dosimetry. The dosimetry of small fields is challenging due to nonequilibrium conditions created as a consequence of the secondary electron track lengths and the source size projected through the collimating system that are comparable to the treatment field size. It is further complicated by the prolonged electron tracks in the presence of low-density inhomogeneities. Also, radiation detectors introduced into such fields usually perturb the level of disequilibrium. Hence, the dosimetric accuracy previously achieved for standard radiotherapy applications is at risk for both absolute and relative dose determination. This article summarizes the present knowledge and gives an insight into the future procedures to handle the nonequilibrium radiation dosimetry problems. It is anticipated that new miniature detectors with controlled perturbations and corrections will be available to meet the demand for accurate measurements. It is also expected that the Monte Carlo techniques will increasingly be used in assessing the accuracy, verification, and calculation of dose, and will aid perturbation calculations of detectors used in small and highly conformal radiation beams. rican Association of Physicists in Medicine.

602 citations


Journal ArticleDOI
TL;DR: The purpose of this report is to set out the salient issues associated with clinical implementation and experimental verification of MC dose algorithms, and to provide the framework upon which to build a comprehensive program for commissioning and routine quality assurance of MC-based treatment planning systems.
Abstract: The Monte Carlo (MC) method has been shown through many research studies to calculate accurate dose distributions for clinical radiotherapy, particularly in heterogeneous patient tissues where the effects of electron transport cannot be accurately handled with conventional, deterministic dose algorithms. Despite its proven accuracy and the potential for improved dose distributions to influence treatment outcomes, the long calculation times previously associated with MC simulation rendered this method impractical for routine clinical treatment planning. However, the development of faster codes optimized for radiotherapy calculations and improvements in computer processor technology have substantially reduced calculation times to, in some instances, within minutes on a single processor. These advances have motivated several major treatment planning system vendors to embark upon the path of MC techniques. Several commercial vendors have already released or are currently in the process of releasing MC algorithms for photon and/or electron beam treatment planning. Consequently, the accessibility and use of MC treatment planning algorithms may well become widespread in the radiotherapy community. With MC simulation, dose is computed stochastically using first principles; this method is therefore quite different from conventional dose algorithms. Issues such as statistical uncertainties, the use of variance reduction techniques, theability to account for geometric details in the accelerator treatment head simulation, and other features, are all unique components of a MC treatment planning algorithm. Successful implementation by the clinical physicist of such a system will require an understanding of the basic principles of MC techniques. The purpose of this report, while providing education and review on the use of MC simulation in radiotherapy planning, is to set out, for both users and developers, the salient issues associated with clinical implementation and experimental verification of MC dose algorithms. As the MC method is an emerging technology, this report is not meant to be prescriptive. Rather, it is intended as a preliminary report to review the tenets of the MC method and to provide the framework upon which to build a comprehensive program for commissioning and routine quality assurance of MC-based treatment planning systems.

591 citations


Journal ArticleDOI
TL;DR: This task group is charged with addressing the issue of radiation dose delivered via image guidance techniques during radiotherapy by compiling an overview of image-guidance techniques and their associated radiation dose levels to enable the design of image guidance regimens that are as effective and efficient as possible.
Abstract: Radiographic image guidance has emerged as the new paradigm for patient positioning, target localization, and external beam alignment in radiotherapy. Although widely varied in modality and method, all radiographic guidance techniques have one thing in common—they can give a significant radiation dose to the patient. As with all medical uses of ionizing radiation, the general view is that this exposure should be carefully managed. The philosophy for dose management adopted by the diagnostic imaging community is summarized by the acronym ALARA, i.e., as low as reasonably achievable. But unlike the general situation with diagnostic imaging and image-guided surgery, image-guided radiotherapy (IGRT) adds the imaging dose to an already high level of therapeutic radiation. There is furthermore an interplay between increased imaging and improved therapeutic dose conformity that suggests the possibility of optimizing rather than simply minimizing the imaging dose. For this reason, the management of imaging dose during radiotherapy is a different problem than its management during routine diagnostic or image-guided surgical procedures. The imaging dose received as part of a radiotherapy treatment has long been regarded as negligible and thus has been quantified in a fairly loose manner. On the other hand, radiation oncologists examine the therapy dose distribution in minute detail. The introduction of more intensive imaging procedures for IGRT now obligates the clinician to evaluate therapeutic and imaging doses in a more balanced manner. This task group is charged with addressing the issue of radiation dose delivered via image guidance techniques during radiotherapy. The group has developed this charge into three objectives: (1) Compile an overview of image-guidance techniques and their associated radiation dose levels, to provide the clinician using a particular set of image guidance techniques with enough data to estimate the total diagnostic dose for a specific treatment scenario, (2) identify ways to reduce the total imaging dose without sacrificing essential imaging information, and (3) recommend optimization strategies to trade off imaging dose with improvements in therapeutic dose delivery. The end goal is to enable the design of image guidance regimens that are as effective and efficient as possible.

513 citations


Journal ArticleDOI
TL;DR: In this article, Monte Carlo simulations were used to assess the efficiency of the computed tomography dose index (CTDI100) parameter for wide (40 mm) collimated x-ray beams.
Abstract: The computed tomography dose index (CTDI100) is typically measured using a 100 mm long pencil ion chamber with cylindrical polymethyl methacrylate (PMMA) dosimetry phantoms. While this metric was useful in the era of single slice CT scanners with collimated slice thicknesses of 10 mm or less, the efficiency of this metric in multi-slice CT scanners with wide (40 mm) collimated x-ray beams is unknown. Monte Carlo simulations were used to assess the efficiency of the CTDI100 parameter for wider beam collimations. The simulations utilized the geometry of a commercially available CT scanner, with modeled polyenergetic x-ray spectra. Dose spread functions (DSFs) were computed along the length of 12.4 mm diam rods placed at several radii in infinitely long 160 mm diam (head) and 320 mm diam (body) PMMA phantoms. The DSFs were used to compute radiation dose profiles for slice thicknesses from 1 to 400 mm. CTDI00 efficiency was defined as the fraction of the dose along a PMMA rod collected in a 100 mm length centered on the CT slice position, divided by the total dose deposited along an infinitely long PMMA rod. For a 10 mm slice thickness, a 120 kVp x-ray spectrum, and the PMMA head phantom, the efficiency of the CTDI00 was 82% and 90% for the center and peripheral holes, respectively. The corresponding efficiency values for the body phantom were 63% and 88%. These values are reduced by only 1% when a 40 mm slice thickness was studied, so the use of CTDI00 for 40 mm wide x-ray beams is no less valid than its use for 10 mm beam widths. However, these data illustrate that the efficiency of the CTDI100 measurement even with 10 mm beam widths is low and, consequently, dose computations which are derived from this metric may not be as accurate as desirable.

414 citations


Journal ArticleDOI
TL;DR: Initial results indicate that operator-independent, whole-breast imaging and the detection of breast masses are feasible, and future studies will focus on improved detection and differentiation of masses in support of the long-term goal of increasing the specificity of breast exams.
Abstract: Although mammography is the gold standard for breast imaging, its limitations result in a high rate of biopsies of benign lesions and a significant false negative rate for women with dense breasts. In response to this imaging performance gap we have been developing a clinical breast imaging methodology based on the principles of ultrasound tomography. The Computed Ultrasound Risk Evaluation (CURE) system has been designed with the clinical goals of whole breast, operator-independent imaging, and differentiation of breast masses. This paper describes the first clinical prototype, summarizes our initial image reconstruction techniques, and presents phantom and preliminary in vivo results. In an initial assessment of its in vivo performance, we have examined 50 women with the CURE prototype and obtained the following results. (1) Tomographic imaging of breast architecture is demonstrated in both CURE modes of reflection and transmission imaging. (2) In-plane spatial resolution of 0.5 mm in reflection and 4 mm in transmission is achieved. (3) Masses > 15 mm in size are routinely detected. (4) Reflection, sound speed, and attenuation imaging of breast masses are demonstrated. These initial results indicate that operator-independent, whole-breast imaging and the detection of breast masses are feasible. Future studies will focus on improved detection and differentiation of masses in support of our long-term goal of increasing the specificity of breast exams, thereby reducing the number of biopsies of benign masses.

363 citations


Journal ArticleDOI
TL;DR: The clinical use of OSLDs for in vivo dosimetric measurements is shown to be feasible and dose sensitivity was not dependent on temperature at the time of irradiation in the range of 10-40 degrees C.
Abstract: Optically stimulated luminescent dosimeters, OSLDs, are plastic disks infused with aluminum oxide doped with carbon (Al2O3 : C). These disks are encased in a light-tight plastic holder. Crystals of Al2O3 : C when exposed to ionizing radiation store energy that is released as luminescence (420 nm) when the OSLD is illuminated with stimulation light (540 nm). The intensity of the luminescence depends on the dose absorbed by the OSLD and the intensity of the stimulation light. OSLDs used in this work were InLight/OSL Dot dosimeters, which were read with a MicroStar reader (Landauer, Inc., Glenwood, IL). The following are dosimetric properties of the OSLD that were determined: After a single irradiation, repeated readings cause the signal to decrease by 0.05% per reading; the signal could be discharged by greater than 98% by illuminating them for more than 45 s with a 150 W tungsten-halogen light; after irradiation there was a transient signal that decayed with a 0.8 min halftime; after the transient signal decay the signal was stable for days; repeated irradiations and readings of an individual OSLD gave a signal with a coefficient of variation of 0.6%; the dose sensitivity of OSLDs from a batch of detectors has a coefficient of variation of 0.9%, response was linear with absorbed dose over a test range of 1-300 cGy; above 300 cGy a small supra-linear behavior occurs; there was no dose-per-pulse dependence over a 388-fold range; there was no dependence on radiation energy or mode for 6 and 15 MV x rays and 6-20 MeV electrons; for Ir-192 gamma rays OSLD had 6% higher sensitivity; the dose sensitivity was unchanged up to an accumulated dose of 20 Gy and thereafter decreased by 4% per 10 Gy of additional accumulated dose; dose sensitivity was not dependent on the angle of incidence of radiation; the OSLD in its light-tight case has an intrinsic buildup of 0.04 g/cm2; dose sensitivity of the OSLD was not dependent on temperature at the time of irradiation in the range of 10-40 degrees C. The clinical use of OSLDs for in vivo dosimetric measurements is shown to be feasible.

347 citations


Journal ArticleDOI
TL;DR: Computer simulations suggest that the applied combined use of internal and external markers allow the robot to accurately follow tumor motion even in the case of irregularities in breathing patterns, as well as the consequences of the lower acquisition frequency of the RTS.
Abstract: The Synchrony™ Respiratory Tracking System (RTS) is a treatment option of the CyberKnife robotic treatment device to irradiate extra-cranial tumors that move due to respiration. Advantages of RTS are that patients can breath normally and that there is no loss of linac duty cycle such as with gated therapy. Tracking is based on a measured correspondence model (linear or polynomial) between internal tumor motion and external (chest/abdominal) marker motion. The radiation beam follows the tumor movement via the continuously measured external marker motion. To establish the correspondence model at the start of treatment, the 3D internal tumor position is determined at 15 discrete time points by automatic detection of implanted gold fiducials in two orthogonal x-ray images; simultaneously, the positions of the external markers are measured. During the treatment, the relationship between internal and external marker positions is continuously accounted for and is regularly checked and updated. Here we use computer simulations based on continuously and simultaneously recorded internal and external marker positions to investigate the effectiveness of tumor tracking by the RTS. The Cyberknife does not allow continuous acquisition of x-ray images to follow the moving internal markers (typical imaging frequency is once per minute). Therefore, for the simulations, we have used data for eight lung cancer patients treated with respiratory gating. All of these patients had simultaneous and continuous recordings of both internal tumor motion and external abdominal motion. The available continuous relationship between internal and external markers for these patients allowed investigation of the consequences of the lower acquisition frequency of the RTS. With the use of the RTS, simulated treatment errors due to breathing motion were reduced largely and consistently over treatment time for all studied patients. A considerable part of the maximum reduction in treatment error could already be reached with a simple linear model. In case of hysteresis, a polynomial model added some extra reduction. More frequent updating of the correspondence model resulted in slightly smaller errors only for the few recordings with a time trend that was fast, relative to the current x-ray update frequency. In general, the simulations suggest that the applied combined use of internal and external markers allow the robot to accurately follow tumor motion even in the case of irregularities in breathing patterns.

257 citations


Journal ArticleDOI
TL;DR: Unlike the semiempirical models, the model proposed here requires the introduction of no empirical and unphysical parameters in the differential bremsstrahlung cross section, bar an overall normalization factor which is close to unity.
Abstract: A new approach to the calculation of the x-ray spectrum emerging from an x-ray tube is proposed. Theoretical results for the bremsstrahlung cross section appearing in the literature are summarized. Four different treatments of electron penetration, based on the work presented in Part I, are then used to generate bremsstrahlung spectra. These spectra are compared to experimental data at 50, 80 and 100 kVp tube potentials. The most sophisticated treatment of electron penetration was required to obtain good agreement. With this treatment both the National Institute of Standards and Technology bremsstrahlung cross sections, based on accurate partial wave calculations, and the Bethe-Heitler cross section [H. A. Bethe and W. Heitler, Proc R. Soc. London, Ser. A. 146, 83-112 (1934)] corrected by a modified Elwert factor [G. Elwert, Ann. Phys. (Leipzig) 426, 178-208 (1939)], provided good agreement to measured data. An approximate treatment of the characteristic spectrum is suggested. The dependencies of the bremsstrahlung and characteristic outputs of an x-ray tube on tube potential are compared to experimentally derived data for 70-140 kVp potentials. Agreement is to within a few percent of the total output over the entire range. The spectral predictions of the semiempirical models of Birch and Marshall [R. Birch and M. Marshall, Phys. Med. Biol. 24, 505-513 (1979)] (IPEM Report 78) and of Tucker et al. [D. M. Tucker, G. T. Barnes, and D. P. Chakraborty, Med. Phys. 18, 211-218 (1991).] are also assessed. The predictions of Tucker et al. are very close to the model developed here. The predictions of IPEM Report 78 are similar, but consistently harder for the range of tube potentials examined (50-100 kV). Unlike the semiempirical models, the model proposed here requires the introduction of no empirical and unphysical parameters in the differential bremsstrahlung cross section, bar an overall normalization factor which is close to unity.

253 citations


Journal ArticleDOI
TL;DR: The penetration characteristics of electron beams into x-ray targets are investigated for incident electron kinetic energies in the range 50-150 keV and the crudity of the use of the Thomson-Whiddington law to describe electron penetration and energy loss is highlighted.
Abstract: The penetration characteristics of electron beams into x-ray targets are investigated for incident electron kinetic energies in the range 50-150 keV. The frequency densities of electrons penetrating to a depth x in a target, with a fraction of initial kinetic energy, u, are calculated using Monte Carlo methods for beam energies of 50, 80, 100, 120 and 150 keV in a tungsten target. The frequency densities for 100 keV electrons in Al, Mo and Re targets are also calculated. A mixture of simple modeling with equations and interpolation from data is used to generalize the calculations in tungsten. Where possible, parameters derived from the Monte Carlo data are compared to experimental measurements. Previous electron transport approximations in the semiempirical models of other authors are discussed and related to this work. In particular, the crudity of the use of the Thomson-Whiddington law to describe electron penetration and energy loss is highlighted. The results presented here may be used towards calculating the target self-attenuation correction for bremsstrahlung photons emitted within a tungsten target.

Journal ArticleDOI
TL;DR: Two novel CAD approaches that both emphasize an intelligible decision process to predict breast biopsy outcomes from BI-RADS findings have the potential to reduce the number of unnecessary breast biopsies in clinical practice.
Abstract: Mammography is the most effective method for breast cancer screening available today. However, the low positive predictive value of breast biopsy resulting from mammogram interpretation leads to approximately 70% unnecessary biopsies with benign outcomes. To reduce the high number of unnecessary breast biopsies, several computer-aided diagnosis (CAD) systems have been proposed in the last several years. These systems help physicians in their decision to perform a breast biopsy on a suspicious lesion seen in a mammogram or to perform a short term follow-up examination instead. We present two novel CAD approaches that both emphasize an intelligible decision process to predict breast biopsy outcomes from BI-RADS findings. An intelligible reasoning process is an important requirement for the acceptance of CAD systems by physicians. The first approach induces a global model based on decison-tree learning. The second approach is based on case-based reasoning and applies an entropic similarity measure. We have evaluated the performance of both CAD approaches on two large publicly available mammography reference databases using receiver operating characteristic (ROC) analysis, bootstrap sampling, and the ANOVA statistical significance test. Both approaches outperform the diagnosis decisions of the physicians. Hence, both systems have the potential to reduce the number of unnecessary breast biopsies in clinical practice. A comparison of the performance of the proposed decision tree and CBR approaches with a state of the art approach based on artificial neural networks (ANN) shows that the CBR approach performs slightly better than the ANN approach, which in turn results in slightly better performance than the decision-tree approach. The differences are statistically significant (p value < 0.001). On 2100 masses extracted from the DDSM database, the CRB approach for example resulted in an area under the ROC curve of A(z) = 0.89 +/- 0.01, the decision-tree approach in A(z) = 0.87 +/- 0.01, and the ANN approach in A(z) = 0.88 +/- 0.01.

Journal ArticleDOI
TL;DR: Air kerma measurements were used to compare a variety of commercial and pre-commercial radiation shielding materials over mean energy ranges from 39 to 205 keV and a correlation was made of radiation attenuation, materials properties, calculated spectra and ambient dose equivalent.
Abstract: The attenuating properties of several types of lead (Pb)-based and non-Pb radiation shielding materials were studied and a correlation was made of radiation attenuation, materials properties, calculated spectra and ambient dose equivalent Utilizing the well-characterized x-ray and gamma ray beams at the National Research Council of Canada, air kerma measurements were used to compare a variety of commercial and pre-commercial radiation shielding materials over mean energy ranges from 39 to 205 keV The EGSnrc Monte Carlo user code cavity cpp was extended to provide computed spectra for a variety of elements that have been used as a replacement for Pb in radiation shielding garments Computed air kerma values were compared with experimental values and with the SRS-30 catalogue of diagnostic spectra available through the Institute of Physics and Engineering in Medicine Report 78 In addition to garment materials, measurements also included pure Pb sheets, allowing direct comparisons to the common industry standards of 025 and 05 mm "lead equivalent" The parameter "lead equivalent" is misleading, since photon attenuation properties for all materials (including Pb) vary significantly over the energy spectrum, with the largest variations occurring in the diagnostic imaging range Furthermore, air kerma measurements are typically made to determine attenuation properties without reference to the measures of biological damage such as ambient dose equivalent, which also vary significantly with air kerma over the diagnostic imaging energy range A single material or combination cannot provide optimum shielding for all energy ranges However, appropriate choice of materials for a particular energy range can offer significantly improved shielding per unit mass over traditional Pb-based materials

Journal ArticleDOI
TL;DR: The AAPM recommends that the consensus data sets and resultant source-specific dose-rate distributions included in this supplement be adopted by all end users for clinical treatment planning of low-energy photon-emitting brachytherapy sources.
Abstract: Since publication of the 2004 update to the American Association of Physicists in Medicine (AAPM) Task Group No. 43 Report (TG-43U1), several new low-energy photon-emitting brachytherapy sources have become available. Many of these sources have satisfied the AAPM prerequisites for routine clinical use as of January 10, 2005, and are posted on the Joint AAPM/RPC Brachytherapy Seed Registry. Consequently, the AAPM has prepared this supplement to the 2004 AAPM TG-43 update. This paper presents the AAPM-approved consensus datasets for these sources, and includes the following 125I sources: Amersham model 6733, Draximage model LS-1, Implant Sciences model 3500, IBt model 1251L, IsoAid model IAI-125A, Mentor model SL-125/ SH-125, and SourceTech Medical model STM1251. The Best Medical model 2335 103Pd source is also included. While the methodology used to determine these data sets is identical to that published in the AAPM TG-43U1 report, additional information and discussion are presented here on some questions that arose since the publication of the TG-43U1 report. Specifically, details of interpolation and extrapolation methods are described further, new methodologies are recommended, and example calculations are provided. Despite these changes, additions, and clarifications, the overall methodology, the procedures for developing consensus data sets, and the dose calculation formalism largely remain the same as in the TG-43U1 report. Thus, the AAPM recommends that the consensus data sets and resultant source-specific dose-rate distributions included in this supplement be adopted by all end users for clinical treatment planning of low-energy photon-emitting brachytherapy sources. Adoption of these recommendations may result in changes to patient dose calculations, and these changes should be carefully evaluated and reviewed with the radiation oncologist prior to implementation of the current protocol.

Journal ArticleDOI
TL;DR: Out of seven different respiratory gating approaches, the variable amplitude method (4) captures the respiratory motion best while keeping a constant noise level among all respiratory phases.
Abstract: Respiratory gating is used for reducing the effects of breathing motion in a wide range of applications from radiotherapy treatment to diagnostical imaging. Different methods are feasible for respiratory gating. In this study seven gating methods were developed and tested on positron emission tomography (PET) listmode data. The results of seven patient studies were compared quantitatively with respect to motion and noise. (1) Equal and (2) variable time-based gating methods use only the time information of the breathing cycle to define respiratory gates. (3) Equal and (4) variable amplitude-based gating approaches utilize the amplitude of the respiratory signal. (5) Cycle-based amplitude gating is a combination of time and amplitude-based techniques. A baseline correction was applied to methods (3) and (4) resulting in two new approaches: Baseline corrected (6) equal and (7) variable amplitude-based gating. Listmode PET data from seven patients were acquired together with a respiratory signal. Images were reconstructed applying the seven gating methods. Two parameters were used to quantify the results: Motion was measured as the displacement of the heart due to respiration and noise was defined as the standard deviation of pixel intensities in a background region. The amplitude-based approaches (3) and (4) were superior to the time-based methods (1) and (2). The improvement in capturing the motion was more than 30% (up to 130%) in all subjects. The variable time (2) and amplitude (4) methods had a more uniform noise distribution among all respiratory gates compared to equal time (1) and amplitude (3) methods. Baseline correction did not improve the results. Out of seven different respiratory gating approaches, the variable amplitude method (4) captures the respiratory motion best while keeping a constant noise level among all respiratory phases.

Journal ArticleDOI
TL;DR: A sparseness prior regularized weighted l2 norm optimization is proposed to mitigate streaking artifacts based on the fact that most medical images are compressible and is implemented as the regularizer for its simplicity.
Abstract: Recent advances in murine cardiac studies with three-dimensional (3D) cone beam micro-CT used a retrospective gating technique However, this sampling technique results in a limited number of projections with an irregular angular distribution due to the temporal resolution requirements and radiation dose restrictions Both angular irregularity and undersampling complicate the reconstruction process, since they cause significant streaking artifacts This work provides an iterative reconstruction solution to address this particular challenge A sparseness prior regularized weighted l2 norm optimization is proposed to mitigate streaking artifacts based on the fact that most medical images are compressible Total variation is implemented in this work as the regularizer for its simplicity Comparison studies are conducted on a 3D cardiac mouse phantom generated with experimental data After optimization, the method is applied to in vivo cardiac micro-CT data

Journal ArticleDOI
TL;DR: Using a set of data that contains synchronous internal and external motion traces, a dynamic data analysis technique is developed to study the internal-external correlation, and to quantitatively estimate its underlying time behavior.
Abstract: In gated radiation therapy procedures, the lung tumor position is used directly (by implanted radiopaque markers) or indirectly (by external surrogate methods) to decrease the volume of irradiated healthy tissue. Due to a risk of pneumothorax, many clinics do not implant fiducials, and the gated treatment is primarily based on a respiratory induced external signal. The external surrogate method relies upon the assumption that the internal tumor motion is well correlated with the external respiratory induced motion, and that this correlation is constant in time. Using a set of data that contains synchronous internal and external motion traces, we have developed a dynamic data analysis technique to study the internal-external correlation, and to quantitatively estimate its underlying time behavior. The work presented here quantifies the time dependent behavior of the correlation between external respiratory signals and lung implanted fiducial motion. The corresponding amplitude mismatch is also reported for the lung patients studied. The information obtained can be used to improve the accuracy of tumor tracking. For the ten patients in this study, the SI internal-external motion is well correlated, with small time shifts and corresponding amplitude mismatches. Although the AP internal-external motion reveals larger time shifts than along the SI direction, the corresponding amplitude mismatches are below 5 mm.

Journal ArticleDOI
TL;DR: In this preliminary analysis the application of the corrected Akaike information criterion is demonstrated considering the example of determining pharmacokinetic parameters for the blood serum time activity curves of 111In-labeled anti-CD66 antibody.
Abstract: In many circumstances of data fitting one has to choose the optimal fitting function or model among several alternatives. Criteria or tests on which this decision is based are necessary and have to be well selected. In this preliminary analysis the application of the corrected Akaike information criterion is demonstrated considering the example of determining pharmacokinetic parameters for the blood serum time activity curves of 111In-labeled anti-CD66 antibody. Another model selection criterion, the F-test, is used for comparison. For the investigated data the corrected Akaike information criterion has proved to be an effective and efficient approach, applicable to nested and non-nested models.

Journal ArticleDOI
TL;DR: Theories and Practice in Radiotherapy Physics: Theory and Practice - Libros de Medicina - Oncologia clinica - 282,26
Abstract: Handbook of Radiotherapy Physics: Theory and Practice - Libros de Medicina - Oncologia clinica - 282,26

Journal ArticleDOI
TL;DR: In this work, output factors were measured for field sizes from 4 mm up to 180 mm side length with different detectors and a simple linear correction for the energy response of solid state detectors is proposed.
Abstract: A variety of detectors and procedures for the measurement of small field output factors are discussed in the current literature. Different detectors with or without corrections are recommended. Correction factors are often derived by Monte Carlo methods, where the bias due to approximations in the model is difficult to judge. Over that, results appear to be contradictory in some cases. In this work, output factors were measured for field sizes from 4 mm up to 180 mm side length with different detectors. A simple linear correction for the energy response of solid state detectors is proposed. This led to identical values down to 8 mm field size, as long as the size of the detector is small against the field size. The correction was of the order of a few percent. For a shielded silicon diode it was well below 1%. A physically meaningful function is proposed in order to calculate output factors for arbitrary field sizes with high accuracy.

Journal ArticleDOI
TL;DR: In this paper, a model of respiration-induced organ motion in the thorax without the commonly adopted assumption of repeatable breath cycles is presented. But, the model is based on the motion of a volume of interest within the patient based on a reference three-dimensional (3D) image (at end expiration) and the diaphragm positions at different time points.
Abstract: The modeling of respiratory motion is important for a more accurate understanding and accounting of its effect on dose to cancers in the thorax and abdomen by radiotherapy We have developed a model of respiration-induced organ motion in the thorax without the commonly adopted assumption of repeatable breath cycles The model describes the motion of a volume of interest within the patient based on a reference three-dimensional (3D) image (at end expiration) and the diaphragm positions at different time points The input data are respiration-correlated CT (RCCT) images of patients treated for non-small- cell lung cancer, consisting of 3D images, including the diaphragm positions, at ten phases of the respiratory cycle A deformable image registration algorithm calculates the deformation field that maps each 3D image to the reference 3D image A principal component analysis is performed to parameterize the 3D deformation field in terms of the diaphragm motion We show that the first two principal components are adequate to accurately and completely describe the organ motion in the data of four patients Artifacts in the RCCT images that commonly occur at the mid-respiration states are reduced in the model-generated images Further validation of the model is demonstrated in the successful application of the parameterized 3D deformation field to RCCT data of the same patient but acquired several days later We have developed a method for predicting respiration-induced organ motion in patients that has potential for improving the accuracy of dose calculation in radiotherapy Possible limitations of the model are cases where the correlation between lung tumor and diaphragm position is less reliable such as superiorly situated tumors and interfraction changes in tumor-diaphragm correlation The limited number of clinical cases examined suggests, but does not confirm, the model's applicability to a wide range of patients

Journal ArticleDOI
TL;DR: The methods and results for predicting, measuring and correcting geometric distortions in a 3 T clinical magnetic resonance (MR) scanner for the purpose of image guidance in radiation treatment planning can be predicted negating the need for individual distortion calculation for a variety of other imaging sequences.
Abstract: The work presented herein describes our methods and results for predicting, measuring and correcting geometric distortions in a 3 T clinical magnetic resonance (MR) scanner for the purpose of image guidance in radiation treatment planning. Geometric inaccuracies due to both inhomogeneities in the background field and nonlinearities in the applied gradients were easily visualized on the MR images of a regularly structured three-dimensional (3D) grid phantom. From a computed tomography scan, the locations of just under 10 000 control points within the phantom were accurately determined in three dimensions using a MATLAB-based computer program. MR distortion was then determined by measuring the corresponding locations of the control points when the phantom was imaged using the MR scanner. Using a reversed gradient method, distortions due to gradient nonlinearities were separated from distortions due to inhomogeneities in the background B0 field. Because the various sources of machine-related distortions can be individually characterized, distortions present in other imaging sequences (for which 3D distortion cannot accurately be measured using phantom methods) can be predicted negating the need for individual distortion calculation for a variety of other imaging sequences. Distortions were found to be primarily caused by gradient nonlinearities and maximum image distortions were reported to be less than those previously found by other researchers at 1.5 T. Finally, the image slices were corrected for distortion in order to provide geometrically accurate phantom images.

Journal ArticleDOI
TL;DR: Clinical image quality and dose efficiency can be improved on scanners with bowtie filters if care is exercised when positioning patients and Automatically providing patient specific centering and scan parameter selection information can help the technologist improve workflow, achieve more consistent imagequality and reduce patient dose.
Abstract: Although x-ray intensity shaping filters (bowtie filters) have been used since the introduction of some of the earliest CT scanner models, the clinical implications on dose and noise are not well understood. To achieve the intended dose and noise advantage requires the patient to be centered in the scan field of view. In this study we explore the implications of patient centering in clinical practice. We scanned various size and shape phantoms on a GE LightSpeed VCT scanner using each available source filter with the phantom centers positioned at 0, 3, and 6 cm below the center of rotation (isocenter). Surface doses were measured along with image noise over a large image region. Regression models of surface dose and noise were generated as a function of phantom size and centering error. Methods were also developed to determine the amount of miscentering using a scout scan projection radiograph (SPR). These models were then used to retrospectively evaluate 273 adult body patients for clinical implications. When miscentered by 3 and 6 cm, the surface dose on a 32 cm CTDI phantom increased by 18% and 41% while image noise also increased by 6% and 22%. The retrospective analysis of adult body scout SPR scans shows that 46% of patients were miscentered in elevation by 20-60 mm with a mean position 23 mm below the center of rotation (isocenter). The analysis indicated a surface dose penalty of up to 140% with a mean dose penalty of 33% assuming that tube current is increased to compensate for the increased noise due to miscentering. Clinical image quality and dose efficiency can be improved on scanners with bowtie filters if care is exercised when positioning patients. Automatically providing patient specific centering and scan parameter selection information can help the technologist improve workflow, achieve more consistent image quality and reduce patient dose.

Journal ArticleDOI
TL;DR: A generalized least-squares (GLS) method is discussed here, which takes into account the variances and covariances among the individual data points and optical properties in the image into a structured weight matrix and shows improvement of GLS minimization when the noise level in the data is high.
Abstract: Diffuse optical tomography (DOT) involves estimation of tissue optical properties using noninvasive boundary measurements. The image reconstruction procedure is a nonlinear, ill-posed, and ill-determined problem, so overcoming these difficulties requires regularization of the solution. While the methods developed for solving the DOT image reconstruction procedure have a long history, there is less direct evidence on the optimal regularization methods, or exploring a common theoretical framework for techniques which uses least-squares (LS) minimization. A generalized least-squares (GLS) method is discussed here, which takes into account the variances and covariances among the individual data points and optical properties in the image into a structured weight matrix. It is shown that most of the least-squares techniques applied in DOT can be considered as special cases of this more generalized LS approach. The performance of three minimization techniques using the same implementation scheme is compared using test problems with increasing noise level and increasing complexity within the imaging field. Techniques that use spatial-prior information as constraints can be also incorporated into the GLS formalism. It is also illustrated that inclusion of spatial priors reduces the image error by at least a factor of 2. The improvement of GLS minimization is even more apparent when the noise level in the data is high (as high as 10%), indicating that the benefits of this approach are important for reconstruction of data in a routine setting where the data variance can be known based upon the signal to noise properties of the instruments.

Journal ArticleDOI
TL;DR: A fast algorithm for a full 3D gamma evaluation at high resolution by searching for the best point of agreement according to the gamma method in the evaluated dose distribution, which can be done at a subvoxel resolution.
Abstract: The gamma-evaluation method is a tool by which dose distributions can be compared in a quantitative manner combining dose-difference and distance-to-agreement criteria. Since its introduction, the gamma evaluation has been used in many studies and is on the verge of becoming the preferred dose distribution comparison method, particularly for intensity-modulated radiation therapy (IMRT) verification. One major disadvantage, however, is its long computation time, which especially applies to the comparison of three-dimensional (3D) dose distributions. We present a fast algorithm for a full 3D gamma evaluation at high resolution. Both the reference and evaluated dose distributions are first resampled on the same grid. For each point of the reference dose distribution, the algorithm searches for the best point of agreement according to the gamma method in the evaluated dose distribution, which can be done at a subvoxel resolution. Speed, computer memory efficiency, and high spatial resolution are achieved by searching around each reference point with increasing distance in a sphere, which has a radius of a chosen maximum search distance and is interpolated "on-the-fly" at a chosen sample step size. The smaller the sample step size and the larger the differences between the dose distributions, the longer the gamma evaluation takes. With decreasing sample step size, statistical measures of the 3D gamma distribution converge. Two clinical examples were investigated using 3% of the prescribed dose as dose-difference and 0.3 cm as distance-to-agreement criteria. For 0.2 cm grid spacing, the change in gamma indices was negligible below a sample step size of 0.02 cm. Comparing the full 3D gamma evaluation and slice-by-slice 2D gamma evaluations ("2.5D") for these clinical examples, the gamma indices improved by searching in full 3D space, with the average gamma index decreasing by at least 8%.

Journal ArticleDOI
TL;DR: An optical flow based method for improved reconstruction of 4D CT data sets from multislice CT scans is presented and results show a relevant reduction of reconstruction artifacts by the technique.
Abstract: Respiratory motion degrades anatomic position reproducibility and leads to issues affecting image acquisition, treatment planning, and radiation delivery. Four-dimensional (4D) computer tomography (CT) image acquisition can be used to measure the impact of organ motion and to explicitly account for respiratory motion during treatment planning and radiation delivery. Modern CT scanners can only scan a limited region of the body simultaneously and patients have to be scanned in segments consisting of multiple slices. A respiratory signal (spirometer signal or surface tracking) is used to reconstruct a 4D data set by sorting the CT scans according to the couch position and signal coherence with predefined respiratory phases. But artifacts can occur if there are no acquired data segments for exactly the same respiratory state for all couch positions. These artifacts are caused by device-dependent limitations of gantry rotation, image reconstruction times and by the variability of the patient's respiratory pattern. In this paper an optical flow based method for improved reconstruction of 4D CT data sets from multislice CT scans is presented. The optical flow between scans at neighboring respiratory states is estimated by a non-linear registration method. The calculated velocity field is then used to reconstruct a 4D CT data set by interpolating data at exactly the predefined respiratory phase. Our reconstruction method is compared with the usually used reconstruction based on amplitude sorting. The procedures described were applied to reconstruct 4D CT data sets for four cancer patients and a qualitative and quantitative evaluation of the optical flow based reconstruction method was performed. Evaluation results show a relevant reduction of reconstruction artifacts by our technique. The reconstructed 4D data sets were used to quantify organ displacements and to visualize the abdominothoracic organ motion.

Journal ArticleDOI
TL;DR: BrachyDose, a recently developed EGSnrc Monte Carlo code for rapid brachytherapy dose calculations, has been benchmarked by reproducing previously published dosimetry parameters by showing good agreement with calculations made with PTRAN although there are some exceptions.
Abstract: In this study, BrachyDose, a recently developed EGSnrc Monte Carlo code for rapid brachytherapy dose calculations, has been benchmarked by reproducing previously published dosimetry parameters for three brachytherapy seeds with varied internal structure and encapsulation. Calculations are performed for two 125I seeds (Source Tech Medical Model STM1251 and Imagyn isoSTAR model 12501) and one l03Pd source (Theragenics Model 200). Voxel size effects were investigated with dose distribution calculations for three voxel sizes: 0.1 x 0.1 x 0.1 mm(3), 0.5 x 0.5 x 0.5 mm(3), and 1 X 1 X 1 mm(3). In order to minimize the impact of voxel size effects, tabulated dosimetry data for this study consist of a combination of the three calculations: 0.1 X 0.1 x 0.1 mm(3) voxels for distances in the range of 0

Journal ArticleDOI
TL;DR: This task group report is to provide guidelines for film selection, irradiation, processing, scanning, and interpretation to allow the physicist to accurately and precisely measure dose with film.
Abstract: TG-69 is a task group report of the AAPM on the use of radiographic film for dosimetry. Radiographic films have been used for radiation dosimetry since the discovery of x-rays and have become an integral part of dose verification for both routine quality assurance and for complex treatments such as soft wedges (dynamic and virtual), intensity modulated radiation therapy (IMRT), image guided radiation therapy (IGRT), and small field dosimetry like stereotactic radiosurgery. Film is convenient to use, spatially accurate, and provides a permanent record of the integrated two dimensional dose distributions. However, there are several challenges to obtaining high quality dosimetric results with film, namely, the dependence of optical density on photon energy, field size, depth, film batch sensitivity differences, film orientation, processing conditions, and scanner performance. Prior to the clinical implementation of a film dosimetry program, the film, processor, and scanner need to be tested to characterize them with respect to these variables. Also, the physicist must understand the basic characteristics of all components of film dosimetry systems. The primary mission of this task group report is to provide guidelines for film selection, irradiation, processing, scanning, and interpretation to allow the physicist to accurately and precisely measure dose with film. Additionally, we present the basic principles and characteristics of film, processors, and scanners. Procedural recommendations are made for each of the steps required for film dosimetry and guidance is given regarding expected levels of accuracy. Finally, some clinical applications of film dosimetry are discussed.

Journal ArticleDOI
TL;DR: The result shows the possibility of practical realization of moving target irradiation with pencil beam scanning, and the present status of the raster scanning system for the HIMAC new treatment facility is described.
Abstract: A project to construct a new treatment facility as an extension of the existing heavy-ion medical accelerator in chiba (HIMAC) facility has been initiated for further development of carbon-ion therapy. The greatest challenge of this project is to realize treatment of a moving target by scanning irradiation. For this purpose, we decided to combine the rescanning technique and the gated irradiation method. To determine how to avoid hot and/or cold spots by the relatively large number of rescannings within an acceptable irradiation time, we have studied the scanning strategy, scanning magnets and their control, and beam intensity dynamic control. We have designed a raster scanning system and carried out a simulation of irradiating moving targets. The result shows the possibility of practical realization of moving target irradiation with pencil beam scanning. We describe the present status of our design study of the raster scanning system for the HIMAC new treatment facility.

Journal ArticleDOI
TL;DR: Nonrigid registration using the proposed rigidity penalty term is capable of nonrigidly aligning images, while keeping user-defined structures locally rigid.
Abstract: Medical images that are to be registered for clinical application often contain both structures that deform and ones that remain rigid. Nonrigid registration algorithms that do not model properties of different tissue types may result in deformations of rigid structures. In this article a local rigidity penalty term is proposed which is included in the registration function in order to penalize the deformation of rigid objects. This term can be used for any representation of the deformation field capable of modelling locally rigid transformations. By using a B-spline representation of the deformation field, a fast algorithm can be devised. The proposed method is compared with an unconstrained nonrigid registration algorithm. It is evaluated on clinical three-dimensional CT follow-up data of the thorax and on two-dimensional DSA image sequences. The results show that nonrigid registration using the proposed rigidity penalty term is capable of nonrigidly aligning images, while keeping user-defined structures locally rigid.