scispace - formally typeset
Search or ask a question

Showing papers in "Medical Physics in 2013"


Journal ArticleDOI
TL;DR: The authors offer the vision for the future ofPCD-CT and PCD-XR with the review of the current status and the prediction of detector technologies, imaging technologies, system technologies, and potential clinical benefits with PCDs.
Abstract: Photon counting detectors (PCDs) with energy discrimination capabilities have been developed for medical x-ray computed tomography (CT) and x-ray (XR) imaging. Using detection mechanisms that are completely different from the current energy integrating detectors and measuring the material information of the object to be imaged, these PCDs have the potential not only to improve the current CT and XR images, such as dose reduction, but also to open revolutionary novel applications such as molecular CT and XR imaging. The performance of PCDs is not flawless, however, and it seems extremely challenging to develop PCDs with close to ideal characteristics. In this paper, the authors offer our vision for the future of PCD-CT and PCD-XR with the review of the current status and the prediction of (1) detector technologies, (2) imaging technologies, (3) system technologies, and (4) potential clinical benefits with PCDs.

778 citations


Journal ArticleDOI
TL;DR: The extensive research performed during the development of breast tomosynthesis is reviewed, with a focus on the research addressing the medical physics aspects of this imaging modality.
Abstract: Mammography is a very well-established imaging modality for the early detection and diagnosis of breast cancer. However, since the introduction of digital imaging to the realm of radiology, more advanced, and especially tomographic imaging methods have been made possible. One of these methods, breast tomosynthesis, has finally been introduced to the clinic for routine everyday use, with potential to in the future replace mammography for screening for breast cancer. In this two part paper, the extensive research performed during the development of breast tomosynthesis is reviewed, with a focus on the research addressing the medical physics aspects of this imaging modality. This first paper will review the research performed on the issues relevant to the image acquisition process, including system design, optimization of geometry and technique, x-ray scatter, and radiation dose. The companion to this paper will review all other aspects of breast tomosynthesis imaging, including the reconstruction process.

363 citations


Journal ArticleDOI
TL;DR: The authors emphasize limitations encountered in the context of quantitative PET imaging, wherein increased intervoxel correlations due to resolution modeling can lead to significant loss of precision for small regions of interest, which can be a considerable pitfall depending on the task of interest.
Abstract: In this paper, the authors review the field of resolution modeling in positron emission tomography (PET) image reconstruction, also referred to as point-spread-function modeling. The review includes theoretical analysis of the resolution modeling framework as well as an overview of various approaches in the literature. It also discusses potential advantages gained via this approach, as discussed with reference to various metrics and tasks, including lesion detection observer studies. Furthermore, attention is paid to issues arising from this approach including the pervasive problem of edge artifacts, as well as explanation and potential remedies for this phenomenon. Furthermore, the authors emphasize limitations encountered in the context of quantitative PET imaging, wherein increased intervoxel correlations due to resolution modeling can lead to significant loss of precision (reproducibility) for small regions of interest, which can be a considerable pitfall depending on the task of interest.

295 citations


Journal ArticleDOI
TL;DR: The rationale for in vivo measurements is to provide an accurate and independent verification of the overall treatment procedure and enable the identification of potential errors in dose calculation, data transfer, dose delivery, patient setup, and changes in patient anatomy.
Abstract: In vivo dosimetry (IVD) is in use in external beam radiotherapy (EBRT) to detect major errors, to assess clinically relevant differences between planned and delivered dose, to record dose received by individual patients, and to fulfill legal requirements. After discussing briefly the main characteristics of the most commonly applied IVD systems, the clinical experience of IVD during EBRT will be summarized. Advancement of the traditional aspects of in vivo dosimetry as well as the development of currently available and newly emerging noninterventional technologies are required for large-scale implementation of IVD in EBRT. These new technologies include the development of electronic portal imaging devices for 2D and 3D patient dosimetry during advanced treatment techniques, such as IMRT and VMAT, and the use of IVD in proton and ion radiotherapy by measuring the decay of radiation-induced radionuclides. In the final analysis, we will show in this Vision 20/20 paper that in addition to regulatory compliance and reimbursement issues, the rationale for in vivo measurements is to provide an accurate and independent verification of the overall treatment procedure. It will enable the identification of potential errors in dose calculation, data transfer, dose delivery, patient setup, and changes in patient anatomy. It is the authors’ opinion that all treatments with curative intent should be verified through in vivo dose measurements in combination with pretreatment checks.

279 citations


Journal ArticleDOI
TL;DR: This work describes and validates a computationally efficient technique for noise map estimation directly from CT images, and an adaptive NLM filtering based on this noise map, on phantom and patient data.
Abstract: Purpose: To develop and evaluate an image-domain noise reduction method based on a modified nonlocal means (NLM) algorithm that is adaptive to local noise level of CT images and to implement this method in a time frame consistent with clinical workflow. Methods: A computationally efficient technique for local noise estimation directly from CT images was developed. A forward projection, based on a 2D fan-beam approximation, was used to generate the projection data, with a noise model incorporating the effects of the bowtie filter and automatic exposure control. The noise propagation from projection data to images was analytically derived. The analytical noise map was validated using repeated scans of a phantom. A 3D NLM denoising algorithm was modified to adapt its denoising strength locally based on this noise map. The performance of this adaptive NLM filter was evaluated in phantom studies in terms of in-plane and cross-plane high-contrast spatial resolution, noise power spectrum (NPS), subjective low-contrast spatial resolution using the American College of Radiology (ACR) accreditation phantom, and objective low-contrast spatial resolution using a channelized Hotelling model observer (CHO). Graphical processing units (GPU) implementation of this noise map calculation and the adaptive NLM filtering were developed to meet demands of clinical workflow. Adaptive NLM was piloted on lower dose scans in clinical practice. Results: The local noise level estimation matches the noise distribution determined from multiple repetitive scans of a phantom, demonstrated by small variations in the ratio map between the analytical noise map and the one calculated from repeated scans. The phantom studies demonstrated that the adaptive NLM filter can reduce noise substantially without degrading the high-contrast spatial resolution, as illustrated by modulation transfer function and slice sensitivity profile results. The NPS results show that adaptive NLM denoising preserves the shape and peak frequency of the noise power spectrum better than commercial smoothing kernels, and indicate that the spatial resolution at low contrast levels is not significantly degraded. Both the subjective evaluation using the ACR phantom and the objective evaluation on a low-contrast detection task using a CHO model observer demonstrate an improvement on low-contrast performance. The GPU implementation can process and transfer 300 slice images within 5 min. On patient data, the adaptive NLM algorithm provides more effective denoising of CT data throughout a volume than standard NLM, and may allow significant lowering of radiation dose. After a two week pilot study of lower dose CT urography and CT enterography exams, both GI and GU radiology groups elected to proceed with permanent implementation of adaptive NLM in their GI and GU CT practices. Conclusions: This work describes and validates a computationally efficient technique for noise map estimation directly from CT images, and an adaptive NLM filtering based on this noise map, on phantom and patient data. Both the noise map calculation and the adaptive NLM filtering can be performed in times that allow integration with clinical workflow. The adaptive NLM algorithm provides effective denoising of CT data throughout a volume, and may allow significant lowering of radiation dose.

235 citations


Journal ArticleDOI
TL;DR: A review of breast tomosynthesis research is performed, with an emphasis on its medical physics aspects, including reconstruction, image processing, and analysis, as well as the advanced applications being investigated for breasttomosynthesis.
Abstract: Many important post-acquisition aspects of breast tomosynthesis imaging can impact its clinical performance. Chief among them is the reconstruction algorithm that generates the representation of the three-dimensional breast volume from the acquired projections. But even after reconstruction, additional processes, such as artifact reduction algorithms, computer aided detection and diagnosis, among others, can also impact the performance of breast tomosynthesis in the clinical realm. In this two part paper, a review of breast tomosynthesis research is performed, with an emphasis on its medical physics aspects. In the companion paper, the first part of this review, the research performed relevant to the image acquisition process is examined. This second part will review the research on the post-acquisition aspects, including reconstruction, image processing, and analysis, as well as the advanced applications being investigated for breast tomosynthesis.

215 citations


Journal ArticleDOI
TL;DR: CNR, lateral field-of-view and penetration depth of the dedicated PAM scanning system is sufficient to image breasts as large as 1335 mL, which should accommodate up to 90% of the women in the United States.
Abstract: Purpose: To report the design and imaging methodology of a photoacoustic scanner dedicated to imaging hemoglobin distribution throughout a human breast. Methods: The authors developed a dedicated breast photoacoustic mammography (PAM) system using a spherical detector aperture based on our previous photoacoustic tomography scanner. The system uses 512 detectors with rectilinear scanning. The scan shape is a spiral pattern whose radius varies from 24 to 96 mm, thereby allowing a field of view that accommodates a wide range of breast sizes. The authors measured the contrast-to-noise ratio (CNR) using a target comprised of 1-mm dots printed on clear plastic. Each dot absorption coefficient was approximately the same as a 1-mm thickness of whole blood at 756 nm, the output wavelength of the Alexandrite laser used by this imaging system. The target was immersed in varying depths of an 8% solution of stock Liposyn II-20%, which mimics the attenuation of breast tissue (1.1 cm−1). The spatial resolution was measured using a 6 μm-diameter carbon fiber embedded in agar. The breasts of four healthy female volunteers, spanning a range of breast size from a brassiere C cup to a DD cup, were imaged using a 96-mm spiral protocol. Results: The CNR target was clearly visualized to a depth of 53 mm. Spatial resolution, which was estimated from the full width at half-maximum of a profile across the PAM image of a carbon fiber, was 0.42 mm. In the four human volunteers, the vasculature was well visualized throughout the breast tissue, including to the chest wall. Conclusions: CNR, lateral field-of-view and penetration depth of our dedicated PAM scanning system is sufficient to image breasts as large as 1335 mL, which should accommodate up to 90% of the women in the United States.

208 citations


Journal ArticleDOI
TL;DR: This method provides superior resolution to deep-tissue contrast ultrasound and has the potential to be extended to provide complete vascular network imaging in the brain.
Abstract: Purpose: High-resolution vascular imaging has not been achieved in the brain due to limitations of current clinical imaging modalities. The authors present a method for transcranial ultrasound imaging of single micrometer-size bubbles within a tube phantom. Methods: Emissions from single bubbles within a tube phantom were mapped through an ex vivo human skull using a sparse hemispherical receiver array and a passive beamforming algorithm. Noninvasive phase and amplitude correction techniques were applied to compensate for the aberrating effects of the skull bone. The positions of the individual bubbles were estimated beyond the diffraction limit of ultrasound to produce a super-resolution image of the tube phantom, which was compared with microcomputed tomography (micro-CT). Results: The resulting super-resolution ultrasound image is comparable to results obtained via the micro-CT for small tissue specimen imaging. Conclusions: This method provides superior resolution to deep-tissue contrast ultrasound and has the potential to be extended to provide complete vascular network imaging in the brain.

201 citations


Journal ArticleDOI
TL;DR: At 4° CP sampling, LT, MCSv, and LTMCS were found to be significantly correlated with VMAT dosimetric accuracy, expressed as Γ pass-rates, which indicated that the influence of LT on VMAT dosage accuracy can be controlled by reducing CP separation.
Abstract: Purpose: To evaluate the effect of plan parameters on volumetric modulated arc therapy (VMAT) dosimetric accuracy, together with the possibility of scoring plan complexity. Methods: 142 clinical VMAT plans initially optimized using a 4° control point (CP) separation were evaluated. All plans were delivered by a 6 MV Linac to a biplanar diode array for patient-specific quality assurance (QA). Local Γ index analysis (3%, 3 mm and 2%, 2 mm) enabled the comparison between delivered and calculated dose. The following parameters were considered for each plan: average leaf travel (LT), modulation complexity score applied to VMAT (MCSv), MU value, and a multiplicative combination of LT and MCSv (LTMCS). Pearson's correlation analysis was performed between Γ passing rates and each parameter. The effects of CP angular separation on VMAT dosimetric accuracy were also analyzed by focusing on plans with high LT values. Forty out of 142 plans with LT above 350 mm were further optimized using a finer angle spacing (3° or 2°) and Γ analysis was performed. The average Γ passing rates obtained at 4° and at 3°/2° sampling were compared. A further correlation analysis between all parameters and the Γ pass-rates was performed on 142 plans, but including the newly optimized 40 plans (CP every 3° or 2°) in place of the old ones (CP every 4°). Results: A moderate significant (p < 0.05) correlation between each examined parameter and Γ passing rates was observed for the original 142 plans at 4° CP discretization. A negative correlation was found for LT with Pearson's r absolute values above 0.6, suggesting that a lower dosimetric accuracy may be expected for higher LT values when a 4° CP sampling is used. A positive correlation was observed for MCSv and LTMCS with r values above 0.5. In order to score plan complexity, threshold values of LTMCS were defined. The average Γ passing rates were significantly higher for the plans created using the finer CP spacing (3°/2°) compared to the plans optimized using the standard 4° spacing (Student t-test p < 0.05). The correlation between LT and passing rates was strongly diminished when plans with finer angular separations were considered, yielding Pearson's r absolute values below 0.45. Conclusions: At 4° CP sampling, LT, MCSv, and LTMCS were found to be significantly correlated with VMAT dosimetric accuracy, expressed as Γ pass-rates. These parameters were found to be possible candidates for scoring plan complexity using threshold values. A finer CP separation (3°/2°) led to a significant increase in dosimetric accuracy for plans with high leaf travel values, and to a decrease in correlation between LT and Γ passing rates. These results indicated that the influence of LT on VMAT dosimetric accuracy can be controlled by reducing CP separation. CP spacing for all plans requiring large leaf motion should not exceed 3°. The reported data were integrated to optimize our clinical workflow for plan creation, optimization, selection among rival plans, and patient-specific QA of VMAT treatments.

189 citations


Journal ArticleDOI
TL;DR: IMRT and VMAT commissioning would benefit from the retirement of the 3%/3 mm passing rates as a primary metric of performance, and the adoption instead of tighter tolerances, more diligent diagnostics, and more thorough analysis.
Abstract: Purpose: This study (1) examines a variety of real-world cases where systematic errors were not detected by widely accepted methods for IMRT/VMAT dosimetric accuracy evaluation, and (2) drills-down to identify failure modes and their corresponding means for detection, diagnosis, and mitigation. The primary goal of detailing these case studies is to explore different, more sensitive methods and metrics that could be used more effectively for evaluating accuracy of dose algorithms, delivery systems, and QA devices. Methods: The authors present seven real-world case studies representing a variety of combinations of the treatment planning system (TPS), linac, delivery modality, and systematic error type. These case studies are typical to what might be used as part of an IMRT or VMAT commissioning test suite, varying in complexity. Each case study is analyzed according to TG-119 instructions for gamma passing rates and action levels for per-beam and/or composite plan dosimetric QA. Then, each case study is analyzed in-depth with advanced diagnostic methods (dose profile examination, EPID-based measurements, dose difference pattern analysis, 3D measurement-guided dose reconstruction, and dose grid inspection) and more sensitive metrics (2% local normalization/2 mm DTA and estimated DVH comparisons). Results: For these case studies, the conventional 3%/3 mm gamma passing rates exceeded 99% for IMRT per-beam analyses and ranged from 93.9% to 100% for composite plan dose analysis, well above the TG-119 action levels of 90% and 88%, respectively. However, all cases had systematic errors that were detected only by using advanced diagnostic techniques and more sensitive metrics. The systematic errors caused variable but noteworthy impact, including estimated target dose coverage loss of up to 5.5% and local dose deviations up to 31.5%. Types of errors included TPS model settings, algorithm limitations, and modeling and alignment of QA phantoms in the TPS. Most of the errors were correctable after detection and diagnosis, and the uncorrectable errors provided useful information about system limitations, which is another key element of system commissioning. Conclusions : Many forms of relevant systematic errors can go undetected when the currently prevalent metrics for IMRT/VMAT commissioning are used. If alternative methods and metrics are used instead of (or in addition to) the conventional metrics, these errors are more likely to be detected, and only once they are detected can they be properly diagnosed and rooted out of the system. Removing systematic errors should be a goal not only of commissioning by the end users but also product validation by the manufacturers. For any systematic errors that cannot be removed, detecting and quantifying them is important as it will help the physicist understand the limits of the system and work with the manufacturer on improvements. In summary, IMRT and VMAT commissioning, along with product validation, would benefit from the retirement of the 3%/3 mm passing rates as a primary metric of performance, and the adoption instead of tighter tolerances, more diligent diagnostics, and more thorough analysis.

188 citations


Journal ArticleDOI
TL;DR: It is possible to combine ultrasound, microbubbles, and chemotherapy in a clinical setting using commercially available clinical ultrasound scanners to increase the number of treatment cycles, prolonging the quality of life in patients with pancreatic adenocarcinoma compared to chemotherapy alone.
Abstract: Purpose: The purpose of this study was to investigate the ability and efficacy of inducing sonoporation in a clinical setting, using commercially available technology, to increase the patients’ quality of life and extend the low Eastern Cooperative Oncology Group performance grade; as a result increasing the overall survival in patients with pancreatic adenocarcinoma. Methods: Patients were treated using a customized configuration of a commercial clinical ultrasound scanner over a time period of 31.5 min following standard chemotherapy treatment with gemcitabine. SonoVue® ultrasound contrast agent was injected intravascularly during the treatment with the aim to induce sonoporation. Results: Using the authors’ custom acoustic settings, the authors’ patients were able to undergo an increased number of treatment cycles; from an average of 9 cycles, to an average of 16 cycles when comparing to a historical control group of 80 patients. In two out of five patients treated, the maximum tumor diameter was temporally decreased to 80 ± 5% and permanently to 70 ± 5% of their original size, while the other patients showed reduced growth. The authors also explain and characterize the settings and acoustic output obtained from a commercial clinical scanner used for combined ultrasound microbubble and chemotherapy treatment. Conclusions: It is possible to combine ultrasound, microbubbles, and chemotherapy in a clinical setting using commercially available clinical ultrasound scanners to increase the number of treatment cycles, prolonging the quality of life in patients with pancreatic adenocarcinoma compared to chemotherapy alone.

Journal ArticleDOI
TL;DR: It is indicated that it is possible to construct high quality pseudo-CT images by converting the intensity values of a single MRI series into HUs in the male pelvis, and to use these images for accurate MRI-based prostate RTP dose calculations.
Abstract: Purpose: The lack of electron density information in magnetic resonance images (MRI) poses a major challenge for MRI-based radiotherapy treatment planning (RTP). In this study the authors convert MRI intensity values into Hounsfield units (HUs) in the male pelvis and thus enable accurate MRI-based RTP for prostate cancer patients with varying tissue anatomy and body fat contents. Methods: T{sub 1}/T{sub 2}*-weighted MRI intensity values and standard computed tomography (CT) image HUs in the male pelvis were analyzed using image data of 10 prostate cancer patients. The collected data were utilized to generate a dual model HU conversion technique from MRI intensity values of the single image set separately within and outside of contoured pelvic bones. Within the bone segment local MRI intensity values were converted to HUs by applying a second-order polynomial model. This model was tuned for each patient by two patient-specific adjustments: MR signal normalization to correct shifts in absolute intensity level and application of a cutoff value to accurately represent low density bony tissue HUs. For soft tissues, such as fat and muscle, located outside of the bone contours, a threshold-based segmentation method without requirements for any patient-specific adjustments was introduced to convert MRI intensity values intomore » HUs. The dual model HU conversion technique was implemented by constructing pseudo-CT images for 10 other prostate cancer patients. The feasibility of these images for RTP was evaluated by comparing HUs in the generated pseudo-CT images with those in standard CT images, and by determining deviations in MRI-based dose distributions compared to those in CT images with 7-field intensity modulated radiation therapy (IMRT) with the anisotropic analytical algorithm and 360° volumetric-modulated arc therapy (VMAT) with the Voxel Monte Carlo algorithm. Results: The average HU differences between the constructed pseudo-CT images and standard CT images of each test patient ranged from −2 to 5 HUs and from 22 to 78 HUs in soft and bony tissues, respectively. The average local absolute value differences were 11 HUs in soft tissues and 99 HUs in bones. The planning target volume doses (volumes 95%, 50%, 5%) in the pseudo-CT images were within 0.8% compared to those in CT images in all of the 20 treatment plans. The average deviation was 0.3%. With all the test patients over 94% (IMRT) and 92% (VMAT) of dose points within body (lower than 10% of maximum dose suppressed) passed the 1 mm and 1% 2D gamma index criterion. The statistical tests (t- and F-tests) showed significantly improved (p ≤ 0.05) HU and dose calculation accuracies with the soft tissue conversion method instead of homogeneous representation of these tissues in MRI-based RTP images. Conclusions: This study indicates that it is possible to construct high quality pseudo-CT images by converting the intensity values of a single MRI series into HUs in the male pelvis, and to use these images for accurate MRI-based prostate RTP dose calculations.« less

Journal ArticleDOI
TL;DR: Improvements within BT safety by developments of IVD into an effective method of independent treatment verification are encouraged by the Vision 20/20 paper.
Abstract: In vivo dosimetry (IVD) has been used in brachytherapy (BT) for decades with a number of different detectors and measurement technologies. However, IVD in BT has been subject to certain difficulties and complexities, in particular due to challenges of the high-gradient BT dose distribution and the large range of dose and dose rate. Due to these challenges, the sensitivity and specificity toward error detection has been limited, and IVD has mainly been restricted to detection of gross errors. Given these factors, routine use of IVD is currently limited in many departments. Although the impact of potential errors may be detrimental since treatments are typically administered in large fractions and with high-gradient-dose-distributions, BT is usually delivered without independent verification of the treatment delivery. This Vision 20/20 paper encourages improvements within BT safety by developments of IVD into an effective method of independent treatment verification.

Journal ArticleDOI
TL;DR: The series of anatomically variable phantoms developed in this work provide a valuable resource for investigating 3D and 4D imaging devices and the effects of anatomy and motion in imaging, and are the first library of 4D computationalphantoms.
Abstract: Purpose: The authors previously developed the 4D extended cardiac-torso (XCAT) phantom for multimodality imaging research. The XCAT consisted of highly detailed whole-body models for the standard male and female adult, including the cardiac and respiratory motions. In this work, the authors extend the XCAT beyond these reference anatomies by developing a series of anatomically variable 4D XCAT adult phantoms for imaging research, the first library of 4D computational phantoms. Methods: The initial anatomy of each phantom was based on chest–abdomen–pelvis computed tomography data from normal patients obtained from the Duke University database. The major organs and structures for each phantom were segmented from the corresponding data and defined using nonuniform rational B-spline surfaces. To complete the body, the authors manually added on the head, arms, and legs using the original XCAT adult male and female anatomies. The structures were scaled to best match the age and anatomy of the patient. A multichannel large deformation diffeomorphic metric mapping algorithm was then used to calculate the transform from the template XCAT phantom (male or female) to the target patient model. The transform was applied to the template XCAT to fill in any unsegmented structures within the target phantom and to implement the 4D cardiac and respiratory models in the new anatomy. Each new phantom was refined by checking for anatomical accuracy via inspection of the models. Results: Using these methods, the authors created a series of computerized phantoms with thousands of anatomical structures and modeling cardiac and respiratory motions. The database consists of 58 (35 male and 23 female) anatomically variable phantoms in total. Like the original XCAT, these phantoms can be combined with existing simulation packages to simulate realistic imaging data. Each new phantom contains parameterized models for the anatomy and the cardiac and respiratory motions and can, therefore, serve as a jumping point from which to create an unlimited number of 3D and 4D variations for imaging research. Conclusions: A population of phantoms that includes a range of anatomical variations representative of the public at large is needed to more closely mimic a clinical study or trial. The series of anatomically variable phantoms developed in this work provide a valuable resource for investigating 3D and 4D imaging devices and the effects of anatomy and motion in imaging. Combined with Monte Carlo simulation programs, the phantoms also provide a valuable tool to investigate patient-specific dose and image quality, and optimization for adults undergoing imaging procedures.

Journal ArticleDOI
TL;DR: Numerical simulations have shown that the acoustic pattern can be complex inside the rat head and that special care must be taken for small animal studies relating acoustic parameters to neurostimulation effects, especially at a low frequency.
Abstract: Purpose: Low-intensity focused ultrasound has been shown to stimulate the brain noninvasively and without noticeable tissue damage. Such a noninvasive and localized neurostimulation is expected to have a major impact in neuroscience in the coming years. This emerging field will require many animal experiments to fully understand the link between ultrasound and stimulation. The primary goal of this paper is to investigate transcranial ultrasonic neurostimulation at low frequency (320 kHz) on anesthetized rats for different acoustic pressures and estimate thein situ pressure field distribution and the corresponding motor threshold, if any. The corresponding acoustic pressure distribution inside the brain, which cannot be measured in vivo, is investigated based on numerical simulations of the ultrasound propagation inside the head cavity, reproducing at best the experiments conducted in the first part, both in terms of transducer and head geometry and in terms of acoustic parameters. Methods: In this study, 37 ultrasonic neurostimulation sessions were achieved in rats (N = 8) using a 320 kHz transducer. The corresponding beam profile in the entire head was simulated in order to investigate the in situ pressure and intensity level as well as the spatial pressure distribution, thanks to a rat microcomputed tomography scan (CT)-based 3D finite differences time domain solver. Results: Ultrasound pulse evoked a motor response in more than 60% of the experimental sessions. In those sessions, the stimulation was always present, repeatable with a pressure threshold under which no motor response occurred. This average acoustic pressure threshold was found to be 0.68 ± 0.1 MPa (corresponding mechanical index, MI = 1.2 and spatial peak, pulse averaged intensity, Isppa = 7.5 W cm−2), as calibrated in free water. A slight variation was observed between deep anesthesia stage (0.77 ± 0.04 MPa) and light anesthesia stage (0.61 ± 0.03 MPa), assessed from the pedal reflex. Several kinds of motor responses were observed: movements of the tail, the hind legs, the forelimbs, the eye, and even a single whisker were induced separately. Numerical simulations of an equivalent experiment with identical acoustic parameters showed that the acoustic field was spread over the whole rat brain with the presence of several secondary pressure peaks. Due to reverberations, a 1.8-fold increase of the spatial peak, temporal peak acoustic pressure (Psptp) (±0.4 standard deviation), a 3.6-fold increase (±1.8) for the spatial peak, temporal peak acoustic intensity (Isptp), and 2.3 for the spatial peak, pulse averaged acoustic intensity (Isppa), were found compared to simulations of the beam in free water. Applying such corrections due to reverberations on the experimental results would yield a higher estimation for the average acoustic pressure threshold for motor neurostimulation at 320 KHz at 1.2 ± 0.3 MPa (MI = 2.2 ± 0.5 and Isppa = 17.5 ± 7.5 W cm−2). Conclusions: Transcranial ultrasonic stimulation is pressure- and anesthesia-dependent in the rat model. Numerical simulations have shown that the acoustic pattern can be complex inside the rat head and that special care must be taken for small animal studies relating acoustic parameters to neurostimulation effects, especially at a low frequency.

Journal ArticleDOI
TL;DR: The authors have developed an easily-implementable technique to measure the axial MTF and 3D NPS of clinical CT systems using an ACR phantom and the widespread availability of the phantom along with the free software the authors have provided will enable many different institutions to immediately measure M TF and NPS values for comparison of protocols and systems.
Abstract: Purpose: To develop an easily-implemented technique with free publicly-available analysis software to measure the modulation transfer function (MTF) and noise-power spectrum (NPS) of a clinical computed tomography (CT) system from images acquired using a widely-available and standardized American College of Radiology (ACR) CT accreditation phantom. Methods: Images of the ACR phantom were acquired on a Siemens SOMATOM Definition Flash system using a standard adult head protocol: 120 kVp, 300 mAs, and reconstructed voxel size of 0.49 mm × 0.49 mm × 4.67 mm. The radial (axial) MTF was measured using an edge method where the boundary of the third module of the ACR phantom, originally designed to measure uniformity and noise, was used as a circular edge. The 3D NPS was measured using images from this same module and using a previously-described methodology that quantifies noise magnitude and 3D noise correlation. Results: The axial MTF was radially symmetrical and had a value of 0.1 at 0.62 mm−1. The 3D NPS shape was consistent with the filter-ramp function of filtered-backprojection reconstruction algorithms and previously reported values. The radial NPS peak value was ∼115 HU2mm3 at ∼0.25 mm−1 and dropped to 0 HU2mm3 by 0.8 mm−1. Conclusions: The authors have developed an easily-implementable technique to measure the axial MTF and 3D NPS of clinical CT systems using an ACR phantom. The widespread availability of the phantom along with the free software the authors have provided will enable many different institutions to immediately measure MTF and NPS values for comparison of protocols and systems.

Journal ArticleDOI
TL;DR: A robust optimization method that deals with the uncertainties directly during the spot weight optimization to ensure clinical target volume (CTV) coverage without using PTV provided significantly more robust dose distributions to targets and organs than PTV-based conventional optimization in H&N using IMPT.
Abstract: Purpose: Intensity-modulated proton therapy (IMPT) is highly sensitive to uncertainties in beam range and patient setup. Conventionally, these uncertainties are dealt using geometrically expanded planning target volume (PTV). In this paper, the authors evaluated a robust optimization method that deals with the uncertainties directly during the spot weight optimization to ensure clinical target volume (CTV) coverage without using PTV. The authors compared the two methods for a population of head and neck (H&N) cancer patients. Methods: Two sets of IMPT plans were generated for 14 H&N cases, one being PTV-based conventionally optimized and the other CTV-based robustly optimized. For the PTV-based conventionally optimized plans, the uncertainties are accounted for by expanding CTV to PTV via margins and delivering the prescribed dose to PTV. For the CTV-based robustly optimized plans, spot weight optimization was guided to reduce the discrepancy in doses under extreme setup and range uncertainties directly, while delivering the prescribed dose to CTV rather than PTV. For each of these plans, the authors calculated dose distributions under various uncertainty settings. The root-mean-square dose (RMSD) for each voxel was computed and the area under the RMSD-volume histogram curves (AUC) was used to relatively compare plan robustness. Data derived from the dose volume histogram in the worst-case and nominal doses were used to evaluate the plan optimality. Then the plan evaluation metrics were averaged over the 14 cases and were compared with two-sided paired t tests. Results: CTV-based robust optimization led to more robust (i.e., smaller AUCs) plans for both targets and organs. Under the worst-case scenario and the nominal scenario, CTV-based robustly optimized plans showed better target coverage (i.e., greater D95%), improved dose homogeneity (i.e., smaller D5% − D95%), and lower or equivalent dose to organs at risk. Conclusions: CTV-based robust optimization provided significantly more robust dose distributions to targets and organs than PTV-based conventional optimization in H&N using IMPT. Eliminating the use of PTV and planning directly based on CTV provided better or equivalent normal tissue sparing.

Journal ArticleDOI
TL;DR: The first objective of this work was to determine and to compare small fields output factors (OF) measured with different types of active detectors and passive dosimeters for three types of facilities: a CyberKnife(®) system, a dedicated medical linear accelerator (Novalis) equipped with m3 microMLC and circular cones, and an adaptive medicallinear accelerator (Clinac 2100)equipped with an additional m3microMLC.
Abstract: Purpose: The use of small photon fields is now an established practice in stereotactic radiosurgery and radiotherapy. However, due to a lack of lateral electron equilibrium and high dose gradients, it is difficult to accurately measure the dosimetric quantities required for the commissioning of such systems. Moreover, there is still no metrological dosimetric reference for this kind of beam today. In this context, the first objective of this work was to determine and to compare small fields output factors (OF) measured with different types of active detectors and passive dosimeters for three types of facilities: a CyberKnife{sup Registered-Sign} system, a dedicated medical linear accelerator (Novalis) equipped with m3 microMLC and circular cones, and an adaptive medical linear accelerator (Clinac 2100) equipped with an additional m3 microMLC. The second one was to determine the k{sub Q{sub c{sub l{sub i{sub n,Q{sub m{sub s{sub r}{sup f{sub c}{sub l}{sub i}{sub n},f{sub m}{sub s}{sub r}}}}}}}}} correction factors introduced in a recently proposed small field dosimetry formalism for different active detectors.Methods: Small field sizes were defined either by microMLC down to 6 Multiplication-Sign 6 mm{sup 2} or by circular cones down to 4 mm in diameter. OF measurements were performed with several commercially available active detectorsmore » dedicated to measurements in small fields (high resolution diodes: IBA SFD, Sun Nuclear EDGE, PTW 60016, PTW 60017; ionizing chambers: PTW 31014 PinPoint chamber, PTW 31018 microLion liquid chamber, and PTW 60003 natural diamond). Two types of passive dosimeters were used: LiF microcubes and EBT2 radiochromic films.Results: Significant differences between the results obtained by several dosimetric systems were observed, particularly for the smallest field size for which the difference in the measured OF reaches more than 20%. For passive dosimeters, an excellent agreement was observed (better than 2%) between EBT2 and LiF microcubes for all OF measurements. Moreover, it has been shown that these passive dosimeters do not require correction factors and can then be used as reference dosimeters. Correction factors for the active detectors have then been determined from the mean experimental OF measured by the passive dosimeters.Conclusions: Four sets of correction factors needed to apply the new small field dosimetry formalism are provided for several active detectors. A protocol for small photon beams OF determination based on passive dosimeters measurements has been recently proposed to French radiotherapy treatment centers.« less

Journal ArticleDOI
TL;DR: This set of multi-institutional data can provide comparison data to others embarking on TrueBeam commissioning, ultimately improving the safety and quality of beam commissioning.
Abstract: Purpose: Latest generation linear accelerators (linacs), i.e., TrueBeam (Varian Medical Systems, Palo Alto, CA) and its stereotactic counterpart, TrueBeam STx, have several unique features, including high-dose-rate flattening-filter-free (FFF) photon modes, reengineered electron modes with new scattering foil geometries, updated imaging hardware/software, and a novel control system. An evaluation of five TrueBeam linacs at three different institutions has been performed and this work reports on the commissioning experience. Methods: Acceptance and commissioning data were analyzed for five TrueBeam linacs equipped with 120 leaf (5 mm width) MLCs at three different institutions. Dosimetric data and mechanical parameters were compared. These included measurements of photon beam profiles (6X, 6XFFF, 10X, 10XFFF, 15X), photon and electron percent depth dose (PDD) curves (6, 9, 12 MeV), relative photon output factors (Scp), electron cone factors, mechanical isocenter accuracy, MLC transmission, and dosimetric leaf gap (DLG). End-to-end testing and IMRT commissioning were also conducted. Results: Gantry/collimator isocentricity measurements were similar (0.27–0.28 mm), with overall couch/gantry/collimator values of 0.46–0.68 mm across the three institutions. Dosimetric data showed good agreement between machines. The average MLC DLGs for 6, 10, and 15 MV photons were 1.33 ± 0.23, 1.57 ± 0.24, and 1.61 ± 0.26 mm, respectively. 6XFFF and 10XFFF modes had average DLGs of 1.16 ± 0.22 and 1.44 ± 0.30 mm, respectively. MLC transmission showed minimal variation across the three institutions, with the standard deviation <0.2% for all linacs. Photon and electron PDDs were comparable for all energies. 6, 10, and 15 MV photon beam quality, %dd(10)x varied less than 0.3% for all linacs. Output factors (Scp) and electron cone factors agreed within 0.27%, on average; largest variations were observed for small field sizes (1.2% coefficient of variation, 10 MV, 2 × 2 cm2) and small cone sizes (<1% coefficient of variation, 6 × 6 cm2 cone), respectively. Conclusions: Overall, excellent agreement was observed in TrueBeam commissioning data. This set of multi-institutional data can provide comparison data to others embarking on TrueBeam commissioning, ultimately improving the safety and quality of beam commissioning.

Journal ArticleDOI
TL;DR: The first tomographic EI XPCi images acquired with a conventional x-ray source at dose levels below that used for preclinical small animal imaging are presented, demonstrating that phase based imaging methods can provide superior results compared to attenuated modalities for weakly attenuating samples also in 3D.
Abstract: Purpose: The edge illumination (EI) x-ray phase contrast imaging (XPCi) method has been recently further developed to perform tomographic and, thus, volumetric imaging. In this paper, the first tomographic EI XPCi images acquired with a conventional x-ray source at dose levels below that used for preclinical small animal imaging are presented. Methods: Two test objects, a biological sample and a custom-built phantom, were imaged with a laboratory-based EI XPCi setup in tomography mode. Tomographic maps that show the phase shift and attenuating properties of the object were reconstructed, and analyzed in terms of signal-to-noise ratio and quantitative accuracy. Dose measurements using thermoluminescence devices were performed. Results: The obtained images demonstrate that phase based imaging methods can provide superior results compared to attenuation based modalities for weakly attenuating samples also in 3D. Moreover, and, most importantly, they demonstrate the feasibility of low-dose imaging. In addition, the experimental results can be considered quantitative within the constraints imposed by polychromaticity. Conclusions: The results, together with the method's dose efficiency and compatibility with conventional x-ray sources, indicate that tomographic EI XPCi can become an important tool for the routine imaging of biomedical samples.

Journal ArticleDOI
TL;DR: The quality assurance procedure of Rapidarc treatment plans must include a thorough examination of where dose discrepancies occur, and professional judgment is needed when interpreting the gamma-index analysis, since even a >90% passing rate using the 2%/2 mm gamma- index criterion does not guarantee the absence of clinically significance dose deviation.
Abstract: Purpose: In this study the effects of small systematic MLC misalignments and gravitational errors on the quality of Rapidarc treatment plan delivery are investigated with respect to verification measurements with two detector arrays and the evaluation of clinical significance of the error-induced deviations. Methods: Five prostate and six head and neck plans were modified by means of three error types: (1) both MLC banks are opened, respectively, in opposing directions, resulting in larger fields; (2) both MLC banks are closed, resulting in smaller fields; and (3) both MLC banks are shifted for lateral gantry angles, respectively, in the same direction to simulate the effects of gravity on the leaves. Measurements were evaluated with respect to a gamma-index of 3%/3 mm and 2%/2 mm. Dose in the modified plans was recalculated and the resulting dose volume histograms for target and critical structures were compared to those of the unaltered plans. Results: The smallest introduced leaf position deviations which fail the >90% criterion for a gamma-index of 2%/2 mm are: (1) 1 mm; (2) 0.5 mm for prostate and 1.0 mm for head and neck cases; and (3) 3 mm corresponding to the error types, respectively. These errors would lead to significant changes in mean PTV dose and would not be detected with the more commonly used 3%/3 mm gamma-index criterion. Conclusions: A stricter gamma-index (2%/2 mm) is necessary in order to detect positional errors of the MLC. Nevertheless, the quality assurance procedure of Rapidarc treatment plans must include a thorough examination of where dose discrepancies occur, and professional judgment is needed when interpreting the gamma-index analysis, since even a >90% passing rate using the 2%/2 mm gamma-index criterion does not guarantee the absence of clinically significance dose deviation.

Journal ArticleDOI
TL;DR: Human observer performance on a 2AFC lesion detection task in CT with a uniform background can be accurately predicted by a channelized Hotelling observer at different radiation dose levels and for both FBP and IR methods.
Abstract: Purpose: Efficient optimization of CT protocols demands a quantitative approach to predicting human observer performance on specific tasks at various scan and reconstruction settings. The goal of this work was to investigate how well a channelized Hotelling observer (CHO) can predict human observer performance on 2-alternative forced choice (2AFC) lesion-detection tasks at various dose levels and two different reconstruction algorithms: a filtered-backprojection (FBP) and an iterative reconstruction (IR) method. Methods: A 35 × 26 cm2 torso-shaped phantom filled with water was used to simulate an average-sized patient. Three rods with different diameters (small: 3 mm; medium: 5 mm; large: 9 mm) were placed in the center region of the phantom to simulate small, medium, and large lesions. The contrast relative to background was −15 HU at 120 kV. The phantom was scanned 100 times using automatic exposure control each at 60, 120, 240, 360, and 480 quality reference mAs on a 128-slice scanner. After removing the three rods, the water phantom was again scanned 100 times to provide signal-absent background images at the exact same locations. By extracting regions of interest around the three rods and on the signal-absent images, the authors generated 21 2AFC studies. Each 2AFC study had 100 trials, with each trial consisting of a signal-present image and a signal-absent image side-by-side in randomized order. In total, 2100 trials were presented to both the model and human observers. Four medical physicists acted as human observers. For the model observer, the authors used a CHO with Gabor channels, which involves six channel passbands, five orientations, and two phases, leading to a total of 60 channels. The performance predicted by the CHO was compared with that obtained by four medical physicists at each 2AFC study. Results: The human and model observers were highly correlated at each dose level for each lesion size for both FBP and IR. The Pearson's product-moment correlation coefficients were 0.986 [95% confidence interval (CI): 0.958–0.996] for FBP and 0.985 (95% CI: 0.863–0.998) for IR. Bland-Altman plots showed excellent agreement for all dose levels and lesions sizes with a mean absolute difference of 1.0% ± 1.1% for FBP and 2.1% ± 3.3% for IR. Conclusions: Human observer performance on a 2AFC lesion detection task in CT with a uniform background can be accurately predicted by a CHO model observer at different radiation dose levels and for both FBP and IR methods.

Journal ArticleDOI
TL;DR: The head and neck phantom is a useful credentialing tool for multi-institutional IMRT clinical trials and the most commonly represented linear accelerator-planning system combinations can all pass the phantom, though some combinations had higher passing percentages than others.
Abstract: Purpose: This study was performed to report and analyze the results of the Radiological Physics Center's head and neck intensity-modulated radiation therapy (IMRT) phantom irradiations done by institutions seeking to be credentialed for participation in clinical trials using intensity modulated radiation therapy. Methods: The Radiological Physics Center's anthropomorphic head and neck phantom was sent to institutions seeking to participate in multi-institutional clinical trials. The phantom contained two planning target volume (PTV) structures and an organ at risk (OAR). Thermoluminescent dosimeters (TLD) and film dosimeters were imbedded in the PTV. Institutions were asked to image, plan, and treat the phantom as they would treat a patient. The treatment plan should cover at least 95% of the primary PTV with 6.6 Gy and at least 95% of the secondary PTV with 5.4 Gy. The plan should limit the dose to the OAR to less than 4.5 Gy. The passing criteria were ±7% for the TLD in the PTVs and a distance to agreement of 4 mm in the high dose gradient area between the PTV and the OAR. Pass rates for different delivery types, treatment planning systems (TPS), linear accelerators, and linear accelerator-planning system combinations were compared. Results: The phantom was irradiated 1139 times by 763 institutions from 2001 through 2011. 929 (81.6%) of the irradiations passed the criteria. 156 (13.7%) irradiations failed only the TLD criteria, 21 (1.8%) failed only the film criteria, and 33 (2.9%) failed both sets of criteria. Only 69% of the irradiations passed a narrowed TLD criterion of ±5%. Varian-Elipse and TomoTherapy-HiArt combinations had the highest pass rates, ranging from 90% to 93%. Varian-Pinnacle3, Varian-XiO, Siemens-Pinnacle3, and Elekta-Pinnacle3 combinations had pass rates that ranged from 66% to 81%. Conclusions: The head and neck phantom is a useful credentialing tool for multi-institutional IMRT clinical trials. The most commonly represented linear accelerator-planning system combinations can all pass the phantom, though some combinations had higher passing percentages than others. Tightening the criteria would significantly reduce the number of institutions passing the credentialing criteria. Causes for failures include incorrect data entered into the TPS, inexact beam modeling, and software and hardware failures.

Journal ArticleDOI
TL;DR: The authors created CONRAD, a software framework that provides many of the tools that are required to simulate basic processes in x-ray imaging and perform image reconstruction with consideration of nonlinear physical effects and enables the medical physics community to share algorithms and develop new ideas.
Abstract: Purpose: In the community of x-ray imaging, there is a multitude of tools and applications that are used in scientific practice. Many of these tools are proprietary and can only be used within a certain lab. Often the same algorithm is implemented multiple times by different groups in order to enable comparison. In an effort to tackle this problem, the authors created CONRAD, a software framework that provides many of the tools that are required to simulate basic processes in x-ray imaging and perform image reconstruction with consideration of nonlinear physical effects. Methods: CONRAD is a Java-based state-of-the-art software platform with extensive documentation. It is based on platform-independent technologies. Special libraries offer access to hardware acceleration such as OpenCL. There is an easy-to-use interface for parallel processing. The software package includes different simulation tools that are able to generate up to 4D projection and volume data and respective vector motion fields. Well known reconstruction algorithms such as FBP, DBP, and ART are included. All algorithms in the package are referenced to a scientific source. Results: A total of 13 different phantoms and 30 processing steps have already been integrated into the platform at the time of writing. The platform comprises 74.000 nonblank lines of code out of which 19% are used for documentation. The software package is available for download at http://conrad.stanford.edu. To demonstrate the use of the package, the authors reconstructed images from two different scanners, a table top system and a clinical C-arm system. Runtimes were evaluated using the RabbitCT platform and demonstrate state-of-the-art runtimes with 2.5 s for the 256 problem size and 12.4 s for the 512 problem size. Conclusions: As a common software framework, CONRAD enables the medical physics community to share algorithms and develop new ideas. In particular this offers new opportunities for scientific collaboration and quantitative performance comparison between the methods of different groups.

Journal ArticleDOI
TL;DR: This Vision 20/20 paper addresses major questions related to the applicability of advanced cloud computing in medical imaging and considers security and ethical issues that accompany cloud computing.
Abstract: Over the past century technology has played a decisive role in defining, driving, and reinventing procedures, devices, and pharmaceuticals in healthcare. Cloud computing has been introduced only recently but is already one of the major topics of discussion in research and clinical settings. The provision of extensive, easily accessible, and reconfigurable resources such as virtual systems, platforms, and applications with low service cost has caught the attention of many researchers and clinicians. Healthcare researchers are moving their efforts to the cloud, because they need adequate resources to process, store, exchange, and use large quantities of medical data. This Vision 20/20 paper addresses major questions related to the applicability of advanced cloud computing in medical imaging. The paper also considers security and ethical issues that accompany cloud computing.

Journal ArticleDOI
TL;DR: The various systems can detect various errors and the sensitivity to the introduced errors depends on the plan, but there was poor correlation between the gamma evaluation pass rates of the QA procedures and the deviations observed in the dose volume histograms.
Abstract: Purpose: The purpose of the present study was to investigate the ability of commercial patient quality assurance (QA) systems to detect linear accelerator-related errors. Methods: Four measuring systems (Delta4®, OCTAVIUS®, COMPASS, and Epiqa™) designed for patient specific quality assurance for rotational radiation therapy were compared by measuring four clinical rotational intensity modulated radiation therapy plans as well as plans with introduced intentional errors. The intentional errors included increasing the number of monitor units, widening of the MLC banks, and rotation of the collimator. The measurements were analyzed using the inherent gamma evaluation with 2% and 2 mm criteria and 3% and 3 mm criteria. When applicable, the plans with intentional errors were compared with the original plans both by 3D gamma evaluation and by inspecting the dose volume histograms produced by the systems. Results: There was considerable variation in the type of errors that the various systems detected; the failure rate for the plans with errors varied between 0% and 72%. When using 2% and 2 mm criteria and 95% as a pass rate the Delta4® detected 15 of 20 errors, OCTAVIUS® detected 8 of 20 errors, COMPASS detected 8 of 20 errors, and Epiqa™ detected 20 of 20 errors. It was also found that the calibration and measuring procedure could benefit from improvements for some of the patient QA systems. Conclusions: The various systems can detect various errors and the sensitivity to the introduced errors depends on the plan. There was poor correlation between the gamma evaluation pass rates of the QA procedures and the deviations observed in the dose volume histograms.

Journal ArticleDOI
TL;DR: This installment of the Vision 20∕20 series examines the current status of MRgFUS, focusing on the hurdles the technology faces before it can cross over from a research technique to a standard fixture in the clinic and reviews current and near-term technical developments which may overcome these hurdles.
Abstract: MR-guided focused ultrasound surgery (MRgFUS) is a quickly developing technology with potential applications across a spectrum of indications traditionally within the domain of radiation oncology. Especially for applications where focal treatment is the preferred technique (for example, radiosurgery), MRgFUS has the potential to be a disruptive technology that could shift traditional patterns of care. While currently cleared in the United States for the noninvasive treatment of uterine fibroids and bone metastases, a wide range of clinical trials are currently underway, and the number of publications describing advances in MRgFUS is increasing. However, for MRgFUS to make the transition from a research curiosity to a clinical standard of care, a variety of challenges, technical, financial, clinical, and practical, must be overcome. This installment of the Vision 20/20 series examines the current status of MRgFUS, focusing on the hurdles the technology faces before it can cross over from a research technique to a standard fixture in the clinic. It then reviews current and near-term technical developments which may overcome these hurdles and allow MRgFUS to break through into clinical practice.

Journal ArticleDOI
TL;DR: 4D-CT(MRI) presents a novel approach to test the robustness of treatment plans in the circumstance of patient motion, and it is demonstrated that motion information from 4D-MRI can be used to generate realistic 4d-CT data sets on the basis of a single static 3D- CT data set.
Abstract: Purpose: Target sites affected by organ motion require a time resolved (4D) dose calculation. Typical 4D dose calculations use 4D-CT as a basis. Unfortunately, 4D-CT images have the disadvantage of being a 'snap-shot' of the motion during acquisition and of assuming regularity of breathing. In addition, 4D-CT acquisitions involve a substantial additional dose burden to the patient making many, repeated 4D-CT acquisitions undesirable. Here the authors test the feasibility of an alternative approach to generate patient specific 4D-CT data sets. Methods: In this approach motion information is extracted from 4D-MRI. Simulated 4D-CT data sets [which the authors call 4D-CT(MRI)] are created by warping extracted deformation fields to a static 3D-CT data set. The employment of 4D-MRI sequences for this has the advantage that no assumptions on breathing regularity are made, irregularities in breathing can be studied and, if necessary, many repeat imaging studies (and consequently simulated 4D-CT data sets) can be performed on patients and/or volunteers. The accuracy of 4D-CT(MRI)s has been validated by 4D proton dose calculations. Our 4D dose algorithm takes into account displacements as well as deformations on the originating 4D-CT/4D-CT(MRI) by calculating the dose of each pencil beam based on an individual time stamp of whenmore » that pencil beam is applied. According to corresponding displacement and density-variation-maps the position and the water equivalent range of the dose grid points is adjusted at each time instance. Results: 4D dose distributions, using 4D-CT(MRI) data sets as input were compared to results based on a reference conventional 4D-CT data set capturing similar motion characteristics. Almost identical 4D dose distributions could be achieved, even though scanned proton beams are very sensitive to small differences in the patient geometry. In addition, 4D dose calculations have been performed on the same patient, but using 4D-CT(MRI) data sets based on variable breathing patterns to show the effect of possible irregular breathing on active scanned proton therapy. Using a 4D-CT(MRI), including motion irregularities, resulted in significantly different proton dose distributions. Conclusions: The authors have demonstrated that motion information from 4D-MRI can be used to generate realistic 4D-CT data sets on the basis of a single static 3D-CT data set. 4D-CT(MRI) presents a novel approach to test the robustness of treatment plans in the circumstance of patient motion.« less

Journal ArticleDOI
TL;DR: The performance of this new segmentation algorithm in delineating tumor contour and measuring tumor size illustrates its potential clinical value for assisting in noninvasive diagnosis of pulmonary nodules, therapy response assessment, and radiation treatment planning.
Abstract: Purpose: Lung lesions vary considerably in size, density, and shape, and can attach to surrounding anatomic structures such as chest wall or mediastinum Automatic segmentation of the lesions poses a challenge This work communicates a new three-dimensional algorithm for the segmentation of a wide variety of lesions, ranging from tumors found in patients with advanced lung cancer to small nodules detected in lung cancer screening programs Methods: The authors’ algorithm uniquely combines the image processing techniques of marker-controlled watershed, geometric active contours as well as Markov random field (MRF) The user of the algorithm manually selects a region of interest encompassing the lesion on a single slice and then the watershed method generates an initial surface of the lesion in three dimensions, which is refined by the active geometric contours MRF improves the segmentation of ground glass opacity portions of part-solid lesions The algorithm was tested on an anthropomorphic thorax phantom dataset and two publicly accessible clinical lung datasets These clinical studies included a same-day repeat CT (prewalk and postwalk scans were performed within 15 min) dataset containing 32 lung lesions with one radiologist's delineated contours, and the first release of the Lung Image Database Consortium (LIDC) dataset containing 23 lung nodules with 6 radiologists’ delineated contours The phantom dataset contained 22 phantom nodules of known volumes that were inserted in a phantom thorax Results: For the prewalk scans of the same-day repeat CT dataset and the LIDC dataset, the mean overlap ratios of lesion volumes generated by the computer algorithm and the radiologist(s) were 69% and 65%, respectively For the two repeat CT scans, the intra-class correlation coefficient (ICC) was 0998, indicating high reliability of the algorithm The mean relative difference was −3% for the phantom dataset Conclusions: The performance of this new segmentation algorithm in delineating tumor contour and measuring tumor size illustrates its potential clinical value for assisting in noninvasive diagnosis of pulmonary nodules, therapy response assessment, and radiation treatment planning

Journal ArticleDOI
TL;DR: Proper assessment of CAD system performance is expected to increase the understanding of a CAD system's effectiveness and limitations, which isexpected to stimulate further research and development efforts on CAD technologies, reduce problems due to improper use, and eventually improve the utility and efficacy of CAD in clinical practice.
Abstract: Computer-aided detection and diagnosis (CAD) systems are increasingly being used as an aid by clinicians for detection and interpretation of diseases. Computer-aided detection systems mark regions of an image that may reveal specific abnormalities and are used to alert clinicians to these regions during image interpretation. Computer-aided diagnosis systems provide an assessment of a disease using image-based information alone or in combination with other relevant diagnostic data and are used by clinicians as a decision support in developing their diagnoses. While CAD systems are commercially available, standardized approaches for evaluating and reporting their performance have not yet been fully formalized in the literature or in a standardization effort. This deficiency has led to difficulty in the comparison of CAD devices and in understanding how the reported performance might translate into clinical practice. To address these important issues, the American Association of Physicists in Medicine (AAPM) formed the Computer Aided Detection in Diagnostic Imaging Subcommittee (CADSC), in part, to develop recommendations on approaches for assessing CAD system performance. The purpose of this paper is to convey the opinions of the AAPM CADSC members and to stimulate the development of consensus approaches and “best practices” for evaluating CAD systems. Both the assessment of a standalone CAD system and the evaluation of the impact of CAD on end-users are discussed. It is hoped that awareness of these important evaluation elements and the CADSC recommendations will lead to further development of structured guidelines for CAD performance assessment. Proper assessment of CAD system performance is expected to increase the understanding of a CAD system's effectiveness and limitations, which is expected to stimulate further research and development efforts on CAD technologies, reduce problems due to improper use, and eventually improve the utility and efficacy of CAD in clinical practice.