scispace - formally typeset
Search or ask a question

Showing papers in "Physics in Medicine and Biology in 2014"


Journal ArticleDOI
TL;DR: This review can serve as a source for defining input parameters for applying or refining biophysical models and to identify endpoints where additional radiobiological data are needed in order to reduce the uncertainties in proton RBE values to clinically acceptable levels.
Abstract: Proton therapy treatments are based on a proton RBE (relative biological effectiveness) relative to high-energy photons of 1.1. The use of this generic, spatially invariant RBE within tumors and normal tissues disregards the evidence that proton RBE varies with linear energy transfer (LET), physiological and biological factors, and clinical endpoint.Based on the available experimental data from published literature, this review analyzes relationships of RBE with dose, biological endpoint and physical properties of proton beams. The review distinguishes between endpoints relevant for tumor control probability and those potentially relevant for normal tissue complication. Numerous endpoints and experiments on sub-cellular damage and repair effects are discussed.Despite the large amount of data, considerable uncertainties in proton RBE values remain. As an average RBE for cell survival in the center of a typical spread-out Bragg peak (SOBP), the data support a value of ~1.15 at 2 Gy/fraction. The proton RBE increases with increasing LETd and thus with depth in an SOBP from ~1.1 in the entrance region, to ~1.15 in the center, ~1.35 at the distal edge and ~1.7 in the distal fall-off (when averaged over all cell lines, which may not be clinically representative). For small modulation widths the values could be increased. Furthermore, there is a trend of an increase in RBE as (α/β)x decreases. In most cases the RBE also increases with decreasing dose, specifically for systems with low (α/β)x. Data on RBE for endpoints other than clonogenic cell survival are too diverse to allow general statements other than that the RBE is, on average, in line with a value of ~1.1.This review can serve as a source for defining input parameters for applying or refining biophysical models and to identify endpoints where additional radiobiological data are needed in order to reduce the uncertainties to clinically acceptable levels.

664 citations


Journal ArticleDOI
TL;DR: The research and development performed to obtain anatomical models that meet the requirements necessary for medical implant safety assessment applications are described, including implementation of quality control procedures, re-segmentation at higher resolution, more-consistent tissue assignments, enhanced surface processing and numerous anatomical refinements.
Abstract: The Virtual Family computational whole-body anatomical human models were originally developed for electromagnetic (EM) exposure evaluations, in particular to study how absorption of radiofrequency radiation from external sources depends on anatomy. However, the models immediately garnered much broader interest and are now applied by over 300 research groups, many from medical applications research fields. In a first step, the Virtual Family was expanded to the Virtual Population to provide considerably broader population coverage with the inclusion of models of both sexes ranging in age from 5 to 84 years old. Although these models have proven to be invaluable for EM dosimetry, it became evident that significantly enhanced models are needed for reliable effectiveness and safety evaluations of diagnostic and therapeutic applications, including medical implants safety. This paper describes the research and development performed to obtain anatomical models that meet the requirements necessary for medical implant safety assessment applications. These include implementation of quality control procedures, re-segmentation at higher resolution, more-consistent tissue assignments, enhanced surface processing and numerous anatomical refinements. Several tools were developed to enhance the functionality of the models, including discretization tools, posing tools to expand the posture space covered, and multiple morphing tools, e.g., to develop pathological models or variations of existing ones. A comprehensive tissue properties database was compiled to complement the library of models. The results are a set of anatomically independent, accurate, and detailed models with smooth, yet feature-rich and topologically conforming surfaces. The models are therefore suited for the creation of unstructured meshes, and the possible applications of the models are extended to a wider range of solvers and physics. The impact of these improvements is shown for the MRI exposure of an adult woman with an orthopedic spinal implant. Future developments include the functionalization of the models for specific physical and physiological modeling tasks.

355 citations


Journal ArticleDOI
TL;DR: The potential of 3D Ultrafast Ultrasound Imaging for the 3D mapping of stiffness, tissue motion, and flow in humans in vivo is demonstrated and promises new clinical applications of ultrasound with reduced intra--and inter-observer variability.
Abstract: Very high frame rate ultrasound imaging has recently allowed for the extension of the applications of echography to new fields of study such as the functional imaging of the brain, cardiac electrophysiology, and the quantitative imaging of the intrinsic mechanical properties of tumors, to name a few, non-invasively and in real time. In this study, we present the first implementation of Ultrafast Ultrasound Imaging in 3D based on the use of either diverging or plane waves emanating from a sparse virtual array located behind the probe. It achieves high contrast and resolution while maintaining imaging rates of thousands of volumes per second. A customized portable ultrasound system was developed to sample 1024 independent channels and to drive a 32 × 32 matrix-array probe. Its ability to track in 3D transient phenomena occurring in the millisecond range within a single ultrafast acquisition was demonstrated for 3D Shear-Wave Imaging, 3D Ultrafast Doppler Imaging, and, finally, 3D Ultrafast combined Tissue and Flow Doppler Imaging. The propagation of shear waves was tracked in a phantom and used to characterize its stiffness. 3D Ultrafast Doppler was used to obtain 3D maps of Pulsed Doppler, Color Doppler, and Power Doppler quantities in a single acquisition and revealed, at thousands of volumes per second, the complex 3D flow patterns occurring in the ventricles of the human heart during an entire cardiac cycle, as well as the 3D in vivo interaction of blood flow and wall motion during the pulse wave in the carotid at the bifurcation. This study demonstrates the potential of 3D Ultrafast Ultrasound Imaging for the 3D mapping of stiffness, tissue motion, and flow in humans in vivo and promises new clinical applications of ultrasound with reduced intra—and inter-observer variability.

254 citations


Journal ArticleDOI
TL;DR: This review identifies a clear progression of computational phantom complexity which can be denoted by three distinct generations, and explains an unexpected finding that the phantoms from the past 50 years followed a pattern of exponential growth.
Abstract: Radiation dose calculation using models of the human anatomy has been a subject of great interest to radiation protection, medical imaging, and radiotherapy. However, early pioneers of this field did not foresee the exponential growth of research activity as observed today. This review article walks the reader through the history of the research and development in this field of study which started some 50 years ago. This review identifies a clear progression of computational phantom complexity which can be denoted by three distinct generations. The first generation of stylized phantoms, representing a grouping of less than dozen models, was initially developed in the 1960s at Oak Ridge National Laboratory to calculate internal doses from nuclear medicine procedures. Despite their anatomical simplicity, these computational phantoms were the best tools available at the time for internal/external dosimetry, image evaluation, and treatment dose evaluations. A second generation of a large number of voxelized phantoms arose rapidly in the late 1980s as a result of the increased availability of tomographic medical imaging and computers. Surprisingly, the last decade saw the emergence of the third generation of phantoms which are based on advanced geometries called boundary representation (BREP) in the form of Non-Uniform Rational B-Splines (NURBS) or polygonal meshes. This new class of phantoms now consists of over 287 models including those used for non-ionizing radiation applications. This review article aims to provide the reader with a general understanding of how the field of computational phantoms came about and the technical challenges it faced at different times. This goal is achieved by defining basic geometry modeling techniques and by analyzing selected phantoms in terms of geometrical features and dosimetric problems to be solved. The rich historical information is summarized in four tables that are aided by highlights in the text on how some of the most well-known phantoms were developed and used in practice. Some of the information covered in this review has not been previously reported, for example, the CAM and CAF phantoms developed in 1970s for space radiation applications. The author also clarifies confusion about 'population-average' prospective dosimetry needed for radiological protection under the current ICRP radiation protection system and 'individualized' retrospective dosimetry often performed for medical physics studies. To illustrate the impact of computational phantoms, a section of this article is devoted to examples from the author's own research group. Finally the author explains an unexpected finding during the course of preparing for this article that the phantoms from the past 50 years followed a pattern of exponential growth. The review ends on a brief discussion of future research needs (a supplementary file '3DPhantoms.pdf' to figure 15 is available for download that will allow a reader to interactively visualize the phantoms in 3D).

235 citations


Journal ArticleDOI
TL;DR: Experimental results show that the present PWLS-TGV method can achieve images with several noticeable gains over the original TV-based method in terms of accuracy and resolution properties.
Abstract: Sparse-view CT reconstruction algorithms via total variation (TV) optimize the data iteratively on the basis of a noise- and artifact-reducing model, resulting in significant radiation dose reduction while maintaining image quality. However, the piecewise constant assumption of TV minimization often leads to the appearance of noticeable patchy artifacts in reconstructed images. To obviate this drawback, we present a penalized weighted least-squares (PWLS) scheme to retain the image quality by incorporating the new concept of total generalized variation (TGV) regularization. We refer to the proposed scheme as 'PWLS-TGV' for simplicity. Specifically, TGV regularization utilizes higher order derivatives of the objective image, and the weighted least-squares term considers data-dependent variance estimation, which fully contribute to improving the image quality with sparse-view projection measurement. Subsequently, an alternating optimization algorithm was adopted to minimize the associative objective function. To evaluate the PWLS-TGV method, both qualitative and quantitative studies were conducted by using digital and physical phantoms. Experimental results show that the present PWLS-TGV method can achieve images with several noticeable gains over the original TV-based method in terms of accuracy and resolution properties.

205 citations


Journal ArticleDOI
TL;DR: This review aims to present an overview of current concepts for both scanner output metrics and for patient dosimetry and will comment on their strengths and weaknesses.
Abstract: Radiation dose in x-ray computed tomography (CT) has become a topic of high interest due to the increasing numbers of CT examinations performed worldwide. This review aims to present an overview of current concepts for both scanner output metrics and for patient dosimetry and will comment on their strengths and weaknesses. Controversial issues such as the appropriateness of the CT dose index (CTDI) are discussed in detail. A review of approaches to patient dose assessment presently in practice, of the dose levels encountered and options for further dose optimization are also given and discussed. Patient dose assessment remains a topic for further improvement and for international consensus. All approaches presently in use are based on Monte Carlo (MC) simulations. Estimates for effective dose are established, but they are crude and not patient-specific; organ dose estimates are rarely available. Patient- and organ-specific dose estimates can be provided with adequate accuracy and independent of CTDI phantom measurements by fast MC simulations. Such information, in particular on 3D dose distributions, is important and helpful in optimization efforts. Dose optimization has been performed very successfully in recent years and even resulted in applications with effective dose values of below 1 mSv. In general, a trend towards lower dose values based on technical innovations has to be acknowledged. Effective dose values are down to clearly below 10 mSv on average, and there are a number of applications such as cardiac and pediatric CT which are performed routinely below 1 mSv on modern equipment.

191 citations



Journal ArticleDOI
TL;DR: An experimental verification of stopping-power-ratio (SPR) prediction from dual energy CT (DECT) with potential use for dose planning in proton and ion therapy is presented.
Abstract: We present an experimental verification of stopping-power-ratio (SPR) prediction from dual energy CT (DECT) with potential use for dose planning in proton and ion therapy. The approach is based on DECT images converted to electron density relative to water ϱe/ϱe, w and effective atomic number Zeff. To establish a parameterization of the I-value by Zeff, 71 tabulated tissue compositions were used. For the experimental assessment of the method we scanned 20 materials (tissue surrogates, polymers, aluminum, titanium) at 80/140Sn kVp and 100/140Sn kVp (Sn: additional tin filtration) and computed the ϱe/ϱe, w and Zeff with a purely image based algorithm. Thereby, we found that ϱe/ϱe, w (Zeff) could be determined with an accuracy of 0.4% (1.7%) for the tissue surrogates with known elemental compositions. SPRs were predicted from DECT images for all 20 materials using the presented approach and were compared to measured water-equivalent path lengths (closely related to SPR). For the tissue surrogates the presented DECT approach was found to predict the experimental values within 0.6%, for aluminum and titanium within an accuracy of 1.7% and 9.4% (from 16-bit reconstructed DECT images).

175 citations


Journal ArticleDOI
TL;DR: The different instrumentation, methodological approaches and schema for inverse image reconstructions for optical tomography, including luminescence and fluorescence modalities, are summarized, and limitations and key technological advances needed for further discovery research and translation are commented on.
Abstract: Emerging fluorescence and bioluminescence tomography approaches have several common, yet several distinct features from established emission tomographies of PET and SPECT. Although both nuclear and optical imaging modalities involve counting of photons, nuclear imaging techniques collect the emitted high energy (100–511 keV) photons after radioactive decay of radionuclides while optical techniques count low-energy (1.5–4.1 eV) photons that are scattered and absorbed by tissues requiring models of light transport for quantitative image reconstruction. Fluorescence imaging has been recently translated into clinic demonstrating high sensitivity, modest tissue penetration depth, and fast, millisecond image acquisition times. As a consequence, the promise of quantitative optical tomography as a complement of small animal PET and SPECT remains high. In this review, we summarize the different instrumentation, methodological approaches and schema for inverse image reconstructions for optical tomography, including luminescence and fluorescence modalities, and comment on limitations and key technological advances needed for further discovery research and translation.

170 citations


Journal ArticleDOI
TL;DR: In this study, shear wave velocity dispersion was measured in vivo in ten Achilles tendons parallel and perpendicular to the tendon fibre orientation and it is shown that parallel to fibres the shear waves dispersion is not influenced by viscosity, while it is perpendicularly to fibre.
Abstract: Non-invasive evaluation of the Achilles tendon elastic properties may enhance diagnosis of tendon injury and the assessment of recovery treatments. Shear wave elastography has shown to be a powerful tool to estimate tissue mechanical properties. However, its applicability to quantitatively evaluate tendon stiffness is limited by the understanding of the physics on the shear wave propagation in such a complex medium. First, tendon tissue is transverse isotropic. Second, tendons are characterized by a marked stiffness in the 400 to 1300 kPa range (i.e. fast shear waves). Hence, the shear wavelengths are greater than the tendon thickness leading to guided wave propagation. Thus, to better understand shear wave propagation in tendons and consequently to properly estimate its mechanical properties, a dispersion analysis is required. In this study, shear wave velocity dispersion was measured in vivo in ten Achilles tendons parallel and perpendicular to the tendon fibre orientation. By modelling the tendon as a transverse isotropic viscoelastic plate immersed in fluid it was possible to fully describe the experimental data (deviation<1.4%). We show that parallel to fibres the shear wave velocity dispersion is not influenced by viscosity, while it is perpendicularly to fibres. Elasticity (found to be in the range from 473 to 1537 kPa) and viscosity (found to be in the range from 1.7 to 4 Pa.s) values were retrieved from the model in good agreement with reported results.

167 citations


Journal ArticleDOI
TL;DR: Results show that quantitative prompt gamma-ray measurements enable knowledge of nuclear reaction cross sections to be used for precise proton range verification in the presence of tissue with an unknown composition.
Abstract: We present an experimental study of a novel method to verify the range of proton therapy beams. Differential cross sections were measured for 15 prompt gamma-ray lines from proton-nuclear interactions with 12C and 16O at proton energies up to 150 MeV. These cross sections were used to model discrete prompt gamma-ray emissions along proton pencil-beams. By fitting detected prompt gamma-ray counts to these models, we simultaneously determined the beam range and the oxygen and carbon concentration of the irradiated matter. The performance of the method was assessed in two phantoms with different elemental concentrations, using a small scale prototype detector. Based on five pencil-beams with different ranges delivering 5 × 108 protons and without prior knowledge of the elemental composition at the measurement point, the absolute range was determined with a standard deviation of 1.0–1.4 mm. Relative range shifts at the same dose level were detected with a standard deviation of 0.3–0.5 mm. The determined oxygen and carbon concentrations also agreed well with the actual values. These results show that quantitative prompt gamma-ray measurements enable knowledge of nuclear reaction cross sections to be used for precise proton range verification in the presence of tissue with an unknown composition.

Journal ArticleDOI
TL;DR: This paper proposes a setup, based on clinical irradiation conditions, capable of determining proton range deviations within a few seconds of irradiation, thus allowing for a fast safety survey, and describes model calculations that very precisely reproduce the experimental results.
Abstract: Proton and ion beams open up new vistas for the curative treatment of tumors, but adequate technologies for monitoring the compliance of dose delivery with treatment plans in real time are still missing. Range assessment, meaning the monitoring of therapy-particle ranges in tissue during dose delivery (treatment), is a continuous challenge considered a key for tapping the full potential of particle therapies. In this context the paper introduces an unconventional concept of range assessment by prompt-gamma timing (PGT), which is based on an elementary physical effect not considered so far: therapy particles penetrating tissue move very fast, but still need a finite transit time?about 1?2?ns in case of protons with a 5?20?cm range?from entering the patient?s body until stopping in the target volume. The transit time increases with the particle range. This causes measurable effects in PGT spectra, usable for range verification. The concept was verified by proton irradiation experiments at the AGOR cyclotron, KVI-CART, University of Groningen. Based on the presented kinematical relations, we describe model calculations that very precisely reproduce the experimental results. As the clinical treatment conditions entail measurement constraints (e.g. limited treatment time), we propose a setup, based on clinical irradiation conditions, capable of determining proton range deviations within a few seconds of irradiation, thus allowing for a fast safety survey. Range variations of 2?mm are expected to be clearly detectable.

Journal ArticleDOI
TL;DR: This topical review provides an up-to-date overview of the theoretical and practical aspects of therapeutic kilovoltage x-ray beam dosimetry and provides recommendations for clinical measurements based on the literature to date.
Abstract: This topical review provides an up-to-date overview of the theoretical and practical aspects of therapeutic kilovoltage x-ray beam dosimetry. Kilovoltage x-ray beams have the property that the maximum dose occurs very close to the surface and thus, they are predominantly used in the treatment of skin cancers but also have applications for the treatment of other cancers. In addition, kilovoltage x-ray beams are used in intra operative units, within animal irradiators and in on-board imagers on linear accelerators and kilovoltage dosimetry is important in these applications as well. This review covers both reference and relative dosimetry of kilovoltage x-ray beams and provides recommendations for clinical measurements based on the literature to date. In particular, practical aspects for the selection of dosimeter and phantom material are reviewed to provide suitable advice for medical physicists. An overview is also presented of dosimeters other than ionization chambers which can be used for both relative and in vivo dosimetry. Finally, issues related to the treatment planning and the use of Monte Carlo codes for solving radiation transport problems in kilovoltage x-ray beams are presented.

Journal ArticleDOI
TL;DR: This article briefly introduces the general physical characteristics and properties that are commonly used to describe the behaviour and performance of both discrete and imaging detectors for x-ray nuclear medicine and ion beam imaging and dosimetry.
Abstract: The enormous advances in the understanding of human anatomy, physiology and pathology in recent decades have led to ever-improving methods of disease prevention, diagnosis and treatment. Many of these achievements have been enabled, at least in part, by advances in ionizing radiation detectors. Radiology has been transformed by the implementation of multi-slice CT and digital x-ray imaging systems, with silver halide films now largely obsolete for many applications. Nuclear medicine has benefited from more sensitive, faster and higher-resolution detectors delivering ever-higher SPECT and PET image quality. PET/MR systems have been enabled by the development of gamma ray detectors that can operate in high magnetic fields. These huge advances in imaging have enabled equally impressive steps forward in radiotherapy delivery accuracy, with 4DCT, PET and MRI routinely used in treatment planning and online image guidance provided by cone-beam CT.The challenge of ensuring safe, accurate and precise delivery of highly complex radiation fields has also both driven and benefited from advances in radiation detectors. Detector systems have been developed for the measurement of electron, intensity-modulated and modulated arc x-ray, proton and ion beams, and around brachytherapy sources based on a very wide range of technologies. The types of measurement performed are equally wide, encompassing commissioning and quality assurance, reference dosimetry, in vivo dosimetry and personal and environmental monitoring.In this article, we briefly introduce the general physical characteristics and properties that are commonly used to describe the behaviour and performance of both discrete and imaging detectors. The physical principles of operation of calorimeters; ionization and charge detectors; semiconductor, luminescent, scintillating and chemical detectors; and radiochromic and radiographic films are then reviewed and their principle applications discussed. Finally, a general discussion of the application of detectors for x-ray nuclear medicine and ion beam imaging and dosimetry is presented.

Journal ArticleDOI
TL;DR: The current applications of GPU in major imaging-related and therapy-related problems encountered in radiotherapy are reviewed and a comparison of GPU with other platforms will also be presented.
Abstract: Recent developments in radiotherapy therapy demand high computation powers to solve challenging problems in a timely fashion in a clinical environment. The graphics processing unit (GPU), as an emerging high-performance computing platform, has been introduced to radiotherapy. It is particularly attractive due to its high computational power, small size, and low cost for facility deployment and maintenance. Over the past few years, GPU-based high-performance computing in radiotherapy has experienced rapid developments. A tremendous amount of study has been conducted, in which large acceleration factors compared with the conventional CPU platform have been observed. In this paper, we will first give a brief introduction to the GPU hardware structure and programming model. We will then review the current applications of GPU in major imaging-related and therapy-related problems encountered in radiotherapy. A comparison of GPU with other platforms will also be presented.

Journal ArticleDOI
TL;DR: GNPs have the potential to enhance radiation therapy depending on the type of radiation source and the mechanism by which GNPs can lead to dose enhancements in radiation therapy differs when comparing photon and proton radiation.
Abstract: Gold nanoparticles (GNPs) have shown potential to be used as a radiosensitizer for radiation therapy. Despite extensive research activity to study GNP radiosensitization using photon beams, only a few studies have been carried out using proton beams. In this work Monte Carlo simulations were used to assess the dose enhancement of GNPs for proton therapy. The enhancement effect was compared between a clinical proton spectrum, a clinical 6 MV photon spectrum, and a kilovoltage photon source similar to those used in many radiobiology lab settings. We showed that the mechanism by which GNPs can lead to dose enhancements in radiation therapy differs when comparing photon and proton radiation. The GNP dose enhancement using protons can be up to 14 and is independent of proton energy, while the dose enhancement is highly dependent on the photon energy used. For the same amount of energy absorbed in the GNP, interactions with protons, kVp photons and MV photons produce similar doses within several nanometers of the GNP surface, and differences are below 15% for the first 10 nm. However, secondary electrons produced by kilovoltage photons have the longest range in water as compared to protons and MV photons, e.g. they cause a dose enhancement 20 times higher than the one caused by protons 10 μm away from the GNP surface. We conclude that GNPs have the potential to enhance radiation therapy depending on the type of radiation source. Proton therapy can be enhanced significantly only if the GNPs are in close proximity to the biological target.

Journal ArticleDOI
TL;DR: It is shown that with a rigorous definition of the EAN, the stoichiometric calibration method can be successfully adapted to DECT with significant accuracy improvements with respect to the literature without the need for spectrum measurements or empirical beam hardening corrections.
Abstract: The accuracy of radiotherapy dose calculation relies crucially on patient composition data. The computed tomography (CT) calibration methods based on the stoichiometric calibration of Schneider et al (1996 Phys. Med. Biol. 41 111–24) are the most reliable to determine electron density (ED) with commercial single energy CT scanners. Along with the recent developments in dual energy CT (DECT) commercial scanners, several methods were published to determine ED and the effective atomic number (EAN) for polyenergetic beams without the need for CT calibration curves. This paper intends to show that with a rigorous definition of the EAN, the stoichiometric calibration method can be successfully adapted to DECT with significant accuracy improvements with respect to the literature without the need for spectrum measurements or empirical beam hardening corrections. Using a theoretical framework of ICRP human tissue compositions and the XCOM photon cross sections database, the revised stoichiometric calibration method yields Hounsfield unit (HU) predictions within less than ±1.3 HU of the theoretical HU calculated from XCOM data averaged over the spectra used (e.g., 80 kVp, 100 kVp, 140 kVp and 140/Sn kVp). A fit of mean excitation energy (I-value) data as a function of EAN is provided in order to determine the ion stopping power of human tissues from ED–EAN measurements. Analysis of the calibration phantom measurements with the Siemens SOMATOM Definition Flash dual source CT scanner shows that the present formalism yields mean absolute errors of (0.3 ± 0.4)% and (1.6 ± 2.0)% on ED and EAN, respectively. For ion therapy, the mean absolute errors for calibrated I-values and proton stopping powers (216 MeV) are (4.1 ± 2.7)% and (0.5 ± 0.4)%, respectively. In all clinical situations studied, the uncertainties in ion ranges in water for therapeutic energies are found to be less than 1.3 mm, 0.7 mm and 0.5 mm for protons, helium and carbon ions respectively, using a generic reconstruction algorithm (filtered back projection). With a more advanced method (sinogram affirmed iterative technique), the values become 1.0 mm, 0.5 mm and 0.4 mm for protons, helium and carbon ions, respectively. These results allow one to conclude that the present adaptation of the stoichiometric calibration yields a highly accurate method for characterizing tissue with DECT for ion beam therapy and potentially for photon beam therapy.

Journal ArticleDOI
TL;DR: The results support the hypothesis that the cortex is most sensitive to fields oriented perpendicular to the cortical layers, while it is relatively insensitive to fields parallel to them, which has important implications for targeting of TMS.
Abstract: Responses elicited by transcranial magnetic stimulation (TMS) over the hand motor area depend on the position and orientation of the stimulating coil. In this work, we computationally investigate the induced electric field for multiple coil orientations and locations in order to determine which parts of the brain are affected and how the sensitivity of motor cortical activation depends on the direction of the electric field. The finite element method is used for calculating the electric field induced by TMS in two individual anatomical models of the head and brain. The orientation of the coil affects both the strength and depth of penetration of the electric field, and the field strongly depends on the direction of the sulcus, where the target neurons are located. The coil position that gives the strongest electric field in the target cortical region may deviate from the closest scalp location by a distance on the order of 1 cm. Together with previous experimental data, the results support the hypothesis that the cortex is most sensitive to fields oriented perpendicular to the cortical layers, while it is relatively insensitive to fields parallel to them. This has important implications for targeting of TMS. To determine the most effective coil position and orientation, it is essential to consider both biological (the direction of the targeted axons) and physical factors (the strength and direction of the electric field).

Journal ArticleDOI
TL;DR: For x-ray photons the light emission would be optimally suited for narrow beam stereotactic radiation therapy and surgery validation studies, for verification of dynamic intensity-modulated and volumetric modulated arc therapy treatment plans in water tanks, near monoenergetic sources and also for entrance and exit surface imaging dosimetry of both narrow and broad beams.
Abstract: Recent studies have proposed that light emitted by the Cherenkov effect may be used for a number of radiation therapy dosimetry applications. There is a correlation between the captured light and expected dose under certain conditions, yet discrepancies have also been observed and a complete examination of the theoretical differences has not been done. In this study, a fundamental comparison between the Cherenkov emission and absorbed dose was explored for x-ray photons, electrons, and protons using both a theoretical and Monte Carlo-based analysis. Based on the findings of where dose correlates with Cherenkov emission, it was concluded that for x-ray photons the light emission would be optimally suited for narrow beam stereotactic radiation therapy and surgery validation studies, for verification of dynamic intensity-modulated and volumetric modulated arc therapy treatment plans in water tanks, near monoenergetic sources (e.g., Co-60 and brachy therapy sources) and also for entrance and exit surface imaging dosimetry of both narrow and broad beams. For electron use, Cherenkov emission was found to be only suitable for surface dosimetry applications. Finally, for proton dosimetry, there exists a fundamental lack of Cherenkov emission at the Bragg peak, making the technique of little use, although post-irradiation detection of light emission from radioisotopes could prove to be useful.

Journal ArticleDOI
TL;DR: It is concluded that the acquisition of prompt gamma profiles for in vivo range verification of proton beam with the developed gamma camera and a slit collimator is feasible in clinical conditions.
Abstract: In this work, we present experimental results of a prompt gamma camera for real-time proton beam range verification. The detection system features a pixelated Cerium doped lutetium based scintillation crystal, coupled to Silicon PhotoMultiplier arrays, read out by dedicated electronics. The prompt gamma camera uses a knife-edge slit collimator to produce a 1D projection of the beam path in the target on the scintillation detector. We designed the detector to provide high counting statistics and high photo-detection efficiency for prompt gamma rays of several MeV. The slit design favours the counting statistics and could be advantageous in terms of simplicity, reduced cost and limited footprint. We present the description of the realized gamma camera, as well as the results of the characterization of the camera itself in terms of imaging performance. We also present the results of experiments in which a polymethyl methacrylate phantom was irradiated with proton pencil beams in a proton therapy center. A tungsten slit collimator was used and prompt gamma rays were acquired in the 3–6 MeV energy range. The acquisitions were performed with the beam operated at 100 MeV, 160 MeV and 230 MeV, with beam currents at the nozzle exit of several nA. Measured prompt gamma profiles are consistent with the simulations and we reached a precision (2σ) in shift retrieval of 4 mm with 0.5 × 108, 1.4 × 108 and 3.4 × 108 protons at 100, 160 and 230 MeV, respectively. We conclude that the acquisition of prompt gamma profiles for in vivo range verification of proton beam with the developed gamma camera and a slit collimator is feasible in clinical conditions. The compact design of the camera allows its integration in a proton therapy treatment room and further studies will be undertaken to validate the use of this detection system during treatment of real patients.

Journal ArticleDOI
TL;DR: It is concluded that the currently used generic range uncertainty margins in proton therapy should be redefined site specific and that complex geometries may require a field specific adjustment.
Abstract: The purpose of this study was to assess the possibility of introducing site-specific range margins to replace current generic margins in proton therapy. Further, the goal was to study the potential of reducing margins with current analytical dose calculations methods. For this purpose we investigate the impact of complex patient geometries on the capability of analytical dose calculation algorithms to accurately predict the range of proton fields. Dose distributions predicted by an analytical pencil-beam algorithm were compared with those obtained using Monte Carlo (MC) simulations (TOPAS). A total of 508 passively scattered treatment fields were analyzed for seven disease sites (liver, prostate, breast, medulloblastoma?spine, medulloblastoma?whole brain, lung and head and neck). Voxel-by-voxel comparisons were performed on two-dimensional distal dose surfaces calculated by pencil-beam and MC algorithms to obtain the average range differences and root mean square deviation for each field for the?distal position of the 90% dose level (R90) and the 50% dose level (R50). The average dose degradation of the distal falloff region, defined as the distance between the distal position of the 80% and 20% dose levels (R80?R20), was also analyzed. All ranges were calculated in water-equivalent distances. Considering total range uncertainties and uncertainties from dose calculation alone, we were able to deduce site-specific estimations. For liver, prostate and whole brain fields our results demonstrate that a reduction of currently used uncertainty margins is feasible even without introducing MC dose calculations. We recommend range margins of 2.8%?+?1.2?mm for liver and prostate treatments and 3.1%?+?1.2?mm for whole brain treatments, respectively. On the other hand, current margins seem to be insufficient for some breast, lung and head and neck patients, at least if used generically. If no case specific adjustments are applied, a generic margin of 6.3%?+?1.2?mm would be needed for breast, lung and head and neck treatments. We conclude that the currently used generic range uncertainty margins in proton therapy should be redefined site specific and that complex geometries may require a field specific adjustment. Routine verifications of treatment plans using MC simulations are recommended for patients with heterogeneous geometries.

Journal ArticleDOI
TL;DR: The importance of increasing linac specific QC when phantom-less methodologies, such as the use of log files, are used to reduce patient specific QC is shown.
Abstract: This work investigated the differences between multileaf collimator (MLC) positioning accuracy determined using either log files or electronic portal imaging devices (EPID) and then assessed the possibility of reducing patient specific quality control (QC) via phantom-less methodologies. In-house software was developed, and validated, to track MLC positional accuracy with the rotational and static gantry picket fence tests using an integrated electronic portal image. This software was used to monitor MLC daily performance over a 1 year period for two Varian TrueBeam linear accelerators, with the results directly compared with MLC positions determined using leaf trajectory log files. This software was validated by introducing known shifts and collimator errors. Skewness of the MLCs was found to be 0.03 ± 0.06° (mean ±1 standard deviation (SD)) and was dependent on whether the collimator was rotated manually or automatically. Trajectory log files, analysed using in-house software, showed average MLC positioning errors with a magnitude of 0.004 ± 0.003 mm (rotational) and 0.004 ± 0.011 mm (static) across two TrueBeam units over 1 year (mean ±1 SD). These ranges, as indicated by the SD, were lower than the related average MLC positioning errors of 0.000 ± 0.025 mm (rotational) and 0.000 ± 0.039 mm (static) that were obtained using the in-house EPID based software. The range of EPID measured MLC positional errors was larger due to the inherent uncertainties of the procedure. Over the duration of the study, multiple MLC positional errors were detected using the EPID based software but these same errors were not detected using the trajectory log files. This work shows the importance of increasing linac specific QC when phantom-less methodologies, such as the use of log files, are used to reduce patient specific QC. Tolerances of 0.25 mm have been created for the MLC positional errors using the EPID-based automated picket fence test. The software allows diagnosis of any specific leaf that needs repair and gives an indication as to the course of action that is required.

Journal ArticleDOI
TL;DR: An overview of the theory of 4D image reconstruction for emission tomography is given, and maximum likelihood or maximum a posteriori (MAP) estimation of either linear or non-linear model parameters can be achieved in image space after carrying out a conventional expectation maximization update of the dynamic image series, using a Kullback-Leibler distance metric.
Abstract: An overview of the theory of 4D image reconstruction for emission tomography is given along with a review of the current state of the art, covering both positron emission tomography and single photon emission computed tomography (SPECT). By viewing 4D image reconstruction as a matter of either linear or non-linear parameter estimation for a set of spatiotemporal functions chosen to approximately represent the radiotracer distribution, the areas of so-called 'fully 4D' image reconstruction and 'direct kinetic parameter estimation' are unified within a common framework. Many choices of linear and non-linear parameterization of these functions are considered (including the important case where the parameters have direct biological meaning), along with a review of the algorithms which are able to estimate these often non-linear parameters from emission tomography data. The other crucial components to image reconstruction (the objective function, the system model and the raw data format) are also covered, but in less detail due to the relatively straightforward extension from their corresponding components in conventional 3D image reconstruction. The key unifying concept is that maximum likelihood or maximum a posteriori (MAP) estimation of either linear or non-linear model parameters can be achieved in image space after carrying out a conventional expectation maximization (EM) update of the dynamic image series, using a Kullback-Leibler distance metric (comparing the modeled image values with the EM image values), to optimize the desired parameters. For MAP, an image-space penalty for regularization purposes is required. The benefits of 4D and direct reconstruction reported in the literature are reviewed, and furthermore demonstrated with simple simulation examples. It is clear that the future of reconstructing dynamic or functional emission tomography images, which often exhibit high levels of spatially correlated noise, should ideally exploit these 4D approaches.

Journal ArticleDOI
TL;DR: This work presents the first results with a high-resolution preclinical PET scanner based on thin monolithic scintillators and a large solid angle dedicated to rat-brain imaging and has a very compact geometry.
Abstract: A new preclinical PET system based on dSiPMs, called DigiPET, is presented. The system is based on thin monolithic scintillation crystals and exhibits superior spatial resolution at low-cost compared to systems based on pixelated crystals. Current dedicated small-rodent PET scanners have a spatial resolution in the order of 1 mm. Most of them have a large footprint, requiring considerable laboratory space. For rodent brain imaging, a PET scanner with sub-millimeter resolution is desired. To achieve this, crystals with a pixel pitch down to 0.5 mm have been used. However, fine pixels are difficult to produce and will render systems expensive. In this work, we present the first results with a high-resolution preclinical PET scanner based on thin monolithic scintillators and a large solid angle. The design is dedicated to rat-brain imaging and therefore has a very compact geometry. Four detectors were placed in a square arrangement with a distance of 34.5 mm between two opposing detector modules, defining a field of view (FOV) of 32 × 32 × 32 mm3. Each detector consists of a thin monolithic LYSO crystal of 32 × 32 × 2 mm3 optically coupled to a digital silicon photomultiplier (dSiPM). Event positioning within each detector was obtained using the maximum likelihood estimation (MLE) method. To evaluate the system performance, we measured the energy resolution, coincidence resolving time (CRT), sensitivity and spatial resolution. The image quality was evaluated by acquiring a hot-rod phantom filled with 18F-FDG and a rat head one hour after an 18F-FDG injection. The MLE yielded an average intrinsic spatial resolution on the detector of 0.54 mm FWHM. We obtained a CRT of 680 ps and an energy resolution of 18% FWHM at 511 keV. The sensitivity and spatial resolution obtained at the center of the FOV were 6.0 cps kBq−1 and 0.7 mm, respectively. In the reconstructed images of the hot-rod phantom, hot rods down to 0.7 mm can be discriminated. In conclusion, a compact PET scanner was built using dSiPM technology and thin monolithic LYSO crystals. Excellent spatial resolution and acceptable sensitivity were demonstrated. Promising results were also obtained in a hot-rod phantom and in rat-brain imaging.

Journal ArticleDOI
TL;DR: Stronger tissues with higher ultimate stress, higher density, and lower water content were more resistant to histotripsy damage in comparison to weaker tissues, and a self-limiting vessel-sparing treatment strategy was developed in an attempt to preserve major vessels while fractionating the surrounding target tissue.
Abstract: Histotripsy is a non-invasive tissue ablation method capable of fractionating tissue by controlling acoustic cavitation. To determine the fractionation susceptibility of various tissues, we investigated histotripsy-induced damage on tissue phantoms and ex vivo tissues with different mechanical strengths. A histotripsy bubble cloud was formed at tissue phantom surfaces using 5-cycle long ultrasound pulses with peak negative pressure of 18 MPa and PRFs of 10, 100, and 1000 Hz. Results showed significantly smaller lesions were generated in tissue phantoms of higher mechanical strength. Histotripsy was also applied to 43 different ex vivo porcine tissues with a wide range of mechanical properties. Gross morphology demonstrated stronger tissues with higher ultimate stress, higher density, and lower water content were more resistant to histotripsy damage in comparison to weaker tissues. Based on these results, a self-limiting vessel-sparing treatment strategy was developed in an attempt to preserve major vessels while fractionating the surrounding target tissue. This strategy was tested in porcine liver in vivo. After treatment, major hepatic blood vessels and bile ducts remained intact within a completely fractionated liver volume. These results identify varying susceptibilities of tissues to histotripsy therapy and provide a rational basis to optimize histotripsy parameters for treatment of specific tissues.

Journal ArticleDOI
TL;DR: A realistic head model from MRI data is constructed and a 'virtual lesion' is included in the model to simulate the presence of an idealized tumor to better understand the physical basis of TTF efficacy through retrospective analysis and to improve TTF treatment planning.
Abstract: The use of alternating electric fields has been recently proposed for the treatment of recurrent glioblastoma. In order to predict the electric field distribution in the brain during the application of such tumor treating fields (TTF), we constructed a realistic head model from MRI data and placed transducer arrays on the scalp to mimic an FDA-approved medical device. Values for the tissue dielectric properties were taken from the literature; values for the device parameters were obtained from the manufacturer. The finite element method was used to calculate the electric field distribution in the brain. We also included a 'virtual lesion' in the model to simulate the presence of an idealized tumor. The calculated electric field in the brain varied mostly between 0.5 and 2.0 V cm( - 1) and exceeded 1.0 V cm( - 1) in 60% of the total brain volume. Regions of local field enhancement occurred near interfaces between tissues with different conductivities wherever the electric field was perpendicular to those interfaces. These increases were strongest near the ventricles but were also present outside the tumor's necrotic core and in some parts of the gray matter-white matter interface. The electric field values predicted in this model brain are in reasonably good agreement with those that have been shown to reduce cancer cell proliferation in vitro. The electric field distribution is highly non-uniform and depends on tissue geometry and dielectric properties. This could explain some of the variability in treatment outcomes. The proposed modeling framework could be used to better understand the physical basis of TTF efficacy through retrospective analysis and to improve TTF treatment planning.

Journal ArticleDOI
TL;DR: The results suggest that one very useful application of the phantom library would be the construction of a pre-computed dose library for CT imaging as needed for patient dose-tracking.
Abstract: Substantial increases in pediatric and adult obesity in the US have prompted a major revision to the current UF/NCI (University of Florida/National Cancer Institute) family of hybrid computational phantoms to more accurately reflect current trends in larger body morphometry. A decision was made to construct the new library in a gridded fashion by height/weight without further reference to age-dependent weight/height percentiles as these become quickly outdated. At each height/weight combination, circumferential parameters were defined and used for phantom construction. All morphometric data for the new library were taken from the CDC NHANES survey data over the time period 1999–2006, the most recent reported survey period. A subset of the phantom library was then used in a CT organ dose sensitivity study to examine the degree to which body morphometry influences the magnitude of organ doses for patients that are underweight to morbidly obese in body size. Using primary and secondary morphometric parameters, grids containing 100 adult male height/weight bins, 93 adult female height/weight bins, 85 pediatric male height/weight bins and 73 pediatric female height/weight bins were constructed. These grids served as the blueprints for construction of a comprehensive library of patient-dependent phantoms containing 351 computational phantoms. At a given phantom standing height, normalized CT organ doses were shown to linearly decrease with increasing phantom BMI for pediatric males, while curvilinear decreases in organ dose were shown with increasing phantom BMI for adult females. These results suggest that one very useful application of the phantom library would be the construction of a pre-computed dose library for CT imaging as needed for patient dose-tracking.

Journal ArticleDOI
TL;DR: Generation of pCTs using statistical regression seems to be the most promising candidate for MRI-only RT of the brain and the highest dosimetrical agreement.
Abstract: Radiotherapy (RT) based on magnetic resonance imaging (MRI) as the only modality, so-called MRI-only RT, would remove the systematic registration error between MR and computed tomography (CT), and provide co-registered MRI for assessment of treatment response and adaptive RT. Electron densities, however, need to be assigned to the MRI images for dose calculation and patient setup based on digitally reconstructed radiographs (DRRs). Here, we investigate the geometric and dosimetric performance for a number of popular voxel-based methods to generate a so-called pseudo CT (pCT). Five patients receiving cranial irradiation, each containing a co-registered MRI and CT scan, were included. An ultra short echo time MRI sequence for bone visualization was used. Six methods were investigated for three popular types of voxel-based approaches; (1) threshold-based segmentation, (2) Bayesian segmentation and (3) statistical regression. Each approach contained two methods. Approach 1 used bulk density assignment of MRI voxels into air, soft tissue and bone based on logical masks and the transverse relaxation time T2 of the bone. Approach 2 used similar bulk density assignments with Bayesian statistics including or excluding additional spatial information. Approach 3 used a statistical regression correlating MRI voxels with their corresponding CT voxels. A similar photon and proton treatment plan was generated for a target positioned between the nasal cavity and the brainstem for all patients. The CT agreement with the pCT of each method was quantified and compared with the other methods geometrically and dosimetrically using both a number of reported metrics and introducing some novel metrics. The best geometrical agreement with CT was obtained with the statistical regression methods which performed significantly better than the threshold and Bayesian segmentation methods (excluding spatial information). All methods agreed significantly better with CT than a reference water MRI comparison. The mean dosimetric deviation for photons and protons compared to the CT was about 2% and highest in the gradient dose region of the brainstem. Both the threshold based method and the statistical regression methods showed the highest dosimetrical agreement.Generation of pCTs using statistical regression seems to be the most promising candidate for MRI-only RT of the brain. Further, the total amount of different tissues needs to be taken into account for dosimetric considerations regardless of their correct geometrical position.

Journal ArticleDOI
TL;DR: A noticeable impact of Auger electrons emitted from the nanoparticles is found, due to the large number of electrons in atoms with high atomic numbers, in addition to the primary ionization process.
Abstract: A possible dose enhancement effect by proton or electron irradiation in the vicinity of nanoparticles consisting of different high Z atomic materials has been investigated using the track structure Monte Carlo code TRAX. In the simulations, Fe, Ag, Gd, Pt and Au nanoparticles (r = 22 and 2 nm) were irradiated with monoenergetic proton beams at energies of therapeutic interest (2, 80 and 300 MeV) and 44 keV electrons. Due to the large number of electrons in atoms with high atomic numbers, many electrons can be released in Auger cascades in addition to the primary ionization process. The potential additional nanoscopic radial dose contributions in the presence of metallic nanoparticles are assessed by comparison with liquid water and water simulated with the same density as the metallic materials. We find a noticeable impact of Auger electrons emitted from the nanoparticles. Special focus has been given to the assessment of complete sets of low-energy electron cross sections for the nanoparticle materials.

Journal ArticleDOI
TL;DR: The results of this study indicate that a commissioning process using large fields does not lead to an accurate estimation of the source size, even if a 2 × 2 cm(2) field is included, and the detector should be explicitly modelled in the calculations.
Abstract: The purpose of this study was to derive a complete set of correction and perturbation factors for output factors (OF) and dose profiles Modern small field detectors were investigated including a plastic scintillator (Exradin W1, SI), a liquid ionization chamber (microLion 31018, PTW), an unshielded diode (Exradin D1V, SI) and a synthetic diamond (microDiamond 60019, PTW) A Monte Carlo (MC) beam model was commissioned for use in small fields following two commissioning procedures: (1) using intermediate and moderately small fields (down to 2 × 2 cm2) and (2) using only small fields (05 × 05 cm2 –2 × 2 cm2) In the latter case the detectors were explicitly modelled in the dose calculation The commissioned model was used to derive the correction and perturbation factors with respect to a small point in water as suggested by the Alfonso formalism In MC calculations the design of two detectors was modified in order to minimize or eliminate the corrections needed The results of this study indicate that a commissioning process using large fields does not lead to an accurate estimation of the source size, even if a 2 × 2 cm2 field is included Furthermore, the detector should be explicitly modelled in the calculations On the output factors, the scintillator W1 needed the smallest correction (+06%), followed by the microDiamond (+13%) Larger corrections were observed for the microLion (+24%) and diode D1V (−24%) On the profiles, significant corrections were observed out of the field on the gradient and tail regions The scintillator needed the smallest corrections (−4%), followed by the microDiamond (−11%), diode D1V (+13%) and microLion (−15%) The major perturbations reported were due to volume averaging and high density materials that surround the active volumes These effects presented opposite trends in both OF and profiles By decreasing the radius of the microLion to 085 mm we could modify the volume averaging effect in order to achieve a discrepancy less than 1% for OF and 5% for profiles compared to water Similar results were observed for the diode D1V if the radius was increased to 1 mm