scispace - formally typeset
Search or ask a question
Author

A. Trindade

Bio: A. Trindade is an academic researcher. The author has contributed to research in topics: Data acquisition & Monte Carlo method. The author has an hindex of 9, co-authored 10 publications receiving 213 citations.

Papers
More filters
Journal ArticleDOI
01 Jan 2003
TL;DR: The GEANT4 Monte Carlo radiation transport toolkit as discussed by the authors provides the basic services and infrastructure required for the development of flexible simulation frameworks and applications which have found generalized use in high energy physics, nuclear physics, astrophysics and medical physics research.
Abstract: The GEANT4 Monte Carlo radiation transport toolkit provides the basic services and infrastructure required for the development of flexible simulation frameworks and applications which have found generalized use in high energy physics, nuclear physics, astrophysics and medical physics research. GEANT4 object-oriented design provides the possibility to implement or modify any physics process in GEANT4 without changing other parts of the software. This feature makes GEANT4 open to extension of its physics modeling capabilities and to the implementation of alternative physics models. In this paper, the development a simulation platform for performance studies and detector optimization of the Clear-PEM scanner, a high-performance positron emission mammography prototype, and the implementation of precise low energy bremsstrahlung angular generators for the GEANT4 low energy electromagnetic physics category are described.

46 citations

Proceedings ArticleDOI
19 Oct 2003
TL;DR: A series of achievements associated with Geant4-based applications in medical physics and, in particular, in radiotherapy, protontherapy, PEM, PET, MRT, metabolic therapy, IORT are presented.
Abstract: We present a series of achievements associated with Geant4-based applications in medical physics and, in particular, in radiotherapy (external beams and brachytherapy), protontherapy, PEM, PET, MRT, metabolic therapy, IORT; projects in microdosimetry and radiobiology are beginning. The Geant4 CT-interface allows to reproduce realistically the patient anatomy, the integration to the GRID allows to run the applications sharing distributed computing resources. The Geant4 Medical Physics Group has born from the collaboration of Geant4 with several research and medical physics institutes in Europe.

39 citations

Journal ArticleDOI
TL;DR: The concept of Clear-PEM, the system presently developed in the frame of the Crystal Clear Collaboration at CERN, will be a dedicated scanner, offering better perspectives in terms of position resolution and detection sensitivity.
Abstract: Positron emission mammography (PEM) can offer a non-invasive method for the diagnosis of breast cancer. Metabolic images from PEM using 1 8 F-fluoro-deoxy-glucose, contain unique information not available from conventional morphologic imaging techniques like X-ray radiography. In this work, the concept of Clear-PEM, the system presently developed in the frame of the Crystal Clear Collaboration at CERN, is described. Clear-PEM will be a dedicated scanner, offering better perspectives in terms of position resolution and detection sensitivity.

32 citations

Journal ArticleDOI
TL;DR: Monte Carlo simulation results evaluating the trigger performance, as well as results of hardware simulations are presented, showing the correctness of the design and the implementation approach.
Abstract: The Clear-PEM detector system is a compact positron emission mammography scanner with about 12000 channels aiming at high sensitivity and good spatial resolution. Front-end, Trigger, and Data Acquisition electronics are crucial components of this system. The on-detector front-end is implemented as a data-driven synchronous system that identifies and selects the analog signals whose energy is above a predefined threshold. The off-detector trigger logic uses digitized front-end data streams to compute pulse amplitudes and timing. Based on this information it generates a coincidence trigger signal that is used to initiate the conditioning and transfer of the relevant data to the data acquisition computer. To minimize dead-time, the data acquisition electronics makes extensive use of pipeline processing structures and derandomizer memories with multievent capacity. The system operates at 100-MHz clock frequency, and is capable of sustaining a data acquisition rate of 1 million events per second with an efficiency above 95%, at a total single photon background rate of 10 MHz. The basic component of the front-end system is a low-noise amplifier-multiplexer chip presently under development. The off-detector system is designed around a dual-bus crate backplane for fast intercommunication between the system boards. The trigger and data acquisition logic is implemented in large FPGAs with 4 million gates. Monte Carlo simulation results evaluating the trigger performance, as well as results of hardware simulations are presented, showing the correctness of the design and the implementation approach

30 citations

Journal ArticleDOI
TL;DR: In this article, a high-level C++ simulation tool was developed for data acquisition performance analysis and validated at bit-level against FPGA VHDL testbenches.
Abstract: The Clear-PEM detector is a positron emission mammography scanner based on a high-granularity avalanche photodiode readout with 12 288 channels. The front-end sub-system is instrumented with low-noise 192:2 channel amplifier-multiplexer ASICs and free-running sampling ADCs. The off-detector trigger, implemented in a FPGA based architecture, computes the pulses amplitude and timing required for coincidence validation from the front-end data streams. A high-level C++ simulation tool was developed for data acquisition performance analysis and validated at bit-level against FPGA VHDL testbenches. In this work, simulation studies concerning the performance of the on-line/off-line energy and time extraction algorithms and the foreseen detector energy and time resolution are presented. Time calibration and trigger efficiency are also discussed

16 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: GeGeant4 as mentioned in this paper is a software toolkit for the simulation of the passage of particles through matter, it is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection.
Abstract: Geant4 is a software toolkit for the simulation of the passage of particles through matter. It is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection. Its functionality and modeling capabilities continue to be extended, while its performance is enhanced. An overview of recent developments in diverse areas of the toolkit is presented. These include performance optimization for complex setups; improvements for the propagation in fields; new options for event biasing; and additions and improvements in geometry, physics processes and interactive capabilities

6,063 citations

Journal ArticleDOI
TL;DR: The purpose of this report is to set out the salient issues associated with clinical implementation and experimental verification of MC dose algorithms, and to provide the framework upon which to build a comprehensive program for commissioning and routine quality assurance of MC-based treatment planning systems.
Abstract: The Monte Carlo (MC) method has been shown through many research studies to calculate accurate dose distributions for clinical radiotherapy, particularly in heterogeneous patient tissues where the effects of electron transport cannot be accurately handled with conventional, deterministic dose algorithms. Despite its proven accuracy and the potential for improved dose distributions to influence treatment outcomes, the long calculation times previously associated with MC simulation rendered this method impractical for routine clinical treatment planning. However, the development of faster codes optimized for radiotherapy calculations and improvements in computer processor technology have substantially reduced calculation times to, in some instances, within minutes on a single processor. These advances have motivated several major treatment planning system vendors to embark upon the path of MC techniques. Several commercial vendors have already released or are currently in the process of releasing MC algorithms for photon and/or electron beam treatment planning. Consequently, the accessibility and use of MC treatment planning algorithms may well become widespread in the radiotherapy community. With MC simulation, dose is computed stochastically using first principles; this method is therefore quite different from conventional dose algorithms. Issues such as statistical uncertainties, the use of variance reduction techniques, theability to account for geometric details in the accelerator treatment head simulation, and other features, are all unique components of a MC treatment planning algorithm. Successful implementation by the clinical physicist of such a system will require an understanding of the basic principles of MC techniques. The purpose of this report, while providing education and review on the use of MC simulation in radiotherapy planning, is to set out, for both users and developers, the salient issues associated with clinical implementation and experimental verification of MC dose algorithms. As the MC method is an emerging technology, this report is not meant to be prescriptive. Rather, it is intended as a preliminary report to review the tenets of the MC method and to provide the framework upon which to build a comprehensive program for commissioning and routine quality assurance of MC-based treatment planning systems.

591 citations

Journal ArticleDOI
TL;DR: Potential application of interlaced microbeams to treat tumors or to ablate nontumorous abnormalities with minimal damage to surrounding normal tissue is suggested.
Abstract: Studies have shown that x-rays delivered as arrays of parallel microplanar beams (microbeams), 25- to 90-μm thick and spaced 100–300 μm on-center, respectively, spare normal tissues including the central nervous system (CNS) and preferentially damage tumors. However, such thin microbeams can only be produced by synchrotron sources and have other practical limitations to clinical implementation. To approach this problem, we first studied CNS tolerance to much thicker beams. Three of four rats whose spinal cords were exposed transaxially to four 400-Gy, 0.68-mm microbeams, spaced 4 mm, and all four rats irradiated to their brains with large, 170-Gy arrays of such beams spaced 1.36 mm, all observed for 7 months, showed no paralysis or behavioral changes. We then used an interlacing geometry in which two such arrays at a 90° angle produced the equivalent of a contiguous beam in the target volume only. By using this approach, we produced 90-, 120-, and 150-Gy 3.4 × 3.4 × 3.4 mm3 exposures in the rat brain. MRIs performed 6 months later revealed focal damage within the target volume at the 120- and 150-Gy doses but no apparent damage elsewhere at 120 Gy. Monte Carlo calculations indicated a 30-μm dose falloff (80–20%) at the edge of the target, which is much less than the 2- to 5-mm value for conventional radiotherapy and radiosurgery. These findings strongly suggest potential application of interlaced microbeams to treat tumors or to ablate nontumorous abnormalities with minimal damage to surrounding normal tissue.

185 citations

Journal ArticleDOI
TL;DR: The proposed method is more sensitive to small variations of the electron beam diameter with respect to the conventional method used to commission Monte Carlo codes, i.e., by comparison with measured percentage depth doses (PDD) and beam profiles, for which measurements of PDD and profiles are strongly affected by the type of detector used.
Abstract: The scope of this study was to estimate total scatter factors ( s c , p ) of the three smallest collimators of the Cyberknife radiosurgery system (5–10 mm in diameter), combining experimental measurements and Monte Carlo simulation. Two microchambers, a diode, and a diamonddetector were used to collect experimental data. The treatment head and the detectors were simulated by means of a Monte Carlo code in order to calculate correction factors for the detectors and to estimate total scatter factors by means of a consistency check between measurement and simulation. Results for the three collimators were: s c , p ( 5 mm ) = 0.677 ± 0.004 , s c , p ( 7.5 mm ) = 0.820 ± 0.008 , s c , p ( 10 mm ) = 0.871 ± 0.008 , all relative to the 60 mm collimator at 80 cm source-to-detector distance. The method also allows the full width at half maximum of the electron beam to be estimated; estimations made with different collimators and different detectors were in excellent agreement and gave a value of 2.1 mm. Correction factors to be applied to the detectors for the measurement of s c , p were consistent with a prevalence of volume effect for the microchambers and the diamond and a prevalence of scattering from high- Z material for the diode detector. The proposed method is more sensitive to small variations of the electron beam diameter with respect to the conventional method used to commission Monte Carlo codes, i.e., by comparison with measured percentage depth doses (PDD) and beam profiles. This is especially important for small fields (less than 10 mm diameter), for which measurements of PDD and profiles are strongly affected by the type of detector used. Moreover, this method should allow s c , p of Cyberknife systems different from the unit under investigation to be estimated without the need for further Monte Carlo calculation, provided that one of the microchambers or the diode detector of the type used in this study are employed. The results for the diamond are applicable only to the specific detector that was investigated due to excessive variability in manufacturing.

152 citations

Journal ArticleDOI
TL;DR: PEM/PET may be an effective system for the detection and diagnosis of breast cancer, according to initial phantom-based testing of the device, which assessed spatial resolution and detection sensitivity.
Abstract: Tomographic breast imaging techniques can potentially improve detection and diagnosis of cancer in women with radiodense and/or fibrocystic breasts. We have developed a high-resolution positron emission mammography/tomography imaging and biopsy device (called PEM/PET) to detect and guide the biopsy of suspicious breast lesions. PET images are acquired to detect suspicious focal uptake of the radiotracer and guide biopsy of the area. Limited-angle PEM images could then be used to verify the biopsy needle position prior to tissue sampling. The PEM/PET scanner consists of two sets of rotating planar detector heads. Each detector consists of a 4 x 3 array of Hamamatsu H8500 flat panel position sensitive photomultipliers (PSPMTs) coupled to a 96 x 72 array of 2 x 2 x 15 mm(3) LYSO detector elements (pitch = 2.1 mm). Image reconstruction is performed with a three-dimensional, ordered set expectation maximization (OSEM) algorithm parallelized to run on a multi-processor computer system. The reconstructed field of view (FOV) is 15 x 15 x 15 cm(3). Initial phantom-based testing of the device is focusing upon its PET imaging capabilities. Specifically, spatial resolution and detection sensitivity were assessed. The results from these measurements yielded a spatial resolution at the center of the FOV of 2.01 +/- 0.09 mm (radial), 2.04 +/- 0.08 mm (tangential) and 1.84 +/- 0.07 mm (axial). At a radius of 7 cm from the center of the scanner, the results were 2.11 +/- 0.08 mm (radial), 2.16 +/- 0.07 mm (tangential) and 1.87 +/- 0.08 mm (axial). Maximum system detection sensitivity of the scanner is 488.9 kcps microCi(-1) ml(-1) (6.88%). These promising findings indicate that PEM/PET may be an effective system for the detection and diagnosis of breast cancer.

141 citations