scispace - formally typeset
Search or ask a question

Showing papers in "Health Physics in 2007"



Journal ArticleDOI
TL;DR: Findings reported during the 20-y period after the Chernobyl accident regarding stress-related symptoms, effects on the developing brain, and cognitive and psychological impairments among highly exposed cleanup workers are reviewed.
Abstract: The mental health impact of Chernobyl is regarded by many experts as the largest public health problem unleashed by the accident to date. This paper reviews findings reported during the 20-y period after the accident regarding stress-related symptoms, effects on the developing brain, and cognitive and psychological impairments among highly exposed cleanup workers. With respect to stress-related symptoms, the rates of depressive, anxiety (especially post-traumatic stress symptoms), and medically unexplained physical symptoms are two to four times higher in Chernobyl-exposed populations compared to controls, although rates of diagnosable psychiatric disorders do not appear to be elevated. The symptom elevations were found as late as 11 y after the accident. Severity of symptomatology is significantly related to risk perceptions and being diagnosed with a Chernobyl-related health problem. In general, the morbidity patterns are consistent with the psychological impairments documented after other toxic events, such as the atomic bombings of Hiroshima and Nagasaki, the Three Mile Island accident, and Bhopal. With respect to the developing brain of exposed children who were in utero or very young when the accident occurred, the World Health Organization as well as American and Israeli researchers have found no significant associations of radiation exposure with cognitive impairments. Cognitive impairments in highly exposed cleanup workers have been reported by Ukrainian researchers, but these findings have not been independently confirmed. A seminal study found a significant excess death rate from suicide in cleanup workers, suggesting a sizable emotional toll. Given the magnitude and persistence of the adverse mental health effects on the general population, long-term educational and psychosocial interventions should be initiated that target primary care physicians, local researchers, and high risk populations, including participants in ongoing cohort studies.

118 citations


Journal ArticleDOI
TL;DR: Whether large-scale evacuations or sheltering in place are appropriate responses to a radiological dispersal device event is discussed and the method and technical bases used to establish hazard boundaries and the appropriate actions that apply within those areas are discussed.
Abstract: If dispersal occurs from an explosive radiological dispersal device, first responders need to know what actions they need to take to protect life and property. Many of the decisions required to minimize exposure will be made during the first hour. To help the first responder decide what countermeasures to employ, Sandia National Laboratories has established realistic hazard boundaries for acute and sub-acute effects relevant to radiological dispersal devices. These boundaries were derived from dispersal calculations based on the aerosolization behavior of devices tested in the Sandia Aerosolization Program. For 20 years, the Sandia Aerosolization Program has performed explosive and non-explosive aerosolization tests relevant to radiological dispersal devices. This paper discusses (1) the method and technical bases used to establish hazard boundaries and the appropriate actions that apply within those areas and (2) whether large-scale evacuations or sheltering in place are appropriate responses to a radiological dispersal device event.

97 citations



Journal ArticleDOI
TL;DR: The data currently available suggest that both the magnitude and patterns of thyroid cancer risk are generally consistent with those reported following external exposure, and that continued follow-up of the exposed populations should provide valuable information.
Abstract: As a result of the Chernobyl nuclear power plant accident, massive amounts of radioactive materials were released into the environment and large numbers of individuals living in Belarus, Russia, and Ukraine were exposed to radioactive iodines, primarily 131I. Iodine-131 concentrated in the thyroid gland of residents of the contaminated areas, with children and adolescents being particularly affected. In the decade after the accident, a substantial increase in thyroid cancer incidence was observed among exposed children in the three affected countries, and compelling evidence of an association between pediatric thyroid cancer incidence and radiation exposure to the thyroid gland accumulated. The data currently available suggest that both the magnitude and patterns of thyroid cancer risk are generally consistent with those reported following external exposure. Based on data from case-control studies, iodine deficiency appeared to enhance the risk of developing thyroid cancer following exposure from Chernobyl. Results from a recent large cohort study, however, did not support these findings. Data on adult exposure are limited and not entirely consistent. Similarly, information on thyroid cancer risks associated with in utero exposure is insufficient to draw conclusions. The lack of information on these two population groups indicates an important gap that needs to be filled. Twenty years after the accident, excess thyroid cancers are still occurring among persons exposed as children or adolescents, and, if external radiation can be used as a guide, we can expect an excess of radiation-associated thyroid cancers for several more decades. Since considerable uncertainties about the long-term health effects from Chernobyl remain, continued follow-up of the exposed populations should provide valuable information.

92 citations


Journal ArticleDOI
TL;DR: The updated dosimetry database, “Doses-2005,” represents a significant improvement in the determination of absorbed organ dose from external radiation and plutonium intake for the original cohort of 18,831 Mayak workers.
Abstract: The Mayak Production Association (MPA) was the first plutonium production plant in the former Soviet Union. Workers at the MPA were exposed to relatively large internal radiation intakes and external radiation exposures, particularly in the early years of plant operations. This paper describes the updated dosimetry database, Doses-2005. Doses-2005 represents a significant improvement in the determination of absorbed organ dose from external radiation and plutonium intake for the original cohort of 18,831 Mayak workers. The methods of dose reconstruction of absorbed organ doses from external radiation uses: 1) archive records of measured dose and worker exposure history, 2) measured energy and directional response characteristics of historical Mayak film dosimeters, and 3) calculated dose conversion factors for Mayak Study-defined exposure scenarios using Monte Carlo techniques. The methods of dose reconstruction for plutonium intake uses two revised models developed from empirical data derived from bioassay and autopsy cases and/or updates from prevailing or emerging International Commission on Radiological Protection models. Other sources of potential significant exposure to workers such as medical diagnostic x-rays, ambient onsite external radiation, neutron radiation, intake of airborne effluent, and intake of nuclides other than plutonium were evaluated to determine their impact on the dose estimates.

91 citations


Journal ArticleDOI
TL;DR: It is suggested that some reported “nonthermal” effects of RF energy may be thermal in nature; also that subtle thermal effects from RF energy exist but have no consequence to health or safety.
Abstract: This article reviews thermal mechanisms of interaction between radiofrequency (RF) fields and biological systems, focusing on theoretical frameworks that are of potential use in setting guidelines for human exposure to RF energy. Several classes of thermal mechanisms are reviewed that depend on the temperature increase or rate of temperature increase and the relevant dosimetric considerations associated with these mechanisms. In addition, attention is drawn to possible molecular and physiological reactions that could be induced by temperature elevations below 0.1 degrees, which are normal physiological responses to heat, and to the so-called microwave auditory effect, which is a physiologically trivial effect resulting from thermally-induced acoustic stimuli. It is suggested that some reported "nonthermal" effects of RF energy may be thermal in nature; also that subtle thermal effects from RF energy exist but have no consequence to health or safety. It is proposed that future revisions of exposure guidelines make more explicit use of thermal models and empirical data on thermal effects in quantifying potential hazards of RF fields.

90 citations


Journal ArticleDOI
TL;DR: This survey measured radiofrequency fields from wireless local area networks (WLANs) using Wi-Fi technology against a background of RF fields in the environment over the frequency range 75 MHz–3 GHz to determine the levels of RF energy exposure from WLANs.
Abstract: —This survey measured radiofrequency (RF) fields from wireless local area networks (WLANs) using Wi-Fi technology against a background of RF fields in the environment over the frequency range 75 MHz–3 GHz. A total of 356 measurements were conducted at 55 sites (including private residences,

89 citations


Journal ArticleDOI
TL;DR: This review is a synopsis of the subgroup that examined the radiological effects to nonhuman biota within the 30-km Exclusion Zone around Chernobyl, and the observed effects on plants, soil invertebrates, terrestrial vertebrates and fish are summarized.
Abstract: Several United Nations organizations sought to dispel the uncertainties and controversy that still exist concerning the effects of the Chernobyl accident. A Chernobyl Forum of international expertise was established to reach consensus on the environmental consequences and health effects attributable to radiation exposure arising from the accident. This review is a synopsis of the subgroup that examined the radiological effects to nonhuman biota within the 30-km Exclusion Zone. The response of biota to Chernobyl irradiation was a complex interaction among radiation dose, dose rate, temporal and spatial variation, varying radiation sensitivities of the different taxons, and indirect effects from other events. The radiation-induced effects to plants and animals within the 30-km Exclusion Zone around Chernobyl can be framed in three broad time periods relative to the accident: an intense exposure period during the first 30 d following the accident of 26 April 1986; a second phase that extended through the first year of exposure during which time the short-lived radionuclides decayed and longer-lived radionuclides were transported to different components of the environment by physical, chemical and biological processes; and the third and continuing long-term phase of chronic exposure with dose rates<1% of the initial values. The doses accumulated, and the observed effects on plants, soil invertebrates, terrestrial vertebrates and fish are summarized for each time period. Physiological and genetic effects on biota, as well as the indirect effects on wildlife of removing humans from the Chernobyl area, are placed in context of what was known about radioecological effects prior to the accident.

82 citations


Journal ArticleDOI
TL;DR: The Chernobyl accident resulted in almost one-third of the reported cases of acute radiation sickness (ARS) reported worldwide and cases occurred among the plant employees and first responders but not among the evacuated populations or general population.
Abstract: The Chernobyl accident resulted in almost one-third of the reported cases of acute radiation sickness (ARS) reported worldwide. Cases occurred among the plant employees and first responders but not among the evacuated populations or general population. The diagnosis of ARS was initially considered for 237 persons based on symptoms of nausea, vomiting, and diarrhea. Ultimately, the diagnosis of ARS was confirmed in 134 persons. There were 28 short term deaths of which 95% occurred at whole body doses in excess of 6.5 Gy. Underlying bone marrow failure was the main contributor to all deaths during the first 2 mo. Allogenic bone marrow transplantation was performed on 13 patients and an additional six received human fetal liver cells. All of these patients died except one individual who later was discovered to have recovered his own marrow and rejected the transplant. Two or three patients were felt to have died as a result of transplant complications. Skin doses exceeded bone marrow doses by a factor of 10-30, and at least 19 of the deaths were felt to be primarily due to infection from large area beta burns. Internal contamination was of relatively minor importance in treatment. By the end of 2001, an additional 14 ARS survivors died from various causes. Long term treatment has included therapy for beta burn fibrosis and skin atrophy as well as for cataracts.

74 citations


Journal ArticleDOI
TL;DR: It is concluded that the integrative properties of the synapses and neural networks of the central nervous system render cognitive function sensitive to the effects of physiologically weak electric fields, below the threshold for peripheral nerve stimulation.
Abstract: It is well understood that electric currents applied directly to the body can stimulate peripheral nerve and muscle tissue; such effects can be fatal if breathing is inhibited or ventricular fibrillation is induced. Exposure to extremely low frequency electric and magnetic fields will also induce electric fields and currents within the body, but these are almost always much lower than those that can stimulate peripheral nerve tissue. Guidance on exposure to such fields is based on the avoidance of acute effects in the central nervous system. This paper reviews the physiological processes involved in nerve cell excitability in the peripheral and central nervous system, and the experimental evidence for physiologically weak electric field effects. It is concluded that the integrative properties of the synapses and neural networks of the central nervous system render cognitive function sensitive to the effects of physiologically weak electric fields, below the threshold for peripheral nerve stimulation. However, the only direct evidence of these weak field interactions within the central nervous system is the induction of phosphenes in humans--the perception of faint flickering light in the periphery of the visual field, by magnetic field exposure. Other tissues are potentially sensitive to induced electric fields through effects on voltage-gated ion channels, but the sensitivity of these ion channels is likely to be lower than those of nerve and muscle cells specialized for rapid electrical signaling. In addition, such tissues lack the integrative properties of synapses and neuronal networks that render the central nervous system potentially more vulnerable.

Journal ArticleDOI
TL;DR: A cohort of seventy-four 1991 Gulf War soldiers with known exposure to depleted uranium resulting from their involvement in friendly-fire incidents with DU munitions is being followed by the Baltimore Veterans Affairs Medical Center.
Abstract: A cohort of seventy-four 1991 Gulf War soldiers with known exposure to depleted uranium (DU) resulting from their involvement in friendly-fire incidents with DU munitions is being followed by the Baltimore Veterans Affairs Medical Center. Biennial medical surveillance visits designed to identify uranium-related changes in health have been conducted since 1993. On-going systemic exposure to DU in veterans with embedded metal fragments is indicated by elevated urine uranium (U) excretion at concentrations up to 1,000-fold higher than that seen in the normal population. Health outcome results from the subcohort of this group of veterans attending the 2005 surveillance visit were examined based on two measures of U exposure. As in previous years, current U exposure is measured by determining urine U concentration at the time of their surveillance visit. A cumulative measure of U exposure was also calculated based on each veteran's past urine U concentrations since first exposure in 1991. Using either exposure metric, results continued to show no evidence of clinically significant DU-related health effects. Urine concentrations of retinol binding protein (RBP), a biomarker of renal proximal tubule function, were not significantly different between the low vs. high U groups based on either the current or cumulative exposure metric. Continued evidence of a weak genotoxic effect from the on-going DU exposure as measured at the HPRT (hypoxanthine-guanine phosphoribosyl transferase) locus and suggested by the fluorescent in-situ hybridization (FISH) results in peripheral blood recommends the need for continued surveillance of this population.

Journal ArticleDOI
TL;DR: The analysis suggests that the current maximum permissible exposure limits could be beneficially revised to relax the IR limits over wavelength ranges where unusually high safety margins may unintentionally hinder applications of recently developed military and telecommunications laser systems.
Abstract: This report summarizes the results of a series of infrared (IR) laser-induced ocular damage studies conducted over the past decade. The studies examined retinal, lens, and corneal effects of laser exposures in the near-IR to far-IR transition region (wavelengths from 1.3-1.4 mum with exposure durations ranging from Q-switched to continuous wave). The corneal and retinal damage thresholds are tabulated for all pulsewidth regimes, and the wavelength dependence of the IR thresholds is discussed and contrasted to laser safety standard maximum permissible exposure limits. The analysis suggests that the current maximum permissible exposure limits could be beneficially revised to (1) relax the IR limits over wavelength ranges where unusually high safety margins may unintentionally hinder applications of recently developed military and telecommunications laser systems; (2) replace step-function discontinuities in the IR limits by continuously varying analytical functions of wavelength and pulsewidth which more closely follow the trends of the experimental retinal (for point-source laser exposures) and corneal ED50 threshold data; and (3) result in an overall simplification of the permissible exposure limits over the wavelength range from 1.2-2.6 mum. A specific proposal for amending the IR maximum permissible exposure limits over this wavelength range is presented.

Journal ArticleDOI
TL;DR: Results obtained indicate that THz exposure, in the explored electromagnetic conditions, is not able to induce either genotoxicity or alteration of cell cycle kinetics in human blood cells from healthy subjects.
Abstract: Emerging technologies are considering the possible use of Terahertz radiation in different fields ranging from telecommunications to biology and biomedicine. The study of the potential effects of Terahertz radiation on biological systems is therefore an important issue in order to safely develop a variety of applications. This paper describes a pilot study devoted to determine if Terahertz radiation could induce genotoxic effects in human peripheral blood leukocytes. For this purpose, human whole blood samples from healthy donors were exposed for 20 min to Terahertz radiation. Since, to our knowledge, this is the first study devoted to the evaluation of possible genotoxic effects of such radiation, different electromagnetic conditions were considered. In particular, the frequencies of 120 and 130 GHz were chosen: the first one was tested at a specific absorption rate (SAR) of 0.4 mW g-1, while the second one was tested at SAR levels of 0.24, 1.4, and 2 mW g-1. Chromosomal damage was evaluated by means of the cytokinesis block micronucleus technique, which also gives information on cell cycle kinetics. Moreover, human whole blood samples exposed to 130 GHz at SAR levels of 1.4 and 2 mW g-1 were also tested for primary DNA damage by applying the alkaline comet assay immediately after exposure. The results obtained indicate that THz exposure, in the explored electromagnetic conditions, is not able to induce either genotoxicity or alteration of cell cycle kinetics in human blood cells from healthy subjects.

Journal ArticleDOI
Victor K. Ivanov1
TL;DR: The presented work summarizes data on estimated radiation risks among Chernobyl emergency workers of the Russian Federation between 1986 and 2003 and finds no adjustments were made for recognized risk factors for cerebrovascular diseases.
Abstract: The presented work summarizes data on estimated radiation risks among Chernobyl emergency workers of the Russian Federation. In 1991-1998, the excess relative risk (ERR) of death from malignant neoplasm was statistically significant: excess relative risk per 1 Gy (ERR/Gy)=2.11 with 95% confidence interval (CI) (1.31-2.92). In 1991-2001, the ERR estimation for incident solid cancers gives a positive, but statistically insignificant value: ERR/Gy=0.34 with 95% CI (-0.39; 1.22). In 1986-2003, radiation risk for leukemia incidence was investigated. During the first 10 y after the Chernobyl accident (1986-1996) the relative risk (RR) of leukemia (excluding chronic lymphocytic leukemia) was statistically significant: RR=2.2 with 95% CI (1.3-3.8) for emergency workers with doses>0.15 Gy in comparison with less exposed workers. In 1986-2000, a statistically significant dose response was observed for incident cerebrovascular diseases: ERR/Gy=0.39, 95% CI=(0.004; 0.77). For doses>0.15 Gy a statistically significant risk of cerebrovascular diseases as a function of mean daily dose was observed: ERR per 0.1 Gy d(-1)=2.17 with 95% CI=(0.64; 3.69). Different but overlapping cohorts of Russian emergency workers were used for these estimations. No adjustments were made for recognized risk factors for cerebrovascular diseases. All results should be considered as preliminary.

Journal ArticleDOI
TL;DR: An important strength of this study is that the effects of gamma radiation, thoron, and radioactive dust, common exposures in other miner studies, can be ruled out because the source of radon was from water running through the mine.
Abstract: Radon is a well-recognized cause of lung cancer, and studies of underground miners have provided invaluable insights on the mechanisms of radon carcinogenesis. Given the dramatic decreases in occupational exposures and the latent interval between the time of exposure and the development of lung cancer, continued follow-up of these cohorts is needed to address uncertainties in risk estimates. Here, we report on the relationship between radon and lung cancer mortality in a cohort of 1,742 Newfoundland fluorspar miners between 1950 and 2001; follow-up has been extended 11 y from previous analyses. The standardized mortality ratio (SMR) was used to compare the mortality experience of the cohort to similarly aged Newfoundland males. Poisson regression methods were used to characterize the radon-lung cancer relationship with respect to: age at first exposure, attained age, time since last exposure, interactions with cigarette smoking, and exposure rate. In total, 191 lung cancers were observed among underground miners (SMR = 3.09; 95% CI = 2.66, 3.56). ERR/WLMs decreased with attained age and time since last exposure. An inverse dose-rate effect was observed, while age at first exposure was not associated with lung cancer risk. An important strength of this study is that the effects of gamma radiation, thoron, and radioactive dust, common exposures in other miner studies, can be ruled out because the source of radon was from water running through the mine. However, the results should be interpreted cautiously due to uncertainties associated with the estimation of radon exposure levels before ventilation was introduced into the mine, and the relatively small number of lung cancer deaths that precluded joint modeling of multiple risk factors.

Journal ArticleDOI
TL;DR: The investigation showed that the total number of all medical x-ray examinations performed by GPs registered a 1% decrease between 1998 and 2003, and that the sensitivities of the film-screen combinations registered a shift towards higher values, leading to a reduction of the dose delivered by a GP of the order of 20%.
Abstract: — A nationwide investigation was conducted in Switzerland to establish the exposure of the population by medical x rays and update the results of the 1998 survey. Both the frequency and the dose variations were studied in order to determine the change in the collective dose. The frequency study addressed 206 general practitioners (GPs), 30 hospitals, and 10 private radiology institutes. Except for the latter, the response rate was very satisfactory. The dose study relied on the assessment of the speed class of the screen-film combinations used by the GPs as well as the results of two separate studies dedicated to fluoroscopy and CT. The investigation showed that the total number of all medical x-ray examinations performed by GPs registered a 1% decrease between 1998 and 2003, and that the sensitivities of the film-screen combinations registered a shift towards higher values, leading to a reduction of the dose delivered by a GP of the order of 20%. The study indicated also that the total number of all x-ray examinations performed in hospitals increased by 4%, with a slight increase of radiographies by 1% but significant decrease of examinations involving fluoroscopy (39%), and a 70% increase for CT examinations. Concerning the doses, the investigation of a selection of examinations involving fluoroscopy showed a significant increase of the kerma-area product (KAP) per procedure. For CT the study showed an increase of the dose-length product (DLP) per procedure for skull and abdomen examinations, and a decrease for chest examination. Both changes in the frequency and the effective dose per examination led to a 20% increase in the total collective dose.

Journal ArticleDOI
TL;DR: A case-control study nested in the cohort of French uranium miners took smoking information into account in investigating the effect of radon exposure on lung cancer risk, showing a relative risk of lung cancer related to smoking similar to that estimated from previous miners' cohorts.
Abstract: A case-control study nested in the cohort of French uranium miners took smoking information into account in investigating the effect of radon exposure on lung cancer risk. This study included 100 miners who died of lung cancer and 500 controls matched for birth period and attained age. Data about radon exposure came from the cohort study, and smoking information was retrospectively determined from a questionnaire and occupational medical records. Smoking status (never vs. ever) was reconstructed for 62 cases and 320 controls. Statistical analyses used conditional logistic regression. The effect of radon exposure on lung cancer risk was assessed with a linear excess relative risk model, and smoking was considered as a multiplicative factor. Mean cumulative radon exposures were 114.75 and 70.84 Working Level Months (WLM) among exposed cases and controls, respectively. The crude excess risk of lung cancer per 100 WLM was 0.98 (95% CI: 0.18-3.08%). When adjusted for smoking, the excess risk was 0.85 per 100 WLM (95% CI: 0.12-2.79%), which is still statistically significant. The relative risk related to smoking was equal to 3.04 (95% CI: 1.20-7.70). This analysis shows a relative risk of lung cancer related to smoking similar to that estimated from previous miners' cohorts. After adjustment for smoking, the effect of radon exposure on lung cancer risk persists, and its estimated risk coefficient is close to that found in the French cohort without smoking information.

Journal ArticleDOI
TL;DR: The results of the present study show 210Po concentrations ranged from 6.84 to 17.49 mBq per cigarette, which indicates that smokers who smoke 20 cigarettes per day inhale, on average, 79.53 ± 28.65 mBQ d−1 of 210Po and 210Pb each.
Abstract: 210Po and its precursor 210Pb in cigarette smoke contribute a significant radiation dose to the lungs of smokers. In this work, the concentration of 210Po was determined in 17 of the most frequently smoked cigarette brands in Italy. Samples of tobacco, fresh filters, ash, and post-smoking filters were analyzed; 210Po was determined by alpha spectrometry after its spontaneous deposition on a silver disk. To verify the radioactive equilibrium between 210Po and 210Pb, lead was determined in one tobacco sample by counting the beta activity of its decay product 210Bi with a gas flow proportional detector after separation. The results of the present study show 210Po concentrations ranged from 6.84 to 17.49 mBq per cigarette. Based on these results, smokers who smoke 20 cigarettes per day inhale, on average, 79.53 +/- 28.65 mBq d(-1) of 210Po and 210Pb each. The mean value of the annual committed effective dose for Italian smokers, calculated by applying the dose conversion factor for adults of 4.3 microSv Bq(-1) for 210Po and 5.6 microSv Bq(-1) for 210Pb, was estimated to be 124.8 and 162.6 microSv y(-1) for 210Po and 210Pb, respectively. The lung dose from inhalation of cigarette smoke is much higher than the lung dose from inhalation of atmospheric 210Po and 210Pb.

Journal ArticleDOI
TL;DR: The phenomenon, mechanism, power requirement, pressure amplitude, and auditory thresholds of microwave hearing are discussed in this paper and a specific emphasis is placed on human exposures to wireless communication fields and magnetic resonance imaging coils.
Abstract: The hearing of microwave pulses is a unique exception to the airborne or bone-conducted sound energy normally encountered in human auditory perception. The hearing apparatus commonly responds to airborne or bone-conducted acoustic or sound pressure waves in the audible frequency range. But the hearing of microwave pulses involves electromagnetic waves whose frequency ranges from hundreds of MHz to tens of GHz. Since electromagnetic waves (e.g., light) are seen but not heard, the report of auditory perception of microwave pulses was at once astonishing and intriguing. Moreover, it stood in sharp contrast to the responses associated with continuous-wave microwave radiation. Experimental and theoretical studies have shown that the microwave auditory phenomenon does not arise from an interaction of microwave pulses directly with the auditory nerves or neurons along the auditory neurophysiological pathways of the central nervous system. Instead, the microwave pulse, upon absorption by soft tissues in the head, launches a thermoelastic wave of acoustic pressure that travels by bone conduction to the inner ear. There, it activates the cochlear receptors via the same process involved for normal hearing. Aside from tissue heating, microwave auditory effect is the most widely accepted biological effect of microwave radiation with a known mechanism of interaction: the thermoelastic theory. The phenomenon, mechanism, power requirement, pressure amplitude, and auditory thresholds of microwave hearing are discussed in this paper. A specific emphasis is placed on human exposures to wireless communication fields and magnetic resonance imaging (MRI) coils.

Journal ArticleDOI
TL;DR: The in situ electric field is the more stable dosimetric quantity with respect to changes of the tissue conductivity of the Visible Human body model and may exceed the ICNIRP basic restriction for CNS tissue at least in a worst-case scenario.
Abstract: — In situ electric fields and current densities are investigated by numerical simulations for exposure to ELF electric and magnetic fields. Computations are based on the finite-difference time-domain method (FDTD). The computational uncertainty is determined by comparison of analytical and numerical results and amounts to a worst-case expanded uncertainty (95% confidence interval) of ±9.89 dB for both dosimetric quantities (E, J). Detailed investigations based on the Visible Human body model with a resolution of 2 mm show a strong influence of the tissue boundaries on the simulation results, which is caused by the numerical method. For the tissue specific in situ electric field and current density changes in excess of 10 dB are observed when comparing the results with and without evaluation of the dosimetric quantities at tissue boundaries. Moderate sensitivities with respect to tissue boundaries are observed only for low conductivity tissues when evaluating the in situ electric field whereas this behavior is observed for high conductivity tissues when evaluating the current density. For exposure to a 50 Hz magnetic field corresponding to the ICNIRP reference level, the simulated current density for central nervous system (CNS) tissue is in compliance with the ICNIRP guidelines. Exposure to a 50 Hz electric field may exceed the ICNIRP basic restriction for CNS tissue at least in a worst-case scenario (grounded human body, vertical electric field, tissue boundaries included for the evaluation of the current density). The in situ electric field is the more stable dosimetric quantity with respect to changes of the tissue conductivity of the Visible Human body model. The maximum conductivity sensitivity coefficient amounts to +122% for the current density whereas the maximum sensitivity coefficient for the in situ electric field is −20%. For electric field exposure the in situ electric field remains comparable (−6% to −4%), the averaged current density change ranges from −57% to −16% for the tissues under investigation. Magnetic field exposure of a scaled model of a five year old child leads to a decrease of the dosimetric quantities (J: −74% to −45%, E: −42% to −23%) compared to the Visible Human results.

Journal ArticleDOI
TL;DR: The concentrations of all radionuclides examined at Amchitka are similar to those of other uncontaminated Northern Hemisphere sites, and are lower than those reported for fishes and birds from the Irish Sea in the vicinity of the Sellafield nuclear reprocessing facility, an area with known contamination.
Abstract: Amchitka Island (51degrees N lat, 179 degrees E long) was the site of three underground nuclear tests from 1965-1971. There have been no substantive studies of radionuclides in marine fishes and birds in the area since the mid-1970's. In this study, levels of 60Co, 52Eu, 90Sr, 99Tc, 129I, 137Cs, and the actinides (241Am, 238Pu, 239,240Pu, 234U, 235U, 236U, and 238U) were studied in ten marine fish species (including Pacific Cod Gadus macrocephalus and Pacific Halibut Hippoglossus stenolepis) and five marine bird species (including Glaucous-winged Gulls Larus glaucescens, Tufted Puffins Fratercula cirrhata, and Common Eider Ducks Somateria mollissima) from Amchitka. The same species were collected at a reference site, Kiska Island (52 degrees N lat; 177 degrees E long), about 130 km west of Amchitka. Each sample was a composite of edible muscle from five or more individual fish or birds of similar size (+/-15%) from the same sampling station. The null hypotheses of no differences among species or between Amchitka and Kiska were tested. Most analytic results were below the minimum detectable activity (MDA), even when 1,000 g sizes and 72 h counting times were used. The only radionuclides detected above the MDA were 137Cs, 241Am, 239,240Pu, 234U, 235U, and 238U. There were significant differences in 137Cs as a function of species, but not location, for top predatory fishes. Of the fishes, eight of ten species had 137Cs values above the MDA for some samples; only one bird, Glaucous-winged Gull, had 137Cs values above the MDA. The highest concentrations of 137Cs were in Dolly Varden [Salvelinus malma, 0.780 (Bq kg(-1) wet weight)] and Pacific Cod (0.602 Bq kg(-1)). In aggregate for any actinides, 73 of 234 (31%) composites for fish were above the MDA, compared to only 3 of 98 (3%) for birds. 234U and 238U, radionuclides that are primarily natural in origin, were routinely detected in these biological samples, but there were no significant differences in mean concentrations between Amchitka and Kiska. The concentrations of all radionuclides examined at Amchitka are similar to those of other uncontaminated Northern Hemisphere sites, and are lower than those reported for fishes and birds from the Irish Sea in the vicinity of the Sellafield nuclear reprocessing facility, an area with known contamination.

Journal ArticleDOI
TL;DR: Insight is provided into the uncertainty of residential radon gas concentrations that can be incorporated into the sensitivity analyses for the risk estimates of both the North American and global pooling of residential Radon studies to improve risk estimates.
Abstract: It is well known that inhalation of 222Rn and 222Rn decay products increases the risk of lung cancer. While the occurrences of high radon areas in the United States are generally known, studies examining the temporal yearly radon variation in homes across different regions are lacking. This information is essential to assess the ability of a year-long radon measurement to predict the future radon concentration in a home or reconstruct the retrospective residential radon concentration. The purpose of this study is to help fill this gap by examining the temporal variation of residential radon concentrations in homes over several years as well as to explore factors that affect the yearly temporal variability of residential radon concentrations. The coefficient of variation was used as a measure of relative variation between multiple measurements performed across homes over several years. Generalized linear model analyses were applied to investigate factors affecting the coefficient of variation. The median coefficient of variation between the first and second test period was 12%, while a median coefficient of variation of 19% was found between the first and third test period. Factors impacting the coefficients of variation were found to vary for different types of homes and by floors of a home. This study provides important insights into the uncertainty of residential radon gas concentrations that can be incorporated into the sensitivity analyses for the risk estimates of both the North American and global pooling of residential radon studies to improve risk estimates.

Journal ArticleDOI
TL;DR: It is shown that a significant uptake of a uranyl nitrate solution through intact skin can occur within the first 6 h of exposure, and that the biokinetics of a given physicochemical form of uranium incorporated after wound contamination depend largely on the physiological evolution of the considered wound.
Abstract: —Uranium uptake can occur accidentally by inhalation, ingestion, injection, or absorption through intact or wounded skin. Intact or wounded skin routes of absorption of uranium have received little attention. The aims of our work were (1) to evaluate the influence of the type of wound contam

Journal ArticleDOI
TL;DR: Results indicate that, once the soil pore volume becomes saturated to values above ∼20%, the diffusion of radon is markedly hampered, and values determined in the field were systematically lower than those assessed in the laboratory, illustrating the key role of structural differences between undisturbed and repacked soil.
Abstract: The diffusion of radon through soil is strongly affected by the degree of water saturation of the soil pores. In the present work, a laboratory technique for studying radon diffusion has been developed and applied to determine diffusion coefficients in a sandy loam, containing various amounts of water, from null to saturation. The results indicate that, once the soil pore volume becomes saturated to values above approximately 20%, the diffusion of radon is markedly hampered; the bulk diffusion coefficient drops from 1.2 x 10(-6) to 2 x 10(-9) m2 s(-1) as soil saturation increases from 20 to 90%. The effect of soil moisture was further evaluated in field experiments conducted on soil of the same matrix. Comparison between results obtained by the two methods showed that laboratory studies may provide a good indication of radon diffusion coefficients to be expected in situ. However, values determined in the field were systematically lower than those assessed in the laboratory, illustrating the key role of structural differences between undisturbed and repacked soil.

Journal ArticleDOI
TL;DR: This analysis demonstrates that if radiation standards protect humans, then biota are also adequately protected against ionizing radiation.
Abstract: The distribution and migration of radionuclides released into the environment following the Chernobyl accident in 1986 are described. The Chernobyl disaster resulted in the consumption of farm products containing radionuclides as a source of irradiation of the population due to the prevalence of a rural type of human nutrition in the affected region. Economic and radiologic importance of countermeasures for reducing the impacts of the accident are described. The basic radioecological problem is described in which the area where direct radiation contamination of biota was observed is considerably smaller than the zone where concentrations of radionuclides through the food chain exceeded the permissible standards. The radiation-induced effects in biota in the affected area are described. In the long-term post-accident period, the radionuclide distribution between components of ecosystems (including humans) and doses are considered in comparison to a technologically normal situation of nuclear power plant operation. This analysis demonstrates that if radiation standards protect humans, then biota are also adequately protected against ionizing radiation.

Journal ArticleDOI
TL;DR: The cloud from the reactor spread many different radionuclides, particularly those of iodine (131I) and cesium (134Cs and 137Cs), over the majority of European countries, but the greatest contamination occurred over vast areas of Belarus, the Russian Federation and Ukraine.
Abstract: The explosions at the Chernobyl Nuclear Power Plant (CNPP) in Ukraine early in the morning of 26 April 1986 led to a considerable release of radioactive materials during 10 d. The cloud from the reactor spread many different radionuclides, particularly those of iodine (131I) and cesium (134Cs and 137Cs), over the majority of European countries, but the greatest contamination occurred over vast areas of Belarus, the Russian Federation and Ukraine. As the major health effect of Chernobyl is an elevated thyroid cancer incidence in children and adolescents, much attention has been paid to the thyroid doses resulting from intakes of 131I, which were delivered within 2 mo following the accident. The thyroid doses received by the inhabitants of the contaminated areas of Belarus, Russia, and Ukraine varied in a wide range, mainly according to age, level of ground contamination, milk consumption rate, and origin of the milk that was consumed. Reported individual thyroid doses varied up to approximately 40,000 mGy, with average doses of a few to 1,000 mGy, depending on the area where people were exposed. In addition, the presence in the environment of long-lived 134Cs and 137Cs has led to a relatively homogeneous exposure of all organs and tissues of the body via external and internal irradiation, albeit at low rates. Excluding the thyroid doses, the whole-body (or effective) dose estimates for the general population accumulated during 20 y after the accident (1986-2005) range from a few millisieverts (mSv) to some hundred mSv with an average dose of approximately 10 mSv in the contaminated areas of Belarus, Russia, and Ukraine. In other European countries, both the thyroid and the effective doses are, on average, much smaller.

Journal ArticleDOI
TL;DR: Activity of 210Pb measured from wells two years apart clearly demonstrated the continuous flux of groundwater within aquifers, and from a public health perspective this is a concern, since the drinking water screening levels for gross alpha is at 555 mBq L−1 and gross beta is at 1,850 mB QL−1.
Abstract: Groundwater wells from across the State of California were sampled and analyzed for 210 Pb and 210 Po. The separation method involved Fe(OH) 3 precipitation from a 5-L groundwater sample followed by electrodeposition of 210 Po on a nickel disk. The resulting solution was passed through an ion-exchange resin column for the isolation of 210 Pb. De-ionized water spiked at a concentration range from 4.92 mBq L -1 to 755 mBq L -1 with these radionuclide standards showed excellent accuracy and precision of the method. In the groundwater wells, overall activity of 210 Pb ranged from 3.7 mBq L -1 to 1,481 mBq L -1 and the 210 Po activity ranged from 0.25 mBq L -1 to 555 mBq L -1 . Of the select wells tested, 27% for 210 Pb and 19% for 210 Po were above the proposed maximum contamination limits for these radionuclides, which are set at 37 mBq L -1 and 26 mBq L -1 , respectively. From a public health perspective this is a concern, since the drinking water screening levels for gross alpha is at 555 mBq L -1 and gross beta is at 1,850 mBq L -1 . At such high screening levels 210 Pb and 210 Po will not be captured, and this situation was found in several of the wells studied. The occurrence of 210 Pb and 210 Po are not correlated within the sources, however; the polonium concentrations were always lower than the lead concentrations. Activities of 210 Pb measured from wells two years apart clearly demonstrated the continuous flux of groundwater within aquifers.

Journal ArticleDOI
TL;DR: The Department of Energy’s Russian Health Studies Program has continued to generate excitement and enthusiasm throughout its 22-year quest to assess worker and public health risks from radiation exposure resulting from nuclear weapons production activities in the former Soviet Union.
Abstract: Recognized for conducting cutting edge science in the field of radiation health effects research, the Department of Energy’s (DOE) Russian Health Studies Program has continued to generate excitement and enthusiasm throughout its 22-year quest to assess worker and public health risks from radiation exposure resulting from nuclear weapons production activities in the former Soviet Union. The three goals of the program are to:

Journal ArticleDOI
TL;DR: A comparative study of electron paramagnetic resonance dosimetry in Q- and X-bands has shown that Q-band is able to provide accurate measurements of radiation doses even below 0.5 Gy with tooth enamel samples as small as 2 mg with full resolution of the radiation-induced EPR signal from the native, background signal.
Abstract: — A comparative study of electron paramagnetic resonance dosimetry in Q- and X-bands has shown that Q-band is able to provide accurate measurements of radiation doses even below 05 Gy with tooth enamel samples as small as 2 mg The optimal amount of tooth enamel for dose measurements in Q-band was found to be 4 mg This is less than 1% of the total amount of tooth enamel in one molar tooth Such a small amount of tooth enamel can be harmlessly obtained in an emergency requiring after-the-fact radiation dose measurement The other important advantage of Q-band is full resolution of the radiation-induced EPR signal from the native, background signal This separation makes dose response measurements much easier in comparison to conventional X-band measurements in which these overlapping signals necessitate special methods for doses below 05 Gy The main disadvantages of Q-band measurements are a higher level of noise and lower spectral reproducibility than in X-band The effect of these negative factors on the precision of dose measurements in Q-band could probably be reduced by improvement of sample fixation in the resonance cavity and better optimization of signal filtration to reduce high-frequency noise