scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Radiological Protection in 1996"


Journal ArticleDOI
TL;DR: The new respiratory tract model, which applies explicitly to workers and to all members of the general public inhaling gases, vapour or particles, permits the evaluation of dose per unit intake or exposure as well as the interpretation of bioassay data.
Abstract: In 1984 the ICRP appointed a Task Group of its Committee 2, led by Dr W J Bair, to review the respiratory tract model which had been used for dosimetric purposes in ICRP Publication 30 and to propose an updated model. The new model was to reflect the greatly increased knowledge which was available concerning lung physiology; particle clearance and the biological effects of inhaled radioactive particles as well as the increased needs of present day radiation protection. The Task Group set out to provide a respiratory tract model which would meet the following five criteria: 1. permit dose calculations for workers, and for individual members of populations of all ethnic groups; 2. be useful for predictive dose assessments, as well as for setting limits on intake; 3. take account of the influences of smoking, air pollutants and respiratory tract diseases; 4. permit estimates of dose to the respiratory tract from bioassay data; and 5. be equally applicable to radioactive gases and vapours as well as to particles. The 480 pages of the volume under review indicate the success of this massive task. In brief the new model, which applies explicitly to workers and to all members of the general public inhaling gases, vapour or particles, permits the evaluation of dose per unit intake or exposure as well as the interpretation of bioassay data. There is a fundamental difference in approach from the old model, the latter computed only average dose to the whole lung. The new model considers the respiratory tract as five regions, two extrathoracic, the anterior nose or , and the posterior nasal passages, larynx, pharynx and mouth ; the bronchial region (BB); the bronchiolar region (bb), and the alveolar - interstitial region (AI), each of which is assumed to have a different radiosensitivity. These regions differ widely in the radiation doses they may receive, and the model computes specific tissue doses. The model readily permits the insertion of subject specific data, such as age, activity levels, smoking habits and health status. The main features of the model and of the underlying physiology and radiobiology aspects are presented in the first 100 or so pages of the book. The rest being taken up by 8 Annexes, each authored by groups of Task Group members, which contain all the detailed arguments and information on which the report is based, together with large tabulations of valuable physiological and physical data. Specific annexes deal with anatomy and morphology, respiratory physiology, radiation effects on the respiratory tract, deposition of inhaled particles, particle clearance, reference values for regional deposition, specific absorbed fractions of photon energy and absorbed fractions for alpha, beta and electron radiations. This is a detailed and comprehensive model of the human respiratory tract and a very complex one in relation to its predecessor. However, it is versatile and appears to meet all the criteria required of it. For some, its complexity may appear rather daunting and it appears to require major computer programs, such a LUDEP developed by NRPB, to enable it to be used. However, this impression is misleading, if the model is clearly understood it is still possible to make simple dose evaluations without the aid of a computer. This volume provides a wealth of information on the human respiratory tract and its physiology and its appeal should spread far beyond the radiation protection community. This is a valuable working manual as well as a reference book, it is, therefore, a pity that the publishers have not chosen to offer it in a durable hard bound format.

463 citations


Journal ArticleDOI
TL;DR: When using weighting factors derived for children and adolescents, taking into account both the risk of somatic and hereditary effects, as well as severity, the effective dose differed seldom more than 10% from that calculated with the ICRP weighting Factors.
Abstract: Various sets of tissue weighting factors have been derived for the calculation of the effective dose to children (age groups 0 - 9 and 10 - 19 years) in order to assess the detriment caused by radiation, e.g. in diagnostic radiology. The risk estimates for children have been taken from the recent UNSCEAR 1994 Report. The paper presents four sets of weighting factors, two including and two not including the risk of hereditary effects. The effective dose is calculated for some investigations common in paediatric diagnostic radiology. Although the derived weighting factors for children and adolescents differ from those for the whole population, the effective dose value was not dramatically different. When using weighting factors derived for children and adolescents, taking into account both the risk of somatic and hereditary effects, as well as severity, the effective dose differed seldom more than 10% from that calculated with the ICRP weighting factors. Thus, the ICRP weighting factors could also be used for children and adolescents. Age-dependent risk coefficients should, however, be used.

40 citations


Journal ArticleDOI
TL;DR: Generalisations of the two-mutation carcinogenesis model of Moolgavkar, Venzon and Knudson, and of the model of Armitage and Doll, are fitted to the Japanese atomic bomb survivor mortality data, finding that without some extra stochastic `stage' appended the two -mutation model is perhaps not well able to describe the pattern of excess risk for solid cancers that is often seen after exposure to radiation.
Abstract: Generalisations of the two-mutation carcinogenesis model of Moolgavkar, Venzon and Knudson (to allow for an arbitrary number of mutational stages), and of the model of Armitage and Doll, are fitted to the Japanese atomic bomb survivor mortality data. Models with two or three mutations give adequate descriptions of the excess mortality of solid cancers. For leukaemia the fit of the three-mutation model is preferable to that of the two-mutation model. The optimal three-mutation leukaemia model provides a satisfactory fit only when both first and second mutation rates are radiation-affected. Examination of other epidemiological data leads to the conclusion that without some extra stochastic `stage' appended (such as might be provided by consideration of the process of development of a malignant clone from a single malignant cell) the two-mutation model is perhaps not well able to describe the pattern of excess risk for solid cancers that is often seen after exposure to radiation. The optimal three-mutation models predict low-dose population risks for a current UK population of excess cancer deaths , radiation-induced cancer deaths or 1.0 - 1.4 years of life lost . Risks for a current Japanese population are excess cancer deaths , radiation-induced cancer deaths , or 1.2 years of life lost .

37 citations


Journal ArticleDOI
TL;DR: The subject matter of this NCRP report is highly relevant since national legislation in the majority of countries throughout the world currently specifies key dose limits in terms of effective dose equivalent and will soon change the use ofeffective dose (E).
Abstract: The subject matter of this NCRP report is highly relevant since national legislation in the majority of countries throughout the world currently specifies key dose limits in terms of effective dose equivalent and will soon change the use of effective dose (E). Moreover, many in the field of personal radiation dosimetry are actively seeking advice on this subject. NCRP is an American body so it is not surprising to find that the report is biased towards the situation in the USA. Nevertheless, because of the international nature of radiation protection much of the content and certainly the conclusions reached are highly relevant to personal monitoring in other countries. The introduction to the report deals with the definitions of and E and the differences are carefully explained. These particularly concern the list of organs, the way the remainder organs are treated, the weighting of absorbed dose and the tissue weighting factors. The biological effects and the radiation risks to which each quantity refers is also carefully pointed out. Because of these differences it is clearly important that there is consistency of use of the two quantities. This is dealt with in the final section of the introduction which emphasises that is meant for use with the radiation protection system in ICRP Publication 26 and E with that recommended in ICRP Publication 60. The second chapter deals with the use of personal monitors for workers in the USA. One of the disadvantages of both and E is that neither are measurable and the use of the ICRU operational quantity personal dose equivalent, , to provide an estimate of or E is explained and accepted as a satisfactory solution to the problem. Two major US accreditation programmes for personal dosimetry services are described, the National Voluntary Accreditation Program (NVLAP) and the Department of Energy Laboratory Accreditation Program (DOELAP). Both programmes are highly developed and the details provided offer a great deal of guidance to other countries wishing to set up their own accreditation system. Of particular help is the section on the calibration of personal dosemeters; the discussions on the choice of phantom and its use is particularly valuable. The chapter concludes with a discussion on the number and types of personal dosemeter worn by workers in different work categories in the USA. Clearly personal monitoring is treated very seriously and significant effort is devoted to deciding upon and operating the most appropriate system. Sometimes a single passive dosemeter is worn; in other cases this may be supplemented by a direct reading device but occasionally a number of dosemeters may be worn. For example, in a steam generator in a nuclear power plant up to 14 dosemeters may be placed at various positions on the body including on top of the head. One wonders whether this is really necessary! In chapter 3 the question of estimating or E in practice is dealt with. In the early part of the chapter the authors discuss data published by ICRU in Report 43. These data show that if the dosemeter is worn centrally on the chest a satisfactory estimate of is obtained for a number of radiation geometries including lateral, isotropic and planar isotropic. This is not so in the extreme conditions where the dosemeter is worn on the front of the body to measure radiation received from behind and vice versa. Consideration is given to the use of two dosemeters used in conjunction with an algorithm to determine the dose received. This approach is particularly valuable where little is known about the geometry of the radiation field. The highlight of the chapter, however, is the section devoted to the approach when protective aprons are worn during diagnostic and interventional medical procedures using fluoroscopy. This has long been a vexed question in personal dosemetry and the report makes a careful study of a number of published papers on the subject and a way forward is suggested. This involves the use of an appropriate correction factor where one dosemeter is worn outside the apron or an algorithm where an additional dosemeter is worn beneath the apron. The final chapter deals with recommendations. There are three sets of highly useful recommendations, the first giving the working conditions in which an estimate of and E can be obtained from a single dosemeter designed to measure , the second for working conditions where two dosemeters are advisable and the third where a protective apron is worn in some clinical work. To summarise, this is a very useful report on a subject of substantial interest and importance to those involved in personal dosimetry. The report points a way ahead, which is very much in accord with the approach being adopted in other parts of the world and hopefully it will help to provide much needed stability in radiation dosimetry.

33 citations


Journal ArticleDOI
TL;DR: The results reveal that targetting future housing is a more cost-effective option than remediation of existing dwellings with radon concentrations above the Reference Level - the costs per lung cancer death averted are typically $145 000.
Abstract: Published information on the distribution of radon levels in Spanish single family dwellings is used to evaluate the cost-effectiveness of three different intervention scenarios: remediation of existing dwellings, radon proofing of all future dwellings and the targetting of areas with higher than average indoor radon concentrations. Analysis is carried out on the basis of a Reference Level of for the existing housing stock and for new dwellings. Certain assumptions are made about the effectiveness and durability of the measures applied and annualised costs are used to calculate the costs per lung cancer death averted. The results reveal that targetting future housing is a more cost-effective option than remediation of existing dwellings with radon concentrations above the Reference Level - the costs per lung cancer death averted are typically $145 000. In high-risk areas, these costs can be considerably less, depending on the percentage of dwellings expected to exceed the Reference Level and the average savings in exposure as a result of the intervention. The costs of intervention to reduce lung cancer deaths following exposure to radon compare favourably with those of other health programmes in other countries.

23 citations


Journal ArticleDOI
TL;DR: The European Dosimetry Group EURADOS has approved a report on the cosmic ray exposure of aircrew: it will be published in the near future as discussed by the authors, which covers the radiation fields, estimates of risk to aircrew and passengers, radiation protection philosophy and appropriate measurements.
Abstract: Current commercial aircraft fly at altitudes between some 9000 to 12 000 m although Concorde reaches 18 000 m. In this document the NCRP is looking ahead to the possibility of regular flights at altitudes of 15 000 to 24 000 m. The radiation field to which aircrew and passengers are exposed during high altitude flight is complex. Data on this environment have come from cosmic ray studies and from research on high energy accelerators. Many measurements have also been reported from onboard various aircraft and a computer program is available that calculates doses to aircrew. However, uncertainties remain and development continues. An NCRP Commentary can provide a preliminary review of a topic within a short timescale: this Commentary is both clear and concise. It covers the radiation fields, estimates of risk to aircrew and passengers, radiation protection philosophy and appropriate measurements. There are eight major recommendations for future work aimed at improving relevant knowledge, reducing uncertainties and providing information. Although the document's title uses the words `high altitude flight', most of the recommendations cover the entire altitude range 9000--24 000 m; only one is specific to high altitudes. There is a very limited treatment of the differences in the nature of, and the response to, solar particle radiation as distinct from galactic cosmic radiation. The section on estimates of risk to aircrew and passengers relates only to supersonic flight, whereas much of the report covers a much broader situation. For flights at 20 000 m aircrew are assumed to be exposed for 1000 h per year whereas frequent fliers are given only 100 h per year. At this altitude the equivalent dose rate is taken as . There is no discussion on the uncertainties associated with these numbers. Indeed aircrew are only at high altitudes for part of their working time; they have duties before take-off and after landing. Couriers are a group of frequent fliers and their exposure times may exceed those of aircrew. It is recommended by ICRP that cosmic ray exposure of aircrew be considered part of occupational exposure. The revised EURATOM Directive on the Basic Safety Standards will address the requirements in this area. The European Dosimetry Group EURADOS has approved a report on the cosmic ray exposure of aircrew: it will be published in the near future. Radiation exposure during flight is an international topic and there is a clear case for international cooperation and agreement in these matters.

22 citations


Journal ArticleDOI
TL;DR: The biological assumptions underpinning the collective dose concept are reviewed and issues surrounding the application of the concept to various situations are discussed, and NCRP concludes that direct supportive evidence is limited and ultimately confidence in this assumption may derive from mechanistic biophysical studies.
Abstract: `For moderate increments above background, a linear relationship between the incremental dose and the incremental probability of a deleterious effect will be an adequate approximation' (International Commission on Radiological Protection). This, and similar statements going back over a number of years, validates the collective dose concept. Put simply, collective dose is the sum of the doses to all people in the exposed population and can be converted into the corresponding number of health effects on the basis of the linear dose - response relationship. Taking the argument further, collective dose can be linked to total detriment and, as a final step, a monetary cost can be assigned to unit collective dose representing the cost to society of the detriment. Collective doses find application in the optimisation of protection and estimates of total numbers of health effects have also been inputs to justification decisions. This all sounds intellectually satisfying and watertight until one looks deeper: is the collective dose concept applicable to large populations with very small individual doses and to populations that may exist several generations into the future? This report from the American National Council on Radiation Protection and Measurements provides insights into these issues. The biological assumptions underpinning the collective dose concept are reviewed and issues surrounding the application of the concept to various situations are discussed. The review of the biological assumptions is detailed, covering cellular and animal studies on effects of ionising radiation together with epidemiological data from exposed human populations. The conclusion reached is that it is prudent to assume a linear dose - response relationship in the low dose region. However, NCRP concludes that direct supportive evidence is limited and ultimately confidence in this assumption may derive from mechanistic biophysical studies. In connection with possible collective doses delivered into the far future following, say, disposal of long-lived radionuclides, the report draws attention to the uncertainties in the fertility, size and location of future populations, and their level of medical technology. The report makes a number of recommendations which include: regulatory limits should not be set in terms of collective dose; collective dose is most useful when applied to populations with known characteristics; all doses should be included in collective dose calculations as there is no conceptual basis for excluding any individual doses no matter how small; calculation of collective doses to future populations should be done with care and with recognition of the uncertainties. The report presents the ideas and issues clearly. It is of considerable interest to those who were involved in the discussions surrounding the BNFL Sellafield authorisation three years ago. However, it is not the last word on the subject as further work and ideas are required on the applicability and interpretation of collective dose estimates, particularly in the context of optimisation of protection.

19 citations


Journal ArticleDOI
TL;DR: The high doses of alpha-particle radiation which Thorotrast delivers to the testes makes the study of offspring of patients, who were administered this diagnostic contrast medium, particularly relevant to the controversy regarding childhood leukaemia in the offspring of the male Sellafield workforce.
Abstract: The high doses of alpha-particle radiation which Thorotrast delivers to the testes makes the study of offspring of patients, who were administered this diagnostic contrast medium, particularly relevant to the controversy regarding childhood leukaemia in the offspring of the male Sellafield workforce, as it has been suggested that the alpha-particle dose from actinides may, in part, account for the observed association between paternal pre-conception external irradiation and leukaemia The risks of childhood leukaemia and non-Hodgkin's lymphoma found among the offspring of the male Danish cohort of neurosurgical patients injected with Thorotrast for cerebral arteriography are compared with those observed in the offspring of the Sellafield (western Cumbria), Ontario and Scottish radiation workers and in the offspring of the Japanese atomic bomb survivors Risks are compared using linear and exponential forms of a relative risk model The pre-conception exposure risks of leukaemia, or of leukaemia and non-Hodgkin's lymphoma, in the Danish study are statistically incompatible with the corresponding risks in the children of the Sellafield workforce ( P < 001), which is entirely due to incompatibility with the risks in the children of the Sellafield workforce born in Seascale ( P < 001) Indeed no cases arose in the Danish study The statistical incompatibility between the offspring of the Danish patients and the Seascale-born offspring of the Sellafield workforce is independent of the models used and is insensitive to uncertainties in the gonadal dose estimates in the respective datasets or to uncertainties in the estimates of the background cancer rates in the Seascale and Danish populations The risks of leukaemia and non-Hodgkin's lymphoma in the Danish dataset are statistically compatible with the risks observed in the offspring of the Sellafield workforce born elsewhere in western Cumbria and with those seen among the offspring of the Ontario or Scottish radiation workforces as well as with those in the offspring of the Japanese atomic bomb survivors The statistical compatibilities of the leukaemia pre-conception exposure risks in these various groups are also independent of the models used It is most unlikely that the Seascale childhood leukaemia cases are caused by paternal exposures to alpha-particle emitters such as plutonium In each non-Seascale dataset the risks are consistent with there being no excess hazard of leukaemia associated with paternal pre-conception exposure to ionising radiation, and it would seem that the association seen for Seascale-born children should be attributed to chance

16 citations


Journal ArticleDOI
TL;DR: The topic dealt with in this book is mandated science which means applied science brought to bear on practical issues such as risk assessment and the large differences between the risks of alachlor as determined by the manufacturer (Monsanto) as compared to those of the Department of Health and Welfare were of particular interest.
Abstract: The topic dealt with in this book is mandated science which means applied science brought to bear on practical issues such as risk assessment. Most readers of the Journal of Radiological Protection would likely consider risk assessment to be objective and value free, but it is just this feature that the authors of this engaging book set about to undermine. In short, the notion that mandated science provides neutral, value-free advice is often mistaken, as is illustrated by a detailed consideration of the alachlor controversy in Canada. Alachlor is a chemical herbicide manufactured by Monsanto and used to control weeds in fields planted with corn and soya beans. Prior to being made commercially available, chemicals such as alachlor require registration by Agriculture Canada. The issue of their safety is dealt with by the Canadian Department of Health and Welfare. Establishing the safety of chemicals like alachlor requires a dose-response curve for the induction of tumours in animal studies, as well as an assessment of the exposures to the workers who will be handling these chemicals during their routine work activities. In the mid-80s, the Canadian government concluded that Alachlor posed too high a risk of cancer. Monsanto appealed this decision, and the Alachlor Review Board was convened to consider the appeal during 1986 - 87. Although the Review Board concluded that alachlor was not a risky product, and should be registered to permit its legal sale in Canada, this recommendation by the Board was rejected by the Minister. As a consequence of this government decision, alachlor is no longer commercially available in Canada. Of particular interest were the large differences between the risks of alachlor as determined by the manufacturer (Monsanto) as compared to those of the Department of Health and Welfare. Based on the same empirical data, the risk estimates made by the commercial company and the health authorities disagreed due to different assumptions made about the exposure to agricultural workers. The most intriguing conclusion of this study, however, is that these differences arose from the fact that the two sides held different value perspectives. Included in this category were issues such as the importance of technology, the relative importance of human health and corporate profits, their political philosophy (e.g. liberalism) as well as the nature of rationality itself. In the case of alachlor, conflicting risk estimates differed by as much as six orders of magnitude because of different assumptions made about issues such as whether the workers would be using protective clothing (gloves) and the exposure measurement methodology (patch tests versus biomonitoring). These issues may be considered to be conditionally or inherently normative and value laden decisions were required before any full blown risk estimate could be made. Retreat by the assessors to simply providing a range of risks which differed by would have rendered any assessment of no practical value for any regulatory purpose. This key facet of risk assessment can be illustrated by looking at the issue of protective clothing, where the question is whether it should be assumed that the worker will wear protective gloves. Not surprisingly, Monsanto assumed workers would be wearing such gloves, an approach which minimised the risk to the company's economic freedom and financial position. Since Health and Welfare has the mandate of health protection, their choice was that workers would not wear gloves. The glove issue was not purely factual but normative; it contrasted the fairness to the company (why should Monsanto be penalised if workers did not obey the instructions provided) versus health (Health and Welfare noted that gloves were in fact generally not worn by workers). The point that the authors are making is that the risk assessment of alachlor is not merely an empirical and factual exercise. When computing risk estimates in the real world, there are usually uncertainties in the underlying science. For example, it is not clear that the induction of tumours in a rat following exposure to a chemical will also result in tumours to humans at similar exposure levels. It is the manner of dealing with these types of scientific (and other) uncertainties that can cause real difficulties in a risk assessment exercise. Proponents of risk assessment are frequently mistaken when they make claims of objectivity. In the alachlor controversy, the scientific mindset represented by the conclusion of the review Board was in reality the hegemony of one set of values dressed up to appear value neutral and scientifically objective. It is important to emphasise that the authors are not denying the need for good science in risk assessment, but that such an exercise involves much more than just establishing `the facts'. The implications of this study are that disagreements about issues such as nuclear power are often normative in nature and not purely factual. A corollary is that any satisfactory resolution of such debates needs to focus on the normative aspects. It is a mistake to pretend that a search for objective `facts' about the risks is likely to resolve such differences and although there may be differences of fact which require attention, these are likely to be minor in comparison to the normative issues. If the case the authors are making were to be generally accepted, then many in the risk assessment business would need to review their methodology and identify the points at which their expertise runs out and value predilections take over. The issue of risk assessment is of major significance for all who work in the field of radiological protection. The arguments and conclusions presented in this book may be disturbing to many health physicists who believe their role in risk assessment to be neutral and purely objective. In this reviewer's opinion, the authors have made a substantive case that risk assessment per se is a much more complex business than hitherto believed to be the case. It appears that risk assessment, as well as any subsequent issues of the acceptability of any risks, impinge on the political realm where conflicting societal demands need to be resolved. The notion that the issue of risk assessment can be resolved by invoking a neutral and objective `algorithm' to yield a rational (and therefore acceptable) answer is apparently misguided. The arguments in this book are clearly presented and the topic is of obvious importance for the health physics profession. If the authors' conclusions on this important topic were to be generally accepted, then those involved with risk assessment would face an important professional choice. Risk assessors could stay out of normative issues and offer watered down versions emphasising uncertainties and how the risk depends on underlying assumptions. Whether such assessments would be of any practical benefit is clearly problematical. Any submission of a comprehensive risk assessment, on the other hand, would need to recognise that these also express value commitments. Obviously such values could not be claimed to be supported by any underlying scientific expertise. Whether one agrees or disagrees with these conclusions, reading this book is an important step towards a deeper understanding of the fundamental issues unearthed by this meticulous scrutiny of the alachlor controversy.

16 citations


Journal ArticleDOI
TL;DR: A knowledge of the toxicity of plutonium is largely dependent upon animal studies where exposure to relatively large amounts, compared with those associated with known human exposure, can cause tumours in those tissues where it is retained.
Abstract: The physical and chemical properties of plutonium are related to its environmental transfer and uptake by man. Once incorporated, plutonium is avidly retained in the lungs, liver and skeleton, the relevant amounts being determined by its solubility in body fluids. A knowledge of the toxicity of plutonium is largely dependent upon animal studies where exposure to relatively large amounts, compared with those associated with known human exposure, can cause tumours in those tissues where it is retained. With one exception, epidemiological studies have not been able to demonstrate adverse health effects in humans. Precautions taken in the processing of plutonium have ensured that average intakes by workers have been consistently low. When it has been released to the environment, it has been of little ecological importance and has caused only small doses to man with no observable adverse effects. The long half-life of plutonium causes anxiety about its storage and disposal, but plutonium is not unique. It is often forgotten that very much larger amounts of permanently toxic elements such as arsenic, cadmium and lead are stored and disposed of with much less concern. Plutonium is a valuable resource and for that reason should not be treated as a waste for disposal into the environment.

15 citations


Journal ArticleDOI
TL;DR: Intended for use by TL dosimetrists, research workers and students, it includes information on glow curves, emission spectra, the mechanisms of TL production, trapping parameters associated with TL signals as well as a lot of information about the dosimetric properties of many materials when exposed to low LET radiation.
Abstract: This book will be invaluable to anyone who is interested in understanding the physics of thermoluminescence (TL). Intended for use by TL dosimetrists, research workers and students, it includes information on glow curves, emission spectra, the mechanisms of TL production, trapping parameters associated with TL signals as well as a lot of information about the dosimetric properties of many materials when exposed to low LET radiation. TL dosimetry is often used by workers who do not have much knowledge of the underlying physical mechanisms causing emission. For those who only wish to use TL dosimetry as a means of measuring radiation dose and are content to remain uninformed about crystal lattice structures, vacancy and impurity centres and the fundamental characteristics and properties of dosemeters there is no need to purchase this book. But if you are interested in exploring the possibility of using a new TL dosemeter or in understanding the links between solid state physics and the radiation dosimetry properties of any of the principal TL materials in current use - then this very informative book will be a good buy - even though it is not cheap. It includes numerous original conventional glow curves as well as isometric and contour plots of the emission spectra of each of the specimens studied. The `3D' isometric plots of emission intensity against temperature and wavelength provide the reader with a wealth of information in a user friendly format. There are chapters on Thermoluminescence and Thermoluminescent Dosimetry and well-researched chapters on the TL properties of Fluorides, Oxides, Sulphates and Borates. Each chapter has a good list of references. Less commendable features which might irk the specialist reader include the brevity of the index and the use of previously published data in which doses are quoted in R rather than Gy.

Journal ArticleDOI
TL;DR: This paper reconsiders collective dose and explores the concerns relating to the use of collective dose estimates that are made up of low levels of individual dose, or extend over long periods of time into the future and are delivered to different populations.
Abstract: Collective dose is described as a measure of the total detriment associated with a specific source or practice, and the collective doses resulting from a number of sources have been assessed. However, the interpretation and use of such estimates, for example in making decisions, has been the subject of much debate. This paper reconsiders collective dose and explores the concerns relating to the use of collective dose estimates that are made up of low levels of individual dose, or extend over long periods of time into the future and are delivered to different populations. Practical solutions are advanced to deal with long timescales with the aim of providing more robust estimates of long-term population detriment for decision making.

Journal ArticleDOI
TL;DR: It is indicated that there were a few houses in Syria that require remedial action and most houses that have high levels of radon were found in the southern area, especially in the Damascus governorate.
Abstract: A nationwide investigation of radon levels in Syrian houses was carried out during the period 1991 - 1993. Passive radon diffusion dosemeters using polycarbonate detectors were distributed in houses all over Syria. Detectors were subjected to electrochemical etching to reveal latent tracks of alpha particles. The mean radon concentration in Syrian houses was found to be with some values several times higher. This investigation indicated that there were a few houses in Syria that require remedial action. Most houses that have high levels of radon were found in the southern area, especially in the Damascus governorate. The study also indicated that radon concentrations were higher in old houses built from mud with no tiling.


Journal ArticleDOI
TL;DR: These proceedings bring together the presentations and discussions of a meeting held in Paris in September 1994, which provided the basis for a later document entitled `The Management of Long-Lived Radioactive waste, The Environmental and Ethical Basis of Geological Disposal, a Collective Opinion of the NEA Radioactive Waste Management Committee (1995)'.
Abstract: Environmental issues are currently very fashionable in the wake of the '92 earth summit, Agenda 21 and the concept of sustainable development. The Radioactive Waste Management Committee of the NEA have agreed it would be helpful to policy makers if a collective opinion could be drawn up to explain the ethical and environmental aspects of long-lived radioactive waste disposal. These proceedings bring together the presentations and discussions of a meeting held in Paris in September 1994, which provided the basis for a later document entitled `The Management of Long-Lived Radioactive Waste, The Environmental and Ethical Basis of Geological Disposal, a Collective Opinion of the NEA Radioactive Waste Management Committee (1995)'. Three sessions were held over 2 days, discussing the background to current environmental policies and their implementation, ethics and the environment and a review of geological disposal strategy. A final session discussed conclusions. It was advocated early on that an incremental decision making process, which is being followed in a number of countries, is beneficial in allowing safety assessment to be separated from the implementation activities (such as siting a disposal facility) that often create public concern. The decision whether or not to close a repository would then be made on the basis of several decades accumulated evidence and experience. Public involvement was a recurring theme throughout the presentations from the USA, Canada, Germany and France. Another recurring theme was the comparison of hazards, risk assessment and general principles with other industries, in particular toxic waste disposal. The point was made that the nuclear industry has always shown leadership in environmental matters. The question of intergenerational equity was thoroughly aired, consistent with discussions about sustainable development. This covered the Precautionary Principle and consideration of retrievability as opposed to `final disposal' which may be challenged in some quarters on the basis that it might preclude better environmental safeguards being implemented by future generations. The group asked the question `what is the responsibility of the current generation to future generations?'. This led to a discussion of benefits to one generation and costs to another and consideration of differing societies, economies, views and values across the world. The overall conclusions of the group included a reaffirmation that deep geological disposal, incorporating the multi-barrier concept, is preferred for long-lived radioactive waste. The burden on future generations should be minimised, although an incremental decision process, which would allow a future society to remove material, if it wishes, was favoured. The group thought it was important, when making decisions, to consider what the views of host communities, the political community, the technical community and the community at large might be.

Journal ArticleDOI
TL;DR: The Radon Book was originally written in Swedish, but it has been well translated into English and adapted for an international audience and covers all aspects of the subject from the production of radon in the ground to the presence ofRadon decay products in air, with some additional information on health risks and measurement techniques.
Abstract: The problem of high radon levels in homes was recognised very early in Sweden. The first measurements were made in the 1950s and recommendations on reduction measures were published in 1982, before most countries had noticed the existence of radon. The rest of the world has since done a lot of catching up. but Sweden remains in the forefront of research and action on preventing high exposures, as illustrated in this book. The Radon Book was originally written in Swedish, but it has been well translated into English and adapted for an international audience. It covers all aspects of the subject from the production of radon in the ground to the presence of radon decay products in air, with some additional information on health risks and measurement techniques. The main part of the book, however, concentrates on remedial measures for existing buildings and preventive measures for new ones. Systematic and detailed instructions are given for identifying problems, choosing solutions and applying them. The remedial measures developed in Sweden were initially based on those used in North America, where high radon levels were at first thought to be a man-made problem caused by building houses over uranium mine tailings. No expense was spared in solving this problem until it was found that high radon levels indoors were more commonly a result of building houses over certain rock types. The realisation that radon problems were of natural origin changed attitudes dramatically. Sweden continued the research on remedial measures, in particular on the most effective and reliable method, sub-floor depressurisation. This measure is most appropriate in houses with concrete floors, and uses a small fan to extract the radon-laden air from under the floor. This solution has since spread around the world, despite initial scepticism. It seems that in each country the building technologists start out convinced that the buildings in their country are so different from elsewhere that sub-floor depressurisation will not work, and are then surprised to find that it does. Sweden has an additional radon problem not commonly found elsewhere: from building materials. From 1929 to 1975 a lightweight concrete was made from alum shale. This often has uranium and radium contents a hundred times higher than ordinary building materials, and such buildings can be identified with a gamma ray monitor. Most of the dose, however, comes from the radon exhaled by the concrete, and this calls for different remedial measures from those used when the radon comes from the ground. The methods developed, such as ventilation and sealing, may also be required in some other countries, such as Italy, where it has been reported that volcanic tuffs used as building materials have caused high radon levels. Jon Miles This book review first appeared in NRPB's Radiological Protection Bulletin and is reproduced here with the permission of the publishers.

Journal ArticleDOI
TL;DR: The report in its very thorough review points out that many stages of radioactive waste management, including the disposal of low-level and medium-level waste, have actually become routine industrial procedures.
Abstract: This report is clearly written and concise. The report will be of considerable assistance to decision-makers, opinion formers and interested members of the general public. It will form a useful summary for a non-specialist reader who is interested in radioactive waste management practices in the OECD area, and in the current expert consensus on the subject. The explanations of the different principles and stages of radioactive waste management for each category of waste are useful. The issues of environmental protection, safety assessments, financing, public perceptions and international cooperative effort are all concisely described. The annexes which summarise the current national radioactive waste management programmes in OECD countries are useful summaries. Radioactive waste disposal is often perceived as one of the `most important' unresolved issues in the nuclear energy cycle. This NEA/OECD report highlights the fact that a broad scientific and technical consensus exists among specialists regarding radioactive waste disposal. Radioactive wastes can be managed and disposed of safely with current techniques, provided it is performed in accordance with all the extant regulatory guidance. The report in its very thorough review points out that many stages of radioactive waste management, including the disposal of low-level and medium-level waste, have actually become routine industrial procedures.

Journal ArticleDOI
TL;DR: The effects of the Chernobyl nuclear accident on the psychology of the affected population have been much discussed as discussed by the authors, and the psychological and social effects associated with the post-accident situation arise from the interdependency of a number of complex factors exerting a deleterious effect on the population.
Abstract: The effects of the Chernobyl nuclear accident on the psychology of the affected population have been much discussed. The psychological dimension has been advanced as a factor explaining the emergence, from 1990 onwards, of a post-accident crisis in the main CIS countries affected. This article presents the conclusions of a series of European studies, which focused on the consequences of the Chernobyl accident. These studies show that the psychological and social effects associated with the post-accident situation arise from the interdependency of a number of complex factors exerting a deleterious effect on the population. We shall first attempt to characterise the stress phenomena observed among the population affected by the accident. Secondly, we will be presenting an analysis of the various factors that have contributed to the emerging psychological and social features of population reaction to the accident and in post-accident phases, while not neglecting the effects of the pre-accident situation on the target population. Thirdly, we shall devote some initial consideration to the conditions that might be conducive to better management of post-accident stress. In conclusion, we shall emphasise the need to restore confidence among the population generally.

Journal ArticleDOI
TL;DR: In situ gamma-ray spectrometers have been used to estimate the radioactivity levels in North Wales and their correlation with the depth distribution of the core samples was assessed.
Abstract: North Wales has a wide range of levels of natural radioactivity together with significant levels of artificial radionuclides on its coast arising mainly from the Sellafield nuclear processing plant. In situ gamma-ray spectrometry offers a rapid alternative to core sampling for mapping out these radioactivity levels but requires extensive calibration and some knowledge of the depth distribution. Quantitative in situ spectrometry and analysis of core samples was performed for three selected areas of North Wales and their correlation assessed. Despite a non-exponential activity - depth distribution, agreement to within 25% was found for and to within 50% for . Complete agreement was found for within the experimental errors. Environmental dose-rate assessments were also performed using the in situ spectra and showed that was responsible for almost 50% of the external gamma-ray dose-rate at two of the sites. These dose-rate measurements were compared with those obtained from a compensated GM tube and were found to be in complete agreement within the experimental errors.

Journal ArticleDOI
TL;DR: The results of this work indicate that the effects of cooking should be considered when assessing the dose received from the intake of foodstuffs in response to most of the preparation techniques.
Abstract: Radionuclides, including and , are periodically and routinely discharged from nuclear powered electricity generation sites and it is important to assess the radiological impact of such discharges on humans due to food consumption. Foodstuffs may be cooked before being eaten and this can change their radionuclide content. The aim of this study was to examine the effects of a range of domestic food preparation techniques on the radionuclide contents of a range of food types. Radionuclide concentrations of tritium (free tritium, HTO, and organically bound tritium, OBT), and were examined in a selection of fruit and vegetables that would form part of a typical diet. The foodstuffs included blackberries, broad beans, cabbages, carrots and potatoes (at two stages of development). The preparation techniques included boiling (potatoes, carrots, broad beans), roasting (potatoes), steaming (cabbage), or stewing (blackberries). In general, the radionuclide concentrations were reduced in the crops by at least 30% after preparation using any of the cooking techniques. The concentrations of fell by at least 60%, and this radionuclide showed the greatest reduction in response to most of the preparation techniques. Roasting resulted in the greatest reductions in the levels of HTO and . The results of this work indicate that the effects of cooking should be considered when assessing the dose received from the intake of foodstuffs.

Journal ArticleDOI
TL;DR: This pamphlet proposes a six-tiered hierarchical model of efficacy which can be applied to nearly all diagnostic techniques including diagnostic imaging, and makes some recommendations, namely that efficacy assessments should be used by physicians for every emerging technology.
Abstract: To the radiation expert, many medical practitioners appear casual to the point of recklessness regarding radiation exposure to their own person. However, diagnostic radiologists are in a position to make a significant contribution to radiation protection of the population by discouraging unnecessary medical radiation exposure, by applying the time honoured medical dictum `if a test will not benefit the individual patient, and will not alter the medical management of the patient, then do not do the test'. It behoves scientific and medical workers to establish that each new technological advance is `efficacious', and to re-evaluate existing technologies in the rapidly developing field of diagnostic radiology, and thus to ensure that individuals and populations are not exposed to excessive radiation as a result of inappropriate application of old, or indeed new techniques. This pamphlet defines `efficacy' as the probability of benefit to individuals in a defined population from a medical technology applied for a given medical problem under ideal conditions of use. It proposes a six-tiered hierarchical model of efficacy which can be applied to nearly all diagnostic techniques including diagnostic imaging. Level 1 is labelled `Technical Efficacy', which is generally the domain of the physicist, and concerns parameters affecting image quality, from resolution of line pairs to computer artefact analysis. Level 2 is labelled `Diagnostic Accuracy Efficacy', which is characterised by measurements such as accuracy of diagnosis (often measured as percentage of correct diagnoses in a case series with respect to a given standard of reference), sensitivity, specificity, and positive and negative predictive values. The receiver operating characteristic (ROC) curve and (the area under the curve) have become an important tool in the measurement of diagnostic accuracy efficacy; the radiological literature contains many studies addressing this form of efficacy, and all acknowledge the importance of the observer as a variable. Level 3 describes `Diagnostic Thinking Efficacy', which addresses the question `how often does the investigation alter the pre-test perception of the referring physician?'. Given the subjective nature of diagnostic thinking, this is particularly difficult to analyse. However, it is suggested that there are methods which can identify when no change in perception of diagnostic probability results from information derived from a test, which indicates a lack of efficacy. Level 4 is labelled `Therapeutic Efficacy', which asks the question `how often is the intended management of a patient altered by information derived from the imaging technique?'. This may be studied retrospectively, and the early days of CT scanning provided a number of examples of how a test may influence patient management and therapeutic intervention. Level 5 is described as `Patient Outcome Efficacy'. Here the question asked is `does the investigation make any difference to the health of the patient?'. From the viewpoint of the patient, this would seem to be the most important level of analysis, particularly for tests which are dangerous, painful, widely used or expensive. Measurements of improved patient outcome can be difficult to verify, e.g. increased life expectancy, improved quality of life, avoidance of other tests or procedures. To demonstrate efficacy at levels 3, 4 and 5 in a definitive manner would best be accomplished using prospective randomised controlled trials, in which one arm was deprived of the imaging technique. The statistical and ethical problems associated with such trials are considerable, and the authors suggest the decision analysis approach, which uses statistical methods applied to epidemiological data, to provide an investigative alternative to the traditional randomised control trial. Level 6 is labelled `Societal Efficacy', which asks the question `does the test benefit society as a whole?'. At this level, the cost (borne by society as a whole including the detriment due to radiation exposure) of a given examination must be considered. Here efficacy interfaces with cost effectiveness and cost benefit analysis. Many of the most controversial current efficacy analyses are in Level 6, e.g. widespread use of mammographic screening for breast cancer. Subsequent sections of the publication concern practical aspects of application of efficacy concepts to emergence of new technologies, and makes some recommendations, namely that efficacy assessments should be used by physicians for every emerging technology; educational programmes should enhance physician's knowledge of the concept of efficacy; research programmes aimed at developing new imaging efficacy methodology should be supported; and that technology assessment per se should be acknowledged as a useful exercise. In general, the text identifies the concept of efficacy as complex rather than simple, and is a concise and useful attempt at simplification. Many of the ideas will be readily accepted by diagnostic radiologists; many investigators in the field of radiology use the concepts intuitively, and to some it may sound like a jargonised statement of the obvious. However, it is useful to have structured thinking formalised, and for less experienced investigators it would make excellent background reading prior to planning a clinical trial on an emerging imaging technology. To the radiation protection scientist, the text will provide an interesting insight into the philosophy of medical research into imaging technology. It is a key feature of the hierarchical model that each level depends on the ones below. Therefore, Levels 2 - 6, concerning diagnosis, therapy and outcome, cannot function without technical efficacy. The physicist therefore supports the entire teetering tower. I am sure that scientists involved in radiation protection will find it an interesting read, although not of direct relevance to day-to-day problems. The text would also be valuable to those involved in management at local and national levels. One of the most useful sections is the glossary, which gives definitions of cost benefit analysis, cost effectiveness analysis, decision analysis process and principles, effectiveness, etc. These terms can be daunting for relatively inexperienced managers in radiology departments, who may be confronted by health service accountants fluent in such jargon. Conversely, regional managers would benefit from understanding the difficulties of evaluating new technology, which may help to explain why clinicians are often vague about why they need new and expensive pieces of equipment.

Journal ArticleDOI
TL;DR: Statistics on the prevalence of neutral-to-earth connections and measurements of net currents suggest that the best way of assessing average magnetic fields in residences remains by direct measurement over at least 24 h.
Abstract: In the majority of homes in the UK, background power-frequency magnetic fields come from currents in final distribution circuits. In these circuits, load currents produce a negligible external magnetic field. The fields in homes arise from net currents, produced when neutral currents divert out of the distribution cable through earth connections. This paper reports statistics on the prevalence of neutral-to-earth connections and measurements of net currents. Neutral-to-earth connections occur as part of protective multiple earthing, which is applied to 64% of underground circuits and 32% of domestic consumers' installations, and also occur accidentally within up to 20% (and probably substantially more) of homes. The 48 h average net current in a sample of 21 circuits was 3.6 A. Because net currents are produced by diverted neutral current, they vary as loads vary. However, neutral current is proportional not to total load but to the unbalance between the three phases, and this weakens the correlation between net currents and loads. Individual unbalanced loads can lead to disproportionately high net currents. These considerations suggest that the best way of assessing average magnetic fields in residences (which is necessary for epidemiological studies) remains by direct measurement over at least 24 h.

Journal ArticleDOI
TL;DR: In this article, the authors estimated the change in the average exposure of the population of England and Wales to power-frequency magnetic fields between 1949 and 1989, and concluded that the estimated increase in overall average exposure is by a factor of 4.5.
Abstract: This paper estimates the change in the average exposure of the population of England and Wales to power-frequency magnetic fields between 1949 and 1989. If magnetic fields are causally linked to disease with a linear exposure - response relationship, this quantity is related to the incidence rate of the disease. The exposure is divided into components attributable to a number of sources, principally residential background fields and fields from domestic appliances and the transmission system. The 1989 average exposures from these sources are estimated as 45 nT, 20 nT and 4.2 nT respectively. For each source, an understanding of how fields arise is combined with statistics on the use of electricity and demographic statistics to estimate the change in exposure from that source. These individual changes are then combined, weighted according to the average exposure from that source. The estimated increase in overall average exposure is by a factor of 4.5, which applies to the whole population and also just to children. This increase is slightly greater than the result obtained by the simpler method of taking average domestic electricity demand per consumer, and can be treated with more confidence. There are still numerous approximations involved, some of which are identified and discussed, with the conclusion that the estimated increase is probably an underestimate.

Journal ArticleDOI
TL;DR: In this article, a model based on the distributed point source approximation was used to calculate the gamma-ray dose rate on the exterior surfaces of cylindrical vessels containing radioactive solutions, and the results were compared with experimentally determined results for (a pure -emitter) and (a mixed and -emitters) in acrylic containers of various wall thicknesses, as well as for single containers made from polycarbonate and polypropylene; good agreement was obtained.
Abstract: Gamma-ray dose rates on the exterior surfaces of cylindrical vessels containing radioactive solutions are calculated using a model based on the distributed point source approximation. A cylinder is subdivided into a number of annular sector segments of equal volume and the dose rate from each segment is combined to give the total dose rate at a point on the exterior surface of the cylindrical container. Calculated results for the method are compared with experimentally determined results for (a pure -emitter) and (a mixed and -emitter) in acrylic containers of various wall thicknesses, as well as for single containers made from polycarbonate and polypropylene; good agreement was obtained. Calculated results for the -ray dose rates to the skin of the fingers, for partially filled plastic syringes, are compared with other published results, for , , , , and in syringes of various diameters and wall thicknesses; good agreement was obtained. The calculations are extended to provide results for the -ray dose rate distribution along the external surfaces of partially filled syringes for and . These results are used to objectively derive guidelines for the safe handling of cylindrical vessels containing -emitting radionuclides, without the use of extra shielding. It was found that the weekly dose limit to the skin, of 10 mSv, is exceeded if the fingers are in contact with the container, over the active volume, for periods greater than about one minute. However, if handled at the rear of the syringe barrel a typical weekly work load can be managed without exceeding dose limits. It is recommended, when using syringes without syringe guards, that the fingers should never approach the active volume closer than the rear end of the syringe barrel, and that syringes should not be filled beyond 75% of their capacity.

Journal ArticleDOI
TL;DR: The authors intend the book to be an introduction to the field of radiation protection dosimetry, having just been published last year, it is able to take into account all the changes and developments that have come in recent ICRP and ICRU reports.
Abstract: The authors intend the book to be an introduction to the field of radiation protection dosimetry. Having just been published last year, the book is able to take into account all the changes and developments that have come in recent ICRP and ICRU reports. The book is aimed at graduate students or as a reference work for professionals in the field. I have to admit to disliking the typeface used and this is a pity as the standard of illustrations included are very good, I particularly liked the illustration of expanded and expanded and aligned fields. The book is divided into two parts; the first part deals with radiation quantities and units as defined by the ICRU and ICRP. The material is covered in a similar manner to the way it is presented in the ICRU documents with a small amount of further explanation. All the important quantities are covered including operational quantities and a discussion on microdosimetry. Most quantities are well explained although it is a shame that only half a page was devoted to Personal Dose Equivalent. It would have been good to see slightly more explanation on the use and measurement of this quantity and the relationship between and EDE. The book is intended as a teaching text and so care should have been taken not to include misleading information. For example, in discussing indirectly ionising radiation the authors state that photons do not cause ionisation, this is not true and could cause confusion in a student new to the field. Mislabelling the Radiation Weighting Factor as throughout one section does not help. The second part of the book deals with the levels of radiation exposure to the public and radiation workers from natural and man-made sources. The section on natural sources covers exposure of the public to extra-terrestrial sources, terrestrial sources and a 20 page section on radon. This is followed by information on population exposures to man-made sources with sections covering power production, nuclear weapon testing, radionuclide production and uses, consumer products, accidents and medical and industrial uses. Occupational radiation doses are covered for workers in the fuel cycle and power production, defence activities, industry and medicine. Most of the data in the second half of the book is gleaned from the UNSCEAR reports, however, a long table of references is given making it a good starting point for further study. It is a pity that the risks associated with these exposures are not discussed. The only section on risk assessment is one page at the end of the first section; however the current ICRP risk estimates are given if not discussed. Would I recommend this book? Yes if you do not have access to ICRU, ICRP and UNSCEAR reports and want to have the definitions of radiation quantities at your fingertips. Otherwise at nearly £50 it could be a useful addition to a student library.

Journal ArticleDOI
TL;DR: A number of properties related to radon have been surveyed in underground railway stations in Hong Kong, and it is seen that the radon dose received by the night time maintenance worker was not insignificant, that received by an average commuter was negligible while thatreceived by a day time employee was between the two.
Abstract: A number of properties related to radon have been surveyed in underground railway stations in Hong Kong. These properties include the radon concentration, the total potential alpha energy concentration of radon daughters and the equilibrium factor. The mean values were different from those of dwelling sites, and the values for different railway lines were also very different, attributable to the different ventilation efficiencies and aerosol concentrations. The values for a chosen station were observed to have temporal variation in the morning, since the air-conditioning is switched off at night. From the measurements, sample calculations of the radon dose received by different categories of people were performed. It is seen that the radon dose received by the night time maintenance worker was not insignificant, that received by an average commuter was negligible while that received by a day time employee was between the two.


Journal ArticleDOI
TL;DR: The third annual report produced by NEA on the data held on ISOE has recently been published, showing how multifarious are the influences on this quantity from various plant operations and how universal is the drive towards minimising occupational exposure.
Abstract: The Nuclear Energy Agency of the Organisation for Economic Cooperation and Development has set up an information system on occupational exposure (ISOE) received at nuclear power stations throughout the world. The third annual report produced by NEA on the data held on ISOE has recently been published. Data from 16 countries are included in the report, mainly from OECD countries, but some non-OECD countries are included through membership of IAEA, which is a co-sponsor of ISOE. The 1993 data are analysed and trends are given for the period 1969 to the end of 1993. At the end of 1993, data for 339 reactors were included in the database, which represents 78% of the operating reactors in the world. About half of these supplied data directly to ISOE, and other data for the report were obtained from annual reports produced by the utilities. The data are well presented, mainly in tabular and graphical form. The ISOE information system consists of three databases: NEA 1, NEA 2 and NEA 3. NEA 1 contains radiological data in the form of collective dose, and individual dose distributions, which can be analysed by country or type of reactor; these data are the main source of information for this report. However, only collective dose is analysed in the report and not individual dose data. NEA 2 includes information on dose control techniques and work management, whereas NEA 3 contains details of specific operations and experience gained in radiation protection procedures. All three databases are to be developed further. The total collective dose from all reactors in 1993 was 585 man Sv, which was 6% less than in 1992. This fall is a continuation of a steady reduction since the mid-1980s, there being a maximum of 913 man Sv in 1983. That peak was mainly caused by back-fitting work in PWR reactors required after the Three Mile Island incident. In recent years also, increased emphasis on dose minimisation has gradually reduced the annual collective dose. The collective dose per unit power generated in 1993 was 0.32 man Sv per Tw h; this is the lowest value achieved in any year for utilities on the database. The report gives other analyses such as average outage period for refuelling: Finland was lowest with an average of 2 to 3 weeks and Japan was the highest at around 20 weeks. The penultimate chapter gives brief reports from each participating country on the principal events that affected the collective dose in 1993, and shows how multifarious are the influences on this quantity from various plant operations. These national reports also demonstrate how universal is the drive towards minimising occupational exposure.

Journal ArticleDOI
TL;DR: The variation in beta dose and gamma air kerma rates in a small area were investigated and recommendations are made as to the number of replicates needed to reduce the error to of the mean.
Abstract: As a by-product of the fabrication of uranium fuels, BNFL, Springfields are authorised to discharge beta emitting rardionuclides, principally into the Ribble Estuary. The accumulation of in the estuarine sediments leads to enhanced beta dose rates not seen elsewhere in the UK. The methods used to determine the beta dose rate and calculate organ and effective doses are presented. The variation in beta dose and gamma air kerma rates in a small area were investigated and recommendations are made as to the number of replicates needed to reduce the error to of the mean. An example of the dose to a walker on the intertidal areas of the Ribble Estuary is calculated by the detailed methods. The effective dose was , of which 86% was derived from gamma irradiation. Doses to hands contaminated by estuarine sediments are also low .