scispace - formally typeset
Search or ask a question

Showing papers in "Toxicological Reviews in 2004"


Journal ArticleDOI
TL;DR: Hydrogen peroxide is an oxidising agent that is used in a number of household products, including general-purpose disinfectants, chlorine-free bleaches, fabric stain removers, contact lens disinfectants and hair dyes, and it is a component of some tooth whitening products.
Abstract: Hydrogen peroxide is an oxidising agent that is used in a number of household products, including general-purpose disinfectants, chlorine-free bleaches, fabric stain removers, contact lens disinfectants and hair dyes, and it is a component of some tooth whitening products. In industry, the principal use of hydrogen peroxide is as a bleaching agent in the manufacture of paper and pulp. Hydrogen peroxide has been employed medicinally for wound irrigation and for the sterilisation of ophthalmic and endoscopic instruments. Hydrogen peroxide causes toxicity via three main mechanisms: corrosive damage, oxygen gas formation and lipid peroxidation. Concentrated hydrogen peroxide is caustic and exposure may result in local tissue damage. Ingestion of concentrated (>35%) hydrogen peroxide can also result in the generation of substantial volumes of oxygen. Where the amount of oxygen evolved exceeds its maximum solubility in blood, venous or arterial gas embolism may occur. The mechanism of CNS damage is thought to be arterial gas embolisation with subsequent brain infarction. Rapid generation of oxygen in closed body cavities can also cause mechanical distension and there is potential for the rupture of the hollow viscus secondary to oxygen liberation. In addition, intravascular foaming following absorption can seriously impede right ventricular output and produce complete loss of cardiac output. Hydrogen peroxide can also exert a direct cytotoxic effect via lipid peroxidation. Ingestion of hydrogen peroxide may cause irritation of the gastrointestinal tract with nausea, vomiting, haematemesis and foaming at the mouth; the foam may obstruct the respiratory tract or result in pulmonary aspiration. Painful gastric distension and belching may be caused by the liberation of large volumes of oxygen in the stomach. Blistering of the mucosae and oropharyngeal burns are common following ingestion of concentrated solutions, and laryngospasm and haemorrhagic gastritis have been reported. Sinus tachycardia, lethargy, confusion, coma, convulsions, stridor, sub-epiglottic narrowing, apnoea, cyanosis and cardiorespiratory arrest may ensue within minutes of ingestion. Oxygen gas embolism may produce multiple cerebral infarctions. Although most inhalational exposures cause little more than coughing and transient dyspnoea, inhalation of highly concentrated solutions of hydrogen peroxide can cause severe irritation and inflammation of mucous membranes, with coughing and dyspnoea. Shock, coma and convulsions may ensue and pulmonary oedema may occur up to 24-72 hours post exposure. Severe toxicity has resulted from the use of hydrogen peroxide solutions to irrigate wounds within closed body cavities or under pressure as oxygen gas embolism has resulted. Inflammation, blistering and severe skin damage may follow dermal contact. Ocular exposure to 3% solutions may cause immediate stinging, irritation, lacrimation and blurred vision, but severe injury is unlikely. Exposure to more concentrated hydrogen peroxide solutions (>10%) may result in ulceration or perforation of the cornea. Gut decontamination is not indicated following ingestion, due to the rapid decomposition of hydrogen peroxide by catalase to oxygen and water. If gastric distension is painful, a gastric tube should be passed to release gas. Early aggressive airway management is critical in patients who have ingested concentrated hydrogen peroxide, as respiratory failure and arrest appear to be the proximate cause of death. Endoscopy should be considered if there is persistent vomiting, haematemesis, significant oral burns, severe abdominal pain, dysphagia or stridor. Corticosteroids in high dosage have been recommended if laryngeal and pulmonary oedema supervene, but their value is unproven. Endotracheal intubation, or rarely, tracheostomy may be required for life-threatening laryngeal oedema. Contaminated skin should be washed with copious amounts of water. Skin lesions should be treated as thermal burns; surgery may be required for deep burns. In the case of eye exposure, the affected eye(s) shod eye(s) should be irrigated immediately and thoroughly with water or 0.9% saline for at least 10-15 minutes. Instillation of a local anaesthetic may reduce discomfort and assist more thorough decontamination.

297 citations


Journal ArticleDOI
TL;DR: Despite differences, treatment of poisoning is nearly identical for BB and CCB, with some additional considerations given to specific BB.
Abstract: Calcium channel blockers (CCB) and β-blockers (BB) account for approximately 40% of cardiovascular drug exposures reported to the American Association of Poison Centers. However, these drugs represent >65% of deaths from cardiovascular medications. Yet, caring for patients poisoned with these medications can be extremely difficult. Severely poisoned patients may have profound bradycardia and hypotension that is refractory to standard medications used for circulatory support. Calcium plays a pivotal role in cardiovascular function. The flow of calcium across cell membranes is necessary for cardiac automaticity, conduction and contraction, as well as maintenance of vascular tone. Through differing mechanisms, CCB and BB interfere with calcium fluxes across cell membranes. CCB directly block calcium flow through L-type calcium channels found in the heart, vasculature and pancreas, whereas BB decrease calcium flow by modifying the channels via second messenger systems. Interruption of calcium fluxes leads to decreased intracellular calcium producing cardiovascular dysfunction that, in the most severe situations, results in cardiovascular collapse. Although, CCB and BB have different mechanisms of action, their physiological and toxic effects are similar. However, differences exist between these drug classes and between drugs in each class. Diltiazem and especially verapamil tend to produce the most hypotension, bradycardia, conduction disturbances and deaths of the CCB. Nifedipine and other dihydropyridines are generally less lethal and tend to produce sinus tachycardia instead of bradycardia with fewer conduction disturbances. BB have a wider array of properties influencing their toxicity compared with CCB. BB possessing membrane stabilising activity are associated with the largest proportion of fatalities from BB overdose. Sotalol overdoses, in addition to bradycardia and hypotension, can cause torsade de pointes. Although BB and CCB poisoning can present in a similar fashion with hypotension and bradycardia, CCB toxicity is often associated with significant hyperglycaemia and acidosis because of complex metabolic derangements related to these medications. Despite differences, treatment of poisoning is nearly identical for BB and CCB, with some additional considerations given to specific BB. Initial management of critically ill patients consists of supporting airway, breathing and circulation. However, maintenance of adequate circulation in poisoned patients often requires a multitude of simultaneous therapies including intravenous fluids, vasopressors, calcium, glucagon, phosphodiesterase inhibitors, high-dose insulin, a relatively new therapy, and mechanical devices. This article provides a detailed review of the pharmacology, pathophysiology, clinical presentation and treatment strategies for CCB and BB overdoses.

204 citations


Journal ArticleDOI
TL;DR: The involvement of CYP enzymes in metabolism-related drug-herb interactions and the importance of gaining a mechanism-based understanding to avoid potential adverse drug reactions are highlighted, in addition to outlining other contributory factors, such as pharmacogenetics and recreational habits that may compound this important health issue.
Abstract: The metabolism of a drug can be altered by another drug or foreign chemical, and such interactions can often be clinically significant. Cytochrome P450 (CYP) enzymes, a superfamily of enzymes found mainly in the liver, are involved in the metabolism of a plethora of xenobiotics and have been shown to be involved in numerous interactions between drugs and food, herbs and other drugs. The observed induction and inhibition of CYP enzymes by natural products in the presence of a prescribed drug has (among other reasons) led to the general acceptance that natural therapies can have adverse effects, contrary to the popular beliefs in countries where there is an active practice of ethnomedicine. Herbal medicines such as St. John's wort, garlic, piperine, ginseng, and gingko, which are freely available over the counter, have given rise to serious clinical interactions when co-administered with prescription medicines. Such adversities have spurred various pre-clinical and in vitro investigations on a series of other herbal remedies, with their clinical relevance remaining to be established. Although the presence of numerous active ingredients in herbal medicines, foods and dietary supplements complicate experimentation, the observable interactions with CYP enzymes warrant systematic studies, so that metabolism-based interactions can be predicted and avoided more readily. This article highlights the involvement of CYP enzymes in metabolism-related drug-herb interactions and the importance of gaining a mechanism-based understanding to avoid potential adverse drug reactions, in addition to outlining other contributory factors, such as pharmacogenetics and recreational habits that may compound this important health issue.

127 citations


Journal ArticleDOI
TL;DR: Gl glucose-6-phosphate deficiency, a common but heterogeneous range of enzyme-deficient states, which impairs the ability of the erythrocyte to respond to oxidant injury, is reviewed.
Abstract: The erythrocyte is a highly specialised cell with a limited metabolic repertoire. As an oxygen shuttle, it must continue to perform this essential task while exposed to a wide range of environments on each vascular circuit, and to a variety of xenobiotics across its lifetime. During this time, it must continuously ward off oxidant stress on the haeme iron, the globin chain and on other essential cellular molecules. Haemolysis, the acceleration of the normal turnover of senescent erythrocytes, follows severe and irreversible oxidant injury. A detailed understanding of the molecular mechanisms underlying oxidant injury and its reversal, and of the clinical and laboratory features of haemolysis is important to the medical toxicologist. This review will also briefly review glucose-6-phosphate deficiency, a common but heterogeneous range of enzyme-deficient states, which impairs the ability of the erythrocyte to respond to oxidant injury.

92 citations


Journal ArticleDOI
TL;DR: Urine alkalinization with high-flow urine output will enhance herbicide elimination and should be considered in all seriously poisoned patients, without the need for urine pH manipulation and the administration of substantial amounts of intravenous fluid in an already compromised patient.
Abstract: Chlorophenoxy herbicides are used widely for the control of broad-leaved weeds. They exhibit a variety of mechanisms of toxicity including dose-dependent cell membrane damage, uncoupling of oxidative phosphorylation and disruption of acetylcoenzyme A metabolism. Following ingestion, vomiting, abdominal pain, diarrhoea and, occasionally, gastrointestinal haemorrhage are early effects. Hypotension, which is common, is due predominantly to intravascular volume loss, although vasodilation and direct myocardial toxicity may also contribute. Coma, hypertonia, hyperreflexia, ataxia, nystagmus, miosis, hallucinations, convulsions, fasciculation and paralysis may then ensue. Hypoventilation is commonly secondary to CNS depression, but respiratory muscle weakness is a factor in the development of respiratory failure in some patients. Myopathic symptoms including limb muscle weakness, loss of tendon reflexes, myotonia and increased creatine kinase activity have been observed. Metabolic acidosis, rhabdomyolysis, renal failure, increased aminotransferase activities, pyrexia and hyperventilation have been reported. Substantial dermal exposure to 2,4-dichlorophenoxy acetic acid (2,4-D) has led occasionally to systemic features including mild gastrointestinal irritation and progressive mixed sensorimotor peripheral neuropathy. Mild, transient gastrointestinal and peripheral neuromuscular symptoms have occurred after occupational inhalation exposure. In addition to supportive care, urine alkalinization with high-flow urine output will enhance herbicide elimination and should be considered in all seriously poisoned patients. Haemodialysis produces similar herbicide clearances to urine alkalinization without the need for urine pH manipulation and the administration of substantial amounts of intravenous fluid in an already compromised patient.

88 citations


Journal ArticleDOI
TL;DR: The mechanisms of action are discussed, an approach to the treatment of overdose, abuse, and addiction is presented, and the potential for new treatments for narcolepsy and alcoholism is presented.
Abstract: γ-Hydroxybutyric acid (GHB) is a short-chain fatty acid that occurs naturally in mammalian brain where it is derived metabolically from γ-aminobutyric acid (GABA), the primary inhibitory neurotransmitter in the brain. GHB was synthesised over 40 years ago and its presence in the brain and a number of aspects of its biological, pharmacological and toxicological properties have been elucidated over the last 20–30 years. However, widespread interest in this compound has arisen only in the past 5–10 years, primarily as a result of the emergence of GHB as a major recreational drug and public health problem in the US. There is considerable evidence that GHB may be a neuromodulator in the brain. GHB has multiple neuronal mechanisms including activation of both the γ-aminobutyric acid type B (GABAB) receptor, and a separate GHB-specific receptor. This complex GHB-GABAB receptor interaction is probably responsible for the protean pharmacological, electroencephalographic, behavioural and toxicological effects of GHB, as well as the perturbations of learning and memory associated with supra-physiological concentrations of GHB in the brain that result from the exogenous administration of this drug in the clinical context of GHB abuse, addiction and withdrawal. Investigation of the inborn error of metabolism succinic semialdehyde deficiency (SSADH) and the murine model of this disorder (SSADH knockout mice), in which GHB plays a major role, may help dissect out GHB- and GABAB receptor-mediated mechanisms. In particular, the mechanisms that are operative in the molecular pathogenesis of GHB addiction and withdrawal as well as the absence seizures observed in the GHB-treated animals.

83 citations


Journal ArticleDOI
TL;DR: Although not effective in all cases, it is recommended to recommend hyperinsulinaemia/euglycaemia therapy in patients with severe CCA poisoning who present with hypotension and respond poorly to fluid, calcium salts, glucagon and catecholamine infusion.
Abstract: The inotropic effect of insulin has been long established. High-dose (0.5–1 IU/kg/hour) insulin, in combination with a glucose infusion to maintain euglycaemia (hyperinsulinaemia/euglycaemia therapy), has been proposed as a treatment for calcium channel antagonist (CCA) and β-adrenoceptor antagonist (β-blocker) poisonings. However, the basis for its beneficial effect is poorly understood. CCAs inhibit insulin secretion, resulting in hyperglycaemia and alteration of myocardial fatty acid oxidation. Similarly, blockade of β2-adrenoceptors in β-blocker poisoning results in impaired lipolysis, glycogenolysis and insulin release. Insulin administration switches cell metabolism from fatty acids to carbohydrates and restores calcium fluxes, resulting in improvement in cardiac contractility. Experimental studies in verapamil poisoning have shown that high-dose insulin significantly improved survival compared with calcium salts, epinephrine or glucagon. In several life-threatening poisonings in humans, the administration of high-dose insulin produced cardiovascular stabilisation, decreased the catecholamine vasopressor infusion rate and improved the survival rate. In a canine model of propranolol intoxication, high-dose insulin provided a sustained increase in systemic blood pressure, cardiac performance and survival rate compared with glucagon or epinephrine. In contrast, insulin had no effect on heart rate and electrical conduction in the myocardium. In another study, high-dose insulin reversed the negative inotropic effect of propranolol to 80% of control function and normalised heart rate. High-dose insulin produced a significant decrease in the left ventricular end-diastolic pressure and a significant increase in the stroke volume and cardiac output. The vasodilator effect was explained by an enhanced cardiac output leading to withdrawal of compensatory vasoconstriction. No clinical studies have yet been performed. Although not effective in all cases, we recommend hyperinsulinaemia/euglycaemia therapy in patients with severe CCA poisoning who present with hypotension and respond poorly to fluid, calcium salts, glucagon and catecholamine infusion. However, careful monitoring of blood glucose and serum potassium concentrations is required to avoid serious adverse effects. More clinical data are needed before this therapy can be recommended in β-blocker poisoning. There is a need for large prospective clinical trials to confirm safety and efficacy of hyperinsulinaemia/euglycaemia therapy in both CCA and β-blocker poisoning.

78 citations


Journal ArticleDOI
TL;DR: Although many GHB users will experience a mild withdrawal syndrome upon drug discontinuation, those with chronic heavy GHB use can experience severe withdrawal, and patients with fulminant GHB withdrawal require aggressive treatment with cross-tolerant sedative hypnotics, such as benzodiazepines.
Abstract: γ-Hydroxybutyrate (GHB) is endogenous inhibitory transmitter that, when administered in pharmacological doses, has sedative-hypnotic properties. It is used in anaesthesia for the treatment of narcolepsy/catalepsy and in alcohol/opioid detoxification treatment regimens. Based on its purported anabolic effects, GHB use became established among bodybuilders. As the euphorigenic effects of GHB became publicised, attendees at dance clubs and rave parties began to use it alone or in combination with other psychoactive drugs. Following the ban of GHB in 1990, several precursor products (e.g. γ-butyrolactone, butanediol) became widely used as replacement drugs until their ultimate proscription from lawful use in 2000. GHB and its precursors, like most sedative-hypnotic agents, can induce tolerance and produce dependence. Although many GHB users will experience a mild withdrawal syndrome upon drug discontinuation, those with chronic heavy GHB use can experience severe withdrawal. This syndrome clinically resembles the withdrawal syndrome noted from alcohol and other sedative-hypnotic drugs (e.g. benzodiazepines). Distinct clinical features of GHB withdrawal are its relatively mild and brief autonomic instability with prolonged psychotic symptoms. Patients with fulminant GHB withdrawal require aggressive treatment with cross-tolerant sedative hypnotics, such as benzodiazepines.

64 citations


Journal ArticleDOI
TL;DR: A review examines the history of GHB analogue abuse as well as the clinical presentation and management of acute intoxication and withdrawal associated with abuse of these compounds.
Abstract: γ-Hydroxybutyrate (GHB) is a GABA-active CNS depressant, commonly used as a drug of abuse. In the early 1990s, the US Drug Enforcement Administration (DEA) warned against the use of GHB and restricted its sale. This diminished availability of GHB caused a shift toward GHB analogues such as γ-butyrolactone (GBL) and 1,4-butanediol (1,4-BD) as precursors and surrogates. Both GBL and 1,4-BD are metabolically converted to GHB. Furthermore, GBL is commonly used as a starting material for chemical conversion to GHB. As such, the clinical presentation and management of GBL and 1,4-BD intoxication shares a great deal of common ground with that for GHB. This similarity exists not only for acute intoxication but also for withdrawal in those patients with a history of extended high-dose abuse. This review examines the history of GHB analogue abuse as well as the clinical presentation and management of acute intoxication and withdrawal associated with abuse of these compounds.

58 citations


Journal ArticleDOI
TL;DR: An approach to therapy is proposed, which is based on theoretical principles and evidence gleaned from currently available clinical data sets, and suggests that half the calculated loading dose, based on serum concentration, should be administered and the impact on clinical features observed; a second dose should be given in the event of recurrence of toxicity.
Abstract: Digitalis glycoside poisoning is an important clinical problem and the development of digoxin-specific antibody fragments (Fab) 30 years ago has changed clinical practice. Nevertheless, doubts still exist as to the appropriate dose indications for therapy. This paper reviews relevant literature, describes the difficulties associated with current treatment protocols and proposes an approach to therapy, which is based on theoretical principles and evidence gleaned from currently available clinical data sets. In patients with 'acute' poisoning, serum digoxin concentrations do not equate to the total body burden, as tissue distribution will not have occurred, and the calculations for present protocols, which use serum concentrations, are therefore likely to result in too much antibody being administered. Since a therapeutic quantity of digoxin will have little effect in a normal individual, complete neutralisation of all digoxin is also unnecessary. The pharmacokinetic and dynamic logic of using a smaller initial loading dose than predicted from total body calculations is rational. It is recommended that half the calculated loading dose, either based on serum concentration or history, should be administered and the impact on clinical features observed. If a clinical response is not seen within 1-2 hours, a further similar dose should be given. In the event of a full response, patients should be monitored for 6-12 hours; a second dose should only be given in the event of recurrence of toxicity. In patients with 'chronic' digoxin poisoning, the serum digoxin concentration will reflect the total body load. However, since such patients are invariably receiving digoxin for therapeutic purposes, full neutralisation is again not indicated. In addition, tissue redistribution of digoxin from deeper stores will occur following the binding of biologically active digoxin in the circulation. This process will occur over a number of hours and if the total calculated dose of antibody is administered in a single bolus, significant quantities will be excreted prior to redistribution of digoxin. Pharmacokinetic logic, therefore, suggests that half the calculated loading dose, based on serum concentration, should be administered and the impact on clinical features observed; a second dose should be given in the event of recurrence of toxicity.

46 citations


Journal ArticleDOI
TL;DR: This review assesses the evidence regarding the effects of occupational exposure to organic solvents on colour discrimination and investigates exposure-response relationships and reversibility, and considers the current state of knowledge of the possible mechanisms underlying changes in colour vision, and the human health significance of any reported changes.
Abstract: This review assesses the evidence regarding the effects of occupational exposure to organic solvents on colour discrimination and investigates exposure-response relationships and reversibility. This review also considers the current state of knowledge of the possible mechanisms underlying changes in colour vision, and the human health significance of any reported changes. Among the commonly used organic solvents, styrene has been investigated the most thoroughly. Studies of styrene-exposed workers in Germany, Italy and Japan provide a sufficiently consistent body of evidence to support a robust conclusion that styrene does cause an impairment of colour discrimination relative to age-matched controls. Generally, the impairment of colour discrimination observed in styrene-exposed workers tends to be of the tritan (blue-yellow) type, although some cases of red-green impairment have also been found. The limited information available on exposure-response relationships indicates that the effects on colour discrimination would not be expected at 8-hour time weighted average (8 h TWA) exposures <20 ppm, although a precise threshold cannot be determined. The data on reversibility are limited and inconclusive. The results from the most rigorous study in which this aspect was investigated point to a reversibility of effects after a 4-week exposure-free period, whereas results from a study with limitations suggest a persistence of effect. The effects of toluene, tetrachloroethylene or mixed solvent exposure have also been investigated, although the information available is generally less reliable than for styrene. For toluene, it can be confidently concluded that this solvent does not have an acute effect on colour discrimination, even when exposures are relatively high (50-150 ppm 8 h TWA, and 290-360 ppm 30 minutes TWA). However, studies are inconclusive on whether long-term or repeated exposure to toluene can cause a persistent impairment of colour discrimination. There are few studies that have specifically investigated the effects of tetrachloroethylene on colour discrimination. Among these studies, none has examined the potential for any acute effects of this solvent vapour. A large-scale study in Japanese workers showed no effects of long-term exposure to tetrachloroethylene concentrations in the region of 12-13 ppm. However, the test methodology used was relatively insensitive to changes in colour discrimination, hence the results do not provide reassurance for an absence of subtle effects. A study in Italian dry-cleaners suggested a slight impairment of colour discrimination relative to controls, associated with relatively low exposures to tetrachloroethylene (mean 8 h TWA exposure approximately 6 ppm). The studies concerning the effects of mixed solvent exposure on colour discrimination are based on workers exposed to solvents in paints and lacquers, workers from the printing and petrochemical industries, people working in or living near to microelectronics factories and children exposed to solvents prenatally. However, these studies are subject to design limitations or methodological irregularities, such that no conclusions regarding the effects of mixed solvent exposure on colour discrimination can be drawn. Overall, the only credible evidence for an effect of solvents on colour discrimination derives from the studies on styrene. Because of limitations in the data for other solvents it is not possible to determine whether the evidence for styrene reflects a generic property of solvents. The mechanisms of styrene-induced effects on colour discrimination have not been properly investigated and can only be the subject of speculation. One conclusion that can be drawn is that pathological changes to the ocular system, such as changes to the lens, are unlikely to be involved. This is because there is an absence of convincing evidence for such changes from medical examinations conducted in epidemiological studies of solvent-exposed workers. Also, it seems unlikely that effects on colour discrimination are a nonspecific consequence of more generalised CNS depression, given that styrene-induced effects on colour discrimination appear to occur below the threshold for narcotic effects. The effects of styrene on colour discrimination are subtle and involve an impairment of the ability to discriminate accurately between closely related shades of the same colour rather than 'colour blindness'. There is no valid basis for using colour discrimination as a marker for other forms of solvent-induced neurotoxicity.

Journal ArticleDOI
TL;DR: There has not been an appropriately designed empirical evaluation of the diagnostic utility of the osmole gap and that its clinical utility remains hypothetical, having been theoretically extrapolated from the non-poisoned population.
Abstract: The rapid and accurate diagnosis of toxic alcohol poisoning due to methanol (methyl alcohol) [MeOH] and ethylene glycol (EG), is paramount in preventing serious adverse outcomes. The quantitative measurement of specific serum levels of these substances using gas chromatography is expensive, time consuming and generally only available at major tertiary-care facilities. Therefore, because these toxic substances are osmotically active and the measurement of serum osmolality is easily performed and more readily available, the presence of an osmole gap (OG) has been adopted as an alternative screening test. By definition, the OG is the difference between the measured serum osmolality determined using the freezing point depression (Osmm) and the calculated serum molarity (Mc), which is estimated from the known and readily measurable osmotically active substances in the serum, in particular sodium, urea, glucose, and potassium and ethanol (alcohol). Thus, the OG = Osmm − Mc, and an OG above a specific threshold (the threshold of positivity) suggests the presence of unmeasured osmotically active substances, which could be indicative of a toxic exposure. The objectives of this study were to review the principles of evaluating screening tests, the theory behind the OG as a screening test and the literature upon which the adoption of the OG as a screening test has been based. This review revealed that there have been numerous equations derived and proposed for the estimation of the Mc, with the objective of developing empirical evidence of the best equation for the determination of the OG and ultimately the utility of OG as a screening test. However, the methods and statistical analysis employed have generally been inconsistent with recommended guidelines for screening test evaluation and although many equations have been derived, they have not been appropriately validated. Specific evidence of the clinical utility of the OG requires that a threshold of positivity be definitively established, and the sensitivity and specificity of the OG in patients exposed to either EG or MeOH be measured. However, the majority of studies to date have only evaluated the relationship between the Osmm (mmol/kg H2O) and the Mc (mmol/L) in patients that have not been exposed to either MeOH or EG. While some studies have evaluated the relationship between the OG and serum ethanol concentration, these findings cannot be extrapolated to the use of the OG to screen for toxic alcohol exposure. This review shows that there has not been an appropriately designed empirical evaluation of the diagnostic utility of the OG and that its clinical utility remains hypothetical, having been theoretically extrapolated from the non-poisoned population.

Journal ArticleDOI
TL;DR: There is considerable experimental evidence to support the hypothesis that diazepam (and other anticonvulsants) may prevent structural damage to the central nervous system as evidenced by neuropathological changes such as neuronal necrosis at autopsy.
Abstract: The main site of action of diazepam, as with other benzodiazepines, is at the GABA(A) receptor, although it has been suggested that some of the potentially beneficial actions of diazepam in nerve agent poisoning are mediated through other means It is likely that convulsions may have long-term sequelae in the central nervous system, because of damage by anoxia and/or excitotoxicity Numerous pharmacodynamic studies of the action of diazepam in animals experimentally poisoned with nerve agents have been undertaken In nearly all of these, diazepam has been studied in combination with other antidotes, such as atropine and/or pyridinium oximes, sometimes in combination with pyridostigmine pretreatment These studies show that diazepam is an efficacious anticonvulsant in nerve agent poisoning There is considerable experimental evidence to support the hypothesis that diazepam (and other anticonvulsants) may prevent structural damage to the central nervous system as evidenced by neuropathological changes such as neuronal necrosis at autopsy In instances of nerve agent poisoning during terrorist use in Japan, diazepam seems to have been an effective anticonvulsant Consequently, the use of diazepam is an important part of the treatment regimen of nerve agent poisoning, the aim being to prevent convulsions or reduce their duration Diazepam should be given to patients poisoned with nerve agents whenever convulsions or muscle fasciculation are present In severe poisoning, diazepam administration should be considered even before these complications occur Diazepam is also useful as an anxiolytic in those exposed to nerve agents

Journal ArticleDOI
TL;DR: The present clinical use of serum osmometry is erroneous because of the incorrect assumption that serum behaves as a dilute ‘ideal’ solution and that the osmotic activity of a substance depends solely on the number of solute particles.
Abstract: The present clinical use of serum osmometry is erroneous in two respects. The first, and the most important, is the incorrect assumption that serum behaves as a dilute 'ideal' solution and that the osmotic activity of a substance depends solely on the number of solute particles. The amount of variance from ideal behaviour of serum containing an exogenous substance is expressed by the osmotic coefficient (phi). We have calculated the osmotic coefficient for serum containing ethanol (alcohol) and recommend that the osmotic coefficient for serum containing other low molecular weight substances such as methanol (methyl alcohol), isopropyl alcohol and ethylene glycol also be calculated. This is necessary for the accurate calculation of the contribution of these substances to the serum osmolality.Secondly, the practice of subtracting the calculated serum molarity from measured serum osmolality is not valid since it represents a mathematically improper expression. The units of these two terms are different. The 'osmole gap' (OG) is typically viewed as the difference between serum osmolality determined by an osmometer and the estimated total molarity of solute in serum by directly measuring the concentration of several substances and then substituting them into a published formula. Some authors call this sum the calculated or estimated osmolarity but, because the concentrations are measured directly and not with an osmometer, the calculated term represents molarity. The units of osmolality are mmol/kg of H2O and the units of molarity are mmol/L. Therefore, the practice of subtracting calculated serum molarity from measured serum osmolality is not mathematically sound and is an oversimplification for ease of application. This mathematical transgression necessarily adds an error to the incorrectly calculated OG. Despite this, the OG is commonly used in clinical medicine. Serum osmolality can be converted to molarity provided the weight percentage and the density of the solution are known and thus, we recommend that this conversion be done prior to calculation of the gap. We recommend that the gap between measured serum osmolarity and calculated serum molarity be called the 'osmolar gap'. After having corrected for non-ideality for serum and for inconsistency of units, the standard value and reference range for this gap must be determined in an adequate number of patient populations and in a variety of clinical settings. An example of this determination, using data from a group of ethanol-poisoned patients is given. This correction should be applied before the evaluation of the osmolar gap as a screening test for other low molecular weight substances proceeds.

Journal ArticleDOI
TL;DR: Emergency toxicological analyses that could influence immediate patient management such as iron, lithium and paracetamol (acetaminophen), are relatively few in number and are remarkably similar worldwide.
Abstract: Many acutely poisoned patients are treated with no laboratory help other than general clinical chemistry and haematology. Emergency toxicological analyses (24-hour availability) that could influence immediate patient management such as iron, lithium and paracetamol (acetaminophen), are relatively few in number and are remarkably similar worldwide. These assays should be provided at hospitals with large accident and emergency departments. More complex, less frequently needed clinical toxicological assays that can often be offered on a less urgent basis are usually provided from regional or national centres because of the need to make best use of resources. Recommendations as to the assays that should be provided locally and at regional centres are available for the UK and US, and are generally applicable. Regional centres normally diversify into specialised therapeutic drug monitoring, urine screening for drugs of abuse, metals analysis and sometimes forensic work in order to widen the repertoire of tests available and to increase funding. Whatever the type and quantity of work undertaken and the instrumentation used, guidelines are now available delineating staff training, method validation, assay operation, quality control/quality assurance, and indeed virtually all other aspects of laboratory operation. These considerations notwithstanding, clinical interpretation of analytical results remains a difficult area and is the responsibility of the reporting laboratory, at least in the first instance.

Journal ArticleDOI
TL;DR: Clinical manifestations of latex allergy depend on the route of exposure and occur by direct contact either with skin or mucosa, or by inhalation, and there appears to be a positive correlation between protein content and allergenicity of gloves.
Abstract: Latex allergy continues to be an important occupational health problem as latex products are used increasingly worldwide, particularly in healthcare. Although there are few epidemiological studies on the incidence of latex allergy, there has been an increase in the number of case reports over the last 10 years and, based on skin-prick tests, estimates of prevalence of latex allergy in healthcare workers range from 2% to 17%. The allergic health effects arise either from the latex proteins, generally causing a type I immediate hypersensitivity reaction, or from the chemicals added to latex during processing, causing a type IV delayed hypersensitivity reaction. Clinical manifestations of latex allergy depend on the route of exposure and occur by direct contact either with skin or mucosa, or by inhalation. The diagnosis of latex allergy is based on the history, skin tests, serological tests and challenge tests. Thirteen latex allergens have been identified and isolated so far from natural rubber latex. They differ in their potential to elicit immunological responses in individuals allergic to latex and thus have been designated as major or minor allergens. In latex gloves, cornstarch powder used as a donning agent carries latex proteins, thereby increasing inhalational and mucosal exposure to latex proteins. There also appears to be a positive correlation between protein content and allergenicity of gloves. The use of powder-free, low-protein gloves is effective in reducing symptoms and markers of sensitisation. Alternatives to latex gloves, such as nitrile or vinyl gloves are available but may be inferior in respect to manual dexterity and biological impermeability.

Journal ArticleDOI
TL;DR: Investigation of GHB and related compounds can be clinically useful in confirming the cause of coma in an overdose patient, determining its potential role in a postmortem victim, as well as evaluating its use in a drug-facilitated sexual assault victim.
Abstract: Laboratory detection of gamma-hydroxybutyrate (GHB) has been published as early as the 1960s. However, wide-scale use of GHB during the 1990s has led to the development of current analytic methods to test for GHB and related compounds. Detection of GHB and related compounds can be clinically useful in confirming the cause of coma in an overdose patient, determining its potential role in a postmortem victim, as well as evaluating its use in a drug-facilitated sexual assault victim. Analytical method sensitivity must be known in order to determine the usefulness and clinical application. Most laboratory cut-off levels are based on instrument sensitivity and will not establish endogenous versus exogenous GHB levels. Interpretation of GHB levels must include a knowledge base of endogenous GHB, metabolism of GHB and related compounds, as well as postmortem generation. Due to potential analytical limitations in various GHB methods, it is clinically relevant to specifically request for GHB as well as related GHB compounds if they are also in question. Various storage conditions (collection time, types of containers, use of preservatives, storage temperature) can also affect the analysis and interpretation of GHB and related compounds.

Journal ArticleDOI
Bryan Ballantyne1
TL;DR: From a knowledge of threshold and no-effect concentrations, a workplace permissible vapour exposure concentration can be developed along with industrial hygiene precautionary measures.
Abstract: Glaucopsia is a transient disturbance of vision that results from the development of corneal epithelial oedema and associated microcysts produced by exposure to the vapour of certain industrial chemicals, notably aliphatic, alicyclic and heterocyclic amines. After a latent period of a few hours of exposure, there is typically a blurring of vision, objects take on a blue-grey appearance and halos develop around bright objects. Corneal changes can be seen on biomicroscopy and corneal thickness increase is measurable by pachymetry. At concentrations higher than threshold values, visual acuity may be decreased, but contrast sensitivity is a better measure of visual effects. On vacating the causative vapour, vision returns to normal in a few hours without leaving permanent ocular sequelae. Vapour concentration of the causative amine is a major factor in the development of glaucopsia, and a concentration-effect relationship is usually evident. A correlation exists between the vapour concentration, degree of corneal oedema, corneal thickness and subjective symptoms, which permits no-effect and threshold-effect concentrations to be determined. The disturbance of vision is a nuisance factor and this may impair work efficiency, predispose to physical accidents and hinder the performance of coordinated tasks (e.g. driving). As a consequence, development of glaucopsia is considered a hazard and is thus an important consideration in assessing workplace safety. From a knowledge of threshold and no-effect concentrations, a workplace permissible vapour exposure concentration can be developed along with industrial hygiene precautionary measures.

Journal ArticleDOI
TL;DR: It is important that investigational drugs with the potential to produce hypersensitivity reactions be identified as early in the development process as possible and use of well understood models that have been thoroughly validated is imperative.
Abstract: Drug-induced hypersensitivity is an adverse reaction, characterised by damaging immune-mediated responses, initiated by medicine given at therapeutic doses for prevention, diagnosis or treatment. Immune-mediated drug hypersensitivity accounts for 6-10% of the adverse drug reactions, which rank between the fourth and sixth leading causes of death in the US. With $US30 billion annually in the US (1995 value). In addition, the costs of not determining the potential of a drug to produce hypersensitivity in the pre-clinical phase of drug development can be substantial. It has been estimated that the pre-clinical phase and clinical phase I, phase II and phase III costs are approximately $US6 million, $US12 million, $US12 million and $US100 million per drug, respectively (1999 values). It is important that investigational drugs with the potential to produce hypersensitivity reactions be identified as early in the development process as possible. Some adverse reactions to drugs can be avoided if drug-drug interactions are known or if there is a structure-activity relationship established. However, these methods are inadequate. Appropriate animal models of drug-induced hypersensitivity are needed, especially because hypersensitivity has been cited as the leading reason for taking drugs off the market. It is of critical importance to be able to predict hypersensitivity reactions to drugs. Most anaphylactic reactions occur in atopic individuals. Similarly, patients who have experienced other hypersensitivity reactions are more likely to have recurrent reactions. Therefore, animal models should be considered that predispose the animal to the reaction, such as the use of appropriate adjuvants and species. Using known positive controls of varying strengths, the investigator can rank the reaction against the positive controls as standards. This approach might yield greater results in a shorter period of time than using novel models. For the greatest safety, use of well understood models that have been thoroughly validated is imperative.

Journal ArticleDOI
TL;DR: In this article, the authors measured the bronchial hyperresponsiveness to inhaled methacholine to assess work-related change across individual workshifts and compared with values following a period away from work.
Abstract: Inhalation of a range of agents can result in airway inflammation and/or irritation. This may result in occupational asthma or reactive airways dysfunction syndrome. Reactive airways dysfunction syndrome follows a single large exposure to a chemical agent but is now frequently embraced under the wider term of irritant-induced asthma, a term that also includes asthma due to persistent, lower dose irritant exposures. Bronchial hyperresponsiveness is a hallmark of both occupational asthma and reactive airways dysfunction syndrome, although some patients with occupational asthma may occasionally have typical clinical features without increased bronchial hyperresponsiveness. Following removal of the causal agent in occupational asthma, bronchial hyperresponsiveness generally returns towards normal over a 2-year period, although some individuals demonstrate increased bronchial hyperresponsiveness for longer. Measurement of specific bronchial hyperresponsiveness to the primary causal agent in occupational asthma is used diagnostically but not for assessing prognosis. Bronchial hyperresponsiveness to inhaled methacholine can be measured across individual workshifts to assess work-related change. It may also be measured at the end of a work period when exposure has occurred, and compared with values following a period away from work. There have been no direct, systematic comparisons of changes in methacholine responsiveness in the diagnosis of occupational asthma compared with the more frequently used serial peak flow measurements. Patients with reactive airways dysfunction syndrome classically exhibit non-specific bronchial hyperresponsiveness, which can be readily measured by evaluating responses to inhaled methacholine. Bronchial hyperresponsiveness in reactive airways dysfunction syndrome can persist for many years after initial exposure and serial changes can be used to assess recovery and subsequent disability over time.