scispace - formally typeset
Search or ask a question

Showing papers by "Gian Luca Salvagno published in 2007"


Journal ArticleDOI
TL;DR: The use of EDTA for measuring cytokines, protein and peptides, and cardiac markers is described, with an outline of the protection of labile molecules provided by this anticoagulant.
Abstract: Anticoagulants are used to prevent clot formation both in vitro and in vivo. In the specific field of in vitro diagnostics, anticoagulants are commonly added to collection tubes either to maintain blood in the fluid state for hematological testing or to obtain suitable plasma for coagulation and clinical chemistry analyses. Unfortunately, no universal anticoagulant that could be used for evaluation of several laboratory parameters in a sample from a single test tube is available so far. Ethylenediamine tetraacetic acid (EDTA) is a polyprotic acid containing four carboxylic acid groups and two amine groups with lone-pair electrons that chelate calcium and several other metal ions. Calcium is necessary for a wide range of enzyme reactions of the coagulation cascade and its removal irreversibly prevents blood clotting within the collection tube. Historically, EDTA has been recommended as the anticoagulant of choice for hematological testing because it allows the best preservation of cellular components and morphology of blood cells. The remarkable expansion in laboratory test volume and complexity over recent decades has amplified the potential spectrum of applications for this anticoagulant, which can be used to stabilize blood for a variety of traditional and innovative tests. Specific data on the behavior of EDTA as an anticoagulant in hematology, including possible pitfalls, are presented. The use of EDTA for measuring cytokines, protein and peptides, and cardiac markers is described, with an outline of the protection of labile molecules provided by this anticoagulant. The use of EDTA in proteomics and in general clinical chemistry is also described in comparison with other anticoagulants and with serum samples. Finally, the possible uses of alternative anticoagulants instead of EDTA and the potential use of a universal anticoagulant are illustrated.

197 citations


Journal Article
TL;DR: The results of this case-control study demonstrate that physiological pregnancy is associated with a substantial modification of the lipid and lipoprotein metabolism from the second trimester, providing reference ranges for traditional and emerging cardiovascular risk predictors throughout the gestational period.
Abstract: Physiologic pregnancy is associated with a broad series of metabolic adaptations which may also influence the metabolism of lipids and lipoproteins. Although the modification of serum lipids and lipoproteins has been exhaustively investigated during and after pregnancy, the relative changes recorded vary widely among the different studies. A comprehensive lipid and lipoprotein profile was evaluated in 57 women with uncomplicated pregnancies at different gestational ages (20 in the first, 20 in the second, and 17 in the third trimester of pregnancy) and compared to that of 21 non-pregnant women. Conventional lipid parameters, including total cholesterol, high-density lipoprotein cholesterol and triglycerides, were evaluated on the Modular System P. Low-density lipoprotein cholesterol was quantified by the formula of Friedewald, the atherogenic index of plasma was quantified by the formula log (triglycerides/high-density lipoprotein cholesterol), whereas lipoprotein(a) was assayed on the BN II nephelometric analyzer. We observed that all the lipid parameters tested were significantly modified by the gestational age; in particular, women in the second and third trimester displayed significantly increased total cholesterol, low-density lipoprotein cholesterol, high-density lipoprotein cholesterol, total to high-density lipoprotein cholesterol ratio, lipoprotein(a) and atherogenic index of plasma (third trimester only) when compared to either the control population or the subgroup of women in the first trimester of pregnancy. The value distributions and the relative percentage of women with undesirable or abnormal values according to the current NCEP or AHA/ACC goals were comparable between controls and women in the first trimester. However, when compared with either controls or women in the first trimester, advanced pregnancy was associated with an increased prevalence of undesirable or abnormal values for total cholesterol, low-density lipoprotein cholesterol and triglycerides in the second trimester, and total cholesterol, low-density lipoprotein cholesterol, triglycerides, total to high-density lipoprotein cholesterol ratio and lipoprotein(a) (only from non-pregnant women) in the third trimester. The results of this case-control study demonstrate that physiological pregnancy is associated with a substantial modification of the lipid and lipoprotein metabolism from the second trimester, providing reference ranges for traditional and emerging cardiovascular risk predictors throughout the gestational period.

136 citations


Journal ArticleDOI
TL;DR: The present document, issued by the Italian Inter-society SIBioC-SIMeL-CISMEL (Society of Clinical Biochemistry and Clinical Molecular Biology-Italian Society of Laboratory Medicine-Italian Committee for Standardization of Hematological and Laboratory Methods) Study Group on Extra-analytical Variability, reviews the major causes of unsuitable specimens in clinical laboratories, providing consensus recommendations for detection and management.
Abstract: A large body of evidence attests that quality programs developed around the analytical phase of the total testing process would only produce limited improvements, since the large majority of errors encountered in clinical laboratories still prevails within extra-analytical areas of testing, especially in manually intensive preanalytical processes. Most preanalytical errors result from system flaws and insufficient audit of the operators involved in specimen collection and handling responsibilities, leading to an unacceptable number of unsuitable specimens due to misidentification, in vitro hemolysis, clotting, inappropriate volume, wrong container or contamination from infusive routes. Detection and management of unsuitable samples are necessary to overcome this variability. The present document, issued by the Italian Inter-society SIBioC-SIMeL-CISMEL (Society of Clinical Biochemistry and Clinical Molecular Biology-Italian Society of Laboratory Medicine-Italian Committee for Standardization of Hematological and Laboratory Methods) Study Group on Extra-analytical Variability, reviews the major causes of unsuitable specimens in clinical laboratories, providing consensus recommendations for detection and management.

105 citations


Journal ArticleDOI
TL;DR: It has been suggested that IMA serum concentrations in patients with extremely low or high serum albumin levels may be unreliable and lacking in clinically informative value, and adjustment of IMA values for serumalbumin concentration may be a feasible option.
Abstract: The diagnostic approach to acute myocardial infarction (AMI) is still one of the most challenging and controversial medical issues. Although the measurement of cardiospecific troponins has emerged as the gold standard for management of patients with acute chest pain, it is often unsuitable for early diagnosis, as nearly 50% of patients may present with non-diagnostic concentrations (1). Ischemia-modified albumin (IMA), measured by the albumin cobalt binding (ACB) colorimetric assay, has recently been proposed for the detection of early myocardial and skeletal muscle ischemia (2–4). During ischemia, the generation of reactive oxygen species reduces the metal-binding capacity of albumin for cobalt and other transition metals (5). Accordingly, any IMA increase over a specific diagnostic threshold may be reliably interpreted as a reliable marker of localized or systemic ischemia preceding necrosis. Epidemiological investigations on the clinical value of IMA for diagnosing myocardial ischemia have led to controversial outcomes (6, 7). Accordingly, the significance and correct placing of IMA testing within diagnostic algorithms for the evaluation of patients with suspected AMI are still being debated (6). Part of this indecision has been attributed to the influence of some analytical drawbacks when assaying IMA in clinical studies (8, 9). In particular, a significant inverse association was observed between IMA and serum albumin. Accordingly, it has been suggested that IMA serum concentrations in patients with extremely low or high serum albumin levels (-20 or )55 g/L) may be unreliable and lacking in clinically informative value (6, 9). To verify whether adjustment of IMA values for serum albumin concentration may be a feasible

98 citations


Journal ArticleDOI
TL;DR: Data available suggest that rFVIIa could have a potential role in refractory bleeding associated with disseminated intravascular coagulation, and large randomized trials are needed to confirm the preliminary results and to assess the safety and dosing regimens of this agent.
Abstract: Recombinant activated factor VII (rFVIIa) is a novel hemostatic agent, originally developed for the treatment of hemorrhage in hemophiliacs with inhibitors, which has been successfully used recently in an increasing number of nonhemophilic bleeding conditions. In the present systematic review we report the existing literature data on the use of this hemostatic agent in severe bleeding, unresponsive to standard treatment, associated with disseminated intravascular coagulation. A total of 99 disseminated intravascular coagulation-associated bleeding episodes treated with rFVIIa were collected from 27 published articles: in the majority of the cases, the underlying disorder complicated by disseminated intravascular coagulation was a postpartum hemorrhage, while in the remaining cases it was a cancer, trauma, sepsis or liver failure. Although limited, the data available suggest that rFVIIa could have a potential role in this clinical setting. Large randomized trials are needed, however, to confirm the preliminary results and to assess the safety and dosing regimens of this agent in refractory bleeding associated with disseminated intravascular coagulation.

48 citations


Journal ArticleDOI
TL;DR: The thrombin generation assay provides additional indications for the role of VWF in the treatment of patients with inhibitors, as the VWF‐containing concentrates Fanhdi and Haemate‐P, added to FVIII‐deficient plasma with the presence of inhibitor, generate moreThrombin than do the purified concentrates Hemofil‐M and Kogenate Bayer.
Abstract: In order to describe the haemostatic role of a variation in inhibitor reactivity with different factor VIII (FVIII) concentrates, we have compared inhibitor titres against a panel of FVIII concentrates and correlated titre with the capacity to inhibit thrombin generation. Three plasma-derived concentrates were tested in vitro in mixing experiments with inhibitor plasmas from 11 patients with severe haemophilia A: Fanhdi, which contains von Willebrand factor (VWF) with a final ratio of approximately 1:1 (VWF IU per IU FVIII:Q; Haemate-P with a ratio of 2.5:1 and Hemofil-M containing only trace amounts of VWF. In addition, the recombinant FVIII concentrate Kogenate Bayer containing no VWF was included. Inhibitor titres and the capacity to generate thrombin were measured. A statistically significant difference in measured titres was found with the highest titres recorded against Hemofil-M. The inhibitor titres needed to inhibit 50% maximum thrombin generation were the lowest for Kogenate Bayer and the highest and similar for Fanhdi and Haemate-P with intermediate titres needed for inhibition of Hemofil-M. In this study, the thrombin generation assay provides additional indications for the role of VWF in the treatment of patients with inhibitors. The VWF-containing concentrates Fanhdi and Haemate-P, added to FVIII-deficient plasma with the presence of inhibitor, generate more thrombin than do the purified concentrates Hemofil-M and Kogenate Bayer.

44 citations


Journal ArticleDOI
TL;DR: The importance of critical value reporting is still poorly recognized in Italy and uniform or internationally accredited practices for communication and recording are not currently implemented.
Abstract: Background Critical values' reporting is an essential requisite for clinical laboratories. Local policies were investigated within an indicative cohort of Italian laboratories to monitor the situation and establish a performance benchmark. Methods A five-point questionnaire was administered to 150 laboratory specialists attending the SIMEL (Italian Society of Laboratory Medicine) National Meeting in June 2006. Results A total of 107 questionnaires (71.3%) were returned with a 100% individual question response rate. Only 55% of the participants acknowledge critical values reporting as an essential practice, 80% admit that a comprehensive list of critical values is unavailable in the laboratory and 4% do not promptly communicate critical values. The list of critical values is variable among laboratories, ranging from none to 20 analytes included. The requesting physician or his/her office staff receives the great majority (97%) of notifications by telephone for outpatients. Critical values for inpatients are notified directly by telephone (81%) and in a minority of cases by either fax or computer (19%). In the inpatient setting, the information is notified to physicians (77%), nurses (15%) or other healthcare staff in the clinic (8%). It was found that 49% of the participants adopt a standard (digital or written) policy for routine recording of notifications; in 32% of the cases the registration is left to individual attitudes, whereas in 20% of the cases the notification is not recorded. No laboratory has yet adopted a read-back verification of the complete test result by the person receiving the information. Conclusions The importance of critical value reporting is still poorly recognized in Italy and uniform or internationally accredited practices for communication and recording are not currently implemented.

42 citations


Journal ArticleDOI
TL;DR: Reliable studies on animal models indicate that the proteolytic break-down products of apolipoprotein[a] would posses anti-angiogenic and anti-tumoral properties both in vitro and in vivo, a premise to develop novel therapeutic modalities which may efficiently suppress tumor growth and metastasis.

39 citations


Journal ArticleDOI
TL;DR: It is indicated that increased GGT activities are independently associated with a more atherogenic lipid profile in general population.

33 citations


Journal ArticleDOI
TL;DR: A 5–10 min centrifuge time at 1500 × g may be suitable for primary tubes collected for routine coagulation testing, which is inversely associated with residual blood cell elements in plasma, especially platelets.
Abstract: Preparation of blood specimens is a major bottleneck in the laboratory throughput. Reliable strategies for reducing the time required for specimen processing without affecting quality should be acknowledged, especially for laboratories performing stat analyses. The present investigation was planned to establish a minimal suitable centrifuge time for primary samples collected for routine coagulation testing. Five sequential primary vacuum tubes containing 0.109 mol/l buffered trisodium citrate were collected from 10 volunteers and were immediately centrifuged on a conventional centrifuge at 1500 x g, at room temperature for 1, 2, 5, 10 and 15 min, respectively. Hematological and routine coagulation testing, including prothrombin time, activated partial thromboplastin time and fibrinogen, were performed. The centrifugation time was inversely associated with residual blood cell elements in plasma, especially platelets. Statistically significant variations from the reference 15-min centrifuge specimens were observed for fibrinogen in samples centrifuged for 5 min at most and for the activated partial thromboplastin time in samples centrifuged for 2 min at most. Meaningful biases related to the desirable bias were observed for fibrinogen in samples centrifuged for 2 min at most, and for the activated partial thromboplastin time in samples centrifuged for 1 min at most. According to our experimental conditions, a 5-10 min centrifuge time at 1500 x g may be suitable for primary tubes collected for routine coagulation testing.

27 citations


Journal ArticleDOI
TL;DR: There is little evidence supporting the influence of different centrifuge times for primary lithium-heparin tubes with plasma separator on stat clinical chemistry testing.
Abstract: Background: Centrifugation time is a major bottleneck in laboratory specimen throughput. In most cases, 10 minutes is suggested to centrifuge samples on a swing-out rotor centrifuge, at room temperature, with a relative centrifugal force of 1,200±100 g; and 1,300 g or centrifugation times longer than 10 to 15 minutes may be advisable to get a better platelet clearance. Nevertheless, there is little evidence supporting the influence of different centrifuge times for primary lithium-heparin tubes with plasma separator on stat clinical chemistry testing. Methods: Five evacuated tubes collected from 10 consecutive subjects were centrifuged on an identical swing bucket centrifuge at 1,200 g for 1, 2, 5, 10, and 15 minutes respectively and tested for clinical chemistry stat analytes. Results: Statistically significant variations from the 15-minute centrifuge reference specimen were observed for ALT, calcium, glucose, potassium, urea nitrogen, and CK-MB (1 minute centrifugation), ALT, glucose, urea nitrogen, and CK-MB (2 minutes centrifugation), and glucose (5 and 10 minutes centrifugation). Meaningful biases according to the analytical quality specifi-cations for clinically-allowable variance were recorded for ALT, calcium, glucose, potassium (1 minute centrifugation), and for ALT, glucose, and potassium (2 minutes centrifugation).

Journal ArticleDOI
TL;DR: Provided the tubes are filled to their nominal volume, the blood-to-anticoagulant mixture occurring during sample collection may be adequate.
Abstract: Background: Although it is recommended that primary tubes containing an additive should be mixed several times, there is no evidence that unmixed specimens will provide unreliable results in hematological testing. Methods: Three primary 3.0-mL siliconized vacuum tubes containing 5.4 mg K2 ethylene diamine tetraacetic acid (EDTA) were sequentially collected from 20 healthy volunteers. The first was not inverted, was left standing in a vertical position, and then analyzed. The second and third tubes were respectively inverted 6 and 12 times immediately after collection and then analyzed. Results: When compared with the reference specimens inverted 6 times, results on unmixed specimens revealed significant decreases for red blood cell count, hemoglobin, hematocrit and platelets count, whereas the mean platelet volume was significantly increased. In none of the specimens were results flagged for platelet clumping, nor did the differences exceed the acceptable limits of bias. Conclusion: Provided the tubes are filled to their nominal volume, the blood-to-anticoagulant mixture occurring during sample collection may be adequate.

Journal ArticleDOI
TL;DR: The results of the present investigation demonstrate that the actual adult reference ranges for coagulation screening tests, especially PT and APTT, cannot be applied to newborns and young infants.
Abstract: The diagnostic approach to haemostatic defects in the newborn is challenging and requires appropriate interpretation of coagulation tests according to reference values dependent on the postnatal age. This investigation was designed to study the postnatal development of the human coagulation system in newborn infants and to develop appropriate reference ranges for prothrombin time (PT), activated partial thromboplastin time (APTT) and fibrinogen (FBG) according at the day of birth and for the following postnatal period (days 1, 2, 3, 4, 5, 6, from 7 to 10 and from 11 to 44). The mean FBG value was already within the adult reference range in newborns at birth, the mean PT value fell within the adult reference range in infants aged 4 days or more, whereas the mean APTT value was still higher than the upper limit of the adult reference range in infants aged between 11 and 20 days. The prevalence of infants with pathological values according to the actual adult reference ranges was limited for FBG (from 24 to 7%), decreased from 92 to 8% in infants aged 0 and 11–20 days for PT, but remained elevated throughout the observational period for APTT (from 94 to 71%). The results of the present investigation demonstrate that the actual adult reference ranges for coagulation screening tests, especially PT and APTT, cannot be applied to newborns and young infants.

Journal ArticleDOI
TL;DR: The case of an asymptomatic women, who was enrolled as a healthy control in a study and showed up with a substantially increased D- dimer value is document to hypothesize that screening for occult malignancy in the presence of apparently inexplicable elevated D-dimer values may be taken into consideration.
Abstract: Fibrin formation and removal occurs continuously during the development of malignancy. Accordingly, hemostatic disorders in cancer patients are a rather frequent observation and range from asymptomatic laboratory changes to massive thromboembolism or haemorrhage. We document the case of an asymptomatic women, who was enrolled as a healthy control in a study and showed up with a substantially increased D- dimer value. After ruling out the most probable sources of D-dimer elevation, such as thrombosis, inflammation and trauma, she underwent laboratory and radiological investigations for malignancy, which were consistent with a colorectal metastatic adenocarcinoma. This case allow us to hypothesize that screening for occult malignancy in the presence of apparently inexplicable elevated D-dimer values may be taken into consideration.

Journal ArticleDOI
TL;DR: a Sezione di Chimica e Microscopia Clinica, Dipartimento di Scienze Morfologico-Biomediche, Universita degli Studi di Verona, Ospedale Policlinico G.B. Rossi,Verona, Italy.

Journal ArticleDOI
TL;DR: The results of this investigation demonstrate that inappropriate laboratory utilisation of this test is commonplace, especially for inpatients, and more accurate application of the current recommendations would be advisable to decrease unnecessary testing and prevent avoidable health expenditure.
Abstract: Background: Regardless of the available recommendations to perform glycated haemoglobin testing at a 2- to 3-month frequency, there is increasing evidence of an inappropriate laboratory use of this test in clinical practice. Methods: Data from our Laboratory Information System were analysed for glycated haemoglobin test orders over a 3-year period using Microsoft® Excel to calculate the order intervals and the test frequency for each patient. To assess the appropriateness of repeat testing, only data for patients who had at least two separate glycated haemoglobin test results were included in the analysis. Inappropriate test orders were defined as any order for a given patient taking place within a 29- or 89-day period following the previous order. Results: The results of our investigation demonstrate that inappropriate laboratory utilisation of this test is commonplace (26% of total repeat orders within 90 days), especially for inpatients (63.7% of inpatient repeat orders in less than 90 days). When stratifying glycated haemoglobin test results according to the >7% threshold, the frequency of inappropriate laboratory use (> 90 days) was surprisingly greater among inpatients with a previous value of 7% (57.6% vs. 42.4%). The frequency of inappropriate glycated haemoglobin repeat test orders was lower among outpatients with a previous value of 7% (64.8% vs. 35.2%). Conclusions: We conclude that more accurate application of the current recommendations would be advisable to decrease unnecessary testing and prevent avoidable health expenditure.


Journal ArticleDOI
TL;DR: It is concluded that the analytical performances and the main technical features of new Immulite 2000 d‐dimer assay make it a suitable method for the rapid quantification of d‐Dimer in clinical laboratories.
Abstract: Summary We evaluated the analytical performances of a new commercial chemiluminescent enzyme immunometric assay for d-dimer measurement on the Immulite 2000 automated analyser. The within- and between-run coefficients of variations for low, intermediate and high d-dimer concentrations ranged from 2.2% to 5.1% and from 2.9% to 6.0% respectively. The assay was proven linear in a range of d-dimer concentrations comprised between 324 and 7602 ng/ml. Comparison of samples collected in either 0.109 mol/l sodium citrate or lithium–heparin tubes by Bland–Altman plots and nonparametric regression analysis (y = 0.858x − 39, r = 0.997; P < 0.0001) revealed a minimum bias, almost completely attributable to the dilution effect of the anticoagulant. Results of 56 outpatients’ citrated plasma samples analysed with the Immulite 2000 d-dimer assay were compared with those of the current reference commercial immunoassay on the Mini Vidas Immunoanalyser. Although the nonparametric regression according to the method of Passing and Bablok and the relative Spearman's correlation coefficient was satisfactory (Immulite = 1.001 × Vidas − 181; r = 0.937, P < 0.001), some random discrepancies could be observed in Bland–Altman plots analysis. The analytical accuracy, calculated as the area under the Receiver Operating Characteristic (ROC) curve (AUC), was 0.961 (P < 0.0001). On the basis of the results of the present evaluation, we conclude that the analytical performances and the main technical features of new Immulite 2000 d-dimer assay make it a suitable method for the rapid quantification of d-dimer in clinical laboratories.

Journal ArticleDOI
TL;DR: Results of this investigation suggest that SSc patients with ACA positivity, after a primary fibrogenetic noxa, react with a more abundant release of MMP/TIMP, whereas patients with anti-Scl70 antibody show a normal response.

Journal ArticleDOI
TL;DR: K2 EDTA plasma is the most suitable specimens for BNP testing on fresh and frozen samples stored at either −20°C or −80°C for up to 1 week.
Abstract: The assessment and management of congestive heart failure relies increasingly on the measurement of B-type natriuretic peptide (BNP). However, the effective contribution of this biochemical test in the clinical decision making is influenced by reliability of the measure, which also depends on several preanalytical issues. Since there is controversy on the influence of the matrix and the storage conditions on BNP measurement, we compared results of BNP in serum, K2 ethylene diamine tetra-acetic acid (EDTA) plasma and lithium heparin plasma fresh samples and in matching samples stored at -20 and -80 degrees C for 1 week. BNP measured on the Bayer Advia Centaur was systematically underestimated in heparin plasma (-47%) and serum (-62%) when compared to K2 EDTA plasma. According to the established 100 ng/L cutoff value, 25% and 37% of the fresh samples collected in heparin plasma or serum were misclassified from the reference K2 EDTA fresh specimen, respectively. When compared to the fresh specimens, the mean and interindividual bias observed for samples stored at either -20 degrees C or -80 degrees C was, overall, modest for K2 EDTA plasma (-2%) and heparin plasma (+6% and -4%, respectively), though it appeared clinically meaningful in serum (+47% and +28%, respectively). Although we can not rule out that other BNP assays using different antibodies may be not affected from degradation during storage to the same extent, results of our investigation demonstrate that K2 EDTA plasma is the most suitable specimens for BNP testing on fresh and frozen samples stored at either -20 degrees C or -80 degrees C for up to 1 week.

Journal ArticleDOI
TL;DR: New and efficient measures should be adopted, such as lowering the cornering speed, having heavier and safer vehicles, having barriers surrounding the track to protect both spectators and competitors better, and having innovative clothing and protective devices to defend key anatomical structures while minimising the hindrance to the rider.
Abstract: Motor racing is a dangerous sport and an inherently risky activity. The organisers of top-class motor sports championships, Formula One and MotoGP, have agreed on a set of regulations to reduce speed and improve safety over the last 10 years. These changes include limitations in weight, fuel and engine capacity. Nevertheless, there is evidence that most of the restrictions that have been introduced over the past 10 years have failed slow down vehicles, since the lap times have decreased almost linearly from 1995 to 2006 and drivers continue to die or to sustain serious injuries that keep them away from competition. Therefore, new and efficient measures should be adopted, such as lowering the cornering speed, having heavier and safer vehicles, having barriers surrounding the track to protect both spectators and competitors better, and having innovative clothing and protective devices to defend key anatomical structures while minimising the hindrance to the rider.

Journal ArticleDOI
TL;DR: A retrospective analysis on the database of the Laboratory Information System of the Clinical Chemistry Institute of the University Hospital of Verona on outpatients referred by the general practitioner to the laboratory for routine blood testing document a positive correlation between serum ferritin level and haemoglobin concentration, confirming the previous data from studies on patients with primary iron overload.
Abstract: Haemochromatosis, a disease characterised by an increase in iron storage which may cause pathological changes1, is divided into primary or secondary forms. While primary (or genetic) haemochromatosis, the most common autosomal recessive disorder, is due in the majority of cases to mutations in the HFE gene, secondary haemochromatosis can have several causes including ineffective erythropoiesis and multiple red blood cell transfusions1,2. A number of studies have documented that subjects carrying mutations in the haemochoromatosis gene have significantly increased peripheral blood erythrocyte indices (haematocrit, haemoglobin and mean corpuscular haemoglobin concentrations), compared with wild-type individuals, thus suggesting an evolutionary sex-related protecting role against iron deficiency3–7. By contrast, no studies have so far analysed the relationship between serum ferritin, a widely accepted measurement of iron storage, and haemoglobin concentration. We, therefore, performed a retrospective analysis on the database of the Laboratory Information System of the Clinical Chemistry Institute of the University Hospital of Verona on outpatients referred by the general practitioner to our laboratory for routine blood testing. Between May 2004 and May 2007, 1,907 consecutive subjects underwent routine blood tests including measurement of serum ferritin levels and inflammatory indices (C-reactive protein) and blood cell count. After the exclusion of women, duplicate tests from the same person and individuals with abnormal inflammatory indices or low serum ferritin levels ( 350 μg/L, mean 658.4 ± 341.8 μg/L). The mean age and haemoglobin concentration were then compared between these two groups (Table I). Table I Characteristics of the 589 patients enrolled in the study There were statistically significant differences in age (36.3 ± 26.1 years in the first group versus 51.6 ± 16.4 years in the second group, p <0.0001) and haemoglobin concentration (14.4 ± 1.7 g/dL in the first group versus 15.5 ± 1.6 g/dL in the second, p < 0.0001). In conclusion, the results of our study document a positive correlation between serum ferritin level and haemoglobin concentration, thus confirming the previous data from studies on patients with primary iron overload. The different mean age between the two groups is also an interesting finding of our study and testifies that iron accumulation is an age-dependent process which happens regardless of the underlying cause.

Journal Article
TL;DR: It is demonstrated that EDTA plasma is the most suitable sample matrix for the storage of beta-CTX at room temperature after centrifugation and that preanalytical biases in the results of this marker can decrease its clinical usefulness.
Abstract: The measurement of beta-C-telopeptides of type I collagen (beta-CTX) reflects the rate of bone resorption in a variety of metabolic bone disorders and it is increasingly used to assist diagnosis and follow-up of these pathologies. Since preanalytical biases in the results of this marker can decrease its clinical usefulness, specific stability studies should be developed to prevent that inconsistent results of laboratory testing might affect patient health and waste economical resources. Three blood samples were simultaneously collected without venous stasis into evacuated tubes containing no additives, K2 EDTA or lithium heparin, from 23 out-patients referred to our phlebotomy service for routine laboratory testing. After centrifugation and separation of the specimens, a first aliquot was immediately analyzed, whereas a second and third aliquot was processed after a 24- and 48-hour storage at room temperature (21 degrees C). Beta-CTX was assayed on the automated electrochemiluminescence analyzer E170. A modest and clinically irrelevant underestimation was observed in lithium heparin plasma when compared with either K2 EDTA (-7.1%; 95% C.I. -2.0 to -12.3%; p < 0.001) or serum (-7.8%; 95% C.I. -3.2 to -12.4%; p < 0.001), but not between serum and K2 EDTA (+0.8%, 95% C.I. -5.3 to +6.9%; p = 0.260). Storage at room temperature in K2 EDTA plasma introduced a modest and clinically negligible decay in immunoreactivity (-4.4% and -5.7% at 24 and 48 hours, respectively), whereas storage at room temperature in both serum (-17.6% and -28.6% at 24 and 48 hours, respectively) and lithium heparin plasma (-29.1% and -44.0% at 24 and 48 hours, respectively) was associated with a substantial decay and a larger inter-individual variability in the measurable concentration of the analyte. In conclusion, the results of our investigation demonstrate that EDTA plasma is the most suitable sample matrix for the storage of beta-CTX at room temperature after centrifugation.

Journal ArticleDOI
TL;DR: The main haemostatic changes occurring during pregnancy are described, focusing on aetiopathological aspects, the diagnostic procedures for evaluating the risk of thrombophilia and the appropriate diagnostic measures.
Abstract: The Norwegian researcher Egeberg coined the term ‘thrombophilia’ in 1965, after having described for the first time the association between venous thromboembolism, occurring in the absence of any specific causes, and an inherited antithrombin III defect1. The term thrombophilia is now used to define all those inherited, acquired and/or transitory conditions that predispose to the onset of arterial or venous thromboses2,3. The crucial point regarding research into thrombophilia is the evaluation of the correct functioning of haemostasis. Haemostasis is currently defined in terms of ‘haemostatic balance’, a terminology that envelops a vast series of physiological mechanisms aimed at maintaining the blood fluid while it is in the vascular system, but preventing excessive haemorrhagic loss following endothelial damage or breaks in the wall of the blood vessel. A series of events is activated as a result of a lesion in a blood vessel. These events can be divided into successive phases according to a cascade model: the blood vessel-platelet activation phase (primary haemostasis), coagulation phase (secondary haemostasis), clot dissolution (fibrinolysis), and repair of the endothelial damage. The particular characteristics of the haemostatic process are its localisation, amplification and modulation that, in physiological conditions, are in perfect dynamic equilibrium4. These complex haemostatic functions are the result of the integrated and finely regulated action of many components, of which the main ones are the vascular endothelium, platelets, circulating and transmembrane proteins and calcium ions. The activation and amplification of the cascade and stabilisation of the coagulum are regulated by the balance between the activity of specific proteases and that of their respective allosteric and enzymatic inhibitors5. Abnormalities at any point during the coagulation cascade can cause pathological changes, characterised by haemorrhages or intravascular thromboses, depending on whether there is an insufficient or excessive response to endothelial damage6. In 1856, the Prussian pathologist Rudolf Virchow first proposed his hypothesis, which was subsequently shown to be correct and just as relevant nowadays, to explain the pathogenesis of thrombosis7. He suggested that three factors were sufficient and necessary to produce thrombosis: (i) hypercoagulability, (ii) stasis and, (iii) endothelial damage. These factors, present to variable degrees in the pathogenesis of venous thrombosis, also concur to increase the risk of thromboembolism during a normal pregnancy. Indeed, a state of hypercoagulability develops during pregnancy; this is caused by both increases of certain procoagulant factors and decreased effectiveness of inhibitory systems. Furthermore, the mechanical obstruction caused by the foetus and the vasodilatory effects mediated by the altered oestrogen/progesterone ratio result in increased blood pressure and stasis in the lower limbs. Together, these phenomena are responsible for the endothelial damage and compromise primary and secondary haemostasis. This review describes the main haemostatic changes occurring during pregnancy, focusing on aetiopathological aspects, the diagnostic procedures for evaluating the risk of thrombophilia and the appropriate diagnostic measures.

Journal ArticleDOI
TL;DR: There is evidence this drug may be frequently mishandled due to inappropriate administration or monitoring, and the overall cost for the national healthcare system associated with the 1,373 measurements of inappropriate digoxin levels was estimated at approximately 15,309 euros (Canadian $23,625) per year.
Abstract: Despite the introduction of a variety of new classes of drugs for the management of heart failure, digoxin continues to have an important role in long-term outpatient management in that it can improve symptoms, quality of life, and exercise tolerance in patients with mild, moderate, or severe heart failure [1–3]. We have read with interest the article of Hallberg et al., who recently highlighted that long-term therapy with digoxin is an independent risk factor for death in patients with atrial fibrillation (AF) [4]. As for other drugs, digoxin measurement is essential, as the large interindividual variability in pharmacokinetics and the concentrationdependent pharmacokinetics can make individual dosage rather challenging, explaining the high burden of potential drug-related adverse events [5]. Although the narrow therapeutic index justifies digoxin monitoring in clinical practice, inappropriate indications or wrong sampling time may produce misleading results, which would both jeopardize the patient’s health and waste substantial economic resources. Basically, inappropriate levels of digoxin may be due to testing on patients not receiving digoxin and early routine monitoring or determination after adjustment before a pharmacological steady state had been achieved. A recent investigation attests that the quality of the information in therapeutic drug monitoring of serum digoxin concentrations requests is poor across different specialties and health care settings worldwide [6], confirming the earlier perception that inappropriate prescribing is commonplace and persists over time [7–10]. Therefore, there is evidence this drug may be frequently mishandled due to inappropriate administration or monitoring. As we aimed to asses the appropriateness of serum digoxin levels for developing educational policies at our Institution, we performed a retrospective analysis of digoxin plasma-level determinations in inpatients and outpatients performed between January and December 2006. The study was conducted at the Verona University Hospital, a primary care university teaching hospital with 750 beds that serves an area with a population of nearly 270,000 inhabitants. The main outcome measure was established as the proportion of digoxin levels measured on heparin plasma on the Roche Modular System P (Roche Diagnostics GmbH, Mannheim, Germany) outside the conventional therapeutic range of 0.9 ng/mL–2.0 ng/mL (1.2–2.6 nmol/L). Plasma concentrations >2.0 ng/mL (2.6 nmol/L) were classified as toxic [11]. A total of 2,477 digoxin plasma levels were measured for adult inpatients (2,271, 91.7%) and outpatients (206, 8.3%). Inappropriate levels could be detected in nearly half of the samples tested (1,104, 44.6 %), 169 of which (6.8%) corresponded to toxic concentrations, whereas 935 (37.7%) gave results under the expected therapeutic range. Overall, the percentage of samples with levels within the therapeutic range was statistically higher for inpatients than for outpatients (57.3% vs. 35.0%, p<0.001 by chi-square test analysis) (Table 1). The overall cost for our national healthcare system associated with the 1,373 measurements of inappropriate digoxin levels was estimated at approximately 15,309 euros (Canadian $23,625) per year. Eur J Clin Pharmacol (2007) 63:1201–1202 DOI 10.1007/s00228-007-0379-0



Journal ArticleDOI
TL;DR: Serum, heparin and K2 EDTA plasma may be suitable for NT-proBNP measurement and showed a marginally significant underestimation when compared to serum and heparIn, whereas no significant difference was observed between serum andHeparin plasma.

Journal ArticleDOI
TL;DR: It is demonstrated that activated partial thromboplastin time has an excellent diagnostic sensitivity and a satisfactory specificity for identifying isolated von Willebrand factor deficiencies.
Abstract: The diagnostic approach to von Willebrand factor deficiencies is challenging and requires discretionary use of laboratory resources. Although extensive preoperative testing is not recommended, the activated partial thromboplastin time may be useful, especially in selected categories of patients. To establish the diagnostic sensitivity of this test to identify isolate von Willebrand factor deficiencies, 204 consecutive patients underwent a routine preoperative screening consisting of activated partial thromboplastin time, von Willebrand factor antigen, intrinsic pathway clotting factors activity, lupus anticoagulants and thrombin time. Thirty-seven patients were diagnosed with haemostasis disturbances other than von Willebrand factor deficiencies and were excluded from the evaluation. Isolated von Willebrand factor deficiency was diagnosed in 11 of the remaining 167 patients. A significant correlation was observed between von Willebrand factor antigen and activated partial thromboplastin time. Receiver operating characteristic curve analysis showed an area under the curve of 0.982 (95% confidence interval: 0.972-0.992; P < 0.001). At the 1.17 upper limit of the activated partial thromboplastin time, sensitivity and specificity were 100 and 85%, respectively, with negative and positive predictive values of 100 and 31%, respectively. These results demonstrate that activated partial thromboplastin time has an excellent diagnostic sensitivity and a satisfactory specificity for identifying isolated von Willebrand factor deficiencies.

01 Jan 2007
TL;DR: The present document is aimed to review the major causes of unsuitable specimens in clinical laboratories, providing consensus recommendations on detection and management.
Abstract: Recommendations for detection and management of unsuitable samples in clinical laboratories. A large body of evidence attests that quality improvement programs targeted at the sole analytical phase would not grant additional clinical and economical outcomes, since the large majority of errors encountered in clinical laboratories occurs in the extra-analytical areas of testing, especially in manually-intensive preanalytical processes. Most preanalytical errors result from system flaws and insufficient audit with the operators involved in sample collection and handling responsibilities, leading to an unacceptable number of unsuitable specimens due to misidentification, in vitro hemolysis, clotting, insufficient volume, wrong container and contamination. Detection and management of unsuitable samples are, therefore, necessary conditions to overcome this unwelcome variability. The present document is aimed to review the major causes of unsuitable specimens in clinical laboratories, providing consensus recommendations on detection and management.