scispace - formally typeset
Search or ask a question

Showing papers by "Sapienza University of Rome published in 2006"


Journal ArticleDOI
01 Jun 2006-BMJ
TL;DR: Selective COX 2 inhibitors are associated with a moderate increase in the risk of vascular events, as are high dose regimens of ibuprofen and diclofenac, but high dose naproxen is not associated with such an excess.
Abstract: Objective To assess the effects of selective cyclo-oxygenase-2 (COX 2) inhibitors and traditional non-steroidal anti-inflammatory drugs (NSAIDs) on the risk of vascular events. Design Meta-analysis of published and unpublished tabular data from randomised trials, with indirect estimation of the effects of traditional NSAIDs. Data sources Medline and Embase (January 1966 to April 2005); Food and Drug Administration records; and data on file from Novartis, Pfizer, and Merck. Review methods Eligible studies were randomised trials that included a comparison of a selective COX 2 inhibitor versus placebo or a selective COX 2 inhibitor versus a traditional NSAID, of at least four weeks9 duration, with information on serious vascular events (defined as myocardial infarction, stroke, or vascular death). Individual investigators and manufacturers provided information on the number of patients randomised, numbers of vascular events, and the person time of follow-up for each randomised group. Results In placebo comparisons, allocation to a selective COX 2 inhibitor was associated with a 42% relative increase in the incidence of serious vascular events (1.2%/year v 0.9%/year; rate ratio 1.42, 95% confidence interval 1.13 to 1.78; P = 0.003), with no significant heterogeneity among the different selective COX 2 inhibitors. This was chiefly attributable to an increased risk of myocardial infarction (0.6%/year v 0.3%/year; 1.86, 1.33 to 2.59; P = 0.0003), with little apparent difference in other vascular outcomes. Among trials of at least one year9s duration (mean 2.7 years), the rate ratio for vascular events was 1.45 (1.12 to 1.89; P = 0.005). Overall, the incidence of serious vascular events was similar between a selective COX 2 inhibitor and any traditional NSAID (1.0%/year v 0.9%/year; 1.16, 0.97 to 1.38; P = 0.1). However, statistical heterogeneity (P = 0.001) was found between trials of a selective COX 2 inhibitor versus naproxen (1.57, 1.21 to 2.03) and of a selective COX 2 inhibitor versus non-naproxen NSAIDs (0.88, 0.69 to 1.12). The summary rate ratio for vascular events, compared with placebo, was 0.92 (0.67 to 1.26) for naproxen, 1.51 (0.96 to 2.37) for ibuprofen, and 1.63 (1.12 to 2.37) for diclofenac. Conclusions Selective COX 2 inhibitors are associated with a moderate increase in the risk of vascular events, as are high dose regimens of ibuprofen and diclofenac, but high dose naproxen is not associated with such an excess.

1,380 citations


Journal ArticleDOI
TL;DR: In this paper, teachers' selfefficacy beliefs were examined as determinants of their job satisfaction and students' academic achievement, controlling for previous levels of achievement, and structural equation modeling analyses corroborated a conceptual model in which teachers' personal efficacy beliefs affected their job- satisfaction and student's academic achievement.

1,296 citations


Journal ArticleDOI
TL;DR: The findings strongly suggest that students of different education levels (from school to university) are chronically sleep deprived or suffer from poor sleep quality and consequent daytime sleepiness and sleep loss is frequently associated with poor declarative and procedural learning in students.

1,051 citations


Journal ArticleDOI
TL;DR: These guidelines are intended to give evidence-based recommendations for the use of ONS and TF in surgical patients and it is strongly recommended not to wait until severe undernutrition has developed, but to start EN therapy early, as soon as a nutritional risk becomes apparent.

1,008 citations


Journal ArticleDOI
TL;DR: Evaluated trials provide level A evidence for the efficacy of tricyclic antidepressants, gabapentin, pregabalin and opioids, with a large number of class I trials, followed by topical lidocaine and the newer antidepressants venlafaxine and duloxetine.
Abstract: Neuropathic pain treatment remains unsatisfactory despite a substantial increase in the number of trials. This EFNS Task Force aimed at evaluating the existing evidence about the pharmacological treatment of neuropathic pain. Studies were identified using first the Cochrane Database then Medline. Trials were classified according to the aetiological condition. All class I and II controlled trials (according to EFNS classification of evidence) were assessed, but lower-class studies were considered in conditions that had no top level studies. Only treatments feasible in an outpatient setting were evaluated. Effects on pain symptoms/signs, quality of life and comorbidities were particularly searched for. Most of the randomized controlled trials included patients with postherpetic neuralgia (PHN) and painful polyneuropathies (PPN) mainly caused by diabetes. These trials provide level A evidence for the efficacy of tricyclic antidepressants, gabapentin, pregabalin and opioids, with a large number of class I trials, followed by topical lidocaine (in PHN) and the newer antidepressants venlafaxine and duloxetine (in PPN). A small number of controlled trials were performed in central pain, trigeminal neuralgia, other peripheral neuropathic pain states and multiple-aetiology neuropathic pains. The main peripheral pain conditions respond similarly well to tricyclic antidepressants, gabapentin, and pregabalin, but some conditions, such as HIV-associated polyneuropathy, are more refractory. There are too few studies on central pain, combination therapy, and head-to-head comparison. For future trials, we recommend to assess quality of life and pain symptoms or signs with standardized tools.

988 citations


Journal ArticleDOI
TL;DR: The largely complementary palaeobotanical and genetic data indicate that beech survived the last glacial period in multiple refuge areas and the modern genetic diversity was shaped over multiple glacial-interglacial cycles.
Abstract: Summary • Here, palaeobotanical and genetic data for common beech (Fagus sylvatica) in Europe are used to evaluate the genetic consequences of long-term survival in refuge areas and postglacial spread. • Four large datasets are presented, including over 400 fossil-pollen sites, 80 plant-macrofossil sites, and 450 and 600 modern beech populations for chloroplast and nuclear markers, respectively. • The largely complementary palaeobotanical and genetic data indicate that: (i) beech survived the last glacial period in multiple refuge areas; (ii) the central European refugia were separated from the Mediterranean refugia; (iii) the Mediterranean refuges did not contribute to the colonization of central and northern Europe; (iv) some populations expanded considerably during the postglacial period, while others experienced only a limited expansion; (v) the mountain chains were not geographical barriers for beech but rather facilitated its diffusion; and (vi) the modern genetic diversity was shaped over multiple glacial–interglacial cycles. • This scenario differs from many recent treatments of tree phylogeography in Europe that largely focus on the last ice age and the postglacial period to interpret genetic structure and argue that the southern peninsulas (Iberian, Italian and Balkan) were the main source areas for trees in central and northern Europe.

846 citations


Journal ArticleDOI
TL;DR: Patients with hematologic malignancies are currently at higher risk of IFI caused by molds than by yeasts, and the incidence of I FI is highest among patients with acute myeloid leukemia.
Abstract: BACKGROUND AND OBJECTIVES: The aim of this study was to evaluate the incidence and outcome of invasive fungal infections (IFI) in patients with hematologic malignancies. DESIGN AND METHODS: This was a retrospective cohort study of patients admitted between 1999 and 2003 to 18 hematology wards in Italy. Each participating center provided information on all patients with newly diagnosed hematologic malignancies admitted during the survery period and on all episodes of IFI experienced by these patients. RESULTS: The cohort was formed of 11,802 patients with hematologic malignacies: acute leukemia (myeloid 3012, lymphoid 1173), chronic leukemia (myeloid 596, lymphoid 1104), lymphoma (Hodgkin's 844, non-Hodgkin's 3457), or multiple myeloma (1616). There were 538 proven or probable IFI (4.6%); 373 (69%) occurred in patients with acute myeloid leukemia. Over half (346/538) were caused by molds (2.9%), in most cases Aspergillus spp. (310/346). The 192 yeast infections (1.6%) included 175 cases of candidemia. Overall and IFI-attributable mortality rates were 2% (209/11802) and 39% (209/538), respectively. The highest IFI-attributable mortality rates were associated with zygomycosis (64%) followed by fusariosis (53%), aspergillosis (42%), and candidemia (33%). INTERPRETATION AND CONCLUSIONS: Patients with hematologic malignancies are currently at higher risk of IFI caused by molds than by yeasts, and the incidence of IFI is highest among patients with acute myeloid leukemia. Aspergillus spp are still the most common pathogens, followed by Candida spp. Other agents are rare. The attributable mortality rate for aspergillosis has dropped from 60-70% to approximately 40%. Candidemia-related mortality remains within the 30-40% range reported in literature although the incidence has decreased.

844 citations



Journal ArticleDOI
TL;DR: The guidelines for the use of oral nutritional supplements (ONS) and tube feeding (TF) in cancer patients were developed by an interdisciplinary expert group in accordance with officially accepted standards as discussed by the authors.

790 citations


Journal ArticleDOI
TL;DR: This article examined two aspects of personality that may influence political choice, traits and personal values, using the Five Factor Model of personality traits and the Schwartz (1992) theory of basic personal values.
Abstract: Voters' political choices have presumably come to depend more on their personal preferences and less on their social characteristics in Western democracies. We examine two aspects of personality that may influence political choice, traits and personal values, using the Five Factor Model of personality traits and the Schwartz (1992) theory of basic personal values. Data from 3044 voters for the major coalitions in the Italian national election of 2001 showed that supporters of the two coalitions differed in traits and values, largely as hypothesized. Center-left voters were higher than center-right voters in the traits of friendliness and openness and lower in energy and conscientiousness. Regarding values, center-left voters were higher than center-right voters in universalism, benevolence, and self-direction and lower in security, power, achievement, conformity, and tradition. Logistic regressions revealed that values explained substantial variance in past and future voting and in change of political choice, trumping personality traits. We discuss explanations for the primacy of values and implications for the social cognitive view of personality.

697 citations


Journal ArticleDOI
M. H. Ahn1, E. Aliu2, S. Andringa2, Shigeki Aoki3  +217 moreInstitutions (29)
TL;DR: In this article, measurements of {nu}{sub {mu}} disappearance in K2K, the KEK to Kamioka long-baseline neutrino oscillation experiment are presented.
Abstract: We present measurements of {nu}{sub {mu}} disappearance in K2K, the KEK to Kamioka long-baseline neutrino oscillation experiment. One-hundred and twelve beam-originated neutrino events are observed in the fiducial volume of Super-Kamiokande with an expectation of 158.1{sub -8.6}{sup +9.2} events without oscillation. A distortion of the energy spectrum is also seen in 58 single-ring muonlike events with reconstructed energies. The probability that the observations are explained by the expectation for no neutrino oscillation is 0.0015% (4.3{sigma}). In a two-flavor oscillation scenario, the allowed {delta}m{sup 2} region at sin{sup 2}2{theta}=1 is between 1.9 and 3.5x10{sup -3} eV{sup 2} at the 90% C.L. with a best-fit value of 2.8x10{sup -3} eV{sup 2}.

Journal ArticleDOI
TL;DR: A computational technique is proposed which combines the string method with a sampling technique to determine minimum free energy paths and captures the mechanism of transition in that it allows to determine the committor function for the reaction and, in particular, the transition state region.
Abstract: A computational technique is proposed which combines the string method with a sampling technique to determine minimum free energy paths. The technique only requires to compute the mean force and another conditional expectation locally along the string, and therefore can be applied even if the number of collective variables kept in the free energy calculation is large. This is in contrast with other free energy sampling techniques which aim at mapping the full free energy landscape and whose cost increases exponentially with the number of collective variables kept in the free energy. Provided that the number of collective variables is large enough, the new technique captures the mechanism of transition in that it allows to determine the committor function for the reaction and, in particular, the transition state region. The new technique is illustrated on the example of alanine dipeptide, in which we compute the minimum free energy path for the isomerization transition using either two or four dihedral angles as collective variables. It is shown that the mechanism of transition can be captured using the four dihedral angles, but it cannot be captured using only two of them.


Proceedings ArticleDOI
09 Oct 2006
TL;DR: An efficient collision detection method that uses only proprioceptive robot sensors and provides also directional information for a safe robot reaction after collision is presented.
Abstract: A robot manipulator sharing its workspace with humans should be able to quickly detect collisions and safely react for limiting injuries due to physical contacts. In the absence of external sensing, relative motions between robot and human are not predictable and unexpected collisions may occur at any location along the robot arm. Based on physical quantities such as total energy and generalized momentum of the robot manipulator, we present an efficient collision detection method that uses only proprioceptive robot sensors and provides also directional information for a safe robot reaction after collision. The approach is first developed for rigid robot arms and then extended to the case of robots with elastic joints, proposing different reaction strategies. Experimental results on collisions with the DLR-III lightweight manipulator are reported.

Journal ArticleDOI
TL;DR: In this article, the authors investigate the relationship between location patterns, innovation processes and industrial clusters and extend a transactions costs-based classification into a knowledge-based taxonomy of clusters, along with a critical revision of the main assumptions underlying most of the existing literature on spatial clusters.

Journal ArticleDOI
TL;DR: EN by means of ONS is recommended for patients with chronic LD in whom undernutrition is very common and TF commenced early after liver transplantation can reduce complication rate and cost and is preferable to parenteral nutrition.

Journal ArticleDOI
TL;DR: In this article, a subgroup of the custodial symmetry O(3) that protects ρ from radiative corrections can also protect the ZbRR coupling, which allows one to build models of electroweak symmetry breaking, such as Higgsless, Little Higgs or 5D composite Higgs models, that are safe from corrections to Z → bb.


Journal ArticleDOI
Pietro Cortese, G. Dellacasa, Luciano Ramello, M. Sitta  +975 moreInstitutions (78)
TL;DR: The ALICE Collaboration as mentioned in this paper is a general-purpose heavy-ion experiment designed to study the physics of strongly interacting matter and the quark-gluon plasma in nucleus-nucleus collisions at the LHC.
Abstract: ALICE is a general-purpose heavy-ion experiment designed to study the physics of strongly interacting matter and the quark–gluon plasma in nucleus–nucleus collisions at the LHC. It currently involves more than 900 physicists and senior engineers, from both the nuclear and high-energy physics sectors, from over 90 institutions in about 30 countries.The ALICE detector is designed to cope with the highest particle multiplicities above those anticipated for Pb–Pb collisions (dNch/dy up to 8000) and it will be operational at the start-up of the LHC. In addition to heavy systems, the ALICE Collaboration will study collisions of lower-mass ions, which are a means of varying the energy density, and protons (both pp and pA), which primarily provide reference data for the nucleus–nucleus collisions. In addition, the pp data will allow for a number of genuine pp physics studies.The detailed design of the different detector systems has been laid down in a number of Technical Design Reports issued between mid-1998 and the end of 2004. The experiment is currently under construction and will be ready for data taking with both proton and heavy-ion beams at the start-up of the LHC.Since the comprehensive information on detector and physics performance was last published in the ALICE Technical Proposal in 1996, the detector, as well as simulation, reconstruction and analysis software have undergone significant development. The Physics Performance Report (PPR) provides an updated and comprehensive summary of the performance of the various ALICE subsystems, including updates to the Technical Design Reports, as appropriate.The PPR is divided into two volumes. Volume I, published in 2004 (CERN/LHCC 2003-049, ALICE Collaboration 2004 J. Phys. G: Nucl. Part. Phys. 30 1517–1763), contains in four chapters a short theoretical overview and an extensive reference list concerning the physics topics of interest to ALICE, the experimental conditions at the LHC, a short summary and update of the subsystem designs, and a description of the offline framework and Monte Carlo event generators.The present volume, Volume II, contains the majority of the information relevant to the physics performance in proton–proton, proton–nucleus, and nucleus–nucleus collisions. Following an introductory overview, Chapter 5 describes the combined detector performance and the event reconstruction procedures, based on detailed simulations of the individual subsystems. Chapter 6 describes the analysis and physics reach for a representative sample of physics observables, from global event characteristics to hard processes.

Journal ArticleDOI
TL;DR: CPET indices, such as peak O2 uptake, V′O2 at lactate threshold, the slope of the ventilation–CO2 output relationship and the presence of arterial O2 desaturation, have all been shown to have power in prognostic evaluation.
Abstract: Evidence-based recommendations on the clinical use of cardiopulmonary exercise testing (CPET) in lung and heart disease are presented, with reference to the assessment of exercise intolerance, prognostic assessment and the evaluation of therapeutic interventions (e.g. drugs, supplemental oxygen, exercise training). A commonly used grading system for recommendations in evidence-based guidelines was applied, with the grade of recommendation ranging from A, the highest, to D, the lowest. For symptom-limited incremental exercise, CPET indices, such as peak O(2) uptake (V'O(2)), V'O(2) at lactate threshold, the slope of the ventilation-CO(2) output relationship and the presence of arterial O(2) desaturation, have all been shown to have power in prognostic evaluation. In addition, for assessment of interventions, the tolerable duration of symptom-limited high-intensity constant-load exercise often provides greater sensitivity to discriminate change than the classical incremental test. Field-testing paradigms (e.g. timed and shuttle walking tests) also prove valuable. In turn, these considerations allow the resolution of practical questions that often confront the clinician, such as: 1) "When should an evaluation of exercise intolerance be sought?"; 2) "Which particular form of test should be asked for?"; and 3) "What cluster of variables should be selected when evaluating prognosis for a particular disease or the effect of a particular intervention?"

Journal ArticleDOI
TL;DR: The article explores the interconnections between the feminization of migration, on the one hand, and ongoing change in the Southern European care regimes, onthe other hand to identify issues of efficiency, equity and sustainability raised by this new ‘model’ of care.
Abstract: Concern over the need to provide long-term care for an ageing population has stimulated a search for new solutions able to ensure financial viability and a better balance between demand and supply of care. There is at present a great variety of care regimes across industrial countries, with Mediterranean countries forming a distinctive cluster where management of care is overwhelmingly entrusted to the family. In some of these countries elderly care has recently attracted large flows of care migrants, ushering in a new division of labour among family carers (mainly women), female immigrants, and skilled native workers. The article explores the interconnections between the feminization of migration, on the one hand, and ongoing change in the Southern European care regimes, on the other hand. Different strands of the literature are brought together and reviewed to illustrate ongoing developments. One main objective is to identify issues of efficiency, equity and sustainability raised by this new ‘model’ of ...

Journal ArticleDOI
TL;DR: In this paper, a combination of bubble modeling and acoustic observations of rising bubbles was used to determine what fraction of the methane transported by bubbles will reach the atmosphere, and the model was validated using methane and argon bubble dissolution measurements obtained from the literature for deep, oxic, saline water with excellent results.
Abstract: There is growing concern about the transfer of methane originating from water bodies to the atmosphere. Methane from sediments can reach the atmosphere directly via bubbles or indirectly via vertical turbulent transport. This work quantifies methane gas bubble dissolution using a combination of bubble modeling and acoustic observations of rising bubbles to determine what fraction of the methane transported by bubbles will reach the atmosphere. The bubble model predicts the evolving bubble size, gas composition, and rise distance and is suitable for almost all aquatic environments. The model was validated using methane and argon bubble dissolution measurements obtained from the literature for deep, oxic, saline water with excellent results. Methane bubbles from within the hydrate stability zone (typically below ∼500 m water depth in the ocean) are believed to form an outer hydrate rim. To explain the subsequent slow dissolution, a model calibration was performed using bubble dissolution data from the literature measured within the hydrate stability zone. The calibrated model explains the impressively tall flares (>1300 m) observed in the hydrate stability zone of the Black Sea. This study suggests that only a small amount of methane reaches the surface at active seep sites in the Black Sea, and this only from very shallow water areas (<100 m). Clearly, the Black Sea and the ocean are rather effective barriers against the transfer of bubble methane to the atmosphere, although substantial amounts of methane may reach the surface in shallow lakes and reservoirs.

Journal ArticleDOI
TL;DR: Theoretical results suggest that the reduction of the number of bonded nearest neighbors offers the possibility of generating liquid states with temperature T lower than the liquid-gas critical temperature with a vanishing occupied packing fraction (phi), a case which can not be realized with spherically interacting particles.
Abstract: We report theoretical and numerical evaluations of the phase diagram for patchy colloidal particles of new generation. We show that the reduction of the number of bonded nearest neighbors offers the possibility of generating liquid states (i.e., states with temperature T lower than the liquid-gas critical temperature) with a vanishing occupied packing fraction (� ), a case which can not be realized with spherically interacting particles. Theoretical results suggest that such reduction is accompanied by an increase of the region of stability of the liquid phase in the (T-� ) plane, possibly favoring the establishment of homogeneous disordered materials at small � , i.e., stable equilibrium gels. The physico-chemical manipulation of colloidal particles is growing at an incredible pace. The large freedom in the control of the interparticle potential has made it possible to design colloidal particles which significantly extend the possibilities offered by atomic systems [1]. An impressive step further is offered by the newly developed techniques to assemble (and produce with significant yield) colloidal molecules, particles decorated on their surface by a predefined number of attractive sticky spots, i.e., particles with specifically designed shapes and interaction sites [2 ‐ 5]. These new particles, thanks to the specificity of the built-in interactions, will be able not only to reproduce molecular systems on the nano and micro scale, but will also show novel collective behaviors. To guide future applications of patchy colloids, to help in designing bottom-up strategies in self-assembly [6 ‐8], and to tackle the issue of interplay between dynamic arrest and crystallization —a hot-topic related, for example, to the possibility of nucleating a colloidal diamond crystal structure for photonic applications [9]—it is crucial to be able to predict the region in the (T-� ) plane in which clustering, phase separation, or even gelation is expected. While design and production of patchy colloids is present-day research, unexpectedly theoretical studies of the physical properties of these systems have a longer history, starting in the eighties in the context of the physics of associated liquids [10 ‐15]. These studies, in the attempt to pin down the essential features of association, modeled molecules as hardcore particles with attractive spots on the surface, a realistic description of the recently created patchy colloidal particles. A thermodynamic perturbation theory (TPT) appropriate for these models was introduced by Wertheim [16] to describe association under the hypothesis that a sticky site on a particle cannot bind simultaneously to two (or more) sites on another particle. Such a condition can be naturally implemented in colloids, due to the relative size of the particle as compared to the range of the sticky interaction. These old studies provide a very valuable starting point for addressing the issue of the phase diagram of this new class of colloids, and, in particular, of the role of the patches number. In this Letter, we study a system of hard-sphere particles with a small number M of identical short-ranged, squarewell attraction sites per particle (sticky spots), distributed on the surface with the same geometry as the recently produced patchy colloidal particles [4]. We identify the number of possible bonds per particle as the key parameter controlling the location of the critical point, as opposed to the fraction of surface covered by attractive patches. We present results of extensive numerical simulations of this model in the grand-canonical ensemble [17] to evaluate the location of the critical point of the system in the (T-� ) plane as a function of M. We complement the simulation results with the evaluation of the region of thermodynamic instability according to the Wertheim theory [16,18,19]. Both theory and simulation confirm that, on decreasing the number of sticky sites, the critical point moves toward ���������������� �

Journal ArticleDOI
TL;DR: Risks of completed and attempted suicide were consistently lower, by approximately 80%, during treatment of bipolar and other major affective disorder patients with lithium for an average of 18 months, and these benefits were sustained in randomized as well as open clinical trials.
Abstract: Objectives: To update and extend comparisons of rates of suicides and suicide attempts among patients with major affective disorders with versus without long-term lithium treatment. Methods: Broad searching yielded 45 studies providing rates of suicidal acts during lithium treatment, including 34 also providing rates without lithium treatment. We scored study quality, tested between-study variance, and examined suicidal rates on versus off lithium by meta-analytic methods to determine risk ratios (RRs) and 95% confidence intervals (CI). Results: In 31 studies suitable for meta-analysis, involving a total of 85,229 person-years of risk-exposure, the overall risk of suicides and attempts was five times less among lithium-treated subjects than among those not treated with lithium (RR = 4.91, 95% CI 3.82–6.31, p < 0.0001). Similar effects were found with other meta-analytic methods, as well as for completed versus attempted suicide, and for bipolar versus major mood disorder patients. Studies with higher quality ratings, including randomized, controlled trials, involved shorter exposures with somewhat lesser lithium superiority. Omitting one very large study or those involving lithium-discontinuation had little effect on the results. The incidence-ratio of attempts-to-suicides increased 2.5 times with lithium-treatment, indicating reduced lethality of suicidal acts. There was no indication of bias toward reporting positive findings, nor were outcomes significantly influenced by publication-year or study size. Conclusions: Risks of completed and attempted suicide were consistently lower, by approximately 80%, during treatment of bipolar and other major affective disorder patients with lithium for an average of 18 months. These benefits were sustained in randomized as well as open clinical trials.

Journal ArticleDOI
TL;DR: Benefits of anti-inflammatory, immunosuppressive and biological drugs during pregnancy and lactation, effects of these drugs on male and female fertility and possible long-term effects on infants exposed to drugs antenatally are discussed.
Abstract: Rheumatic diseases in women of childbearing years may necessitate drug treatment during a pregnancy, to control maternal disease activity and to ensure a successful pregnancy outcome. This survey is based on a consensus workshop of international experts discussing effects of anti-inflammatory, immunosuppressive and biological drugs during pregnancy and lactation. In addition, effects of these drugs on male and female fertility and possible long-term effects on infants exposed to drugs antenatally are discussed where data were available. Recommendations for drug treatment during pregnancy and lactation are given.

Journal ArticleDOI
TL;DR: Conservation plans should include an estimation of commission and omission errors in underlying species data and explicitly use this information to influence conservation planning outcomes.
Abstract: Data on the occurrence of species are widely used to inform the design of reserve networks. These data contain commission errors (when a species is mistakenly thought to be present) and omission errors (when a species is mistakenly thought to be absent), and the rates of the two types of error are inversely related. Point locality data can minimize commission errors, but those obtained from museum collections are generally sparse, suffer from substantial spatial bias and contain large omission errors. Geographic ranges generate large commission errors because they assume homogenous species distributions. Predicted distribution data make explicit inferences on species occurrence and their commission and omission errors depend on model structure, on the omission of variables that determine species distribution and on data resolution. Omission errors lead to identifying networks of areas for conservation action that are smaller than required and centred on known species occurrences, thus affecting the comprehensiveness, representativeness and efficiency of selected areas. Commission errors lead to selecting areas not relevant to conservation, thus affecting the representativeness and adequacy of reserve networks. Conservation plans should include an estimation of commission and omission errors in underlying species data and explicitly use this information to influence conservation planning outcomes.

Journal ArticleDOI
01 Jan 2006-Genetics
TL;DR: The surplus of nonsynonymous mutations is a general feature of the young branches of the phylogenetic tree, affecting also those that are found only in Africa, and a new calibration method is introduced to estimate the coalescent times of mtDNA haplogroups.
Abstract: High mutation rate in mammalian mitochondrial DNA generates a highly divergent pool of alleles even within species that have dispersed and expanded in size recently. Phylogenetic analysis of 277 human mitochondrial genomes revealed a significant (P < 0.01) excess of rRNA and nonsynonymous base substitutions among hotspots of recurrent mutation. Most hotspots involved transitions from guanine to adenine that, with thymine-to-cytosine transitions, illustrate the asymmetric bias in codon usage at synonymous sites on the heavy-strand DNA. The mitochondrion-encoded tRNAThr varied significantly more than any other tRNA gene. Threonine and valine codons were involved in 259 of the 414 amino acid replacements observed. The ratio of nonsynonymous changes from and to threonine and valine differed significantly (P = 0.003) between populations with neutral (22/58) and populations with significantly negative Tajima's D values (70/76), independent of their geographic location. In contrast to a recent suggestion that the excess of nonsilent mutations is characteristic of Arctic populations, implying their role in cold adaptation, we demonstrate that the surplus of nonsynonymous mutations is a general feature of the young branches of the phylogenetic tree, affecting also those that are found only in Africa. We introduce a new calibration method of the mutation rate of synonymous transitions to estimate the coalescent times of mtDNA haplogroups.

Journal ArticleDOI
TL;DR: An overview of the results obtained with lattice models of the fracture, highlighting the relations with statistical physics theories and more conventional fracture mechanics approaches is presented.
Abstract: Disorder and long-range interactions are two of the key components that make material failure an interesting playfield for the application of statistical mechanics. The cornerstone in this respect has been lattice models of the fracture in which a network of elastic beams, bonds, or electrical fuses with random failure thresholds are subject to an increasing external load. These models describe on a qualitative level the failure processes of real, brittle, or quasi-brittle materials. This has been particularly important in solving the classical engineering problems of material strength: the size dependence of maximum stress and its sample-to-sample statistical fluctuations. At the same time, lattice models pose many new fundamental questions in statistical physics, such as the relation between fracture and phase transitions. Experimental results point out to the existence of an intriguing crackling noise in the acoustic emission and of self-affine fractals in the crack surface morphology. Recent advances ...

Journal Article
TL;DR: It is reported that 22 of 129 individuals with Noonan syndrome without PTPN11 or KRAS mutation have missense mutations in SOS1, which encodes a RAS-specific guanine nucleotide exchange factor, and this finding defines a new mechanism by which upregulation of the RAS pathway can profoundly change human development.
Abstract: Noonan syndrome (NS) is a developmental disorder characterized by short stature, facial dysmorphia, congenital heart defects and skeletal anomalies1. Increased RAS-mitogenactivated protein kinase (MAPK) signaling due to PTPN11 and KRAS mutations cause 50 percent of NS2-6. Here, we report that 22 of 129 NS patients without PTPN11 or KRAS mutation (17 percent) have missense mutations in SOS1, which encodes a RAS-specific guanine nucleotide exchange factor (GEF). SOS1 mutations cluster at residues implicated in the maintenance of SOS1 in its autoinhibited form and ectopic expression of two NS-associated mutants induced enhanced RAS activation. The phenotype associated with SOS1 defects is distinctive, although within NS spectrum, with a high prevalence of ectodermal abnormalities but generally normal development and linear growth. Our findings implicate for the first time gain-of-function mutations in a RAS GEF in inherited disease and define a new mechanism by which upregulation of the RAS pathway can profoundly change human development.

Proceedings Article
02 Jun 2006
TL;DR: In this article, the authors study the data complexity of answering conjunctive queries over Description Logic knowledge bases and show that the Description Logics of the DL-Lite family are the maximal logics that allow query answering over very large ABoxes.
Abstract: In this paper we study data complexity of answering conjunctive queries over Description Logic knowledge bases constituted by an ABox and a TBox. In particular, we are interested in characterizing the FOL-reducibility and the polynomial tractability boundaries of conjunctive query answering, depending on the expressive power of the Description Logic used to specify the knowledge base. FOL-reducibility means that query answering can be reduced to evaluating queries over the database corresponding to the ABox. Since first-order queries can be expressed in SQL, the importance of FOL-reducibility is that, when query answering enjoys this property, we can take advantage of Data Base Management System (DBMS) techniques for both representing data, i.e., ABox assertions, and answering queries via reformulation into SQL. What emerges from our complexity analysis is that the Description Logics of the DL-Lite family are the maximal logics allowing conjunctive query answering through standard database technology. In this sense, they are the first Description Logics specifically tailored for effective query answering over very large ABoxes.