scispace - formally typeset
Search or ask a question

Showing papers by "University of Paris published in 2016"


Journal ArticleDOI
Daniel J. Klionsky1, Kotb Abdelmohsen2, Akihisa Abe3, Joynal Abedin4  +2519 moreInstitutions (695)
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.

5,187 citations


Journal ArticleDOI
TL;DR: Pfam is now primarily based on the UniProtKB reference proteomes, with the counts of matched sequences and species reported on the website restricted to this smaller set, and the facility to view the relationship between families within a clan has been improved by the introduction of a new tool.
Abstract: In the last two years the Pfam database (http://pfam.xfam.org) has undergone a substantial reorganisation to reduce the effort involved in making a release, thereby permitting more frequent releases. Arguably the most significant of these changes is that Pfam is now primarily based on the UniProtKB reference proteomes, with the counts of matched sequences and species reported on the website restricted to this smaller set. Building families on reference proteomes sequences brings greater stability, which decreases the amount of manual curation required to maintain them. It also reduces the number of sequences displayed on the website, whilst still providing access to many important model organisms. Matches to the full UniProtKB database are, however, still available and Pfam annotations for individual UniProtKB sequences can still be retrieved. Some Pfam entries (1.6%) which have no matches to reference proteomes remain; we are working with UniProt to see if sequences from them can be incorporated into reference proteomes. Pfam-B, the automatically-generated supplement to Pfam, has been removed. The current release (Pfam 29.0) includes 16 295 entries and 559 clans. The facility to view the relationship between families within a clan has been improved by the introduction of a new tool.

4,906 citations


Journal ArticleDOI
23 Feb 2016-JAMA
TL;DR: Clinician recognition of ARDS was associated with higher PEEP, greater use of neuromuscular blockade, and prone positioning, which indicates the potential for improvement in the management of patients with ARDS.
Abstract: IMPORTANCE Limited information exists about the epidemiology, recognition, management, and outcomes of patients with the acute respiratory distress syndrome (ARDS). OBJECTIVES To evaluate intensive ...

3,259 citations


Journal ArticleDOI
TL;DR: In this paper, the authors review recent progress, from both in situ experiments and advanced simulation techniques, in understanding the charge storage mechanism in carbon- and oxide-based supercapacitors.
Abstract: Supercapacitors are electrochemical energy storage devices that operate on the simple mechanism of adsorption of ions from an electrolyte on a high-surface-area electrode. Over the past decade, the performance of supercapacitors has greatly improved, as electrode materials have been tuned at the nanoscale and electrolytes have gained an active role, enabling more efficient storage mechanisms. In porous carbon materials with subnanometre pores, the desolvation of the ions leads to surprisingly high capacitances. Oxide materials store charge by surface redox reactions, leading to the pseudocapacitive effect. Understanding the physical mechanisms underlying charge storage in these materials is important for further development of supercapacitors. Here we review recent progress, from both in situ experiments and advanced simulation techniques, in understanding the charge storage mechanism in carbon- and oxide-based supercapacitors. We also discuss the challenges that still need to be addressed for building better supercapacitors.

1,565 citations


Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, M. R. Abernathy1  +976 moreInstitutions (107)
TL;DR: It is found that the final remnant's mass and spin, as determined from the low-frequency and high-frequency phases of the signal, are mutually consistent with the binary black-hole solution in general relativity.
Abstract: The LIGO detection of GW150914 provides an unprecedented opportunity to study the two-body motion of a compact-object binary in the large-velocity, highly nonlinear regime, and to witness the final merger of the binary and the excitation of uniquely relativistic modes of the gravitational field. We carry out several investigations to determine whether GW150914 is consistent with a binary black-hole merger in general relativity. We find that the final remnant’s mass and spin, as determined from the low-frequency (inspiral) and high-frequency (postinspiral) phases of the signal, are mutually consistent with the binary black-hole solution in general relativity. Furthermore, the data following the peak of GW150914 are consistent with the least-damped quasinormal mode inferred from the mass and spin of the remnant black hole. By using waveform models that allow for parametrized general-relativity violations during the inspiral and merger phases, we perform quantitative tests on the gravitational-wave phase in the dynamical regime and we determine the first empirical bounds on several high-order post-Newtonian coefficients. We constrain the graviton Compton wavelength, assuming that gravitons are dispersed in vacuum in the same way as particles with mass, obtaining a 90%-confidence lower bound of 1013 km. In conclusion, within our statistical uncertainties, we find no evidence for violations of general relativity in the genuinely strong-field regime of gravity.

1,421 citations


Journal ArticleDOI
TL;DR: This work developed REVEL (rare exome variant ensemble learner), an ensemble method for predicting the pathogenicity of missense variants on the basis of individual tools: MutPred, FATHMM, VEST, PolyPhen, SIFT, PROVEAN, MutationAssessor, LRT, GERP, SiPhy, phyloP, and phastCons.
Abstract: The vast majority of coding variants are rare, and assessment of the contribution of rare variants to complex traits is hampered by low statistical power and limited functional data. Improved methods for predicting the pathogenicity of rare coding variants are needed to facilitate the discovery of disease variants from exome sequencing studies. We developed REVEL (rare exome variant ensemble learner), an ensemble method for predicting the pathogenicity of missense variants on the basis of individual tools: MutPred, FATHMM, VEST, PolyPhen, SIFT, PROVEAN, MutationAssessor, MutationTaster, LRT, GERP, SiPhy, phyloP, and phastCons. REVEL was trained with recently discovered pathogenic and rare neutral missense variants, excluding those previously used to train its constituent tools. When applied to two independent test sets, REVEL had the best overall performance (p −12 ) as compared to any individual tool and seven ensemble methods: MetaSVM, MetaLR, KGGSeq, Condel, CADD, DANN, and Eigen. Importantly, REVEL also had the best performance for distinguishing pathogenic from rare neutral variants with allele frequencies

1,295 citations


Journal ArticleDOI
TL;DR: Among women with early-stage breast cancer who were at high clinical risk and low genomic risk for recurrence, the receipt of no chemotherapy on the basis of the 70-gene signature led to a 5-year rate of survival without distant metastasis that was 1.5 percentage points lower than the rate with chemotherapy.
Abstract: BackgroundThe 70-gene signature test (MammaPrint) has been shown to improve prediction of clinical outcome in women with early-stage breast cancer. We sought to provide prospective evidence of the clinical utility of the addition of the 70-gene signature to standard clinical–pathological criteria in selecting patients for adjuvant chemotherapy. MethodsIn this randomized, phase 3 study, we enrolled 6693 women with early-stage breast cancer and determined their genomic risk (using the 70-gene signature) and their clinical risk (using a modified version of Adjuvant! Online). Women at low clinical and genomic risk did not receive chemotherapy, whereas those at high clinical and genomic risk did receive such therapy. In patients with discordant risk results, either the genomic risk or the clinical risk was used to determine the use of chemotherapy. The primary goal was to assess whether, among patients with high-risk clinical features and a low-risk gene-expression profile who did not receive chemotherapy, the...

1,291 citations


Journal ArticleDOI
19 Apr 2016-Test
TL;DR: The present article reviews the most recent theoretical and methodological developments for random forests, with special attention given to the selection of parameters, the resampling mechanism, and variable importance measures.
Abstract: The random forest algorithm, proposed by L. Breiman in 2001, has been extremely successful as a general-purpose classification and regression method. The approach, which combines several randomized decision trees and aggregates their predictions by averaging, has shown excellent performance in settings where the number of variables is much larger than the number of observations. Moreover, it is versatile enough to be applied to large-scale problems, is easily adapted to various ad hoc learning tasks, and returns measures of variable importance. The present article reviews the most recent theoretical and methodological developments for random forests. Emphasis is placed on the mathematical forces driving the algorithm, with special attention given to the selection of parameters, the resampling mechanism, and variable importance measures. This review is intended to provide non-experts easy access to the main ideas.

1,279 citations


Journal ArticleDOI
TL;DR: An updated review of the literature and evidence on the definitions and lexicon, the limits, the natural history, the markers of progression, and the ethical consequence of detecting the disease at this asymptomatic stage of Alzheimer's disease are provided.
Abstract: During the past decade, a conceptual shift occurred in the field of Alzheimer's disease (AD) considering the disease as a continuum. Thanks to evolving biomarker research and substantial discoveries, it is now possible to identify the disease even at the preclinical stage before the occurrence of the first clinical symptoms. This preclinical stage of AD has become a major research focus as the field postulates that early intervention may offer the best chance of therapeutic success. To date, very little evidence is established on this "silent" stage of the disease. A clarification is needed about the definitions and lexicon, the limits, the natural history, the markers of progression, and the ethical consequence of detecting the disease at this asymptomatic stage. This article is aimed at addressing all the different issues by providing for each of them an updated review of the literature and evidence, with practical recommendations.

1,235 citations


Journal ArticleDOI
01 Mar 2016-Gut
TL;DR: A. muciniphila is associated with a healthier metabolic status and better clinical outcomes after CR in overweight/obese adults, and the interaction between gut microbiota ecology and A. muc iniphila warrants further investigation.
Abstract: OBJECTIVE: Individuals with obesity and type 2 diabetes differ from lean and healthy individuals in their abundance of certain gut microbial species and microbial gene richness Abundance of Akkermansia muciniphila, a mucin-degrading bacterium, has been inversely associated with body fat mass and glucose intolerance in mice, but more evidence is needed in humans The impact of diet and weight loss on this bacterial species is unknown Our objective was to evaluate the association between faecal A muciniphila abundance, faecal microbiome gene richness, diet, host characteristics, and their changes after calorie restriction (CR) DESIGN: The intervention consisted of a 6-week CR period followed by a 6-week weight stabilisation diet in overweight and obese adults (N=49, including 41 women) Faecal A muciniphila abundance, faecal microbial gene richness, diet and bioclinical parameters were measured at baseline and after CR and weight stabilisation RESULTS: At baseline A muciniphila was inversely related to fasting glucose, waist-to-hip ratio and subcutaneous adipocyte diameter Subjects with higher gene richness and A muciniphila abundance exhibited the healthiest metabolic status, particularly in fasting plasma glucose, plasma triglycerides and body fat distribution Individuals with higher baseline A muciniphila displayed greater improvement in insulin sensitivity markers and other clinical parameters after CR These participants also experienced a reduction in A muciniphila abundance, but it remained significantly higher than in individuals with lower baseline abundance A muciniphila was associated with microbial species known to be related to health CONCLUSIONS: A muciniphila is associated with a healthier metabolic status and better clinical outcomes after CR in overweight/obese adults The interaction between gut microbiota ecology and A muciniphila warrants further investigation TRIAL REGISTRATION NUMBER: NCT01314690

1,224 citations


Journal ArticleDOI
University of East Anglia1, University of Oslo2, Commonwealth Scientific and Industrial Research Organisation3, University of Exeter4, Oak Ridge National Laboratory5, National Oceanic and Atmospheric Administration6, Woods Hole Research Center7, University of California, San Diego8, Karlsruhe Institute of Technology9, Cooperative Institute for Marine and Atmospheric Studies10, Centre national de la recherche scientifique11, University of Maryland, College Park12, National Institute of Water and Atmospheric Research13, Woods Hole Oceanographic Institution14, Flanders Marine Institute15, Alfred Wegener Institute for Polar and Marine Research16, Netherlands Environmental Assessment Agency17, University of Illinois at Urbana–Champaign18, Leibniz Institute of Marine Sciences19, Max Planck Society20, University of Paris21, Hobart Corporation22, University of Bern23, Oeschger Centre for Climate Change Research24, National Center for Atmospheric Research25, University of Miami26, Council of Scientific and Industrial Research27, University of Colorado Boulder28, National Institute for Environmental Studies29, Joint Institute for the Study of the Atmosphere and Ocean30, Geophysical Institute, University of Bergen31, Goddard Space Flight Center32, Montana State University33, University of New Hampshire34, Bjerknes Centre for Climate Research35, Imperial College London36, Lamont–Doherty Earth Observatory37, Auburn University38, Wageningen University and Research Centre39, VU University Amsterdam40, Met Office41
TL;DR: In this article, the authors quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics, and model estimates and their interpretation by a broad scientific community.
Abstract: . Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere – the “global carbon budget” – is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics, and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates and consistency within and among components, alongside methodology and data limitations. CO2 emissions from fossil fuels and industry (EFF) are based on energy statistics and cement production data, respectively, while emissions from land-use change (ELUC), mainly deforestation, are based on combined evidence from land-cover change data, fire activity associated with deforestation, and models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The mean ocean CO2 sink (SOCEAN) is based on observations from the 1990s, while the annual anomalies and trends are estimated with ocean models. The variability in SOCEAN is evaluated with data products based on surveys of ocean CO2 measurements. The global residual terrestrial CO2 sink (SLAND) is estimated by the difference of the other terms of the global carbon budget and compared to results of independent dynamic global vegetation models. We compare the mean land and ocean fluxes and their variability to estimates from three atmospheric inverse methods for three broad latitude bands. All uncertainties are reported as ±1σ, reflecting the current capacity to characterise the annual estimates of each component of the global carbon budget. For the last decade available (2006–2015), EFF was 9.3 ± 0.5 GtC yr−1, ELUC 1.0 ± 0.5 GtC yr−1, GATM 4.5 ± 0.1 GtC yr−1, SOCEAN 2.6 ± 0.5 GtC yr−1, and SLAND 3.1 ± 0.9 GtC yr−1. For year 2015 alone, the growth in EFF was approximately zero and emissions remained at 9.9 ± 0.5 GtC yr−1, showing a slowdown in growth of these emissions compared to the average growth of 1.8 % yr−1 that took place during 2006–2015. Also, for 2015, ELUC was 1.3 ± 0.5 GtC yr−1, GATM was 6.3 ± 0.2 GtC yr−1, SOCEAN was 3.0 ± 0.5 GtC yr−1, and SLAND was 1.9 ± 0.9 GtC yr−1. GATM was higher in 2015 compared to the past decade (2006–2015), reflecting a smaller SLAND for that year. The global atmospheric CO2 concentration reached 399.4 ± 0.1 ppm averaged over 2015. For 2016, preliminary data indicate the continuation of low growth in EFF with +0.2 % (range of −1.0 to +1.8 %) based on national emissions projections for China and USA, and projections of gross domestic product corrected for recent changes in the carbon intensity of the economy for the rest of the world. In spite of the low growth of EFF in 2016, the growth rate in atmospheric CO2 concentration is expected to be relatively high because of the persistence of the smaller residual terrestrial sink (SLAND) in response to El Nino conditions of 2015–2016. From this projection of EFF and assumed constant ELUC for 2016, cumulative emissions of CO2 will reach 565 ± 55 GtC (2075 ± 205 GtCO2) for 1870–2016, about 75 % from EFF and 25 % from ELUC. This living data update documents changes in the methods and data sets used in this new carbon budget compared with previous publications of this data set (Le Quere et al., 2015b, a, 2014, 2013). All observations presented here can be downloaded from the Carbon Dioxide Information Analysis Center ( doi:10.3334/CDIAC/GCP_2016 ).

Journal ArticleDOI
TL;DR: It is shown that screening for stx, eae, espK, and espV, in association with the CRISPRO26:H11 marker is a better approach to narrow down the EHEC screening step in beef enrichments, and the number of potentially positive samples was reduced by 48.88% by means of this alternative strategy.
Abstract: Current methods for screening Enterohemorrhagic Escherichia coli (EHEC) O157 and non-O157 in beef enrichments typically rely on the molecular detection of stx, eae, and serogroup-specific wzx or wzy gene fragments. As these genetic markers can also be found in some non-EHEC strains, a number of “false positive” results are obtained. Here, we explore the suitability of five novel molecular markers, espK, espV, ureD, Z2098, and CRISPRO26:H11 as candidates for a more accurate screening of EHEC strains of greater clinical significance in industrialized countries. Of the 1739 beef enrichments tested, 180 were positive for both stx and eae genes. Ninety (50%) of these tested negative for espK, espV, ureD, and Z2098, but 12 out of these negative samples were positive for the CRISPRO26:H11 gene marker specific for a newly emerging virulent EHEC O26:H11 French clone. We show that screening for stx, eae, espK, and espV, in association with the CRISPRO26:H11 marker is a better approach to narrow down the EHEC screening step in beef enrichments. The number of potentially positive samples was reduced by 48.88% by means of this alternative strategy compared to the European and American reference methods, thus substantially improving the discriminatory power of EHEC screening systems. This approach is in line with the EFSA (European Food Safety Authority) opinion on pathogenic STEC published in 2013.


Journal ArticleDOI
TL;DR: The atmospheric fallout of microplastics was investigated in two different urban and sub-urban sites and a rough estimation was shown showing that between 3 and 10 tons of fibers are deposited by atmospheric fallout at the scale of the Parisian agglomeration every year.

Journal ArticleDOI
TL;DR: In this paper, a different method is proposed to analyze experimental results and it is employed here to reexamine experimental data taken from the literature, and it appears that the method generally used is flawed and that it unfairly favors pseudo-second order kinetics.

Journal ArticleDOI
TL;DR: The results support the hypothesis that rare coding variants can pinpoint causal genes within known genetic loci and illustrate that applying the approach systematically to detect new loci requires extremely large sample sizes.
Abstract: Advanced age-related macular degeneration (AMD) is the leading cause of blindness in the elderly, with limited therapeutic options. Here we report on a study of >12 million variants, including 163,714 directly genotyped, mostly rare, protein-altering variants. Analyzing 16,144 patients and 17,832 controls, we identify 52 independently associated common and rare variants (P < 5 × 10(-8)) distributed across 34 loci. Although wet and dry AMD subtypes exhibit predominantly shared genetics, we identify the first genetic association signal specific to wet AMD, near MMP9 (difference P value = 4.1 × 10(-10)). Very rare coding variants (frequency <0.1%) in CFH, CFI and TIMP3 suggest causal roles for these genes, as does a splice variant in SLC16A8. Our results support the hypothesis that rare coding variants can pinpoint causal genes within known genetic loci and illustrate that applying the approach systematically to detect new loci requires extremely large sample sizes.

Journal ArticleDOI
TL;DR: The COSMOS2015(24) catalog as mentioned in this paper contains precise photometric redshifts and stellar masses for more than half a million objects over the 2deg(2) COSmOS field, which is highly optimized for the study of galaxy evolution and environments in the early universe.
Abstract: We present the COSMOS2015(24) catalog, which contains precise photometric redshifts and stellar masses for more than half a million objects over the 2deg(2) COSMOS field. Including new YJHK(s) images from the UltraVISTA-DR2 survey, Y-band images from Subaru/Hyper-Suprime-Cam, and infrared data from the Spitzer Large Area Survey with the Hyper-Suprime-Cam Spitzer legacy program, this near-infrared-selected catalog is highly optimized for the study of galaxy evolution and environments in the early universe. To maximize catalog completeness for bluer objects and at higher redshifts, objects have been detected on a chi(2) sum of the YJHK(s) and z(++) images. The catalog contains similar to 6 x 10(5) objects in the 1.5 deg(2) UltraVISTA-DR2 region and similar to 1.5 x 10(5) objects are detected in the “ultra-deep stripes” (0.62 deg(2)) at K-s \textless= 24.7 (3 sigma, 3 `', AB magnitude). Through a comparison with the zCOSMOS-bright spectroscopic redshifts, we measure a photometric redshift precision of sigma(Delta z(1) (+ zs)) = 0.007 and a catastrophic failure fraction of eta = 0.5%. At 3 \textless z \textless 6, using the unique database of spectroscopic redshifts in COSMOS, we find sigma(Delta z(1) (+ zs)) = 0.021 and eta = 13.2%. The deepest regions reach a 90% completeness limit of 10(10)M(circle dot) to z = 4. Detailed comparisons of the color distributions, number counts, and clustering show excellent agreement with the literature in the same mass ranges. COSMOS2015 represents a unique, publicly available, valuable resource with which to investigate the evolution of galaxies within their environment back to the earliest stages of the history of the universe. The COSMOS2015 catalog is distributed via anonymous ftp and through the usual astronomical archive systems (CDS, ESO Phase 3, IRSA).

Journal ArticleDOI
TL;DR: The clinical presentation, epidemiology, pathophysiology, diagnosis, and acute management of iron deficiency anaemia, and outstanding research questions for treatment are discussed.

Journal ArticleDOI
TL;DR: The 2009 European League Against Rheumatism recommendations for the management of antineutrophil cytoplasmic antibody (ANCA)-associated vasculitis (AAV) have been updated and 15 recommendations were developed, covering general aspects, such as attaining remission.
Abstract: In this article, the 2009 European League Against Rheumatism (EULAR) recommendations for the management of antineutrophil cytoplasmic antibody (ANCA)-associated vasculitis (AAV) have been updated. The 2009 recommendations were on the management of primary small and medium vessel vasculitis. The 2015 update has been developed by an international task force representing EULAR, the European Renal Association and the European Vasculitis Society (EUVAS). The recommendations are based upon evidence from systematic literature reviews, as well as expert opinion where appropriate. The evidence presented was discussed and summarised by the experts in the course of a consensus-finding and voting process. Levels of evidence and grades of recommendations were derived and levels of agreement (strengths of recommendations) determined. In addition to the voting by the task force members, the relevance of the recommendations was assessed by an online voting survey among members of EUVAS. Fifteen recommendations were developed, covering general aspects, such as attaining remission and the need for shared decision making between clinicians and patients. More specific items relate to starting immunosuppressive therapy in combination with glucocorticoids to induce remission, followed by a period of remission maintenance; for remission induction in life-threatening or organ-threatening AAV, cyclophosphamide and rituximab are considered to have similar efficacy; plasma exchange which is recommended, where licensed, in the setting of rapidly progressive renal failure or severe diffuse pulmonary haemorrhage. These recommendations are intended for use by healthcare professionals, doctors in specialist training, medical students, pharmaceutical industries and drug regulatory organisations.

Journal ArticleDOI
TL;DR: This work reports on the observation of stable skyrmions in sputtered ultrathin Pt/Co/MgO nanostructures at room temperature and zero external magnetic field, substantiated by micromagnetic simulations and numerical models.
Abstract: Magnetic skyrmions are chiral spin structures with a whirling configuration. Their topological properties, nanometre size and the fact that they can be moved by small current densities have opened a new paradigm for the manipulation of magnetization at the nanoscale. Chiral skyrmion structures have so far been experimentally demonstrated only in bulk materials and in epitaxial ultrathin films, and under an external magnetic field or at low temperature. Here, we report on the observation of stable skyrmions in sputtered ultrathin Pt/Co/MgO nanostructures at room temperature and zero external magnetic field. We use high lateral resolution X-ray magnetic circular dichroism microscopy to image their chiral Neel internal structure, which we explain as due to the large strength of the Dzyaloshinskii–Moriya interaction as revealed by spin wave spectroscopy measurements. Our results are substantiated by micromagnetic simulations and numerical models, which allow the identification of the physical mechanisms governing the size and stability of the skyrmions.

Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, Matthew Abernathy1  +984 moreInstitutions (116)
TL;DR: The data around the time of the event were analyzed coherently across the LIGO network using a suite of accurate waveform models that describe gravitational waves from a compact binary system in general relativity.
Abstract: On September 14, 2015, the Laser Interferometer Gravitational-wave Observatory (LIGO) detected a gravitational-wave transient (GW150914); we characterise the properties of the source and its parameters. The data around the time of the event were analysed coherently across the LIGO network using a suite of accurate waveform models that describe gravitational waves from a compact binary system in general relativity. GW150914 was produced by a nearly equal mass binary black hole of $36^{+5}_{-4} M_\odot$ and $29^{+4}_{-4} M_\odot$ (for each parameter we report the median value and the range of the 90% credible interval). The dimensionless spin magnitude of the more massive black hole is bound to be $0.7$ (at 90% probability). The luminosity distance to the source is $410^{+160}_{-180}$ Mpc, corresponding to a redshift $0.09^{+0.03}_{-0.04}$ assuming standard cosmology. The source location is constrained to an annulus section of $590$ deg$^2$, primarily in the southern hemisphere. The binary merges into a black hole of $62^{+4}_{-4} M_\odot$ and spin $0.67^{+0.05}_{-0.07}$. This black hole is significantly more massive than any other known in the stellar-mass regime.

Journal ArticleDOI
Sergey Alekhin, Wolfgang Altmannshofer1, Takehiko Asaka2, Brian Batell3, Fedor Bezrukov4, Kyrylo Bondarenko5, Alexey Boyarsky5, Ki-Young Choi6, Cristóbal Corral7, Nathaniel Craig8, David Curtin9, Sacha Davidson10, Sacha Davidson11, André de Gouvêa12, Stefano Dell'Oro, Patrick deNiverville13, P. S. Bhupal Dev14, Herbi K. Dreiner15, Marco Drewes16, Shintaro Eijima17, Rouven Essig18, Anthony Fradette13, Björn Garbrecht16, Belen Gavela19, Gian F. Giudice3, Mark D. Goodsell20, Mark D. Goodsell21, Dmitry Gorbunov22, Stefania Gori1, Christophe Grojean23, Alberto Guffanti24, Thomas Hambye25, Steen Honoré Hansen24, Juan Carlos Helo7, Juan Carlos Helo26, Pilar Hernández27, Alejandro Ibarra16, Artem Ivashko5, Artem Ivashko28, Eder Izaguirre1, Joerg Jaeckel29, Yu Seon Jeong30, Felix Kahlhoefer, Yonatan Kahn31, Andrey Katz3, Andrey Katz32, Andrey Katz33, Choong Sun Kim30, Sergey Kovalenko7, Gordan Krnjaic1, Valery E. Lyubovitskij34, Valery E. Lyubovitskij35, Valery E. Lyubovitskij36, Simone Marcocci, Matthew McCullough3, David McKeen37, Guenakh Mitselmakher38, Sven Moch39, Rabindra N. Mohapatra9, David E. Morrissey40, Maksym Ovchynnikov28, Emmanuel A. Paschos, Apostolos Pilaftsis14, Maxim Pospelov13, Maxim Pospelov1, Mary Hall Reno41, Andreas Ringwald, Adam Ritz13, Leszek Roszkowski, Valery Rubakov, Oleg Ruchayskiy24, Oleg Ruchayskiy17, Ingo Schienbein42, Daniel Schmeier15, Kai Schmidt-Hoberg, Pedro Schwaller3, Goran Senjanovic43, Osamu Seto44, Mikhail Shaposhnikov17, Lesya Shchutska38, J. Shelton45, Robert Shrock18, Brian Shuve1, Michael Spannowsky46, Andrew Spray47, Florian Staub3, Daniel Stolarski3, Matt Strassler33, Vladimir Tello, Francesco Tramontano48, Anurag Tripathi, Sean Tulin49, Francesco Vissani, Martin Wolfgang Winkler15, Kathryn M. Zurek50, Kathryn M. Zurek51 
Perimeter Institute for Theoretical Physics1, Niigata University2, CERN3, University of Connecticut4, Leiden University5, Korea Astronomy and Space Science Institute6, Federico Santa María Technical University7, University of California, Santa Barbara8, University of Maryland, College Park9, University of Lyon10, Claude Bernard University Lyon 111, Northwestern University12, University of Victoria13, University of Manchester14, University of Bonn15, Technische Universität München16, École Polytechnique Fédérale de Lausanne17, Stony Brook University18, Autonomous University of Madrid19, University of Paris20, Centre national de la recherche scientifique21, Moscow Institute of Physics and Technology22, Autonomous University of Barcelona23, University of Copenhagen24, Université libre de Bruxelles25, University of La Serena26, University of Valencia27, Taras Shevchenko National University of Kyiv28, Heidelberg University29, Yonsei University30, Princeton University31, University of Geneva32, Harvard University33, Tomsk Polytechnic University34, University of Tübingen35, Tomsk State University36, University of Washington37, University of Florida38, University of Hamburg39, TRIUMF40, University of Iowa41, University of Grenoble42, International Centre for Theoretical Physics43, Hokkai Gakuen University44, University of Illinois at Urbana–Champaign45, Durham University46, University of Melbourne47, University of Naples Federico II48, York University49, Lawrence Berkeley National Laboratory50, University of California, Berkeley51
TL;DR: It is demonstrated that the SHiP experiment has a unique potential to discover new physics and can directly probe a number of solutions of beyond the standard model puzzles, such as neutrino masses, baryon asymmetry of the Universe, dark matter, and inflation.
Abstract: This paper describes the physics case for a new fixed target facility at CERN SPS. The SHiP (search for hidden particles) experiment is intended to hunt for new physics in the largely unexplored domain of very weakly interacting particles with masses below the Fermi scale, inaccessible to the LHC experiments, and to study tau neutrino physics. The same proton beam setup can be used later to look for decays of tau-leptons with lepton flavour number non-conservation, $\tau \to 3\mu $ and to search for weakly-interacting sub-GeV dark matter candidates. We discuss the evidence for physics beyond the standard model and describe interactions between new particles and four different portals—scalars, vectors, fermions or axion-like particles. We discuss motivations for different models, manifesting themselves via these interactions, and how they can be probed with the SHiP experiment and present several case studies. The prospects to search for relatively light SUSY and composite particles at SHiP are also discussed. We demonstrate that the SHiP experiment has a unique potential to discover new physics and can directly probe a number of solutions of beyond the standard model puzzles, such as neutrino masses, baryon asymmetry of the Universe, dark matter, and inflation.

Journal ArticleDOI
TL;DR: These recommendations provide stakeholders with an updated consensus on the pharmacological treatment of PsA and strategies to reach optimal outcomes in PsA, based on a combination of evidence and expert opinion.
Abstract: Background Since the publication of the European League Against Rheumatism recommendations for the pharmacological treatment of psoriatic arthritis (PsA) in 2012, new evidence and new therapeutic agents have emerged. The objective was to update these recommendations. Methods A systematic literature review was performed regarding pharmacological treatment in PsA. Subsequently, recommendations were formulated based on the evidence and the expert opinion of the 34 Task Force members. Levels of evidence and strengths of recommendations were allocated. Results The updated recommendations comprise 5 overarching principles and 10 recommendations, covering pharmacological therapies for PsA from non-steroidal anti-inflammatory drugs (NSAIDs), to conventional synthetic (csDMARD) and biological (bDMARD) disease-modifying antirheumatic drugs, whatever their mode of action, taking articular and extra-articular manifestations of PsA into account, but focusing on musculoskeletal involvement. The overarching principles address the need for shared decision-making and treatment objectives. The recommendations address csDMARDs as an initial therapy after failure of NSAIDs and local therapy for active disease, followed, if necessary, by a bDMARD or a targeted synthetic DMARD (tsDMARD). The first bDMARD would usually be a tumour necrosis factor (TNF) inhibitor. bDMARDs targeting interleukin (IL)12/23 (ustekinumab) or IL-17 pathways (secukinumab) may be used in patients for whom TNF inhibitors are inappropriate and a tsDMARD such as a phosphodiesterase 4-inhibitor (apremilast) if bDMARDs are inappropriate. If the first bDMARD strategy fails, any other bDMARD or tsDMARD may be used. Conclusions These recommendations provide stakeholders with an updated consensus on the pharmacological treatment of PsA and strategies to reach optimal outcomes in PsA, based on a combination of evidence and expert opinion.

Journal ArticleDOI
TL;DR: The MorphoLibJ library proposes a large collection of generic tools based on MM to process binary and grey-level 2D and 3D images, integrated into user-friendly plugins.
Abstract: Motivation: Mathematical morphology (MM) provides many powerful operators for processing 2D and 3D images. However, most MM plugins currently implemented for the popular ImageJ/Fiji platform are limited to the processing of 2D images. Results: The MorphoLibJ library proposes a large collection of generic tools based on MM to process binary and grey-level 2D and 3D images, integrated into user-friendly plugins. We illustrate how MorphoLibJ can facilitate the exploitation of 3D images of plant tissues. Availability and Implementation: MorphoLibJ is freely available at http://imagej.net/MorphoLibJ


Journal ArticleDOI
09 Sep 2016-Science
TL;DR: This work identifies six biological mechanisms that commonly shape responses to climate change yet are too often missing from current predictive models and prioritize the types of information needed to inform each of these mechanisms, and suggests proxies for data that are missing or difficult to collect.
Abstract: BACKGROUND As global climate change accelerates, one of the most urgent tasks for the coming decades is to develop accurate predictions about biological responses to guide the effective protection of biodiversity. Predictive models in biology provide a means for scientists to project changes to species and ecosystems in response to disturbances such as climate change. Most current predictive models, however, exclude important biological mechanisms such as demography, dispersal, evolution, and species interactions. These biological mechanisms have been shown to be important in mediating past and present responses to climate change. Thus, current modeling efforts do not provide sufficiently accurate predictions. Despite the many complexities involved, biologists are rapidly developing tools that include the key biological processes needed to improve predictive accuracy. The biggest obstacle to applying these more realistic models is that the data needed to inform them are almost always missing. We suggest ways to fill this growing gap between model sophistication and information to predict and prevent the most damaging aspects of climate change for life on Earth. ADVANCES On the basis of empirical and theoretical evidence, we identify six biological mechanisms that commonly shape responses to climate change yet are too often missing from current predictive models: physiology; demography, life history, and phenology; species interactions; evolutionary potential and population differentiation; dispersal, colonization, and range dynamics; and responses to environmental variation. We prioritize the types of information needed to inform each of these mechanisms and suggest proxies for data that are missing or difficult to collect. We show that even for well-studied species, we often lack critical information that would be necessary to apply more realistic, mechanistic models. Consequently, data limitations likely override the potential gains in accuracy of more realistic models. Given the enormous challenge of collecting this detailed information on millions of species around the world, we highlight practical methods that promote the greatest gains in predictive accuracy. Trait-based approaches leverage sparse data to make more general inferences about unstudied species. Targeting species with high climate sensitivity and disproportionate ecological impact can yield important insights about future ecosystem change. Adaptive modeling schemes provide a means to target the most important data while simultaneously improving predictive accuracy. OUTLOOK Strategic collections of essential biological information will allow us to build generalizable insights that inform our broader ability to anticipate species’ responses to climate change and other human-caused disturbances. By increasing accuracy and making uncertainties explicit, scientists can deliver improved projections for biodiversity under climate change together with characterizations of uncertainty to support more informed decisions by policymakers and land managers. Toward this end, a globally coordinated effort to fill data gaps in advance of the growing climate-fueled biodiversity crisis offers substantial advantages in efficiency, coverage, and accuracy. Biologists can take advantage of the lessons learned from the Intergovernmental Panel on Climate Change’s development, coordination, and integration of climate change projections. Climate and weather projections were greatly improved by incorporating important mechanisms and testing predictions against global weather station data. Biology can do the same. We need to adopt this meteorological approach to predicting biological responses to climate change to enhance our ability to mitigate future changes to global biodiversity and the services it provides to humans.

Journal ArticleDOI
Kyle S. Dawson1, Jean-Paul Kneib2, Will J. Percival3, Shadab Alam4  +155 moreInstitutions (51)
TL;DR: The Extended Baryon Oscillation Spectroscopic Survey (eBOSS) as mentioned in this paper uses four different tracers of the underlying matter density field to expand the volume covered by BOSS and map the large-scale structures over the relatively unconstrained redshift range 0.6 0.87.
Abstract: In a six-year program started in 2014 July, the Extended Baryon Oscillation Spectroscopic Survey (eBOSS) will conduct novel cosmological observations using the BOSS spectrograph at Apache Point Observatory. These observations will be conducted simultaneously with the Time Domain Spectroscopic Survey (TDSS) designed for variability studies and the Spectroscopic Identification of eROSITA Sources (SPIDERS) program designed for studies of X-ray sources. In particular, eBOSS will measure with percent-level precision the distance-redshift relation with baryon acoustic oscillations (BAO) in the clustering of matter. eBOSS will use four different tracers of the underlying matter density field to vastly expand the volume covered by BOSS and map the large-scale-structures over the relatively unconstrained redshift range 0.6 0.6 sample of BOSS galaxies. With ~195,000 new emission line galaxy redshifts, we expect BAO measurements of d_A(z) to an accuracy of 3.1% and H(z) to 4.7% at an effective redshift of z = 0.87. A sample of more than 500,000 spectroscopically confirmed quasars will provide the first BAO distance measurements over the redshift range 0.9 2.1; these new data will enhance the precision of dA(z) and H(z) at z > 2.1 by a factor of 1.44 relative to BOSS. Furthermore, eBOSS will provide improved tests of General Relativity on cosmological scales through redshift-space distortion measurements, improved tests for non-Gaussianity in the primordial density field, and new constraints on the summed mass of all neutrino species. Here, we provide an overview of the cosmological goals, spectroscopic target sample, demonstration of spectral quality from early data, and projected cosmological constraints from eBOSS.

Journal ArticleDOI
Bonnie R. Joubert1, Janine F. Felix2, Paul Yousefi3, Kelly M. Bakulski4, Allan C. Just5, Carrie V. Breton6, Sarah E. Reese1, Christina A. Markunas1, Christina A. Markunas7, Rebecca C Richmond8, Cheng-Jian Xu9, Leanne K. Küpers9, Sam S. Oh10, Cathrine Hoyo11, Olena Gruzieva12, Cilla Söderhäll12, Lucas A. Salas13, Nour Baïz14, Hongmei Zhang15, Johanna Lepeule16, Carlos Ruiz13, Symen Ligthart2, Tianyuan Wang1, Jack A. Taylor1, Liesbeth Duijts, Gemma C Sharp8, Soesma A Jankipersadsing9, Roy Miodini Nilsen17, Ahmad Vaez9, Ahmad Vaez18, M. Daniele Fallin4, Donglei Hu10, Augusto A. Litonjua19, Bernard F. Fuemmeler7, Karen Huen3, Juha Kere12, Inger Kull12, Monica Cheng Munthe-Kaas20, Ulrike Gehring21, Mariona Bustamante, Marie José Saurel-Coubizolles22, Bilal M. Quraishi15, Jie Ren6, Jörg Tost, Juan R. González13, Marjolein J. Peters2, Siri E. Håberg23, Zongli Xu1, Joyce B. J. van Meurs2, Tom R. Gaunt8, Marjan Kerkhof9, Eva Corpeleijn9, Andrew P. Feinberg24, Celeste Eng10, Andrea A. Baccarelli25, Sara E. Benjamin Neelon4, Asa Bradman3, Simon Kebede Merid12, Anna Bergström12, Zdenko Herceg26, Hector Hernandez-Vargas26, Bert Brunekreef21, Mariona Pinart, Barbara Heude27, Susan Ewart28, Jin Yao6, Nathanaël Lemonnier29, Oscar H. Franco2, Michael C. Wu30, Albert Hofman2, Albert Hofman25, Wendy L. McArdle8, Pieter van der Vlies9, Fahimeh Falahi9, Matthew W. Gillman25, Lisa F. Barcellos3, Ashok Kumar31, Ashok Kumar12, Ashok Kumar32, Magnus Wickman33, Magnus Wickman12, Stefano Guerra, Marie-Aline Charles27, John W. Holloway34, Charles Auffray29, Henning Tiemeier2, George Davey Smith8, Dirkje S. Postma9, Marie-France Hivert25, Brenda Eskenazi3, Martine Vrijheid13, Hasan Arshad34, Josep M. Antó, Abbas Dehghan2, Wilfried Karmaus15, Isabella Annesi-Maesano14, Jordi Sunyer, Akram Ghantous26, Göran Pershagen12, Nina Holland3, Susan K. Murphy7, Dawn L. DeMeo19, Esteban G. Burchard10, Christine Ladd-Acosta4, Harold Snieder9, Wenche Nystad23, Gerard H. Koppelman9, Caroline L Relton8, Vincent W. V. Jaddoe2, Allen J. Wilcox1, Erik Melén33, Erik Melén12, Stephanie J. London1 
TL;DR: This large scale meta-analysis of methylation data identified numerous loci involved in response to maternal smoking in pregnancy with persistence into later childhood and provide insights into mechanisms underlying effects of this important exposure.
Abstract: Epigenetic modifications, including DNA methylation, represent a potential mechanism for environmental impacts on human disease. Maternal smoking in pregnancy remains an important public health problem that impacts child health in a myriad of ways and has potential lifelong consequences. The mechanisms are largely unknown, but epigenetics most likely plays a role. We formed the Pregnancy And Childhood Epigenetics (PACE) consortium and meta-analyzed, across 13 cohorts (n = 6,685), the association between maternal smoking in pregnancy and newborn blood DNA methylation at over 450,000 CpG sites (CpGs) by using the Illumina 450K BeadChip. Over 6,000 CpGs were differentially methylated in relation to maternal smoking at genome-wide statistical significance (false discovery rate, 5%), including 2,965 CpGs corresponding to 2,017 genes not previously related to smoking and methylation in either newborns or adults. Several genes are relevant to diseases that can be caused by maternal smoking (e.g., orofacial clefts and asthma) or adult smoking (e.g., certain cancers). A number of differentially methylated CpGs were associated with gene expression. We observed enrichment in pathways and processes critical to development. In older children (5 cohorts, n = 3,187), 100% of CpGs gave at least nominal levels of significance, far more than expected by chance (p value < 2.2 × 10(-16)). Results were robust to different normalization methods used across studies and cell type adjustment. In this large scale meta-analysis of methylation data, we identified numerous loci involved in response to maternal smoking in pregnancy with persistence into later childhood and provide insights into mechanisms underlying effects of this important exposure.

Journal ArticleDOI
TL;DR: It is proved the existence of an optimal MAPE model and the universal consistency of Empirical Risk Minimization based on the MAPE is shown, and it is shown that finding the best model under theMAPE is equivalent to doing weighted Mean Absolute Error regression, and this weighting strategy is applied to kernel regression.

Journal ArticleDOI
TL;DR: A group of experts convened to develop clinical, radiological and microbiological guidelines for the management of chronic pulmonary aspergillosis concluded that long-term oral antifungal therapy is recommended for CCPA to improve overall health status and respiratory symptoms, arrest haemoptysis and prevent progression.
Abstract: Chronic pulmonary aspergillosis (CPA) is an uncommon and problematic pulmonary disease, complicating many other respiratory disorders, thought to affect ~240 000 people in Europe. The most common form of CPA is chronic cavitary pulmonary aspergillosis (CCPA), which untreated may progress to chronic fibrosing pulmonary aspergillosis. Less common manifestations include: Aspergillus nodule and single aspergilloma. All these entities are found in non-immunocompromised patients with prior or current lung disease. Subacute invasive pulmonary aspergillosis (formerly called chronic necrotising pulmonary aspergillosis) is a more rapidly progressive infection (<3 months) usually found in moderately immunocompromised patients, which should be managed as invasive aspergillosis. Few clinical guidelines have been previously proposed for either diagnosis or management of CPA. A group of experts convened to develop clinical, radiological and microbiological guidelines. The diagnosis of CPA requires a combination of characteristics: one or more cavities with or without a fungal ball present or nodules on thoracic imaging, direct evidence of Aspergillus infection (microscopy or culture from biopsy) or an immunological response to Aspergillus spp. and exclusion of alternative diagnoses, all present for at least 3 months. Aspergillus antibody (precipitins) is elevated in over 90% of patients. Surgical excision of simple aspergilloma is recommended, if technically possible, and preferably via video-assisted thoracic surgery technique. Long-term oral antifungal therapy is recommended for CCPA to improve overall health status and respiratory symptoms, arrest haemoptysis and prevent progression. Careful monitoring of azole serum concentrations, drug interactions and possible toxicities is recommended. Haemoptysis may be controlled with tranexamic acid and bronchial artery embolisation, rarely surgical resection, and may be a sign of therapeutic failure and/or antifungal resistance. Patients with single Aspergillus nodules only need antifungal therapy if not fully resected, but if multiple they may benefit from antifungal treatment, and require careful follow-up.