Showing papers by "University of Bordeaux published in 2016"
••
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes.
For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy.
Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.
5,187 citations
••
[...]
T. Prusti1, J. H. J. de Bruijne1, Anthony G. A. Brown2, Antonella Vallenari3 +621 more•Institutions (93)
TL;DR: Gaia as discussed by the authors is a cornerstone mission in the science programme of the European Space Agency (ESA). The spacecraft construction was approved in 2006, following a study in which the original interferometric concept was changed to a direct-imaging approach.
Abstract: Gaia is a cornerstone mission in the science programme of the EuropeanSpace Agency (ESA). The spacecraft construction was approved in 2006, following a study in which the original interferometric concept was changed to a direct-imaging approach. Both the spacecraft and the payload were built by European industry. The involvement of the scientific community focusses on data processing for which the international Gaia Data Processing and Analysis Consortium (DPAC) was selected in 2007. Gaia was launched on 19 December 2013 and arrived at its operating point, the second Lagrange point of the Sun-Earth-Moon system, a few weeks later. The commissioning of the spacecraft and payload was completed on 19 July 2014. The nominal five-year mission started with four weeks of special, ecliptic-pole scanning and subsequently transferred into full-sky scanning mode. We recall the scientific goals of Gaia and give a description of the as-built spacecraft that is currently (mid-2016) being operated to achieve these goals. We pay special attention to the payload module, the performance of which is closely related to the scientific performance of the mission. We provide a summary of the commissioning activities and findings, followed by a description of the routine operational mode. We summarise scientific performance estimates on the basis of in-orbit operations. Several intermediate Gaia data releases are planned and the data can be retrieved from the Gaia Archive, which is available through the Gaia home page.
5,164 citations
••
TL;DR: In this article, the authors used a Bayesian hierarchical model to estimate trends in diabetes prevalence, defined as fasting plasma glucose of 7.0 mmol/L or higher, or history of diagnosis with diabetes, or use of insulin or oral hypoglycaemic drugs in 200 countries and territories in 21 regions, by sex and from 1980 to 2014.
2,782 citations
••
University of Manchester1, KEK2, CERN3, Complutense University of Madrid4, SLAC National Accelerator Laboratory5, Toyama College6, Lebedev Physical Institute7, Fermilab8, University of Paris-Sud9, Lawrence Livermore National Laboratory10, National Research Nuclear University MEPhI11, Queen's University Belfast12, Korea Institute of Science and Technology Information13, Istituto Nazionale di Fisica Nucleare14, Northeastern University15, University of Seville16, National University of Cordoba17, Saint Joseph University18, Joint Institute for Nuclear Research19, Illawarra Health & Medical Research Institute20, University of Wollongong21, Hampton University22, TRIUMF23, ETH Zurich24, Centre national de la recherche scientifique25, University of Bordeaux26, University of Helsinki27, Johns Hopkins University School of Medicine28, National Technical University of Athens29, University of Notre Dame30, Ashikaga Institute of Technology31, Kobe University32, Intelligence and National Security Alliance33, University of Trieste34, University of Warwick35, University of Belgrade36, Instituto Superior Técnico37, European Space Agency38, Varian Medical Systems39, George Washington University40, Ritsumeikan University41, Ton Duc Thang University42, Université Paris-Saclay43, Idaho State University44, Naruto University of Education45
01 Nov 2016-Nuclear Instruments & Methods in Physics Research Section A-accelerators Spectrometers Detectors and Associated Equipment
TL;DR: Geant4 as discussed by the authors is a software toolkit for the simulation of the passage of particles through matter, which is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection.
Abstract: Geant4 is a software toolkit for the simulation of the passage of particles through matter. It is used by a large number of experiments and projects in a variety of application domains, including high energy physics, astrophysics and space science, medical physics and radiation protection. Over the past several years, major changes have been made to the toolkit in order to accommodate the needs of these user communities, and to efficiently exploit the growth of computing power made available by advances in technology. The adaptation of Geant4 to multithreading, advances in physics, detector modeling and visualization, extensions to the toolkit, including biasing and reverse Monte Carlo, and tools for physics and release validation are discussed here.
2,260 citations
••
Anthony G. A. Brown1, Antonella Vallenari2, T. Prusti2, J. H. J. de Bruijne3 +587 more•Institutions (89)
TL;DR: The first Gaia data release, Gaia DR1 as discussed by the authors, consists of three components: a primary astrometric data set which contains the positions, parallaxes, and mean proper motions for about 2 million of the brightest stars in common with the Hipparcos and Tycho-2 catalogues.
Abstract: Context. At about 1000 days after the launch of Gaia we present the first Gaia data release, Gaia DR1, consisting of astrometry and photometry for over 1 billion sources brighter than magnitude 20.7. Aims: A summary of Gaia DR1 is presented along with illustrations of the scientific quality of the data, followed by a discussion of the limitations due to the preliminary nature of this release. Methods: The raw data collected by Gaia during the first 14 months of the mission have been processed by the Gaia Data Processing and Analysis Consortium (DPAC) and turned into an astrometric and photometric catalogue. Results: Gaia DR1 consists of three components: a primary astrometric data set which contains the positions, parallaxes, and mean proper motions for about 2 million of the brightest stars in common with the Hipparcos and Tycho-2 catalogues - a realisation of the Tycho-Gaia Astrometric Solution (TGAS) - and a secondary astrometric data set containing the positions for an additional 1.1 billion sources. The second component is the photometric data set, consisting of mean G-band magnitudes for all sources. The G-band light curves and the characteristics of 3000 Cepheid and RR Lyrae stars, observed at high cadence around the south ecliptic pole, form the third component. For the primary astrometric data set the typical uncertainty is about 0.3 mas for the positions and parallaxes, and about 1 mas yr-1 for the proper motions. A systematic component of 0.3 mas should be added to the parallax uncertainties. For the subset of 94 000 Hipparcos stars in the primary data set, the proper motions are much more precise at about 0.06 mas yr-1. For the secondary astrometric data set, the typical uncertainty of the positions is 10 mas. The median uncertainties on the mean G-band magnitudes range from the mmag level to0.03 mag over the magnitude range 5 to 20.7. Conclusions: Gaia DR1 is an important milestone ahead of the next Gaia data release, which will feature five-parameter astrometry for all sources. Extensive validation shows that Gaia DR1 represents a major advance in the mapping of the heavens and the availability of basic stellar data that underpin observational astrophysics. Nevertheless, the very preliminary nature of this first Gaia data release does lead to a number of important limitations to the data quality which should be carefully considered before drawing conclusions from the data.
2,174 citations
••
National University of Cordoba1, Max Planck Society2, VU University Amsterdam3, Macquarie University4, University of Grenoble5, University of Lyon6, Leipzig University7, Industrial University of Santander8, University of Oldenburg9, Imperial College London10, University of Montpellier11, Forschungszentrum Jülich12, University of Minnesota13, University of Western Sydney14, University of New South Wales15, Royal Botanic Gardens16, George Washington University17, Missouri Botanical Garden18, Paul Sabatier University19, Smithsonian Tropical Research Institute20, Komarov Botanical Institute21, University of Bordeaux22, Institut national de la recherche agronomique23, Florida International University24, University of Insubria25, University of Milan26, Université de Sherbrooke27, Centro Agronómico Tropical de Investigación y Enseñanza28
TL;DR: Analysis of worldwide variation in six major traits critical to growth, survival and reproduction within the largest sample of vascular plant species ever compiled found that occupancy of six-dimensional trait space is strongly concentrated, indicating coordination and trade-offs.
Abstract: The authors found that the key elements of plant form and function, analysed at global scale, are largely concentrated into a two-dimensional plane indexed by the size of whole plants and organs on the one hand, and the construction costs for photosynthetic leaf area, on the other.
1,814 citations
••
James Bentham1, Mariachiara Di Cesare2, Mariachiara Di Cesare1, Gretchen A Stevens3 +787 more•Institutions (246)
TL;DR: The height differential between the tallest and shortest populations was 19-20 cm a century ago, and has remained the same for women and increased for men a century later despite substantial changes in the ranking of countries.
Abstract: Being taller is associated with enhanced longevity, and higher education and earnings. We reanalysed 1472 population-based studies, with measurement of height on more than 18.6 million participants to estimate mean height for people born between 1896 and 1996 in 200 countries. The largest gain in adult height over the past century has occurred in South Korean women and Iranian men, who became 20.2 cm (95% credible interval 17.5–22.7) and 16.5 cm (13.3–19.7) taller, respectively. In contrast, there was little change in adult height in some sub-Saharan African countries and in South Asia over the century of analysis. The tallest people over these 100 years are men born in the Netherlands in the last quarter of 20th century, whose average heights surpassed 182.5 cm, and the shortest were women born in Guatemala in 1896 (140.3 cm; 135.8–144.8). The height differential between the tallest and shortest populations was 19-20 cm a century ago, and has remained the same for women and increased for men a century later despite substantial changes in the ranking of countries.
1,348 citations
••
Pierre-and-Marie-Curie University1, AXA2, University of British Columbia3, VU University Medical Center4, University of Southern California5, University of Toulouse6, ICM Partners7, French Institute of Health and Medical Research8, Imperial College London9, University of Lübeck10, Sahlgrenska University Hospital11, Federal Institute for Drugs and Medical Devices12, UCL Institute of Neurology13, University of Bordeaux14, University of Geneva15, McGill University16, University of Paris17, University of Washington18, Karolinska University Hospital19, University of Eastern Finland20, University of North Texas Health Science Center21, University of California, San Francisco22, University of Melbourne23, Brown University24, Harvard University25, Brigham and Women's Hospital26, Alzheimer's Association27, Lou Ruvo Brain Institute28, Mayo Clinic29
TL;DR: An updated review of the literature and evidence on the definitions and lexicon, the limits, the natural history, the markers of progression, and the ethical consequence of detecting the disease at this asymptomatic stage of Alzheimer's disease are provided.
Abstract: During the past decade, a conceptual shift occurred in the field of Alzheimer's disease (AD) considering the disease as a continuum. Thanks to evolving biomarker research and substantial discoveries, it is now possible to identify the disease even at the preclinical stage before the occurrence of the first clinical symptoms. This preclinical stage of AD has become a major research focus as the field postulates that early intervention may offer the best chance of therapeutic success. To date, very little evidence is established on this "silent" stage of the disease. A clarification is needed about the definitions and lexicon, the limits, the natural history, the markers of progression, and the ethical consequence of detecting the disease at this asymptomatic stage. This article is aimed at addressing all the different issues by providing for each of them an updated review of the literature and evidence, with practical recommendations.
1,235 citations
••
TL;DR: The frequency of genetic alterations, acceptable turnaround times in obtaining analysis results, and the clinical advantage provided by detection of a genetic alteration suggest that this nationwide molecular profiling of patients with advanced NSCLC provides a clinical benefit.
757 citations
••
TL;DR: The incidence of dementia has declined among participants in the Framingham Heart Study and the prevalence of most vascular risk factors and the risk of dementia associated with stroke, atrial fibrillation, or heart failure have decreased over time, but none of these trends completely explain the decrease in the incidence.
Abstract: BackgroundThe prevalence of dementia is expected to soar as the average life expectancy increases, but recent estimates suggest that the age-specific incidence of dementia is declining in high-income countries. Temporal trends are best derived through continuous monitoring of a population over a long period with the use of consistent diagnostic criteria. We describe temporal trends in the incidence of dementia over three decades among participants in the Framingham Heart Study. MethodsParticipants in the Framingham Heart Study have been under surveillance for incident dementia since 1975. In this analysis, which included 5205 persons 60 years of age or older, we used Cox proportional-hazards models adjusted for age and sex to determine the 5-year incidence of dementia during each of four epochs. We also explored the interactions between epoch and age, sex, apolipoprotein E e4 status, and educational level, and we examined the effects of these interactions, as well as the effects of vascular risk factors a...
734 citations
••
Max Planck Society1, University of Tübingen2, Howard Hughes Medical Institute3, Harvard University4, Broad Institute5, University College Dublin6, University of Coimbra7, University of Adelaide8, Russian Academy of Sciences9, Altai State University10, University of Pisa11, University of Bari12, University of Cantabria13, University of New Mexico14, Austrian Academy of Sciences15, Naturhistorisches Museum16, University of Vienna17, University of Ferrara18, University of Florence19, University of Siena20, Centre national de la recherche scientifique21, University of Bucharest22, California State University, Northridge23, University of Bordeaux24, University of Toulouse25, Royal Belgian Institute of Natural Sciences26, Academy of Sciences of the Czech Republic27, Masaryk University28
TL;DR: In this article, the authors analyse genome-wide data from 51 Eurasians from ~45,000-7,000 years ago and find that the proportion of Neanderthal DNA decreased from 3-6% to around 2%, consistent with natural selection against Neanderthal variants in modern humans.
Abstract: Modern humans arrived in Europe ~45,000 years ago, but little is known about their genetic composition before the start of farming ~8,500 years ago. Here we analyse genome-wide data from 51 Eurasians from ~45,000-7,000 years ago. Over this time, the proportion of Neanderthal DNA decreased from 3-6% to around 2%, consistent with natural selection against Neanderthal variants in modern humans. Whereas there is no evidence of the earliest modern humans in Europe contributing to the genetic composition of present-day Europeans, all individuals between ~37,000 and ~14,000 years ago descended from a single founder population which forms part of the ancestry of present-day Europeans. An ~35,000-year-old individual from northwest Europe represents an early branch of this founder population which was then displaced across a broad region, before reappearing in southwest Europe at the height of the last Ice Age ~19,000 years ago. During the major warming period after ~14,000 years ago, a genetic component related to present-day Near Easterners became widespread in Europe. These results document how population turnover and migration have been recurring themes of European prehistory.
••
TL;DR: This multicenter randomized study shows that CA of AF is superior to AMIO in achieving freedom from AF at long-term follow-up and reducing unplanned hospitalization and mortality in patients with heart failure and persistent AF.
Abstract: Background—Whether catheter ablation (CA) is superior to amiodarone (AMIO) for the treatment of persistent atrial fibrillation (AF) in patients with heart failure is unknown. Methods and Results—This was an open-label, randomized, parallel-group, multicenter study. Patients with persistent AF, dual-chamber implantable cardioverter defibrillator or cardiac resynchronization therapy defibrillator, New York Heart Association II to III, and left ventricular ejection fraction <40% within the past 6 months were randomly assigned (1:1 ratio) to undergo CA for AF (group 1, n=102) or receive AMIO (group 2, n=101). Recurrence of AF was the primary end point. All-cause mortality and unplanned hospitalization were the secondary end points. Patients were followed up for a minimum of 24 months. At the end of follow-up, 71 (70%; 95% confidence interval, 60%–78%) patients in group 1 were recurrence free after an average of 1.4±0.6 procedures in comparison with 34 (34%; 95% confidence interval, 25%–44%) in group 2 (log-ra...
••
TL;DR: Using human embryos and human pluripotent stem cells, it is shown that the reorganization of the embryonic lineage is mediated by cellular polarization leading to cavity formation, indicating that the critical remodelling events at this stage of human development are embryo-autonomous.
Abstract: Zernicka-Goetz and colleagues report an in vitro culture system that recapitulates hallmarks of human embryo morphogenesis before gastrulation, including formation of the pro-amniotic cavity and appearance of the prospective yolk sac.
••
Macquarie University1, Agricultural Research Service2, University of Ulm3, University of Sydney4, University of Alberta5, California State University, Bakersfield6, Haverford College7, University of Tasmania8, National University of Patagonia San Juan Bosco9, Guangxi University10, Institut national de la recherche agronomique11, Blaise Pascal University12, University of Bordeaux13, International Sleep Products Association14, Duke University15, Xishuangbanna Tropical Botanical Garden16, James Cook University17, University of Idaho18, Naturalis19, University of Guelph20, University of Innsbruck21, University of Wisconsin-Madison22, University of Edinburgh23, Commonwealth Scientific and Industrial Research Organisation24, University of Trieste25, University of California, Santa Cruz26, University of Utah27, George Washington University28
TL;DR: There appears to be no persuasive explanation for the considerable number of species with both low efficiency and low safety in branch xylem, and these species represent a real challenge for understanding the evolution ofxylem.
Abstract: Fil: Gleason, Sean M.. Macquarie University. Department of Biological Sciences ; Australia. USDA-ARS. Water Management Research; Estados Unidos
••
TL;DR: The G4Hunter algorithm is applied to genomes of a number of species, including humans, allowing us to conclude that the number of sequences capable of forming stable quadruplexes (at least in vitro) in the human genome is significantly higher, by a factor of 2–10, than previously thought.
Abstract: Critical evidence for the biological relevance of G-quadruplexes (G4) has recently been obtained in seminal studies performed in a variety of organisms. Four-stranded G-quadruplex DNA structures are promising drug targets as these non-canonical structures appear to be involved in a number of key biological processes. Given the growing interest for G4, accurate tools to predict G-quadruplex propensity of a given DNA or RNA sequence are needed. Several algorithms such as Quadparser predict quadruplex forming propensity. However, a number of studies have established that sequences that are not detected by these tools do form G4 structures (false negatives) and that other sequences predicted to form G4 structures do not (false positives). Here we report development and testing of a radically different algorithm, G4Hunter that takes into account G-richness and G-skewness of a given sequence and gives a quadruplex propensity score as output. To validate this model, we tested it on a large dataset of 392 published sequences and experimentally evaluated quadruplex forming potential of 209 sequences using a combination of biophysical methods to assess quadruplex formation in vitro. We experimentally validated the G4Hunter algorithm on a short complete genome, that of the human mitochondria (16.6 kb), because of its relatively high GC content and GC skewness as well as the biological relevance of these quadruplexes near instability hotspots. We then applied the algorithm to genomes of a number of species, including humans, allowing us to conclude that the number of sequences capable of forming stable quadruplexes (at least in vitro) in the human genome is significantly higher, by a factor of 2-10, than previously thought.
••
University of Bordeaux1, Paris Diderot University2, Médecins Sans Frontières3, Pasteur Institute4, French Institute of Health and Medical Research5, Rega Institute for Medical Research6, École Normale Supérieure7, Friedrich Loeffler Institute8, United Kingdom Ministry of Defence9, Robert Koch Institute10, Public Health Agency of Sweden11, Necker-Enfants Malades Hospital12, Université de Montréal13, Public Health England14, Bernhard Nocht Institute for Tropical Medicine15, Bundeswehr Institute of Microbiology16, Cliniques Universitaires Saint-Luc17, Université catholique de Louvain18, Southampton General Hospital19, EHESP20
TL;DR: The objectives of the trial were to test the feasibility and acceptability of an emergency trial in the context of a large Ebola outbreak, and to collect data on the safety and effectiveness of favipiravir in reducing mortality and viral load in patients with EVD.
Abstract: BACKGROUND:Ebola virus disease (EVD) is a highly lethal condition for which no specific treatment has proven efficacy. In September 2014, while the Ebola outbreak was at its peak, the World Health Organization released a short list of drugs suitable for EVD research. Favipiravir, an antiviral developed for the treatment of severe influenza, was one of these. In late 2014, the conditions for starting a randomized Ebola trial were not fulfilled for two reasons. One was the perception that, given the high number of patients presenting simultaneously and the very high mortality rate of the disease, it was ethically unacceptable to allocate patients from within the same family or village to receive or not receive an experimental drug, using a randomization process impossible to understand by very sick patients. The other was that, in the context of rumors and distrust of Ebola treatment centers, using a randomized design at the outset might lead even more patients to refuse to seek care. Therefore, we chose to conduct a multicenter non-randomized trial, in which all patients would receive favipiravir along with standardized care. The objectives of the trial were to test the feasibility and acceptability of an emergency trial in the context of a large Ebola outbreak, and to collect data on the safety and effectiveness of favipiravir in reducing mortality and viral load in patients with EVD. The trial was not aimed at directly informing future guidelines on Ebola treatment but at quickly gathering standardized preliminary data to optimize the design of future studies.METHODS AND FINDINGS:Inclusion criteria were positive Ebola virus reverse transcription PCR (RT-PCR) test, age ≥ 1 y, weight ≥ 10 kg, ability to take oral drugs, and informed consent. All participants received oral favipiravir (day 0: 6,000 mg; day 1 to day 9: 2,400 mg/d). Semi-quantitative Ebola virus RT-PCR (results expressed in "cycle threshold" [Ct]) and biochemistry tests were performed at day 0, day 2, day 4, end of symptoms, day 14, and day 30. Frozen samples were shipped to a reference biosafety level 4 laboratory for RNA viral load measurement using a quantitative reference technique (genome copies/milliliter). Outcomes were mortality, viral load evolution, and adverse events. The analysis was stratified by age and Ct value. A "target value" of mortality was defined a priori for each stratum, to guide the interpretation of interim and final analysis. Between 17 December 2014 and 8 April 2015, 126 patients were included, of whom 111 were analyzed (adults and adolescents, ≥13 y, n = 99; young children, ≤6 y, n = 12). Here we present the results obtained in the 99 adults and adolescents. Of these, 55 had a baseline Ct value ≥ 20 (Group A Ct ≥ 20), and 44 had a baseline Ct value < 20 (Group A Ct < 20). Ct values and RNA viral loads were well correlated, with Ct = 20 corresponding to RNA viral load = 7.7 log10 genome copies/ml. Mortality was 20% (95% CI 11.6%-32.4%) in Group A Ct ≥ 20 and 91% (95% CI 78.8%-91.1%) in Group A Ct < 20. Both mortality 95% CIs included the predefined target value (30% and 85%, respectively). Baseline serum creatinine was ≥110 μmol/l in 48% of patients in Group A Ct ≥ 20 (≥300 μmol/l in 14%) and in 90% of patients in Group A Ct < 20 (≥300 μmol/l in 44%). In Group A Ct ≥ 20, 17% of patients with baseline creatinine ≥110 μmol/l died, versus 97% in Group A Ct < 20. In patients who survived, the mean decrease in viral load was 0.33 log10 copies/ml per day of follow-up. RNA viral load values and mortality were not significantly different between adults starting favipiravir within <72 h of symptoms compared to others. Favipiravir was well tolerated.CONCLUSIONS:In the context of an outbreak at its peak, with crowded care centers, randomizing patients to receive either standard care or standard care plus an experimental drug was not felt to be appropriate. We did a non-randomized trial. This trial reaches nuanced conclusions. On the one hand, we do not conclude on the efficacy of the drug, and our conclusions on tolerance, although encouraging, are not as firm as they could have been if we had used randomization. On the other hand, we learned about how to quickly set up and run an Ebola trial, in close relationship with the community and non-governmental organizations; we integrated research into care so that it improved care; and we generated knowledge on EVD that is useful to further research. Our data illustrate the frequency of renal dysfunction and the powerful prognostic value of low Ct values. They suggest that drug trials in EVD should systematically stratify analyses by baseline Ct value, as a surrogate of viral load. They also suggest that favipiravir monotherapy merits further study in patients with medium to high viremia, but not in those with very high viremia.TRIAL REGISTRATION:ClinicalTrials.gov NCT02329054.
••
TL;DR: A meta-analysis of genome-wide association studies for estimated glomerular filtration rate suggests that genetic determinants of eGFR are mediated largely through direct effects within the kidney and highlight important cell types and biological pathways.
Abstract: Reduced glomerular filtration rate defines chronic kidney disease and is associated with cardiovascular and all-cause mortality. We conducted a meta-analysis of genome-wide association studies for estimated glomerular filtration rate (eGFR), combining data across 133,413 individuals with replication in up to 42,166 individuals. We identify 24 new and confirm 29 previously identified loci. Of these 53 loci, 19 associate with eGFR among individuals with diabetes. Using bioinformatics, we show that identified genes at eGFR loci are enriched for expression in kidney tissues and in pathways relevant for kidney development and transmembrane transporter activity, kidney structure, and regulation of glucose metabolism. Chromatin state mapping and DNase I hypersensitivity analyses across adult tissues demonstrate preferential mapping of associated variants to regulatory regions in kidney but not extra-renal tissues. These findings suggest that genetic determinants of eGFR are mediated largely through direct effects within the kidney and highlight important cell types and biological pathways.
••
University of Manchester1, University of Oxford2, University of Edinburgh3, Philadelphia College of Osteopathic Medicine4, Oregon Health & Science University5, University of New Mexico6, University of Ulm7, Autonomous University of Madrid8, University of Milan9, Queen's University Belfast10, Queens College11, Université de Sherbrooke12, Catholic University of the Sacred Heart13, University of Arkansas for Medical Sciences14, Wayne State University15, New York University16, University of Bologna17, University of Bordeaux18, Umeå University19, Austral University of Chile20, Sapienza University of Rome21, University of Texas at San Antonio22, University of Glasgow23, University of Pretoria24, University of Helsinki25, Brighton and Sussex Medical School26, Imperial College London27
TL;DR: Researchers and clinicians working on Alzheimer’s disease or related topics write to express their concern that one particular aspect of the disease has been neglected.
Abstract: We are researchers and clinicians working on Alzheimer’s disease (AD) or related topics, and we write to express our concern that one particular aspect of the disease has been neglected, even thoug ...
••
Institut national de la recherche agronomique1, University of Bordeaux2, Stanford University3, University of California, Berkeley4, Ohio State University5, Museum and Institute of Zoology6, Game & Wildlife Conservation Trust7, Michigan State University8, University of Göttingen9, Technische Universität München10, Swedish University of Agricultural Sciences11, United States Department of Agriculture12
TL;DR: In this paper, a quantitative synthesis with data collected from several cropping systems in Europe and North America, analyzed how the level and within-field spatial stability of natural pest control services was related to the simplification of the surrounding landscape.
••
TL;DR: The capability of performing high resolution international clock comparisons paves the way for a redefinition of the unit of time and an all-optical dissemination of the SI-second.
Abstract: Leveraging the unrivalled performance of optical clocks as key tools for geo-science, for astronomy and for fundamental physics beyond the standard model requires comparing the frequency of distant optical clocks faithfully. Here, we report on the comparison and agreement of two strontium optical clocks at an uncertainty of 5 × 10−17 via a newly established phase-coherent frequency link connecting Paris and Braunschweig using 1,415 km of telecom fibre. The remote comparison is limited only by the instability and uncertainty of the strontium lattice clocks themselves, with negligible contributions from the optical frequency transfer. A fractional precision of 3 × 10−17 is reached after only 1,000 s averaging time, which is already 10 times better and more than four orders of magnitude faster than any previous long-distance clock comparison. The capability of performing high resolution international clock comparisons paves the way for a redefinition of the unit of time and an all-optical dissemination of the SI-second. Comparing the frequency of two distant optical clocks will enable sensitive tests of fundamental physics. Here, the authors compare two strontium optical-lattice clocks 690 kilometres apart to a degree of accuracy that is limited only by the uncertainty of the individual clocks themselves.
••
TL;DR: Treatments directed at improving borderline motor dysfunction or reducing reflux burden to sub-normal levels have limited success in symptom improvement, and strategies focused on modulating peripheral triggering and central perception are mechanistically viable and clinically meaningful.
••
TL;DR: A general overview of the helicase/G-quadruplex field is presented and it is suggested that proteins may have evolved to remove these structures from genomic DNA.
Abstract: Guanine-rich DNA strands can fold in vitro into non-canonical DNA structures called G-quadruplexes. These structures may be very stable under physiological conditions. Evidence suggests that G-quadruplex structures may act as 'knots' within genomic DNA, and it has been hypothesized that proteins may have evolved to remove these structures. The first indication of how G-quadruplex structures could be unfolded enzymatically came in the late 1990s with reports that some well-known duplex DNA helicases resolved these structures in vitro. Since then, the number of studies reporting G-quadruplex DNA unfolding by helicase enzymes has rapidly increased. The present review aims to present a general overview of the helicase/G-quadruplex field.
••
TL;DR: In this article, the authors discuss the impact of climate change on wine production and propose adaptation strategies to continue to produce high-quality wines and to preserve their typicity according to their origin in a changing climate.
Abstract: Climate change is a major challenge in wine production. Temperatures are increasing worldwide, and most regions are exposed to water deficits more frequently. Higher temperatures trigger advanced phenology. This shifts the ripening phase to warmer periods in the summer, which will affect grape composition, in particular with respect to aroma compounds. Increased water stress reduces yields and modifies fruit composition. The frequency of extreme climatic events (hail, flooding) is likely to increase. Depending on the region and the amount of change, this may have positive or negative implications on wine quality. Adaptation strategies are needed to continue to produce high-quality wines and to preserve their typicity according to their origin in a changing climate. The choice of plant material is a valuable resource to implement these strategies. (JEL Classifications: Q13, Q54)
••
TL;DR: The management of AD must consider the clinical and pathogenic variabilities of the disease and also target flare prevention, as well as avoidance of specific and unspecific provocation factors.
Abstract: Atopic dermatitis (AD) is a clinically defined, highly pruritic, chronic inflammatory skin disease of children and adults. The diagnosis is made using evaluated clinical criteria. Disease activity is best measured with a composite score assessing both objective signs and subjective symptoms, such as SCORAD. The management of AD must consider the clinical and pathogenic variabilities of the disease and also target flare prevention. Basic therapy includes hydrating topical treatment, as well as avoidance of specific and unspecific provocation factors. Anti-inflammatory treatment of visible skin lesions is based on topical glucocorticosteroids and the topical calcineurin inhibitors tacrolimus and pimecrolimus. Topical calcineurin inhibitors are preferred in sensitive locations. Tacrolimus and mid-potent steroids are proven for proactive therapy, which is long-term intermittent anti-inflammatory therapy of the frequently relapsing skin areas. Systemic anti-inflammatory or immunosuppressive treatment is indicated for severe refractory cases. Biologicals targeting key mechanisms of the atopic immune response are promising emerging treatment options. Microbial colonization and superinfection may induce disease exacerbation and can justify additional antimicrobial treatment. Systemic antihistamines (H1R-blockers) may diminish pruritus, but do not have sufficient effect on lesions. Adjuvant therapy includes UV irradiation, preferably UVA1 or narrow-band UVB 311 nm. Dietary recommendations should be patient specific and elimination diets should only be advised in case of proven food allergy. Allergen-specific immunotherapy to aeroallergens may be useful in selected cases. Psychosomatic counselling is recommended to address stress-induced exacerbations. 'Eczema school' educational programmes have been proven to be helpful for children and adults.
••
TL;DR: It is shown that acute cannabinoid-induced memory impairment in mice requires activation of hippocampal mtCB1 receptors, and this data reveal that bioenergetic processes are primary acute regulators of cognitive functions.
Abstract: Cellular activity in the brain depends on the high energetic support provided by mitochondria, the cell organelles which use energy sources to generate ATP. Acute cannabinoid intoxication induces amnesia in humans and animals, and the activation of type-1 cannabinoid receptors present at brain mitochondria membranes (mtCB1) can directly alter mitochondrial energetic activity. Although the pathological impact of chronic mitochondrial dysfunctions in the brain is well established, the involvement of acute modulation of mitochondrial activity in high brain functions, including learning and memory, is unknown. Here, we show that acute cannabinoid-induced memory impairment in mice requires activation of hippocampal mtCB1 receptors. Genetic exclusion of CB1 receptors from hippocampal mitochondria prevents cannabinoid-induced reduction of mitochondrial mobility, synaptic transmission and memory formation. mtCB1 receptors signal through intra-mitochondrial Gαi protein activation and consequent inhibition of soluble-adenylyl cyclase (sAC). The resulting inhibition of protein kinase A (PKA)-dependent phosphorylation of specific subunits of the mitochondrial electron transport system eventually leads to decreased cellular respiration. Hippocampal inhibition of sAC activity or manipulation of intra-mitochondrial PKA signalling or phosphorylation of the Complex I subunit NDUFS2 inhibit bioenergetic and amnesic effects of cannabinoids. Thus, the G protein-coupled mtCB1 receptors regulate memory processes via modulation of mitochondrial energy metabolism. By directly linking mitochondrial activity to memory formation, these data reveal that bioenergetic processes are primary acute regulators of cognitive functions.
••
TL;DR: In this paper, the authors presented a catalog of hard Fermi-LAT sources (2FHLs) in the 50 GeV-2 TeV energy range and found that 86% of the sources can be associated with counterparts at other wavelengths, of which the majority are active galactic nuclei and the rest (11%) are Galactic sources.
Abstract: We present a catalog of sources detected above 50 GeV by the Fermi-Large Area Telescope (LAT) in 80 months of data. The newly delivered Pass 8 event-level analysis allows the detection and characterization of sources in the 50 GeV–2 TeV energy range. In this energy band, Fermi-LAT has detected 360 sources, which constitute the second catalog of hard Fermi-LAT sources (2FHL). The improved angular resolution enables the precise localization of point sources (~1farcm7 radius at 68% C. L.) and the detection and characterization of spatially extended sources. We find that 86% of the sources can be associated with counterparts at other wavelengths, of which the majority (75%) are active galactic nuclei and the rest (11%) are Galactic sources. Only 25% of the 2FHL sources have been previously detected by Cherenkov telescopes, implying that the 2FHL provides a reservoir of candidates to be followed up at very high energies. This work closes the energy gap between the observations performed at GeV energies by Fermi-LAT on orbit and the observations performed at higher energies by Cherenkov telescopes from the ground.
••
TL;DR: This study revealed genomic alterations and mutational signatures involved in the resistance to therapies, including actionable mutations, in HR+/HER2− metastatic tumors as compared to primary TCGA samples.
Abstract: BACKGROUND: Major advances have been achieved in the characterization of early breast cancer (eBC) genomic profiles. Metastatic breast cancer (mBC) is associated with poor outcomes, yet limited information is available on the genomic profile of this disease. This study aims to decipher mutational profiles of mBC using next-generation sequencing.
METHODS AND FINDINGS: Whole-exome sequencing was performed on 216 tumor-blood pairs from mBC patients who underwent a biopsy in the context of the SAFIR01, SAFIR02, SHIVA, or Molecular Screening for Cancer Treatment Optimization (MOSCATO) prospective trials. Mutational profiles from 772 primary breast tumors from The Cancer Genome Atlas (TCGA) were used as a reference for comparing primary and mBC mutational profiles. Twelve genes (TP53, PIK3CA, GATA3, ESR1, MAP3K1, CDH1, AKT1, MAP2K4, RB1, PTEN, CBFB, and CDKN2A) were identified as significantly mutated in mBC (false discovery rate [FDR] < 0.1). Eight genes (ESR1, FSIP2, FRAS1, OSBPL3, EDC4, PALB2, IGFN1, and AGRN) were more frequently mutated in mBC as compared to eBC (FDR < 0.01). ESR1 was identified both as a driver and as a metastatic gene (n = 22, odds ratio = 29, 95% CI [9-155], p = 1.2e-12) and also presented with focal amplification (n = 9) for a total of 31 mBCs with either ESR1 mutation or amplification, including 27 hormone receptor positive (HR+) and HER2 negative (HER2-) mBCs (19%). HR+/HER2- mBC presented a high prevalence of mutations on genes located on the mechanistic target of rapamycin (mTOR) pathway (TSC1 and TSC2) as compared to HR+/HER2- eBC (respectively 6% and 0.7%, p = 0.0004). Other actionable genes were more frequently mutated in HR+ mBC, including ERBB4 (n = 8), NOTCH3 (n = 7), and ALK (n = 7). Analysis of mutational signatures revealed a significant increase in APOBEC-mediated mutagenesis in HR+/HER2- metastatic tumors as compared to primary TCGA samples (p < 2e-16). The main limitations of this study include the absence of bone metastases and the size of the cohort, which might not have allowed the identification of rare mutations and their effect on survival.
CONCLUSIONS: This work reports the results of the analysis of the first large-scale study on mutation profiles of mBC. This study revealed genomic alterations and mutational signatures involved in the resistance to therapies, including actionable mutations.
••
TL;DR: LSM and FibroMeter(V2G) fibrosis classifications help physicians estimate both fibrosis stage and patient prognosis in clinical practice and were the two most accurate tests for the non-invasive evaluation of liver fibrosis in NAFLD.
••
TL;DR: This review discusses the synthesis and properties of thermosets and thermoplastic polymers prepared from vanillin, ferulic acid, guaiacol, syringaldehyde, or 4-hydroxybenzoic acid.
Abstract: Nowadays, the synthesis of (semi)aromatic polymers from lignin derivatives is of major interest, as aromatic compounds are key intermediates in the manufacture of polymers and lignin is the main source of aromatic biobased substrates. Phenols with a variety of chemical structures can be obtained from lignin deconstruction; among them, vanillin and ferulic acid are the main ones. Depending on the phenol substrates, different chemical modifications and polymerization pathways are developed, leading to (semi)aromatic polymers covering a wide range of thermomechanical properties. This review discusses the synthesis and properties of thermosets (vinyl ester resins, cyanate ester, epoxy, and benzoxazine resins) and thermoplastic polymers (polyesters, polyanhydrides, Schiff base polymers, polyacetals, polyoxalates, polycarbonates, acrylate polymers) prepared from vanillin, ferulic acid, guaiacol, syringaldehyde, or 4-hydroxybenzoic acid.
••
TL;DR: An unappreciated physical dimension to lymphocyte function is revealed and cells use mechanical forces to control the activity of outgoing chemical signals and data indicate that CTLs coordinate perforin release and force exertion in space and time.