Showing papers by "University of California, Davis published in 2016"
••
TL;DR: In this article, the authors present a cosmological analysis based on full-mission Planck observations of temperature and polarization anisotropies of the cosmic microwave background (CMB) radiation.
Abstract: This paper presents cosmological results based on full-mission Planck observations of temperature and polarization anisotropies of the cosmic microwave background (CMB) radiation. Our results are in very good agreement with the 2013 analysis of the Planck nominal-mission temperature data, but with increased precision. The temperature and polarization power spectra are consistent with the standard spatially-flat 6-parameter ΛCDM cosmology with a power-law spectrum of adiabatic scalar perturbations (denoted “base ΛCDM” in this paper). From the Planck temperature data combined with Planck lensing, for this cosmology we find a Hubble constant, H0 = (67.8 ± 0.9) km s-1Mpc-1, a matter density parameter Ωm = 0.308 ± 0.012, and a tilted scalar spectral index with ns = 0.968 ± 0.006, consistent with the 2013 analysis. Note that in this abstract we quote 68% confidence limits on measured parameters and 95% upper limits on other parameters. We present the first results of polarization measurements with the Low Frequency Instrument at large angular scales. Combined with the Planck temperature and lensing data, these measurements give a reionization optical depth of τ = 0.066 ± 0.016, corresponding to a reionization redshift of . These results are consistent with those from WMAP polarization measurements cleaned for dust emission using 353-GHz polarization maps from the High Frequency Instrument. We find no evidence for any departure from base ΛCDM in the neutrino sector of the theory; for example, combining Planck observations with other astrophysical data we find Neff = 3.15 ± 0.23 for the effective number of relativistic degrees of freedom, consistent with the value Neff = 3.046 of the Standard Model of particle physics. The sum of neutrino masses is constrained to ∑ mν < 0.23 eV. The spatial curvature of our Universe is found to be very close to zero, with | ΩK | < 0.005. Adding a tensor component as a single-parameter extension to base ΛCDM we find an upper limit on the tensor-to-scalar ratio of r0.002< 0.11, consistent with the Planck 2013 results and consistent with the B-mode polarization constraints from a joint analysis of BICEP2, Keck Array, and Planck (BKP) data. Adding the BKP B-mode data to our analysis leads to a tighter constraint of r0.002 < 0.09 and disfavours inflationarymodels with a V(φ) ∝ φ2 potential. The addition of Planck polarization data leads to strong constraints on deviations from a purely adiabatic spectrum of fluctuations. We find no evidence for any contribution from isocurvature perturbations or from cosmic defects. Combining Planck data with other astrophysical data, including Type Ia supernovae, the equation of state of dark energy is constrained to w = −1.006 ± 0.045, consistent with the expected value for a cosmological constant. The standard big bang nucleosynthesis predictions for the helium and deuterium abundances for the best-fit Planck base ΛCDM cosmology are in excellent agreement with observations. We also constraints on annihilating dark matter and on possible deviations from the standard recombination history. In neither case do we find no evidence for new physics. The Planck results for base ΛCDM are in good agreement with baryon acoustic oscillation data and with the JLA sample of Type Ia supernovae. However, as in the 2013 analysis, the amplitude of the fluctuation spectrum is found to be higher than inferred from some analyses of rich cluster counts and weak gravitational lensing. We show that these tensions cannot easily be resolved with simple modifications of the base ΛCDM cosmology. Apart from these tensions, the base ΛCDM cosmology provides an excellent description of the Planck CMB observations and many other astrophysical data sets.
10,728 citations
••
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes.
For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy.
Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.
5,187 citations
••
TL;DR: These and other strategies are providing researchers and clinicians a variety of tools to probe genomes in greater depth, leading to an enhanced understanding of how genome sequence variants underlie phenotype and disease.
Abstract: Since the completion of the human genome project in 2003, extraordinary progress has been made in genome sequencing technologies, which has led to a decreased cost per megabase and an increase in the number and diversity of sequenced genomes. An astonishing complexity of genome architecture has been revealed, bringing these sequencing technologies to even greater advancements. Some approaches maximize the number of bases sequenced in the least amount of time, generating a wealth of data that can be used to understand increasingly complex phenotypes. Alternatively, other approaches now aim to sequence longer contiguous pieces of DNA, which are essential for resolving structurally complex regions. These and other strategies are providing researchers and clinicians a variety of tools to probe genomes in greater depth, leading to an enhanced understanding of how genome sequence variants underlie phenotype and disease.
3,096 citations
••
Boston University1, Cornell University2, Johns Hopkins University3, University of Miami4, University of California, San Francisco5, Texas A&M Health Science Center College of Medicine6, Centers for Disease Control and Prevention7, University of Washington8, Case Western Reserve University9, University of Pennsylvania10, Denver Health Medical Center11, University of Michigan12, University of California, Davis13, University of California, Los Angeles14, United States Department of Veterans Affairs15, Washington University in St. Louis16, Wake Forest University17, University of Utah18, Memorial Sloan Kettering Cancer Center19
TL;DR: These recommendations address the best approaches for antibiotic stewardship programs to influence the optimal use of antibiotics.
Abstract: Evidence-based guidelines for implementation and measurement of antibiotic stewardship interventions in inpatient populations including long-term care were prepared by a multidisciplinary expert panel of the Infectious Diseases Society of America and the Society for Healthcare Epidemiology of America. The panel included clinicians and investigators representing internal medicine, emergency medicine, microbiology, critical care, surgery, epidemiology, pharmacy, and adult and pediatric infectious diseases specialties. These recommendations address the best approaches for antibiotic stewardship programs to influence the optimal use of antibiotics.
1,969 citations
••
University of Texas Health Science Center at San Antonio1, University of California, Davis2, University Hospital of South Manchester NHS Foundation Trust3, Harvard University4, Tufts University5, University of Strasbourg6, University of Texas MD Anderson Cancer Center7, Johns Hopkins University8, University of Minnesota9, University of Pittsburgh10, Roswell Park Cancer Institute11, Duke University12, Cornell University13, University of Florida14, National Institutes of Health15
TL;DR: IDSA considers adherence to these guidelines to be voluntary, with the ultimate determination regarding their application to be made by the physician in the light of each patient's individual circumstances.
Abstract: It is important to realize that guidelines cannot always account for individual variation among patients. They are not intended to supplant physician judgment with respect to particular patients or special clinical situations. IDSA considers adherence to these guidelines to be voluntary, with the ultimate determination regarding their application to be made by the physician in the light of each patient's individual circumstances.
1,745 citations
••
TL;DR: Venous thromboembolism is a complex disease, involving interactions between acquired or inherited predispositions to thrombosis and VTE risk factors, including increasing patient age and obesity, hospitalization for surgery or acute illness, nursing-home confinement, active cancer, trauma or fracture, immobility or leg paresis, superficial vein thromBosis, and, in women, pregnancy and puerperium.
Abstract: Venous thromboembolism (VTE) is categorized by the U.S. Surgeon General as a major public health problem. VTE is relatively common and associated with reduced survival and substantial health-care costs, and recurs frequently. VTE is a complex (multifactorial) disease, involving interactions between acquired or inherited predispositions to thrombosis and VTE risk factors, including increasing patient age and obesity, hospitalization for surgery or acute illness, nursing-home confinement, active cancer, trauma or fracture, immobility or leg paresis, superficial vein thrombosis, and, in women, pregnancy and puerperium, oral contraception, and hormone therapy. Although independent VTE risk factors and predictors of VTE recurrence have been identified, and effective primary and secondary prophylaxis is available, the occurrence of VTE seems to be relatively constant, or even increasing.
1,548 citations
••
TL;DR: The open-source FALCON and FALcon-Unzip algorithms are introduced to assemble long-read sequencing data into highly accurate, contiguous, and correctly phased diploid genomes.
Abstract: While genome assembly projects have been successful in many haploid and inbred species, the assembly of noninbred or rearranged heterozygous genomes remains a major challenge. To address this challenge, we introduce the open-source FALCON and FALCON-Unzip algorithms (https://github.com/PacificBiosciences/FALCON/) to assemble long-read sequencing data into highly accurate, contiguous, and correctly phased diploid genomes. We generate new reference sequences for heterozygous samples including an F1 hybrid of Arabidopsis thaliana, the widely cultivated Vitis vinifera cv. Cabernet Sauvignon, and the coral fungus Clavicorona pyxidata, samples that have challenged short-read assembly approaches. The FALCON-based assemblies are substantially more contiguous and complete than alternate short- or long-read approaches. The phased diploid assembly enabled the study of haplotype structure and heterozygosities between homologous chromosomes, including the identification of widespread heterozygous structural variation within coding sequences.
1,490 citations
••
James Bentham1, Mariachiara Di Cesare1, Mariachiara Di Cesare2, Gretchen A Stevens3 +787 more•Institutions (246)
TL;DR: The height differential between the tallest and shortest populations was 19-20 cm a century ago, and has remained the same for women and increased for men a century later despite substantial changes in the ranking of countries.
Abstract: Being taller is associated with enhanced longevity, and higher education and earnings. We reanalysed 1472 population-based studies, with measurement of height on more than 18.6 million participants to estimate mean height for people born between 1896 and 1996 in 200 countries. The largest gain in adult height over the past century has occurred in South Korean women and Iranian men, who became 20.2 cm (95% credible interval 17.5–22.7) and 16.5 cm (13.3–19.7) taller, respectively. In contrast, there was little change in adult height in some sub-Saharan African countries and in South Asia over the century of analysis. The tallest people over these 100 years are men born in the Netherlands in the last quarter of 20th century, whose average heights surpassed 182.5 cm, and the shortest were women born in Guatemala in 1896 (140.3 cm; 135.8–144.8). The height differential between the tallest and shortest populations was 19-20 cm a century ago, and has remained the same for women and increased for men a century later despite substantial changes in the ranking of countries.
1,348 citations
••
Centers for Disease Control and Prevention1, University of Utah2, University of California, San Francisco3, University of Minnesota4, Northeast Ohio Medical University5, Boston Children's Hospital6, University of Pennsylvania7, The Pew Charitable Trusts8, Brigham and Women's Hospital9, University of California, Davis10, Georgetown University Medical Center11, Harvard University12, Washington University in St. Louis13, University of Illinois at Chicago14, Pacific Lutheran University15
TL;DR: In the United States in 2010-2011, there was an estimated annual antibiotic prescription rate per 1000 population of 506, but only an estimated 353 antibiotic prescriptions were likely appropriate, supporting the need for establishing a goal for outpatient antibiotic stewardship.
Abstract: Importance The National Action Plan for Combating Antibiotic-Resistant Bacteria set a goal of reducing inappropriate outpatient antibiotic use by 50% by 2020, but the extent of inappropriate outpatient antibiotic use is unknown. Objective To estimate the rates of outpatient oral antibiotic prescribing by age and diagnosis, and the estimated portions of antibiotic use that may be inappropriate in adults and children in the United States. Design, Setting, and Participants Using the 2010-2011 National Ambulatory Medical Care Survey and National Hospital Ambulatory Medical Care Survey, annual numbers and population-adjusted rates with 95% confidence intervals of ambulatory visits with oral antibiotic prescriptions by age, region, and diagnosis in the United States were estimated. Exposures Ambulatory care visits. Main Outcomes and Measures Based on national guidelines and regional variation in prescribing, diagnosis-specific prevalence and rates of total and appropriate antibiotic prescriptions were determined. These rates were combined to calculate an estimate of the appropriate annual rate of antibiotic prescriptions per 1000 population. Results Of the 184 032 sampled visits, 12.6% of visits (95% CI, 12.0%-13.3%) resulted in antibiotic prescriptions. Sinusitis was the single diagnosis associated with the most antibiotic prescriptions per 1000 population (56 antibiotic prescriptions [95% CI, 48-64]), followed by suppurative otitis media (47 antibiotic prescriptions [95% CI, 41-54]), and pharyngitis (43 antibiotic prescriptions [95% CI, 38-49]). Collectively, acute respiratory conditions per 1000 population led to 221 antibiotic prescriptions (95% CI, 198-245) annually, but only 111 antibiotic prescriptions were estimated to be appropriate for these conditions. Per 1000 population, among all conditions and ages combined in 2010-2011, an estimated 506 antibiotic prescriptions (95% CI, 458-554) were written annually, and, of these, 353 antibiotic prescriptions were estimated to be appropriate antibiotic prescriptions. Conclusions and Relevance In the United States in 2010-2011, there was an estimated annual antibiotic prescription rate per 1000 population of 506, but only an estimated 353 antibiotic prescriptions were likely appropriate, supporting the need for establishing a goal for outpatient antibiotic stewardship.
1,162 citations
••
Ghent University1, Forschungszentrum Jülich2, Åbo Akademi University3, Aalto University4, Vienna University of Technology5, Duke University6, University of Grenoble7, École Polytechnique Fédérale de Lausanne8, Durham University9, International School for Advanced Studies10, Max Planck Society11, Uppsala University12, Humboldt University of Berlin13, Fritz Haber Institute of the Max Planck Society14, Technical University of Denmark15, National Institute of Standards and Technology16, University of Udine17, Université catholique de Louvain18, University of Basel19, Harvard University20, University of California, Davis21, Rutgers University22, University of York23, Wake Forest University24, Science and Technology Facilities Council25, University of Oxford26, University of Vienna27, Dresden University of Technology28, Leibniz Institute for Neurobiology29, Radboud University Nijmegen30, University of Tokyo31, Centre national de la recherche scientifique32, University of Cambridge33, Royal Holloway, University of London34, University of California, Santa Barbara35, University of Luxembourg36, Los Alamos National Laboratory37, Harbin Institute of Technology38
TL;DR: A procedure to assess the precision of DFT methods was devised and used to demonstrate reproducibility among many of the most widely used DFT codes, demonstrating that the precisionof DFT implementations can be determined, even in the absence of one absolute reference code.
Abstract: The widespread popularity of density functional theory has given rise to an extensive range of dedicated codes for predicting molecular and crystalline properties. However, each code implements the formalism in a different way, raising questions about the reproducibility of such predictions. We report the results of a community-wide effort that compared 15 solid-state codes, using 40 different potentials or basis set types, to assess the quality of the Perdew-Burke-Ernzerhof equations of state for 71 elemental crystals. We conclude that predictions from recent codes and pseudopotentials agree very well, with pairwise differences that are comparable to those between different high-precision experiments. Older methods, however, have less precise agreement. Our benchmark provides a framework for users and developers to document the precision of new applications and methodological improvements.
1,141 citations
••
TL;DR: This Review provides a comprehensive discussion of RADseq methods to aid researchers in choosing among the many different approaches and avoiding erroneous scientific conclusions from RADseq data, a problem that has plagued other genetic marker types in the past.
Abstract: High-throughput techniques based on restriction site-associated DNA sequencing (RADseq) are enabling the low-cost discovery and genotyping of thousands of genetic markers for any species, including non-model organisms, which is revolutionizing ecological, evolutionary and conservation genetics. Technical differences among these methods lead to important considerations for all steps of genomics studies, from the specific scientific questions that can be addressed, and the costs of library preparation and sequencing, to the types of bias and error inherent in the resulting data. In this Review, we provide a comprehensive discussion of RADseq methods to aid researchers in choosing among the many different approaches and avoiding erroneous scientific conclusions from RADseq data, a problem that has plagued other genetic marker types in the past.
••
TL;DR: In this article, the authors provide an overview of FDA, starting with simple statistical notions such as mean and covariance functions, then covering some core techniques, the most popular of which is functional principal component analysis (FPCA).
Abstract: With the advance of modern technology, more and more data are being recorded continuously during a time interval or intermittently at several discrete time points. These are both examples of functional data, which has become a commonly encountered type of data. Functional data analysis (FDA) encompasses the statistical methodology for such data. Broadly interpreted, FDA deals with the analysis and theory of data that are in the form of functions. This paper provides an overview of FDA, starting with simple statistical notions such as mean and covariance functions, then covering some core techniques, the most popular of which is functional principal component analysis (FPCA). FPCA is an important dimension reduction tool, and in sparse data situations it can be used to impute functional data that are sparsely observed. Other dimension reduction approaches are also discussed. In addition, we review another core technique, functional linear regression, as well as clustering and classification of functional d...
••
University of Florida1, University of Göttingen2, City College of New York3, Mackenzie Presbyterian University4, University of São Paulo5, Johns Hopkins University6, National Institutes of Health7, Harvard University8, University of California, Davis9, University of Brescia10, University of Lisbon11, University of Oxford12, ETH Zurich13, Ruhr University Bochum14
TL;DR: This review covers technical aspects of tES, as well as applications like exploration of brain physiology, modelling approaches, tES in cognitive neurosciences, and interventional approaches to help the reader to appropriately design and conduct studies involving these brain stimulation techniques.
••
TL;DR: In this article, the authors present the Planck 2015 likelihoods, statistical descriptions of the 2-point correlation functions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties.
Abstract: This paper presents the Planck 2015 likelihoods, statistical descriptions of the 2-point correlationfunctions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties, both instrumental and astrophysical in nature. They are based on the same hybrid approach used for the previous release, i.e., a pixel-based likelihood at low multipoles (l< 30) and a Gaussian approximation to the distribution of cross-power spectra at higher multipoles. The main improvements are the use of more and better processed data and of Planck polarization information, along with more detailed models of foregrounds and instrumental uncertainties. The increased redundancy brought by more than doubling the amount of data analysed enables further consistency checks and enhanced immunity to systematic effects. It also improves the constraining power of Planck, in particular with regard to small-scale foreground properties. Progress in the modelling of foreground emission enables the retention of a larger fraction of the sky to determine the properties of the CMB, which also contributes to the enhanced precision of the spectra. Improvements in data processing and instrumental modelling further reduce uncertainties. Extensive tests establish the robustness and accuracy of the likelihood results, from temperature alone, from polarization alone, and from their combination. For temperature, we also perform a full likelihood analysis of realistic end-to-end simulations of the instrumental response to the sky, which were fed into the actual data processing pipeline; this does not reveal biases from residual low-level instrumental systematics. Even with the increase in precision and robustness, the ΛCDM cosmological model continues to offer a very good fit to the Planck data. The slope of the primordial scalar fluctuations, n_s, is confirmed smaller than unity at more than 5σ from Planck alone. We further validate the robustness of the likelihood results against specific extensions to the baseline cosmology, which are particularly sensitive to data at high multipoles. For instance, the effective number of neutrino species remains compatible with the canonical value of 3.046. For this first detailed analysis of Planck polarization spectra, we concentrate at high multipoles on the E modes, leaving the analysis of the weaker B modes to future work. At low multipoles we use temperature maps at all Planck frequencies along with a subset of polarization data. These data take advantage of Planck’s wide frequency coverage to improve the separation of CMB and foreground emission. Within the baseline ΛCDM cosmology this requires τ = 0.078 ± 0.019 for the reionization optical depth, which is significantly lower than estimates without the use of high-frequency data for explicit monitoring of dust emission. At high multipoles we detect residual systematic errors in E polarization, typically at the μK^2 level; we therefore choose to retain temperature information alone for high multipoles as the recommended baseline, in particular for testing non-minimal models. Nevertheless, the high-multipole polarization spectra from Planck are already good enough to enable a separate high-precision determination of the parameters of the ΛCDM model, showing consistency with those established independently from temperature information alone.
••
University of Pennsylvania1, Medical Research Council2, Wellcome Trust Sanger Institute3, European Bioinformatics Institute4, Medical University of Vienna5, Hospital for Sick Children6, University of California, Davis7, National Research Council8, Harvard University9, Baylor College of Medicine10, Nanjing University11, Broad Institute12, University of Strasbourg13, Children's Hospital Oakland Research Institute14, Technische Universität München15, Francis Crick Institute16
TL;DR: It is shown that human disease genes are enriched for essential genes, thus providing a dataset that facilitates the prioritization and validation of mutations identified in clinical sequencing efforts and reveals that incomplete penetrance and variable expressivity are common even on a defined genetic background.
Abstract: Approximately one-third of all mammalian genes are essential for life. Phenotypes resulting from knockouts of these genes in mice have provided tremendous insight into gene function and congenital disorders. As part of the International Mouse Phenotyping Consortium effort to generate and phenotypically characterize 5,000 knockout mouse lines, here we identify 410 lethal genes during the production of the first 1,751 unique gene knockouts. Using a standardized phenotyping platform that incorporates high-resolution 3D imaging, we identify phenotypes at multiple time points for previously uncharacterized genes and additional phenotypes for genes with previously reported mutant phenotypes. Unexpectedly, our analysis reveals that incomplete penetrance and variable expressivity are common even on a defined genetic background. In addition, we show that human disease genes are enriched for essential genes, thus providing a dataset that facilitates the prioritization and validation of mutations identified in clinical sequencing efforts.
••
TL;DR: The microbiome has an important role in human health and unravelling the interactions between the microbiota, the host and pathogenic bacteria will produce strategies for manipulating the microbiota against infectious diseases.
Abstract: The microbiome has an important role in human health. Changes in the microbiota can confer resistance to or promote infection by pathogenic bacteria. Antibiotics have a profound impact on the microbiota that alters the nutritional landscape of the gut and can lead to the expansion of pathogenic populations. Pathogenic bacteria exploit microbiota-derived sources of carbon and nitrogen as nutrients and regulatory signals to promote their own growth and virulence. By eliciting inflammation, these bacteria alter the intestinal environment and use unique systems for respiration and metal acquisition to drive their expansion. Unravelling the interactions between the microbiota, the host and pathogenic bacteria will produce strategies for manipulating the microbiota against infectious diseases.
••
Stellenbosch University1, University of Western Australia2, University of Kiel3, University of Geneva4, Free University of Berlin5, Macedonian Academy of Sciences and Arts6, Slovenian Academy of Sciences and Arts7, University of Nova Gorica8, Academy of Sciences of the Czech Republic9, University of Vienna10, University of Bayreuth11, Complutense University of Madrid12, Masaryk University13, Sapienza University of Rome14, University of Zielona Góra15, University of Münster16, University of Göttingen17, Russian Academy of Sciences18, Slovak Academy of Sciences19, Radboud University Nijmegen20, Wageningen University and Research Centre21, National Academy of Sciences of Ukraine22, University of Lisbon23, University of Vechta24, University of California, Davis25, University of Patras26
TL;DR: This paper features the first comprehensive and critical account of European syntaxa and synthesizes more than 100 yr of classification effort by European phytosociologists.
Abstract: Aims: Vegetation classification consistent with the
Braun-Blanquet approach is widely used in Europe for applied
vegetation science, conservation planning and land management.
During the long history of syntaxonomy, many concepts and names
of vegetation units have been proposed, but there has been no
single classification system integrating these units. Here we
(1) present a comprehensive, hierarchical, syntaxonomic system
of alliances, orders and classes of Braun-Blanquet syntaxonomy
for vascular plant, bryophyte and lichen, and algal communities
of Europe; (2) briefly characterize in ecological and
geographic terms accepted syntaxonomic concepts; (3) link
available synonyms to these accepted concepts; and (4) provide
a list of diagnostic species for all classes. LocationEuropean
mainland, Greenland, Arctic archipelagos (including Iceland,
Svalbard, Novaya Zemlya), Canary Islands, Madeira, Azores,
Caucasus, Cyprus. Methods: We evaluated approximately 10000
bibliographic sources to create a comprehensive list of
previously proposed syntaxonomic units. These units were
evaluated by experts for their floristic and ecological
distinctness, clarity of geographic distribution and compliance
with the nomenclature code. Accepted units were compiled into
three systems of classes, orders and alliances
(EuroVegChecklist, EVC) for communities dominated by vascular
plants (EVC1), bryophytes and lichens (EVC2) and algae (EVC3).
Results: EVC1 includes 109 classes, 300 orders and 1108
alliances; EVC2 includes 27 classes, 53 orders and 137
alliances, and EVC3 includes 13 classes, 24 orders and 53
alliances. In total 13448 taxa were assigned as indicator
species to classes of EVC1, 2087 to classes of EVC2 and 368 to
classes of EVC3. Accepted syntaxonomic concepts are summarized
in a series of appendices, and detailed information on each is
accessible through the software tool EuroVegBrowser.
Conclusions: This paper features the first comprehensive and
critical account of European syntaxa and synthesizes more than
100 yr of classification effort by European phytosociologists.
It aims to document and stabilize the concepts and nomenclature
of syntaxa for practical uses, such as calibration of habitat
classification used by the European Union, standardization of
terminology for environmental assessment, management and
conservation of nature areas, landscape planning and education.
The presented classification systems provide a baseline for
future development and revision of European syntaxonomy.
••
Yasser Iturria-Medina, Roberto C. Sotero1, Paule-Joanne Toussaint, José María Mateos-Pérez +311 more•Institutions (60)
TL;DR: Imaging results suggest that intra-brain vascular dysregulation is an early pathological event during disease development, suggesting early memory deficit associated with the primary disease factors.
Abstract: Multifactorial mechanisms underlying late-onset Alzheimer's disease (LOAD) are poorly characterized from an integrative perspective. Here spatiotemporal alterations in brain amyloid-β deposition, metabolism, vascular, functional activity at rest, structural properties, cognitive integrity and peripheral proteins levels are characterized in relation to LOAD progression. We analyse over 7,700 brain images and tens of plasma and cerebrospinal fluid biomarkers from the Alzheimer's Disease Neuroimaging Initiative (ADNI). Through a multifactorial data-driven analysis, we obtain dynamic LOAD-abnormality indices for all biomarkers, and a tentative temporal ordering of disease progression. Imaging results suggest that intra-brain vascular dysregulation is an early pathological event during disease development. Cognitive decline is noticeable from initial LOAD stages, suggesting early memory deficit associated with the primary disease factors. High abnormality levels are also observed for specific proteins associated with the vascular system's integrity. Although still subjected to the sensitivity of the algorithms and biomarkers employed, our results might contribute to the development of preventive therapeutic interventions.
••
University of Exeter1, University of Massachusetts Amherst2, Purdue University3, University of Washington4, Agricultural Research Service5, National Park Service6, University of California, Davis7, University of Michigan8, Stanford University9, University of California, Irvine10, University of Southampton11
TL;DR: It is found that one-sixth of the global land surface is highly vulnerable to invasion, including substantial areas in developing economies and biodiversity hotspots, and there is a clear need for proactive invasion strategies in areas with high poverty levels, high biodiversity and low historical levels of invasion.
Abstract: Invasive alien species (IAS) threaten human livelihoods and biodiversity globally. Increasing globalization facilitates IAS arrival, and environmental changes, including climate change, facilitate IAS establishment. Here we provide the first global, spatial analysis of the terrestrial threat from IAS in light of twenty-first century globalization and environmental change, and evaluate national capacities to prevent and manage species invasions. We find that one-sixth of the global land surface is highly vulnerable to invasion, including substantial areas in developing economies and biodiversity hotspots. The dominant invasion vectors differ between high-income countries (imports, particularly of plants and pets) and low-income countries (air travel). Uniting data on the causes of introduction and establishment can improve early-warning and eradication schemes. Most countries have limited capacity to act against invasions. In particular, we reveal a clear need for proactive invasion strategies in areas with high poverty levels, high biodiversity and low historical levels of invasion.
••
TL;DR: Common principles revealed by maternal immune activation models are described, highlighting recent findings that strengthen their relevance for schizophrenia and autism and are starting to reveal the molecular mechanisms underlying the effects of MIA on offspring.
Abstract: Epidemiological evidence implicates maternal infection as a risk factor for autism spectrum disorder and schizophrenia. Animal models corroborate this link and demonstrate that maternal immune activation (MIA) alone is sufficient to impart lifelong neuropathology and altered behaviors in offspring. This Review describes common principles revealed by these models, highlighting recent findings that strengthen their relevance for schizophrenia and autism and are starting to reveal the molecular mechanisms underlying the effects of MIA on offspring. The role of MIA as a primer for a much wider range of psychiatric and neurologic disorders is also discussed. Finally, the need for more research in this nascent field and the implications for identifying and developing new treatments for individuals at heightened risk for neuroimmune disorders are considered.
••
Katholieke Universiteit Leuven1, University of Bologna2, Royal Prince Alfred Hospital3, University of California, Davis4, University of Milan5, Radboud University Nijmegen6, Scripps Health7, Curie Institute8, University of Amsterdam9, Medical University of Vienna10, Newcastle University11, University of Padua12, Ludwig Maximilian University of Munich13, Virginia Commonwealth University14, Utrecht University15, Autonomous University of Barcelona16, Université de Montréal17, University of Birmingham18, University Health Network19, Erasmus University Rotterdam20, University of Gothenburg21, Intercept Pharmaceuticals22
TL;DR: Obeticholic acid administered with ursodiol or as monotherapy for 12 months in patients with primary biliary cholangitis resulted in decreases from baseline in alkaline phosphatase and total bilirubin levels that differed significantly from the changes observed with placebo.
Abstract: Background Primary biliary cholangitis (formerly called primary biliary cirrhosis) can progress to cirrhosis and death despite ursodiol therapy. Alkaline phosphatase and bilirubin levels correlate with the risk of liver transplantation or death. Obeticholic acid, a farnesoid X receptor agonist, has shown potential benefit in patients with this disease. Methods In this 12-month, double-blind, placebo-controlled, phase 3 trial, we randomly assigned 217 patients who had an inadequate response to ursodiol or who found the side effects of ursodiol unacceptable to receive obeticholic acid at a dose of 10 mg (the 10-mg group), obeticholic acid at a dose of 5 mg with adjustment to 10 mg if applicable (the 5-10-mg group), or placebo. The primary end point was an alkaline phosphatase level of less than 1.67 times the upper limit of the normal range, with a reduction of at least 15% from baseline, and a normal total bilirubin level. Results Of 216 patients who underwent randomization and received at least one dose of obeticholic acid or placebo, 93% received ursodiol as background therapy. The primary end point occurred in more patients in the 5-10-mg group (46%) and the 10-mg group (47%) than in the placebo group (10%; P Conclusions Obeticholic acid administered with ursodiol or as monotherapy for 12 months in patients with primary biliary cholangitis resulted in decreases from baseline in alkaline phosphatase and total bilirubin levels that differed significantly from the changes observed with placebo. There were more serious adverse events with obeticholic acid. (Funded by Intercept Pharmaceuticals; POISE ClinicalTrials.gov number, NCT01473524; Current Controlled Trials number, ISRCTN89514817.).
••
University of Leeds1, University of California, San Diego2, University of Washington3, Hospital Italiano de Buenos Aires4, University of Southern California5, Geneva College6, Toronto Western Hospital7, University of Cagliari8, LSU Health Sciences Center New Orleans9, University College Dublin10, University of Toronto11, Tufts Medical Center12, Cleveland Clinic13, University of Iceland14, University of Molise15, Royal National Hospital for Rheumatic Diseases16, University of Queensland17, University of Pennsylvania18, Johns Hopkins University School of Medicine19, Chapel Allerton Hospital20, Hospital for Special Surgery21, University of California, Davis22, University of Rochester Medical Center23
TL;DR: To update the 2009 Group for Research and Assessment of Psoriasis and Psoriatic Arthritis (GRAPPA) treatment recommendations for the spectrum of manifestations affecting patients with psoriatic arthritis (PsA).
Abstract: Objective
To update the 2009 Group for Research and Assessment of Psoriasis and Psoriatic Arthritis (GRAPPA) treatment recommendations for the spectrum of manifestations affecting patients with psoriatic arthritis (PsA).
Methods
GRAPPA rheumatologists, dermatologists, and PsA patients drafted overarching principles for the management of PsA, based on consensus achieved at face-to-face meetings and via online surveys. We conducted literature reviews regarding treatment for the key domains of PsA (arthritis, spondylitis, enthesitis, dactylitis, skin disease, and nail disease) and convened a new group to identify pertinent comorbidities and their effect on treatment. Finally, we drafted treatment recommendations for each of the clinical manifestations and assessed the level of agreement for the overarching principles and treatment recommendations among GRAPPA members, using an online questionnaire.
Results
Six overarching principles had ≥80% agreement among both health care professionals (n = 135) and patient research partners (n = 10). We developed treatment recommendations and a schema incorporating these principles for arthritis, spondylitis, enthesitis, dactylitis, skin disease, nail disease, and comorbidities in the setting of PsA, using the Grading of Recommendations, Assessment, Development and Evaluation process. Agreement of >80% was reached for approval of the individual recommendations and the overall schema.
Conclusion
We present overarching principles and updated treatment recommendations for the key manifestations of PsA, including related comorbidities, based on a literature review and consensus of GRAPPA members (rheumatologists, dermatologists, other health care providers, and patient research partners). Further updates are anticipated as the therapeutic landscape in PsA evolves.
••
Vardan Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam +2283 more•Institutions (141)
TL;DR: Combined fits to CMS UE proton–proton data at 7TeV and to UEProton–antiproton data from the CDF experiment at lower s, are used to study the UE models and constrain their parameters, providing thereby improved predictions for proton-proton collisions at 13.
Abstract: New sets of parameters ("tunes") for the underlying-event (UE) modeling of the PYTHIA8, PYTHIA6 and HERWIG++ Monte Carlo event generators are constructed using different parton distribution functions. Combined fits to CMS UE data at sqrt(s) = 7 TeV and to UE data from the CDF experiment at lower sqrt(s), are used to study the UE models and constrain their parameters, providing thereby improved predictions for proton-proton collisions at 13 TeV. In addition, it is investigated whether the values of the parameters obtained from fits to UE observables are consistent with the values determined from fitting observables sensitive to double-parton scattering processes. Finally, comparisons of the UE tunes to "minimum bias" (MB) events, multijet, and Drell-Yan (q q-bar to Z / gamma* to lepton-antilepton + jets) observables at 7 and 8 TeV are presented, as well as predictions of MB and UE observables at 13 TeV.
••
Harvard University1, University of Michigan2, VU University Amsterdam3, Boston University4, Katholieke Universiteit Leuven5, University of California, Davis6, University of São Paulo7, Universidade Nova de Lisboa8, University College Hospital, Ibadan9, University of Barcelona10, University of Balamand11, Wrocław Medical University12, EHESP13, The Chinese University of Hong Kong14, Park Centre for Mental Health15, Ulster University16, University of Otago17, Utrecht University18, Center for Excellence in Education19
TL;DR: Mental disorders are common among college students, have onsets that mostly occur prior to college entry, and in the case of pre-matriculation disorders are associated with college attrition, and are typically untreated.
Abstract: Background Although mental disorders are significant predictors of educational attainment throughout the entire educational career, most research on mental disorders among students has focused on the primary and secondary school years. Method The World Health Organization World Mental Health Surveys were used to examine the associations of mental disorders with college entry and attrition by comparing college students (n = 1572) and non-students in the same age range (18–22 years; n = 4178), including non-students who recently left college without graduating (n = 702) based on surveys in 21 countries (four low/lower-middle income, five upper-middle-income, one lower-middle or upper-middle at the times of two different surveys, and 11 high income). Lifetime and 12-month prevalence and age-of-onset of DSM-IV anxiety, mood, behavioral and substance disorders were assessed with the Composite International Diagnostic Interview (CIDI). Results One-fifth (20.3%) of college students had 12-month DSM-IV/CIDI disorders; 83.1% of these cases had pre-matriculation onsets. Disorders with pre-matriculation onsets were more important than those with post-matriculation onsets in predicting subsequent college attrition, with substance disorders and, among women, major depression the most important such disorders. Only 16.4% of students with 12-month disorders received any 12-month healthcare treatment for their mental disorders. Conclusions Mental disorders are common among college students, have onsets that mostly occur prior to college entry, in the case of pre-matriculation disorders are associated with college attrition, and are typically untreated. Detection and effective treatment of these disorders early in the college career might reduce attrition and improve educational and psychosocial functioning.
••
Duke University1, University at Buffalo2, University of Miami3, Catholic University of the Sacred Heart4, University of Bergen5, Autonomous University of Barcelona6, Northwestern University7, University of Erlangen-Nuremberg8, University of Tübingen9, Kyushu University10, University of Western Ontario11, University of Oxford12, University of California, Davis13, Leiden University14, Harvard University15
TL;DR: An international formal consensus of MG experts intended to be a guide for clinicians caring for patients with MG worldwide is developed.
Abstract: Objective: To develop formal consensus-based guidance for the management of myasthenia gravis (MG). Methods: In October 2013, the Myasthenia Gravis Foundation of America appointed a Task Force to develop treatment guidance for MG, and a panel of 15 international experts was convened. The RAND/UCLA appropriateness methodology was used to develop consensus guidance statements. Definitions were developed for goals of treatment, minimal manifestations, remission, ocular MG, impending crisis, crisis, and refractory MG. An in-person panel meeting then determined 7 treatment topics to be addressed. Initial guidance statements were developed from literature summaries. Three rounds of anonymous e-mail votes were used to attain consensus on guidance statements modified on the basis of panel input. Results: Guidance statements were developed for symptomatic and immunosuppressive treatments, IV immunoglobulin and plasma exchange, management of impending and manifest myasthenic crisis, thymectomy, juvenile MG, MG associated with antibodies to muscle-specific tyrosine kinase, and MG in pregnancy. Conclusion: This is an international formal consensus of MG experts intended to be a guide for clinicians caring for patients with MG worldwide.
••
University of Georgia1, University of Brasília2, United States Department of Agriculture3, University of California, Davis4, National Center for Genome Resources5, Iowa State University6, Empresa Brasileira de Pesquisa Agropecuária7, Texas A&M University8, Texas Tech University9, International Crops Research Institute for the Semi-Arid Tropics10, International Potato Center11, Crops Research Institute12, North Carolina State University13
TL;DR: The genome sequences of its diploid ancestors are reported to show that these genomes are similar to cultivated peanut's A and B subgenomes and used to identify candidate disease resistance genes, to guide tetraploid transcript assemblies and to detect genetic exchange between cultivated peanuts' subgenome.
Abstract: Cultivated peanut (Arachis hypogaea) is an allotetraploid with closely related subgenomes of a total size of ∼2.7 Gb. This makes the assembly of chromosomal pseudomolecules very challenging. As a foundation to understanding the genome of cultivated peanut, we report the genome sequences of its diploid ancestors (Arachis duranensis and Arachis ipaensis). We show that these genomes are similar to cultivated peanut's A and B subgenomes and use them to identify candidate disease resistance genes, to guide tetraploid transcript assemblies and to detect genetic exchange between cultivated peanut's subgenomes. On the basis of remarkably high DNA identity of the A. ipaensis genome and the B subgenome of cultivated peanut and biogeographic evidence, we conclude that A. ipaensis may be a direct descendant of the same population that contributed the B subgenome to cultivated peanut.
••
Katholieke Universiteit Leuven1, University of Texas MD Anderson Cancer Center2, Johns Hopkins University3, University of Texas Health Science Center at San Antonio4, University of Cologne5, University of Manitoba6, Sheba Medical Center7, Université libre de Bruxelles8, University of Alabama at Birmingham9, Tel Aviv University10, University of Würzburg11, University of Strasbourg12, University of Liverpool13, München Klinik Bogenhausen14, Catholic University of Korea15, Necker-Enfants Malades Hospital16, University of Minnesota17, Technion – Israel Institute of Technology18, University of California, Davis19, Center for Global Development20
TL;DR: The results support the use of isavuconazole for the primary treatment of patients with invasive mould disease and non-inferiority was shown.
••
University of New England (Australia)1, Spanish National Research Council2, National University of Río Negro3, University of Reading4, Plant & Food Research5, Rutgers University6, Commonwealth Scientific and Industrial Research Organisation7, University of Queensland8, Australian Bureau of Agricultural and Resource Economics9, Lund University10, Swedish University of Agricultural Sciences11, University of California, Davis12, University of Brasília13, University of Lisbon14, Naturalis15, National University of Tucumán16, University of Koblenz and Landau17, Federal University of Ceará18, United Nations19, ETH Zurich20, Federal University of Bahia21, University of Giessen22, University of Freiburg23, Wageningen University and Research Centre24, American Museum of Natural History25, Hebrew University of Jerusalem26, University of Bern27, Royal Holloway, University of London28, Trinity College, Dublin29, Jagiellonian University30, University of Agriculture, Faisalabad31, Universidad de las Américas Puebla32
TL;DR: It is shown that non-bee insect pollinators play a significant role in global crop production and respond differently than bees to landscape structure, probably making their crop pollination services more robust to changes in land use.
Abstract: Wild and managed bees are well documented as effective pollinators of global crops of economic importance. However, the contributions by pollinators other than bees have been little explored despite their potential to contribute to crop production and stability in the face of environmental change. Non-bee pollinators include flies, beetles, moths, butterflies, wasps, ants, birds, and bats, among others. Here we focus on non-bee insects and synthesize 39 field studies from five continents that directly measured the crop pollination services provided by non-bees, honey bees, and other bees to compare the relative contributions of these taxa. Non-bees performed 25–50% of the total number of flower visits. Although non-bees were less effective pollinators than bees per flower visit, they made more visits; thus these two factors compensated for each other, resulting in pollination services rendered by non-bees that were similar to those provided by bees. In the subset of studies that measured fruit set, fruit set increased with non-bee insect visits independently of bee visitation rates, indicating that non-bee insects provide a unique benefit that is not provided by bees. We also show that non-bee insects are not as reliant as bees on the presence of remnant natural or seminatural habitat in the surrounding landscape. These results strongly suggest that non-bee insect pollinators play a significant role in global crop production and respond differently than bees to landscape structure, probably making their crop pollination services more robust to changes in land use. Non-bee insects provide a valuable service and provide potential insurance against bee population declines.
••
Smithsonian Tropical Research Institute1, Texas A&M University at Galveston2, University of Florida3, National University of Colombia4, Florida International University5, University of Nevada, Reno6, Florida State University7, Scripps Institution of Oceanography8, United States Geological Survey9, University of California, Riverside10, Federal Fluminense University11, Rutgers University12, University of Iowa13, Universidade Federal de Minas Gerais14, Hamilton College15, University of California, Berkeley16, Texas A&M University17, Natural History Museum18, Woods Hole Oceanographic Institution19, National Museum of Natural History20, Washington and Lee University21, University of California, Davis22
TL;DR: An exhaustive review and reanalysis of geological, paleontological, and molecular records converge upon a cohesive narrative of gradually emerging land and constricting seaways, with formation of the Isthmus of Panama sensu stricto around 2.8 Ma.
Abstract: The formation of the Isthmus of Panama stands as one of the greatest natural events of the Cenozoic, driving profound biotic transformations on land and in the oceans. Some recent studies suggest that the Isthmus formed many millions of years earlier than the widely recognized age of approximately 3 million years ago (Ma), a result that if true would revolutionize our understanding of environmental, ecological, and evolutionary change across the Americas. To bring clarity to the question of when the Isthmus of Panama formed, we provide an exhaustive review and reanalysis of geological, paleontological, and molecular records. These independent lines of evidence converge upon a cohesive narrative of gradually emerging land and constricting seaways, with formation of the Isthmus of Panama sensu stricto around 2.8 Ma. The evidence used to support an older isthmus is inconclusive, and we caution against the uncritical acceptance of an isthmus before the Pliocene.
••
TL;DR: It is recommended that future research efforts focus stronger on the causal understanding of why tree species classification approaches work under certain conditions or – maybe even more important - why they do not work in other cases as this might require more complex field acquisitions than those typically used in the reviewed studies.