scispace - formally typeset
Search or ask a question

Showing papers by "University of California, Davis published in 2016"


Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, Monique Arnaud3, M. Ashdown4  +334 moreInstitutions (82)
TL;DR: In this article, the authors present a cosmological analysis based on full-mission Planck observations of temperature and polarization anisotropies of the cosmic microwave background (CMB) radiation.
Abstract: This paper presents cosmological results based on full-mission Planck observations of temperature and polarization anisotropies of the cosmic microwave background (CMB) radiation. Our results are in very good agreement with the 2013 analysis of the Planck nominal-mission temperature data, but with increased precision. The temperature and polarization power spectra are consistent with the standard spatially-flat 6-parameter ΛCDM cosmology with a power-law spectrum of adiabatic scalar perturbations (denoted “base ΛCDM” in this paper). From the Planck temperature data combined with Planck lensing, for this cosmology we find a Hubble constant, H0 = (67.8 ± 0.9) km s-1Mpc-1, a matter density parameter Ωm = 0.308 ± 0.012, and a tilted scalar spectral index with ns = 0.968 ± 0.006, consistent with the 2013 analysis. Note that in this abstract we quote 68% confidence limits on measured parameters and 95% upper limits on other parameters. We present the first results of polarization measurements with the Low Frequency Instrument at large angular scales. Combined with the Planck temperature and lensing data, these measurements give a reionization optical depth of τ = 0.066 ± 0.016, corresponding to a reionization redshift of . These results are consistent with those from WMAP polarization measurements cleaned for dust emission using 353-GHz polarization maps from the High Frequency Instrument. We find no evidence for any departure from base ΛCDM in the neutrino sector of the theory; for example, combining Planck observations with other astrophysical data we find Neff = 3.15 ± 0.23 for the effective number of relativistic degrees of freedom, consistent with the value Neff = 3.046 of the Standard Model of particle physics. The sum of neutrino masses is constrained to ∑ mν < 0.23 eV. The spatial curvature of our Universe is found to be very close to zero, with | ΩK | < 0.005. Adding a tensor component as a single-parameter extension to base ΛCDM we find an upper limit on the tensor-to-scalar ratio of r0.002< 0.11, consistent with the Planck 2013 results and consistent with the B-mode polarization constraints from a joint analysis of BICEP2, Keck Array, and Planck (BKP) data. Adding the BKP B-mode data to our analysis leads to a tighter constraint of r0.002 < 0.09 and disfavours inflationarymodels with a V(φ) ∝ φ2 potential. The addition of Planck polarization data leads to strong constraints on deviations from a purely adiabatic spectrum of fluctuations. We find no evidence for any contribution from isocurvature perturbations or from cosmic defects. Combining Planck data with other astrophysical data, including Type Ia supernovae, the equation of state of dark energy is constrained to w = −1.006 ± 0.045, consistent with the expected value for a cosmological constant. The standard big bang nucleosynthesis predictions for the helium and deuterium abundances for the best-fit Planck base ΛCDM cosmology are in excellent agreement with observations. We also constraints on annihilating dark matter and on possible deviations from the standard recombination history. In neither case do we find no evidence for new physics. The Planck results for base ΛCDM are in good agreement with baryon acoustic oscillation data and with the JLA sample of Type Ia supernovae. However, as in the 2013 analysis, the amplitude of the fluctuation spectrum is found to be higher than inferred from some analyses of rich cluster counts and weak gravitational lensing. We show that these tensions cannot easily be resolved with simple modifications of the base ΛCDM cosmology. Apart from these tensions, the base ΛCDM cosmology provides an excellent description of the Planck CMB observations and many other astrophysical data sets.

10,728 citations


Journal ArticleDOI
Daniel J. Klionsky1, Kotb Abdelmohsen2, Akihisa Abe3, Joynal Abedin4  +2519 moreInstitutions (695)
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.

5,187 citations


Journal ArticleDOI
TL;DR: These and other strategies are providing researchers and clinicians a variety of tools to probe genomes in greater depth, leading to an enhanced understanding of how genome sequence variants underlie phenotype and disease.
Abstract: Since the completion of the human genome project in 2003, extraordinary progress has been made in genome sequencing technologies, which has led to a decreased cost per megabase and an increase in the number and diversity of sequenced genomes. An astonishing complexity of genome architecture has been revealed, bringing these sequencing technologies to even greater advancements. Some approaches maximize the number of bases sequenced in the least amount of time, generating a wealth of data that can be used to understand increasingly complex phenotypes. Alternatively, other approaches now aim to sequence longer contiguous pieces of DNA, which are essential for resolving structurally complex regions. These and other strategies are providing researchers and clinicians a variety of tools to probe genomes in greater depth, leading to an enhanced understanding of how genome sequence variants underlie phenotype and disease.

3,096 citations


Journal ArticleDOI
TL;DR: These recommendations address the best approaches for antibiotic stewardship programs to influence the optimal use of antibiotics.
Abstract: Evidence-based guidelines for implementation and measurement of antibiotic stewardship interventions in inpatient populations including long-term care were prepared by a multidisciplinary expert panel of the Infectious Diseases Society of America and the Society for Healthcare Epidemiology of America. The panel included clinicians and investigators representing internal medicine, emergency medicine, microbiology, critical care, surgery, epidemiology, pharmacy, and adult and pediatric infectious diseases specialties. These recommendations address the best approaches for antibiotic stewardship programs to influence the optimal use of antibiotics.

1,969 citations


Journal ArticleDOI
TL;DR: IDSA considers adherence to these guidelines to be voluntary, with the ultimate determination regarding their application to be made by the physician in the light of each patient's individual circumstances.
Abstract: It is important to realize that guidelines cannot always account for individual variation among patients. They are not intended to supplant physician judgment with respect to particular patients or special clinical situations. IDSA considers adherence to these guidelines to be voluntary, with the ultimate determination regarding their application to be made by the physician in the light of each patient's individual circumstances.

1,745 citations


Journal ArticleDOI
TL;DR: Venous thromboembolism is a complex disease, involving interactions between acquired or inherited predispositions to thrombosis and VTE risk factors, including increasing patient age and obesity, hospitalization for surgery or acute illness, nursing-home confinement, active cancer, trauma or fracture, immobility or leg paresis, superficial vein thromBosis, and, in women, pregnancy and puerperium.
Abstract: Venous thromboembolism (VTE) is categorized by the U.S. Surgeon General as a major public health problem. VTE is relatively common and associated with reduced survival and substantial health-care costs, and recurs frequently. VTE is a complex (multifactorial) disease, involving interactions between acquired or inherited predispositions to thrombosis and VTE risk factors, including increasing patient age and obesity, hospitalization for surgery or acute illness, nursing-home confinement, active cancer, trauma or fracture, immobility or leg paresis, superficial vein thrombosis, and, in women, pregnancy and puerperium, oral contraception, and hormone therapy. Although independent VTE risk factors and predictors of VTE recurrence have been identified, and effective primary and secondary prophylaxis is available, the occurrence of VTE seems to be relatively constant, or even increasing.

1,548 citations


Journal ArticleDOI
TL;DR: The open-source FALCON and FALcon-Unzip algorithms are introduced to assemble long-read sequencing data into highly accurate, contiguous, and correctly phased diploid genomes.
Abstract: While genome assembly projects have been successful in many haploid and inbred species, the assembly of noninbred or rearranged heterozygous genomes remains a major challenge. To address this challenge, we introduce the open-source FALCON and FALCON-Unzip algorithms (https://github.com/PacificBiosciences/FALCON/) to assemble long-read sequencing data into highly accurate, contiguous, and correctly phased diploid genomes. We generate new reference sequences for heterozygous samples including an F1 hybrid of Arabidopsis thaliana, the widely cultivated Vitis vinifera cv. Cabernet Sauvignon, and the coral fungus Clavicorona pyxidata, samples that have challenged short-read assembly approaches. The FALCON-based assemblies are substantially more contiguous and complete than alternate short- or long-read approaches. The phased diploid assembly enabled the study of haplotype structure and heterozygosities between homologous chromosomes, including the identification of widespread heterozygous structural variation within coding sequences.

1,490 citations


Journal ArticleDOI
26 Jul 2016-eLife
TL;DR: The height differential between the tallest and shortest populations was 19-20 cm a century ago, and has remained the same for women and increased for men a century later despite substantial changes in the ranking of countries.
Abstract: Being taller is associated with enhanced longevity, and higher education and earnings. We reanalysed 1472 population-based studies, with measurement of height on more than 18.6 million participants to estimate mean height for people born between 1896 and 1996 in 200 countries. The largest gain in adult height over the past century has occurred in South Korean women and Iranian men, who became 20.2 cm (95% credible interval 17.5–22.7) and 16.5 cm (13.3–19.7) taller, respectively. In contrast, there was little change in adult height in some sub-Saharan African countries and in South Asia over the century of analysis. The tallest people over these 100 years are men born in the Netherlands in the last quarter of 20th century, whose average heights surpassed 182.5 cm, and the shortest were women born in Guatemala in 1896 (140.3 cm; 135.8–144.8). The height differential between the tallest and shortest populations was 19-20 cm a century ago, and has remained the same for women and increased for men a century later despite substantial changes in the ranking of countries.

1,348 citations


Journal ArticleDOI
03 May 2016-JAMA
TL;DR: In the United States in 2010-2011, there was an estimated annual antibiotic prescription rate per 1000 population of 506, but only an estimated 353 antibiotic prescriptions were likely appropriate, supporting the need for establishing a goal for outpatient antibiotic stewardship.
Abstract: Importance The National Action Plan for Combating Antibiotic-Resistant Bacteria set a goal of reducing inappropriate outpatient antibiotic use by 50% by 2020, but the extent of inappropriate outpatient antibiotic use is unknown. Objective To estimate the rates of outpatient oral antibiotic prescribing by age and diagnosis, and the estimated portions of antibiotic use that may be inappropriate in adults and children in the United States. Design, Setting, and Participants Using the 2010-2011 National Ambulatory Medical Care Survey and National Hospital Ambulatory Medical Care Survey, annual numbers and population-adjusted rates with 95% confidence intervals of ambulatory visits with oral antibiotic prescriptions by age, region, and diagnosis in the United States were estimated. Exposures Ambulatory care visits. Main Outcomes and Measures Based on national guidelines and regional variation in prescribing, diagnosis-specific prevalence and rates of total and appropriate antibiotic prescriptions were determined. These rates were combined to calculate an estimate of the appropriate annual rate of antibiotic prescriptions per 1000 population. Results Of the 184 032 sampled visits, 12.6% of visits (95% CI, 12.0%-13.3%) resulted in antibiotic prescriptions. Sinusitis was the single diagnosis associated with the most antibiotic prescriptions per 1000 population (56 antibiotic prescriptions [95% CI, 48-64]), followed by suppurative otitis media (47 antibiotic prescriptions [95% CI, 41-54]), and pharyngitis (43 antibiotic prescriptions [95% CI, 38-49]). Collectively, acute respiratory conditions per 1000 population led to 221 antibiotic prescriptions (95% CI, 198-245) annually, but only 111 antibiotic prescriptions were estimated to be appropriate for these conditions. Per 1000 population, among all conditions and ages combined in 2010-2011, an estimated 506 antibiotic prescriptions (95% CI, 458-554) were written annually, and, of these, 353 antibiotic prescriptions were estimated to be appropriate antibiotic prescriptions. Conclusions and Relevance In the United States in 2010-2011, there was an estimated annual antibiotic prescription rate per 1000 population of 506, but only an estimated 353 antibiotic prescriptions were likely appropriate, supporting the need for establishing a goal for outpatient antibiotic stewardship.

1,162 citations


Journal ArticleDOI
Kurt Lejaeghere1, Gustav Bihlmayer2, Torbjörn Björkman3, Torbjörn Björkman4, Peter Blaha5, Stefan Blügel2, Volker Blum6, Damien Caliste7, Ivano E. Castelli8, Stewart J. Clark9, Andrea Dal Corso10, Stefano de Gironcoli10, Thierry Deutsch7, J. K. Dewhurst11, Igor Di Marco12, Claudia Draxl13, Claudia Draxl14, Marcin Dulak15, Olle Eriksson12, José A. Flores-Livas11, Kevin F. Garrity16, Luigi Genovese7, Paolo Giannozzi17, Matteo Giantomassi18, Stefan Goedecker19, Xavier Gonze18, Oscar Grånäs12, Oscar Grånäs20, E. K. U. Gross11, Andris Gulans14, Andris Gulans13, Francois Gygi21, D. R. Hamann22, P. J. Hasnip23, Natalie Holzwarth24, Diana Iusan12, Dominik B. Jochym25, F. Jollet, Daniel M. Jones26, Georg Kresse27, Klaus Koepernik28, Klaus Koepernik29, Emine Kucukbenli10, Emine Kucukbenli8, Yaroslav Kvashnin12, Inka L. M. Locht30, Inka L. M. Locht12, Sven Lubeck13, Martijn Marsman27, Nicola Marzari8, Ulrike Nitzsche29, Lars Nordström12, Taisuke Ozaki31, Lorenzo Paulatto32, Chris J. Pickard33, Ward Poelmans1, Matt Probert23, Keith Refson25, Keith Refson34, Manuel Richter28, Manuel Richter29, Gian-Marco Rignanese18, Santanu Saha19, Matthias Scheffler14, Matthias Scheffler35, Martin Schlipf21, Karlheinz Schwarz5, Sangeeta Sharma11, Francesca Tavazza16, Patrik Thunström5, Alexandre Tkatchenko14, Alexandre Tkatchenko36, Marc Torrent, David Vanderbilt22, Michiel van Setten18, Veronique Van Speybroeck1, John M. Wills37, Jonathan R. Yates26, Guo-Xu Zhang38, Stefaan Cottenier1 
25 Mar 2016-Science
TL;DR: A procedure to assess the precision of DFT methods was devised and used to demonstrate reproducibility among many of the most widely used DFT codes, demonstrating that the precisionof DFT implementations can be determined, even in the absence of one absolute reference code.
Abstract: The widespread popularity of density functional theory has given rise to an extensive range of dedicated codes for predicting molecular and crystalline properties. However, each code implements the formalism in a different way, raising questions about the reproducibility of such predictions. We report the results of a community-wide effort that compared 15 solid-state codes, using 40 different potentials or basis set types, to assess the quality of the Perdew-Burke-Ernzerhof equations of state for 71 elemental crystals. We conclude that predictions from recent codes and pseudopotentials agree very well, with pairwise differences that are comparable to those between different high-precision experiments. Older methods, however, have less precise agreement. Our benchmark provides a framework for users and developers to document the precision of new applications and methodological improvements.

1,141 citations


Journal ArticleDOI
TL;DR: This Review provides a comprehensive discussion of RADseq methods to aid researchers in choosing among the many different approaches and avoiding erroneous scientific conclusions from RADseq data, a problem that has plagued other genetic marker types in the past.
Abstract: High-throughput techniques based on restriction site-associated DNA sequencing (RADseq) are enabling the low-cost discovery and genotyping of thousands of genetic markers for any species, including non-model organisms, which is revolutionizing ecological, evolutionary and conservation genetics. Technical differences among these methods lead to important considerations for all steps of genomics studies, from the specific scientific questions that can be addressed, and the costs of library preparation and sequencing, to the types of bias and error inherent in the resulting data. In this Review, we provide a comprehensive discussion of RADseq methods to aid researchers in choosing among the many different approaches and avoiding erroneous scientific conclusions from RADseq data, a problem that has plagued other genetic marker types in the past.

Journal ArticleDOI
TL;DR: In this article, the authors provide an overview of FDA, starting with simple statistical notions such as mean and covariance functions, then covering some core techniques, the most popular of which is functional principal component analysis (FPCA).
Abstract: With the advance of modern technology, more and more data are being recorded continuously during a time interval or intermittently at several discrete time points. These are both examples of functional data, which has become a commonly encountered type of data. Functional data analysis (FDA) encompasses the statistical methodology for such data. Broadly interpreted, FDA deals with the analysis and theory of data that are in the form of functions. This paper provides an overview of FDA, starting with simple statistical notions such as mean and covariance functions, then covering some core techniques, the most popular of which is functional principal component analysis (FPCA). FPCA is an important dimension reduction tool, and in sparse data situations it can be used to impute functional data that are sparsely observed. Other dimension reduction approaches are also discussed. In addition, we review another core technique, functional linear regression, as well as clustering and classification of functional d...

Journal ArticleDOI
TL;DR: This review covers technical aspects of tES, as well as applications like exploration of brain physiology, modelling approaches, tES in cognitive neurosciences, and interventional approaches to help the reader to appropriately design and conduct studies involving these brain stimulation techniques.

Journal ArticleDOI
Nabila Aghanim1, Monique Arnaud2, M. Ashdown3, J. Aumont1  +291 moreInstitutions (73)
TL;DR: In this article, the authors present the Planck 2015 likelihoods, statistical descriptions of the 2-point correlation functions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties.
Abstract: This paper presents the Planck 2015 likelihoods, statistical descriptions of the 2-point correlationfunctions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties, both instrumental and astrophysical in nature. They are based on the same hybrid approach used for the previous release, i.e., a pixel-based likelihood at low multipoles (l< 30) and a Gaussian approximation to the distribution of cross-power spectra at higher multipoles. The main improvements are the use of more and better processed data and of Planck polarization information, along with more detailed models of foregrounds and instrumental uncertainties. The increased redundancy brought by more than doubling the amount of data analysed enables further consistency checks and enhanced immunity to systematic effects. It also improves the constraining power of Planck, in particular with regard to small-scale foreground properties. Progress in the modelling of foreground emission enables the retention of a larger fraction of the sky to determine the properties of the CMB, which also contributes to the enhanced precision of the spectra. Improvements in data processing and instrumental modelling further reduce uncertainties. Extensive tests establish the robustness and accuracy of the likelihood results, from temperature alone, from polarization alone, and from their combination. For temperature, we also perform a full likelihood analysis of realistic end-to-end simulations of the instrumental response to the sky, which were fed into the actual data processing pipeline; this does not reveal biases from residual low-level instrumental systematics. Even with the increase in precision and robustness, the ΛCDM cosmological model continues to offer a very good fit to the Planck data. The slope of the primordial scalar fluctuations, n_s, is confirmed smaller than unity at more than 5σ from Planck alone. We further validate the robustness of the likelihood results against specific extensions to the baseline cosmology, which are particularly sensitive to data at high multipoles. For instance, the effective number of neutrino species remains compatible with the canonical value of 3.046. For this first detailed analysis of Planck polarization spectra, we concentrate at high multipoles on the E modes, leaving the analysis of the weaker B modes to future work. At low multipoles we use temperature maps at all Planck frequencies along with a subset of polarization data. These data take advantage of Planck’s wide frequency coverage to improve the separation of CMB and foreground emission. Within the baseline ΛCDM cosmology this requires τ = 0.078 ± 0.019 for the reionization optical depth, which is significantly lower than estimates without the use of high-frequency data for explicit monitoring of dust emission. At high multipoles we detect residual systematic errors in E polarization, typically at the μK^2 level; we therefore choose to retain temperature information alone for high multipoles as the recommended baseline, in particular for testing non-minimal models. Nevertheless, the high-multipole polarization spectra from Planck are already good enough to enable a separate high-precision determination of the parameters of the ΛCDM model, showing consistency with those established independently from temperature information alone.

Journal ArticleDOI
Mary E. Dickinson, Ann M. Flenniken, Xiao Ji1, Lydia Teboul2, Michael D. Wong, Jacqueline K. White3, Terrence F. Meehan4, Wolfgang Weninger5, Henrik Westerberg2, Hibret A. Adissu6, Candice N. Baker, Lynette Bower7, James M. Brown2, L. Brianna Caddle, Francesco Chiani8, Dave Clary7, James Cleak2, Mark J. Daly9, James M. Denegre, Brendan Doe3, Mary E. Dolan, Sarah M. Edie, Helmut Fuchs, Valerie Gailus-Durner, Antonella Galli3, Alessia Gambadoro8, Juan Gallegos10, Shiying Guo11, Neil R. Horner2, Chih-Wei Hsu, Sara Johnson2, Sowmya Kalaga, Lance C. Keith, Louise Lanoue7, Thomas N. Lawson2, Monkol Lek12, Monkol Lek9, Manuel Mark13, Susan Marschall, Jeremy Mason4, Melissa L. McElwee, Susan Newbigging6, Lauryl M. J. Nutter6, Kevin A. Peterson, Ramiro Ramirez-Solis3, Douglas J. Rowland7, Edward Ryder3, Kaitlin E. Samocha9, Kaitlin E. Samocha12, John R. Seavitt10, Mohammed Selloum13, Zsombor Szoke-Kovacs2, Masaru Tamura, Amanda G. Trainor7, Ilinca Tudose4, Shigeharu Wakana, Jonathan Warren4, Olivia Wendling13, David B. West14, Leeyean Wong, Atsushi Yoshiki, Daniel G. MacArthur9, Daniel G. MacArthur12, Glauco P. Tocchini-Valentini8, Xiang Gao11, Paul Flicek4, Allan Bradley3, William C. Skarnes3, Monica J. Justice, Helen Parkinson4, Mark W. Moore, Sara Wells2, Robert E. Braun, Karen L. Svenson, Martin Hrabé de Angelis15, Yann Herault13, Timothy J. Mohun16, Ann-Marie Mallon2, R. Mark Henkelman, Steve D.M. Brown2, David J. Adams3, Kevin C K Lloyd7, Colin McKerlie6, Arthur L. Beaudet10, Maja Bucan1, Stephen A. Murray 
22 Sep 2016-Nature
TL;DR: It is shown that human disease genes are enriched for essential genes, thus providing a dataset that facilitates the prioritization and validation of mutations identified in clinical sequencing efforts and reveals that incomplete penetrance and variable expressivity are common even on a defined genetic background.
Abstract: Approximately one-third of all mammalian genes are essential for life. Phenotypes resulting from knockouts of these genes in mice have provided tremendous insight into gene function and congenital disorders. As part of the International Mouse Phenotyping Consortium effort to generate and phenotypically characterize 5,000 knockout mouse lines, here we identify 410 lethal genes during the production of the first 1,751 unique gene knockouts. Using a standardized phenotyping platform that incorporates high-resolution 3D imaging, we identify phenotypes at multiple time points for previously uncharacterized genes and additional phenotypes for genes with previously reported mutant phenotypes. Unexpectedly, our analysis reveals that incomplete penetrance and variable expressivity are common even on a defined genetic background. In addition, we show that human disease genes are enriched for essential genes, thus providing a dataset that facilitates the prioritization and validation of mutations identified in clinical sequencing efforts.

Journal ArticleDOI
06 Jul 2016-Nature
TL;DR: The microbiome has an important role in human health and unravelling the interactions between the microbiota, the host and pathogenic bacteria will produce strategies for manipulating the microbiota against infectious diseases.
Abstract: The microbiome has an important role in human health. Changes in the microbiota can confer resistance to or promote infection by pathogenic bacteria. Antibiotics have a profound impact on the microbiota that alters the nutritional landscape of the gut and can lead to the expansion of pathogenic populations. Pathogenic bacteria exploit microbiota-derived sources of carbon and nitrogen as nutrients and regulatory signals to promote their own growth and virulence. By eliciting inflammation, these bacteria alter the intestinal environment and use unique systems for respiration and metal acquisition to drive their expansion. Unravelling the interactions between the microbiota, the host and pathogenic bacteria will produce strategies for manipulating the microbiota against infectious diseases.

Journal ArticleDOI
TL;DR: This paper features the first comprehensive and critical account of European syntaxa and synthesizes more than 100 yr of classification effort by European phytosociologists.
Abstract: Aims: Vegetation classification consistent with the Braun-Blanquet approach is widely used in Europe for applied vegetation science, conservation planning and land management. During the long history of syntaxonomy, many concepts and names of vegetation units have been proposed, but there has been no single classification system integrating these units. Here we (1) present a comprehensive, hierarchical, syntaxonomic system of alliances, orders and classes of Braun-Blanquet syntaxonomy for vascular plant, bryophyte and lichen, and algal communities of Europe; (2) briefly characterize in ecological and geographic terms accepted syntaxonomic concepts; (3) link available synonyms to these accepted concepts; and (4) provide a list of diagnostic species for all classes. LocationEuropean mainland, Greenland, Arctic archipelagos (including Iceland, Svalbard, Novaya Zemlya), Canary Islands, Madeira, Azores, Caucasus, Cyprus. Methods: We evaluated approximately 10000 bibliographic sources to create a comprehensive list of previously proposed syntaxonomic units. These units were evaluated by experts for their floristic and ecological distinctness, clarity of geographic distribution and compliance with the nomenclature code. Accepted units were compiled into three systems of classes, orders and alliances (EuroVegChecklist, EVC) for communities dominated by vascular plants (EVC1), bryophytes and lichens (EVC2) and algae (EVC3). Results: EVC1 includes 109 classes, 300 orders and 1108 alliances; EVC2 includes 27 classes, 53 orders and 137 alliances, and EVC3 includes 13 classes, 24 orders and 53 alliances. In total 13448 taxa were assigned as indicator species to classes of EVC1, 2087 to classes of EVC2 and 368 to classes of EVC3. Accepted syntaxonomic concepts are summarized in a series of appendices, and detailed information on each is accessible through the software tool EuroVegBrowser. Conclusions: This paper features the first comprehensive and critical account of European syntaxa and synthesizes more than 100 yr of classification effort by European phytosociologists. It aims to document and stabilize the concepts and nomenclature of syntaxa for practical uses, such as calibration of habitat classification used by the European Union, standardization of terminology for environmental assessment, management and conservation of nature areas, landscape planning and education. The presented classification systems provide a baseline for future development and revision of European syntaxonomy.

Journal ArticleDOI
TL;DR: Imaging results suggest that intra-brain vascular dysregulation is an early pathological event during disease development, suggesting early memory deficit associated with the primary disease factors.
Abstract: Multifactorial mechanisms underlying late-onset Alzheimer's disease (LOAD) are poorly characterized from an integrative perspective. Here spatiotemporal alterations in brain amyloid-β deposition, metabolism, vascular, functional activity at rest, structural properties, cognitive integrity and peripheral proteins levels are characterized in relation to LOAD progression. We analyse over 7,700 brain images and tens of plasma and cerebrospinal fluid biomarkers from the Alzheimer's Disease Neuroimaging Initiative (ADNI). Through a multifactorial data-driven analysis, we obtain dynamic LOAD-abnormality indices for all biomarkers, and a tentative temporal ordering of disease progression. Imaging results suggest that intra-brain vascular dysregulation is an early pathological event during disease development. Cognitive decline is noticeable from initial LOAD stages, suggesting early memory deficit associated with the primary disease factors. High abnormality levels are also observed for specific proteins associated with the vascular system's integrity. Although still subjected to the sensitivity of the algorithms and biomarkers employed, our results might contribute to the development of preventive therapeutic interventions.

Journal ArticleDOI
TL;DR: It is found that one-sixth of the global land surface is highly vulnerable to invasion, including substantial areas in developing economies and biodiversity hotspots, and there is a clear need for proactive invasion strategies in areas with high poverty levels, high biodiversity and low historical levels of invasion.
Abstract: Invasive alien species (IAS) threaten human livelihoods and biodiversity globally. Increasing globalization facilitates IAS arrival, and environmental changes, including climate change, facilitate IAS establishment. Here we provide the first global, spatial analysis of the terrestrial threat from IAS in light of twenty-first century globalization and environmental change, and evaluate national capacities to prevent and manage species invasions. We find that one-sixth of the global land surface is highly vulnerable to invasion, including substantial areas in developing economies and biodiversity hotspots. The dominant invasion vectors differ between high-income countries (imports, particularly of plants and pets) and low-income countries (air travel). Uniting data on the causes of introduction and establishment can improve early-warning and eradication schemes. Most countries have limited capacity to act against invasions. In particular, we reveal a clear need for proactive invasion strategies in areas with high poverty levels, high biodiversity and low historical levels of invasion.

Journal ArticleDOI
19 Aug 2016-Science
TL;DR: Common principles revealed by maternal immune activation models are described, highlighting recent findings that strengthen their relevance for schizophrenia and autism and are starting to reveal the molecular mechanisms underlying the effects of MIA on offspring.
Abstract: Epidemiological evidence implicates maternal infection as a risk factor for autism spectrum disorder and schizophrenia. Animal models corroborate this link and demonstrate that maternal immune activation (MIA) alone is sufficient to impart lifelong neuropathology and altered behaviors in offspring. This Review describes common principles revealed by these models, highlighting recent findings that strengthen their relevance for schizophrenia and autism and are starting to reveal the molecular mechanisms underlying the effects of MIA on offspring. The role of MIA as a primer for a much wider range of psychiatric and neurologic disorders is also discussed. Finally, the need for more research in this nascent field and the implications for identifying and developing new treatments for individuals at heightened risk for neuroimmune disorders are considered.

Journal ArticleDOI
TL;DR: Obeticholic acid administered with ursodiol or as monotherapy for 12 months in patients with primary biliary cholangitis resulted in decreases from baseline in alkaline phosphatase and total bilirubin levels that differed significantly from the changes observed with placebo.
Abstract: Background Primary biliary cholangitis (formerly called primary biliary cirrhosis) can progress to cirrhosis and death despite ursodiol therapy. Alkaline phosphatase and bilirubin levels correlate with the risk of liver transplantation or death. Obeticholic acid, a farnesoid X receptor agonist, has shown potential benefit in patients with this disease. Methods In this 12-month, double-blind, placebo-controlled, phase 3 trial, we randomly assigned 217 patients who had an inadequate response to ursodiol or who found the side effects of ursodiol unacceptable to receive obeticholic acid at a dose of 10 mg (the 10-mg group), obeticholic acid at a dose of 5 mg with adjustment to 10 mg if applicable (the 5-10-mg group), or placebo. The primary end point was an alkaline phosphatase level of less than 1.67 times the upper limit of the normal range, with a reduction of at least 15% from baseline, and a normal total bilirubin level. Results Of 216 patients who underwent randomization and received at least one dose of obeticholic acid or placebo, 93% received ursodiol as background therapy. The primary end point occurred in more patients in the 5-10-mg group (46%) and the 10-mg group (47%) than in the placebo group (10%; P Conclusions Obeticholic acid administered with ursodiol or as monotherapy for 12 months in patients with primary biliary cholangitis resulted in decreases from baseline in alkaline phosphatase and total bilirubin levels that differed significantly from the changes observed with placebo. There were more serious adverse events with obeticholic acid. (Funded by Intercept Pharmaceuticals; POISE ClinicalTrials.gov number, NCT01473524; Current Controlled Trials number, ISRCTN89514817.).

Journal ArticleDOI
TL;DR: To update the 2009 Group for Research and Assessment of Psoriasis and Psoriatic Arthritis (GRAPPA) treatment recommendations for the spectrum of manifestations affecting patients with psoriatic arthritis (PsA).
Abstract: Objective To update the 2009 Group for Research and Assessment of Psoriasis and Psoriatic Arthritis (GRAPPA) treatment recommendations for the spectrum of manifestations affecting patients with psoriatic arthritis (PsA). Methods GRAPPA rheumatologists, dermatologists, and PsA patients drafted overarching principles for the management of PsA, based on consensus achieved at face-to-face meetings and via online surveys. We conducted literature reviews regarding treatment for the key domains of PsA (arthritis, spondylitis, enthesitis, dactylitis, skin disease, and nail disease) and convened a new group to identify pertinent comorbidities and their effect on treatment. Finally, we drafted treatment recommendations for each of the clinical manifestations and assessed the level of agreement for the overarching principles and treatment recommendations among GRAPPA members, using an online questionnaire. Results Six overarching principles had ≥80% agreement among both health care professionals (n = 135) and patient research partners (n = 10). We developed treatment recommendations and a schema incorporating these principles for arthritis, spondylitis, enthesitis, dactylitis, skin disease, nail disease, and comorbidities in the setting of PsA, using the Grading of Recommendations, Assessment, Development and Evaluation process. Agreement of >80% was reached for approval of the individual recommendations and the overall schema. Conclusion We present overarching principles and updated treatment recommendations for the key manifestations of PsA, including related comorbidities, based on a literature review and consensus of GRAPPA members (rheumatologists, dermatologists, other health care providers, and patient research partners). Further updates are anticipated as the therapeutic landscape in PsA evolves.

Journal ArticleDOI
Vardan Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam  +2283 moreInstitutions (141)
TL;DR: Combined fits to CMS UE proton–proton data at 7TeV and to UEProton–antiproton data from the CDF experiment at lower s, are used to study the UE models and constrain their parameters, providing thereby improved predictions for proton-proton collisions at 13.
Abstract: New sets of parameters ("tunes") for the underlying-event (UE) modeling of the PYTHIA8, PYTHIA6 and HERWIG++ Monte Carlo event generators are constructed using different parton distribution functions. Combined fits to CMS UE data at sqrt(s) = 7 TeV and to UE data from the CDF experiment at lower sqrt(s), are used to study the UE models and constrain their parameters, providing thereby improved predictions for proton-proton collisions at 13 TeV. In addition, it is investigated whether the values of the parameters obtained from fits to UE observables are consistent with the values determined from fitting observables sensitive to double-parton scattering processes. Finally, comparisons of the UE tunes to "minimum bias" (MB) events, multijet, and Drell-Yan (q q-bar to Z / gamma* to lepton-antilepton + jets) observables at 7 and 8 TeV are presented, as well as predictions of MB and UE observables at 13 TeV.

Journal ArticleDOI
TL;DR: Mental disorders are common among college students, have onsets that mostly occur prior to college entry, and in the case of pre-matriculation disorders are associated with college attrition, and are typically untreated.
Abstract: Background Although mental disorders are significant predictors of educational attainment throughout the entire educational career, most research on mental disorders among students has focused on the primary and secondary school years. Method The World Health Organization World Mental Health Surveys were used to examine the associations of mental disorders with college entry and attrition by comparing college students (n = 1572) and non-students in the same age range (18–22 years; n = 4178), including non-students who recently left college without graduating (n = 702) based on surveys in 21 countries (four low/lower-middle income, five upper-middle-income, one lower-middle or upper-middle at the times of two different surveys, and 11 high income). Lifetime and 12-month prevalence and age-of-onset of DSM-IV anxiety, mood, behavioral and substance disorders were assessed with the Composite International Diagnostic Interview (CIDI). Results One-fifth (20.3%) of college students had 12-month DSM-IV/CIDI disorders; 83.1% of these cases had pre-matriculation onsets. Disorders with pre-matriculation onsets were more important than those with post-matriculation onsets in predicting subsequent college attrition, with substance disorders and, among women, major depression the most important such disorders. Only 16.4% of students with 12-month disorders received any 12-month healthcare treatment for their mental disorders. Conclusions Mental disorders are common among college students, have onsets that mostly occur prior to college entry, in the case of pre-matriculation disorders are associated with college attrition, and are typically untreated. Detection and effective treatment of these disorders early in the college career might reduce attrition and improve educational and psychosocial functioning.

Journal ArticleDOI
TL;DR: An international formal consensus of MG experts intended to be a guide for clinicians caring for patients with MG worldwide is developed.
Abstract: Objective: To develop formal consensus-based guidance for the management of myasthenia gravis (MG). Methods: In October 2013, the Myasthenia Gravis Foundation of America appointed a Task Force to develop treatment guidance for MG, and a panel of 15 international experts was convened. The RAND/UCLA appropriateness methodology was used to develop consensus guidance statements. Definitions were developed for goals of treatment, minimal manifestations, remission, ocular MG, impending crisis, crisis, and refractory MG. An in-person panel meeting then determined 7 treatment topics to be addressed. Initial guidance statements were developed from literature summaries. Three rounds of anonymous e-mail votes were used to attain consensus on guidance statements modified on the basis of panel input. Results: Guidance statements were developed for symptomatic and immunosuppressive treatments, IV immunoglobulin and plasma exchange, management of impending and manifest myasthenic crisis, thymectomy, juvenile MG, MG associated with antibodies to muscle-specific tyrosine kinase, and MG in pregnancy. Conclusion: This is an international formal consensus of MG experts intended to be a guide for clinicians caring for patients with MG worldwide.

Journal ArticleDOI
TL;DR: The genome sequences of its diploid ancestors are reported to show that these genomes are similar to cultivated peanut's A and B subgenomes and used to identify candidate disease resistance genes, to guide tetraploid transcript assemblies and to detect genetic exchange between cultivated peanuts' subgenome.
Abstract: Cultivated peanut (Arachis hypogaea) is an allotetraploid with closely related subgenomes of a total size of ∼2.7 Gb. This makes the assembly of chromosomal pseudomolecules very challenging. As a foundation to understanding the genome of cultivated peanut, we report the genome sequences of its diploid ancestors (Arachis duranensis and Arachis ipaensis). We show that these genomes are similar to cultivated peanut's A and B subgenomes and use them to identify candidate disease resistance genes, to guide tetraploid transcript assemblies and to detect genetic exchange between cultivated peanut's subgenomes. On the basis of remarkably high DNA identity of the A. ipaensis genome and the B subgenome of cultivated peanut and biogeographic evidence, we conclude that A. ipaensis may be a direct descendant of the same population that contributed the B subgenome to cultivated peanut.

Journal ArticleDOI
TL;DR: The results support the use of isavuconazole for the primary treatment of patients with invasive mould disease and non-inferiority was shown.

Journal ArticleDOI
TL;DR: It is shown that non-bee insect pollinators play a significant role in global crop production and respond differently than bees to landscape structure, probably making their crop pollination services more robust to changes in land use.
Abstract: Wild and managed bees are well documented as effective pollinators of global crops of economic importance. However, the contributions by pollinators other than bees have been little explored despite their potential to contribute to crop production and stability in the face of environmental change. Non-bee pollinators include flies, beetles, moths, butterflies, wasps, ants, birds, and bats, among others. Here we focus on non-bee insects and synthesize 39 field studies from five continents that directly measured the crop pollination services provided by non-bees, honey bees, and other bees to compare the relative contributions of these taxa. Non-bees performed 25–50% of the total number of flower visits. Although non-bees were less effective pollinators than bees per flower visit, they made more visits; thus these two factors compensated for each other, resulting in pollination services rendered by non-bees that were similar to those provided by bees. In the subset of studies that measured fruit set, fruit set increased with non-bee insect visits independently of bee visitation rates, indicating that non-bee insects provide a unique benefit that is not provided by bees. We also show that non-bee insects are not as reliant as bees on the presence of remnant natural or seminatural habitat in the surrounding landscape. These results strongly suggest that non-bee insect pollinators play a significant role in global crop production and respond differently than bees to landscape structure, probably making their crop pollination services more robust to changes in land use. Non-bee insects provide a valuable service and provide potential insurance against bee population declines.

Journal ArticleDOI
TL;DR: An exhaustive review and reanalysis of geological, paleontological, and molecular records converge upon a cohesive narrative of gradually emerging land and constricting seaways, with formation of the Isthmus of Panama sensu stricto around 2.8 Ma.
Abstract: The formation of the Isthmus of Panama stands as one of the greatest natural events of the Cenozoic, driving profound biotic transformations on land and in the oceans. Some recent studies suggest that the Isthmus formed many millions of years earlier than the widely recognized age of approximately 3 million years ago (Ma), a result that if true would revolutionize our understanding of environmental, ecological, and evolutionary change across the Americas. To bring clarity to the question of when the Isthmus of Panama formed, we provide an exhaustive review and reanalysis of geological, paleontological, and molecular records. These independent lines of evidence converge upon a cohesive narrative of gradually emerging land and constricting seaways, with formation of the Isthmus of Panama sensu stricto around 2.8 Ma. The evidence used to support an older isthmus is inconclusive, and we caution against the uncritical acceptance of an isthmus before the Pliocene.

Journal ArticleDOI
TL;DR: It is recommended that future research efforts focus stronger on the causal understanding of why tree species classification approaches work under certain conditions or – maybe even more important - why they do not work in other cases as this might require more complex field acquisitions than those typically used in the reviewed studies.