Showing papers by "University of California, Santa Cruz published in 2019"
••
European Bioinformatics Institute1, University of California, Santa Cruz2, University of Lausanne3, University of Bern4, Broad Institute5, Massachusetts Institute of Technology6, Brunel University London7, Yale University8, University of Warsaw9, Ohio State University10, Pompeu Fabra University11, King's College London12
TL;DR: This work generates primary data, creates bioinformatics tools and provides analysis to support the work of expert manual gene annotators and automated gene annotation pipelines to identify and characterise gene loci to the highest standard.
Abstract: The accurate identification and description of the genes in the human and mouse genomes is a fundamental requirement for high quality analysis of data informing both genome biology and clinical genomics. Over the last 15 years, the GENCODE consortium has been producing reference quality gene annotations to provide this foundational resource. The GENCODE consortium includes both experimental and computational biology groups who work together to improve and extend the GENCODE gene annotation. Specifically, we generate primary data, create bioinformatics tools and provide analysis to support the work of expert manual gene annotators and automated gene annotation pipelines. In addition, manual and computational annotation workflows use any and all publicly available data and analysis, along with the research literature to identify and characterise gene loci to the highest standard. GENCODE gene annotations are accessible via the Ensembl and UCSC Genome Browsers, the Ensembl FTP site, Ensembl Biomart, Ensembl Perl and REST APIs as well as https://www.gencodegenes.org.
2,095 citations
••
TL;DR: The Large Synoptic Survey Telescope (LSST) as discussed by the authors is a large, wide-field ground-based system designed to obtain repeated images covering the sky visible from Cerro Pachon in northern Chile.
Abstract: We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the solar system, exploring the transient optical sky, and mapping the Milky Way. LSST will be a large, wide-field ground-based system designed to obtain repeated images covering the sky visible from Cerro Pachon in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2 field of view, a 3.2-gigapixel camera, and six filters (ugrizy) covering the wavelength range 320–1050 nm. The project is in the construction phase and will begin regular survey operations by 2022. About 90% of the observing time will be devoted to a deep-wide-fast survey mode that will uniformly observe a 18,000 deg2 region about 800 times (summed over all six bands) during the anticipated 10 yr of operations and will yield a co-added map to r ~ 27.5. These data will result in databases including about 32 trillion observations of 20 billion galaxies and a similar number of stars, and they will serve the majority of the primary science programs. The remaining 10% of the observing time will be allocated to special projects such as Very Deep and Very Fast time domain surveys, whose details are currently under discussion. We illustrate how the LSST science drivers led to these choices of system parameters, and we describe the expected data products and their characteristics.
921 citations
••
TL;DR: This chapter describes the use of the UNIX command-driven and online web versions of tRNAscan-SE, illustrating different search modes and options.
Abstract: Transfer RNAs are the largest, most complex non-coding RNA family, universal to all living organisms. tRNAscan-SE has been the de facto tool for predicting tRNA genes in whole genomes. The newly developed version 2.0 has incorporated advanced methodologies with improved probabilistic search software and a suite of new gene models, enabling better functional classification of predicted genes. This chapter describes the use of the UNIX command-driven and online web versions, illustrating different search modes and options.
867 citations
••
TL;DR: A new tool is added that lets users interactively arrange existing graphing tracks into new groups and create a 30-way primate alignment on the human genome in the UCSC Genome Browser.
Abstract: The UCSC Genome Browser (https://genome.ucsc.edu) is a graphical viewer for exploring genome annotations. For almost two decades, the Browser has provided visualization tools for genetics and molecular biology and continues to add new data and features. This year, we added a new tool that lets users interactively arrange existing graphing tracks into new groups. Other software additions include new formats for chromosome interactions, a ChIP-Seq peak display for track hubs and improved support for HGVS. On the annotation side, we have added gnomAD, TCGA expression, RefSeq Functional elements, GTEx eQTLs, CRISPR Guides, SNPpedia and created a 30-way primate alignment on the human genome. Nine assemblies now have RefSeq-mapped gene models.
649 citations
••
Kavli Institute for Theoretical Physics1, Polish Academy of Sciences2, University of California, Santa Cruz3, University of California, Santa Barbara4, Princeton University5, York University6, Harvard University7, University of Amsterdam8, State University of New York at Oswego9, Northwestern University10, University of Liège11, University of Wisconsin-Madison12, Arizona State University13, University of Wisconsin–Eau Claire14, California Institute of Technology15
TL;DR: In this paper, the capabilities of the open-knowledge software instrument Modules for Experiments in Stellar Astrophysics (MESA) have been updated to improve numerical energy conservation capabilities, including during mass changes.
Abstract: We update the capabilities of the open-knowledge software instrument Modules for Experiments in Stellar Astrophysics (MESA). RSP is a new functionality in MESAstar that models the nonlinear radial stellar pulsations that characterize RR Lyrae, Cepheids, and other classes of variable stars. We significantly enhance numerical energy conservation capabilities, including during mass changes. For example, this enables calculations through the He flash that conserve energy to better than 0.001%. To improve the modeling of rotating stars in MESA, we introduce a new approach to modifying the pressure and temperature equations of stellar structure, as well as a formulation of the projection effects of gravity darkening. A new scheme for tracking convective boundaries yields reliable values of the convective core mass and allows the natural emergence of adiabatic semiconvection regions during both core hydrogen- and helium-burning phases. We quantify the parallel performance of MESA on current-generation multicore architectures and demonstrate improvements in the computational efficiency of radiative levitation. We report updates to the equation of state and nuclear reaction physics modules. We briefly discuss the current treatment of fallback in core-collapse supernova models and the thermodynamic evolution of supernova explosions. We close by discussing the new MESA Testhub software infrastructure to enhance source code development.
601 citations
••
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4 +1491 more•Institutions (239)
TL;DR: In this article, the authors present the second volume of the Future Circular Collider Conceptual Design Report, devoted to the electron-positron collider FCC-ee, and present the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan.
Abstract: In response to the 2013 Update of the European Strategy for Particle Physics, the Future Circular Collider (FCC) study was launched, as an international collaboration hosted by CERN. This study covers a highest-luminosity high-energy lepton collider (FCC-ee) and an energy-frontier hadron collider (FCC-hh), which could, successively, be installed in the same 100 km tunnel. The scientific capabilities of the integrated FCC programme would serve the worldwide community throughout the 21st century. The FCC study also investigates an LHC energy upgrade, using FCC-hh technology. This document constitutes the second volume of the FCC Conceptual Design Report, devoted to the electron-positron collider FCC-ee. After summarizing the physics discovery opportunities, it presents the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan. FCC-ee can be built with today’s technology. Most of the FCC-ee infrastructure could be reused for FCC-hh. Combining concepts from past and present lepton colliders and adding a few novel elements, the FCC-ee design promises outstandingly high luminosity. This will make the FCC-ee a unique precision instrument to study the heaviest known particles (Z, W and H bosons and the top quark), offering great direct and indirect sensitivity to new physics.
526 citations
••
Institute for the Physics and Mathematics of the Universe1, University of Tokyo2, Inter-University Centre for Astronomy and Astrophysics3, Carnegie Mellon University4, Nagoya University5, Lawrence Livermore National Laboratory6, Princeton University7, University of Geneva8, Academia Sinica Institute of Astronomy and Astrophysics9, Graduate University for Advanced Studies10, University of California, Santa Cruz11, Hiroshima University12, Lawrence Berkeley National Laboratory13, University of California, Berkeley14, University of California, Riverside15, California Institute of Technology16, Harvard University17, York University18, SLAC National Accelerator Laboratory19
TL;DR: In this article, the authors measured cosmic weak lensing shear power spectra with the Subaru Hyper Suprime-Cam (HSC) survey first-year shear catalog covering 137 degrees of the sky.
Abstract: We measure cosmic weak lensing shear power spectra with the Subaru Hyper Suprime-Cam (HSC) survey first-year shear catalog covering 137 deg^2 of the sky. Thanks to the high effective galaxy number density of ∼17 arcmin^−2, even after conservative cuts such as a magnitude cut of i < 24.5 and photometric redshift cut of 0.3 ≤ z ≤ 1.5, we obtain a high-significance measurement of the cosmic shear power spectra in four tomographic redshift bins, achieving a total signal-to-noise ratio of 16 in the multipole range 300 ≤ l ≤ 1900. We carefully account for various uncertainties in our analysis including the intrinsic alignment of galaxies, scatters and biases in photometric redshifts, residual uncertainties in the shear measurement, and modeling of the matter power spectrum. The accuracy of our power spectrum measurement method as well as our analytic model of the covariance matrix are tested against realistic mock shear catalogs. For a flat Λ cold dark matter model, we find |$S\,_{8}\equiv \sigma _8(\Omega _{\rm m}/0.3)^\alpha =0.800^{+0.029}_{-0.028}$| for α = 0.45 (|$S\,_8=0.780^{+0.030}_{-0.033}$| for α = 0.5) from our HSC tomographic cosmic shear analysis alone. In comparison with Planck cosmic microwave background constraints, our results prefer slightly lower values of S_8, although metrics such as the Bayesian evidence ratio test do not show significant evidence for discordance between these results. We study the effect of possible additional systematic errors that are unaccounted for in our fiducial cosmic shear analysis, and find that they can shift the best-fit values of S_8 by up to ∼0.6 σ in both directions. The full HSC survey data will contain several times more area, and will lead to significantly improved cosmological constraints.
510 citations
••
TL;DR: It is found that synaptic signaling of upper-layer excitatory neurons and the molecular state of microglia are preferentially affected in autism, and results show that dysregulation of specific groups of genes in cortico-cortical projection neurons correlates with clinical severity of autism.
Abstract: Despite the clinical and genetic heterogeneity of autism, bulk gene expression studies show that changes in the neocortex of autism patients converge on common genes and pathways. However, direct assessment of specific cell types in the brain affected by autism has not been feasible until recently. We used single-nucleus RNA sequencing of cortical tissue from patients with autism to identify autism-associated transcriptomic changes in specific cell types. We found that synaptic signaling of upper-layer excitatory neurons and the molecular state of microglia are preferentially affected in autism. Moreover, our results show that dysregulation of specific groups of genes in cortico-cortical projection neurons correlates with clinical severity of autism. These findings suggest that molecular changes in upper-layer cortical circuits are linked to behavioral manifestations of autism.
507 citations
••
TL;DR: UCSC Xena as mentioned in this paper is a web-based visualization tool for both public and private omics data, supported through Xena Browser and multiple turn-key Xena Hubs, allowing researchers to view their own data securely, using private Xena hubs, simultaneously visualizing large public cancer genomics datasets, including TCGA and the GDC.
Abstract: UCSC Xena is a visual exploration resource for both public and private omics data, supported through the web-based Xena Browser and multiple turn-key Xena Hubs. This unique archecture allows researchers to view their own data securely, using private Xena Hubs, simultaneously visualizing large public cancer genomics datasets, including TCGA and the GDC. Data integration occurs only within the Xena Browser, keeping private data private. Xena supports virtually any functional genomics data, including SNVs, INDELs, large structural variants, CNV, expression, DNA methylation, ATAC-seq signals, and phenotypic annotations. Browser features include the Visual Spreadsheet, survival analyses, powerful filtering and subgrouping, statistical analyses, genomic signatures, and bookmarks. Xena differentiates itself from other genomics tools, including its predecessor, the UCSC Cancer Genomics Browser, by its ability to easily and securely view public and private data, its high performance, its broad data type support, and many unique features.
452 citations
••
University of Würzburg1, National University of Comahue2, Spanish National Research Council3, Swedish University of Agricultural Sciences4, University of Lisbon5, Universidade Federal de Goiás6, Stanford University7, Commonwealth Scientific and Industrial Research Organisation8, National University of Río Negro9, ETH Zurich10, Cornell University11, University of California, Davis12, The Nature Conservancy13, Wageningen University and Research Centre14, University of British Columbia15, Great Lakes Bioenergy Research Center16, University of California, Santa Cruz17, University of Padua18, University of New England (Australia)19, Lund University20, University of Göttingen21, University of La Rochelle22, Institut national de la recherche agronomique23, Federal University of Ceará24, University of Freiburg25, Concordia University Wisconsin26, University of Belgrade27, National University of Tucumán28, Michigan State University29, University of Brasília30, University of Greenwich31, University of Reading32, University of Wisconsin-Madison33, National Institute of Amazonian Research34, Boise State University35, University of Texas at Austin36, University of Haifa37, Kansas State University38, University of Hamburg39, Bioversity International40, University of California, Santa Barbara41, Seattle University42, University of Vienna43, University of Florida44, Centro Agronómico Tropical de Investigación y Enseñanza45, National Audubon Society46, University of Buenos Aires47, Virginia Tech48, University of Bordeaux49, University of Auckland50, University of California, Berkeley51, University College Dublin52, Trinity College, Dublin53, University of Tokyo54, Federal University of Bahia55, Lincoln University (New Zealand)56, National Institute for Environmental Studies57, International Food Policy Research Institute58, Xi'an Jiaotong-Liverpool University59
TL;DR: Using a global database from 89 studies (with 1475 locations), the relative importance of species richness, abundance, and dominance for pollination; biological pest control; and final yields in the context of ongoing land-use change is partitioned.
Abstract: Human land use threatens global biodiversity and compromises multiple ecosystem functions critical to food production. Whether crop yield-related ecosystem services can be maintained by a few dominant species or rely on high richness remains unclear. Using a global database from 89 studies (with 1475 locations), we partition the relative importance of species richness, abundance, and dominance for pollination; biological pest control; and final yields in the context of ongoing land-use change. Pollinator and enemy richness directly supported ecosystem services in addition to and independent of abundance and dominance. Up to 50% of the negative effects of landscape simplification on ecosystem services was due to richness losses of service-providing organisms, with negative consequences for crop yields. Maintaining the biodiversity of ecosystem service providers is therefore vital to sustain the flow of key agroecosystem benefits to society.
434 citations
••
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4 +1496 more•Institutions (238)
TL;DR: In this paper, the authors describe the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider in collaboration with national institutes, laboratories and universities worldwide, and enhanced by a strong participation of industrial partners.
Abstract: Particle physics has arrived at an important moment of its history. The discovery of the Higgs boson, with a mass of 125 GeV, completes the matrix of particles and interactions that has constituted the “Standard Model” for several decades. This model is a consistent and predictive theory, which has so far proven successful at describing all phenomena accessible to collider experiments. However, several experimental facts do require the extension of the Standard Model and explanations are needed for observations such as the abundance of matter over antimatter, the striking evidence for dark matter and the non-zero neutrino masses. Theoretical issues such as the hierarchy problem, and, more in general, the dynamical origin of the Higgs mechanism, do likewise point to the existence of physics beyond the Standard Model. This report contains the description of a novel research infrastructure based on a highest-energy hadron collider with a centre-of-mass collision energy of 100 TeV and an integrated luminosity of at least a factor of 5 larger than the HL-LHC. It will extend the current energy frontier by almost an order of magnitude. The mass reach for direct discovery will reach several tens of TeV, and allow, for example, to produce new particles whose existence could be indirectly exposed by precision measurements during the earlier preceding e+e– collider phase. This collider will also precisely measure the Higgs self-coupling and thoroughly explore the dynamics of electroweak symmetry breaking at the TeV scale, to elucidate the nature of the electroweak phase transition. WIMPs as thermal dark matter candidates will be discovered, or ruled out. As a single project, this particle collider infrastructure will serve the world-wide physics community for about 25 years and, in combination with a lepton collider (see FCC conceptual design report volume 2), will provide a research tool until the end of the 21st century. Collision energies beyond 100 TeV can be considered when using high-temperature superconductors. The European Strategy for Particle Physics (ESPP) update 2013 stated “To stay at the forefront of particle physics, Europe needs to be in a position to propose an ambitious post-LHC accelerator project at CERN by the time of the next Strategy update”. The FCC study has implemented the ESPP recommendation by developing a long-term vision for an “accelerator project in a global context”. This document describes the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider “in collaboration with national institutes, laboratories and universities worldwide”, and enhanced by a strong participation of industrial partners. Now, a coordinated preparation effort can be based on a core of an ever-growing consortium of already more than 135 institutes worldwide. The technology for constructing a high-energy circular hadron collider can be brought to the technology readiness level required for constructing within the coming ten years through a focused R&D programme. The FCC-hh concept comprises in the baseline scenario a power-saving, low-temperature superconducting magnet system based on an evolution of the Nb3Sn technology pioneered at the HL-LHC, an energy-efficient cryogenic refrigeration infrastructure based on a neon-helium (Nelium) light gas mixture, a high-reliability and low loss cryogen distribution infrastructure based on Invar, high-power distributed beam transfer using superconducting elements and local magnet energy recovery and re-use technologies that are already gradually introduced at other CERN accelerators. On a longer timescale, high-temperature superconductors can be developed together with industrial partners to achieve an even more energy efficient particle collider or to reach even higher collision energies.The re-use of the LHC and its injector chain, which also serve for a concurrently running physics programme, is an essential lever to come to an overall sustainable research infrastructure at the energy frontier. Strategic R&D for FCC-hh aims at minimising construction cost and energy consumption, while maximising the socio-economic impact. It will mitigate technology-related risks and ensure that industry can benefit from an acceptable utility. Concerning the implementation, a preparatory phase of about eight years is both necessary and adequate to establish the project governance and organisation structures, to build the international machine and experiment consortia, to develop a territorial implantation plan in agreement with the host-states’ requirements, to optimise the disposal of land and underground volumes, and to prepare the civil engineering project. Such a large-scale, international fundamental research infrastructure, tightly involving industrial partners and providing training at all education levels, will be a strong motor of economic and societal development in all participating nations. The FCC study has implemented a set of actions towards a coherent vision for the world-wide high-energy and particle physics community, providing a collaborative framework for topically complementary and geographically well-balanced contributions. This conceptual design report lays the foundation for a subsequent infrastructure preparatory and technical design phase.
••
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4 +1501 more•Institutions (239)
TL;DR: In this article, the physics opportunities of the Future Circular Collider (FC) were reviewed, covering its e+e-, pp, ep and heavy ion programs, and the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions.
Abstract: We review the physics opportunities of the Future Circular Collider, covering its e+e-, pp, ep and heavy ion programmes. We describe the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions, the top quark and flavour, as well as phenomena beyond the Standard Model. We highlight the synergy and complementarity of the different colliders, which will contribute to a uniquely coherent and ambitious research programme, providing an unmatchable combination of precision and sensitivity to new physics.
••
TL;DR: Nitrogen- and ruthenium-codoped carbon nanowires are prepared as effective hydrogen evolution catalysts in which r Ruthenium atoms in a carbon matrix drive electrocatalysis of hydrogen evolution.
Abstract: Hydrogen evolution reaction is an important process in electrochemical energy technologies. Herein, ruthenium and nitrogen codoped carbon nanowires are prepared as effective hydrogen evolution catalysts. The catalytic performance is markedly better than that of commercial platinum catalyst, with an overpotential of only -12 mV to reach the current density of 10 mV cm-2 in 1 M KOH and -47 mV in 0.1 M KOH. Comparisons with control experiments suggest that the remarkable activity is mainly ascribed to individual ruthenium atoms embedded within the carbon matrix, with minimal contributions from ruthenium nanoparticles. Consistent results are obtained in first-principles calculations, where RuCxNy moieties are found to show a much lower hydrogen binding energy than ruthenium nanoparticles, and a lower kinetic barrier for water dissociation than platinum. Among these, RuC2N2 stands out as the most active catalytic center, where both ruthenium and adjacent carbon atoms are the possible active sites.
••
University of California, Los Angeles1, University of California, Berkeley2, Goddard Space Flight Center3, Nagoya University4, Kanazawa University5, Tohoku University6, Korea Astronomy and Space Science Institute7, The Aerospace Corporation8, University of Washington9, Dartmouth College10, Montana State University11, University of California, Santa Cruz12, National Cheng Kung University13, Academia Sinica Institute of Astronomy and Astrophysics14, University of Tokyo15, National Central University16, National Oceanic and Atmospheric Administration17, Cooperative Institute for Research in Environmental Sciences18, Johns Hopkins University Applied Physics Laboratory19, Kyushu University20, Kyoto University21, National Institute of Polar Research22, University of Colorado Boulder23, University of Iowa24, University of New Hampshire25, Southwest Research Institute26, National Center for Atmospheric Research27, Université Paris-Saclay28, Boston University29, Braunschweig University of Technology30, University of Calgary31, University of Graz32, University of Minnesota33
TL;DR: The SPEDAS development history, goals, and current implementation are reviewed, and its “modes of use” are explained with examples geared for users and its technical implementation and requirements with software developers in mind are outlined.
Abstract: With the advent of the Heliophysics/Geospace System Observatory (H/GSO), a complement of multi-spacecraft missions and ground-based observatories to study the space environment, data retrieval, analysis, and visualization of space physics data can be daunting. The Space Physics Environment Data Analysis System (SPEDAS), a grass-roots software development platform (
www.spedas.org
), is now officially supported by NASA Heliophysics as part of its data environment infrastructure. It serves more than a dozen space missions and ground observatories and can integrate the full complement of past and upcoming space physics missions with minimal resources, following clear, simple, and well-proven guidelines. Free, modular and configurable to the needs of individual missions, it works in both command-line (ideal for experienced users) and Graphical User Interface (GUI) mode (reducing the learning curve for first-time users). Both options have “crib-sheets,” user-command sequences in ASCII format that can facilitate record-and-repeat actions, especially for complex operations and plotting. Crib-sheets enhance scientific interactions, as users can move rapidly and accurately from exchanges of technical information on data processing to efficient discussions regarding data interpretation and science. SPEDAS can readily query and ingest all International Solar Terrestrial Physics (ISTP)-compatible products from the Space Physics Data Facility (SPDF), enabling access to a vast collection of historic and current mission data. The planned incorporation of Heliophysics Application Programmer’s Interface (HAPI) standards will facilitate data ingestion from distributed datasets that adhere to these standards. Although SPEDAS is currently Interactive Data Language (IDL)-based (and interfaces to Java-based tools such as Autoplot), efforts are under-way to expand it further to work with python (first as an interface tool and potentially even receiving an under-the-hood replacement). We review the SPEDAS development history, goals, and current implementation. We explain its “modes of use” with examples geared for users and outline its technical implementation and requirements with software developers in mind. We also describe SPEDAS personnel and software management, interfaces with other organizations, resources and support structure available to the community, and future development plans.
••
Kavli Institute for Theoretical Physics1, Polish Academy of Sciences2, University of California, Santa Cruz3, University of California, Santa Barbara4, Princeton University5, York University6, Harvard University7, University of Amsterdam8, State University of New York at Oswego9, Northwestern University10, University of Liège11, University of Wisconsin-Madison12, Arizona State University13, University of Wisconsin–Eau Claire14, California Institute of Technology15
TL;DR: In this article, the capabilities of the open-knowledge software instrument Modules for Experiments in Stellar Astrophysics (MESA) have been updated to improve numerical energy conservation capabilities, including during mass changes.
Abstract: We update the capabilities of the open-knowledge software instrument Modules for Experiments in Stellar Astrophysics (MESA). RSP is a new functionality in MESAstar that models the non-linear radial stellar pulsations that characterize RR Lyrae, Cepheids, and other classes of variable stars. We significantly enhance numerical energy conservation capabilities, including during mass changes. For example, this enables calculations through the He flash that conserve energy to better than 0.001 %. To improve the modeling of rotating stars in MESA, we introduce a new approach to modifying the pressure and temperature equations of stellar structure, and a formulation of the projection effects of gravity darkening. A new scheme for tracking convective boundaries yields reliable values of the convective-core mass, and allows the natural emergence of adiabatic semiconvection regions during both core hydrogen- and helium-burning phases. We quantify the parallel performance of MESA on current generation multicore architectures and demonstrate improvements in the computational efficiency of radiative levitation. We report updates to the equation of state and nuclear reaction physics modules. We briefly discuss the current treatment of fallback in core-collapse supernova models and the thermodynamic evolution of supernova explosions. We close by discussing the new MESA Testhub software infrastructure to enhance source-code development.
••
TL;DR: This study generated 9.9 million aligned sequence reads for the human cell line GM12878, using thirty MinION flow cells at six institutions to identify 33,984 plausible RNA isoforms and describes strategies for assessing 3′ poly(A) tail length, base modifications and transcript haplotypes.
Abstract: High-throughput complementary DNA sequencing technologies have advanced our understanding of transcriptome complexity and regulation. However, these methods lose information contained in biological RNA because the copied reads are often short and modifications are not retained. We address these limitations using a native poly(A) RNA sequencing strategy developed by Oxford Nanopore Technologies. Our study generated 9.9 million aligned sequence reads for the human cell line GM12878, using thirty MinION flow cells at six institutions. These native RNA reads had a median length of 771 bases, and a maximum aligned length of over 21,000 bases. Mitochondrial poly(A) reads provided an internal measure of read-length quality. We combined these long nanopore reads with higher accuracy short-reads and annotated GM12878 promoter regions to identify 33,984 plausible RNA isoforms. We describe strategies for assessing 3' poly(A) tail length, base modifications and transcript haplotypes.
••
Australia Telescope National Facility1, Swinburne University of Technology2, Curtin University3, Institute for the Physics and Mathematics of the Universe4, University of California, Santa Cruz5, Pontifical Catholic University of Valparaíso6, Macquarie University7, University of Sydney8, University of Washington9, California Polytechnic State University10, University of Western Australia11, United States Naval Research Laboratory12, W.M. Keck Observatory13, University of California, Los Angeles14, Inter-University Centre for Astronomy and Astrophysics15, Australian National University16, ASTRON17
TL;DR: In this paper, the authors reported the interferometric localization of the single-pulse fast radio burst (FRB 180924) to a position 4 kiloparsecs from the center of a luminous galaxy at redshift 0.3214.
Abstract: Fast radio bursts (FRBs) are brief radio emissions from distant astronomical sources. Some are known to repeat, but most are single bursts. Nonrepeating FRB observations have had insufficient positional accuracy to localize them to an individual host galaxy. We report the interferometric localization of the single-pulse FRB 180924 to a position 4 kiloparsecs from the center of a luminous galaxy at redshift 0.3214. The burst has not been observed to repeat. The properties of the burst and its host are markedly different from those of the only other accurately localized FRB source. The integrated electron column density along the line of sight closely matches models of the intergalactic medium, indicating that some FRBs are clean probes of the baryonic component of the cosmic web.
••
TL;DR: This work leveraged recent innovations that permit generating pluripotent stem cell-derived cerebral organoids from chimpanzee to identify human-specific features of cortical development and identifies 261 differentially expressed genes in human compared to both chimpanzee organoids and macaque cortex.
••
TL;DR: A map of gene expression in lesions from brains of patients with multiple sclerosis is constructed, revealing distinct lineage- and region-specific transcriptomic changes associated with selective cortical neuron damage and glial activation.
Abstract: Multiple sclerosis (MS) is a neuroinflammatory disease with a relapsing-remitting disease course at early stages, distinct lesion characteristics in cortical grey versus subcortical white matter and neurodegeneration at chronic stages. Here we used single-nucleus RNA sequencing to assess changes in expression in multiple cell lineages in MS lesions and validated the results using multiplex in situ hybridization. We found selective vulnerability and loss of excitatory CUX2-expressing projection neurons in upper-cortical layers underlying meningeal inflammation; such MS neuron populations exhibited upregulation of stress pathway genes and long non-coding RNAs. Signatures of stressed oligodendrocytes, reactive astrocytes and activated microglia mapped most strongly to the rim of MS plaques. Notably, single-nucleus RNA sequencing identified phagocytosing microglia and/or macrophages by their ingestion and perinuclear import of myelin transcripts, confirmed by functional mouse and human culture assays. Our findings indicate lineage- and region-specific transcriptomic changes associated with selective cortical neuron damage and glial activation contributing to progression of MS lesions.
••
TL;DR: A 3D printed graphene aerogel/MnO2 electrode with MnO2 loading of 182.2 µm cm−2 achieved a record-high areal capacitance of 44.13 µm F cm−1 as discussed by the authors.
••
Stanford University1, University of California, Berkeley2, California Institute of Technology3, Ames Research Center4, Los Alamos National Laboratory5, Cornell University6, Lawrence Livermore National Laboratory7, University of Arizona8, National Research Council9, National Institutes of Natural Sciences, Japan10, Princeton University11, University of Notre Dame12, University of Georgia13, Université de Montréal14, University of Grenoble15, University of Chicago16, University of California, Los Angeles17, Amherst College18, University of California, Santa Cruz19, University of Victoria20, University of Michigan21, European Southern Observatory22, University of Exeter23, Arizona State University24, United States Geological Survey25, Pennsylvania State University26, Search for extraterrestrial intelligence27, University of California, San Diego28, Stony Brook University29, University of Western Ontario30, American Museum of Natural History31, Space Telescope Science Institute32, University of Cambridge33, Johns Hopkins University34, Leiden University35
TL;DR: Nielsen et al. as discussed by the authors presented a statistical analysis of the first 300 stars observed by the Gemini Planet Imager Exoplanet Survey (GPEES) to infer the underlying distributions of substellar companions with respect to their mass, semimajor axis, and host stellar mass.
Abstract: Author(s): Nielsen, EL; De Rosa, RJ; Macintosh, B; Wang, JJ; Ruffio, JB; Chiang, E; Marley, MS; Saumon, D; Savransky, D; Mark Ammons, S; Bailey, VP; Barman, T; Blain, C; Bulger, J; Burrows, A; Chilcote, J; Cotten, T; Czekala, I; Doyon, R; Duchene, G; Esposito, TM; Fabrycky, D; Fitzgerald, MP; Follette, KB; Fortney, JJ; Gerard, BL; Goodsell, SJ; Graham, JR; Greenbaum, AZ; Hibon, P; Hinkley, S; Hirsch, LA; Hom, J; Hung, LW; Ilene Dawson, R; Ingraham, P; Kalas, P; Konopacky, Q; Larkin, JE; Lee, EJ; Lin, JW; Maire, J; Marchis, F; Marois, C; Metchev, S; Millar-Blanchaer, MA; Morzinski, KM; Oppenheimer, R; Palmer, D; Patience, J; Perrin, M; Poyneer, L; Pueyo, L; Rafikov, RR; Rajan, A; Rameau, J; Rantakyro, FT; Ren, B; Schneider, AC; Sivaramakrishnan, A; Song, I; Soummer, R; Tallis, M; Thomas, S; Ward-Duong, K; Wolff, S | Abstract: We present a statistical analysis of the first 300 stars observed by the Gemini Planet Imager Exoplanet Survey. This subsample includes six detected planets and three brown dwarfs; from these detections and our contrast curves we infer the underlying distributions of substellar companions with respect to their mass, semimajor axis, and host stellar mass. We uncover a strong correlation between planet occurrence rate and host star mass, with stars M ∗ g1.5 M o more likely to host planets with masses between 2 and 13M Jup and semimajor axes of 3-100 au at 99.92% confidence. We fit a double power-law model in planet mass (m) and semimajor axis (a) for planet populations around high-mass stars (M ∗ g1.5 M o) of the form , finding α = -2.4 +0.8 and β = -2.0 +0.5, and an integrated occurrence rate of % between 5-13M Jup and 10-100 au. A significantly lower occurrence rate is obtained for brown dwarfs around all stars, with % of stars hosting a brown dwarf companion between 13-80M Jup and 10-100 au. Brown dwarfs also appear to be distributed differently in mass and semimajor axis compared to giant planets; whereas giant planets follow a bottom-heavy mass distribution and favor smaller semimajor axes, brown dwarfs exhibit just the opposite behaviors. Comparing to studies of short-period giant planets from the radial velocity method, our results are consistent with a peak in occurrence of giant planets between ∼1 and 10 au. We discuss how these trends, including the preference of giant planets for high-mass host stars, point to formation of giant planets by core/pebble accretion, and formation of brown dwarfs by gravitational instability.
••
TL;DR: The Sloan Digital Sky Survey (SDSS) as discussed by the authors released data taken by the fourth phase of SDSS-IV across its first three years of operation (2014 July-2017 July).
Abstract: Twenty years have passed since first light for the Sloan Digital Sky Survey (SDSS). Here, we release data taken by the fourth phase of SDSS (SDSS-IV) across its first three years of operation (2014 July–2017 July). This is the third data release for SDSS-IV, and the 15th from SDSS (Data Release Fifteen; DR15). New data come from MaNGA—we release 4824 data cubes, as well as the first stellar spectra in the MaNGA Stellar Library (MaStar), the first set of survey-supported analysis products (e.g., stellar and gas kinematics, emission-line and other maps) from the MaNGA Data Analysis Pipeline, and a new data visualization and access tool we call "Marvin." The next data release, DR16, will include new data from both APOGEE-2 and eBOSS; those surveys release no new data here, but we document updates and corrections to their data processing pipelines. The release is cumulative; it also includes the most recent reductions and calibrations of all data taken by SDSS since first light. In this paper, we describe the location and format of the data and tools and cite technical references describing how it was obtained and processed. The SDSS website (www.sdss.org) has also been updated, providing links to data downloads, tutorials, and examples of data use. Although SDSS-IV will continue to collect astronomical data until 2020, and will be followed by SDSS-V (2020–2025), we end this paper by describing plans to ensure the sustainability of the SDSS data archive for many years beyond the collection of data.
••
TL;DR: Overall, tRNA detection sensitivity and specificity is improved for all isotypes, particularly those utilizing specialized models for selenocysteine and the three subtypes of tRNA genes encoding a CAU anticodon.
Abstract: tRNAscan-SE has been widely used for whole-genome transfer RNA gene prediction for nearly two decades. With the increased availability of new genomes, a vastly larger training set has enabled creation of nearly one hundred specialized isotype-specific models, greatly improving tRNAscan-SE’s ability to identify and classify both typical and atypical tRNAs. We employ a new multi-model annotation strategy where predicted tRNAs are scored against a full set of isotype-specific covariance models. A post-filtering feature also better identifies tRNA-derived SINEs that are abundant in many eukaryotic genomes, and provides a “high confidence” tRNA gene set which improves upon prior pseudogene prediction. These new enhancements of tRNAscan-SE will provide researchers more accurate detection and more comprehensive annotation for tRNA genes.
••
••
TL;DR: In this article, a search for high-mass dielectron and dimuon resonances in the mass range of 250 GeV to 6 TeV was performed at the Large Hadron Collider.
••
TL;DR: The Belle II detector as mentioned in this paper is a state-of-the-art detector for heavy flavor physics, quarkonium and exotic states, searches for dark sectors, and many other areas.
Abstract: The Belle II detector will provide a major step forward in precision heavy flavor physics, quarkonium and exotic states, searches for dark sectors, and many other areas. The sensitivity to a large number of key observables can be improved by about an order of magnitude compared to the current measurements, and up to two orders in very clean search measurements. This increase in statistical precision arises not only due to the increased luminosity, but also from improved detector efficiency and precision for many channels. Many of the most interesting observables tend to have very small theoretical uncertainties that will therefore not limit the physics reach. This book has presented many new ideas for measurements, both to elucidate the nature of current anomalies seen in flavor, and to search for new phenomena in a plethora of observables that will become accessible with the Belle II dataset. The simulation used for the studiesinthis book was state ofthe artat the time, though weare learning a lot more about the experiment during the commissioning period. The detector is in operation, and working spectacularly well.
••
TL;DR: This work shows a reversible paramagnetic-to-ferromagnetic transformation of ferrofluid droplets by the jamming of a monolayer of magnetic nanoparticles assembled at the water-oil interface that exhibit a finite coercivity and remanent magnetization.
Abstract: Solid ferromagnetic materials are rigid in shape and cannot be reconfigured. Ferrofluids, although reconfigurable, are paramagnetic at room temperature and lose their magnetization when the applied magnetic field is removed. Here, we show a reversible paramagnetic-to-ferromagnetic transformation of ferrofluid droplets by the jamming of a monolayer of magnetic nanoparticles assembled at the water-oil interface. These ferromagnetic liquid droplets exhibit a finite coercivity and remanent magnetization. They can be easily reconfigured into different shapes while preserving the magnetic properties of solid ferromagnets with classic north-south dipole interactions. Their translational and rotational motions can be actuated remotely and precisely by an external magnetic field, inspiring studies on active matter, energy-dissipative assemblies, and programmable liquid constructs.
••
TL;DR: This work proposes modeling visualization design knowledge as a collection of constraints, in conjunction with a method to learn weights for soft constraints from experimental data, which can take theoretical design knowledge and express it in a concrete, extensible, and testable form.
Abstract: There exists a gap between visualization design guidelines and their application in visualization tools. While empirical studies can provide design guidance, we lack a formal framework for representing design knowledge, integrating results across studies, and applying this knowledge in automated design tools that promote effective encodings and facilitate visual exploration. We propose modeling visualization design knowledge as a collection of constraints, in conjunction with a method to learn weights for soft constraints from experimental data. Using constraints, we can take theoretical design knowledge and express it in a concrete, extensible, and testable form: the resulting models can recommend visualization designs and can easily be augmented with additional constraints or updated weights. We implement our approach in Draco, a constraint-based system based on Answer Set Programming (ASP). We demonstrate how to construct increasingly sophisticated automated visualization design systems, including systems based on weights learned directly from the results of graphical perception experiments.
••
Simon Fraser University1, University of California, Santa Cruz2, University of Connecticut3, University of São Paulo4, University of Washington5, Strathclyde Institute of Pharmacy and Biomedical Sciences6, University of Illinois at Chicago7, University of Montana8, Wageningen University and Research Centre9, University of Hawaii at Manoa10, University of Texas Southwestern Medical Center11, Vietnam Academy of Science and Technology12
TL;DR: The Natural Products Atlas is designed as a community-supported resource to provide a central repository for known natural product structures from microorganisms and is the first comprehensive, open access resource of this type.
Abstract: Despite rapid evolution in the area of microbial natural products chemistry, there is currently no open access database containing all microbially produced natural product structures. Lack of availability of these data is preventing the implementation of new technologies in natural products science. Specifically, development of new computational strategies for compound characterization and identification are being hampered by the lack of a comprehensive database of known compounds against which to compare experimental data. The creation of an open access, community-maintained database of microbial natural product structures would enable the development of new technologies in natural products discovery and improve the interoperability of existing natural products data resources. However, these data are spread unevenly throughout the historical scientific literature, including both journal articles and international patents. These documents have no standard format, are often not digitized as machine readable text, and are not publicly available. Further, none of these documents have associated structure files (e.g., MOL, InChI, or SMILES), instead containing images of structures. This makes extraction and formatting of relevant natural products data a formidable challenge. Using a combination of manual curation and automated data mining approaches we have created a database of microbial natural products (The Natural Products Atlas, www.npatlas.org) that includes 24 594 compounds and contains referenced data for structure, compound names, source organisms, isolation references, total syntheses, and instances of structural reassignment. This database is accompanied by an interactive web portal that permits searching by structure, substructure, and physical properties. The Web site also provides mechanisms for visualizing natural products chemical space and dashboards for displaying author and discovery timeline data. These interactive tools offer a powerful knowledge base for natural products discovery with a central interface for structure and property-based searching and presents new viewpoints on structural diversity in natural products. The Natural Products Atlas has been developed under FAIR principles (Findable, Accessible, Interoperable, and Reusable) and is integrated with other emerging natural product databases, including the Minimum Information About a Biosynthetic Gene Cluster (MIBiG) repository, and the Global Natural Products Social Molecular Networking (GNPS) platform. It is designed as a community-supported resource to provide a central repository for known natural product structures from microorganisms and is the first comprehensive, open access resource of this type. It is expected that the Natural Products Atlas will enable the development of new natural products discovery modalities and accelerate the process of structural characterization for complex natural products libraries.
••
Deakin University1, University of Maryland Center for Environmental Science2, National Oceanic and Atmospheric Administration3, Bedford Institute of Oceanography4, Wildlife Conservation Society5, University of South Alabama6, University of Pisa7, Parks Victoria8, University of California, Santa Cruz9, Autonomous University of Carmen10, Mammal Research Institute11, BirdLife International12, International Sleep Products Association13, King Abdullah University of Science and Technology14, Duke University15, Swansea University16, National Institute of Water and Atmospheric Research17, University of Exeter18, James Cook University19, University of Miami20, Macquarie University21, Smithsonian Conservation Biology Institute22, Australian Institute of Marine Science23, International Union for Conservation of Nature and Natural Resources24, Marine Biological Association of the United Kingdom25, University of Cambridge26, University of Washington27, University of Tasmania28, Tethys Research Institute29, Oregon State University30, Natural Environment Research Council31, Suffolk University32, University of East Anglia33, Queen Mary University of London34, National Oceanography Centre, Southampton35, University of Southampton36, National Institute of Polar Research37, Chicago Zoological Society38, Florida State University39, University of Western Australia40
TL;DR: A broad range of case studies from diverse marine taxa are compiled to show how tracking data have helped inform conservation policy and management, including reductions in fisheries bycatch and vessel strikes, and the design and administration of marine protected areas and important habitats.
Abstract: There have been efforts around the globe to track individuals of many marine species and assess their movements and distribution, with the putative goal of supporting their conservation and management. Determining whether, and how, tracking data have been successfully applied to address real-world conservation issues is, however, difficult. Here, we compile a broad range of case studies from diverse marine taxa to show how tracking data have helped inform conservation policy and management, including reductions in fisheries bycatch and vessel strikes, and the design and administration of marine protected areas and important habitats. Using these examples, we highlight pathways through which the past and future investment in collecting animal tracking data might be better used to achieve tangible conservation benefits.