scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
14 Aug 2015-Vaccine
TL;DR: Overall, the results showed that multicomponent and dialogue-based interventions were most effective and identified strategies should be carefully tailored according to the target population, their reasons for hesitancy, and the specific context.

735 citations


Journal ArticleDOI
TL;DR: The International Reference Ionosphere model (IRI-2016) as mentioned in this paper is the latest version of the IRI model and includes two new model options for the F2 peak height hmF2 and a better representation of topside ion densities at very low and high solar activities.
Abstract: The paper presents the latest version of the International Reference Ionosphere model (IRI-2016) describing the most important changes and improvements that were included with this version and discussing their impact on the IRI predictions of ionospheric parameters. IRI-2016 includes two new model options for the F2 peak height hmF2 and a better representation of topside ion densities at very low and high solar activities. In addition, a number of smaller changes were made concerning the use of solar indices and the speedup of the computer program. We also review the latest developments toward a Real-Time IRI. The goal is to progress from predicting climatology to describing the real-time weather conditions in the ionosphere.

735 citations


Posted ContentDOI
12 Jul 2017-bioRxiv
TL;DR: The integrative analysis of more than 2,600 whole cancer genomes and their matching normal tissues across 39 distinct tumour types represents the most comprehensive look at cancer whole genomes to date.
Abstract: We report the integrative analysis of more than 2,600 whole cancer genomes and their matching normal tissues across 39 distinct tumour types. By studying whole genomes we have been able to catalogue non-coding cancer driver events, study patterns of structural variation, infer tumour evolution, probe the interactions among variants in the germline genome, the tumour genome and the transcriptome, and derive an understanding of how coding and non-coding variations together contribute to driving individual patient9s tumours. This work represents the most comprehensive look at cancer whole genomes to date. NOTE TO READERS: This is an incomplete draft of the marker paper for the Pan-Cancer Analysis of Whole Genomes Project, and is intended to provide the background information for a series of in-depth papers that will be posted to BioRixv during the summer of 2017.

735 citations


Journal ArticleDOI
TL;DR: This Review provides a comprehensive overview of the gasdermin family, the mechanisms that control their activation and their role in inflammatory disorders and cancer.
Abstract: The gasdermins are a family of recently identified pore-forming effector proteins that cause membrane permeabilization and pyroptosis, a lytic pro-inflammatory type of cell death. Gasdermins contain a cytotoxic N-terminal domain and a C-terminal repressor domain connected by a flexible linker. Proteolytic cleavage between these two domains releases the intramolecular inhibition on the cytotoxic domain, allowing it to insert into cell membranes and form large oligomeric pores, which disrupts ion homeostasis and induces cell death. Gasdermin-induced pyroptosis plays a prominent role in many hereditary diseases and (auto)inflammatory disorders as well as in cancer. In this Review, we discuss recent developments in gasdermin research with a focus on mechanisms that control gasdermin activation, pore formation and functional consequences of gasdermin-induced membrane permeabilization.

735 citations


Journal ArticleDOI
TL;DR: In patients with minor isChemic stroke or high‐risk TIA, those who received a combination of clopidogrel and aspirin had a lower risk of major ischemic events but a higher risk ofMajor hemorrhage at 90 days than those who receive aspirin alone.
Abstract: Background Combination antiplatelet therapy with clopidogrel and aspirin may reduce the rate of recurrent stroke during the first 3 months after a minor ischemic stroke or transient ischemic attack (TIA). A trial of combination antiplatelet therapy in a Chinese population has shown a reduction in the risk of recurrent stroke. We tested this combination in an international population. Methods In a randomized trial, we assigned patients with minor ischemic stroke or high-risk TIA to receive either clopidogrel at a loading dose of 600 mg on day 1, followed by 75 mg per day, plus aspirin (at a dose of 50 to 325 mg per day) or the same range of doses of aspirin alone. The dose of aspirin in each group was selected by the site investigator. The primary efficacy outcome in a time-to-event analysis was the risk of a composite of major ischemic events, which was defined as ischemic stroke, myocardial infarction, or death from an ischemic vascular event, at 90 days. Results A total of 4881 patients were en...

735 citations


Posted Content
Cynthia Rudin1
TL;DR: In this article, the chasm between explaining black box models and using inherently interpretable models is identified, and several key reasons why explainable models should be avoided in high-stakes decisions.
Abstract: Black box machine learning models are currently being used for high stakes decision-making throughout society, causing problems throughout healthcare, criminal justice, and in other domains. People have hoped that creating methods for explaining these black box models will alleviate some of these problems, but trying to \textit{explain} black box models, rather than creating models that are \textit{interpretable} in the first place, is likely to perpetuate bad practices and can potentially cause catastrophic harm to society. There is a way forward -- it is to design models that are inherently interpretable. This manuscript clarifies the chasm between explaining black boxes and using inherently interpretable models, outlines several key reasons why explainable black boxes should be avoided in high-stakes decisions, identifies challenges to interpretable machine learning, and provides several example applications where interpretable models could potentially replace black box models in criminal justice, healthcare, and computer vision.

734 citations


Journal ArticleDOI
TL;DR: FACETS is a fully integrated stand-alone pipeline that includes sequencing BAM file post-processing, joint segmentation of total- and allele-specific read counts, and integer copy number calls corrected for tumor purity, ploidy and clonal heterogeneity, with comprehensive output and integrated visualization.
Abstract: Allele-specific copy number analysis (ASCN) from next generation sequencing (NGS) data can greatly extend the utility of NGS beyond the identification of mutations to precisely annotate the genome for the detection of homozygous/heterozygous deletions, copy-neutral loss-of-heterozygosity (LOH), allele-specific gains/amplifications. In addition, as targeted gene panels are increasingly used in clinical sequencing studies for the detection of 'actionable' mutations and copy number alterations to guide treatment decisions, accurate, tumor purity-, ploidy- and clonal heterogeneity-adjusted integer copy number calls are greatly needed to more reliably interpret NGS-based cancer gene copy number data in the context of clinical sequencing. We developed FACETS, an ASCN tool and open-source software with a broad application to whole genome, whole-exome, as well as targeted panel sequencing platforms. It is a fully integrated stand-alone pipeline that includes sequencing BAM file post-processing, joint segmentation of total- and allele-specific read counts, and integer copy number calls corrected for tumor purity, ploidy and clonal heterogeneity, with comprehensive output and integrated visualization. We demonstrate the application of FACETS using The Cancer Genome Atlas (TCGA) whole-exome sequencing of lung adenocarcinoma samples. We also demonstrate its application to a clinical sequencing platform based on a targeted gene panel.

734 citations


Journal ArticleDOI
TL;DR: Estimates of mixing patterns for societies for which contact data such as POLYMOD are not yet available are provided, finding contact patterns are highly assortative with age across all countries considered, but pronounced regional differences in the age-specific contacts at home were noticeable.
Abstract: Heterogeneities in contact networks have a major effect in determining whether a pathogen can become epidemic or persist at endemic levels. Epidemic models that determine which interventions can successfully prevent an outbreak need to account for social structure and mixing patterns. Contact patterns vary across age and locations (e.g. home, work, and school), and including them as predictors in transmission dynamic models of pathogens that spread socially will improve the models' realism. Data from population-based contact diaries in eight European countries from the POLYMOD study were projected to 144 other countries using a Bayesian hierarchical model that estimated the proclivity of age-and-location-specific contact patterns for the countries, using Markov chain Monte Carlo simulation. Household level data from the Demographic and Health Surveys for nine lower-income countries and socio-demographic factors from several on-line databases for 152 countries were used to quantify similarity of countries to estimate contact patterns in the home, work, school and other locations for countries for which no contact data are available, accounting for demographic structure, household structure where known, and a variety of metrics including workforce participation and school enrolment. Contacts are highly assortative with age across all countries considered, but pronounced regional differences in the age-specific contacts at home were noticeable, with more inter-generational contacts in Asian countries than in other settings. Moreover, there were variations in contact patterns by location, with work-place contacts being least assortative. These variations led to differences in the effect of social distancing measures in an age structured epidemic model. Contacts have an important role in transmission dynamic models that use contact rates to characterize the spread of contact-transmissible diseases. This study provides estimates of mixing patterns for societies for which contact data such as POLYMOD are not yet available.

734 citations


Posted Content
TL;DR: This work shows how to improve semantic segmentation through the use of contextual information, specifically, ' patch-patch' context between image regions, and 'patch-background' context, and formulate Conditional Random Fields with CNN-based pairwise potential functions to capture semantic correlations between neighboring patches.
Abstract: Recent advances in semantic image segmentation have mostly been achieved by training deep convolutional neural networks (CNNs). We show how to improve semantic segmentation through the use of contextual information; specifically, we explore `patch-patch' context between image regions, and `patch-background' context. For learning from the patch-patch context, we formulate Conditional Random Fields (CRFs) with CNN-based pairwise potential functions to capture semantic correlations between neighboring patches. Efficient piecewise training of the proposed deep structured model is then applied to avoid repeated expensive CRF inference for back propagation. For capturing the patch-background context, we show that a network design with traditional multi-scale image input and sliding pyramid pooling is effective for improving performance. Our experimental results set new state-of-the-art performance on a number of popular semantic segmentation datasets, including NYUDv2, PASCAL VOC 2012, PASCAL-Context, and SIFT-flow. In particular, we achieve an intersection-over-union score of 78.0 on the challenging PASCAL VOC 2012 dataset.

734 citations


Journal ArticleDOI
TL;DR: This paper shows that global linear convergence can be guaranteed under the assumptions of strong convexity and Lipschitz gradient on one of the two functions, along with certain rank assumptions on A and B.
Abstract: The formulation $$\begin{aligned} \min _{x,y} ~f(x)+g(y),\quad \text{ subject } \text{ to } Ax+By=b, \end{aligned}$$minx,yf(x)+g(y),subjecttoAx+By=b,where f and g are extended-value convex functions, arises in many application areas such as signal processing, imaging and image processing, statistics, and machine learning either naturally or after variable splitting. In many common problems, one of the two objective functions is strictly convex and has Lipschitz continuous gradient. On this kind of problem, a very effective approach is the alternating direction method of multipliers (ADM or ADMM), which solves a sequence of f/g-decoupled subproblems. However, its effectiveness has not been matched by a provably fast rate of convergence; only sublinear rates such as O(1 / k) and $$O(1/k^2)$$O(1/k2) were recently established in the literature, though the O(1 / k) rates do not require strong convexity. This paper shows that global linear convergence can be guaranteed under the assumptions of strong convexity and Lipschitz gradient on one of the two functions, along with certain rank assumptions on A and B. The result applies to various generalizations of ADM that allow the subproblems to be solved faster and less exactly in certain manners. The derived rate of convergence also provides some theoretical guidance for optimizing the ADM parameters. In addition, this paper makes meaningful extensions to the existing global convergence theory of ADM generalizations.

734 citations


Journal ArticleDOI
TL;DR: Thorough literature search about diagnostic criteria for acute cholecystitis, new and strong evidence that had been released from 2013 to 2017 was not found with serious and important issues about using TG13 diagnostic criteria of acute CholecyStitis, and the TG13 severity grading has been validated in numerous studies.
Abstract: Although the diagnostic and severity grading criteria on the 2013 Tokyo Guidelines (TG13) are used worldwide as the primary standard for management of acute cholangitis (AC), they need to be validated through implementation and assessment in actual clinical practice. Here, we conduct a systematic review of the literature to validate the TG13 diagnostic and severity grading criteria for AC and propose TG18 criteria. While there is little evidence evaluating the TG13 criteria, they were validated through a large-scale case series study in Japan and Taiwan. Analyzing big data from this study confirmed that the diagnostic rate of AC based on the TG13 diagnostic criteria was higher than that based on the TG07 criteria, and that 30-day mortality in patients with a higher severity based on the TG13 severity grading criteria was significantly higher. Furthermore, a comparison of patients treated with early or urgent biliary drainage versus patients not treated this way showed no difference in 30-day mortality among patients with Grade I or Grade III AC, but significantly lower 30-day mortality in patients with Grade II AC who were treated with early or urgent biliary drainage. This suggests that the TG13 severity grading criteria can be used to identify Grade II patients whose prognoses may be improved through biliary drainage. The TG13 severity grading criteria may therefore be useful as an indicator for biliary drainage as well as a predictive factor when assessing the patient's prognosis. The TG13 diagnostic and severity grading criteria for AC can provide results quickly, are minimally invasive for the patients, and are inexpensive. We recommend that the TG13 criteria be adopted in the TG18 guidelines and used as standard practice in the clinical setting. Free full articles and mobile app of TG18 are available at: http://www.jshbps.jp/modules/en/index.php?content_id=47. Related clinical questions and references are also included.

Journal ArticleDOI
TL;DR: This work proposes Neural Textures, which are learned feature maps that are trained as part of the scene capture process that can be utilized to coherently re-render or manipulate existing video content in both static and dynamic environments at real-time rates.
Abstract: The modern computer graphics pipeline can synthesize images at remarkable visual quality; however, it requires well-defined, high-quality 3D content as input. In this work, we explore the use of imperfect 3D content, for instance, obtained from photo-metric reconstructions with noisy and incomplete surface geometry, while still aiming to produce photo-realistic (re-)renderings. To address this challenging problem, we introduce Deferred Neural Rendering, a new paradigm for image synthesis that combines the traditional graphics pipeline with learnable components. Specifically, we propose Neural Textures, which are learned feature maps that are trained as part of the scene capture process. Similar to traditional textures, neural textures are stored as maps on top of 3D mesh proxies; however, the high-dimensional feature maps contain significantly more information, which can be interpreted by our new deferred neural rendering pipeline. Both neural textures and deferred neural renderer are trained end-to-end, enabling us to synthesize photo-realistic images even when the original 3D content was imperfect. In contrast to traditional, black-box 2D generative neural networks, our 3D representation gives us explicit control over the generated output, and allows for a wide range of application domains. For instance, we can synthesize temporally-consistent video re-renderings of recorded 3D scenes as our representation is inherently embedded in 3D space. This way, neural textures can be utilized to coherently re-render or manipulate existing video content in both static and dynamic environments at real-time rates. We show the effectiveness of our approach in several experiments on novel view synthesis, scene editing, and facial reenactment, and compare to state-of-the-art approaches that leverage the standard graphics pipeline as well as conventional generative neural networks.

Journal ArticleDOI
TL;DR: A phase 1 trial of adoptive Treg immunotherapy to repair or replace Tregs in type 1 diabetics was reported, and the therapy was safe, supporting efficacy testing in further trials and support the development of a phase 2 trial to test efficacy of the Treg therapy.
Abstract: Type 1 diabetes (T1D) is an autoimmune disease that occurs in genetically susceptible individuals. Regulatory T cells (Tregs) have been shown to be defective in the autoimmune disease setting. Thus, efforts to repair or replace Tregs in T1D may reverse autoimmunity and protect the remaining insulin-producing β cells. On the basis of this premise, a robust technique has been developed to isolate and expand Tregs from patients with T1D. The expanded Tregs retained their T cell receptor diversity and demonstrated enhanced functional activity. We report on a phase 1 trial to assess safety of Treg adoptive immunotherapy in T1D. Fourteen adult subjects with T1D, in four dosing cohorts, received ex vivo-expanded autologous CD4(+)CD127(lo/-)CD25(+) polyclonal Tregs (0.05 × 10(8) to 26 × 10(8) cells). A subset of the adoptively transferred Tregs was long-lived, with up to 25% of the peak level remaining in the circulation at 1 year after transfer. Immune studies showed transient increases in Tregs in recipients and retained a broad Treg FOXP3(+)CD4(+)CD25(hi)CD127(lo) phenotype long-term. There were no infusion reactions or cell therapy-related high-grade adverse events. C-peptide levels persisted out to 2+ years after transfer in several individuals. These results support the development of a phase 2 trial to test efficacy of the Treg therapy.

Journal ArticleDOI
TL;DR: Progress in reducing the large worldwide stillbirth burden remains slow and insufficient to meet national targets such as for ENAP, but countries and the global community must further improve the quality and comparability of data.

Journal ArticleDOI
TL;DR: Ending the tuberculosis epidemic in high-incidence countries needs a similar approach that guarantees access to high-quality tuberculosis care and prevention to all while simul taneously addressing the social determinants of tuberculosis.

Proceedings ArticleDOI
01 Jul 2017
TL;DR: This is the first neural network architecture that is able to outperform JPEG at image compression across most bitrates on the rate-distortion curve on the Kodak dataset images, with and without the aid of entropy coding.
Abstract: This paper presents a set of full-resolution lossy image compression methods based on neural networks. Each of the architectures we describe can provide variable compression rates during deployment without requiring retraining of the network: each network need only be trained once. All of our architectures consist of a recurrent neural network (RNN)-based encoder and decoder, a binarizer, and a neural network for entropy coding. We compare RNN types (LSTM, associative LSTM) and introduce a new hybrid of GRU and ResNet. We also study one-shot versus additive reconstruction architectures and introduce a new scaled-additive framework. We compare to previous work, showing improvements of 4.3%–8.8% AUC (area under the rate-distortion curve), depending on the perceptual metric used. As far as we know, this is the first neural network architecture that is able to outperform JPEG at image compression across most bitrates on the rate-distortion curve on the Kodak dataset images, with and without the aid of entropy coding.

Journal ArticleDOI
TL;DR: In this article, the authors evaluated whether unbiased clustering analysis using dense phenotypic data (phenomapping) could identify phenotypically distinct heart failure with preserved ejection fraction (HFpEF) categories.
Abstract: Background—Heart failure with preserved ejection fraction (HFpEF) is a heterogeneous clinical syndrome in need of improved phenotypic classification. We sought to evaluate whether unbiased clustering analysis using dense phenotypic data (phenomapping) could identify phenotypically distinct HFpEF categories. Methods and Results—We prospectively studied 397 patients with HFpEF and performed detailed clinical, laboratory, ECG, and echocardiographic phenotyping of the study participants. We used several statistical learning algorithms, including unbiased hierarchical cluster analysis of phenotypic data (67 continuous variables) and penalized model-based clustering, to define and characterize mutually exclusive groups making up a novel classification of HFpEF. All phenomapping analyses were performed by investigators blinded to clinical outcomes, and Cox regression was used to demonstrate the clinical validity of phenomapping. The mean age was 65±12 years; 62% were female; 39% were black; and comorbidities wer...

Journal ArticleDOI
TL;DR: In this article, Manthiram et al. discuss several important design considerations for high-nickel layered oxide cathodes that will be implemented in the automotive market for the coming decade.
Abstract: High-nickel layered oxide cathode materials will be at the forefront to enable longer driving-range electric vehicles at more affordable costs with lithium-based batteries. A continued push to higher energy content and less usage of costly raw materials, such as cobalt, while preserving acceptable power, lifetime and safety metrics, calls for a suite of strategic compositional, morphological and microstructural designs and efficient material production processes. In this Perspective, we discuss several important design considerations for high-nickel layered oxide cathodes that will be implemented in the automotive market for the coming decade. We outline various intrinsic restraints of maximizing their energy output and compare current/emerging development roadmaps approaching low-/zero-cobalt chemistry. Materials production is another focus, relevant to driving down costs and addressing the practical challenges of high-nickel layered oxides for demanding vehicle applications. We further assess a series of stabilization techniques on their prospects to fulfill the aggressive targets of vehicle electrification. The development of high-nickel layered oxide cathodes represents an opportunity to realize the full potential of lithium-ion batteries for electric vehicles. Manthiram and colleagues review the materials design strategies and discuss the challenges and solutions for low-cobalt, high-energy-density cathodes.

Journal ArticleDOI
TL;DR: In this article, the identification of synergies and trade-offs using official SDG indicator data for 227 countries was analyzed. And the most frequent SDG interactions were found to outweigh the negative ones in most countries.
Abstract: Sustainable development goals (SDGs) have set the 2030 agenda to transform our world by tackling multiple challenges humankind is facing to ensure well-being, economic prosperity, and environmental protection. In contrast to conventional development agendas focusing on a restricted set of dimensions, the SDGs provide a holistic and multidimensional view on development. Hence, interactions among the SDGs may cause diverging results. To analyze the SDG interactions we systematize the identification of synergies and trade-offs using official SDG indicator data for 227 countries. A significant positive correlation between a pair of SDG indicators is classified as a synergy while a significant negative correlation is classified as a trade-off. We rank synergies and trade-offs between SDGs pairs on global and country scales in order to identify the most frequent SDG interactions. For a given SDG, positive correlations between indicator pairs were found to outweigh the negative ones in most countries. Among SDGs the positive and negative correlations between indicator pairs allowed for the identification of particular global patterns. SDG 1 (No poverty) has synergetic relationship with most of the other goals, whereas SDG 12 (Responsible consumption and production) is the goal most commonly associated with trade-offs. The attainment of the SDG agenda will greatly depend on whether the identified synergies among the goals can be leveraged. In addition, the highlighted trade-offs, which constitute obstacles in achieving the SDGs, need to be negotiated and made structurally nonobstructive by deeper changes in the current strategies.

Book ChapterDOI
08 Sep 2018
TL;DR: In this paper, a disentangled representation for image-to-image translation is proposed, which embeds images onto two spaces: a domain-invariant content space capturing shared information across domains and a domain specific attribute space.
Abstract: Image-to-image translation aims to learn the mapping between two visual domains. There are two main challenges for many applications: (1) the lack of aligned training pairs and (2) multiple possible outputs from a single input image. In this work, we present an approach based on disentangled representation for producing diverse outputs without paired training images. To achieve diversity, we propose to embed images onto two spaces: a domain-invariant content space capturing shared information across domains and a domain-specific attribute space. Using the disentangled features as inputs greatly reduces mode collapse. To handle unpaired training data, we introduce a novel cross-cycle consistency loss. Qualitative results show that our model can generate diverse and realistic images on a wide range of tasks. We validate the effectiveness of our approach through extensive evaluation.

Journal ArticleDOI
TL;DR: The factors that determine NP colloidal stability, the various efforts to stabilize NP in biological media, the methods to characterize NP colloid stability in situ, and a discussion regarding NP interactions with cells are examined.
Abstract: Nanomaterials are finding increasing use for biomedical applications such as imaging, diagnostics, and drug delivery. While it is well understood that nanoparticle (NP) physico-chemical properties can dictate biological responses and interactions, it has been difficult to outline a unifying framework to directly link NP properties to expected in vitro and in vivo outcomes. When introduced to complex biological media containing electrolytes, proteins, lipids, etc., nanoparticles (NPs) are subjected to a range of forces which determine their behavior in this environment. One aspect of NP behavior in biological systems that is often understated or overlooked is aggregation. NP aggregation will significantly alter in vitro behavior (dosimetry, NP uptake, cytotoxicity), as well as in vivo fate (pharmacokinetics, toxicity, biodistribution). Thus, understanding the factors driving NP colloidal stability and aggregation is paramount. Furthermore, studying biological interactions with NPs at the nanoscale level requires an interdisciplinary effort with a robust understanding of multiple characterization techniques. This review examines the factors that determine NP colloidal stability, the various efforts to stabilize NP in biological media, the methods to characterize NP colloidal stability in situ, and provides a discussion regarding NP interactions with cells.

Posted Content
TL;DR: This paper examined the marginal excess burden of all major taxes in the United States, using a multisector, dynamic computational general equilibrium model, and found that the marginal welfare effects of individual income taxes, corporate taxes, payroll taxes, sales and excise taxes, and other smaller sources of revenue.
Abstract: In recent years, increasing attention has been paid by public finance economists to the marginal excess burden (MEB)1 per additional dollar of tax revenue. Estimates of MEBs stand in contrast to estimates of the welfare cost of taxes which are calculated by totally removing existing taxes and replacing them with equal yield lump sum taxes. Instead, an MEB estimate measures the incremental welfare costs of raising extra revenues from an already existing distorting tax. Earlier estimates of MEBs have either concentrated on particular portions of the tax system, or have employed partial equilibrium methods. Here, we examine the MEB of all major taxes in the United States, using a multisector, dynamic computational general equilibrium model. This allows us to calculate simultaneously the marginal welfare effects of individual income taxes, corporate taxes, payroll taxes, sales and excise taxes, and other smaller sources of revenue. We find that the marginal excess burden

Journal ArticleDOI
01 Sep 2016-Nature
TL;DR: This work demonstrates the transfer of energy between two vibrational modes of a cryogenic optomechanical device using topological operations and shows that this transfer arises from the presence of an exceptional point in the spectrum of the device, which is non-reciprocal.
Abstract: Topological operations can achieve certain goals without requiring accurate control over local operational details; for example, they have been used to control geometric phases and have been proposed as a way of controlling the state of certain systems within their degenerate subspaces. More recently, it was predicted that topological operations can be used to transfer energy between normal modes, provided that the system possesses a specific type of degeneracy known as an exceptional point. Here we demonstrate the transfer of energy between two vibrational modes of a cryogenic optomechanical device using topological operations. We show that this transfer arises from the presence of an exceptional point in the spectrum of the device. We also show that this transfer is non-reciprocal. These results open up new directions in system control; they also open up the possibility of exploring other dynamical effects related to exceptional points, including the behaviour of thermal and quantum fluctuations in their vicinity.

Journal ArticleDOI
TL;DR: In this paper, the implications of Planck data for models of dark energy (DE) and modified gravity (MG), beyond the cosmological constant scenario, were studied, and a range of specific models, such as k-essence, f(R) theories and coupled DE scalar field were tested.
Abstract: We study the implications of Planck data for models of dark energy (DE) and modified gravity (MG), beyond the cosmological constant scenario. We start with cases where the DE only directly affects the background evolution, considering Taylor expansions of the equation of state, principal component analysis and parameterizations related to the potential of a minimally coupled DE scalar field. When estimating the density of DE at early times, we significantly improve present constraints. We then move to general parameterizations of the DE or MG perturbations that encompass both effective field theories and the phenomenology of gravitational potentials in MG models. Lastly, we test a range of specific models, such as k-essence, f(R) theories and coupled DE. In addition to the latest Planck data, for our main analyses we use baryonic acoustic oscillations, type-Ia supernovae and local measurements of the Hubble constant. We further show the impact of measurements of the cosmological perturbations, such as redshift-space distortions and weak gravitational lensing. These additional probes are important tools for testing MG models and for breaking degeneracies that are still present in the combination of Planck and background data sets. All results that include only background parameterizations are in agreement with LCDM. When testing models that also change perturbations (even when the background is fixed to LCDM), some tensions appear in a few scenarios: the maximum one found is \sim 2 sigma for Planck TT+lowP when parameterizing observables related to the gravitational potentials with a chosen time dependence; the tension increases to at most 3 sigma when external data sets are included. It however disappears when including CMB lensing.

Journal ArticleDOI
TL;DR: A new X-ray diffraction data-analysis package is presented with a description of the algorithms and examples of its application to biological and chemical crystallography.
Abstract: The DIALS project is a collaboration between Diamond Light Source, Lawrence Berkeley National Laboratory and CCP4 to develop a new software suite for the analysis of crystallographic X-ray diffraction data, initially encompassing spot finding, indexing, refinement and integration. The design, core algorithms and structure of the software are introduced, alongside results from the analysis of data from biological and chemical crystallography experiments.

Journal ArticleDOI
05 Oct 2018-Science
TL;DR: The three-dimensional geometries of all seismically active global subduction zones are calculated and the resulting model, called Slab2, provides a uniform geometrical analysis of all currently subducting slabs.
Abstract: Subduction zones are home to the most seismically active faults on the planet. The shallow megathrust interfaces of subduction zones host Earth’s largest earthquakes and are likely the only faults capable of magnitude 9+ ruptures. Despite these facts, our knowledge of subduction zone geometry—which likely plays a key role in determining the spatial extent and ultimately the size of subduction zone earthquakes—is incomplete. We calculated the three-dimensional geometries of all seismically active global subduction zones. The resulting model, called Slab2, provides a uniform geometrical analysis of all currently subducting slabs.

Journal ArticleDOI
TL;DR: In a US national cohort of adults with schizophrenia, excess deaths from cardiovascular and respiratory diseases implicate modifiable cardiovascular risk factors, including especially tobacco use, which highlight threats posed by substance abuse.
Abstract: Importance Although adults with schizophrenia have a significantly increased risk of premature mortality, sample size limitations of previous research have hindered the identification of the underlying causes. Objective To describe overall and cause-specific mortality rates and standardized mortality ratios (SMRs) for adults with schizophrenia compared with the US general population. Design, Setting, and Participants We identified a national retrospective longitudinal cohort of patients with schizophrenia 20 to 64 years old in the Medicaid program (January 1, 2001, to December 31, 2007). The cohort included 1 138 853 individuals, 4 807 121 years of follow-up, and 74 003 deaths, of which 65 553 had a known cause. Main Outcomes and Measures Mortality ratios for the schizophrenia cohort standardized to the general population with respect to age, sex, race/ethnicity, and geographic region were estimated for all-cause and cause-specific mortality. Mortality rates per 100 000 person-years and the mean years of potential life lost per death were also determined. Death record information was obtained from the National Death Index. Results Adults with schizophrenia were more than 3.5 times (all-cause SMR, 3.7; 95% CI, 3.7-3.7) as likely to die in the follow-up period as were adults in the general population. Cardiovascular disease had the highest mortality rate (403.2 per 100 000 person-years) and an SMR of 3.6 (95% CI, 3.5-3.6). Among 6 selected cancers, lung cancer had the highest mortality rate (74.8 per 100 000 person-years) and an SMR of 2.4 (95% CI, 2.4-2.5). Particularly elevated SMRs were observed for chronic obstructive pulmonary disease (9.9; 95% CI, 9.6-10.2) and influenza and pneumonia (7.0; 95% CI, 6.7-7.4). Accidental deaths (119.7 per 100 000 person-years) accounted for more than twice as many deaths as suicide (52.0 per 100 000 person-years). Nonsuicidal substance-induced death, mostly from alcohol or other drugs, was also a leading cause of death (95.2 per 100 000 person-years). Conclusions and Relevance In a US national cohort of adults with schizophrenia, excess deaths from cardiovascular and respiratory diseases implicate modifiable cardiovascular risk factors, including especially tobacco use. Excess deaths directly attributable to alcohol or other drugs highlight threats posed by substance abuse. More aggressive identification and management of cardiovascular risk factors, as well as reducing tobacco use and substance abuse, should be leading priorities in the medical care of adults with schizophrenia.

Journal ArticleDOI
TL;DR: Results from a planned interim analysis of an ongoing, phase 3 study of obeticholic acid for NASH show clinically significant histological improvement that is reasonably likely to predict clinical benefit.

Journal ArticleDOI
08 Feb 2018
TL;DR: It is shown that a diffusive memristor based on silver nanoparticles in a dielectric film can be used to create an artificial neuron with stochastic leaky integrate-and-fire dynamics and tunable integration time, which is determined by silver migration alone or its interaction with circuit capacitance.
Abstract: Neuromorphic computers comprised of artificial neurons and synapses could provide a more efficient approach to implementing neural network algorithms than traditional hardware. Recently, artificial neurons based on memristors have been developed, but with limited bio-realistic dynamics and no direct interaction with the artificial synapses in an integrated network. Here we show that a diffusive memristor based on silver nanoparticles in a dielectric film can be used to create an artificial neuron with stochastic leaky integrate-and-fire dynamics and tunable integration time, which is determined by silver migration alone or its interaction with circuit capacitance. We integrate these neurons with nonvolatile memristive synapses to build fully memristive artificial neural networks. With these integrated networks, we experimentally demonstrate unsupervised synaptic weight updating and pattern classification.

Journal ArticleDOI
12 Nov 2015-Nature
TL;DR: A pooled clustered regularly interspaced palindromic repeat-Cas9 guide RNA libraries are developed to perform in situ saturating mutagenesis of the human and mouse enhancers and reveal critical minimal features and discrete vulnerabilities of these enhancers.
Abstract: Enhancers, critical determinants of cellular identity, are commonly recognized by correlative chromatin marks and gain-of-function potential, although only loss-of-function studies can demonstrate their requirement in the native genomic context. Previously, we identified an erythroid enhancer of human BCL11A, subject to common genetic variation associated with the fetal haemoglobin level, the mouse orthologue of which is necessary for erythroid BCL11A expression. Here we develop pooled clustered regularly interspaced palindromic repeat (CRISPR)-Cas9 guide RNA libraries to perform in situ saturating mutagenesis of the human and mouse enhancers. This approach reveals critical minimal features and discrete vulnerabilities of these enhancers. Despite conserved function of the composite enhancers, their architecture diverges. The crucial human sequences appear to be primate-specific. Through editing of primary human progenitors and mouse transgenesis, we validate the BCL11A erythroid enhancer as a target for fetal haemoglobin reinduction. The detailed enhancer map will inform therapeutic genome editing, and the screening approach described here is generally applicable to functional interrogation of non-coding genomic elements.