scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
TL;DR: Marking nodes with biopsy-confirmed metastatic disease allows for selective removal and improves pathologic evaluation for residual nodal disease after chemotherapy, as determined in patients undergoing complete axillary lymphadenectomy.
Abstract: PurposePlacing clips in nodes with biopsy-confirmed metastasis before initiating neoadjuvant therapy allows for evaluation of response in breast cancer. Our goal was to determine if pathologic changes in clipped nodes reflect the status of the nodal basin and if targeted axillary dissection (TAD), which includes sentinel lymph node dissection (SLND) and selective localization and removal of clipped nodes, improves the false-negative rate (FNR) compared with SLND alone.MethodsA prospective study of patients with biopsy-confirmed nodal metastases with a clip placed in the sampled node was performed. After neoadjuvant therapy, patients underwent axillary surgery and the pathology of the clipped node was compared with other nodes. Patients undergoing TAD had SLND and selective removal of the clipped node using iodine-125 seed localization. The FNR was determined in patients undergoing complete axillary lymphadenectomy (ALND).ResultsOf 208 patients enrolled in this study, 191 underwent ALND, with residual dise...

584 citations


Proceedings ArticleDOI
Wenjie Luo1, Bin Yang1, Raquel Urtasun1
18 Jun 2018
TL;DR: A novel deep neural network that is able to jointly reason about 3D detection, tracking and motion forecasting given data captured by a 3D sensor is proposed, which is very efficient in terms of both memory and computation.
Abstract: In this paper we propose a novel deep neural network that is able to jointly reason about 3D detection, tracking and motion forecasting given data captured by a 3D sensor. By jointly reasoning about these tasks, our holistic approach is more robust to occlusion as well as sparse data at range. Our approach performs 3D convolutions across space and time over a bird's eye view representation of the 3D world, which is very efficient in terms of both memory and computation. Our experiments on a new very large scale dataset captured in several north american cities, show that we can outperform the state-of-the-art by a large margin. Importantly, by sharing computation we can perform all tasks in as little as 30 ms.

584 citations


Journal ArticleDOI
15 Apr 2020
TL;DR: Changes in Lachnospiraceae abundances according to health and disease are discussed and how nutrients from the host diet can influence their growth and how their metabolites can, in turn, influence host physiology are analyzed.
Abstract: The complex polymicrobial composition of human gut microbiota plays a key role in health and disease. Lachnospiraceae belong to the core of gut microbiota, colonizing the intestinal lumen from birth and increasing, in terms of species richness and their relative abundances during the host’s life. Although, members of Lachnospiraceae are among the main producers of short-chain fatty acids, different taxa of Lachnospiraceae are also associated with different intra- and extraintestinal diseases. Their impact on the host physiology is often inconsistent across different studies. Here, we discuss changes in Lachnospiraceae abundances according to health and disease. With the aim of harnessing Lachnospiraceae to promote human health, we also analyze how nutrients from the host diet can influence their growth and how their metabolites can, in turn, influence host physiology.

584 citations


ReportDOI
TL;DR: For example, this paper found that government restrictions on commercial activity and voluntary social distancing, operating with powerful effects in a service-oriented economy, are the main reasons the U.S. stock market reacted so much more forcefully to COVID-19 than to previous pandemics in 1918-19, 1957-58 and 1968.
Abstract: No previous infectious disease outbreak, including the Spanish Flu, has impacted the stock market as forcefully as the COVID-19 pandemic. In fact, previous pandemics left only mild traces on the U.S. stock market. We use text-based methods to develop these points with respect to large daily stock market moves back to 1900 and with respect to overall stock market volatility back to 1985. We also evaluate potential explanations for the unprecedented stock market reaction to the COVID-19 pandemic. The evidence we amass suggests that government restrictions on commercial activity and voluntary social distancing, operating with powerful effects in a service-oriented economy, are the main reasons the U.S. stock market reacted so much more forcefully to COVID-19 than to previous pandemics in 1918-19, 1957-58 and 1968.

584 citations


Journal ArticleDOI
TL;DR: In this paper, Li et al. studied the self-enforcing heterogeneity of lithium deposition and dissolution as the cause for dendrite formation on the lithium metal anode in various liquid organic solvent based electrolytes.
Abstract: This comparative work studies the self-enforcing heterogeneity of lithium deposition and dissolution as the cause for dendrite formation on the lithium metal anode in various liquid organic solvent based electrolytes. In addition, the ongoing lithium corrosion, its rate and thus the passivating quality of the SEI are investigated in self-discharge measurements. The behavior of the lithium anode is characterized in two carbonate-based standard electrolytes, 1 M LiPF6 in EC/DEC (3 : 7) and 1 M LiPF6 in EC/DMC (1 : 1), and in two alternative electrolytes 1 M LiPF6 in TEGDME and 1 M LiTFSI in DMSO, which have been proposed in the literature as promising electrolytes for lithium metal batteries, more specifically for lithium/air batteries. As a result, electrolyte decomposition, SEI and dendrite formation at the lithium electrode as well as their mutual influences are understood in the development of overpotentials, surface resistances and lithium electrode surface morphologies in subsequent lithium deposition and dissolution processes. A general model of different stages of these processes could be elaborated.

584 citations


Journal ArticleDOI
TL;DR: The goal is to massively increase opportunities for people with MNS disorders to access services without the prospect of discrimination or impoverishment and with the hope of attaining optimal health and social outcomes.

584 citations


Posted Content
TL;DR: PixelDefend as mentioned in this paper purifies a maliciously perturbed image by moving it back towards the distribution seen in the training data, and runs the purified image through an unmodified classifier, making the method agnostic to both the classifier and the attacking method.
Abstract: Adversarial perturbations of normal images are usually imperceptible to humans, but they can seriously confuse state-of-the-art machine learning models. What makes them so special in the eyes of image classifiers? In this paper, we show empirically that adversarial examples mainly lie in the low probability regions of the training distribution, regardless of attack types and targeted models. Using statistical hypothesis testing, we find that modern neural density models are surprisingly good at detecting imperceptible image perturbations. Based on this discovery, we devised PixelDefend, a new approach that purifies a maliciously perturbed image by moving it back towards the distribution seen in the training data. The purified image is then run through an unmodified classifier, making our method agnostic to both the classifier and the attacking method. As a result, PixelDefend can be used to protect already deployed models and be combined with other model-specific defenses. Experiments show that our method greatly improves resilience across a wide variety of state-of-the-art attacking methods, increasing accuracy on the strongest attack from 63% to 84% for Fashion MNIST and from 32% to 70% for CIFAR-10.

584 citations


Journal ArticleDOI
TL;DR: If suitably designed, supramolecular gels can be recyclable and environmentally benign, while the responsive and tunable nature of the self-assembled network offers significant advantages over other materials solutions to problems caused by pollution in an environmental setting.
Abstract: This review explores supramolecular gels as materials for environmental remediation. These soft materials are formed by self-assembling low-molecular-weight building blocks, which can be programmed with molecular-scale information by simple organic synthesis. The resulting gels often have nanoscale ‘solid-like’ networks which are sample-spanning within a ‘liquid-like’ solvent phase. There is intimate contact between the solvent and the gel nanostructure, which has a very high effective surface area as a result of its dimensions. As such, these materials have the ability to bring a solid-like phase into contact with liquids in an environmental setting. Such materials can therefore remediate unwanted pollutants from the environment including: immobilisation of oil spills, removal of dyes, extraction of heavy metals or toxic anions, and the detection or removal of chemical weapons. Controlling the interactions between the gel nanofibres and pollutants can lead to selective uptake and extraction. Furthermore, if suitably designed, such materials can be recyclable and environmentally benign, while the responsive and tunable nature of the self-assembled network offers significant advantages over other materials solutions to problems caused by pollution in an environmental setting.

584 citations


Journal ArticleDOI
Thomas W. Winkler1, Anne E. Justice2, Mariaelisa Graff2, Llilda Barata3  +435 moreInstitutions (106)
TL;DR: In this paper, the authors performed meta-analyses of 114 studies with genome-wide chip and/or Metabochip data by the Genetic Investigation of Anthropometric Traits (GIANT) Consortium.
Abstract: Genome-wide association studies (GWAS) have identified more than 100 genetic variants contributing to BMI, a measure of body size, or waist-to-hip ratio (adjusted for BMI, WHRadjBMI), a measure of body shape. Body size and shape change as people grow older and these changes differ substantially between men and women. To systematically screen for age- and/or sex-specific effects of genetic variants on BMI and WHRadjBMI, we performed meta-analyses of 114 studies (up to 320,485 individuals of European descent) with genome-wide chip and/or Metabochip data by the Genetic Investigation of Anthropometric Traits (GIANT) Consortium. Each study tested the association of up to ~2.8M SNPs with BMI and WHRadjBMI in four strata (men ≤50y, men >50y, women ≤50y, women >50y) and summary statistics were combined in stratum-specific meta-analyses. We then screened for variants that showed age-specific effects (G x AGE), sex-specific effects (G x SEX) or age-specific effects that differed between men and women (G x AGE x SEX). For BMI, we identified 15 loci (11 previously established for main effects, four novel) that showed significant (FDR<5%) age-specific effects, of which 11 had larger effects in younger (<50y) than in older adults (≥50y). No sex-dependent effects were identified for BMI. For WHRadjBMI, we identified 44 loci (27 previously established for main effects, 17 novel) with sex-specific effects, of which 28 showed larger effects in women than in men, five showed larger effects in men than in women, and 11 showed opposite effects between sexes. No age-dependent effects were identified for WHRadjBMI. This is the first genome-wide interaction meta-analysis to report convincing evidence of age-dependent genetic effects on BMI. In addition, we confirm the sex-specificity of genetic effects on WHRadjBMI. These results may provide further insights into the biology that underlies weight change with age or the sexually dimorphism of body shape.

584 citations


Journal ArticleDOI
TL;DR: In this paper, a global analysis of the neutrino oscillation data available as of fall 2018 in the framework of three massive mixed neutrinos with the goal at determining the ranges of allowed values for the six relevant parameters.
Abstract: We present the results of a global analysis of the neutrino oscillation data available as of fall 2018 in the framework of three massive mixed neutrinos with the goal at determining the ranges of allowed values for the six relevant parameters. We describe the complementarity and quantify the tensions among the results of the different data samples contributing to the determination of each parameter. We also show how those vary when combining our global likelihood with the chi^2 map provided by Super-Kamiokande for their atmospheric neutrino data analysis in the same framework. The best fit of the analysis is for the normal mass ordering with inverted ordering being disfavoured with a Delta_chi^2 = 4.7 (9.3) without (with) SK-atm. We find a preference for the second octant of theta_23, disfavouring the first octant with Delta_chi^2 = 4.4 (6.0) without (with) SK-atm. The best fit for the complex phase is Delta_CP = 215_deg with CP conservation being allowed at Delta_chi^2 = 1.5 (1.8). As a byproduct we quantify the correlated ranges for the laboratory observables sensitive to the absolute neutrino mass scale in beta decay, m_nu_e, and neutrino-less double beta decay, m_ee, and the total mass of the neutrinos, Sigma, which is most relevant in Cosmology.

584 citations


Journal ArticleDOI
TL;DR: In this article, a circuit that pairs a flux qubit with an LC oscillator via Josephson junctions is proposed, and the energy eigenstate including the ground state is predicted to be highly entangled.
Abstract: A circuit that pairs a flux qubit with an LC oscillator via Josephson junctions pushes the coupling between light to matter to uncharted territory, with the potential for new applications in quantum technologies. The interaction between an atom and the electromagnetic field inside a cavity1,2,3,4,5,6 has played a crucial role in developing our understanding of light–matter interaction, and is central to various quantum technologies, including lasers and many quantum computing architectures. Superconducting qubits7,8 have allowed the realization of strong9,10 and ultrastrong11,12,13 coupling between artificial atoms and cavities. If the coupling strength g becomes as large as the atomic and cavity frequencies (Δ and ωo, respectively), the energy eigenstates including the ground state are predicted to be highly entangled14. There has been an ongoing debate15,16,17 over whether it is fundamentally possible to realize this regime in realistic physical systems. By inductively coupling a flux qubit and an LC oscillator via Josephson junctions, we have realized circuits with g/ωo ranging from 0.72 to 1.34 and g/Δ ≫ 1. Using spectroscopy measurements, we have observed unconventional transition spectra that are characteristic of this new regime. Our results provide a basis for ground-state-based entangled pair generation and open a new direction of research on strongly correlated light–matter states in circuit quantum electrodynamics.

Journal ArticleDOI
TL;DR: In this article, the authors investigate the barriers that prevent SMEs from realising the benefits of the circular economy and identify several enabling factors that help SMEs adopt circular economy practices.
Abstract: Small and medium-sized enterprises (SMEs) are increasingly aware of the benefits of closing loops and improving resource efficiency, such as saving material costs, creating competitive advantages, and accessing new markets. At the same time, however, various barriers pose challenges to small businesses in their transition to a circular economy, namely a lack of financial resources and lack of technical skills. The aim of this paper is to increase knowledge and understanding about the barriers and enablers experienced by SMEs when implementing circular economy business models. Looking first at the barriers that prevent SMEs from realising the benefits of the circular economy, an investigation is carried out in the form of a literature review and an analysis of a sample of SME case studies that are featured on the GreenEcoNet EU-funded web platform. Several enabling factors that help SMEs adopt circular economy practices are then identified. The paper concludes that although various policy instruments are available to help SMEs incorporate circular economy principles into their business models, several barriers remain. The authors recommend that European and national policies strengthen their focus on greening consumer preferences, market value chains and company cultures, and support the recognition of SMEs’ green business models. This can be achieved through the creation of dedicated marketplaces and communities of practice, for example.

Journal ArticleDOI
TL;DR: A dimensionality-reduction method is developed, (Z)ero (I)nflated (F)actor (A)nalysis (ZIFA), which explicitly models the dropout characteristics, and it is shown that it improves modeling accuracy on simulated and biological data sets.
Abstract: Single-cell RNA-seq data allows insight into normal cellular function and various disease states through molecular characterization of gene expression on the single cell level. Dimensionality reduction of such high-dimensional data sets is essential for visualization and analysis, but single-cell RNA-seq data are challenging for classical dimensionality-reduction methods because of the prevalence of dropout events, which lead to zero-inflated data. Here, we develop a dimensionality-reduction method, (Z)ero (I)nflated (F)actor (A)nalysis (ZIFA), which explicitly models the dropout characteristics, and show that it improves modeling accuracy on simulated and biological data sets.

Journal ArticleDOI
TL;DR: A substantial improvement in survival in France for newborns born at 25 through 31 weeks' gestation was accompanied by an important reduction in severe morbidity, but survival remained rare before 25 weeks, and improvement at extremely low gestational age may be possible.
Abstract: Up-to-date estimates of the health outcomes of preterm children are needed for assessing perinatal care, informing parents, making decisions about care, and providing evidence for clinical guidelines. To determine survival and neonatal morbidity of infants born from 22 through 34 completed weeks' gestation in France in 2011 and compare these outcomes with a comparable cohort in 1997. The EPIPAGE-2 study is a national, prospective, population-based cohort study conducted in all maternity and neonatal units in France in 2011. A total of 2205 births (stillbirths and live births) and terminations of pregnancy at 22 through 26 weeks' gestation, 3257 at 27 through 31 weeks, and 1234 at 32 through 34 weeks were studied. Cohort data were collected from January 1 through December 31, 1997, and from March 28 through December 31, 2011. Analyses for 1997 were run for the entire year and then separately for April to December; the rates for survival and morbidities did not differ. Data are therefore presented for the whole year in 1997 and the 8-month and 6-month periods in 2011. Survival to discharge and survival without any of the following adverse outcomes: grade III or IV intraventricular hemorrhage, cystic periventricular leukomalacia, severe bronchopulmonary dysplasia, retinopathy of prematurity (stage 3 or higher), or necrotizing enterocolitis (stages 2-3). A total of 0.7% of infants born before 24 weeks' gestation survived to discharge: 31.2% of those born at 24 weeks, 59.1% at 25 weeks, and 75.3% at 26 weeks. Survival rates were 93.6% at 27 through 31 weeks and 98.9% at 32 through 34 weeks. Infants discharged home without severe neonatal morbidity represented 0% at 23 weeks, 11.6% at 24 weeks, 30.0% at 25 weeks, 47.5% at 26 weeks, 81.3% at 27 through 31 weeks, and 96.8% at 32 through 34 weeks. Compared with 1997, the proportion of infants surviving without severe morbidity in 2011 increased by 14.4% (P < .001) at 25 through 29 weeks and 6% (P < .001) at 30 through 31 weeks but did not change appreciably for those born at less than 25 weeks. The rates of antenatal corticosteroid use, induced preterm deliveries, cesarean deliveries, and surfactant use increased significantly in all gestational-age groups, except at 22 through 23 weeks. The substantial improvement in survival in France for newborns born at 25 through 31 weeks' gestation was accompanied by an important reduction in severe morbidity, but survival remained rare before 25 weeks. Although improvement in survival at extremely low gestational age may be possible, its effect on long-term outcomes requires further studies. The long-term results of the EPIPAGE-2 study will be informative in this regard.

Journal ArticleDOI
TL;DR: Among Swedish girls and women 10 to 30 years old, quadrivalent HPV vaccination was associated with a substantially reduced risk of invasive cervical cancer at the population level.
Abstract: Background The efficacy and effectiveness of the quadrivalent human papillomavirus (HPV) vaccine in preventing high-grade cervical lesions have been shown. However, data to inform the rela...

Journal ArticleDOI
TL;DR: Graphene nanoparticle hybrids exist in two forms, as graphene–nanoparticle composites and graphene-encapsulated nanoparticles, and can be used for various bioapplications including biosensors, photothermal therapies, stem cell/tissue engineering, drug/gene delivery, and bioimaging.
Abstract: Graphene is composed of single-atom thick sheets of sp2 bonded carbon atoms that are arranged in a perfect two-dimensional (2D) honeycomb lattice. Because of this structure, graphene is characterized by a number of unique and exceptional structural, optical, and electronic properties.1 Specifically, these extraordinary properties include, but are not limited to, a high planar surface area that is calculated to be 2630 m2 g−1,2 superior mechanical strength with a Young’s modulus of 1100 GPa,3 unparalleled thermal conductivity (5000 W m−1 K−1),4 remarkable electronic properties (e.g., high carrier mobility [10 000 cm2 V−1 s−1] and capacity),5 and alluring optical characteristics (e.g., high opacity [~97.7%] and the ability to quench fluorescence).6 As such, it should come as no surprise that graphene is currently, without any doubt, the most intensively studied material for a wide range of applications that include electronic, energy, and sensing outlets.1c Moreover, because of these unique chemical and physical properties, graphene and graphene-based nanomaterials have attracted increasing interest, and, arguably, hold the greatest promise for implementation into a wide array of bioapplications.7 In the last several years, numerous studies have utilized graphene in bioapplications ranging from the delivery of chemotherapeutics for the treatment of cancer8 to biosensing applications for a host of medical conditions9 and even for the differentiation and imaging of stem cells.10 While promising and exciting, recent reports have demonstrated that the combination of graphene with nanomaterials such as nanoparticles, thereby forming graphene–nanoparticle hybrid structures, offers a number of additional unique physicochemical properties and functions that are both highly desirable and markedly advantageous for bioapplications when compared to the use of either material alone (Figure 1).11 These graphene–nanoparticle hybrid structures are especially alluring because not only do they display the individual properties of the nanoparticles, which can already possess beneficial optical, electronic, magnetic, and structural properties that are unavailable in bulk materials, and of graphene, but they also exhibit additional advantageous and often synergistic properties that greatly augment their potential for bioapplications. Open in a separate window Figure 1 Graphene nanoparticle hybrids exist in two forms, as graphene–nanoparticle composites and graphene-encapsulated nanoparticles, and can be used for various bioapplications including biosensors, photothermal therapies, stem cell/tissue engineering, drug/gene delivery, and bioimaging. Panel (A) reprinted with permission from ref 110. Copyright 2012 Wiley. Panel (B) reprinted with permission from ref 211. Copyright 2013 Elsevier. Panel (C) reprinted with permission from ref 244. Copyright 2013 Wiley.

Journal ArticleDOI
TL;DR: The immunogenicity of ccRCC tumors cannot be explained by mutation load or neo-antigen load, but is highly correlated with MHC class I antigen presenting machinery expression (APM), and both APM and T cell levels are negatively associated with subclone number.
Abstract: Tumor-infiltrating immune cells have been linked to prognosis and response to immunotherapy; however, the levels of distinct immune cell subsets and the signals that draw them into a tumor, such as the expression of antigen presenting machinery genes, remain poorly characterized. Here, we employ a gene expression-based computational method to profile the infiltration levels of 24 immune cell populations in 19 cancer types. We compare cancer types using an immune infiltration score and a T cell infiltration score and find that clear cell renal cell carcinoma (ccRCC) is among the highest for both scores. Using immune infiltration profiles as well as transcriptomic and proteomic datasets, we characterize three groups of ccRCC tumors: T cell enriched, heterogeneously infiltrated, and non-infiltrated. We observe that the immunogenicity of ccRCC tumors cannot be explained by mutation load or neo-antigen load, but is highly correlated with MHC class I antigen presenting machinery expression (APM). We explore the prognostic value of distinct T cell subsets and show in two cohorts that Th17 cells and CD8+ T/Treg ratio are associated with improved survival, whereas Th2 cells and Tregs are associated with negative outcomes. Investigation of the association of immune infiltration patterns with the subclonal architecture of tumors shows that both APM and T cell levels are negatively associated with subclone number. Our analysis sheds light on the immune infiltration patterns of 19 human cancers and unravels mRNA signatures with prognostic utility and immunotherapeutic biomarker potential in ccRCC.

Posted Content
TL;DR: This paper builds upon the favorable domain shift-robust properties of deep learning methods, and develops a low-rank parameterized CNN model for end-to-end DG learning that outperforms existing DG alternatives.
Abstract: The problem of domain generalization is to learn from multiple training domains, and extract a domain-agnostic model that can then be applied to an unseen domain. Domain generalization (DG) has a clear motivation in contexts where there are target domains with distinct characteristics, yet sparse data for training. For example recognition in sketch images, which are distinctly more abstract and rarer than photos. Nevertheless, DG methods have primarily been evaluated on photo-only benchmarks focusing on alleviating the dataset bias where both problems of domain distinctiveness and data sparsity can be minimal. We argue that these benchmarks are overly straightforward, and show that simple deep learning baselines perform surprisingly well on them. In this paper, we make two main contributions: Firstly, we build upon the favorable domain shift-robust properties of deep learning methods, and develop a low-rank parameterized CNN model for end-to-end DG learning. Secondly, we develop a DG benchmark dataset covering photo, sketch, cartoon and painting domains. This is both more practically relevant, and harder (bigger domain shift) than existing benchmarks. The results show that our method outperforms existing DG alternatives, and our dataset provides a more significant DG challenge to drive future research.

Journal ArticleDOI
TL;DR: The current review examined the role of oxidative stress in AD, a process referring to an imbalance between antioxidants and oxidants in favour of oxidants that can occur as a result of increased free radicals or a decrease in antioxidant defense.
Abstract: Alzheimer's disease (AD) is the most common cause of disability in individuals aged >65 years worldwide. AD is characterized by the abnormal deposition of amyloid β (Aβ) peptide, and intracellular accumulation of neurofibrillary tangles of hyperphosphorylated τ protein and dementia. The neurotoxic oligomer Aβ peptide, which is the neuropathological diagnostic criterion of the disease, together with τ protein, are mediators of the neurodegeneration that is among the main causative factors. However, these phenomena are mainly initiated and enhanced by oxidative stress, a process referring to an imbalance between antioxidants and oxidants in favour of oxidants. This imbalance can occur as a result of increased free radicals or a decrease in antioxidant defense, free radicals being a species that contains one or more unpaired electrons in its outer shell. The major source of potent free radicals is the reduction of molecular oxygen in water, that initially yields the superoxide radical, which produces hydrogen peroxide by the addition of an electron. The reduction of hydrogen peroxide produces highly reactive hydroxyl radicals, termed reactive oxygen species (ROS) that can react with lipids, proteins, nucleic acids, and other molecules and may also alter their structures and functions. Thus, tissues and organs, particularly the brain, a vulnerable organ, are affected by ROS due to its composition. The brain is largely composed of easily oxidizable lipids while featuring a high oxygen consumption rate. The current review examined the role of oxidative stress in AD.

Posted Content
TL;DR: Presented on August 28, 2018 at 12:15 p.m. in the Pettit Microelectronics Research Center, Room 102 A/B.
Abstract: Presented on August 28, 2018 at 12:15 p.m. in the Pettit Microelectronics Research Center, Room 102 A/B.

Journal ArticleDOI
TL;DR: In this article, the authors present an unbiased forecasting model built upon a probabilistic mass-radius relation conditioned on a sample of 316 well-constrained objects, which can predict the mass (or radius) from the radius (or mass) for objects covering nine orders of magnitude in mass.
Abstract: Mass and radius are two of the most fundamental properties of an astronomical object. Increasingly, new planet discoveries are being announced with a measurement of one of these terms, but not both. This has led to a growing need to forecast the missing quantity using the other, especially when predicting the detectability of certain follow-up observations. We present am unbiased forecasting model built upon a probabilistic mass-radius relation conditioned on a sample of 316 well-constrained objects. Our publicly available code, Forecaster, accounts for observational errors, hyper-parameter uncertainties and the intrinsic dispersions observed in the calibration sample. By conditioning our model upon a sample spanning dwarf planets to late-type stars, Forecaster can predict the mass (or radius) from the radius (or mass) for objects covering nine orders-of-magnitude in mass. Classification is naturally performed by our model, which uses four classes we label as Terran worlds, Neptunian worlds, Jovian worlds and stars. Our classification identifies dwarf planets as merely low-mass Terrans (like the Earth), and brown dwarfs as merely high-mass Jovians (like Jupiter). We detect a transition in the mass-radius relation at $2.0_{-0.6}^{+0.7} M_\oplus$, which we associate with the divide between solid, Terran worlds and Neptunian worlds. This independent analysis adds further weight to the emerging consensus that rocky Super-Earths represent a narrower region of parameter space than originally thought. Effectively, then, the Earth is the Super-Earth we have been looking for.

Journal ArticleDOI
TL;DR: It seems that ST258 is a hybrid clone that was created by a large recombination event between ST11 and ST442, and incompatibility group F plasmids with blaKPC have contributed significantly to the success of ST258.
Abstract: The management of infections due to Klebsiella pneumoniae has been complicated by the emergence of antimicrobial resistance, especially to carbapenems. Resistance to carbapenems in K. pneumoniae involves multiple mechanisms, including the production of carbapenemases (e.g., KPC, NDM, VIM, OXA-48-like), as well as alterations in outer membrane permeability mediated by the loss of porins and the upregulation of efflux systems. The latter two mechanisms are often combined with high levels of other types of β-lactamases (e.g., AmpC). K. pneumoniae sequence type 258 (ST258) emerged during the early to mid-2000s as an important human pathogen and has spread extensively throughout the world. ST258 comprises two distinct lineages, namely, clades I and II, and it seems that ST258 is a hybrid clone that was created by a large recombination event between ST11 and ST442. Incompatibility group F plasmids with blaKPC have contributed significantly to the success of ST258. The optimal treatment of infections due to carbapenemase-producing K. pneumoniae remains unknown. Some newer agents show promise for treating infections due to KPC producers; however, effective options for the treatment of NDM producers remain elusive.

Posted ContentDOI
27 Mar 2018-bioRxiv
TL;DR: TBtools is described, a Toolkit for Biologists integrating various HTS-data handling tools with a user-friendly interface that facilitates many simple, routine but elaborate tasks working on HTS data, such as bulk sequence extraction, gene set functional enrichment, venn diagram and etc.
Abstract: Various softwares or pipelines have been developed for biological information mining from high-throughput sequencing (HTS) data, and most of them relies on programming and command-line environment with which most biologists are unfamiliar. Bioinformatic tools with an user-friendly interface are preferred by wet-lab biologists. Here, we describe TBtools, a Toolkit for Biologists integrating various HTS-data handling tools with a user-friendly interface. It includes a large collection of functions, which facilitate many simple, routine but elaborate tasks working on HTS data, such as bulk sequence extraction, gene set functional enrichment, venn diagram and etc. TBtools can run under all operating systems with JRE1.6 and is freely available at github.com/CJ-Chen/TBtools. Since its development, it has been used by many researchers. It will be a useful toolkit for wet-lab biologists to work on all kinds of high-throughput data.

Journal ArticleDOI
TL;DR: The study is the first to show the extent to which search systems can effectively and efficiently perform (Boolean) searches with regards to precision, recall, and reproducibility and to demonstrate why Google Scholar is inappropriate as principal search system.
Abstract: Rigorous evidence identification is essential for systematic reviews and meta-analyses (evidence syntheses) because the sample selection of relevant studies determines a review's outcome, validity, and explanatory power. Yet, the search systems allowing access to this evidence provide varying levels of precision, recall, and reproducibility and also demand different levels of effort. To date, it remains unclear which search systems are most appropriate for evidence synthesis and why. Advice on which search engines and bibliographic databases to choose for systematic searches is limited and lacking systematic, empirical performance assessments. This study investigates and compares the systematic search qualities of 28 widely used academic search systems, including Google Scholar, PubMed, and Web of Science. A novel, query-based method tests how well users are able to interact and retrieve records with each system. The study is the first to show the extent to which search systems can effectively and efficiently perform (Boolean) searches with regards to precision, recall, and reproducibility. We found substantial differences in the performance of search systems, meaning that their usability in systematic searches varies. Indeed, only half of the search systems analyzed and only a few Open Access databases can be recommended for evidence syntheses without adding substantial caveats. Particularly, our findings demonstrate why Google Scholar is inappropriate as principal search system. We call for database owners to recognize the requirements of evidence synthesis and for academic journals to reassess quality requirements for systematic reviews. Our findings aim to support researchers in conducting better searches for better evidence synthesis.

Journal ArticleDOI
TL;DR: The present consensus statement summarizes current strategies on diagnosis, treatment, and prevention of 2019-nCoV infection in children and is based on the Novel Coronavirus Infection Pneumonia Diagnosis and Treatment Standards (the fourth edition) and other previous diagnosis and treatment strategies for pediatric virus infections.
Abstract: Since the outbreak of 2019 novel coronavirus infection (2019-nCoV) in Wuhan City, China, by January 30, 2020, a total of 9692 confirmed cases and 15,238 suspected cases have been reported around 31 provinces or cities in China. Among the confirmed cases, 1527 were severe cases, 171 had recovered and been discharged at home, and 213 died. And among these cases, a total of 28 children aged from 1 month to 17 years have been reported in China. For standardizing prevention and management of 2019-nCoV infections in children, we called up an experts’ committee to formulate this experts’ consensus statement. This statement is based on the Novel Coronavirus Infection Pneumonia Diagnosis and Treatment Standards (the fourth edition) (National Health Committee) and other previous diagnosis and treatment strategies for pediatric virus infections. The present consensus statement summarizes current strategies on diagnosis, treatment, and prevention of 2019-nCoV infection in children.

Journal ArticleDOI
TL;DR: It is demonstrated that antibodies reactive with the T cell-specific T3 antigen were insufficient to result in the activation of Jurkat cells, determined by the secretion of IL 2, demonstrating a two-stimulus requirement for gene expression in human T cells.
Abstract: The human T cell leukemia Jurkat was used as a model to examine the requirements of T cell activation. These studies demonstrated that antibodies reactive with the T cell-specific T3 antigen were insufficient to result in the activation of Jurkat cells, determined by the secretion of IL 2. IL 2 production occurred only in the presence of a second stimulus, the phorbol ester PMA. With the use of an IL 2-specific cDNA probe, the appearance of IL 2 RNA, similarly, occurred only when cells were stimulated with both anti-T3 antibodies and PMA. These results demonstrate a two-stimulus requirement for gene expression in human T cells.

Journal ArticleDOI
04 May 2016-BMJ
TL;DR: There was a U shaped association between BMI and mortality in analyses with a greater potential for bias including all participants, current, former, or ever smokers, and in studies with a short duration of follow-up (<5 years or <10 years), or with moderate study quality scores.
Abstract: Objective To conduct a systematic review and meta-analysis of cohort studies of body mass index (BMI) and the risk of all cause mortality, and to clarify the shape and the nadir of the dose-response curve, and the influence on the results of confounding from smoking, weight loss associated with disease, and preclinical disease. Data sources PubMed and Embase databases searched up to 23 September 2015. Study selection Cohort studies that reported adjusted risk estimates for at least three categories of BMI in relation to all cause mortality. Data synthesis Summary relative risks were calculated with random effects models. Non-linear associations were explored with fractional polynomial models. Results 230 cohort studies (207 publications) were included. The analysis of never smokers included 53 cohort studies (44 risk estimates) with >738 144 deaths and >9 976 077 participants. The analysis of all participants included 228 cohort studies (198 risk estimates) with >3 744 722 deaths among 30 233 329 participants. The summary relative risk for a 5 unit increment in BMI was 1.18 (95% confidence interval 1.15 to 1.21; I 2 =95%, n=44) among never smokers, 1.21 (1.18 to 1.25; I 2 =93%, n=25) among healthy never smokers, 1.27 (1.21 to 1.33; I 2 =89%, n=11) among healthy never smokers with exclusion of early follow-up, and 1.05 (1.04 to 1.07; I 2 =97%, n=198) among all participants. There was a J shaped dose-response relation in never smokers (P non-linearity Conclusion Overweight and obesity is associated with increased risk of all cause mortality and the nadir of the curve was observed at BMI 23-24 among never smokers, 22-23 among healthy never smokers, and 20-22 with longer durations of follow-up. The increased risk of mortality observed in underweight people could at least partly be caused by residual confounding from prediagnostic disease. Lack of exclusion of ever smokers, people with prevalent and preclinical disease, and early follow-up could bias the results towards a more U shaped association.

Book ChapterDOI
08 Sep 2018
TL;DR: A novel scene graph generation model called Graph R-CNN, that is both effective and efficient at detecting objects and their relations in images, is proposed and a new evaluation metric is introduced that is more holistic and realistic than existing metrics.
Abstract: We propose a novel scene graph generation model called Graph R-CNN, that is both effective and efficient at detecting objects and their relations in images. Our model contains a Relation Proposal Network (RePN) that efficiently deals with the quadratic number of potential relations between objects in an image. We also propose an attentional Graph Convolutional Network (aGCN) that effectively captures contextual information between objects and relations. Finally, we introduce a new evaluation metric that is more holistic and realistic than existing metrics. We report state-of-the-art performance on scene graph generation as evaluated using both existing and our proposed metrics.

Book
04 Mar 2019
TL;DR: The theory of macrostructures as mentioned in this paper is the result of research carried out during the previous 10 years in the domains of literary theory, text grammar, the general theory of discourse, pragmatics, and the cognitive psychology of discourse processing.
Abstract: Macrostructures are higher-level semantic or conceptual structures that organize the ‘local’ microstructures of discourse, interaction, and their cognitive processing. They are distinguished from other global structures of a more schematic nature, which we call superstructures. Originally published in 1980, the theory of macrostructures outlined in this book is the result of research carried out during the previous 10 years in the domains of literary theory, text grammar, the general theory of discourse, pragmatics, and the cognitive psychology of discourse processing. The presentation of the theory is systematic but informal and at this stage was not intended to be fully formalized.

Journal ArticleDOI
18 Nov 2016-Science
TL;DR: In this article, the authors demonstrate an integrated platform for scalable quantum nanophotonics based on silicon-vacancy (SiV) color centers coupled to diamond nanodevices.
Abstract: Efficient interfaces between photons and quantum emitters form the basis for quantum networks and enable optical nonlinearities at the single-photon level. We demonstrate an integrated platform for scalable quantum nanophotonics based on silicon-vacancy (SiV) color centers coupled to diamond nanodevices. By placing SiV centers inside diamond photonic crystal cavities, we realize a quantum-optical switch controlled by a single color center. We control the switch using SiV metastable states and observe optical switching at the single-photon level. Raman transitions are used to realize a single-photon source with a tunable frequency and bandwidth in a diamond waveguide. By measuring intensity correlations of indistinguishable Raman photons emitted into a single waveguide, we observe a quantum interference effect resulting from the superradiant emission of two entangled SiV centers.