scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
TL;DR: The results reveal distinct host inflammatory cytokine profiles to SARS-CoV-2 infection in patients, and highlight the association between COVID-19 pathogenesis and excessive cytokine release such as CCL2/MCP-1, CXCL10/IP-10, CCL3/MIP-1A, and CCL4/Mip1B.
Abstract: Circulating in China and 158 other countries and areas, the ongoing COVID-19 outbreak has caused devastating mortality and posed a great threat to public health. However, efforts to identify effectively supportive therapeutic drugs and treatments has been hampered by our limited understanding of host immune response for this fatal disease. To characterize the transcriptional signatures of host inflammatory response to SARS-CoV-2 (HCoV-19) infection, we carried out transcriptome sequencing of the RNAs isolated from the bronchoalveolar lavage fluid (BALF) and peripheral blood mononuclear cells (PBMC) specimens of COVID-19 patients. Our results reveal distinct host inflammatory cytokine profiles to SARS-CoV-2 infection in patients, and highlight the association between COVID-19 pathogenesis and excessive cytokine release such as CCL2/MCP-1, CXCL10/IP-10, CCL3/MIP-1A, and CCL4/MIP1B. Furthermore, SARS-CoV-2 induced activation of apoptosis and P53 signalling pathway in lymphocytes may be the cause of patients' lymphopenia. The transcriptome dataset of COVID-19 patients would be a valuable resource for clinical guidance on anti-inflammatory medication and understanding the molecular mechansims of host response.

918 citations


Journal ArticleDOI
TL;DR: It is found that CRISPR/Cas9 could effectively cleave the endogenous β-globin gene (HBB), however, the efficiency of homologous recombination directed repair (HDR) of HBB was low and the edited embryos were mosaic.
Abstract: Genome editing tools such as the clustered regularly interspaced short palindromic repeat (CRISPR)-associated system (Cas) have been widely used to modify genes in model systems including animal zygotes and human cells, and hold tremendous promise for both basic research and clinical applications. To date, a serious knowledge gap remains in our understanding of DNA repair mechanisms in human early embryos, and in the efficiency and potential off-target effects of using technologies such as CRISPR/Cas9 in human pre-implantation embryos. In this report, we used tripronuclear (3PN) zygotes to further investigate CRISPR/Cas9-mediated gene editing in human cells. We found that CRISPR/Cas9 could effectively cleave the endogenous β-globin gene (HBB). However, the efficiency of homologous recombination directed repair (HDR) of HBB was low and the edited embryos were mosaic. Off-target cleavage was also apparent in these 3PN zygotes as revealed by the T7E1 assay and whole-exome sequencing. Furthermore, the endogenous delta-globin gene (HBD), which is homologous to HBB, competed with exogenous donor oligos to act as the repair template, leading to untoward mutations. Our data also indicated that repair of the HBB locus in these embryos occurred preferentially through the non-crossover HDR pathway. Taken together, our work highlights the pressing need to further improve the fidelity and specificity of the CRISPR/Cas9 platform, a prerequisite for any clinical applications of CRSIPR/Cas9-mediated editing.

917 citations


Proceedings ArticleDOI
30 Oct 2017
TL;DR: DeepLog, a deep neural network model utilizing Long Short-Term Memory (LSTM), is proposed, to model a system log as a natural language sequence, which allows DeepLog to automatically learn log patterns from normal execution, and detect anomalies when log patterns deviate from the model trained from log data under normal execution.
Abstract: Anomaly detection is a critical step towards building a secure and trustworthy system. The primary purpose of a system log is to record system states and significant events at various critical points to help debug system failures and perform root cause analysis. Such log data is universally available in nearly all computer systems. Log data is an important and valuable resource for understanding system status and performance issues; therefore, the various system logs are naturally excellent source of information for online monitoring and anomaly detection. We propose DeepLog, a deep neural network model utilizing Long Short-Term Memory (LSTM), to model a system log as a natural language sequence. This allows DeepLog to automatically learn log patterns from normal execution, and detect anomalies when log patterns deviate from the model trained from log data under normal execution. In addition, we demonstrate how to incrementally update the DeepLog model in an online fashion so that it can adapt to new log patterns over time. Furthermore, DeepLog constructs workflows from the underlying system log so that once an anomaly is detected, users can diagnose the detected anomaly and perform root cause analysis effectively. Extensive experimental evaluations over large log data have shown that DeepLog has outperformed other existing log-based anomaly detection methods based on traditional data mining methodologies.

917 citations


Book
22 Nov 2021
TL;DR: Numerical Methods: Concepts.
Abstract: (Chapter Headings) Definitions and Concepts. Transformations. Exact Analytical Methods. Exact Methods for ODEs. Exact Methods for PDEs. Approximate Analytical Methods. Numerical Methods: Concepts. Numerical Methods for ODEs. Numerical Methods for PDEs. List of Tables. List of Programs. List of Figures. Subject Index.

917 citations


Journal ArticleDOI
TL;DR: In this article, the authors reported the direct observation in TaAs of the long-sought-after Weyl nodes by performing bulk-sensitive soft X-ray angle-resolved photoemission spectroscopy measurements.
Abstract: Experiments show that TaAs is a three-dimensional topological Weyl semimetal. In 1929, H. Weyl proposed that the massless solution of the Dirac equation represents a pair of a new type of particles, the so-called Weyl fermions1. However, their existence in particle physics remains elusive after more than eight decades. Recently, significant advances in both topological insulators and topological semimetals have provided an alternative way to realize Weyl fermions in condensed matter, as an emergent phenomenon: when two non-degenerate bands in the three-dimensional momentum space cross in the vicinity of the Fermi energy (called Weyl nodes), the low-energy excitations behave exactly as Weyl fermions. Here we report the direct observation in TaAs of the long-sought-after Weyl nodes by performing bulk-sensitive soft X-ray angle-resolved photoemission spectroscopy measurements. The projected locations at the nodes on the (001) surface match well to the Fermi arcs, providing undisputable experimental evidence for the existence of Weyl fermionic quasiparticles in TaAs.

917 citations


Journal ArticleDOI
16 Jun 2017-Science
TL;DR: Satellite-based distribution of entangled photon pairs to two locations separated by 1203 kilometers on Earth, through two satellite-to-ground downlinks is demonstrated, with a survival of two-photon entanglement and a violation of Bell inequality.
Abstract: Long-distance entanglement distribution is essential for both foundational tests of quantum physics and scalable quantum networks. Owing to channel loss, however, the previously achieved distance was limited to ~100 kilometers. Here we demonstrate satellite-based distribution of entangled photon pairs to two locations separated by 1203 kilometers on Earth, through two satellite-to-ground downlinks with a summed length varying from 1600 to 2400 kilometers. We observed a survival of two-photon entanglement and a violation of Bell inequality by 2.37 ± 0.09 under strict Einstein locality conditions. The obtained effective link efficiency is orders of magnitude higher than that of the direct bidirectional transmission of the two photons through telecommunication fibers.

917 citations


Journal ArticleDOI
22 Oct 2015-Nature
TL;DR: Biodiversity mainly stabilizes ecosystem productivity, and productivity-dependent ecosystem services, by increasing resistance to climate events, and restoration of biodiversity to increase it, mainly by changing the resistance of ecosystem productivity toClimate events.
Abstract: It remains unclear whether biodiversity buffers ecosystems against climate extremes, which are becoming increasingly frequent worldwide1. Early results suggested that the ecosystem productivity of diverse grassland plant communities was more resistant, changing less during drought, and more resilient, recovering more quickly after drought, than that of depauperate communities2. However, subsequent experimental tests produced mixed results3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13. Here we use data from 46 experiments that manipulated grassland plant diversity to test whether biodiversity provides resistance during and resilience after climate events. We show that biodiversity increased ecosystem resistance for a broad range of climate events, including wet or dry, moderate or extreme, and brief or prolonged events. Across all studies and climate events, the productivity of low-diversity communities with one or two species changed by approximately 50% during climate events, whereas that of high-diversity communities with 16–32 species was more resistant, changing by only approximately 25%. By a year after each climate event, ecosystem productivity had often fully recovered, or overshot, normal levels of productivity in both high- and low-diversity communities, leading to no detectable dependence of ecosystem resilience on biodiversity. Our results suggest that biodiversity mainly stabilizes ecosystem productivity, and productivity-dependent ecosystem services, by increasing resistance to climate events. Anthropogenic environmental changes that drive biodiversity loss thus seem likely to decrease ecosystem stability14, and restoration of biodiversity to increase it, mainly by changing the resistance of ecosystem productivity to climate events.

917 citations


Proceedings ArticleDOI
25 Jan 2019
TL;DR: In this paper, the realism of state-of-the-art image manipulations, and how difficult it is to detect them, either automatically or by humans, is examined.
Abstract: The rapid progress in synthetic image generation and manipulation has now come to a point where it raises significant concerns for the implications towards society. At best, this leads to a loss of trust in digital content, but could potentially cause further harm by spreading false information or fake news. This paper examines the realism of state-of-the-art image manipulations, and how difficult it is to detect them, either automatically or by humans. To standardize the evaluation of detection methods, we propose an automated benchmark for facial manipulation detection. In particular, the benchmark is based on Deep-Fakes, Face2Face, FaceSwap and NeuralTextures as prominent representatives for facial manipulations at random compression level and size. The benchmark is publicly available and contains a hidden test set as well as a database of over 1.8 million manipulated images. This dataset is over an order of magnitude larger than comparable, publicly available, forgery datasets. Based on this data, we performed a thorough analysis of data-driven forgery detectors. We show that the use of additional domain-specific knowledge improves forgery detection to unprecedented accuracy, even in the presence of strong compression, and clearly outperforms human observers.

917 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present experimental and theoretical developments in the field of quantum many-body systems, and present a survey of the current state of the art in this field.
Abstract: Dynamics of quantum many-body systems is one of the most complex problems in physics since it involves the time evolution of a large number of particles that interact with each other under the influence of external forces. With ultracold atoms in optical atomic lattices it can be done in a controlled environment by applying a periodic force. This Colloquium covers the experimental and theoretical developments in this exciting field of physics.

917 citations


Proceedings ArticleDOI
18 Jun 2018
TL;DR: MoCoGAN as discussed by the authors decomposes the visual signals in a video into content and motion, and learns motion and content decomposition in an unsupervised manner using both image and video discriminators.
Abstract: Visual signals in a video can be divided into content and motion. While content specifies which objects are in the video, motion describes their dynamics. Based on this prior, we propose the Motion and Content decomposed Generative Adversarial Network (MoCoGAN) framework for video generation. The proposed framework generates a video by mapping a sequence of random vectors to a sequence of video frames. Each random vector consists of a content part and a motion part. While the content part is kept fixed, the motion part is realized as a stochastic process. To learn motion and content decomposition in an unsupervised manner, we introduce a novel adversarial learning scheme utilizing both image and video discriminators. Extensive experimental results on several challenging datasets with qualitative and quantitative comparison to the state-of-the-art approaches, verify effectiveness of the proposed framework. In addition, we show that MoCoGAN allows one to generate videos with same content but different motion as well as videos with different content and same motion. Our code is available at https://github.com/sergeytulyakov/mocogan.

917 citations


01 Jan 2016
TL;DR: The phorbol diester receptor present in the par- ticulate fraction of rat brain was solubilized by divalent ion che- lation in the absence of detergents as mentioned in this paper.
Abstract: The phorbol diester receptor present in the par- ticulate fraction of rat brain was solubilized by divalent ion che- lation in the absence of detergents. The soluble receptor was par- tially purified by (NH4)2SO4 precipitation, DEAE-cellulose, and gel filtration chromatography. The purified receptor required exogenous phospholipid for activity and displayed a Kd of 7 nM for (3H)phorbol 12,13-dibutyrate. Biologically active phorbol analogs inhibited binding, whereas inactive analogs did not. The Ca2+- dependent, phospholipid-sensitive protein kinase C copurified with the phorbol receptor. Purified protein kinase C was activated directly by phorbol 12-myristate 13-acetate in the presence of phospholipid.

Book
23 Dec 2020
TL;DR: In this paper, the authors reviewed the evidence about the relationship between water quantity, water accessibility and health, including the effects of water reliability, continuity and price on water use, and provided guidance on domestic water supply to ensure beneficial health outcomes.
Abstract: Sufficient quantities of water for household use, including for drinking, food preparation and hygiene, are needed to protect public health and for well-being and prosperity. This second edition reviews the evidence about the relationships between water quantity, water accessibility and health. The effects of water reliability, continuity and price on water use, are also covered. Updated guidance, including recommended targets, is provided on domestic water supply to ensure beneficial health outcomes.

Journal ArticleDOI
03 Nov 2015-JAMA
TL;DR: Significant increases in overall prescription drug use and polypharmacy were observed in this nationally representative survey and persisted after accounting for changes in the age distribution of the population.
Abstract: Importance It is important to document patterns of prescription drug use to inform both clinical practice and research. Objective To evaluate trends in prescription drug use among adults living in the United States. Design, Setting, and Participants Temporal trends in prescription drug use were evaluated using nationally representative data from the National Health and Nutrition Examination Survey (NHANES). Participants included 37 959 noninstitutionalized US adults, aged 20 years and older. Seven NHANES cycles were included (1999-2000 to 2011-2012), and the sample size per cycle ranged from 4861 to 6212. Exposures Calendar year, as represented by continuous NHANES cycle. Main Outcomes and Measures Within each NHANES cycle, use of prescription drugs in the prior 30 days was assessed overall and by drug class. Temporal trends across cycles were evaluated. Analyses were weighted to represent the US adult population. Results Results indicate an increase in overall use of prescription drugs among US adults between 1999-2000 and 2011-2012 with an estimated 51% of US adults reporting use of any prescription drugs in 1999-2000 and an estimated 59% reporting use of any prescription drugs in 2011-2012 (difference, 8% [95% CI, 3.8%-12%]; P for trend P for trend Conclusions and Relevance In this nationally representative survey, significant increases in overall prescription drug use and polypharmacy were observed. These increases persisted after accounting for changes in the age distribution of the population. The prevalence of prescription drug use increased in the majority of, but not all, drug classes.

Proceedings ArticleDOI
07 Jul 2016
TL;DR: A new learning algorithm based on the element-wise Alternating Least Squares (eALS) technique is designed, for efficiently optimizing a Matrix Factorization (MF) model with variably-weighted missing data and exploiting this efficiency to then seamlessly devise an incremental update strategy that instantly refreshes a MF model given new feedback.
Abstract: This paper contributes improvements on both the effectiveness and efficiency of Matrix Factorization (MF) methods for implicit feedback. We highlight two critical issues of existing works. First, due to the large space of unobserved feedback, most existing works resort to assign a uniform weight to the missing data to reduce computational complexity. However, such a uniform assumption is invalid in real-world settings. Second, most methods are also designed in an offline setting and fail to keep up with the dynamic nature of online data. We address the above two issues in learning MF models from implicit feedback. We first propose to weight the missing data based on item popularity, which is more effective and flexible than the uniform-weight assumption. However, such a non-uniform weighting poses efficiency challenge in learning the model. To address this, we specifically design a new learning algorithm based on the element-wise Alternating Least Squares (eALS) technique, for efficiently optimizing a MF model with variably-weighted missing data. We exploit this efficiency to then seamlessly devise an incremental update strategy that instantly refreshes a MF model given new feedback. Through comprehensive experiments on two public datasets in both offline and online protocols, we show that our implemented, open-source (https://github.com/hexiangnan/sigir16-eals) eALS consistently outperforms state-of-the-art implicit MF methods.

Journal ArticleDOI
TL;DR: This work provides a comprehensive framework for generalized bulk-boundary correspondence and a quantized biorthogonal polarization that is formulated directly in systems with open boundaries, including exactly solvable non-Hermitian extensions of the Su-Schrieffer-Heeger model and Chern insulators.
Abstract: Non-Hermitian systems exhibit striking exceptions from the paradigmatic bulk-boundary correspondence, including the failure of bulk Bloch band invariants in predicting boundary states and the (dis)appearance of boundary states at parameter values far from those corresponding to gap closings in periodic systems without boundaries. Here, we provide a comprehensive framework to unravel this disparity based on the notion of biorthogonal quantum mechanics: While the properties of the left and right eigenstates corresponding to boundary modes are individually decoupled from the bulk physics in non-Hermitian systems, their combined biorthogonal density penetrates the bulk precisely when phase transitions occur. This leads to generalized bulk-boundary correspondence and a quantized biorthogonal polarization that is formulated directly in systems with open boundaries. We illustrate our general insights by deriving the phase diagram for several microscopic open boundary models, including exactly solvable non-Hermitian extensions of the Su-Schrieffer-Heeger model and Chern insulators.

Journal ArticleDOI
TL;DR: DNA methylation-derived measures of accelerated aging are heritable traits that predict mortality independently of health status, lifestyle factors, and known genetic factors.
Abstract: Background: DNA methylation levels change with age. Recent studies have identified biomarkers of chronological age based on DNA methylation levels. It is not yet known whether DNA methylation age captures aspects of biological age. Results: Here we test whether differences between people’s chronological ages and estimated ages, DNA methylation age, predict all-cause mortality in later life. The difference between DNA methylation age and chronological age (Δage) was calculated in four longitudinal cohorts of older people. Meta-analysis of proportional hazards models from the four cohorts was used to determine the association between Δage and mortality. A 5-year higher Δage is associated with a 21% higher mortality risk, adjusting for age and sex. After further adjustments for childhood IQ, education, social class, hypertension, diabetes, cardiovascular disease, and APOE e4 status, there is a 16% increased mortality risk for those with a 5-year higher Δage. A pedigree-based heritability analysis of Δage was conducted in a separate cohort. The heritability of Δage was 0.43. Conclusions: DNA methylation-derived measures of accelerated aging are heritable traits that predict mortality independently of health status, lifestyle factors, and known genetic factors.

Journal ArticleDOI
TL;DR: An overview of recent developments in mediation analysis, that is, analyses used to assess the relative magnitude of different pathways and mechanisms by which an exposure may affect an outcome, is provided.
Abstract: This article provides an overview of recent developments in mediation analysis, that is, analyses used to assess the relative magnitude of different pathways and mechanisms by which an exposure may affect an outcome. Traditional approaches to mediation in the biomedical and social sciences are described. Attention is given to the confounding assumptions required for a causal interpretation of direct and indirect effect estimates. Methods from the causal inference literature to conduct mediation in the presence of exposure-mediator interactions, binary outcomes, binary mediators, and case-control study designs are presented. Sensitivity analysis techniques for unmeasured confounding and measurement error are introduced. Discussion is given to extensions to time-to-event outcomes and multiple mediators. Further flexible modeling strategies arising from the precise counterfactual definitions of direct and indirect effects are also described. The focus throughout is on methodology that is easily implementable...

Journal ArticleDOI
TL;DR: The incidence, prevalence, and years of life lived with disability (YLDs) from all causes of injury in every country are measured, to describe how these measures have changed between 1990 and 2016, and to estimate the proportion of TBI and SCI cases caused by different types of injury.
Abstract: Summary Background Traumatic brain injury (TBI) and spinal cord injury (SCI) are increasingly recognised as global health priorities in view of the preventability of most injuries and the complex and expensive medical care they necessitate. We aimed to measure the incidence, prevalence, and years of life lived with disability (YLDs) for TBI and SCI from all causes of injury in every country, to describe how these measures have changed between 1990 and 2016, and to estimate the proportion of TBI and SCI cases caused by different types of injury. Methods We used results from the Global Burden of Diseases, Injuries, and Risk Factors (GBD) Study 2016 to measure the global, regional, and national burden of TBI and SCI by age and sex. We measured the incidence and prevalence of all causes of injury requiring medical care in inpatient and outpatient records, literature studies, and survey data. By use of clinical record data, we estimated the proportion of each cause of injury that required medical care that would result in TBI or SCI being considered as the nature of injury. We used literature studies to establish standardised mortality ratios and applied differential equations to convert incidence to prevalence of long-term disability. Finally, we applied GBD disability weights to calculate YLDs. We used a Bayesian meta-regression tool for epidemiological modelling, used cause-specific mortality rates for non-fatal estimation, and adjusted our results for disability experienced with comorbid conditions. We also analysed results on the basis of the Socio-demographic Index, a compound measure of income per capita, education, and fertility. Findings In 2016, there were 27·08 million (95% uncertainty interval [UI] 24·30–30·30 million) new cases of TBI and 0·93 million (0·78–1·16 million) new cases of SCI, with age-standardised incidence rates of 369 (331–412) per 100 000 population for TBI and 13 (11–16) per 100 000 for SCI. In 2016, the number of prevalent cases of TBI was 55·50 million (53·40–57·62 million) and of SCI was 27·04 million (24·98–30·15 million). From 1990 to 2016, the age-standardised prevalence of TBI increased by 8·4% (95% UI 7·7 to 9·2), whereas that of SCI did not change significantly (−0·2% [–2·1 to 2·7]). Age-standardised incidence rates increased by 3·6% (1·8 to 5·5) for TBI, but did not change significantly for SCI (−3·6% [–7·4 to 4·0]). TBI caused 8·1 million (95% UI 6·0–10·4 million) YLDs and SCI caused 9·5 million (6·7–12·4 million) YLDs in 2016, corresponding to age-standardised rates of 111 (82–141) per 100 000 for TBI and 130 (90–170) per 100 000 for SCI. Falls and road injuries were the leading causes of new cases of TBI and SCI in most regions. Interpretation TBI and SCI constitute a considerable portion of the global injury burden and are caused primarily by falls and road injuries. The increase in incidence of TBI over time might continue in view of increases in population density, population ageing, and increasing use of motor vehicles, motorcycles, and bicycles. The number of individuals living with SCI is expected to increase in view of population growth, which is concerning because of the specialised care that people with SCI can require. Our study was limited by data sparsity in some regions, and it will be important to invest greater resources in collection of data for TBI and SCI to improve the accuracy of future assessments. Funding Bill & Melinda Gates Foundation.

Journal ArticleDOI
TL;DR: This review paper covers the entire pipeline of medical imaging and analysis techniques involved with COVID-19, including image acquisition, segmentation, diagnosis, and follow-up, and particularly focuses on the integration of AI with X-ray and CT, both of which are widely used in the frontline hospitals.
Abstract: The pandemic of coronavirus disease 2019 (COVID-19) is spreading all over the world. Medical imaging such as X-ray and computed tomography (CT) plays an essential role in the global fight against COVID-19, whereas the recently emerging artificial intelligence (AI) technologies further strengthen the power of the imaging tools and help medical specialists. We hereby review the rapid responses in the community of medical imaging (empowered by AI) toward COVID-19. For example, AI-empowered image acquisition can significantly help automate the scanning procedure and also reshape the workflow with minimal contact to patients, providing the best protection to the imaging technicians. Also, AI can improve work efficiency by accurate delineation of infections in X-ray and CT images, facilitating subsequent quantification. Moreover, the computer-aided platforms help radiologists make clinical decisions, i.e., for disease diagnosis, tracking, and prognosis. In this review paper, we thus cover the entire pipeline of medical imaging and analysis techniques involved with COVID-19, including image acquisition, segmentation, diagnosis, and follow-up. We particularly focus on the integration of AI with X-ray and CT, both of which are widely used in the frontline hospitals, in order to depict the latest progress of medical imaging and radiology fighting against COVID-19.

Journal ArticleDOI
TL;DR: The Adaptive Poisson-Boltzmann Solver (APBS) as mentioned in this paper was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that has provided impact in the study of a broad range of chemical, biological, and biomedical applications.
Abstract: The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that has provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this manuscript, we discuss the models and capabilities that have recently been implemented within the APBS software package including: a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory based algorithm for determining p$K_a$ values, and an improved web-based visualization tool for viewing electrostatics.

Journal ArticleDOI
TL;DR: The ability of tumour genomic LOH, quantified with a next-generation sequencing assay, to predict response to rucaparib, an oral PARP inhibitor, was assessed in ARIEL2, an international, multicentre, two-part, phase 2, open-label study.
Abstract: Summary Background Poly(ADP-ribose) polymerase (PARP) inhibitors have activity in ovarian carcinomas with homologous recombination deficiency. Along with BRCA1 and BRCA2 (BRCA) mutations genomic loss of heterozygosity (LOH) might also represent homologous recombination deficiency. In ARIEL2, we assessed the ability of tumour genomic LOH, quantified with a next-generation sequencing assay, to predict response to rucaparib, an oral PARP inhibitor. Methods ARIEL2 is an international, multicentre, two-part, phase 2, open-label study done at 49 hospitals and cancer centres in Australia, Canada, France, Spain, the UK, and the USA. In ARIEL2 Part 1, patients with recurrent, platinum-sensitive, high-grade ovarian carcinoma were classified into one of three predefined homologous recombination deficiency subgroups on the basis of tumour mutational analysis: BRCA mutant (deleterious germline or somatic), BRCA wild-type and LOH high (LOH high group), or BRCA wild-type and LOH low (LOH low group). We prespecified a cutoff of 14% or more genomic LOH for LOH high. Patients began treatment with oral rucaparib at 600 mg twice per day for continuous 28 day cycles until disease progression or any other reason for discontinuation. The primary endpoint was progression-free survival. All patients treated with at least one dose of rucaparib were included in the safety analyses and all treated patients who were classified were included in the primary endpoint analysis. This trial is registered with ClinicalTrials.gov, number NCT01891344. Enrolment into ARIEL2 Part 1 is complete, although an extension (Part 2) is ongoing. Findings 256 patients were screened and 206 were enrolled between Oct 30, 2013, and Dec 19, 2014. At the data cutoff date (Jan 18, 2016), 204 patients had received rucaparib, with 28 patients remaining in the study. 192 patients could be classified into one of the three predefined homologous recombination deficiency subgroups: BRCA mutant (n=40), LOH high (n=82), or LOH low (n=70). Tumours from 12 patients were established as BRCA wild-type, but could not be classified for LOH, because of insufficient neoplastic nuclei in the sample. The median duration of treatment for the 204 patients was 5·7 months (IQR 2·8–10·1). 24 patients in the BRCA mutant subgroup, 56 patients in the LOH high subgroup, and 59 patients in the LOH low subgroup had disease progression or died. Median progression-free survival after rucaparib treatment was 12·8 months (95% CI 9·0–14·7) in the BRCA mutant subgroup, 5·7 months (5·3–7·6) in the LOH high subgroup, and 5·2 months (3·6–5·5) in the LOH low subgroup. Progression-free survival was significantly longer in the BRCA mutant (hazard ratio 0·27, 95% CI 0·16–0·44, p Interpretation In patients with BRCA mutant or BRCA wild-type and LOH high platinum-sensitive ovarian carcinomas treated with rucaparib, progression-free survival was longer than in patients with BRCA wild-type LOH low carcinomas. Our results suggest that assessment of tumour LOH can be used to identify patients with BRCA wild-type platinum-sensitive ovarian cancers who might benefit from rucaparib. These results extend the potential usefulness of PARP inhibitors in the treatment setting beyond BRCA mutant tumours. Funding Clovis Oncology, US Department of Defense Ovarian Cancer Research Program, Stand Up To Cancer—Ovarian Cancer Research Fund Alliance—National Ovarian Cancer Coalition Dream Team Translational Research Grant, and V Foundation Translational Award.

Journal ArticleDOI
TL;DR: Among participants at high genetic risk, a favorable lifestyle was associated with a nearly 50% lower relative risk of coronary artery disease than was an unfavorable lifestyle, and across four studies involving 55,685 participants, genetic and lifestyle factors were independently associated with susceptibility to coronary arteries disease.
Abstract: BackgroundBoth genetic and lifestyle factors contribute to individual-level risk of coronary artery disease. The extent to which increased genetic risk can be offset by a healthy lifestyle is unknown. MethodsUsing a polygenic score of DNA sequence polymorphisms, we quantified genetic risk for coronary artery disease in three prospective cohorts — 7814 participants in the Atherosclerosis Risk in Communities (ARIC) study, 21,222 in the Women’s Genome Health Study (WGHS), and 22,389 in the Malmo Diet and Cancer Study (MDCS) — and in 4260 participants in the cross-sectional BioImage Study for whom genotype and covariate data were available. We also determined adherence to a healthy lifestyle among the participants using a scoring system consisting of four factors: no current smoking, no obesity, regular physical activity, and a healthy diet. ResultsThe relative risk of incident coronary events was 91% higher among participants at high genetic risk (top quintile of polygenic scores) than among those at low gen...

Journal ArticleDOI
12 Feb 2015-Cell
TL;DR: Findings identify myoregulin (MLN) as an important regulator of skeletal muscle physiology and highlight the possibility that additional micropeptides are encoded in the many RNAs currently annotated as noncoding.


Journal ArticleDOI
TL;DR: A large fraction of heroin users now report that they formerly used prescription opioids nonmedically, a finding that has led to restrictions on opioid prescribing, but only a small fraction of prescription-opioid users move on to heroin use.
Abstract: A large fraction of heroin users now report that they formerly used prescription opioids nonmedically, a finding that has led to restrictions on opioid prescribing. Nevertheless, only a small fraction of prescription-opioid users move on to heroin use.

Journal ArticleDOI
TL;DR: The epidemiology provides clues on etiology, primary prevention, early detection, and possibly even therapeutic strategies for OC, including parity, oral contraceptive use, and lactation.
Abstract: Ovarian cancer (OC) is the seventh most commonly diagnosed cancer among women in the world and the tenth most common in China. Epithelial OC is the most predominant pathologic subtype, with five major histotypes that differ in origination, pathogenesis, molecular alterations, risk factors, and prognosis. Genetic susceptibility is manifested by rare inherited mutations with high to moderate penetrance. Genome-wide association studies have additionally identified 29 common susceptibility alleles for OC, including 14 subtype-specific alleles. Several reproductive and hormonal factors may lower risk, including parity, oral contraceptive use, and lactation, while others such as older age at menopause and hormone replacement therapy confer increased risks. These associations differ by histotype, especially for mucinous OC, likely reflecting differences in etiology. Endometrioid and clear cell OC share a similar, unique pattern of associations with increased risks among women with endometriosis and decreased risks associated with tubal ligation. OC risks associated with other gynecological conditions and procedures, such as hysterectomy, pelvic inflammatory disease, and polycystic ovarian syndrome, are less clear. Other possible risk factors include environmental and lifestyle factors such as asbestos and talc powder exposures, and cigarette smoking. The epidemiology provides clues on etiology, primary prevention, early detection, and possibly even therapeutic strategies.

Proceedings ArticleDOI
01 Jun 2016
TL;DR: An approach that jointly solves the tasks of detection and pose estimation: it infers the number of persons in a scene, identifies occluded body parts, and disambiguates body parts between people in close proximity of each other is proposed.
Abstract: This paper considers the task of articulated human pose estimation of multiple people in real world images. We propose an approach that jointly solves the tasks of detection and pose estimation: it infers the number of persons in a scene, identifies occluded body parts, and disambiguates body parts between people in close proximity of each other. This joint formulation is in contrast to previous strategies, that address the problem by first detecting people and subsequently estimating their body pose. We propose a partitioning and labeling formulation of a set of body-part hypotheses generated with CNN-based part detectors. Our formulation, an instance of an integer linear program, implicitly performs non-maximum suppression on the set of part candidates and groups them to form configurations of body parts respecting geometric and appearance constraints. Experiments on four different datasets demonstrate state-of-the-art results for both single person and multi person pose estimation1.

Journal ArticleDOI
TL;DR: The possibility that the dark matter comprises primordial black holes (PBHs) is considered in this paper, with particular emphasis on the currently allowed mass windows at 10(16)-10(17) g, 10(20)-10 (24) g and 1-...
Abstract: The possibility that the dark matter comprises primordial black holes (PBHs) is considered, with particular emphasis on the currently allowed mass windows at 10(16)-10(17) g, 10(20)-10(24) g and 1- ...

Journal ArticleDOI
TL;DR: In this paper, the authors consider the issue of opacity as a problem for socially consequential mechanisms of classification and ranking, such as spam filters, credit card fraud detection, search engines, news, etc.
Abstract: This article considers the issue of opacity as a problem for socially consequential mechanisms of classification and ranking, such as spam filters, credit card fraud detection, search engines, news...

Posted Content
TL;DR: This work thoroughly study three key components of SRGAN – network architecture, adversarial loss and perceptual loss, and improves each of them to derive an Enhanced SRGAN (ESRGAN), which achieves consistently better visual quality with more realistic and natural textures than SRGAN.
Abstract: The Super-Resolution Generative Adversarial Network (SRGAN) is a seminal work that is capable of generating realistic textures during single image super-resolution. However, the hallucinated details are often accompanied with unpleasant artifacts. To further enhance the visual quality, we thoroughly study three key components of SRGAN - network architecture, adversarial loss and perceptual loss, and improve each of them to derive an Enhanced SRGAN (ESRGAN). In particular, we introduce the Residual-in-Residual Dense Block (RRDB) without batch normalization as the basic network building unit. Moreover, we borrow the idea from relativistic GAN to let the discriminator predict relative realness instead of the absolute value. Finally, we improve the perceptual loss by using the features before activation, which could provide stronger supervision for brightness consistency and texture recovery. Benefiting from these improvements, the proposed ESRGAN achieves consistently better visual quality with more realistic and natural textures than SRGAN and won the first place in the PIRM2018-SR Challenge. The code is available at this https URL .