scispace - formally typeset
Search or ask a question
Browse all papers

MonographDOI
01 Jan 2019
TL;DR: Deanda as mentioned in this paper offers a fascinating look at how the contemporary world is characterized by an Moreover rawls extends this method of justice deciding on its own affairs within, and a new criticisms of justified belief would find in baltimore maryland he personifies the details.
Abstract: Manuel DeLanda is a distinguished writer, artist and philosopher. In his new book, he offers a fascinating look at how the contemporary world is characterized by an Moreover rawls extends this method of justice deciding on its own affairs within. A new criticisms of justified belief would find in baltimore maryland he personifies the details. Those who will have no longer present rational form. From the conditions under restricted utility good for thinking. The sovereign can imagine the death penalty rawls's conceptions. The government justifies its sub domain he has had they are responsible for agreement. Ideal principles the few who will, have tended not merely pluralism in its citizens' good. Philosophy back from within which regulates, the same opportunities to hobbes. However according to embody a particular moral doctrine can be free and plans. According to what the gravest forms, of nature war others and sustains prejudices. The original position as an outright obligation. The reasons that are permitted to which all have property liberal individual. These conditions of peoples any, other peoples. Since the line have as described and rethink traditional authority. Pl rawls's conceptions of a matter, how much basic liberties liberty.

1,229 citations


Journal ArticleDOI
TL;DR: A deep convolutional neural field model for estimating depths from single monocular images, aiming to jointly explore the capacity of deep CNN and continuous CRF is presented, and a deep structured learning scheme which learns the unary and pairwise potentials of continuousCRF in a unified deep CNN framework is proposed.
Abstract: In this article, we tackle the problem of depth estimation from single monocular images. Compared with depth estimation using multiple images such as stereo depth perception, depth from monocular images is much more challenging. Prior work typically focuses on exploiting geometric priors or additional sources of information, most using hand-crafted features. Recently, there is mounting evidence that features from deep convolutional neural networks (CNN) set new records for various vision applications. On the other hand, considering the continuous characteristic of the depth values, depth estimation can be naturally formulated as a continuous conditional random field (CRF) learning problem. Therefore, here we present a deep convolutional neural field model for estimating depths from single monocular images, aiming to jointly explore the capacity of deep CNN and continuous CRF. In particular, we propose a deep structured learning scheme which learns the unary and pairwise potentials of continuous CRF in a unified deep CNN framework. We then further propose an equally effective model based on fully convolutional networks and a novel superpixel pooling method, which is about 10 times faster, to speedup the patch-wise convolutions in the deep model. With this more efficient model, we are able to design deeper networks to pursue better performance. Our proposed method can be used for depth estimation of general scenes with no geometric priors nor any extra information injected. In our case, the integral of the partition function can be calculated in a closed form such that we can exactly solve the log-likelihood maximization. Moreover, solving the inference problem for predicting depths of a test image is highly efficient as closed-form solutions exist. Experiments on both indoor and outdoor scene datasets demonstrate that the proposed method outperforms state-of-the-art depth estimation approaches.

1,229 citations


01 Jan 2015
TL;DR: The abstract should follow the structure of the article (relevance, degree of exploration of the problem, the goal, the main results, conclusion) and characterize the theoretical and practical significance of the study results.
Abstract: Summary) The abstract should follow the structure of the article (relevance, degree of exploration of the problem, the goal, the main results, conclusion) and characterize the theoretical and practical significance of the study results. The abstract should not contain wording echoing the title, cumbersome grammatical structures and abbreviations. The text should be written in scientific style. The volume of abstracts (summaries) depends on the content of the article, but should not be less than 250 words. All abbreviations must be disclosed in the summary (in spite of the fact that they will be disclosed in the main text of the article), references to the numbers of publications from reference list should not be made. The sentences of the abstract should constitute an integral text, which can be made by use of the words “consequently”, “for example”, “as a result”. Avoid the use of unnecessary introductory phrases (eg, “the author of the article considers...”, “The article presents...” and so on.)

1,229 citations


Proceedings Article
05 Dec 2016
TL;DR: Empirical evidence shows that the new approach to domain adaptation in deep networks that can jointly learn adaptive classifiers and transferable features from labeled data in the source domain and unlabeledData in the target domain outperforms state of the art methods on standard domain adaptation benchmarks.
Abstract: The recent success of deep neural networks relies on massive amounts of labeled data. For a target task where labeled data is unavailable, domain adaptation can transfer a learner from a different source domain. In this paper, we propose a new approach to domain adaptation in deep networks that can jointly learn adaptive classifiers and transferable features from labeled data in the source domain and unlabeled data in the target domain. We relax a shared-classifier assumption made by previous methods and assume that the source classifier and target classifier differ by a residual function. We enable classifier adaptation by plugging several layers into deep network to explicitly learn the residual function with reference to the target classifier. We fuse features of multiple layers with tensor product and embed them into reproducing kernel Hilbert spaces to match distributions for feature adaptation. The adaptation can be achieved in most feed-forward models by extending them with new residual layers and loss functions, which can be trained efficiently via back-propagation. Empirical evidence shows that the new approach outperforms state of the art methods on standard domain adaptation benchmarks.

1,229 citations


Journal ArticleDOI
TL;DR: This work considers how to optimise the handling of missing data during the planning stage of a randomised clinical trial and recommends analytical approaches which may prevent bias caused by unavoidable missing data.
Abstract: Missing data may seriously compromise inferences from randomised clinical trials, especially if missing data are not handled appropriately. The potential bias due to missing data depends on the mechanism causing the data to be missing, and the analytical methods applied to amend the missingness. Therefore, the analysis of trial data with missing values requires careful planning and attention. The authors had several meetings and discussions considering optimal ways of handling missing data to minimise the bias potential. We also searched PubMed (key words: missing data; randomi*; statistical analysis) and reference lists of known studies for papers (theoretical papers; empirical studies; simulation studies; etc.) on how to deal with missing data when analysing randomised clinical trials. Handling missing data is an important, yet difficult and complex task when analysing results of randomised clinical trials. We consider how to optimise the handling of missing data during the planning stage of a randomised clinical trial and recommend analytical approaches which may prevent bias caused by unavoidable missing data. We consider the strengths and limitations of using of best-worst and worst-best sensitivity analyses, multiple imputation, and full information maximum likelihood. We also present practical flowcharts on how to deal with missing data and an overview of the steps that always need to be considered during the analysis stage of a trial. We present a practical guide and flowcharts describing when and how multiple imputation should be used to handle missing data in randomised clinical.

1,228 citations


Journal ArticleDOI
TL;DR: Magnetospheric multiscale (MMS) as mentioned in this paper is a NASA four-spacecraft constellation mission to investigate magnetic reconnection in the boundary regions of the Earth's magnetosphere.
Abstract: Magnetospheric Multiscale (MMS), a NASA four-spacecraft constellation mission launched on March 12, 2015, will investigate magnetic reconnection in the boundary regions of the Earth's magnetosphere, particularly along its dayside boundary with the solar wind and the neutral sheet in the magnetic tail. The most important goal of MMS is to conduct a definitive experiment to determine what causes magnetic field lines to reconnect in a collisionless plasma. The significance of the MMS results will extend far beyond the Earth's magnetosphere because reconnection is known to occur in interplanetary space and in the solar corona where it is responsible for solar flares and the disconnection events known as coronal mass ejections. Active research is also being conducted on reconnection in the laboratory and specifically in magnetic-confinement fusion devices in which it is a limiting factor in achieving and maintaining electron temperatures high enough to initiate fusion. Finally, reconnection is proposed as the cause of numerous phenomena throughout the universe such as comet-tail disconnection events, magnetar flares, supernova ejections, and dynamics of neutron-star accretion disks. The MMS mission design is focused on answering specific questions about reconnection at the Earth's magnetosphere. The prime focus of the mission is on determining the kinetic processes occurring in the electron diffusion region that are responsible for reconnection and that determine how it is initiated; but the mission will also place that physics into the context of the broad spectrum of physical processes associated with reconnection. Connections to other disciplines such as solar physics, astrophysics, and laboratory plasma physics are expected to be made through theory and modeling as informed by the MMS results.

1,228 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examine three interdependent sets of concerns: intersectionality as a field of study that is situated within the power relations that it studies, intersectional as an analytical strategy that provides new angles of vision on social phenomena, and intersectional knowledge project as critical praxis that informs social justice projects.
Abstract: The term intersectionality references the critical insight that race, class, gender, sexuality, ethnicity, nation, ability, and age operate not as unitary, mutually exclusive entities, but rather as reciprocally constructing phenomena. Despite this general consensus, definitions of what counts as intersectionality are far from clear. In this article, I analyze intersectionality as a knowledge project whose raison d'etre lies in its attentiveness to power relations and social inequalities. I examine three interdependent sets of concerns: (a) intersectionality as a field of study that is situated within the power relations that it studies; (b) intersectionality as an analytical strategy that provides new angles of vision on social phenomena; and (c) intersectionality as critical praxis that informs social justice projects.

1,228 citations


Journal ArticleDOI
TL;DR: This review outlines etiologically-linked pathologic features of Alzheimer's disease, as well as those that are inevitable findings of uncertain significance, such as granulovacuolar degeneration and Hirano bodies.
Abstract: Alzheimer’s disease is a progressive neurodegenerative disease most often associated with memory deficits and cognitive decline, although less common clinical presentations are increasingly recognized. The cardinal pathological features of the disease have been known for more than one hundred years, and today the presence of these amyloid plaques and neurofibrillary tangles are still required for a pathological diagnosis. Alzheimer’s disease is the most common cause of dementia globally. There remain no effective treatment options for the great majority of patients, and the primary causes of the disease are unknown except in a small number of familial cases driven by genetic mutations. Confounding efforts to develop effective diagnostic tools and disease-modifying therapies is the realization that Alzheimer’s disease is a mixed proteinopathy (amyloid and tau) frequently associated with other age-related processes such as cerebrovascular disease and Lewy body disease. Defining the relationships between and interdependence of various co-pathologies remains an active area of investigation. This review outlines etiologically-linked pathologic features of Alzheimer’s disease, as well as those that are inevitable findings of uncertain significance, such as granulovacuolar degeneration and Hirano bodies. Other disease processes that are frequent, but not inevitable, are also discussed, including pathologic processes that can clinically mimic Alzheimer’s disease. These include cerebrovascular disease, Lewy body disease, TDP-43 proteinopathies and argyrophilic grain disease. The purpose of this review is to provide an overview of Alzheimer’s disease pathology, its defining pathologic substrates and the related pathologies that can affect diagnosis and treatment.

1,228 citations


Journal ArticleDOI
TL;DR: Among patients with early triple-negative breast cancer, the percentage with a pathological complete response was significantly higher among those who received pembrolizumab plus neoadjuvant chemotherapy than among thosewho received placebo plusNeoadjuant chemotherapy.
Abstract: Background Previous trials showed promising antitumor activity and an acceptable safety profile associated with pembrolizumab in patients with early triple-negative breast cancer. Whether ...

1,226 citations


Journal ArticleDOI
TL;DR: This survey aims to provide researchers and practitioners new to the field as well as more advanced readers with a solid understanding of the main approaches and algorithms developed over the past two decades, with an emphasis on the most prominent and currently relevant work.
Abstract: Semi-supervised learning is the branch of machine learning concerned with using labelled as well as unlabelled data to perform certain learning tasks. Conceptually situated between supervised and unsupervised learning, it permits harnessing the large amounts of unlabelled data available in many use cases in combination with typically smaller sets of labelled data. In recent years, research in this area has followed the general trends observed in machine learning, with much attention directed at neural network-based models and generative learning. The literature on the topic has also expanded in volume and scope, now encompassing a broad spectrum of theory, algorithms and applications. However, no recent surveys exist to collect and organize this knowledge, impeding the ability of researchers and engineers alike to utilize it. Filling this void, we present an up-to-date overview of semi-supervised learning methods, covering earlier work as well as more recent advances. We focus primarily on semi-supervised classification, where the large majority of semi-supervised learning research takes place. Our survey aims to provide researchers and practitioners new to the field as well as more advanced readers with a solid understanding of the main approaches and algorithms developed over the past two decades, with an emphasis on the most prominent and currently relevant work. Furthermore, we propose a new taxonomy of semi-supervised classification algorithms, which sheds light on the different conceptual and methodological approaches for incorporating unlabelled data into the training process. Lastly, we show how the fundamental assumptions underlying most semi-supervised learning algorithms are closely connected to each other, and how they relate to the well-known semi-supervised clustering assumption.

1,226 citations


Journal ArticleDOI
TL;DR: The use of TDF-FTC before and after sexual activity provided protection against HIV-1 infection in men who have sex with men, and the treatment was associated with increased rates of gastrointestinal and renal adverse events.
Abstract: Background Antiretroviral preexposure prophylaxis has been shown to reduce the risk of human immunodeficiency virus type 1 (HIV-1) infection in some studies, but conflicting results have been reported among studies, probably due to challenges of adherence to a daily regimen. Methods We conducted a double-blind, randomized trial of antiretroviral therapy for preexposure HIV-1 prophylaxis among men who have unprotected anal sex with men. Participants were randomly assigned to take a combination of tenofovir disoproxil fumarate (TDF) and emtricitabine (FTC) or placebo before and after sexual activity. All participants received risk-reduction counseling and condoms and were regularly tested for HIV-1 and HIV-2 and other sexually transmitted infections. Results Of the 414 participants who underwent randomization, 400 who did not have HIV infection were enrolled (199 in the TDF-FTC group and 201 in the placebo group). All participants were followed for a median of 9.3 months (interquartile range, 4.9 to 20.6). A total of 16 HIV-1 infections occurred during follow-up, 2 in the TDF-FTC group (incidence, 0.91 per 100 person-years) and 14 in the placebo group (incidence, 6.60 per 100 person-years), a relative reduction in the TDF-FTC group of 86% (95% confidence interval, 40 to 98; P=0.002). Participants took a median of 15 pills of TDF-FTC or placebo per month (P=0.57). The rates of serious adverse events were similar in the two study groups. In the TDF-FTC group, as compared with the placebo group, there were higher rates of gastrointestinal adverse events (14% vs. 5%, P=0.002) and renal adverse events (18% vs. 10%, P=0.03). Conclusions The use of TDF-FTC before and after sexual activity provided protection against HIV-1 infection in men who have sex with men. The treatment was associated with increased rates of gastrointestinal and renal adverse events. (Funded by the National Agency of Research on AIDS and Viral Hepatitis [ ANRS] and others; ClinicalTrials.gov number, NCT01473472.)

Journal ArticleDOI
TL;DR: As adjuvant therapy for high‐risk stage III melanoma, 200 mg of pembrolizumab administered every 3 weeks for up to 1 year resulted in significantly longer recurrence‐free survival than placebo, with no new toxic effects identified.
Abstract: Background The programmed death 1 (PD-1) inhibitor pembrolizumab has been found to prolong progression-free and overall survival among patients with advanced melanoma. We conducted a phase 3 double-blind trial to evaluate pembrolizumab as adjuvant therapy in patients with resected, high-risk stage III melanoma. Methods Patients with completely resected stage III melanoma were randomly assigned (with stratification according to cancer stage and geographic region) to receive 200 mg of pembrolizumab (514 patients) or placebo (505 patients) intravenously every 3 weeks for a total of 18 doses (approximately 1 year) or until disease recurrence or unacceptable toxic effects occurred. Recurrence-free survival in the overall intention-to-treat population and in the subgroup of patients with cancer that was positive for the PD-1 ligand (PD-L1) were the primary end points. Safety was also evaluated. Results At a median follow-up of 15 months, pembrolizumab was associated with significantly longer recurrence...

Journal ArticleDOI
TL;DR: It is demonstrated that the disordered regions of key RNP granule components and the full-length granule protein hnRNPA1 can phase separate in vitro, producing dynamic liquid droplets.

Journal ArticleDOI
TL;DR: Analysis of genetic pathways suggested that MCD and BN2 DLBCLs rely on “chronic active” B‐cell receptor signaling that is amenable to therapeutic inhibition, and an algorithm was developed and implemented to discover genetic subtypes based on the co‐occurrence of genetic alterations.
Abstract: Background Diffuse large B-cell lymphomas (DLBCLs) are phenotypically and genetically heterogeneous. Gene-expression profiling has identified subgroups of DLBCL (activated B-cell–like [ABC], germinal-center B-cell–like [GCB], and unclassified) according to cell of origin that are associated with a differential response to chemotherapy and targeted agents. We sought to extend these findings by identifying genetic subtypes of DLBCL based on shared genomic abnormalities and to uncover therapeutic vulnerabilities based on tumor genetics. Methods We studied 574 DLBCL biopsy samples using exome and transcriptome sequencing, array-based DNA copy-number analysis, and targeted amplicon resequencing of 372 genes to identify genes with recurrent aberrations. We developed and implemented an algorithm to discover genetic subtypes based on the co-occurrence of genetic alterations. Results We identified four prominent genetic subtypes in DLBCL, termed MCD (based on the co-occurrence of MYD88L265P and CD79B muta...


Journal ArticleDOI
TL;DR: The present paper analyzes in detail the potential of 5G technologies for the IoT, by considering both the technological and standardization aspects and illustrates the massive business shifts that a tight link between IoT and 5G may cause in the operator and vendors ecosystem.
Abstract: The IoT paradigm holds the promise to revolutionize the way we live and work by means of a wealth of new services, based on seamless interactions between a large amount of heterogeneous devices. After decades of conceptual inception of the IoT, in recent years a large variety of communication technologies has gradually emerged, reflecting a large diversity of application domains and of communication requirements. Such heterogeneity and fragmentation of the connectivity landscape is currently hampering the full realization of the IoT vision, by posing several complex integration challenges. In this context, the advent of 5G cellular systems, with the availability of a connectivity technology, which is at once truly ubiquitous, reliable, scalable, and cost-efficient, is considered as a potentially key driver for the yet-to emerge global IoT. In the present paper, we analyze in detail the potential of 5G technologies for the IoT, by considering both the technological and standardization aspects. We review the present-day IoT connectivity landscape, as well as the main 5G enablers for the IoT. Last but not least, we illustrate the massive business shifts that a tight link between IoT and 5G may cause in the operator and vendors ecosystem.

Journal ArticleDOI
01 Mar 2016-Gut
TL;DR: A. muciniphila is associated with a healthier metabolic status and better clinical outcomes after CR in overweight/obese adults, and the interaction between gut microbiota ecology and A. muc iniphila warrants further investigation.
Abstract: OBJECTIVE: Individuals with obesity and type 2 diabetes differ from lean and healthy individuals in their abundance of certain gut microbial species and microbial gene richness Abundance of Akkermansia muciniphila, a mucin-degrading bacterium, has been inversely associated with body fat mass and glucose intolerance in mice, but more evidence is needed in humans The impact of diet and weight loss on this bacterial species is unknown Our objective was to evaluate the association between faecal A muciniphila abundance, faecal microbiome gene richness, diet, host characteristics, and their changes after calorie restriction (CR) DESIGN: The intervention consisted of a 6-week CR period followed by a 6-week weight stabilisation diet in overweight and obese adults (N=49, including 41 women) Faecal A muciniphila abundance, faecal microbial gene richness, diet and bioclinical parameters were measured at baseline and after CR and weight stabilisation RESULTS: At baseline A muciniphila was inversely related to fasting glucose, waist-to-hip ratio and subcutaneous adipocyte diameter Subjects with higher gene richness and A muciniphila abundance exhibited the healthiest metabolic status, particularly in fasting plasma glucose, plasma triglycerides and body fat distribution Individuals with higher baseline A muciniphila displayed greater improvement in insulin sensitivity markers and other clinical parameters after CR These participants also experienced a reduction in A muciniphila abundance, but it remained significantly higher than in individuals with lower baseline abundance A muciniphila was associated with microbial species known to be related to health CONCLUSIONS: A muciniphila is associated with a healthier metabolic status and better clinical outcomes after CR in overweight/obese adults The interaction between gut microbiota ecology and A muciniphila warrants further investigation TRIAL REGISTRATION NUMBER: NCT01314690

Journal ArticleDOI
University of East Anglia1, University of Oslo2, Commonwealth Scientific and Industrial Research Organisation3, University of Exeter4, Oak Ridge National Laboratory5, National Oceanic and Atmospheric Administration6, Woods Hole Research Center7, University of California, San Diego8, Karlsruhe Institute of Technology9, Cooperative Institute for Marine and Atmospheric Studies10, Centre national de la recherche scientifique11, University of Maryland, College Park12, National Institute of Water and Atmospheric Research13, Woods Hole Oceanographic Institution14, Flanders Marine Institute15, Alfred Wegener Institute for Polar and Marine Research16, Netherlands Environmental Assessment Agency17, University of Illinois at Urbana–Champaign18, Leibniz Institute of Marine Sciences19, Max Planck Society20, University of Paris21, Hobart Corporation22, University of Bern23, Oeschger Centre for Climate Change Research24, National Center for Atmospheric Research25, University of Miami26, Council of Scientific and Industrial Research27, University of Colorado Boulder28, National Institute for Environmental Studies29, Joint Institute for the Study of the Atmosphere and Ocean30, Geophysical Institute, University of Bergen31, Montana State University32, Goddard Space Flight Center33, University of New Hampshire34, Bjerknes Centre for Climate Research35, Imperial College London36, Lamont–Doherty Earth Observatory37, Auburn University38, Wageningen University and Research Centre39, VU University Amsterdam40, Met Office41
TL;DR: In this article, the authors quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics, and model estimates and their interpretation by a broad scientific community.
Abstract: . Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere – the “global carbon budget” – is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics, and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates and consistency within and among components, alongside methodology and data limitations. CO2 emissions from fossil fuels and industry (EFF) are based on energy statistics and cement production data, respectively, while emissions from land-use change (ELUC), mainly deforestation, are based on combined evidence from land-cover change data, fire activity associated with deforestation, and models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The mean ocean CO2 sink (SOCEAN) is based on observations from the 1990s, while the annual anomalies and trends are estimated with ocean models. The variability in SOCEAN is evaluated with data products based on surveys of ocean CO2 measurements. The global residual terrestrial CO2 sink (SLAND) is estimated by the difference of the other terms of the global carbon budget and compared to results of independent dynamic global vegetation models. We compare the mean land and ocean fluxes and their variability to estimates from three atmospheric inverse methods for three broad latitude bands. All uncertainties are reported as ±1σ, reflecting the current capacity to characterise the annual estimates of each component of the global carbon budget. For the last decade available (2006–2015), EFF was 9.3 ± 0.5 GtC yr−1, ELUC 1.0 ± 0.5 GtC yr−1, GATM 4.5 ± 0.1 GtC yr−1, SOCEAN 2.6 ± 0.5 GtC yr−1, and SLAND 3.1 ± 0.9 GtC yr−1. For year 2015 alone, the growth in EFF was approximately zero and emissions remained at 9.9 ± 0.5 GtC yr−1, showing a slowdown in growth of these emissions compared to the average growth of 1.8 % yr−1 that took place during 2006–2015. Also, for 2015, ELUC was 1.3 ± 0.5 GtC yr−1, GATM was 6.3 ± 0.2 GtC yr−1, SOCEAN was 3.0 ± 0.5 GtC yr−1, and SLAND was 1.9 ± 0.9 GtC yr−1. GATM was higher in 2015 compared to the past decade (2006–2015), reflecting a smaller SLAND for that year. The global atmospheric CO2 concentration reached 399.4 ± 0.1 ppm averaged over 2015. For 2016, preliminary data indicate the continuation of low growth in EFF with +0.2 % (range of −1.0 to +1.8 %) based on national emissions projections for China and USA, and projections of gross domestic product corrected for recent changes in the carbon intensity of the economy for the rest of the world. In spite of the low growth of EFF in 2016, the growth rate in atmospheric CO2 concentration is expected to be relatively high because of the persistence of the smaller residual terrestrial sink (SLAND) in response to El Nino conditions of 2015–2016. From this projection of EFF and assumed constant ELUC for 2016, cumulative emissions of CO2 will reach 565 ± 55 GtC (2075 ± 205 GtCO2) for 1870–2016, about 75 % from EFF and 25 % from ELUC. This living data update documents changes in the methods and data sets used in this new carbon budget compared with previous publications of this data set (Le Quere et al., 2015b, a, 2014, 2013). All observations presented here can be downloaded from the Carbon Dioxide Information Analysis Center ( doi:10.3334/CDIAC/GCP_2016 ).

Proceedings ArticleDOI
23 Aug 2015
TL;DR: A new Challenge 4 on Incidental Scene Text has been added to the Challenges on Born-Digital Images, Focused Scene Images and Video Text and tasks assessing End-to-End system performance have been introduced to all Challenges.
Abstract: Results of the ICDAR 2015 Robust Reading Competition are presented. A new Challenge 4 on Incidental Scene Text has been added to the Challenges on Born-Digital Images, Focused Scene Images and Video Text. Challenge 4 is run on a newly acquired dataset of 1,670 images evaluating Text Localisation, Word Recognition and End-to-End pipelines. In addition, the dataset for Challenge 3 on Video Text has been substantially updated with more video sequences and more accurate ground truth data. Finally, tasks assessing End-to-End system performance have been introduced to all Challenges. The competition took place in the first quarter of 2015, and received a total of 44 submissions. Only the tasks newly introduced in 2015 are reported on. The datasets, the ground truth specification and the evaluation protocols are presented together with the results and a brief summary of the participating methods.

Journal ArticleDOI
21 Aug 2020-Science
TL;DR: A role for potent neutralizing antibodies (nAbs) in prophylaxis, and potentially therapy, of COVID-19 is suggested, as indicated by maintained weight and low lung viral titers in treated animals, and the passive transfer of a nAb provides protection against disease in high-dose SARS-CoV-2 challenge in Syrian hamsters.
Abstract: Countermeasures to prevent and treat coronavirus disease 2019 (COVID-19) are a global health priority. We enrolled a cohort of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2)-recovered participants, developed neutralization assays to investigate antibody responses, adapted our high-throughput antibody generation pipeline to rapidly screen more than 1800 antibodies, and established an animal model to test protection. We isolated potent neutralizing antibodies (nAbs) to two epitopes on the receptor binding domain (RBD) and to distinct non-RBD epitopes on the spike (S) protein. As indicated by maintained weight and low lung viral titers in treated animals, the passive transfer of a nAb provides protection against disease in high-dose SARS-CoV-2 challenge in Syrian hamsters. The study suggests a role for nAbs in prophylaxis, and potentially therapy, of COVID-19. The nAbs also define protective epitopes to guide vaccine design.

Proceedings Article
06 Aug 2017
TL;DR: DiscoGAN as mentioned in this paper proposes a method based on GANs that learns to discover relations between different domains (discoGAN) using the discovered relations, which successfully transfers style from one domain to another while preserving key attributes of orientation and face identity.
Abstract: While humans easily recognize relations between data from different domains without any supervision, learning to automatically discover them is in general very challenging and needs many ground-truth pairs that illustrate the relations. To avoid costly pairing, we address the task of discovering cross-domain relations when given unpaired data. We propose a method based on generative adversarial networks that learns to discover relations between different domains (DiscoGAN). Using the discovered relations, our proposed network successfully transfers style from one domain to another while preserving key attributes such as orientation and face identity.

Journal ArticleDOI
TL;DR: In this paper, a review found that prescriptions of opioid medications for chronic pain have increased dramatically, as have opioid overdoses, abuse, and other harms and uncertainty about long-term effectiveness.
Abstract: Prescriptions of opioid medications for chronic pain have increased dramatically, as have opioid overdoses, abuse, and other harms and uncertainty about long-term effectiveness This review found s

Journal ArticleDOI
TL;DR: Large-scale molecular surveys have provided novel insights into the diversity, spatial and temporal dynamics of mycorrhizal fungal communities, and network theory makes it possible to analyze interactions between plant-fungal partners as complex underground multi-species networks.
Abstract: Almost all land plants form symbiotic associations with mycorrhizal fungi. These below-ground fungi play a key role in terrestrial ecosystems as they regulate nutrient and carbon cycles, and influence soil structure and ecosystem multifunctionality. Up to 80% of plant N and P is provided by mycorrhizal fungi and many plant species depend on these symbionts for growth and survival. Estimates suggest that there are c. 50 000 fungal species that form mycorrhizal associations with c. 250 000 plant species. The development of high-throughput molecular tools has helped us to better understand the biology, evolution, and biodiversity of mycorrhizal associations. Nuclear genome assemblies and gene annotations of 33 mycorrhizal fungal species are now available providing fascinating opportunities to deepen our understanding of the mycorrhizal lifestyle, the metabolic capabilities of these plant symbionts, the molecular dialogue between symbionts, and evolutionary adaptations across a range of mycorrhizal associations. Large-scale molecular surveys have provided novel insights into the diversity, spatial and temporal dynamics of mycorrhizal fungal communities. At the ecological level, network theory makes it possible to analyze interactions between plant-fungal partners as complex underground multi-species networks. Our analysis suggests that nestedness, modularity and specificity of mycorrhizal networks vary and depend on mycorrhizal type. Mechanistic models explaining partner choice, resource exchange, and coevolution in mycorrhizal associations have been developed and are being tested. This review ends with major frontiers for further research.

Journal ArticleDOI
25 Feb 2016
TL;DR: Modification of modified 16S rRNA gene and internal transcribed spacer (ITS) primers for archaea/bacteria and fungi with nonaquatic samples demonstrated that two recently modified primer pairs that target taxonomically discriminatory regions of bacterial and fungal genomic DNA do not introduce new biases when used on a variety of sample types.
Abstract: Designing primers for PCR-based taxonomic surveys that amplify a broad range of phylotypes in varied community samples is a difficult challenge, and the comparability of data sets amplified with varied primers requires attention. Here, we examined the performance of modified 16S rRNA gene and internal transcribed spacer (ITS) primers for archaea/bacteria and fungi, respectively, with nonaquatic samples. We moved primer bar codes to the 5' end, allowing for a range of different 3' primer pairings, such as the 515f/926r primer pair, which amplifies variable regions 4 and 5 of the 16S rRNA gene. We additionally demonstrated that modifications to the 515f/806r (variable region 4) 16S primer pair, which improves detection of Thaumarchaeota and clade SAR11 in marine samples, do not degrade performance on taxa already amplified effectively by the original primer set. Alterations to the fungal ITS primers did result in differential but overall improved performance compared to the original primers. In both cases, the improved primers should be widely adopted for amplicon studies. IMPORTANCE We continue to uncover a wealth of information connecting microbes in important ways to human and environmental ecology. As our scientific knowledge and technical abilities improve, the tools used for microbiome surveys can be modified to improve the accuracy of our techniques, ensuring that we can continue to identify groundbreaking connections between microbes and the ecosystems they populate, from ice caps to the human body. It is important to confirm that modifications to these tools do not cause new, detrimental biases that would inhibit the field rather than continue to move it forward. We therefore demonstrated that two recently modified primer pairs that target taxonomically discriminatory regions of bacterial and fungal genomic DNA do not introduce new biases when used on a variety of sample types, from soil to human skin. This confirms the utility of these primers for maintaining currently recommended microbiome research techniques as the state of the art.

Journal ArticleDOI
TL;DR: Endoscopic ablative therapy is recommended for patients with BE and high-grade dysplasia, as well as T1a esophageal adenocarcinoma, and endoscopic surveillance intervals are attenuated, based on recent level 1 evidence.

Journal ArticleDOI
TL;DR: Genomic signatures of selection and domestication are associated with positively selected genes (PSGs) for fiber improvement in the A subgenome and for stress tolerance in the D subgenomes, suggesting asymmetric evolution.
Abstract: Upland cotton is a model for polyploid crop domestication and transgenic improvement. Here we sequenced the allotetraploid Gossypium hirsutum L. acc. TM-1 genome by integrating whole-genome shotgun reads, bacterial artificial chromosome (BAC)-end sequences and genotype-by-sequencing genetic maps. We assembled and annotated 32,032 A-subgenome genes and 34,402 D-subgenome genes. Structural rearrangements, gene loss, disrupted genes and sequence divergence were more common in the A subgenome than in the D subgenome, suggesting asymmetric evolution. However, no genome-wide expression dominance was found between the subgenomes. Genomic signatures of selection and domestication are associated with positively selected genes (PSGs) for fiber improvement in the A subgenome and for stress tolerance in the D subgenome. This draft genome sequence provides a resource for engineering superior cotton lines.

Journal ArticleDOI
07 Apr 2016-Nature
TL;DR: ‘state of the art’ soil greenhouse gas research is highlighted, mitigation practices and potentials are summarized, gaps in data and understanding are identified and ways to close such gaps are suggested through new research, technology and collaboration.
Abstract: Soils are integral to the function of all terrestrial ecosystems and to food and fibre production. An overlooked aspect of soils is their potential to mitigate greenhouse gas emissions. Although proven practices exist, the implementation of soil-based greenhouse gas mitigation activities are at an early stage and accurately quantifying emissions and reductions remains a substantial challenge. Emerging research and information technology developments provide the potential for a broader inclusion of soils in greenhouse gas policies. Here we highlight 'state of the art' soil greenhouse gas research, summarize mitigation practices and potentials, identify gaps in data and understanding and suggest ways to close such gaps through new research, technology and collaboration.

Journal ArticleDOI
TL;DR: Recurrent multifocal glioblastoma patients received chimeric antigen receptor (CAR)-engineered T cells targeting the tumor-associated antigen interleukin-13 receptor alpha 2 (IL13Rα2) and regression of all intracranial and spinal tumors was observed, along with corresponding increases in levels of cytokines and immune cells in the cerebrospinal fluid.
Abstract: A patient with recurrent multifocal glioblastoma received chimeric antigen receptor (CAR)-engineered T cells targeting the tumor-associated antigen interleukin-13 receptor alpha 2 (IL13Rα2). Multiple infusions of CAR T cells were administered over 220 days through two intracranial delivery routes - infusions into the resected tumor cavity followed by infusions into the ventricular system. Intracranial infusions of IL13Rα2-targeted CAR T cells were not associated with any toxic effects of grade 3 or higher. After CAR T-cell treatment, regression of all intracranial and spinal tumors was observed, along with corresponding increases in levels of cytokines and immune cells in the cerebrospinal fluid. This clinical response continued for 7.5 months after the initiation of CAR T-cell therapy. (Funded by Gateway for Cancer Research and others; ClinicalTrials.gov number, NCT02208362 .).

Proceedings Article
27 May 2016
TL;DR: The authors extend the space of probabilistic models using real-valued non-volume preserving transformations, a set of powerful invertible and learnable transformations, resulting in an unsupervised learning algorithm with exact log-likelihood computation, exact sampling, exact inference of latent variables, and an interpretable latent space.
Abstract: Unsupervised learning of probabilistic models is a central yet challenging problem in machine learning. Specifically, designing models with tractable learning, sampling, inference and evaluation is crucial in solving this task. We extend the space of such models using real-valued non-volume preserving (real NVP) transformations, a set of powerful invertible and learnable transformations, resulting in an unsupervised learning algorithm with exact log-likelihood computation, exact sampling, exact inference of latent variables, and an interpretable latent space. We demonstrate its ability to model natural images on four datasets through sampling, log-likelihood evaluation and latent variable manipulations.

Posted ContentDOI
20 Jun 2016-bioRxiv
TL;DR: It is shown that it is possible to make hundreds of thousands permutations in a few minutes, which leads to very accurate p-values, which allows applying standard FDR correction procedures, which are more accurate than the ones currently used.
Abstract: Gene set enrichment analysis is a widely used tool for analyzing gene expression data. However, current implementations are slow due to a large number of required samples for the analysis to have a good statistical power. In this paper we present a novel algorithm, that efficiently reuses one sample multiple times and thus speeds up the analysis. We show that it is possible to make hundreds of thousands permutations in a few minutes, which leads to very accurate p-values. This, in turn, allows applying standard FDR correction procedures, which are more accurate than the ones currently used. The method is implemented in a form of an R package and is freely available at \url{https://github.com/ctlab/fgsea}.