Browse all papers
••
01 Jul 2017TL;DR: In this paper, dilated residual networks (DRNs) outperform their non-dilated counterparts in image classification without increasing the models depth or complexity, and an approach to remove gridding artifacts introduced by dilation is proposed.
Abstract: Convolutional networks for image classification progressively reduce resolution until the image is represented by tiny feature maps in which the spatial structure of the scene is no longer discernible. Such loss of spatial acuity can limit image classification accuracy and complicate the transfer of the model to downstream applications that require detailed scene understanding. These problems can be alleviated by dilation, which increases the resolution of output feature maps without reducing the receptive field of individual neurons. We show that dilated residual networks (DRNs) outperform their non-dilated counterparts in image classification without increasing the models depth or complexity. We then study gridding artifacts introduced by dilation, develop an approach to removing these artifacts (degridding), and show that this further increases the performance of DRNs. In addition, we show that the accuracy advantage of DRNs is further magnified in downstream applications such as object localization and semantic segmentation.
1,010 citations
••
TL;DR: Deep learning has advanced rapidly since the early 2000s and now demonstrates state-of-the-art performance in various fields, including bioinformatics as discussed by the authors, which has been emphasized in both academia and industry.
Abstract: In the era of big data, transformation of biomedical big data into valuable knowledge has been one of the most important challenges in bioinformatics. Deep learning has advanced rapidly since the early 2000s and now demonstrates state-of-the-art performance in various fields. Accordingly, application of deep learning in bioinformatics to gain insight from data has been emphasized in both academia and industry. Here, we review deep learning in bioinformatics, presenting examples of current research. To provide a useful and comprehensive perspective, we categorize research both by the bioinformatics domain (i.e. omics, biomedical imaging, biomedical signal processing) and deep learning architecture (i.e. deep neural networks, convolutional neural networks, recurrent neural networks, emergent architectures) and present brief descriptions of each study. Additionally, we discuss theoretical and practical issues of deep learning in bioinformatics and suggest future research directions. We believe that this review will provide valuable insights and serve as a starting point for researchers to apply deep learning approaches in their bioinformatics studies.
1,010 citations
•
TL;DR: In 2014, the Science Definition Team (SDT) of the Wide Field Infrared Survey Telescope (WFIRST) mission presented a design reference mission (DRM) for an implementation of WFIRST using one of the 2.4m, Hubble-quality telescopes recently made available to NASA as discussed by the authors.
Abstract: This report describes the 2014 study by the Science Definition Team (SDT) of the Wide-Field Infrared Survey Telescope (WFIRST) mission. It is a space observatory that will address the most compelling scientific problems in dark energy, exoplanets and general astrophysics using a 2.4-m telescope with a wide-field infrared instrument and an optical coronagraph. The Astro2010 Decadal Survey recommended a Wide Field Infrared Survey Telescope as its top priority for a new large space mission. As conceived by the decadal survey,
WFIRST would carry out a dark energy science program, a microlensing program to
determine the demographics of exoplanets, and a general observing program
utilizing its ultra wide field. In October 2012, NASA chartered a Science
Definition Team (SDT) to produce, in collaboration with the WFIRST Study Office
at GSFC and the Program Office at JPL, a Design Reference Mission (DRM) for an
implementation of WFIRST using one of the 2.4-m, Hubble-quality telescope
assemblies recently made available to NASA. This DRM builds on the work of the
earlier WFIRST SDT, reported by Green et al. (2012) and the previous WFIRST-2.4
DRM, reported by Spergel et. (2013). The 2.4-m primary mirror enables a mission
with greater sensitivity and higher angular resolution than the 1.3-m and 1.1-m
designs considered previously, increasing both the science return of the
primary surveys and the capabilities of WFIRST as a Guest Observer facility.
The addition of an on-axis coronagraphic instrument to the baseline design
enables imaging and spectroscopic studies of planets around nearby stars.
1,009 citations
••
University of Washington1, California Institute of Technology2, Stockholm University3, University of Maryland, College Park4, Humboldt University of Berlin5, Goddard Space Flight Center6, National Central University7, Weizmann Institute of Science8, Macau University of Science and Technology9, Tel Aviv University10, University of California, Santa Barbara11, University of Michigan12, Adler Planetarium13, Northwestern University14, Lawrence Berkeley National Laboratory15, University of California, Berkeley16, Soka University of America17, Centre national de la recherche scientifique18, Radboud University Nijmegen19, University of Wisconsin–Milwaukee20, Los Alamos National Laboratory21
TL;DR: The Zwicky Transient Facility (ZTF) as mentioned in this paper is a new optical time-domain survey that uses the Palomar 48 inch Schmidt telescope, which provides a 47 deg^2 field of view and 8 s readout time, yielding more than an order of magnitude improvement in survey speed relative to its predecessor survey.
Abstract: The Zwicky Transient Facility (ZTF) is a new optical time-domain survey that uses the Palomar 48 inch Schmidt telescope. A custom-built wide-field camera provides a 47 deg^2 field of view and 8 s readout time, yielding more than an order of magnitude improvement in survey speed relative to its predecessor survey, the Palomar Transient Factory. We describe the design and implementation of the camera and observing system. The ZTF data system at the Infrared Processing and Analysis Center provides near-real-time reduction to identify moving and varying objects. We outline the analysis pipelines, data products, and associated archive. Finally, we present on-sky performance analysis and first scientific results from commissioning and the early survey. ZTF's public alert stream will serve as a useful precursor for that of the Large Synoptic Survey Telescope.
1,009 citations
••
TL;DR: The toxicity associated with the nuclease-based CRISPR/Cas9 system was greatly reduced in the Target-AID complexes, and it was demonstrated that off-target effects were comparable to those of conventional CRISpr/Cas systems, with a reduced risk of indel formation.
Abstract: INTRODUCTION To combat invading pathogens, cells develop an adaptive immune response by changing their own genetic information. In vertebrates, the generation of genetic variation (somatic hypermutation) is an essential process for diversification and affinity maturation of antibodies that function to detect and sequester various foreign biomolecules. The activation-induced cytidine deaminase (AID) carries out hypermutation by modifying deoxycytidine bases in the variable region of the immunoglobulin locus that produces antibody. AID-generated deoxyuridine in DNA is mutagenic as it can be miss-recognized as deoxythymine, resulting in C to T mutations. CRISPR (clustered regularly interspaced short palindromic repeats)/Cas (CRISPR-associated) is a prokaryotic adaptive immune system that records and degrades invasive foreign DNA or RNA. The CRISPR/Cas system cleaves and incorporates foreign DNA/RNA segments into the genomic region called the CRISPR array. The CRISPR array is transcribed to produce crispr-RNA that serves as guide RNA (gRNA) for recognition of the complementary foreign DNA/RNA in a ribonucleoprotein complex with Cas proteins, which degrade the target. The CRISPR/Cas system has been repurposed as a powerful genome editing tool, because it can be programmed to cleave specific DNA sequence by providing custom gRNAs. RATIONALE Although the precise mechanism by which AID specifically mutates the immunoglobulin locus remains elusive, targeting of AID activity is facilitated by the formation of a single-stranded DNA region, such as a transcriptional RNA/DNA hybrid (R-loop). The CRISPR/Cas system can be engineered to be nuclease-inactive. The nuclease-inactive form is capable of unfolding the DNA double strand in a protospacer adjacent motif (PAM) sequence-dependent manner so that the gRNA binds to complementary target DNA strand and forms an R-loop. The nuclease-deficient CRISPR/Cas system may serve as a suitable DNA-targeting module for AID to catalyze site-specific mutagenesis. RESULTS To determine whether AID activity can be specifically targeted by the CRISPR/Cas system, we combined dCas9 (a nuclease-deficient mutant of Cas9) from Streptococcus pyogenes and an AID ortholog, PmCDA1 from sea lamprey, to form a synthetic complex (Target-AID) by either engineering a fusion between the two proteins or attaching a SH3 (Src 3 homology) domain to the C terminus of dCas9 and a SHL (SH3 interaction ligand) to the C terminus of PmCDA1. Both of these complexes performed highly efficient site-directed mutagenesis. The mutational spectrum was analyzed in yeast and demonstrated that point mutations were dominantly induced at cytosines within the range of three to five bases surrounding the –18 position upstream of the PAM sequence on the noncomplementary strand to gRNA. The toxicity associated with the nuclease-based CRISPR/Cas9 system was greatly reduced in the Target-AID complexes. Combination of PmCDA1 with the nickase Cas9(D10A) mutant, which retains cleavage activity for noncomplementary single-stranded DNA, was more efficient in yeast but also induced deletions as well as point mutations in mammalian cells. Addition of the uracil DNA glycosylase inhibitor protein, which blocks the initial step of the uracil base excision repair pathway, suppressed collateral deletions and further improved targeting efficiency. Potential off-target effects were assessed by whole-genome sequencing of yeast as well as deep sequencing of mammalian cells for regions that contain mismatched target sequences. These results showed that off-target effects were comparable to those of conventional CRISPR/Cas systems, with a reduced risk of indel formation. CONCLUSION By expanding the genome editing potential of the CRISPR/Cas9 system by deaminase-mediated hypermutation, Target-AID demonstrated a very narrow range of targeted nucleotide substitution without the use of template DNA. Nickase Cas9 and uracil DNA glycosylase inhibitor protein can be used to boost the targeting efficiency. The reduced cytotoxicity will be beneficial for use in cells that are sensitive to artificial nucleases. Use of other types of nucleotide-modifying enzymes and/or other CRISPR-related systems with different PAM requirements will expand our genome-editing repertoire and capacity.
1,009 citations
••
TL;DR: A deprescribing protocol is proposed comprising 5 steps: ascertain all drugs the patient is currently taking and the reasons for each one, and prioritize drugs for discontinuation that have the lowest benefit-harm ratio and lowest likelihood of adverse withdrawal reactions or disease rebound syndromes.
Abstract: Inappropriate polypharmacy, especially in older people, imposes a substantial burden of adverse drug events, ill health, disability, hospitalization, and even death. The single most important predictor of inappropriate prescribing and risk of adverse drug events in older patients is the number of prescribed drugs. Deprescribing is the process of tapering or stopping drugs, aimed at minimizing polypharmacy and improving patient outcomes. Evidence of efficacy for deprescribing is emerging from randomized trials and observational studies. A deprescribing protocol is proposed comprising 5 steps: (1) ascertain all drugs the patient is currently taking and the reasons for each one; (2) consider overall risk of drug-induced harm in individual patients in determining the required intensity of deprescribing intervention; (3) assess each drug in regard to its current or future benefit potential compared with current or future harm or burden potential; (4) prioritize drugs for discontinuation that have the lowest benefit-harm ratio and lowest likelihood of adverse withdrawal reactions or disease rebound syndromes; and (5) implement a discontinuation regimen and monitor patients closely for improvement in outcomes or onset of adverse effects. Whereas patient and prescriber barriers to deprescribing exist, resources and strategies are available that facilitate deliberate yet judicious deprescribing and deserve wider application.
1,009 citations
••
TL;DR: The ISHLT Infectious Diseases, Pediatric and Heart Failure and Transplantation Councils Councils, and on behalf of the International Society for Heart Lung Trans transplantation (ISHLT) Infectious diseases, pediatric and heart failure and transplantation councilss are represented.
Abstract: Mandeep R. Mehra, MD (Chair), Charles E. Canter, MD, Margaret M. Hannan, MD, Marc J. Semigran, MD, Patricia A. Uber, PharmD, David A. Baran, MD, Lara Danziger-Isakov, MD, MPH, James K. Kirklin, MD, Richard Kirk, MD, Sudhir S. Kushwaha, MD, Lars H. Lund, MD, PhD, Luciano Potena, MD, PhD, Heather J. Ross, MD, David O. Taylor, MD, Erik A.M. Verschuuren, MD, PhD, Andreas Zuckermann, MD and on behalf of the International Society for Heart Lung Transplantation (ISHLT) Infectious Diseases, Pediatric and Heart Failure and Transplantation Councils
1,009 citations
••
TL;DR: A new view of protein folding is emerging, whereby the energy landscapes that proteins navigate during folding in vivo may differ substantially from those observed during refolding in vitro.
Abstract: Most proteins must fold into unique three-dimensional structures to perform their biological functions. In the crowded cellular environment, newly synthesized proteins are at risk of misfolding and forming toxic aggregate species. To ensure efficient folding, different classes of molecular chaperones receive the nascent protein chain emerging from the ribosome and guide it along a productive folding pathway. Because proteins are structurally dynamic, constant surveillance of the proteome by an integrated network of chaperones and protein degradation machineries is required to maintain protein homeostasis (proteostasis). The capacity of this proteostasis network declines during aging, facilitating neurodegeneration and other chronic diseases associated with protein aggregation. Understanding the proteostasis network holds the promise of identifying targets for pharmacological intervention in these pathologies.
1,009 citations
••
TL;DR: It is suggested that all the potential interventions be implemented to control the emerging COVID‐19 if the infection is uncontrollable and the current children's RNA‐virus vaccines including influenza vaccine should be immunized for uninfected people and health care workers.
Abstract: An outbreak of a novel coronavirus (COVID-19 or 2019-CoV) infection has posed significant threats to international health and the economy. In the absence of treatment for this virus, there is an urgent need to find alternative methods to control the spread of disease. Here, we have conducted an online search for all treatment options related to coronavirus infections as well as some RNA-virus infection and we have found that general treatments, coronavirus-specific treatments, and antiviral treatments should be useful in fighting COVID-19. We suggest that the nutritional status of each infected patient should be evaluated before the administration of general treatments and the current children's RNA-virus vaccines including influenza vaccine should be immunized for uninfected people and health care workers. In addition, convalescent plasma should be given to COVID-19 patients if it is available. In conclusion, we suggest that all the potential interventions be implemented to control the emerging COVID-19 if the infection is uncontrollable.
1,009 citations
•
TL;DR: BART as mentioned in this paper is a denoising autoencoder for pretraining sequence-to-sequence models, which is trained by corrupting text with an arbitrary noising function, and then learning a model to reconstruct the original text.
Abstract: We present BART, a denoising autoencoder for pretraining sequence-to-sequence models. BART is trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text. It uses a standard Tranformer-based neural machine translation architecture which, despite its simplicity, can be seen as generalizing BERT (due to the bidirectional encoder), GPT (with the left-to-right decoder), and many other more recent pretraining schemes. We evaluate a number of noising approaches, finding the best performance by both randomly shuffling the order of the original sentences and using a novel in-filling scheme, where spans of text are replaced with a single mask token. BART is particularly effective when fine tuned for text generation but also works well for comprehension tasks. It matches the performance of RoBERTa with comparable training resources on GLUE and SQuAD, achieves new state-of-the-art results on a range of abstractive dialogue, question answering, and summarization tasks, with gains of up to 6 ROUGE. BART also provides a 1.1 BLEU increase over a back-translation system for machine translation, with only target language pretraining. We also report ablation experiments that replicate other pretraining schemes within the BART framework, to better measure which factors most influence end-task performance.
1,008 citations
••
TL;DR: This work addresses the diffusion of information about the COVID-19 with a massive data analysis on Twitter, Instagram, YouTube, Reddit and Gab, and identifies information spreading from questionable sources, finding different volumes of misinformation in each platform.
Abstract: We address the diffusion of information about the COVID-19 with a massive data analysis on Twitter, Instagram, YouTube, Reddit and Gab. We analyze engagement and interest in the COVID-19 topic and provide a differential assessment on the evolution of the discourse on a global scale for each platform and their users. We fit information spreading with epidemic models characterizing the basic reproduction number [Formula: see text] for each social media platform. Moreover, we identify information spreading from questionable sources, finding different volumes of misinformation in each platform. However, information from both reliable and questionable sources do not present different spreading patterns. Finally, we provide platform-dependent numerical estimates of rumors' amplification.
•
TL;DR: In this article, a LASSO regression based channel selection and least square reconstruction is proposed to accelerate very deep convolutional neural networks, which reduces the accumulated error and enhances the compatibility with various architectures.
Abstract: In this paper, we introduce a new channel pruning method to accelerate very deep convolutional neural networks.Given a trained CNN model, we propose an iterative two-step algorithm to effectively prune each layer, by a LASSO regression based channel selection and least square reconstruction. We further generalize this algorithm to multi-layer and multi-branch cases. Our method reduces the accumulated error and enhance the compatibility with various architectures. Our pruned VGG-16 achieves the state-of-the-art results by 5x speed-up along with only 0.3% increase of error. More importantly, our method is able to accelerate modern networks like ResNet, Xception and suffers only 1.4%, 1.0% accuracy loss under 2x speed-up respectively, which is significant. Code has been made publicly available.
••
TL;DR: Combinatorial cellular indexing, a strategy for multiplex barcoding of thousands of single cells per experiment, was successfully used to investigate the genome-wide chromatin accessibility landscape in each of over 15,000 single cells, avoiding the need for compartmentalization of individual cells.
Abstract: Technical advances have enabled the collection of genome and transcriptome data sets with single-cell resolution. However, single-cell characterization of the epigenome has remained challenging. Furthermore, because cells must be physically separated before biochemical processing, conventional single-cell preparatory methods scale linearly. We applied combinatorial cellular indexing to measure chromatin accessibility in thousands of single cells per assay, circumventing the need for compartmentalization of individual cells. We report chromatin accessibility profiles from more than 15,000 single cells and use these data to cluster cells on the basis of chromatin accessibility landscapes. We identify modules of coordinately regulated chromatin accessibility at the level of single cells both between and within cell types, with a scalable method that may accelerate progress toward a human cell atlas.
••
TL;DR: In this paper, the authors demonstrate room-temperature electrical switching between stable configurations in antiferromagnetic CuMnAs thin-film devices by applied current with magnitudes of order 106 ampere per square centimeter.
Abstract: Antiferromagnets are hard to control by external magnetic fields because of the alternating directions of magnetic moments on individual atoms and the resulting zero net magnetization. However, relativistic quantum mechanics allows for generating current-induced internal fields whose sign alternates with the periodicity of the antiferromagnetic lattice. Using these fields, which couple strongly to the antiferromagnetic order, we demonstrate room-temperature electrical switching between stable configurations in antiferromagnetic CuMnAs thin-film devices by applied current with magnitudes of order 106 ampere per square centimeter. Electrical writing is combined in our solid-state memory with electrical readout and the stored magnetic state is insensitive to and produces no external magnetic field perturbations, which illustrates the unique merits of antiferromagnets for spintronics.
••
30 Oct 2017TL;DR: MagNet, a framework for defending neural network classifiers against adversarial examples, is proposed and it is shown empirically that MagNet is effective against the most advanced state-of-the-art attacks in blackbox and graybox scenarios without sacrificing false positive rate on normal examples.
Abstract: Deep learning has shown impressive performance on hard perceptual problems. However, researchers found deep learning systems to be vulnerable to small, specially crafted perturbations that are imperceptible to humans. Such perturbations cause deep learning systems to mis-classify adversarial examples, with potentially disastrous consequences where safety or security is crucial. Prior defenses against adversarial examples either targeted specific attacks or were shown to be ineffective. We propose MagNet, a framework for defending neural network classifiers against adversarial examples. MagNet neither modifies the protected classifier nor requires knowledge of the process for generating adversarial examples. MagNet includes one or more separate detector networks and a reformer network. The detector networks learn to differentiate between normal and adversarial examples by approximating the manifold of normal examples. Since they assume no specific process for generating adversarial examples, they generalize well. The reformer network moves adversarial examples towards the manifold of normal examples, which is effective for correctly classifying adversarial examples with small perturbation. We discuss the intrinsic difficulties in defending against whitebox attack and propose a mechanism to defend against graybox attack. Inspired by the use of randomness in cryptography, we use diversity to strengthen MagNet. We show empirically that MagNet is effective against the most advanced state-of-the-art attacks in blackbox and graybox scenarios without sacrificing false positive rate on normal examples.
••
TL;DR: HUMAnN2 is developed, a tiered search strategy that enables fast, accurate, and species-resolved functional profiling of host-associated and environmental communities and introduces ‘contributional diversity’ to explain patterns of ecological assembly across different microbial community types.
Abstract: Functional profiles of microbial communities are typically generated using comprehensive metagenomic or metatranscriptomic sequence read searches, which are time-consuming, prone to spurious mapping, and often limited to community-level quantification. We developed HUMAnN2, a tiered search strategy that enables fast, accurate, and species-resolved functional profiling of host-associated and environmental communities. HUMAnN2 identifies a community's known species, aligns reads to their pangenomes, performs translated search on unclassified reads, and finally quantifies gene families and pathways. Relative to pure translated search, HUMAnN2 is faster and produces more accurate gene family profiles. We applied HUMAnN2 to study clinal variation in marine metabolism, ecological contribution patterns among human microbiome pathways, variation in species' genomic versus transcriptional contributions, and strain profiling. Further, we introduce 'contributional diversity' to explain patterns of ecological assembly across different microbial community types.
••
TL;DR: In this paper, a suite of developing theoretical tools is reviewed, with which recent progress on this problem has been based, and a more refined, non-Markovian, treatment is necessary.
Abstract: An ongoing theme in quantum physics is the interaction of small quantum systems with an environment. If that environment has many degrees of freedom and is weakly coupled, it can often be reasonable to treat its decohering effect on the small system using a ``memoryless,'' or Markovian description. This Colloquium shows that for many phenomena a more refined, non-Markovian, treatment is necessary. The suite of developing theoretical tools is reviewed, with which recent progress on this problem has been based.
•
TL;DR: This paper examines six extensions to the DQN algorithm and empirically studies their combination, showing that the combination provides state-of-the-art performance on the Atari 2600 benchmark, both in terms of data efficiency and final performance.
Abstract: The deep reinforcement learning community has made several independent improvements to the DQN algorithm. However, it is unclear which of these extensions are complementary and can be fruitfully combined. This paper examines six extensions to the DQN algorithm and empirically studies their combination. Our experiments show that the combination provides state-of-the-art performance on the Atari 2600 benchmark, both in terms of data efficiency and final performance. We also provide results from a detailed ablation study that shows the contribution of each component to overall performance.
•
TL;DR: By embracing deep neural networks, this work is able to demonstrate end-to-end learning of protocols in complex environments inspired by communication riddles and multi-agent computer vision problems with partial observability.
Abstract: We consider the problem of multiple agents sensing and acting in environments with the goal of maximising their shared utility. In these environments, agents must learn communication protocols in order to share information that is needed to solve the tasks. By embracing deep neural networks, we are able to demonstrate end-to-end learning of protocols in complex environments inspired by communication riddles and multi-agent computer vision problems with partial observability. We propose two approaches for learning in these domains: Reinforced Inter-Agent Learning (RIAL) and Differentiable Inter-Agent Learning (DIAL). The former uses deep Q-learning, while the latter exploits the fact that, during learning, agents can backpropagate error derivatives through (noisy) communication channels. Hence, this approach uses centralised learning but decentralised execution. Our experiments introduce new environments for studying the learning of communication protocols and present a set of engineering innovations that are essential for success in these domains.
••
TL;DR: In this paper, the authors present a comprehensive review of the data sources and estimation methods of 30 currently available global precipitation data sets, including gauge-based, satellite-related, and reanalysis data sets.
Abstract: In this paper, we present a comprehensive review of the data sources and estimation methods of 30 currently available global precipitation data sets, including gauge-based, satellite-related, and reanalysis data sets. We analyzed the discrepancies between the data sets from daily to annual timescales and found large differences in both the magnitude and the variability of precipitation estimates. The magnitude of annual precipitation estimates over global land deviated by as much as 300 mm/yr among the products. Reanalysis data sets had a larger degree of variability than the other types of data sets. The degree of variability in precipitation estimates also varied by region. Large differences in annual and seasonal estimates were found in tropical oceans, complex mountain areas, northern Africa, and some high-latitude regions. Overall, the variability associated with extreme precipitation estimates was slightly greater at lower latitudes than at higher latitudes. The reliability of precipitation data sets is mainly limited by the number and spatial coverage of surface stations, the satellite algorithms, and the data assimilation models. The inconsistencies described limit the capability of the products for climate monitoring, attribution, and model validation.
••
TL;DR: It is shown that a defined population of progenitor cells does not coalesce in the subgranular zone during human fetal or postnatal development, and that neurogenesis in the dentate gyrus does not continue, or is extremely rare, in adult humans.
Abstract: New neurons continue to be generated in the subgranular zone of the dentate gyrus of the adult mammalian hippocampus. This process has been linked to learning and memory, stress and exercise, and is thought to be altered in neurological disease. In humans, some studies have suggested that hundreds of new neurons are added to the adult dentate gyrus every day, whereas other studies find many fewer putative new neurons. Despite these discrepancies, it is generally believed that the adult human hippocampus continues to generate new neurons. Here we show that a defined population of progenitor cells does not coalesce in the subgranular zone during human fetal or postnatal development. We also find that the number of proliferating progenitors and young neurons in the dentate gyrus declines sharply during the first year of life and only a few isolated young neurons are observed by 7 and 13 years of age. In adult patients with epilepsy and healthy adults (18-77 years; n = 17 post-mortem samples from controls; n = 12 surgical resection samples from patients with epilepsy), young neurons were not detected in the dentate gyrus. In the monkey (Macaca mulatta) hippocampus, proliferation of neurons in the subgranular zone was found in early postnatal life, but this diminished during juvenile development as neurogenesis decreased. We conclude that recruitment of young neurons to the primate hippocampus decreases rapidly during the first years of life, and that neurogenesis in the dentate gyrus does not continue, or is extremely rare, in adult humans. The early decline in hippocampal neurogenesis raises questions about how the function of the dentate gyrus differs between humans and other species in which adult hippocampal neurogenesis is preserved.
••
07 Jun 2015TL;DR: A conditional random field model that reasons about possible groundings of scene graphs to test images and shows that the full model can be used to improve object localization compared to baseline methods and outperforms retrieval methods that use only objects or low-level image features.
Abstract: This paper develops a novel framework for semantic image retrieval based on the notion of a scene graph. Our scene graphs represent objects (“man”, “boat”), attributes of objects (“boat is white”) and relationships between objects (“man standing on boat”). We use these scene graphs as queries to retrieve semantically related images. To this end, we design a conditional random field model that reasons about possible groundings of scene graphs to test images. The likelihoods of these groundings are used as ranking scores for retrieval. We introduce a novel dataset of 5,000 human-generated scene graphs grounded to images and use this dataset to evaluate our method for image retrieval. In particular, we evaluate retrieval using full scene graphs and small scene subgraphs, and show that our method outperforms retrieval methods that use only objects or low-level image features. In addition, we show that our full model can be used to improve object localization compared to baseline methods.
••
TL;DR: Human IDH mutant gliomas exhibit hypermethylation at cohesin and CCCTC-binding factor (CTCF)-binding sites, compromising binding of this methylation-sensitive insulator protein, and manifest a CpG island methylator phenotype (G-CIMP), although the functional importance of this altered epigenetic state remains unclear.
Abstract: Gain-of-function IDH mutations are initiating events that define major clinical and prognostic classes of gliomas. Mutant IDH protein produces a new onco-metabolite, 2-hydroxyglutarate, which interferes with iron-dependent hydroxylases, including the TET family of 5'-methylcytosine hydroxylases. TET enzymes catalyse a key step in the removal of DNA methylation. IDH mutant gliomas thus manifest a CpG island methylator phenotype (G-CIMP), although the functional importance of this altered epigenetic state remains unclear. Here we show that human IDH mutant gliomas exhibit hypermethylation at cohesin and CCCTC-binding factor (CTCF)-binding sites, compromising binding of this methylation-sensitive insulator protein. Reduced CTCF binding is associated with loss of insulation between topological domains and aberrant gene activation. We specifically demonstrate that loss of CTCF at a domain boundary permits a constitutive enhancer to interact aberrantly with the receptor tyrosine kinase gene PDGFRA, a prominent glioma oncogene. Treatment of IDH mutant gliomaspheres with a demethylating agent partially restores insulator function and downregulates PDGFRA. Conversely, CRISPR-mediated disruption of the CTCF motif in IDH wild-type gliomaspheres upregulates PDGFRA and increases proliferation. Our study suggests that IDH mutations promote gliomagenesis by disrupting chromosomal topology and allowing aberrant regulatory interactions that induce oncogene expression.
••
TL;DR: The nonlinear K-profiles clustering method is designed, which can be seen as the nonlinear counterpart of the K-means clustering algorithm, and has a built-in statistical testing procedure that ensures genes not belonging to any cluster do not impact the estimation of cluster profiles.
Abstract: With modern technologies such as microarray, deep sequencing, and liquid chromatography-mass spectrometry (LC-MS), it is possible to measure the expression levels of thousands of genes/proteins simultaneously to unravel important biological processes. A very first step towards elucidating hidden patterns and understanding the massive data is the application of clustering techniques. Nonlinear relations, which were mostly unutilized in contrast to linear correlations, are prevalent in high-throughput data. In many cases, nonlinear relations can model the biological relationship more precisely and reflect critical patterns in the biological systems. Using the general dependency measure, Distance Based on Conditional Ordered List (DCOL) that we introduced before, we designed the nonlinear K-profiles clustering method, which can be seen as the nonlinear counterpart of the K-means clustering algorithm. The method has a built-in statistical testing procedure that ensures genes not belonging to any cluster do not impact the estimation of cluster profiles. Results from extensive simulation studies showed that K-profiles clustering not only outperformed traditional linear K-means algorithm, but also presented significantly better performance over our previous General Dependency Hierarchical Clustering (GDHC) algorithm. We further analyzed a gene expression dataset, on which K-profile clustering generated biologically meaningful results.
••
TL;DR: A general strategy to produce numerous gRNAs from a single polycistronic gene via the endogenous tRNA-processing system is developed and shown to significantly increase CRISPR/Cas9 multiplex editing capability and efficiency in plants and is expected to have broad applications for small RNA expression and genome engineering.
Abstract: The clustered regularly interspaced short palindromic repeat (CRISPR)/CRISPR-associated protein 9 nuclease (Cas9) system is being harnessed as a powerful tool for genome engineering in basic research, molecular therapy, and crop improvement. This system uses a small guide RNA (gRNA) to direct Cas9 endonuclease to a specific DNA site; thus, its targeting capability is largely constrained by the gRNA-expressing device. In this study, we developed a general strategy to produce numerous gRNAs from a single polycistronic gene. The endogenous tRNA-processing system, which precisely cleaves both ends of the tRNA precursor, was engineered as a simple and robust platform to boost the targeting and multiplex editing capability of the CRISPR/Cas9 system. We demonstrated that synthetic genes with tandemly arrayed tRNA–gRNA architecture were efficiently and precisely processed into gRNAs with desired 5′ targeting sequences in vivo, which directed Cas9 to edit multiple chromosomal targets. Using this strategy, multiplex genome editing and chromosomal-fragment deletion were readily achieved in stable transgenic rice plants with a high efficiency (up to 100%). Because tRNA and its processing system are virtually conserved in all living organisms, this method could be broadly used to boost the targeting capability and editing efficiency of CRISPR/Cas9 toolkits.
••
TL;DR: The similarities and differences between these two classes of vesicle are reviewed, suggesting that, despite their considerable differences, the functions of ectosomes may be largely analogous to those of exosomes.
••
TL;DR: The Zika virus genome was detected in amniotic fluid samples of two pregnant women in Brazil whose fetuses were diagnosed with microcephaly and results suggest that the virus can cross the placental barrier.
Abstract: Summary Background The incidence of microcephaly in Brazil in 2015 was 20 times higher than in previous years. Congenital microcephaly is associated with genetic factors and several causative agents. Epidemiological data suggest that microcephaly cases in Brazil might be associated with the introduction of Zika virus. We aimed to detect and sequence the Zika virus genome in amniotic fluid samples of two pregnant women in Brazil whose fetuses were diagnosed with microcephaly. Methods In this case study, amniotic fluid samples from two pregnant women from the state of Paraiba in Brazil whose fetuses had been diagnosed with microcephaly were obtained, on the recommendation of the Brazilian health authorities, by ultrasound-guided transabdominal amniocentesis at 28 weeks' gestation. The women had presented at 18 weeks' and 10 weeks' gestation, respectively, with clinical manifestations that could have been symptoms of Zika virus infection, including fever, myalgia, and rash. After the amniotic fluid samples were centrifuged, DNA and RNA were extracted from the purified virus particles before the viral genome was identified by quantitative reverse transcription PCR and viral metagenomic next-generation sequencing. Phylogenetic reconstruction and investigation of recombination events were done by comparing the Brazilian Zika virus genome with sequences from other Zika strains and from flaviviruses that occur in similar regions in Brazil. Findings We detected the Zika virus genome in the amniotic fluid of both pregnant women. The virus was not detected in their urine or serum. Tests for dengue virus, chikungunya virus, Toxoplasma gondii , rubella virus, cytomegalovirus, herpes simplex virus, HIV, Treponema pallidum , and parvovirus B19 were all negative. After sequencing of the complete genome of the Brazilian Zika virus isolated from patient 1, phylogenetic analyses showed that the virus shares 97–100% of its genomic identity with lineages isolated during an outbreak in French Polynesia in 2013, and that in both envelope and NS5 genomic regions, it clustered with sequences from North and South America, southeast Asia, and the Pacific. After assessing the possibility of recombination events between the Zika virus and other flaviviruses, we ruled out the hypothesis that the Brazilian Zika virus genome is a recombinant strain with other mosquito-borne flaviviruses. Interpretation These findings strengthen the putative association between Zika virus and cases of microcephaly in neonates in Brazil. Moreover, our results suggest that the virus can cross the placental barrier. As a result, Zika virus should be considered as a potential infectious agent for human fetuses. Pathogenesis studies that confirm the tropism of Zika virus for neuronal cells are warranted. Funding Consellho Nacional de Desenvolvimento e Pesquisa (CNPq), Fundacao de Amparo a Pesquisa do Estado do Rio de Janeiro (FAPERJ).
••
University of Texas MD Anderson Cancer Center1, Memorial Sloan Kettering Cancer Center2, Institut Gustave Roussy3, Cornell University4, Northwestern University5, Ohio State University6, University of Miami7, University of Texas Southwestern Medical Center8, University of California, San Francisco9, Anschutz Medical Campus10, Sarah Cannon Research Institute11, Harvard University12, Centre Hospitalier Universitaire de Bordeaux13, University of Alabama at Birmingham14, Johns Hopkins University15, City of Hope National Medical Center16, Washington University in St. Louis17, Mayo Clinic18, Oregon Health & Science University19, Medical University of South Carolina20, Emory University21, Cleveland Clinic22, Agios Pharmaceuticals23
TL;DR: In patients with advanced IDH1‐mutated relapsed or refractory AML, ivosidenib at a dose of 500 mg daily was associated with a low frequency of grade 3 or higher treatment‐related adverse events and with transfusion independence, durable remissions, and molecular remissions in some patients with complete remission.
Abstract: Background Mutations in the gene encoding isocitrate dehydrogenase 1 (IDH1) occur in 6 to 10% of patients with acute myeloid leukemia (AML). Ivosidenib (AG-120) is an oral, targeted, small-molecule inhibitor of mutant IDH1. Methods We conducted a phase 1 dose-escalation and dose-expansion study of ivosidenib monotherapy in IDH1-mutated AML. Safety and efficacy were assessed in all treated patients. The primary efficacy population included patients with relapsed or refractory AML receiving 500 mg of ivosidenib daily with at least 6 months of follow-up. Results Overall, 258 patients received ivosidenib and had safety outcomes assessed. Among patients with relapsed or refractory AML (179 patients), treatment-related adverse events of grade 3 or higher that occurred in at least 3 patients were prolongation of the QT interval (in 7.8% of the patients), the IDH differentiation syndrome (in 3.9%), anemia (in 2.2%), thrombocytopenia or a decrease in the platelet count (in 3.4%), and leukocytosis (in 1.7%...
••
TL;DR: The majority of colonic and rectal superficial lesions can be effectively removed in a curative way by standard polypectomy and/or by EMR, and ESGE recommends ESD as treatment of choice for most gastric superficial neoplastic lesions.
Abstract: This Guideline is an official statement of the European Society of Gastrointestinal Endoscopy (ESGE). The Grading of Recommendations Assessment, Development, and Evaluation (GRADE) system 1 2 was adopted to define the strength of recommendations and the quality of evidence. Main recommendations
1 ESGE recommends endoscopic en bloc resection for superficial esophageal squamous cell cancers (SCCs), excluding those with obvious submucosal involvement (strong recommendation, moderate quality evidence). Endoscopic mucosal resection (EMR) may be considered in such lesions when they are smaller than 10 mm if en bloc resection can be assured. However, ESGE recommends endoscopic submucosal dissection (ESD) as the first option, mainly to provide an en bloc resection with accurate pathology staging and to avoid missing important histological features (strong recommendation, moderate quality evidence). 2 ESGE recommends endoscopic resection with a curative intent for visible lesions in Barrett’s esophagus (strong recommendation, moderate quality evidence). ESD has not been shown to be superior to EMR for excision of mucosal cancer, and for that reason EMR should be preferred. ESD may be considered in selected cases, such as lesions larger than 15 mm, poorly lifting tumors, and lesions at risk for submucosal invasion (strong recommendation, moderate quality evidence). 3 ESGE recommends endoscopic resection for the treatment of gastric superficial neoplastic lesions that possess a very low risk of lymph node metastasis (strong recommendation, high quality evidence). EMR is an acceptable option for lesions smaller than 10 – 15 mm with a very low probability of advanced histology (Paris 0-IIa). However, ESGE recommends ESD as treatment of choice for most gastric superficial neoplastic lesions (strong recommendation, moderate quality evidence). 4 ESGE states that the majority of colonic and rectal superficial lesions can be effectively removed in a curative way by standard polypectomy and/or by EMR (strong recommendation, moderate quality evidence). ESD can be considered for removal of colonic and rectal lesions with high suspicion of limited submucosal invasion that is based on two main criteria of depressed morphology and irregular or nongranular surface pattern, particularly if the lesions are larger than 20 mm; or ESD can be considered for colorectal lesions that otherwise cannot be optimally and radically removed by snare-based techniques (strong recommendation, moderate quality evidence).