scispace - formally typeset
Search or ask a question

Showing papers by "Stony Brook University published in 2012"


Journal ArticleDOI
Georges Aad1, T. Abajyan2, Brad Abbott3, Jalal Abdallah4  +2964 moreInstitutions (200)
TL;DR: In this article, a search for the Standard Model Higgs boson in proton-proton collisions with the ATLAS detector at the LHC is presented, which has a significance of 5.9 standard deviations, corresponding to a background fluctuation probability of 1.7×10−9.

9,282 citations


Journal ArticleDOI
TL;DR: These guidelines are presented for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process vs. those that measure flux through the autophagy pathway (i.e., the complete process); thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from stimuli that result in increased autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field.

4,316 citations


Journal ArticleDOI
TL;DR: An anatomically distinct clearing system in the brain that serves a lymphatic-like function is described and may have relevance for understanding or treating neurodegenerative diseases that involve the mis-accumulation of soluble proteins, such as amyloid β in Alzheimer's disease.
Abstract: Because it lacks a lymphatic circulation, the brain must clear extracellular proteins by an alternative mechanism. The cerebrospinal fluid (CSF) functions as a sink for brain extracellular solutes, but it is not clear how solutes from the brain interstitium move from the parenchyma to the CSF. We demonstrate that a substantial portion of subarachnoid CSF cycles through the brain interstitial space. On the basis of in vivo two-photon imaging of small fluorescent tracers, we showed that CSF enters the parenchyma along paravascular spaces that surround penetrating arteries and that brain interstitial fluid is cleared along paravenous drainage pathways. Animals lacking the water channel aquaporin-4 (AQP4) in astrocytes exhibit slowed CSF influx through this system and a ~70% reduction in interstitial solute clearance, suggesting that the bulk fluid flow between these anatomical influx and efflux routes is supported by astrocytic water transport. Fluorescent-tagged amyloid β, a peptide thought to be pathogenic in Alzheimer's disease, was transported along this route, and deletion of the Aqp4 gene suppressed the clearance of soluble amyloid β, suggesting that this pathway may remove amyloid β from the central nervous system. Clearance through paravenous flow may also regulate extracellular levels of proteins involved with neurodegenerative conditions, its impairment perhaps contributing to the mis-accumulation of soluble proteins.

3,368 citations


Journal ArticleDOI
TL;DR: A case of the abscopal effect is reported in a patient with melanoma treated with ipilimumab and radiotherapy, with temporal associations of tumor shrinkage with antibody responses to the cancer-testis antigen NY-ESO-1, changes in peripheral-blood immune cells, and increases in antibodies to other antigens after radiotherapy.
Abstract: The abscopal effect is a phenomenon in which local radiotherapy is associated with the regression of metastatic cancer at a distance from the irradiated site. The abscopal effect may be mediated by activation of the immune system. Ipilimumab is a mono‑ clonal antibody that inhibits an immunologic checkpoint on T cells, cytotoxic T‑lymphocyte–associated antigen 4 (CTLA ‑ 4). We report a case of the abscopal effect in a patient with melanoma treated with ipilimumab and radiotherapy. Temporal associations were noted: tumor shrinkage with antibody responses to the cancer– testis antigen NY‑ ESO‑ 1, changes in peripheral‑ blood immune cells, and increases in antibody responses to other antigens after radiotherapy. (Funded by the National Institutes of Health and others.)

1,769 citations


Journal ArticleDOI
TL;DR: A review of the relationship between eutrophication, climate change and cyanobacterial blooms in freshwater, estuarine, and marine ecosystems can be found in this paper.

1,675 citations


Journal ArticleDOI
23 Nov 2012-Science
TL;DR: Progress is reviewed on three broad questions: What is the physical code by which an amino acid sequence dictates a protein’s native structure?
Abstract: The protein-folding problem was first posed about one half-century ago. The term refers to three broad questions: (i) What is the physical code by which an amino acid sequence dictates a protein's native structure? (ii) How can proteins fold so fast? (iii) Can we devise a computer algorithm to predict protein structures from their sequences? We review progress on these problems. In a few cases, computer simulations of the physical forces in chemically detailed models have now achieved the accurate folding of small proteins. We have learned that proteins fold rapidly because random thermal motions cause conformational changes leading energetically downhill toward the native structure, a principle that is captured in funnel-shaped energy landscapes. And thanks in part to the large Protein Data Bank of known structures, predicting protein structures is now far more successful than was thought possible in the early days. What began as three questions of basic science one half-century ago has now grown into the full-fledged research field of protein physical science.

1,279 citations


Journal ArticleDOI
TL;DR: A statistical framework to describe and compare environmental niches from occurrence and spatial environmental data and shows that niche overlap can be accurately detected with the framework when variables driving the distributions are known.
Abstract: Aim Concerns over how global change will influence species distributions, in conjunction with increased emphasis on understanding niche dynamics in evolutionary and community contexts, highlight the growing need for robust methods to quantify niche differences between or within taxa. We propose a statistical framework to describe and compare environmental niches from occurrence and spatial environmental data. Location Europe, North America and South America. Methods The framework applies kernel smoothers to densities of species occurrence in gridded environmental space to calculate metrics of niche overlap and test hypotheses regarding niche conservatism. We use this framework and simulated species with pre-defined distributions and amounts of niche overlap to evaluate several ordination and species distribution modelling techniques for quantifying niche overlap. We illustrate the approach with data on two well-studied invasive species. Results We show that niche overlap can be accurately detected with the framework when variables driving the distributions are known. The method is robust to known and previously undocumented biases related to the dependence of species occurrences on the frequency of environmental conditions that occur across geographical space. The use of a kernel smoother makes the process of moving from geographical space to multivariate environmental space independent of both sampling effort and arbitrary choice of resolution in environmental space. However, the use of ordination and species distribution model techniques for selecting, combining and weighting variables on which niche overlap is calculated provide contrasting results. Main conclusions The framework meets the increasing need for robust methods to quantify niche differences. It is appropriate for studying niche differences between species, subspecies or intra-specific lineages that differ in their geographical distributions. Alternatively, it can be used to measure the degree to which the environmental niche of a species or intra-specific lineage has changed over time.

1,095 citations


Journal ArticleDOI
13 Sep 2012-Nature
TL;DR: These findings suggest that tropical protected areas are often intimately linked ecologically to their surrounding habitats, and that a failure to stem broad-scale loss and degradation of such habitats could sharply increase the likelihood of serious biodiversity declines.
Abstract: The rapid disruption of tropical forests probably imperils global biodiversity more than any other contemporary phenomenon(1-3). With deforestation advancing quickly, protected areas are increasingly becoming final refuges for threatened species and natural ecosystem processes. However, many protected areas in the tropics are themselves vulnerable to human encroachment and other environmental stresses(4-9). As pressures mount, it is vital to know whether existing reserves can sustain their biodiversity. A critical constraint in addressing this question has been that data describing a broad array of biodiversity groups have been unavailable for a sufficiently large and representative sample of reserves. Here we present a uniquely comprehensive data set on changes over the past 20 to 30 years in 31 functional groups of species and 21 potential drivers of environmental change, for 60 protected areas stratified across the world's major tropical regions. Our analysis reveals great variation in reserve 'health': about half of all reserves have been effective or performed passably, but the rest are experiencing an erosion of biodiversity that is often alarmingly widespread taxonomically and functionally. Habitat disruption, hunting and forest-product exploitation were the strongest predictors of declining reserve health. Crucially, environmental changes immediately outside reserves seemed nearly as important as those inside in determining their ecological fate, with changes inside reserves strongly mirroring those occurring around them. These findings suggest that tropical protected areas are often intimately linked ecologically to their surrounding habitats, and that a failure to stem broad-scale loss and degradation of such habitats could sharply increase the likelihood of serious biodiversity declines.

962 citations


Journal ArticleDOI
TL;DR: In this article, an ensemble of mass and radius observations can realistically restrict the properties of dense matter and, in particular, the behavior of the nuclear symmetry energy near the nuclear saturation density.
Abstract: Neutron stars are valuable laboratories for the study of dense matter. Recent observations have uncovered both massive and low-mass neutron stars and have also set constraints on neutron star radii. The largest mass measurements are powerfully influencing the high-density equation of state because of the existence of the neutron star maximum mass. The smallest mass measurements, and the distributions of masses, have implications for the progenitors and formation mechanisms of neutron stars. The ensemble of mass and radius observations can realistically restrict the properties of dense matter and, in particular, the behavior of the nuclear symmetry energy near the nuclear saturation density. Simultaneously, various nuclear experiments are progressively restricting the ranges of parameters describing the symmetry properties of the nuclear equation of state. In addition, new theoretical studies of pure neutron matter are providing insights. These observational, experimental, and theoretical constraints of dense matter, when combined, are now revealing a remarkable convergence.

825 citations


Journal ArticleDOI
TL;DR: The Mindboggle-101 dataset is introduced, the largest and most complete set of free, publicly accessible, manually labeled human brain images, and a new cortical labeling protocol that relies on robust anatomical landmarks and minimal manual edits after initialization with automated labels is created.
Abstract: We introduce the Mindboggle-101 dataset, the largest and most complete set of free, publicly accessible, manually labeled human brain images. To manually label the macroscopic anatomy in magnetic resonance images of 101 healthy participants, we created a new cortical labeling protocol that relies on robust anatomical landmarks and minimal manual edits after initialization with automated labels. The “Desikan-Killiany-Tourville” (DKT) protocol is intended to improve the ease, consistency, and accuracy of labeling human cortical areas. Given how difficult it is to label brains, the Mindboggle-101 dataset is intended to serve as brain atlases for use in labeling other brains, as a normative dataset to establish morphometric variation in a healthy population for comparison against clinical populations, and contribute to the development, training, testing, and evaluation of automated registration and labeling algorithms. To this end, we also introduce benchmarks for the evaluation of such algorithms by comparing our manual labels with labels automatically generated by probabilistic and multi-atlas registration-based approaches. All data and related software and updated information are available on the http://www.mindboggle.info/data/ website.

806 citations


Proceedings ArticleDOI
10 Dec 2012
TL;DR: A principled understanding of bit-rate adaptation is presented and a suite of techniques that can systematically guide the tradeoffs between stability, fairness, and efficiency are developed, which lead to a general framework for robust video adaptation.
Abstract: Many commercial video players rely on bitrate adaptation logic to adapt the bitrate in response to changing network conditions. Past measurement studies have identified issues with today's commercial players with respect to three key metrics---efficiency, fairness, and stability---when multiple bitrate-adaptive players share a bottleneck link. Unfortunately, our current understanding of why these effects occur and how they can be mitigated is quite limited.In this paper, we present a principled understanding of bitrate adaptation and analyze several commercial players through the lens of an abstract player model. Through this framework, we identify the root causes of several undesirable interactions that arise as a consequence of overlaying the video bitrate adaptation over HTTP. Building on these insights, we develop a suite of techniques that can systematically guide the tradeoffs between stability, fairness and efficiency and thus lead to a general framework for robust video adaptation. We pick one concrete instance from this design space and show that it significantly outperforms today's commercial players on all three key metrics across a range of experimental scenarios.

Journal ArticleDOI
22 Jun 2012-Cell
TL;DR: A role for p53 in activating necrosis is uncovered and the mitochondrial p53-CypD axis is identified as an important contributor to oxidative stress-induced necrosis and implicates this axis in stroke pathology.

Journal ArticleDOI
TL;DR: In this article, an up-to-date global analysis of solar, atmospheric, reactor, and accelerator neutrino data in the framework of three-neutrino oscillations is presented.
Abstract: We present an up-to-date global analysis of solar, atmospheric, reactor, and accelerator neutrino data in the framework of three-neutrino oscillations. We provide results on the determination of θ 13 from global data and discuss the dependence on the choice of reactor fluxes. We study in detail the statistical significance of a possible deviation of θ 23 from maximal mixing, the determination of its octant, the ordering of the mass states, and the sensitivity to the CP violating phase, and discuss the role of various complementary data sets in those respects.

Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, S. Abdel Khalek  +3081 moreInstitutions (197)
TL;DR: A combined search for the Standard Model Higgs boson with the ATLAS experiment at the LHC using datasets corresponding to integrated luminosities from 1.04 fb(-1) to 4.9 fb(1) of pp collisions is described in this paper.

Journal ArticleDOI
TL;DR: In this paper, the authors contrast the extremes of the correlative-process spectrum of species distribution models with respect to core assumptions, model building and selection strategies, validation, uncertainties, common errors and the questions they are most suited to answer.
Abstract: Within the field of species distribution modelling an apparent dichotomy exists between process-based and correlative approaches, where the processes are explicit in the former and implicit in the latter. However, these intuitive distinctions can become blurred when comparing species distribution modelling approaches in more detail. In this review article, we contrast the extremes of the correlative–process spectrum of species distribution models with respect to core assumptions, model building and selection strategies, validation, uncertainties, common errors and the questions they are most suited to answer. The extremes of such approaches differ clearly in many aspects, such as model building approaches, parameter estimation strategies and transferability. However, they also share strengths and weaknesses. We show that claims of one approach being intrinsically superior to the other are misguided and that they ignore the process–correlation continuum as well as the domains of questions that each approach is addressing. Nonetheless, the application of process-based approaches to species distribution modelling lags far behind more correlative (process-implicit) methods and more research is required to explore their potential benefits. Critical issues for the employment of species distribution modelling approaches are given, together with a guideline for appropriate usage. We close with challenges for future development of process-explicit species distribution models and how they may complement current approaches to study species distributions.

Journal ArticleDOI
TL;DR: The potential for increased risk of cancer recurrence with laparoscopy versus laparotomy was quantified and found to be small, providing accurate information for decision making for women with uterine cancer.
Abstract: Purpose The primary objective was to establish noninferiority of laparoscopy compared with laparotomy for recurrence after surgical staging of uterine cancer. Patients and Methods Patients with clinical stages I to IIA disease were randomly allocated (two to one) to laparoscopy (n = 1,696) versus laparotomy (n = 920) for hysterectomy, salpingo-oophorectomy, pelvic cytology, and pelvic and para-aortic lymphadenectomy. The primary study end point was noninferiority of recurrence-free interval defined as no more than a 40% increase in the risk of recurrence with laparoscopy compared with laparotomy. Results With a median follow-up time of 59 months for 2,181 patients still alive, there were 309 recurrences (210 laparoscopy; 99 laparotomy) and 350 deaths (229 laparoscopy; 121 laparotomy). The estimated hazard ratio for laparoscopy relative to laparotomy was 1.14 (90% lower bound, 0.92; 95% upper bound, 1.46), falling short of the protocol-specified definition of noninferiority. However, the actual recurrence ...

Proceedings ArticleDOI
16 Jun 2012
TL;DR: An effective method for parsing clothing in fashion photographs, an extremely challenging problem due to the large number of possible garment items, variations in configuration, garment appearance, layering, and occlusion is demonstrated.
Abstract: In this paper we demonstrate an effective method for parsing clothing in fashion photographs, an extremely challenging problem due to the large number of possible garment items, variations in configuration, garment appearance, layering, and occlusion. In addition, we provide a large novel dataset and tools for labeling garment items, to enable future research on clothing estimation. Finally, we present intriguing initial results on using clothing estimates to improve pose identification, and demonstrate a prototype application for pose-independent visual garment retrieval.

Journal ArticleDOI
TL;DR: A Malat1 loss-of-function genetic model is characterized that indicates that Malat 1 is not essential for mouse pre- and postnatal development, and depletion of Mal at1 does not affect global gene expression, splicing factor level and phosphorylation status, or alternative pre-mRNA splicing.

Journal ArticleDOI
Georges Aad, B. Abbott1, Jalal Abdallah2, A. A. Abdelalim3  +3013 moreInstitutions (174)
TL;DR: In this article, detailed measurements of the electron performance of the ATLAS detector at the LHC were reported, using decays of the Z, W and J/psi particles.
Abstract: Detailed measurements of the electron performance of the ATLAS detector at the LHC are reported, using decays of the Z, W and J/psi particles. Data collected in 2010 at root s = 7 TeV are used, corresponding to an integrated luminosity of almost 40 pb(-1). The inter-alignment of the inner detector and the electromagnetic calorimeter, the determination of the electron energy scale and resolution, and the performance in terms of response uniformity and linearity are discussed. The electron identification, reconstruction and trigger efficiencies, as well as the charge misidentification probability, are also presented.

Journal ArticleDOI
Daniele S. M. Alves1, Nima Arkani-Hamed, S. Arora2, Yang Bai1, Matthew Baumgart3, Joshua Berger4, Matthew R. Buckley5, Bart Butler1, Spencer Chang6, Spencer Chang7, Hsin-Chia Cheng7, Clifford Cheung8, R. Sekhar Chivukula9, Won Sang Cho10, R. Cotta1, Mariarosaria D'Alfonso11, Sonia El Hedri1, Rouven Essig12, Jared A. Evans7, Liam Fitzpatrick13, Patrick J. Fox5, Roberto Franceschini14, Ayres Freitas15, James S. Gainer16, James S. Gainer17, Yuri Gershtein2, R. N.C. Gray2, Thomas Gregoire18, Ben Gripaios19, J.F. Gunion7, Tao Han20, Andy Haas1, P. Hansson1, JoAnne L. Hewett1, Dmitry Hits2, Jay Hubisz21, Eder Izaguirre1, Jared Kaplan1, Emanuel Katz13, Can Kilic2, Hyung Do Kim22, Ryuichiro Kitano23, Sue Ann Koay11, Pyungwon Ko24, David Krohn25, Eric Kuflik26, Ian M. Lewis20, Mariangela Lisanti27, Tao Liu11, Zhen Liu20, Ran Lu26, Markus A. Luty7, Patrick Meade12, David E. Morrissey28, Stephen Mrenna5, Mihoko M. Nojiri, Takemichi Okui29, Sanjay Padhi30, Michele Papucci31, Michael Park2, Myeonghun Park32, Maxim Perelstein4, Michael E. Peskin1, Daniel J. Phalen7, Keith Rehermann33, Vikram Rentala34, Vikram Rentala35, Tuhin S. Roy36, Joshua T. Ruderman27, Veronica Sanz37, Martin Schmaltz13, S. Schnetzer2, Philip Schuster38, Pedro Schwaller39, Pedro Schwaller40, Pedro Schwaller17, Matthew D. Schwartz25, Ariel Schwartzman1, Jing Shao21, J. Shelton41, David Shih2, Jing Shu10, Daniel Silverstein1, Elizabeth H. Simmons9, Sunil Somalwar2, Michael Spannowsky6, Christian Spethmann13, Matthew J. Strassler2, Shufang Su34, Shufang Su35, Tim M. P. Tait34, Brooks Thomas42, Scott Thomas2, Natalia Toro38, Tomer Volansky8, Jay G. Wacker1, Wolfgang Waltenberger43, Itay Yavin44, Felix Yu34, Yue Zhao2, Kathryn M. Zurek26 
TL;DR: A collection of simplified models relevant to the design of new-physics searches at the Large Hadron Collider (LHC) and the characterization of their results is presented in this paper.
Abstract: This document proposes a collection of simplified models relevant to the design of new-physics searches at the Large Hadron Collider (LHC) and the characterization of their results. Both ATLAS and CMS have already presented some results in terms of simplified models, and we encourage them to continue and expand this effort, which supplements both signature-based results and benchmark model interpretations. A simplified model is defined by an effective Lagrangian describing the interactions of a small number of new particles. Simplified models can equally well be described by a small number of masses and cross-sections. These parameters are directly related to collider physics observables, making simplified models a particularly effective framework for evaluating searches and a useful starting point for characterizing positive signals of new physics. This document serves as an official summary of the results from the 'Topologies for Early LHC Searches' workshop, held at SLAC in September of 2010, the purpose of which was to develop a set of representative models that can be used to cover all relevant phase space in experimental searches. Particular emphasis is placed on searches relevant for the first similar to 50-500 pb(-1) of data and those motivated by supersymmetric models. This note largely summarizes material posted at http://lhcnewphysics.org/, which includes simplified model definitions, Monte Carlo material, and supporting contacts within the theory community. We also comment on future developments that may be useful as more data is gathered and analyzed by the experiments.

Proceedings ArticleDOI
16 Jun 2012
TL;DR: A complex human activity dataset depicting two person interactions, including synchronized video, depth and motion capture data is created, and techniques related to Multiple Instance Learning (MIL) are explored, finding that the MIL based classifier outperforms SVMs when the sequences extend temporally around the interaction of interest.
Abstract: Human activity recognition has potential to impact a wide range of applications from surveillance to human computer interfaces to content based video retrieval. Recently, the rapid development of inexpensive depth sensors (e.g. Microsoft Kinect) provides adequate accuracy for real-time full-body human tracking for activity recognition applications. In this paper, we create a complex human activity dataset depicting two person interactions, including synchronized video, depth and motion capture data. Moreover, we use our dataset to evaluate various features typically used for indexing and retrieval of motion capture data, in the context of real-time detection of interaction activities via Support Vector Machines (SVMs). Experimentally, we find that the geometric relational features based on distance between all pairs of joints outperforms other feature choices. For whole sequence classification, we also explore techniques related to Multiple Instance Learning (MIL) in which the sequence is represented by a bag of body-pose features. We find that the MIL based classifier outperforms SVMs when the sequences extend temporally around the interaction of interest.

Journal ArticleDOI
TL;DR: The global patterns of terrestrial Corg preservation reflect broadly different roles for passive and active margin systems in the sedimentary Corg cycle.
Abstract: Understanding the fate of terrestrial organic carbon (Corg) delivered to oceans by rivers is critical for constraining models of biogeochemical cycling and Earth surface evolution. Corg fate is dependent on both intrinsic characteristics (molecular structure, matrix) and the environmental conditions to which fluvial Corg is subjected. Three distinct patterns are evident on continental margins supplied by rivers: (a) high-energy, mobile muds with enhanced oxygen exposure and efficient metabolite exchange have very low preservation of both terrestrial and marine Corg (e.g., Amazon subaqueous delta); (b) low-energy facies with extreme accumulation have high Corg preservation (e.g., Ganges-Brahmaputra); and (c) small, mountainous river systems that sustain average accumulation rates but deliver a large fraction of low-reactivity, fossil Corg in episodic events have the highest preservation efficiencies. The global patterns of terrestrial Corg preservation reflect broadly different roles for passive and active margin systems in the sedimentary Corg cycle.

Journal ArticleDOI
06 Sep 2012-Nature
TL;DR: A niche cell–signal–receptor trio and a local circuitry mechanism that control the activation and self-renewal mode of quiescent adult neural stem cells in response to neuronal activity and experience are identified.
Abstract: Adult neurogenesis arises from neural stem cells within specialized niches. Neuronal activity and experience, presumably acting on this local niche, regulate multiple stages of adult neurogenesis, from neural progenitor proliferation to new neuron maturation, synaptic integration and survival. It is unknown whether local neuronal circuitry has a direct impact on adult neural stem cells. Here we show that, in the adult mouse hippocampus, nestin-expressing radial glia-like quiescent neural stem cells (RGLs) respond tonically to the neurotransmitter γ-aminobutyric acid (GABA) by means of γ2-subunit-containing GABAA receptors. Clonal analysis of individual RGLs revealed a rapid exit from quiescence and enhanced symmetrical self-renewal after conditional deletion of γ2. RGLs are in close proximity to terminals expressing 67-kDa glutamic acid decarboxylase (GAD67) of parvalbumin-expressing (PV+) interneurons and respond tonically to GABA released from these neurons. Functionally, optogenetic control of the activity of dentate PV+ interneurons, but not that of somatostatin-expressing or vasoactive intestinal polypeptide (VIP)-expressing interneurons, can dictate the RGL choice between quiescence and activation. Furthermore, PV+ interneuron activation restores RGL quiescence after social isolation, an experience that induces RGL activation and symmetrical division. Our study identifies a niche cell–signal–receptor trio and a local circuitry mechanism that control the activation and self-renewal mode of quiescent adult neural stem cells in response to neuronal activity and experience.

Journal ArticleDOI
20 Dec 2012-PLOS ONE
TL;DR: FastUniq is presented as a fast de novo tool for removal of duplicates in paired short reads from next-generation sequencing platforms and results in highly efficient running time, which increases linearly at an average speed of 87 million reads per 10 minutes.
Abstract: The presence of duplicates introduced by PCR amplification is a major issue in paired short reads from next-generation sequencing platforms. These duplicates might have a serious impact on research applications, such as scaffolding in whole-genome sequencing and discovering large-scale genome variations, and are usually removed. We present FastUniq as a fast de novo tool for removal of duplicates in paired short reads. FastUniq identifies duplicates by comparing sequences between read pairs and does not require complete genome sequences as prerequisites. FastUniq is capable of simultaneously handling reads with different lengths and results in highly efficient running time, which increases linearly at an average speed of 87 million reads per 10 minutes. FastUniq is freely available at http://sourceforge.net/projects/fastuniq/.

Proceedings Article
23 Apr 2012
TL;DR: A novel generation system that composes humanlike descriptions of images from computer vision detections by leveraging syntactically informed word co-occurrence statistics and automatically generating some of the most natural image descriptions to date.
Abstract: This paper introduces a novel generation system that composes humanlike descriptions of images from computer vision detections. By leveraging syntactically informed word co-occurrence statistics, the generator filters and constrains the noisy detections output from a vision system to generate syntactic trees that detail what the computer vision system sees. Results show that the generation system outperforms state-of-the-art systems, automatically generating some of the most natural image descriptions to date.


Proceedings Article
08 Jul 2012
TL;DR: This paper investigates syntactic stylometry for deception detection, adding a somewhat unconventional angle to prior literature and demonstrating that features driven from Context Free Grammar (CFG) parse trees consistently improve the detection performance over several baselines that are based only on shallow lexico-syntactic features.
Abstract: Most previous studies in computerized deception detection have relied only on shallow lexico-syntactic patterns. This paper investigates syntactic stylometry for deception detection, adding a somewhat unconventional angle to prior literature. Over four different datasets spanning from the product review to the essay domain, we demonstrate that features driven from Context Free Grammar (CFG) parse trees consistently improve the detection performance over several baselines that are based only on shallow lexico-syntactic features. Our results improve the best published result on the hotel review data (Ott et al., 2011) reaching 91.2% accuracy with 14% error reduction.

Journal ArticleDOI
Georges Aad1, Brad Abbott2, J. Abdallah3, S. Abdel Khalek4  +3073 moreInstitutions (193)
TL;DR: In this paper, a Fourier analysis of the charged particle pair distribution in relative azimuthal angle (Delta phi = phi(a)-phi(b)) is performed to extract the coefficients v(n,n) =.
Abstract: Differential measurements of charged particle azimuthal anisotropy are presented for lead-lead collisions at root sNN = 2.76 TeV with the ATLAS detector at the LHC, based on an integrated luminosity of approximately 8 mu b(-1). This anisotropy is characterized via a Fourier expansion of the distribution of charged particles in azimuthal angle relative to the reaction plane, with the coefficients v(n) denoting the magnitude of the anisotropy. Significant v(2)-v(6) values are obtained as a function of transverse momentum (0.5 = 3 are found to vary weakly with both eta and centrality, and their p(T) dependencies are found to follow an approximate scaling relation, v(n)(1/n)(p(T)) proportional to v(2)(1/2)(p(T)), except in the top 5% most central collisions. A Fourier analysis of the charged particle pair distribution in relative azimuthal angle (Delta phi = phi(a)-phi(b)) is performed to extract the coefficients v(n,n) = . For pairs of charged particles with a large pseudorapidity gap (|Delta eta = eta(a) - eta(b)| > 2) and one particle with p(T) < 3 GeV, the v(2,2)-v(6,6) values are found to factorize as v(n,n)(p(T)(a), p(T)(b)) approximate to v(n) (p(T)(a))v(n)(p(T)(b)) in central and midcentral events. Such factorization suggests that these values of v(2,2)-v(6,6) are primarily attributable to the response of the created matter to the fluctuations in the geometry of the initial state. A detailed study shows that the v(1,1)(p(T)(a), p(T)(b)) data are consistent with the combined contributions from a rapidity-even v(1) and global momentum conservation. A two-component fit is used to extract the v(1) contribution. The extracted v(1) isobserved to cross zero at pT approximate to 1.0 GeV, reaches a maximum at 4-5 GeV with a value comparable to that for v(3), and decreases at higher p(T).

Journal ArticleDOI
TL;DR: In this article, Bifurcation analysis can be used in conjunction with a constitutive model to predict the onset of strain localization, which is in qualitative agreement with the laboratory data.

Journal ArticleDOI
28 Sep 2012-Cell
TL;DR: In a late cellular differentiation process, Foxp3 defines Treg cell functionality in an "opportunistic" manner by largely exploiting the preformed enhancer network instead of establishing a new enhancer landscape.