scispace - formally typeset
Search or ask a question

Showing papers by "Humboldt University of Berlin published in 2014"


Journal ArticleDOI
TL;DR: Landsat 8, a NASA and USGS collaboration, acquires global moderate-resolution measurements of the Earth's terrestrial and polar regions in the visible, near-infrared, short wave, and thermal infrared as mentioned in this paper.

1,697 citations


Journal ArticleDOI
M. G. Aartsen1, Markus Ackermann, Jenni Adams2, Juanan Aguilar3  +299 moreInstitutions (41)
TL;DR: Results from an analysis with a third year of data from the complete IceCube detector are consistent with the previously reported astrophysical flux in the 100 TeV-PeV range at the level of 10(-8) GeV cm-2 s-1 sr-1 per flavor and reject a purely atmospheric explanation for the combined three-year data at 5.7σ.
Abstract: A search for high-energy neutrinos interacting within the IceCube detector between 2010 and 2012 provided the first evidence for a high-energy neutrino flux of extraterrestrial origin. Results from an analysis using the same methods with a third year (2012-2013) of data from the complete IceCube detector are consistent with the previously reported astrophysical flux in the 100 TeV-PeV range at the level of 10(-8) GeV cm(-2) s(-1) sr(-1) per flavor and reject a purely atmospheric explanation for the combined three-year data at 5.7 sigma. The data are consistent with expectations for equal fluxes of all three neutrino flavors and with isotropic arrival directions, suggesting either numerous or spatially extended sources. The three-year data set, with a live time of 988 days, contains a total of 37 neutrino candidate events with deposited energies ranging from 30 to 2000 TeV. The 2000-TeV event is the highest-energy neutrino interaction ever observed.

1,183 citations


Journal ArticleDOI
30 Jan 2014-Nature
TL;DR: It is found that lncRNAs, in particular ancient ones, are in general actively regulated and may function predominantly in embryonic development, and an evolutionarily conserved co-expression network is reconstructed.
Abstract: Only a very small fraction of long noncoding RNAs (lncRNAs) are well characterized. The evolutionary history of lncRNAs can provide insights into their functionality, but the absence of lncRNA annotations in non-model organisms has precluded comparative analyses. Here we present a large-scale evolutionary study of lncRNA repertoires and expression patterns, in 11 tetrapod species. We identify approximately 11,000 primate-specific lncRNAs and 2,500 highly conserved lncRNAs, including approximately 400 genes that are likely to have originated more than 300 million years ago. We find that lncRNAs, in particular ancient ones, are in general actively regulated and may function predominantly in embryonic development. Most lncRNAs evolve rapidly in terms of sequence and expression levels, but tissue specificities are often conserved. We compared expression patterns of homologous lncRNA and protein-coding families across tetrapods to reconstruct an evolutionarily conserved co-expression network. This network suggests potential functions for lncRNAs in fundamental processes such as spermatogenesis and synaptic transmission, but also in more specific mechanisms such as placenta development through microRNA production.

855 citations


Journal ArticleDOI
01 Aug 2014-Allergy
TL;DR: These guidelines aim to provide evidence‐based recommendations for the recognition, risk factor assessment, and the management of patients who are at risk of, are experiencing, or have experienced anaphylaxis, and to prevent future episodes by developing personalized risk reduction strategies including, where possible, commencing allergen immunotherapy.
Abstract: Anaphylaxis is a clinical emergency, and all healthcare professionals should be familiar with its recognition and acute and ongoing management. These guidelines have been prepared by the European Academy of Allergy and Clinical Immunology (EAACI) Taskforce on Anaphylaxis. They aim to provide evidence-based recommendations for the recognition, risk factor assessment, and the management of patients who are at risk of, are experiencing, or have experienced anaphylaxis. While the primary audience is allergists, these guidelines are also relevant to all other healthcare professionals. The development of these guidelines has been underpinned by two systematic reviews of the literature, both on the epidemiology and on clinical management of anaphylaxis. Anaphylaxis is a potentially life-threatening condition whose clinical diagnosis is based on recognition of a constellation of presenting features. First-line treatment for anaphylaxis is intramuscular adrenaline. Useful second-line interventions may include removing the trigger where possible, calling for help, correct positioning of the patient, high-flow oxygen, intravenous fluids, inhaled short-acting bronchodilators, and nebulized adrenaline. Discharge arrangements should involve an assessment of the risk of further reactions, a management plan with an anaphylaxis emergency action plan, and, where appropriate, prescribing an adrenaline auto-injector. If an adrenaline auto-injector is prescribed, education on when and how to use the device should be provided. Specialist follow-up is essential to investigate possible triggers, to perform a comprehensive risk assessment, and to prevent future episodes by developing personalized risk reduction strategies including, where possible, commencing allergen immunotherapy. Training for the patient and all caregivers is essential. There are still many gaps in the evidence base for anaphylaxis.

827 citations


Journal ArticleDOI
TL;DR: Rhodopsins found in Eukaryotes, Bacteria, and Archaea consist of opsin apoproteins and a covalently linked retinal which is employed to absorb photons for energy conversion or the initiation of intra- or intercellular signaling.
Abstract: Organisms of all domains of life use photoreceptor proteins to sense and respond to light. The light-sensitivity of photoreceptor proteins arises from bound chromophores such as retinal in retinylidene proteins, bilin in biliproteins, and flavin in flavoproteins. Rhodopsins found in Eukaryotes, Bacteria, and Archaea consist of opsin apoproteins and a covalently linked retinal which is employed to absorb photons for energy conversion or the initiation of intra- or intercellular signaling.1 Both functions are important for organisms to survive and to adapt to the environment. While lower organisms utilize the family of microbial rhodopsins for both purposes, animals solely use a different family of rhodopsins, a specialized subset of G-protein-coupled receptors (GPCRs).1,2 Animal rhodopsins, for example, are employed in visual and nonvisual phototransduction, in the maintenance of the circadian clock and as photoisomerases.3,4 While sharing practically no sequence similarity, microbial and animal rhodopsins, also termed type-I and type-II rhodopsins, respectively, share a common architecture of seven transmembrane α-helices (TM) with the N- and C-terminus facing out- and inside of the cell, respectively (Figure ​(Figure11).1,5 Retinal is attached by a Schiff base linkage to the e-amino group of a lysine side chain in the middle of TM7 (Figures ​(Figures11 and ​and2).2). The retinal Schiff base (RSB) is protonated (RSBH+) in most cases, and changes in protonation state are integral to the signaling or transport activity of rhodopsins. Figure 1 Topology of the retinal proteins. (A) These membrane proteins contain seven α-helices (typically denoted helix A to G in microbial opsins and TM1 to 7 in the animal opsins) spanning the lipid bilayer. The N-terminus faces the outside of the cell ...

811 citations


Journal ArticleDOI
TL;DR: The results indicate that most UES studies have been undertaken in Europe, North America, and China, at city scale, but few study findings have been implemented as land use policy.
Abstract: Although a number of comprehensive reviewshave examined global ecosystem services (ES), few havefocused on studies that assess urban ecosystem services(UES). Given that more than half of the world’ ...

758 citations


Journal ArticleDOI
TL;DR: This paper proposes a smart combination of small cells, joint transmission coordinated multipoint (JT CoMP), and massive MIMO to enhance the spectral efficiency with affordable complexity and shows in measurements with macro-plus-smallcell scenarios that spectral efficiency can be improved by flexible clustering and efficient user selection.
Abstract: 5G will have to support a multitude of new applications with a wide variety of requirements, including higher peak and user data rates, reduced latency, enhanced indoor coverage, increased number of devices, and so on. The expected traffic growth in 10 or more years from now can be satisfied by the combined use of more spectrum, higher spectral efficiency, and densification of cells. The focus of the present article is on advanced techniques for higher spectral efficiency and improved coverage for cell edge users. We propose a smart combination of small cells, joint transmission coordinated multipoint (JT CoMP), and massive MIMO to enhance the spectral efficiency with affordable complexity. We review recent achievements in the transition from theoretical to practical concepts and note future research directions. We show in measurements with macro-plus-smallcell scenarios that spectral efficiency can be improved by flexible clustering and efficient user selection, and that adaptive feedback compression is beneficial to reduce the overhead significantly. Moreover, we show in measurements that fast feedback reporting combined with advanced channel prediction are able to mitigate the impairment effects of JT CoMP.

731 citations


Journal ArticleDOI
Paul M. Thompson1, Jason L. Stein2, Sarah E. Medland3, Derrek P. Hibar1  +329 moreInstitutions (96)
TL;DR: The ENIGMA Consortium has detected factors that affect the brain that no individual site could detect on its own, and that require larger numbers of subjects than any individual neuroimaging study has currently collected.
Abstract: The Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium is a collaborative network of researchers working together on a range of large-scale studies that integrate data from 70 institutions worldwide. Organized into Working Groups that tackle questions in neuroscience, genetics, and medicine, ENIGMA studies have analyzed neuroimaging data from over 12,826 subjects. In addition, data from 12,171 individuals were provided by the CHARGE consortium for replication of findings, in a total of 24,997 subjects. By meta-analyzing results from many sites, ENIGMA has detected factors that affect the brain that no individual site could detect on its own, and that require larger numbers of subjects than any individual neuroimaging study has currently collected. ENIGMA's first project was a genome-wide association study identifying common variants in the genome associated with hippocampal volume or intracranial volume. Continuing work is exploring genetic associations with subcortical volumes (ENIGMA2) and white matter microstructure (ENIGMA-DTI). Working groups also focus on understanding how schizophrenia, bipolar illness, major depression and attention deficit/hyperactivity disorder (ADHD) affect the brain. We review the current progress of the ENIGMA Consortium, along with challenges and unexpected discoveries made on the way.

713 citations


Journal ArticleDOI
TL;DR: Under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results are shown, and guidelines on how to report on Bayesian statistics are provided.
Abstract: Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are introduced using a simplified example. Thereafter, the advantages and pitfalls of the specification of prior knowledge are discussed. To illustrate Bayesian methods explained in this study, in a second example a series of studies that examine the theoretical framework of dynamic interactionism are considered. In the Discussion the advantages and disadvantages of using Bayesian statistics are reviewed, and guidelines on how to report on Bayesian statistics are provided.

540 citations


Journal ArticleDOI
TL;DR: This work has grown, for the first time, macroscopically large crystalline thin films of triazine-based, graphitic carbon nitride (TGCN) using an ionothermal, interfacial reaction starting with the abundant monomer dicyandiamide.
Abstract: Graphitic carbon nitride has been predicted to be structurally analogous to carbon-only graphite, yet with an inherent bandgap. We have grown, for the first time, macroscopically large crystalline thin films of triazine-based, graphitic carbon nitride (TGCN) using an ionothermal, interfacial reaction starting with the abundant monomer dicyandiamide. The films consist of stacked, two-dimensional (2D) crystals between a few and several hundreds of atomic layers in thickness. Scanning force and transmission electron microscopy show long-range, in-plane order, while optical spectroscopy, X-ray photoelectron spectroscopy, and density functional theory calculations corroborate a direct bandgap between 1.6 and 2.0 eV. Thus TGCN is of interest for electronic devices, such as field-effect transistors and light-emitting diodes.

531 citations


Journal ArticleDOI
TL;DR: “Raising public awareness about sublingual immunotherapy”, as a need for patients, and strategies to increase awareness of allergen immunotherapy (AIT) among patients, the medical community, all healthcare stakeholders, and public opinion are reported in detail.

Journal ArticleDOI
TL;DR: In this paper, the authors present an analysis of UGS provisioning in Berlin, Germany in order to identify distributional inequities between UGS and population which are further discussed in light of variations in user preferences associated with demographics and immigrant status.

Journal ArticleDOI
01 Dec 2014
TL;DR: The overall system architecture design decisions are presented, Stratosphere is introduced through example queries, and the internal workings of the system’s components that relate to extensibility, programming model, optimization, and query execution are dive into.
Abstract: We present Stratosphere, an open-source software stack for parallel data analysis. Stratosphere brings together a unique set of features that allow the expressive, easy, and efficient programming of analytical applications at very large scale. Stratosphere's features include "in situ" data processing, a declarative query language, treatment of user-defined functions as first-class citizens, automatic program parallelization and optimization, support for iterative programs, and a scalable and efficient execution engine. Stratosphere covers a variety of "Big Data" use cases, such as data warehousing, information extraction and integration, data cleansing, graph analysis, and statistical analysis applications. In this paper, we present the overall system architecture design decisions, introduce Stratosphere through example queries, and then dive into the internal workings of the system's components that relate to extensibility, programming model, optimization, and query execution. We experimentally compare Stratosphere against popular open-source alternatives, and we conclude with a research outlook for the next years.

Journal ArticleDOI
TL;DR: The food-pics image database, a picture database comprising 568 food images and 315 non-food images along with detailed meta-data, is developed with the hope that the set will facilitate standardization and comparability across studies and advance experimental research on the determinants of eating behavior.
Abstract: Our current environment is characterized by the omnipresence of food cues. The sight and smell of real foods, but also graphically depictions of appetizing foods, can guide our eating behavior, for example, by eliciting food craving and influencing food choice. The relevance of visual food cues on human information processing has been demonstrated by a growing body of studies employing food images across the disciplines of psychology, medicine, and neuroscience. However, currently used food image sets vary considerably across laboratories and image characteristics (contrast, brightness, etc.) and food composition (calories, macronutrients, etc.) are often unspecified. These factors might have contributed to some of the inconsistencies of this research. To remedy this, we developed food-pics, a picture database comprising 568 food images and 315 non-food images along with detailed meta-data. A total of N = 1988 individuals with large variance in age and weight from German speaking countries and North America provided normative ratings of valence, arousal, palatability, desire to eat, recognizability and visual complexity. Furthermore, data on macronutrients (g), energy density (kcal), and physical image characteristics (color composition, contrast, brightness, size, complexity) are provided. The food-pics image data base is freely available under the creative commons license with the hope that the set will facilitate standardization and comparability across studies and advance experimental research on the determinants of eating behavior.

Journal ArticleDOI
J. P. Lees1, V. Poireau1, V. Tisserand1, E. Grauges2  +308 moreInstitutions (73)
TL;DR: In this article, the authors presented a search for a dark photon in the reaction e^{+}e^{-}→γA^{'], A^{'}→e''+''e''-e''−γA''−E''−μ'' −μ'' -E'' −γA''.
Abstract: Dark sectors charged under a new Abelian interaction have recently received much attention in the context of dark matter models. These models introduce a light new mediator, the so-called dark photon (A^{'}), connecting the dark sector to the standard model. We present a search for a dark photon in the reaction e^{+}e^{-}→γA^{'}, A^{'}→e^{+}e^{-}, μ^{+}μ^{-} using 514 fb^{-1} of data collected with the BABAR detector. We observe no statistically significant deviations from the standard model predictions, and we set 90% confidence level upper limits on the mixing strength between the photon and dark photon at the level of 10^{-4}-10^{-3} for dark photon masses in the range 0.02-10.2 GeV. We further constrain the range of the parameter space favored by interpretations of the discrepancy between the calculated and measured anomalous magnetic moment of the muon.

Journal ArticleDOI
Adrian John Bevan1, B. Golob2, Th. Mannel3, S. Prell4  +2061 moreInstitutions (171)
TL;DR: The physics of the SLAC and KEK B Factories are described in this paper, with a brief description of the detectors, BaBar and Belle, and data taking related issues.
Abstract: This work is on the Physics of the B Factories. Part A of this book contains a brief description of the SLAC and KEK B Factories as well as their detectors, BaBar and Belle, and data taking related issues. Part B discusses tools and methods used by the experiments in order to obtain results. The results themselves can be found in Part C.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the biophysical effects of temperate land-management changes and revealed a net warming effect of similar magnitude to that driven by changing land cover, and found that potential surface cooling from increased albedo is typically offset by warming from decreased sensible heat fluxes.
Abstract: The direct effects of land-cover change on surface climate are increasingly well understood, but fewer studies have investigated the consequences of the trend towards more intensive land management practices. Now, research investigating the biophysical effects of temperate land-management changes reveals a net warming effect of similar magnitude to that driven by changing land cover. Anthropogenic changes to land cover (LCC) remain common, but continuing land scarcity promotes the widespread intensification of land management changes (LMC) to better satisfy societal demand for food, fibre, fuel and shelter1. The biophysical effects of LCC on surface climate are largely understood2,3,4,5, particularly for the boreal6 and tropical zones7, but fewer studies have investigated the biophysical consequences of LMC; that is, anthropogenic modification without a change in land cover type. Harmonized analysis of ground measurements and remote sensing observations of both LCC and LMC revealed that, in the temperate zone, potential surface cooling from increased albedo is typically offset by warming from decreased sensible heat fluxes, with the net effect being a warming of the surface. Temperature changes from LMC and LCC were of the same magnitude, and averaged 2 K at the vegetation surface and were estimated at 1.7 K in the planetary boundary layer. Given the spatial extent of land management (42–58% of the land surface) this calls for increasing the efforts to integrate land management in Earth System Science to better take into account the human impact on the climate8.


Journal ArticleDOI
TL;DR: In this paper, a framework has been proposed that distinguishes between the integration (land sharing) and separation (land sparing) of conservation and production to address the challenges of biodiversity conservation and commodity production.
Abstract: To address the challenges of biodiversity conservation and commodity production, a framework has been proposed that distinguishes between the integration (land sharing) and separation (land sparing) of conservation and production. Controversy has arisen around this framework partly because many scholars have focused specifically on food production rather than more encompassing notions such as land scarcity or food security. Controversy further surrounds the practical value of partial trade-off analyses, the ways in which biodiversity should be quantified, and a series of scale effects that are not readily accounted for. We see key priorities for the future in (1) addressing these issues when using the existing framework, and (2) developing alternative, holistic ways to conceptualise challenges related to food, biodiversity, and land scarcity. (Less)

Journal ArticleDOI
TL;DR: It is found in numerical simulations of artificially generated power grids that tree-like connection schemes--so-called dead ends and dead trees--strongly diminish stability, which may indicate a topological design principle for future power grids: avoid dead ends.
Abstract: The cheapest and thus widespread way to add new generators to a high-voltage power grid is by a simple tree-like connection scheme. However, it is not entirely clear how such locally cost-minimizing connection schemes affect overall system performance, in particular the stability against blackouts. Here we investigate how local patterns in the network topology influence a power grid's ability to withstand blackout-prone large perturbations. Employing basin stability, a nonlinear concept, we find in numerical simulations of artificially generated power grids that tree-like connection schemes--so-called dead ends and dead trees--strongly diminish stability. A case study of the Northern European power system confirms this result and demonstrates that the inverse is also true: repairing dead ends by addition of a few transmission lines substantially enhances stability. This may indicate a topological design principle for future power grids: avoid dead ends.

Journal ArticleDOI
TL;DR: Together, these studies provide extensive and compelling evidence that the DIAMONDS taxonomy is useful for organizing major dimensions of situation characteristics.
Abstract: Taxonomies of person characteristics are well developed, whereas taxonomies of psychologically important situation characteristics are underdeveloped. A working model of situation perception implies the existence of taxonomizable dimensions of psychologically meaningful, important, and consequential situation characteristics tied to situation cues, goal affordances, and behavior. Such dimensions are developed and demonstrated in a multi-method set of 6 studies. First, the �Situational Eight DIAMONDS� dimensions Duty, Intellect, Adversity, Mating, pOsitivity, Negativity, Deception, and Sociality (Study 1) are established from the Riverside Situational Q-Sort (Sherman, Nave, & Funder, 2010, 2012, 2013; Wagerman & Funder, 2009). Second, their rater agreement (Study 2) and associations with situation cues and goal/trait affordances (Studies 3 and 4) are examined. Finally, the usefulness of these dimensions is demonstrated by examining their predictive power of behavior (Study 5), particularly vis-a-vis measures of personality and situations (Study 6). Together, we provide extensive and compelling evidence that the DIAMONDS taxonomy is useful for organizing major dimensions of situation characteristics. We discuss the DIAMONDS taxonomy in the context of previous taxonomic approaches and sketch future research directions. (PsycINFO Database Record (c) 2014 APA, all rights reserved)

Journal ArticleDOI
TL;DR: An account of the mathematical and physical foundations of criticality is provided and recent experimental studies are reviewed with the aim of identifying important next steps to be taken and connections to other fields that should be explored.
Abstract: The neural criticality hypothesis states that the brain may be poised in a critical state at a boundary between different types of dynamics. Theoretical and experimental studies show that critical systems often exhibit optimal computational properties, suggesting the possibility that criticality has been evolutionarily selected as a useful trait for our nervous system. Evidence for criticality has been found in cell cultures, brain slices, and anesthetized animals. Yet, inconsistent results were reported for recordings in awake animals and humans, and current results point to open questions about the exact nature and mechanism of criticality, as well as its functional role. Therefore, the criticality hypothesis has remained a controversial proposition. Here, we provide an account of the mathematical and physical foundations of criticality. In the light of this conceptual framework, we then review and discuss recent experimental studies with the aim of identifying important next steps to be taken and connections to other fields that should be explored.

Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, S. Abdel Khalek4  +2871 moreInstitutions (167)
TL;DR: In this article, the authors presented the electron and photon energy calibration achieved with the ATLAS detector using about 25 fb(-1) of LHC proton-proton collision data taken at center-of-mass energies of root s = 7 and 8 TeV.
Abstract: This paper presents the electron and photon energy calibration achieved with the ATLAS detector using about 25 fb(-1) of LHC proton-proton collision data taken at centre-of-mass energies of root s = 7 and 8 TeV. The reconstruction of electron and photon energies is optimised using multivariate algorithms. The response of the calorimeter layers is equalised in data and simulation, and the longitudinal profile of the electromagnetic showers is exploited to estimate the passive material in front of the calorimeter and reoptimise the detector simulation. After all corrections, the Z resonance is used to set the absolute energy scale. For electrons from Z decays, the achieved calibration is typically accurate to 0.05% in most of the detector acceptance, rising to 0.2% in regions with large amounts of passive material. The remaining inaccuracy is less than 0.2-1% for electrons with a transverse energy of 10 GeV, and is on average 0.3% for photons. The detector resolution is determined with a relative inaccuracy of less than 10% for electrons and photons up to 60 GeV transverse energy, rising to 40% for transverse energies above 500 GeV.

Journal ArticleDOI
TL;DR: This perspective provides a focused rather than comprehensive review of the recent advances in the chemistry of biomimetic high-valent metal-oxo and metal-dioxygen complexes, which can be related to the understanding of the biological systems.
Abstract: Selective functionalization of unactivated C–H bonds, water oxidation, and dioxygen reduction are extremely important reactions in the context of finding energy carriers and conversion processes that are alternatives to the current fossil-based oil for energy. A range of metalloenzymes achieve these challenging tasks in biology by using cheap and abundant transition metals, such as iron, copper, and manganese. High-valent metal–oxo and metal–dioxygen (superoxo, peroxo, and hydroperoxo) cores act as active intermediates in many of these processes. The generation of well-described model compounds can provide vital insights into the mechanisms of such enzymatic reactions. This perspective provides a focused rather than comprehensive review of the recent advances in the chemistry of biomimetic high-valent metal–oxo and metal–dioxygen complexes, which can be related to our understanding of the biological systems.

Posted Content
01 Jan 2014
TL;DR: In this paper, a framework has been proposed that distinguishes between the integration (land sharing) and separation (land sparing) of conservation and production of commodity production to address the challenges of biodiversity conservation and commodity production.
Abstract: To address the challenges of biodiversity conservation and commodity production, a framework has been proposed that distinguishes between the integration (“land sharing”) and separation (“land sparing”) of conservation and production. Controversy has arisen around this framework partly because many scholars have focused specifically on food production rather than more encompassing notions such as land scarcity or food security. Controversy further surrounds the practical value of partial trade‐off analyses, the ways in which biodiversity should be quantified, and a series of scale effects that are not readily accounted for. We see key priorities for the future in (1) addressing these issues when using the existing framework, and (2) developing alternative, holistic ways to conceptualise challenges related to food, biodiversity, and land scarcity.

Journal ArticleDOI
25 Apr 2014-Science
TL;DR: By inverting the charge of the selectivity filter, Wietek et al. have created a class of directly light-gated anion channels that can be used to block neuronal output in a fully reversible fashion.
Abstract: The field of optogenetics uses channelrhodopsins (ChRs) for light-induced neuronal activation. However, optimized tools for cellular inhibition at moderate light levels are lacking. We found that replacement of E90 in the central gate of ChR with positively charged residues produces chloride-conducting ChRs (ChloCs) with only negligible cation conductance. Molecular dynamics modeling unveiled that a high-affinity Cl(-)-binding site had been generated near the gate. Stabilizing the open state dramatically increased the operational light sensitivity of expressing cells (slow ChloC). In CA1 pyramidal cells, ChloCs completely inhibited action potentials triggered by depolarizing current injections or synaptic stimulation. Thus, by inverting the charge of the selectivity filter, we have created a class of directly light-gated anion channels that can be used to block neuronal output in a fully reversible fashion.

Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, S. Abdel Khalek4  +2911 moreInstitutions (209)
TL;DR: In this paper, a measurement of the Z/gamma* boson transverse momentum spectrum using ATLAS proton-proton collision data at a centre-of-mass energy of root s = 7TeV at the LHC is described.
Abstract: This paper describes a measurement of the Z/gamma* boson transverse momentum spectrum using ATLAS proton-proton collision data at a centre-of-mass energy of root s = 7TeV at the LHC. The measurement is performed in the Z/gamma* -> e(+)e(-) and Z/gamma* -> mu(+)mu(-) channels, using data corresponding to an integrated luminosity of 4.7 fb(-1). Normalized differential cross sections as a function of the Z/gamma* boson transverse momentum are measured for transverse momenta up to 800 GeV. The measurement is performed inclusively for Z/gamma* rapidities up to 2.4, as well as in three rapidity bins. The channel results are combined, compared to perturbative and resummed QCD calculations and used to constrain the parton shower parameters of Monte Carlo generators.

Journal ArticleDOI
TL;DR: ProTox, a web server for the prediction of rodent oral toxicity, is presented, based on the analysis of the similarity of compounds with known median lethal doses and incorporates the identification of toxic fragments, therefore representing a novel approach in toxicity prediction.
Abstract: Animal trials are currently the major method for determining the possible toxic effects of drug candidates and cosmetics. In silico prediction methods represent an alternative approach and aim to rationalize the preclinical drug development, thus enabling the reduction of the associated time, costs and animal experiments. Here, we present ProTox, a web server for the prediction of rodent oral toxicity. The prediction method is based on the analysis of the similarity of compounds with known median lethal doses (LD50) and incorporates the identification of toxic fragments, therefore representing a novel approach in toxicity prediction. In addition, the web server includes an indication of possible toxicity targets which is based on an in-house collection of protein–ligand-based pharmacophore models (‘toxicophores’) for targets associated with adverse drug reactions. The ProTox web server is open to all users and can be accessed without registration at: http://tox.charite.de/tox. The only requirement for the prediction is the two-dimensional structure of the input compounds. All ProTox methods have been evaluated based on a diverse external validation set and displayed strong performance (sensitivity, specificity and precision of 76, 95 and 75%, respectively) and superiority over other toxicity prediction tools, indicating their possible applicability for other compound classes.

Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, S. Abdel Khalek4  +2916 moreInstitutions (211)
TL;DR: In this article, a search for squarks and gluinos in final states containing high-p T jets, missing transverse momentum and no electrons or muons is presented.
Abstract: A search for squarks and gluinos in final states containing high-p T jets, missing transverse momentum and no electrons or muons is presented. The data were recorded in 2012 by the ATLAS experiment in s√=8 TeV proton-proton collisions at the Large Hadron Collider, with a total integrated luminosity of 20.3 fb−1. Results are interpreted in a variety of simplified and specific supersymmetry-breaking models assuming that R-parity is conserved and that the lightest neutralino is the lightest supersymmetric particle. An exclusion limit at the 95% confidence level on the mass of the gluino is set at 1330 GeV for a simplified model incorporating only a gluino and the lightest neutralino. For a simplified model involving the strong production of first- and second-generation squarks, squark masses below 850 GeV (440 GeV) are excluded for a massless lightest neutralino, assuming mass degenerate (single light-flavour) squarks. In mSUGRA/CMSSM models with tan β = 30, A 0 = −2m 0 and μ > 0, squarks and gluinos of equal mass are excluded for masses below 1700 GeV. Additional limits are set for non-universal Higgs mass models with gaugino mediation and for simplified models involving the pair production of gluinos, each decaying to a top squark and a top quark, with the top squark decaying to a charm quark and a neutralino. These limits extend the region of supersymmetric parameter space excluded by previous searches with the ATLAS detector.

Journal ArticleDOI
TL;DR: Video game training augments GM in brain areas crucial for spatial navigation, strategic planning, working memory and motor performance going along with evidence for behavioral changes of navigation strategy, which could counteract known risk factors for mental disease.
Abstract: Video gaming is a highly pervasive activity, providing a multitude of complex cognitive and motor demands. Gaming can be seen as an intense training of several skills. Associated cerebral structural plasticity induced has not been investigated so far. Comparing a control with a video gaming training group that was trained for 2 months for at least 30 min per day with a platformer game, we found significant gray matter (GM) increase in right hippocampal formation (HC), right dorsolateral prefrontal cortex (DLPFC) and bilateral cerebellum in the training group. The HC increase correlated with changes from egocentric to allocentric navigation strategy. GM increases in HC and DLPFC correlated with participants' desire for video gaming, evidence suggesting a predictive role of desire in volume change. Video game training augments GM in brain areas crucial for spatial navigation, strategic planning, working memory and motor performance going along with evidence for behavioral changes of navigation strategy. The presented video game training could therefore be used to counteract known risk factors for mental disease such as smaller hippocampus and prefrontal cortex volume in, for example, post-traumatic stress disorder, schizophrenia and neurodegenerative disease.