scispace - formally typeset
Search or ask a question

Showing papers by "University of Göttingen published in 2014"


Journal ArticleDOI
Keith A. Olive1, Kaustubh Agashe2, Claude Amsler3, Mario Antonelli  +222 moreInstitutions (107)
TL;DR: The review as discussed by the authors summarizes much of particle physics and cosmology using data from previous editions, plus 3,283 new measurements from 899 Japers, including the recently discovered Higgs boson, leptons, quarks, mesons and baryons.
Abstract: The Review summarizes much of particle physics and cosmology. Using data from previous editions, plus 3,283 new measurements from 899 Japers, we list, evaluate, and average measured properties of gauge bosons and the recently discovered Higgs boson, leptons, quarks, mesons, and baryons. We summarize searches for hypothetical particles such as heavy neutrinos, supersymmetric and technicolor particles, axions, dark photons, etc. All the particle properties and search limits are listed in Summary Tables. We also give numerous tables, figures, formulae, and reviews of topics such as Supersymmetry, Extra Dimensions, Particle Detectors, Probability, and Statistics. Among the 112 reviews are many that are new or heavily revised including those on: Dark Energy, Higgs Boson Physics, Electroweak Model, Neutrino Cross Section Measurements, Monte Carlo Neutrino Generators, Top Quark, Dark Matter, Dynamical Electroweak Symmetry Breaking, Accelerator Physics of Colliders, High-Energy Collider Parameters, Big Bang Nucleosynthesis, Astrophysical Constants and Cosmological Parameters.

7,337 citations


Journal ArticleDOI
Stephan Ripke1, Stephan Ripke2, Benjamin M. Neale1, Benjamin M. Neale2  +351 moreInstitutions (102)
24 Jul 2014-Nature
TL;DR: Associations at DRD2 and several genes involved in glutamatergic neurotransmission highlight molecules of known and potential therapeutic relevance to schizophrenia, and are consistent with leading pathophysiological hypotheses.
Abstract: Schizophrenia is a highly heritable disorder. Genetic risk is conferred by a large number of alleles, including common alleles of small effect that might be detected by genome-wide association studies. Here we report a multi-stage schizophrenia genome-wide association study of up to 36,989 cases and 113,075 controls. We identify 128 independent associations spanning 108 conservatively defined loci that meet genome-wide significance, 83 of which have not been previously reported. Associations were enriched among genes expressed in brain, providing biological plausibility for the findings. Many findings have the potential to provide entirely new insights into aetiology, but associations at DRD2 and several genes involved in glutamatergic neurotransmission highlight molecules of known and potential therapeutic relevance to schizophrenia, and are consistent with leading pathophysiological hypotheses. Independent of genes expressed in brain, associations were enriched among genes expressed in tissues that have important roles in immunity, providing support for the speculated link between the immune system and schizophrenia.

6,809 citations


Proceedings ArticleDOI
01 Jan 2014
TL;DR: DREBIN is proposed, a lightweight method for detection of Android malware that enables identifying malicious applications directly on the smartphone and outperforms several related approaches and detects 94% of the malware with few false alarms.
Abstract: Malicious applications pose a threat to the security of the Android platform. The growing amount and diversity of these applications render conventional defenses largely ineffective and thus Android smartphones often remain unprotected from novel malware. In this paper, we propose DREBIN, a lightweight method for detection of Android malware that enables identifying malicious applications directly on the smartphone. As the limited resources impede monitoring applications at run-time, DREBIN performs a broad static analysis, gathering as many features of an application as possible. These features are embedded in a joint vector space, such that typical patterns indicative for malware can be automatically identified and used for explaining the decisions of our method. In an evaluation with 123,453 applications and 5,560 malware samples DREBIN outperforms several related approaches and detects 94% of the malware with few false alarms, where the explanations provided for each detection reveal relevant properties of the detected malware. On five popular smartphones, the method requires 10 seconds for an analysis on average, rendering it suitable for checking downloaded applications directly on the device.

1,905 citations


Journal ArticleDOI
TL;DR: Secukinumab was effective for psoriasis in two randomized trials, validating interleukin-17A as a therapeutic target and the rates of infection were higher with secuk inumab than with placebo in both studies and were similar to those with etanercept.
Abstract: BACKGROUND: Interleukin-17A is considered to be central to the pathogenesis of psoriasis. We evaluated secukinumab, a fully human anti-interleukin-17A monoclonal antibody, in patients with moderate-to-severe plaque psoriasis. METHODS: In two phase 3, double-blind, 52-week trials, ERASURE (Efficacy of Response and Safety of Two Fixed Secukinumab Regimens in Psoriasis) and FIXTURE (Full Year Investigative Examination of Secukinumab vs. Etanercept Using Two Dosing Regimens to Determine Efficacy in Psoriasis), we randomly assigned 738 patients (in the ERASURE study) and 1306 patients (in the FIXTURE study) to subcutaneous secukinumab at a dose of 300 mg or 150 mg (administered once weekly for 5 weeks, then every 4 weeks), placebo, or (in the FIXTURE study only) etanercept at a dose of 50 mg (administered twice weekly for 12 weeks, then once weekly). The objective of each study was to show the superiority of secukinumab over placebo at week 12 with respect to the proportion of patients who had a reduction of 75% or more from baseline in the psoriasis area-and-severity index score (PASI 75) and a score of 0 (clear) or 1 (almost clear) on a 5-point modified investigator's global assessment (coprimary end points). RESULTS: The proportion of patients who met the criterion for PASI 75 at week 12 was higher with each secukinumab dose than with placebo or etanercept: in the ERASURE study, the rates were 81.6% with 300 mg of secukinumab, 71.6% with 150 mg of secukinumab, and 4.5% with placebo; in the FIXTURE study, the rates were 77.1% with 300 mg of secukinumab, 67.0% with 150 mg of secukinumab, 44.0% with etanercept, and 4.9% with placebo (P<0.001 for each secukinumab dose vs. comparators). The proportion of patients with a response of 0 or 1 on the modified investigator's global assessment at week 12 was higher with each secukinumab dose than with placebo or etanercept: in the ERASURE study, the rates were 65.3% with 300 mg of secukinumab, 51.2% with 150 mg of secukinumab, and 2.4% with placebo; in the FIXTURE study, the rates were 62.5% with 300 mg of secukinumab, 51.1% with 150 mg of secukinumab, 27.2% with etanercept, and 2.8% with placebo (P<0.001 for each secukinumab dose vs. comparators). The rates of infection were higher with secukinumab than with placebo in both studies and were similar to those with etanercept. CONCLUSIONS: Secukinumab was effective for psoriasis in two randomized trials, validating interleukin-17A as a therapeutic target. (Funded by Novartis Pharmaceuticals; ERASURE and FIXTURE ClinicalTrials.gov numbers, NCT01365455 and NCT01358578, respectively.).

1,587 citations


Journal ArticleDOI
TL;DR: There is a sufficient body of evidence to accept with level A (definite efficacy) the analgesic effect of high-frequency rTMS of the primary motor cortex (M1) contralateral to the pain and the antidepressant effect of HF-rT MS of the left dorsolateral prefrontal cortex (DLPFC).

1,554 citations


Journal ArticleDOI
TL;DR: Recent efforts have resulted in widely applicable methods for the versatile preparation of differently decorated arenes and heteroarenes, providing access to among others isoquinolones, 2-pyridones,isoquinolines, indoles, pyrroles, or α-pyrones.
Abstract: To improve the atom- and step-economy of organic syntheses, researchers would like to capitalize upon the chemistry of otherwise inert carbon–hydrogen (C–H) bonds. During the past decade, remarkable progress in organometallic chemistry has set the stage for the development of increasingly viable metal catalysts for C–H bond activation reactions. Among these methods, oxidative C–H bond functionalizations are particularly attractive because they avoid the use of prefunctionalized starting materials. For example, oxidative annulations that involve sequential C–H and heteroatom–H bond cleavages allow for the modular assembly of regioselectively decorated heterocycles. These structures serve as key scaffolds for natural products, functional materials, crop protecting agents, and drugs. While other researchers have devised rhodium or palladium complexes for oxidative alkyne annulations, my laboratory has focused on the application of significantly less expensive, yet highly selective ruthenium complexes.This Ac...

1,403 citations


Journal ArticleDOI
19 Dec 2014-Science
TL;DR: It is shown that roughly one-third of mainland Europe hosts at least one large carnivore species, with stable or increasing abundance in most cases in 21st-century records, and coexistence alongside humans has become possible, argue the authors.
Abstract: The conservation of large carnivores is a formidable challenge for biodiversity conservation. Using a data set on the past and current status of brown bears (Ursus arctos), Eurasian lynx (Lynx lynx), gray wolves (Canis lupus), and wolverines (Gulo gulo) in European countries, we show that roughly one-third of mainland Europe hosts at least one large carnivore species, with stable or increasing abundance in most cases in 21st-century records. The reasons for this overall conservation success include protective legislation, supportive public opinion, and a variety of practices making coexistence between large carnivores and people possible. The European situation reveals that large carnivores and people can share the same landscape.

1,290 citations


Journal ArticleDOI
TL;DR: This work provides the first quantitative support for the generality of positive heterogeneity-richness relationships across heterogeneity components, habitat types, taxa and spatial scales from landscape to global extents, and identifies specific needs for future comparative heterogeneity- richness research.
Abstract: Environmental heterogeneity is regarded as one of the most important factors governing species richness gradients. An increase in available niche space, provision of refuges and opportunities for isolation and divergent adaptation are thought to enhance species coexistence, persistence and diversification. However, the extent and generality of positive heterogeneity–richness relationships are still debated. Apart from widespread evidence supporting positive relationships, negative and hump-shaped relationships have also been reported. In a meta-analysis of 1148 data points from 192 studies worldwide, we examine the strength and direction of the relationship between spatial environmental heterogeneity and species richness of terrestrial plants and animals. We find that separate effects of heterogeneity in land cover, vegetation, climate, soil and topography are significantly positive, with vegetation and topographic heterogeneity showing particularly strong associations with species richness. The use of equal-area study units, spatial grain and spatial extent emerge as key factors influencing the strength of heterogeneity–richness relationships, highlighting the pervasive influence of spatial scale in heterogeneity–richness studies. We provide the first quantitative support for the generality of positive heterogeneity–richness relationships across heterogeneity components, habitat types, taxa and spatial scales from landscape to global extents, and identify specific needs for future comparative heterogeneity–richness research.

1,161 citations


Journal ArticleDOI
TL;DR: This paper updates the earlier work by Keating et?al.
Abstract: Agricultural systems models worldwide are increasingly being used to explore options and solutions for the food security, climate change adaptation and mitigation and carbon trading problem domains. APSIM (Agricultural Production Systems sIMulator) is one such model that continues to be applied and adapted to this challenging research agenda. From its inception twenty years ago, APSIM has evolved into a framework containing many of the key models required to explore changes in agricultural landscapes with capability ranging from simulation of gene expression through to multi-field farms and beyond.Keating et?al. (2003) described many of the fundamental attributes of APSIM in detail. Much has changed in the last decade, and the APSIM community has been exploring novel scientific domains and utilising software developments in social media, web and mobile applications to provide simulation tools adapted to new demands.This paper updates the earlier work by Keating et?al. (2003) and chronicles the changing external challenges and opportunities being placed on APSIM during the last decade. It also explores and discusses how APSIM has been evolving to a "next generation" framework with improved features and capabilities that allow its use in many diverse topics. APSIM is an agricultural modelling framework used extensively worldwide.It can simulate a wide range of agricultural systems.It begins its third decade evolving into an agro-ecosystem framework.

1,151 citations



Journal ArticleDOI
TL;DR: It is proposed that, due to limited mass transfer, high photosynthetic activity in Fe2-rich environments forms a protective zone where Fe2+ precipitates abiotically at a non-lethal distance from the cyanobacteria.
Abstract: If O2 is available at circumneutral pH, Fe2+ is rapidly oxidized to Fe3+, which precipitates as FeO(OH). Neutrophilic iron oxidizing bacteria have evolved mechanisms to prevent self-encrustation in iron. Hitherto, no mechanism has been proposed for cyanobacteria from Fe2+-rich environments; these produce O2 but are seldom found encrusted in iron. We used two sets of illuminated reactors connected to two groundwater aquifers with different Fe2+ concentrations (0.9 μM vs. 26 μM) in the Aspo Hard Rock Laboratory (HRL), Sweden. Cyanobacterial biofilms developed in all reactors and were phylogenetically different between the reactors. Unexpectedly, cyanobacteria growing in the Fe2+-poor reactors were encrusted in iron, whereas those in the Fe2+-rich reactors were not. In-situ microsensor measurements showed that O2 concentrations and pH near the surface of the cyanobacterial biofilms from the Fe2+-rich reactors were much higher than in the overlying water. This was not the case for the biofilms growing at low Fe2+ concentrations. Measurements with enrichment cultures showed that cyanobacteria from the Fe2+-rich environment increased their photosynthesis with increasing Fe2+ concentrations, whereas those from the low Fe2+ environment were inhibited at Fe2+ > 5 μM. Modeling based on in-situ O2 and pH profiles showed that cyanobacteria from the Fe2+-rich reactor were not exposed to significant Fe2+ concentrations. We propose that, due to limited mass transfer, high photosynthetic activity in Fe2+-rich environments forms a protective zone where Fe2+ precipitates abiotically at a non-lethal distance from the cyanobacteria. This mechanism sheds new light on the possible role of cyanobacteria in precipitation of banded iron formations.

Journal ArticleDOI
Heike Rauer1, Heike Rauer2, C. Catala3, Conny Aerts4  +164 moreInstitutions (51)
TL;DR: The PLATO 2.0 mission as discussed by the authors has been selected for ESA's M3 launch opportunity (2022/24) to provide accurate key planet parameters (radius, mass, density and age) in statistical numbers.
Abstract: PLATO 2.0 has recently been selected for ESA’s M3 launch opportunity (2022/24). Providing accurate key planet parameters (radius, mass, density and age) in statistical numbers, it addresses fundamental questions such as: How do planetary systems form and evolve? Are there other systems with planets like ours, including potentially habitable planets? The PLATO 2.0 instrument consists of 34 small aperture telescopes (32 with 25 s readout cadence and 2 with 2.5 s candence) providing a wide field-of-view (2232 deg 2) and a large photometric magnitude range (4–16 mag). It focusses on bright (4–11 mag) stars in wide fields to detect and characterize planets down to Earth-size by photometric transits, whose masses can then be determined by ground-based radial-velocity follow-up measurements. Asteroseismology will be performed for these bright stars to obtain highly accurate stellar parameters, including masses and ages. The combination of bright targets and asteroseismology results in high accuracy for the bulk planet parameters: 2 %, 4–10 % and 10 % for planet radii, masses and ages, respectively. The planned baseline observing strategy includes two long pointings (2–3 years) to detect and bulk characterize planets reaching into the habitable zone (HZ) of solar-like stars and an additional step-and-stare phase to cover in total about 50 % of the sky. PLATO 2.0 will observe up to 1,000,000 stars and detect and characterize hundreds of small planets, and thousands of planets in the Neptune to gas giant regime out to the HZ. It will therefore provide the first large-scale catalogue of bulk characterized planets with accurate radii, masses, mean densities and ages. This catalogue will include terrestrial planets at intermediate orbital distances, where surface temperatures are moderate. Coverage of this parameter range with statistical numbers of bulk characterized planets is unique to PLATO 2.0. The PLATO 2.0 catalogue allows us to e.g.: - complete our knowledge of planet diversity for low-mass objects, - correlate the planet mean density-orbital distance distribution with predictions from planet formation theories,- constrain the influence of planet migration and scattering on the architecture of multiple systems, and - specify how planet and system parameters change with host star characteristics, such as type, metallicity and age. The catalogue will allow us to study planets and planetary systems at different evolutionary phases. It will further provide a census for small, low-mass planets. This will serve to identify objects which retained their primordial hydrogen atmosphere and in general the typical characteristics of planets in such low-mass, low-density range. Planets detected by PLATO 2.0 will orbit bright stars and many of them will be targets for future atmosphere spectroscopy exploring their atmosphere. Furthermore, the mission has the potential to detect exomoons, planetary rings, binary and Trojan planets. The planetary science possible with PLATO 2.0 is complemented by its impact on stellar and galactic science via asteroseismology as well as light curves of all kinds of variable stars, together with observations of stellar clusters of different ages. This will allow us to improve stellar models and study stellar activity. A large number of well-known ages from red giant stars will probe the structure and evolution of our Galaxy. Asteroseismic ages of bright stars for different phases of stellar evolution allow calibrating stellar age-rotation relationships. Together with the results of ESA’s Gaia mission, the results of PLATO 2.0 will provide a huge legacy to planetary, stellar and galactic science.

Journal ArticleDOI
12 May 2014-PLOS ONE
TL;DR: The ability of methods to correct the initial sampling bias varied greatly depending on bias type, bias intensity and species, but the simple systematic sampling of records consistently ranked among the best performing across the range of conditions tested, whereas other methods performed more poorly in most cases.
Abstract: MAXENT is now a common species distribution modeling (SDM) tool used by conservation practitioners for predicting the distribution of a species from a set of records and environmental predictors. However, datasets of species occurrence used to train the model are often biased in the geographical space because of unequal sampling effort across the study area. This bias may be a source of strong inaccuracy in the resulting model and could lead to incorrect predictions. Although a number of sampling bias correction methods have been proposed, there is no consensual guideline to account for it. We compared here the performance of five methods of bias correction on three datasets of species occurrence: one “virtual” derived from a land cover map, and two actual datasets for a turtle (Chrysemys picta) and a salamander (Plethodon cylindraceus). We subjected these datasets to four types of sampling biases corresponding to potential types of empirical biases. We applied five correction methods to the biased samples and compared the outputs of distribution models to unbiased datasets to assess the overall correction performance of each method. The results revealed that the ability of methods to correct the initial sampling bias varied greatly depending on bias type, bias intensity and species. However, the simple systematic sampling of records consistently ranked among the best performing across the range of conditions tested, whereas other methods performed more poorly in most cases. The strong effect of initial conditions on correction performance highlights the need for further research to develop a step-by-step guideline to account for sampling bias. However, this method seems to be the most efficient in correcting sampling bias and should be advised in most cases.

Journal ArticleDOI
19 Feb 2014
TL;DR: The conclusion is that the gap between industry and academia is due to the relatively small functional improvement in daily situations that academic systems offer, despite the promising laboratory results, at the expense of a substantial reduction in robustness.
Abstract: Despite not recording directly from neural cells, the surface electromyogram (EMG) signal contains information on the neural drive to muscles, i.e., the spike trains of motor neurons. Using this property, myoelectric control consists of the recording of EMG signals for extracting control signals to command external devices, such as hand prostheses. In commercial control systems, the intensity of muscle activity is extracted from the EMG and used for single degrees of freedom activation (direct control). Over the past 60 years, academic research has progressed to more sophisticated approaches but, surprisingly, none of these academic achievements has been implemented in commercial systems so far. We provide an overview of both commercial and academic myoelectric control systems and we analyze their performance with respect to the characteristics of the ideal myocontroller. Classic and relatively novel academic methods are described, including techniques for simultaneous and proportional control of multiple degrees of freedom and the use of individual motor neuron spike trains for direct control. The conclusion is that the gap between industry and academia is due to the relatively small functional improvement in daily situations that academic systems offer, despite the promising laboratory results, at the expense of a substantial reduction in robustness. None of the systems so far proposed in the literature fulfills all the important criteria needed for widespread acceptance by the patients, i.e. intuitive, closed-loop, adaptive, and robust real-time ( 200 ms delay) control, minimal number of recording electrodes with low sensitivity to repositioning, minimal training, limited complexity and low consumption. Nonetheless, in recent years, important efforts have been invested in matching these criteria, with relevant steps forwards.

Journal ArticleDOI
TL;DR: In this article, the authors summarized factors determining the plant availability of soil potassium (K), the role of K in crop yield formation and product quality, and the dependence of crop stress resistance on K nutrition.

Journal ArticleDOI
TL;DR: Both K- and r-strategists were beneficial for priming effects, with an increasing contribution of K-selected species under N limitation, which supports the microbial mining theory in terms of N limitation and confirms the stoichiometric decomposition theory.
Abstract: The increasing input of anthropogenically derived nitrogen (N) to ecosystems raises a crucial question: how does available N modify the decomposer community and thus affects the mineralization of soil organic matter (SOM). Moreover, N input modifies the priming effect (PE), that is, the effect of fresh organics on the microbial decomposition of SOM. We studied the interactive effects of C and N on SOM mineralization (by natural (13) C labelling adding C4 -sucrose or C4 -maize straw to C3 -soil) in relation to microbial growth kinetics and to the activities of five hydrolytic enzymes. This encompasses the groups of parameters governing two mechanisms of priming effects - microbial N mining and stoichiometric decomposition theories. In sole C treatments, positive PE was accompanied by a decrease in specific microbial growth rates, confirming a greater contribution of K-strategists to the decomposition of native SOM. Sucrose addition with N significantly accelerated mineralization of native SOM, whereas mineral N added with plant residues accelerated decomposition of plant residues. This supports the microbial mining theory in terms of N limitation. Sucrose addition with N was accompanied by accelerated microbial growth, increased activities of β-glucosidase and cellobiohydrolase, and decreased activities of xylanase and leucine amino peptidase. This indicated an increased contribution of r-strategists to the PE and to decomposition of cellulose but the decreased hemicellulolytic and proteolytic activities. Thus, the acceleration of the C cycle was primed by exogenous organic C and was controlled by N. This confirms the stoichiometric decomposition theory. Both K- and r-strategists were beneficial for priming effects, with an increasing contribution of K-selected species under N limitation. Thus, the priming phenomenon described in 'microbial N mining' theory can be ascribed to K-strategists. In contrast, 'stoichiometric decomposition' theory, that is, accelerated OM mineralization due to balanced microbial growth, is explained by domination of r-strategists.

Journal ArticleDOI
Paul M. Thompson1, Jason L. Stein2, Sarah E. Medland3, Derrek P. Hibar1  +329 moreInstitutions (96)
TL;DR: The ENIGMA Consortium has detected factors that affect the brain that no individual site could detect on its own, and that require larger numbers of subjects than any individual neuroimaging study has currently collected.
Abstract: The Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium is a collaborative network of researchers working together on a range of large-scale studies that integrate data from 70 institutions worldwide. Organized into Working Groups that tackle questions in neuroscience, genetics, and medicine, ENIGMA studies have analyzed neuroimaging data from over 12,826 subjects. In addition, data from 12,171 individuals were provided by the CHARGE consortium for replication of findings, in a total of 24,997 subjects. By meta-analyzing results from many sites, ENIGMA has detected factors that affect the brain that no individual site could detect on its own, and that require larger numbers of subjects than any individual neuroimaging study has currently collected. ENIGMA's first project was a genome-wide association study identifying common variants in the genome associated with hippocampal volume or intracranial volume. Continuing work is exploring genetic associations with subcortical volumes (ENIGMA2) and white matter microstructure (ENIGMA-DTI). Working groups also focus on understanding how schizophrenia, bipolar illness, major depression and attention deficit/hyperactivity disorder (ADHD) affect the brain. We review the current progress of the ENIGMA Consortium, along with challenges and unexpected discoveries made on the way.

Journal ArticleDOI
TL;DR: It is concluded that transcranial focused ultrasound (tFUS) targeted to the human primary somatosensory cortex can be used to focally modulate human cortical function.
Abstract: Improved methods of noninvasively modulating human brain function are needed. Here we probed the influence of transcranial focused ultrasound (tFUS) targeted to the human primary somatosensory cortex (S1) on sensory-evoked brain activity and sensory discrimination abilities. The lateral and axial spatial resolution of the tFUS beam implemented were 4.9 mm and 18 mm, respectively. Electroencephalographic recordings showed that tFUS significantly attenuated the amplitudes of somatosensory evoked potentials elicited by median nerve stimulation. We also found that tFUS significantly modulated the spectral content of sensory-evoked brain oscillations. The changes produced by tFUS on sensory-evoked brain activity were abolished when the acoustic beam was focused 1 cm anterior or posterior to S1. Behavioral investigations showed that tFUS targeted to S1 enhanced performance on sensory discrimination tasks without affecting task attention or response bias. We conclude that tFUS can be used to focally modulate human cortical function.

Journal ArticleDOI
30 May 2014-Science
TL;DR: A quantitative molecular-scale image of the “average” synapse populated with realistic renditions of each of the protein components that contribute to the inner workings of neurons is presented, displaying 300,000 proteins in atomic detail.
Abstract: Synaptic vesicle recycling has long served as a model for the general mechanisms of cellular trafficking. We used an integrative approach, combining quantitative immunoblotting and mass spectrometry to determine protein numbers; electron microscopy to measure organelle numbers, sizes, and positions; and super-resolution fluorescence microscopy to localize the proteins. Using these data, we generated a three-dimensional model of an "average" synapse, displaying 300,000 proteins in atomic detail. The copy numbers of proteins involved in the same step of synaptic vesicle recycling correlated closely. In contrast, copy numbers varied over more than three orders of magnitude between steps, from about 150 copies for the endosomal fusion proteins to more than 20,000 for the exocytotic ones.

Journal ArticleDOI
03 Nov 2014-PLOS ONE
TL;DR: A meta-analysis of the agronomic and economic impacts of GM crops reveals robust evidence of GM crop benefits for farmers in developed and developing countries.
Abstract: Background Despite the rapid adoption of genetically modified (GM) crops by farmers in many countries, controversies about this technology continue. Uncertainty about GM crop impacts is one reason for widespread public suspicion. Objective We carry out a meta-analysis of the agronomic and economic impacts of GM crops to consolidate the evidence. Data Sources Original studies for inclusion were identified through keyword searches in ISI Web of Knowledge, Google Scholar, EconLit, and AgEcon Search. Study Eligibility Criteria Studies were included when they build on primary data from farm surveys or field trials anywhere in the world, and when they report impacts of GM soybean, maize, or cotton on crop yields, pesticide use, and/or farmer profits. In total, 147 original studies were included. Synthesis Methods Analysis of mean impacts and meta-regressions to examine factors that influence outcomes. Results On average, GM technology adoption has reduced chemical pesticide use by 37%, increased crop yields by 22%, and increased farmer profits by 68%. Yield gains and pesticide reductions are larger for insect-resistant crops than for herbicide-tolerant crops. Yield and profit gains are higher in developing countries than in developed countries. Limitations Several of the original studies did not report sample sizes and measures of variance. Conclusion The meta-analysis reveals robust evidence of GM crop benefits for farmers in developed and developing countries. Such evidence may help to gradually increase public trust in this technology.

Journal ArticleDOI
08 May 2014-Nature
TL;DR: Reducing the functional diversity of decomposer organisms and plant litter types slowed the cycling of litter carbon and nitrogen, and the emergence of this general mechanism and the coherence of patterns across contrasting terrestrial and aquatic ecosystems suggest that biodiversity loss has consistent consequences for litter decomposition and the Cycling of major elements on broad spatial scales.
Abstract: The decomposition of dead organic matter is a major determinant of carbon and nutrient cycling in ecosystems, and of carbon fluxes between the biosphere and the atmosphere. Decomposition is driven by a vast diversity of organisms that are structured in complex food webs. Identifying the mechanisms underlying the effects of biodiversity on decomposition is critical given the rapid loss of species worldwide and the effects of this loss on human well-being. Yet despite comprehensive syntheses of studies on how biodiversity affects litter decomposition, key questions remain, including when, where and how biodiversity has a role and whether general patterns and mechanisms occur across ecosystems and different functional types of organism. Here, in field experiments across five terrestrial and aquatic locations, ranging from the subarctic to the tropics, we show that reducing the functional diversity of decomposer organisms and plant litter types slowed the cycling of litter carbon and nitrogen. Moreover, we found evidence of nitrogen transfer from the litter of nitrogen-fixing plants to that of rapidly decomposing plants, but not between other plant functional types, highlighting that specific interactions in litter mixtures control carbon and nitrogen cycling during decomposition. The emergence of this general mechanism and the coherence of patterns across contrasting terrestrial and aquatic ecosystems suggest that biodiversity loss has consistent consequences for litter decomposition and the cycling of major elements on broad spatial scales.

Journal ArticleDOI
TL;DR: This tutorial review deals with the design, synthesis and host-guest chemistry of discrete coordination cages built according to the combination of pyridyl ligands and square-planar Pd(ii) or Pt(II) cations for the realization of supramolecular self-assemblies.
Abstract: The combination of pyridyl ligands and square-planar Pd(II) or Pt(II) cations has proven to be a very reliable recipe for the realization of supramolecular self-assemblies. This tutorial review deals with the design, synthesis and host–guest chemistry of discrete coordination cages built according to this strategy. The focus is set on structures obeying the formula [PdnL2n] (n = 2–4). The most discussed ligands are bent, bis-monodentate bridges having their two donor sites pointing in the same direction. The structures of the resulting cages range from simple globules over intertwined knots to interpenetrated dimers featuring three small pockets instead of one large cavity. The cages have large openings that allow small guest molecules to enter and leave the cavities. Most structures are cationic and thus favour the uptake of anionic guests. Some examples of host–guest complexes are discussed with emphasis on coencapsulation and allosteric binding phenomena. Aside from cages in which the ligands have only a structural role, some examples of functional ligands based on photo- and redox-active backbones are presented.

Journal ArticleDOI
TL;DR: Routine rectal administration of 100 mg of diclofenac or indomethacin immediately before or after ERCP in all patients without contraindication is recommended and needle-knife fistulotomy should be the preferred precut technique in patients with a bile duct dilated down to the papilla.
Abstract: This Guideline is an official statement of the European Society of Gastrointestinal Endoscopy (ESGE). It addresses the prophylaxis of post-endoscopic retrograde cholangiopancreatography (post-ERCP) pancreatitis. Main recommendations 1 ESGE recommends routine rectal administration of 100 mg of diclofenac or indomethacin immediately before or after ERCP in all patients without contraindication. In addition to this, in the case of high risk for post-ERCP pancreatitis (PEP), the placement of a 5-Fr prophylactic pancreatic stent should be strongly considered. Sublingually administered glyceryl trinitrate or 250 µg somatostatin given in bolus injection might be considered as an option in high risk cases if nonsteroidal anti-inflammatory drugs (NSAIDs) are contraindicated and if prophylactic pancreatic stenting is not possible or successful. 2 ESGE recommends keeping the number of cannulation attempts as low as possible. 3 ESGE suggests restricting the use of a pancreatic guidewire as a backup technique for biliary cannulation to cases with repeated inadvertent cannulation of the pancreatic duct; if this method is used, deep biliary cannulation should be attempted using a guidewire rather than the contrast-assisted method and a prophylactic pancreatic stent should be placed. 4 ESGE suggests that needle-knife fistulotomy should be the preferred precut technique in patients with a bile duct dilated down to the papilla. Conventional precut and transpancreatic sphincterotomy present similar success and complication rates; if conventional precut is selected and pancreatic cannulation is easily obtained, ESGE suggests attempting to place a small-diameter (3-Fr or 5-Fr) pancreatic stent to guide the cut and leaving the pancreatic stent in place at the end of ERCP for a minimum of 12 – 24 hours. 4 ESGE does not recommend endoscopic papillary balloon dilation as an alternative to sphincterotomy in routine ERCP, but it may be advantageous in selected patients; if this technique is used, the duration of dilation should be longer than 1 minute.

Journal ArticleDOI
TL;DR: This revision of the 2005 British Association for Psychopharmacology guidelines for the evidence-based pharmacological treatment of anxiety disorders provides an update on key steps in diagnosis and clinical management, including recognition, acute treatment, longer-term treatment, combination treatment, and further approaches for patients who have not responded to first-line interventions.
Abstract: This revision of the 2005 British Association for Psychopharmacology guidelines for the evidence-based pharmacological treatment of anxiety disorders provides an update on key steps in diagnosis and clinical management, including recognition, acute treatment, longer-term treatment, combination treatment, and further approaches for patients who have not responded to first-line interventions. A consensus meeting involving international experts in anxiety disorders reviewed the main subject areas and considered the strength of supporting evidence and its clinical implications. The guidelines are based on available evidence, were constructed after extensive feedback from participants, and are presented as recommendations to aid clinical decision-making in primary, secondary and tertiary medical care. They may also serve as a source of information for patients, their carers, and medicines management and formulary committees.

Proceedings ArticleDOI
18 May 2014
TL;DR: This paper introduces a novel representation of source code called a code property graph that merges concepts of classic program analysis, namely abstract syntax trees, control flow graphs and program dependence graphs, into a joint data structure that enables it to elegantly model templates for common vulnerabilities with graph traversals that can identify buffer overflows, integer overflOWS, format string vulnerabilities, or memory disclosures.
Abstract: The vast majority of security breaches encountered today are a direct result of insecure code. Consequently, the protection of computer systems critically depends on the rigorous identification of vulnerabilities in software, a tedious and error-prone process requiring significant expertise. Unfortunately, a single flaw suffices to undermine the security of a system and thus the sheer amount of code to audit plays into the attacker's cards. In this paper, we present a method to effectively mine large amounts of source code for vulnerabilities. To this end, we introduce a novel representation of source code called a code property graph that merges concepts of classic program analysis, namely abstract syntax trees, control flow graphs and program dependence graphs, into a joint data structure. This comprehensive representation enables us to elegantly model templates for common vulnerabilities with graph traversals that, for instance, can identify buffer overflows, integer overflows, format string vulnerabilities, or memory disclosures. We implement our approach using a popular graph database and demonstrate its efficacy by identifying 18 previously unknown vulnerabilities in the source code of the Linux kernel.

Journal ArticleDOI
TL;DR: In this paper, the authors used 14C-labeled biochar to trace its decomposition to CO2 during 8.5 years under controlled conditions and found that only about 6% of initially added biochar were mineralized to CO 2 during the eight years.
Abstract: Stability and transformation products of incomplete combustion of vegetation or fossil fuel, frequently called pyrogenic or black carbon and of biochar in soil, remains unknown mainly because of their high recalcitrance compared to other natural substances. Therefore, direct estimations of biochar decomposition and transformations are difficult because 1) changes are too small for any relevant experimental period and 2) due to methodological constraints (ambiguity of the origin of investigated compounds). We used 14C-labeled biochar to trace its decomposition to CO2 during 8.5 years and transformation of its chemical compounds: neutral lipids, glycolipids, phospholipids, polysaccharides and benzenepolycarboxylic acids (BPCA). 14C-labeled biochar was produced by charring 14C-labeled Lolium residues. We incubated the 14C-labeled biochar in a Haplic Luvisol and in loess for 8.5 years under controlled conditions. In total only about 6% of initially added biochar were mineralized to CO2 during the 8.5 years. This is probably the slowest decomposition obtained experimentally for any natural organic compound. The biochar decomposition rates estimated by 14CO2 efflux between the 5th and 8th years were of 7 × 10−4 % per day. This corresponds to less than 0.3% per year under optimal conditions and is about 2.5 times slower as reported from the previous shorter study (3.5 years). After 3.5 years of incubation, we analyzed 14C in dissolved organic matter, microbial biomass, and sequentially extracted neutral lipids, glycolipids, phospholipids, polysaccharides and BPCA. Biochar-derived C (14C) in microbial biomass ranged between 0.3 and 0.95% of the 14C input. Biochar-derived C in all lipid fractions was less than 1%. Over 3.5 years, glycolipids and phospholipids were decomposed 1.6 times faster (23% of their initial content per year) compared to neutral lipids (15% year−1). Polysaccharides contributed ca. 17% of the 14C activity in biochar. The highest portion of 14C in the initial biochar (87%) was in BPCA decreasing only 7% over 3.5 years. Condensed aromatic moieties were the most stable fraction compared to all other biochar compounds and the high portion of BPCA in biochar explains its very high stability and its contribution to long-term C sequestration in soil. Our new approach for analysis of biochar stability combines 14C-labeled biochar with 14C determination in chemical fractions allowed tracing of transformation products not only in released CO2 and in microbial biomass, but also evaluation of decomposition of various biochar compounds with different chemical properties.

Journal ArticleDOI
TL;DR: Automated image-based tracking should continue to advance the field of ecology by enabling better understanding of the linkages between individual and higher-level ecological processes, via high-throughput quantitative analysis of complex ecological patterns and processes across scales, including analysis of environmental drivers.
Abstract: The behavior of individuals determines the strength and outcome of ecological interactions, which drive population, community, and ecosystem organization. Bio-logging, such as telemetry and animal-borne imaging, provides essential individual viewpoints, tracks, and life histories, but requires capture of individuals and is often impractical to scale. Recent developments in automated image-based tracking offers opportunities to remotely quantify and understand individual behavior at scales and resolutions not previously possible, providing an essential supplement to other tracking methodologies in ecology. Automated image-based tracking should continue to advance the field of ecology by enabling better understanding of the linkages between individual and higher-level ecological processes, via high-throughput quantitative analysis of complex ecological patterns and processes across scales, including analysis of environmental drivers.

Journal ArticleDOI
Adrian John Bevan1, B. Golob2, Th. Mannel3, S. Prell4  +2061 moreInstitutions (171)
TL;DR: The physics of the SLAC and KEK B Factories are described in this paper, with a brief description of the detectors, BaBar and Belle, and data taking related issues.
Abstract: This work is on the Physics of the B Factories. Part A of this book contains a brief description of the SLAC and KEK B Factories as well as their detectors, BaBar and Belle, and data taking related issues. Part B discusses tools and methods used by the experiments in order to obtain results. The results themselves can be found in Part C.

Journal ArticleDOI
TL;DR: This review will focus on the diverse roles that GSK-3 plays in various human cancers, in particular in solid tumors, and how this pivotal kinase interacts with multiple signaling pathways.
Abstract: // James A. McCubrey 1 , Linda S. Steelman 1 , Fred E. Bertrand 2 , Nicole M. Davis 1 , Melissa Sokolosky 1 , Steve L. Abrams 1 , Giuseppe Montalto 3 , Antonino B. D’Assoro 4 , Massimo Libra 5 , Ferdinando Nicoletti 5 , Roberta Maestro 6 , Jorg Basecke 7,8 , Dariusz Rakus 9 , Agnieszka Gizak 9 Zoya Demidenko 10 , Lucio Cocco 11 , Alberto M. Martelli 11 and Melchiorre Cervello 12 1 Department of Microbiology and Immunology, Brody School of Medicine at East Carolina University Greenville, NC, USA 2 Department of Oncology, Brody School of Medicine at East Carolina University Greenville, NC, USA 3 Biomedical Department of Internal Medicine and Specialties, University of Palermo, Palermo, Italy 4 Department of Medical Oncology, Mayo Clinic Cancer Center, Rochester, MN, USA 5 Department of Bio-Medical Sciences, University of Catania, Catania, Italy 6 Experimental Oncology 1, CRO IRCCS, National Cancer Institute, Aviano, Pordenone, Italy. 7 Department of Medicine, University of Gottingen, Gottingen, Germany 8 Sanct-Josef-Hospital Cloppenburg, Department of Hematology and Oncology, Cloppenburg, Germany 9 Department of Animal Molecular Physiology, Institute of Experimental Biology, Wroclaw University, Wroclaw, Poland 10 Department of Cell Stress Biology, Roswell Park Cancer Institute, Buffalo, NY, USA 11 Dipartimento di Scienze Biomediche e Neuromotorie, Universita di Bologna, Bologna, Italy 12 Consiglio Nazionale delle Ricerche, Istituto di Biomedicina e Immunologia Molecolare “Alberto Monroy”, Palermo, Italy Correspondence: James A. McCubrey, email: // Keywords : GSK-3, cancer stem cells, Wnt/beta-catenin, PI3K, Akt, mTOR, Hedgehog, Notch, Targeted Therapy, Therapy Resistance, Mutations, Rapamycin Received : April 24, 2014 Accepted : May 28, 2014 Published : May 28, 2014 Abstract The serine/threonine kinase glycogen synthase kinase-3 (GSK-3) was initially identified and studied in the regulation of glycogen synthesis. GSK-3 functions in a wide range of cellular processes. Aberrant activity of GSK-3 has been implicated in many human pathologies including: bipolar depression, Alzheimer’s disease, Parkinson’s disease, cancer, non-insulin-dependent diabetes mellitus (NIDDM) and others. In some cases, suppression of GSK-3 activity by phosphorylation by Akt and other kinases has been associated with cancer progression. In these cases, GSK-3 has tumor suppressor functions. In other cases, GSK-3 has been associated with tumor progression by stabilizing components of the beta-catenin complex. In these situations, GSK-3 has oncogenic properties. While many inhibitors to GSK-3 have been developed, their use remains controversial because of the ambiguous role of GSK-3 in cancer development. In this review, we will focus on the diverse roles that GSK-3 plays in various human cancers, in particular in solid tumors. Recently, GSK-3 has also been implicated in the generation of cancer stem cells in various cell types. We will also discuss how this pivotal kinase interacts with multiple signaling pathways such as: PI3K/PTEN/Akt/mTORC1, Ras/Raf/MEK/ERK, Wnt/beta-catenin, Hedgehog, Notch and others.

Journal ArticleDOI
TL;DR: It is expected that panoramic voltage imaging will lead to novel insights about molecular arrhythmia mechanisms through quantitative strategies and organ-representative analysis of intact mouse hearts and its prospects for the mouse heart.
Abstract: To investigate the dynamics and propensity for arrhythmias in intact transgenic hearts comprehensively, optical strategies for panoramic fluorescence imaging of action potential (AP) propagation are essential. In particular, mechanism-oriented molecular studies usually depend on transgenic mouse hearts of only a few millimeters in size. Furthermore, the temporal scales of the mouse heart remain a challenge for panoramic fluorescence imaging with heart rates ranging from 200 min-1 (e.g. depressed sinus node function) to over 1200 min-1 during fast arrhythmias. To meet these challenging demands, we and others developed physiologically relevant mouse models and characterized their hearts with planar AP mapping. Here, we summarize the progress towards panoramic fluorescence imaging and its prospects for the mouse heart. In general, several high-resolution cameras are synchronized and geometrically arranged for panoramic voltage mapping and the surface and blood vessel anatomy documented through image segmentation and 3D heart surface reconstruction. We expect that panoramic voltage imaging will lead to novel insights about molecular arrhythmia mechanisms through quantitative strategies and 3D analysis of intact mouse hearts.