scispace - formally typeset
Search or ask a question

Showing papers by "Technical University of Dortmund published in 2010"


Journal ArticleDOI
TL;DR: The included papers present an interesting mixture of recent developments in the field as they cover fundamental research on the design of experiments, models and analysis methods as well as more applied research connected to real-life applications.
Abstract: The design and analysis of computer experiments as a relatively young research field is not only of high importance for many industrial areas but also presents new challenges and open questions for statisticians. This editorial introduces a special issue devoted to the topic. The included papers present an interesting mixture of recent developments in the field as they cover fundamental research on the design of experiments, models and analysis methods as well as more applied research connected to real-life applications.

2,583 citations


Journal ArticleDOI
TL;DR: The task-switching paradigm offers enormous possibilities to study cognitive control as well as task interference, and the current review provides an overview of recent research on both topics.
Abstract: The task-switching paradigm offers enormous possibilities to study cognitive control as well as task interference. The current review provides an overview of recent research on both topics. First, we review different experimental approaches to task switching, such as comparing mixed-task blocks with single-task blocks, predictable task-switching and task-cuing paradigms, intermittent instructions, and voluntary task selection. In the 2nd part, we discuss findings on preparatory control mechanisms in task switching and theoretical accounts of task preparation. We consider preparation processes in two-stage models, consider preparation as an all-or-none process, address the question of whether preparation is switch-specific, reflect on preparation as interaction of cue encoding and memory retrieval, and discuss the impact of verbal mediation on preparation. In the 3rd part, we turn to interference phenomena in task switching. We consider proactive interference of tasks and inhibition of recently performed tasks indicated by asymmetrical switch costs and n-2 task-repetition costs. We discuss stimulus-based interference as a result of stimulus-based response activation and stimulus-based task activation, and response-based interference because of applying bivalent rather than univalent responses, response repetition effects, and carryover of response selection and execution. In the 4th and final part, we mention possible future research fields.

1,223 citations


Book
01 Sep 2010
TL;DR: In this age of information overload, people use a variety of strategies to make choices about what to buy, how to spend their leisure time, and even whom to date as discussed by the authors.
Abstract: In this age of information overload, people use a variety of strategies to make choices about what to buy, how to spend their leisure time, and even whom to date. Recommender systems automate some of these strategies with the goal of providing affordable, personal, and high-quality recommendations. This book offers an overview of approaches to developing state-of-the-art recommender systems. The authors present current algorithmic approaches for generating personalized buying proposals, such as collaborative and content-based filtering, as well as more interactive and knowledge-based approaches. They also discuss how to measure the effectiveness of recommender systems and illustrate the methods with practical case studies. The final chapters cover emerging topics such as recommender systems in the social web and consumer buying behavior theory. Suitable for computer science researchers and students interested in getting an overview of the field, this book will also be useful for professionals looking for the right technology to build real-world recommender systems.

1,129 citations


Journal ArticleDOI
F. D. Aaron1, Halina Abramowicz2, I. Abt3, Leszek Adamczyk4  +538 moreInstitutions (69)
TL;DR: In this article, a combination of the inclusive deep inelastic cross sections measured by the H1 and ZEUS Collaborations in neutral and charged current unpolarised e(+/-)p scattering at HERA during the period 1994-2000 is presented.
Abstract: A combination is presented of the inclusive deep inelastic cross sections measured by the H1 and ZEUS Collaborations in neutral and charged current unpolarised e(+/-)p scattering at HERA during the period 1994-2000. The data span six orders of magnitude in negative four-momentum-transfer squared, Q(2), and in Bjorken x. The combination method used takes the correlations of systematic uncertainties into account, resulting in an improved accuracy. The combined data are the sole input in a NLO QCD analysis which determines a new set of parton distributions, HERAPDF1.0, with small experimental uncertainties. This set includes an estimate of the model and parametrisation uncertainties of the fit result.

624 citations


Proceedings ArticleDOI
26 Sep 2010
TL;DR: It is argued that the new ways of measuring coverage and serendipity reflect the quality impression perceived by the user in a better way than previous metrics thus leading to enhanced user satisfaction.
Abstract: When we evaluate the quality of recommender systems (RS), most approaches only focus on the predictive accuracy of these systems. Recent works suggest that beyond accuracy there is a variety of other metrics that should be considered when evaluating a RS. In this paper we focus on two crucial metrics in RS evaluation: coverage and serendipity. Based on a literature review, we first discuss both measurement methods as well as the trade-off between good coverage and serendipity. We then analyze the role of coverage and serendipity as indicators of recommendation quality, present novel ways of how they can be measured and discuss how to interpret the obtained measurements. Overall, we argue that our new ways of measuring these concepts reflect the quality impression perceived by the user in a better way than previous metrics thus leading to enhanced user satisfaction.

597 citations


Proceedings ArticleDOI
01 Oct 2010
TL;DR: In this article, the LHCb simulation application consists of two independent phases, the generation of the primary event and the tracking of particles produced in the experimental setup, and the design of the generator phase of Gauss is described: a modular structure with well defined interfaces specific to the various tasks, e.g. pp collisions, particles' decays, selections, etc.
Abstract: The LHCb simulation application. Gauss, consists or two independent phases, the generation of the primary event and the tracking of particles produced in the experimental setup. For the LHCh experimental program it is particularly important to model IS meson decays: the KvtGcn code developed in CLEO and BaBah has been chosen and customized for non-coherent B production as necuring in pp collisions at the LHC, The initial proton-proto n collision is provided by a different generator engine, currently PYTHIA 6 for massive prwluclion of signal and generic pp collisions events. Beam gas events, background events originating from proton halo, cosmics and calibration events for different detectors can be generated in addition to pp collisions. Different generator packages as available in the physics community or specifically developed in LHCb are used for the different purposes. Running conditions affecting the events generated such as the size of the luminous region, the number of collisions occuring in a bunch crossing and the number of spill-over events from neighbouring bunches are modeled via dedicated algorithms appropriately configured. The design of the generator phase of Gauss will be described: a modular structure with well defined interfaces specific to the various tasks, e.g. pp collisions, particles' decays, selections, etc. has been chosen. Different implementations are available for the various tasks allowing selecting and combining them as most appropriate at run time as in the case of Pythia 6 im pp collisions or HIJING for beam gas. The advantages of such structure, allowing for example to adopt transparently new generators packages will be discussed.

543 citations


Journal ArticleDOI
Bernard Aubert1, Y. Karyotakis1, J. P. Lees1, V. Poireau1  +488 moreInstitutions (78)
TL;DR: In this article, the authors performed searches for lepton-flavor-violating decays of a tau lepton to a lighter mass lepton and a photon with the entire data set of (963 +/- 7) x 10(6) tau decays collected by the BABAR detector near the Y(4S), Y(3S) and Y(2S) resonances.
Abstract: Searches for lepton-flavor-violating decays of a tau lepton to a lighter mass lepton and a photon have been performed with the entire data set of (963 +/- 7) x 10(6) tau decays collected by the BABAR detector near the Y(4S), Y(3S) and Y(2S) resonances. The searches yield no evidence of signals and we set upper limits on the branching fractions of B(tau(+/-) -> e(+/-)gamma) mu(+/-)gamma) < 4.4 X 10(-8) at 90% confidence level.

502 citations


Journal ArticleDOI
TL;DR: Initial cellular evaluations supports the view that compound collections based on natural-product-inspired scaffolds constructed with complex stereochemistry, and decorated with assorted substituents, will be a rich source of compounds with diverse bioactivity.
Abstract: A Lewis-acid-catalysed 1,3-dipolar cycloaddition provides rapid access to a variety of substituted spirooxindoles. Initial cellular evaluations supports the view that compound collections based on natural-product-inspired scaffolds constructed with complex stereochemistry, and decorated with assorted substituents, will be a rich source of compounds with diverse bioactivity.

487 citations


Journal ArticleDOI
TL;DR: This study shows that the extruded magnesium alloy LAE442 provides low corrosion rates and reacts in vivo with an acceptable host response and the in vivo corrosion rate can be further reduced by additional MgF(2) coating.

389 citations


Journal ArticleDOI
TL;DR: This Review surveys current approaches to generate DNA-protein conjugates as well as recent advances in their applications to enable applications in sensing, materials science, and catalysis.
Abstract: Conjugation with artificial nucleic acids allows proteins to be modified with a synthetically accessible, robust tag. This attachment is addressable in a highly specific manner by means of molecular recognition events, such as Watson-Crick hybridization. Such DNA-protein conjugates, with their combined properties, have a broad range of applications, such as in high-performance biomedical diagnostic assays, fundamental research on molecular recognition, and the synthesis of DNA nanostructures. This Review surveys current approaches to generate DNA-protein conjugates as well as recent advances in their applications. For example, DNA-protein conjugates have been assembled into model systems for the investigation of catalytic cascade reactions and light-harvesting devices. Such hybrid conjugates are also used for the biofunctionalization of planar surfaces for micro- and nanoarrays, and for decorating inorganic nanoparticles to enable applications in sensing, materials science, and catalysis.

337 citations


Journal ArticleDOI
TL;DR: The development of multicellular organisms is controlled by differential gene expression whereby cells adopt distinct fates as mentioned in this paper, and a spatially resolved view of gene expression allows the elucidation of transcriptional networks that are linked to cellular identity and function.

Journal ArticleDOI
TL;DR: Data support the hypothesis that the major corrosion product Mg(OH)(2) from any magnesium alloy is the major origin of the observed enhanced bone growth in vivo, and further studies have to evaluate if theEnhanced bone growth is mainly due to the local magnesium ion concentration or the local alkalosis accompanying the Mg (OH)( 2) dissolution.

Journal ArticleDOI
TL;DR: A high-order finite-element application, which performs the numerical simulation of seismic wave propagation resulting from earthquakes at the scale of a continent or from active seismic acquisition experiments in the oil industry, on a large cluster of NVIDIA Tesla graphics cards using the CUDA programming environment and non-blocking message passing based on MPI.

Journal ArticleDOI
TL;DR: In this paper, the vibrational C H stretching overtone and combination bands dominate the spectra, rendering an optical characterization of core and clad materials, which also provides information for the synthesis and optical characterization.

Journal ArticleDOI
TL;DR: YLoc, an interpretable web server for predicting subcellular localization that uses natural language to explain why a prediction was made and which biological property of the protein was mainly responsible for it, and estimates the reliability of its own predictions.
Abstract: Predicting subcellular localization has become a valuable alternative to time-consuming experimental methods. Major drawbacks of many of these predictors is their lack of interpretability and the fact that they do not provide an estimate of the confidence of an individual prediction. We present YLoc, an interpretable web server for predicting subcellular localization. YLoc uses natural language to explain why a prediction was made and which biological property of the protein was mainly responsible for it. In addition, YLoc estimates the reliability of its own predictions. YLoc can, thus, assist in understanding protein localization and in location engineering of proteins. The YLoc web server is available online at www.multiloc.org/YLoc.

Journal ArticleDOI
TL;DR: This aim is to showcase how organocatalytic reactions have overcome some of their teething problems and how they frequently offer very practical solutions to long-standing synthetic challenges.

Journal ArticleDOI
TL;DR: It is demonstrated here that DNA nanostructures can be site-specifically decorated with several different proteins by using coupling systems orthogonal to the biotin–STV system, and the general applicability of this approach for the generation of DNA superstructures that are selectively decorated with multiple different proteins is demonstrated.
Abstract: Structural DNA nanotechnology 2] and the technique of DNA origami enable the rapid generation of a plethora of complex self-assembled nanostructures. Since DNA molecules themselves display limited chemical, optical, and electronic functionality, it is of utmost importance to devise methods to decorate DNA scaffolds with functional moieties to realize applications in sensing, catalysis, and device fabrication. Protein functionalization is particulary desirable because it allows exploitation of an almost unlimited variety of functional elements which nature has evolved over billions of years. The delicate architecture of proteins has resulted in no generally applicable method being currently available to selectively couple these components on DNA scaffolds, and thus approaches used so far are based on reversible antibody– antigen interactions, 9] aptamer binding, 11] nucleic acid hybridization of DNA-tagged proteins, 13] or predominantly biotin–streptavidin (STV) interactions. We demonstrate here that DNA nanostructures can be site-specifically decorated with several different proteins by using coupling systems orthogonal to the biotin–STV system. In particular, benzylguanine (BG) and chlorohexane (CH) groups incorporated in DNA origami have been used as suicide ligands for the site-specific coupling of fusion proteins containing the self-labeling protein tags O-alkylguanine-DNA-alkyltransferase (hAGT), which is often referred to as “Snap-tag”, or haloalkane dehalogenase, which is also known as “HaloTag”. By using various model proteins we demonstrate the general applicability of this approach for the generation of DNA superstructures that are selectively decorated with multiple different proteins. To realize orthogonal protein immobilization on DNA origami using self-ligating protein tags, we chose the Snap-tag, developed by Johnsson and co-workers, and the commercially available HaloTag system. The respective smallmolecule suicide tags (O-benzylguanine (BG) and 5-chlorohexane (CH)) for both self-labeling protein tags are readily available as amino-reactive N-hydroxysuccinimide (NHS) derivatives (BG-NHS and CH-NHS; Figure 1a). Complete derivatization of alkylamino-modified oligonucleotides was achieved by coupling with 30 molar equivalents of BG-NHS or CH-NHS, as indicated by electrophoretic analysis (Figure 1b). To gain access to fusion proteins bearing the complementary Snapand Halo-protein tags, we constructed expression plasmids by genetic fusion of the genes encoding the protein of interest (POI) and Snap-tag or HaloTag (see the Supporting Information). As model POIs we chose the fluorescent proteins enhanced yellow fluorescent protein (EYFP) and mKate, the enzymes cytochrome C peroxidase (CCP) and esterase 2 from Alicyclobacillus acidocaldarius thermos (EST2), to which the self-labeling tags were fused at the C terminus (POI-Snap or POI-Halo, respectively). In addition, the bispecific Halo-Snap fusion protein “covalin”, a chimera which specifically reacts with both BG and CH, as well as monovalent STV (mSTV), were used in this study. The fusion proteins were overexpressed and purified by conventional procedures (see the Supporting Information). The coupling of BGand CHmodified oligonucleotides to the protein was analyzed by using covalin as the initial model to simplify the electrophoretic characterization. It is shown in Figure 1c that both BGand CH-modified single-stranded DNA (ssDNA) oligonucleotides couple effectively to generate the corresponding DNA–covalin conjugates in nearly quantitative yields. DNA coupling of the aforementioned POI fusions, namely mKateSnap, EST2-SNAP, mKate-Halo, CCP-Halo, and EYFP-Halo occurred in a highly specific manner (Figure 1d), and neither Snap or Halo nor mSTV revealed cross-reactivity for the orthogonal-tagged DNA oligomers. We then used SARSE software to aid in the design of face-shaped DNA origami to demonstrate the selective immobilization of protein on DNA nanostructures. Correct folding of M13mp18 ssDNA through the use of 236 staple strands was analyzed by atomic force microscopy (AFM; details of the sequence design as well as experimental procedures are reported in the Supporting Information). Figure 2a illustrates that the face-shaped DNA origami was obtained in high purity, and high-resolution AFM clearly revealed the proposed ears, neck, and seam features of this structure. As an initial test for protein decoration, we selected 23 staple strands, which were biotinylated to create eyes (2 6 [*] Dr. B. Sacc , Dipl.-Chem. R. Meyer, Dipl.-Biotechnol. M. Erkelenz, M. Sc. K. Kiko, A. Arndt, Dr. H. Schroeder, Dr. K. S. Rabe, Prof. C. M. Niemeyer Technische Universit t Dortmund, Fakult t Chemie Biologisch-Chemische Mikrostrukturtechnik Otto-Hahn Strasse 6, 44227 Dortmund (Germany) Fax: (+ 49)231-755-7082 E-mail: christof.niemeyer@tu-dortmund.de [] These authors contributed equally to this work.

Proceedings Article
16 Jan 2010
TL;DR: A new k-means clustering algorithm for data streams of points from a Euclidean space that provides a good alternative to BIRCH and StreamLS, in particular, if the number of cluster centers is large.
Abstract: We develop a new k-means clustering algorithm for data streams, which we call StreamKM++. Our algorithm computes a small weighted sample of the data stream and solves the problem on the sample using the k-means++ algorithm [1]. To compute the small sample, we propose two new techniques. First, we use a non-uniform sampling approach similar to the k-means++ seeding procedure to obtain small coresets from the data stream. This construction is rather easy to implement and, unlike other coreset constructions, its running time has only a low dependency on the dimensionality of the data. Second, we propose a new data structure which we call a coreset tree. The use of these coreset trees significantly speeds up the time necessary for the non-uniform sampling during our coreset construction. We compare our algorithm experimentally with two well-known streaming implementations (BIRCH [16] and StreamLS [4, 9]). In terms of quality (sum of squared errors), our algorithm is comparable with StreamLS and significantly better than BIRCH (up to a factor of 2). In terms of running time, our algorithm is slower than BIRCH. Comparing the running time with StreamLS, it turns out that our algorithm scales much better with increasing number of centers. We conclude that, if the first priority is the quality of the clustering, then our algorithm provides a good alternative to BIRCH and StreamLS, in particular, if the number of cluster centers is large. We also give a theoretical justification of our approach by proving that our sample set is a small coreset in low dimensional spaces.

Journal ArticleDOI
TL;DR: In this article, the history of Incremental Sheet Forming (ISF) focusing on technological developments is described, and an extensive list of patents including Japanese patents is provided, and the overall conclusion is that ISF has received the attention of the world, in particular of the automotive industry.

Journal ArticleDOI
TL;DR: The latter technique offers the promise of enabling simple, easily applicable, and robust reaction schemes, for example, by circumventing the ‘cofactor challenge’ and introducing redox power directly to the enzyme’s active sites.
Abstract: Redox enzymes have tremendous potential as catalysts for preparative organic chemistry. Their usually high selectivity, paired with their catalytic efficiency under mild reaction conditions, makes them potentially very valuable tools for synthesis. The number of interesting monooxygenases, dehydrogenases, reductases, oxidases, and peroxidases is steadily increasing and the tailoring of a given biocatalyst is more and more becoming standard technology. However, their cofactor dependency still represents a major impediment en route to true preparative applicability. Currently, three different approaches to deal witch this 'cofactor challenge' are being pursued: using whole cells, biomimetic approaches comprising enzymatic cofactor regenerations systems, and 'unconventional' nonenzymatic regeneration. The latter technique offers the promise of enabling simple, easily applicable, and robust reaction schemes, for example, by circumventing the 'cofactor challenge' and introducing redox power directly to the enzyme's active sites.

BookDOI
03 Nov 2010
TL;DR: In this article, the main issues in the experimental analysis of algorithms are discussed, and the experimental cycle of algorithm development is discussed, as well as statistical distributions of algorithm performance in terms of solution quality, runtime and other measures.
Abstract: In operations research and computer science it is common practice to evaluate the performance of optimization algorithms on the basis of computational results, and the experimental approach should follow accepted principles that guarantee the reliability and reproducibility of results. However, computational experiments differ from those in other sciences, and the last decade has seen considerable methodological research devoted to understanding the particular features of such experiments and assessing the related statistical methods. This book consists of methodological contributions on different scenarios of experimental analysis. The first part overviews the main issues in the experimental analysis of algorithms, and discusses the experimental cycle of algorithm development; the second part treats the characterization by means of statistical distributions of algorithm performance in terms of solution quality, runtime and other measures; and the third part collects advanced methods from experimental design for configuring and tuning algorithms on a specific class of instances with the goal of using the least amount of experimentation. The contributor list includes leading scientists in algorithm design, statistical design, optimization and heuristics, and most chapters provide theoretical background and are enriched with case studies. This book is written for researchers and practitioners in operations research and computer science who wish to improve the experimental assessment of optimization algorithms and, consequently, their design.

Journal ArticleDOI
TL;DR: TXNRD1 and TXNIP are associated with prognosis in breast cancer, and ERBB2 seems to be one of the factors shifting balances of both factors of the redox control system in a prognostic unfavorable manner.
Abstract: The purpose of this work was to study the prognostic influence in breast cancer of thioredoxin reductase 1 (TXNRD1) and thioredoxin interacting protein (TXNIP), key players in oxidative stress control that are currently evaluated as possible therapeutic targets. Analysis of the association of TXNRD1 and TXNIP RNA expression with the metastasis-free interval (MFI) was performed in 788 patients with node-negative breast cancer, consisting of three individual cohorts (Mainz, Rotterdam and Transbig). Correlation with metagenes and conventional clinical parameters (age, pT stage, grading, hormone and ERBB2 status) was explored. MCF-7 cells with a doxycycline-inducible expression of an oncogenic ERBB2 were used to investigate the influence of ERBB2 on TXNRD1 and TXNIP transcription. TXNRD1 was associated with worse MFI in the combined cohort (hazard ratio = 1.955; P < 0.001) as well as in all three individual cohorts. In contrast, TXNIP was associated with better prognosis (hazard ratio = 0.642; P < 0.001) and similar results were obtained in all three subcohorts. Interestingly, patients with ERBB2-status-positive tumors expressed higher levels of TXNRD1. Induction of ERBB2 in MCF-7 cells caused not only an immediate increase in TXNRD1 but also a strong decrease in TXNIP. A subsequent upregulation of TXNIP as cells undergo senescence was accompanied by a strong increase in levels of reactive oxygen species. TXNRD1 and TXNIP are associated with prognosis in breast cancer, and ERBB2 seems to be one of the factors shifting balances of both factors of the redox control system in a prognostic unfavorable manner.

Journal ArticleDOI
TL;DR: Chemical and biological single cell analyses provide an unprecedented access to the understanding of cell-to-cell differences and basic biological concepts.

Journal ArticleDOI
TL;DR: Whereas the effect of ‘strain’ was stable in the four heterogenized experiments, outcomes of the four standardized experiments were highly variable, suggesting that withinSystematic variation improves reproducibility of animal experiments.
Abstract: measures to compare between-experiment variation for the standardized and heterogenized design. Whereas strain differences were relatively consistent among heterogenized experiments, they varied considerably between standardized experiments (Fig. 1a–c). In 33 of 36 measures, between-experiment variation was lower in the heterogenized design, indicating better reproducibility. We also analyzed each experiment separately as if conducted independently in different laboratories and assessed the effect of ‘strain’ on each of the 36 measures using a general linear model (GLM). Based on the 2 × 2 factorial nature of the heterogenized design and cage position in the rack, we divided each replicate experiment into four ‘blocks’, each comprising one cage per strain (Supplementary Fig. 1), and included ‘block’ nested within experiment as blocking factor in the GLM (Supplementary Methods). Whereas the effect of ‘strain’ was stable in the four heterogenized experiments, outcomes of the four standardized experiments were highly variable (Supplementary Fig. 2), suggesting that withinSystematic variation improves reproducibility of animal experiments

Journal ArticleDOI
Roel Aaij, C. Abellan Beteta1, Bernardo Adeva2, Marco Adinolfi3  +626 moreInstitutions (48)
TL;DR: In this article, the average cross-section to produce b-flavoured or anti-b-flavaoured hadrons is (75.3 +/- 5.4 +/- 13.0) microbarns.

Journal ArticleDOI
Jelena Aleksić1, Louis Antonelli2, P. Antoranz3, Michael Backes4  +156 moreInstitutions (21)
TL;DR: The MAGIC Cherenkov telescope was used to observe the Perseus galaxy cluster for a total effective time of 24.4 hours during 2008 November and December as discussed by the authors, and the resulting upper limits on the gamma-ray emission above 100 GeV are in the range of 4.6-7.5, thereby constraining the emission produced by cosmic rays, dark matter annihilations and the central radio galaxy NGC 1275.
Abstract: The Perseus galaxy cluster was observed by the MAGIC Cherenkov telescope for a total effective time of 24.4 hr during 2008 November and December. The resulting upper limits on the gamma-ray emission above 100 GeV are in the range of 4.6-7.5 x 10(-12) cm(-2) s(-1) for spectral indices from -1.5 to -2.5, thereby constraining the emission produced by cosmic rays, dark matter annihilations, and the central radio galaxy NGC 1275. Results are compatible with cosmological cluster simulations for the cosmic-ray-induced gamma-ray emission, constraining the average cosmic ray-to-thermal pressure to < 4% for the cluster core region (< 8% for the entire cluster).

Journal ArticleDOI
TL;DR: In this paper, the authors report findings from longitudinal analyses of the German nation-wide travel survey KONTIV for the period 1976-2002, focusing on travel mode choice, subdivided by distance categories, and also taking car availability and city size into account.

Journal ArticleDOI
Georges Aad, E. Abat, Brad Abbott, Jalal Abdallah  +3208 moreInstitutions (169)
TL;DR: The first measurements from proton-proton collisions recorded with the ATLAS detector at the LHC are presented in this paper, where the charged-particle multiplicity, its dependence on transverse momentum and pseudorapidity, and the relationship between mean transversal momentum and charge multiplicity are measured for events with at least one charged particle in the kinematic range.

Journal ArticleDOI
TL;DR: In this article, the authors present a mechanistic understanding of the architecture of biogeochemical interfaces in soils and of the complex interplay and interdependencies of the physical, chemical, and biological processes acting at and within these dynamic interfaces in soil.
Abstract: Soil, the “Earth’s thin skin” serves as the delicate interface between the biosphere, hydrosphere, atmosphere, and lithosphere. It is a dynamic and hierarchically organized system of various organic and inorganic constituents and organisms, the spatial structure of which defines a large, complex, and heterogeneous interface. Biogeochemical processes at soil interfaces are fundamental for the overall soil development, and they are the primary driving force for key ecosystem functions such as plant productivity and water quality. Ultimately, these processes control the fate and transport of contaminants and nutrients into the vadose zone and as such their biogeochemical cycling. The definite objective in biogeochemical-interface research is to gain a mechanistic understanding of the architecture of these biogeochemical interfaces in soils and of the complex interplay and interdependencies of the physical, chemical, and biological processes acting at and within these dynamic interfaces in soil. The major challenges are (1) to identify the factors controlling the architecture of biogeochemical interfaces, (2) to link the processes operative at the individual molecular and/or organism scale to the phenomena active at the aggregate scale in a mechanistic way, and (3) to explain the behavior of organic chemicals in soil within a general mechanistic framework. To put this in action, integration of soil physical, chemical, and biological disciplines is mandatory. Indispensably, it requires the adaption and development of characterization and probing techniques adapted from the neighboring fields of molecular biology, analytical and computational chemistry as well as materials and nano-sciences. To shape this field of fundamental soil research, the German Research Foundation (DFG) has granted the Priority Program “Biogeochemical Interfaces in Soil”, in which 22 individual research projects are involved.

Journal ArticleDOI
TL;DR: YLoc is a novel method for predicting protein subcellular localization that is able to reliably predict multiple locations and outperforms the best predictors in this area and provides a confidence estimate for the prediction.
Abstract: Motivation: Protein subcellular localization is pivotal in understanding a protein’s function. Computational prediction of subcellular localization has become a viable alternative to experimental approaches. While current machine learning-based methods yield good prediction accuracy, most of them suffer from two key problems: lack of interpretability and dealing with multiple locations. Results: We present YLoc, a novel method for predicting protein subcellular localization that addresses these issues. Due to its simple architecture, YLoc can identify the relevant features of a protein sequence contributing to its subcellular localization, e.g. localization signals or motifs relevant to protein sorting. We present several example applications where YLoc identifies the sequence features responsible for protein localization, and thus reveals not only to which location a protein is transported to, but also why it is transported there. YLoc also provides a confidence estimate for the prediction. Thus, the user can decide what level of error is acceptable for a prediction. Due to a probabilistic approach and the use of several thousands of dual-targeted proteins, YLoc is able to predict multiple locations per protein. YLoc was benchmarked using several independent datasets for protein subcellular localization and performs on par with other state-of-the-art predictors. Disregarding low-confidence predictions, YLoc can achieve prediction accuracies of over 90%. Moreover, we show that YLoc is able to reliably predict multiple locations and outperforms the best predictors in this area. Availability: www.multiloc.org/YLoc