scispace - formally typeset
Search or ask a question

Showing papers by "Vienna University of Technology published in 2015"


Journal ArticleDOI
TL;DR: While the intrinsic complexity of natural product-based drug discovery necessitates highly integrated interdisciplinary approaches, the reviewed scientific developments, recent technological advances, and research trends clearly indicate that natural products will be among the most important sources of new drugs in the future.

1,760 citations


Journal ArticleDOI
TL;DR: An efficient evaluation tool for 3D medical image segmentation is proposed using 20 evaluation metrics based on a comprehensive literature review and guidelines for selecting a subset of these metrics that is suitable for the data and the segmentation task are provided.
Abstract: Medical Image segmentation is an important image processing step. Comparing images to evaluate the quality of segmentation is an essential part of measuring progress in this research area. Some of the challenges in evaluating medical segmentation are: metric selection, the use in the literature of multiple definitions for certain metrics, inefficiency of the metric calculation implementations leading to difficulties with large volumes, and lack of support for fuzzy segmentation by existing metrics. First we present an overview of 20 evaluation metrics selected based on a comprehensive literature review. For fuzzy segmentation, which shows the level of membership of each voxel to multiple classes, fuzzy definitions of all metrics are provided. We present a discussion about metric properties to provide a guide for selecting evaluation metrics. Finally, we propose an efficient evaluation tool implementing the 20 selected metrics. The tool is optimized to perform efficiently in terms of speed and required memory, also if the image size is extremely large as in the case of whole body MRI or CT volume segmentation. An implementation of this tool is available as an open source project. We propose an efficient evaluation tool for 3D medical image segmentation using 20 evaluation metrics and provide guidelines for selecting a subset of these metrics that is suitable for the data and the segmentation task.

1,561 citations


Journal ArticleDOI
TL;DR: The main knowledge gaps, the future research needs and the policy and management options that should be prioritized to tackle antibiotic resistance in the environment are discussed.
Abstract: Antibiotic resistance is a threat to human and animal health worldwide, and key measures are required to reduce the risks posed by antibiotic resistance genes that occur in the environment. These measures include the identification of critical points of control, the development of reliable surveillance and risk assessment procedures, and the implementation of technological solutions that can prevent environmental contamination with antibiotic resistant bacteria and genes. In this Opinion article, we discuss the main knowledge gaps, the future research needs and the policy and management options that should be prioritized to tackle antibiotic resistance in the environment.

1,495 citations


Journal ArticleDOI
TL;DR: Some of the driving theoretical ideas and first experimental realizations of hybrid quantum systems and the opportunities and challenges they present are reviewed and offers a glance at the near- and long-term perspectives of this fascinating and rapidly expanding field.
Abstract: An extensively pursued current direction of research in physics aims at the development of practical technologies that exploit the effects of quantum mechanics. As part of this ongoing effort, devices for quantum information processing, secure communication, and high-precision sensing are being implemented with diverse systems, ranging from photons, atoms, and spins to mesoscopic superconducting and nanomechanical structures. Their physical properties make some of these systems better suited than others for specific tasks; thus, photons are well suited for transmitting quantum information, weakly interacting spins can serve as long-lived quantum memories, and superconducting elements can rapidly process information encoded in their quantum states. A central goal of the envisaged quantum technologies is to develop devices that can simultaneously perform several of these tasks, namely, reliably store, process, and transmit quantum information. Hybrid quantum systems composed of different physical components with complementary functionalities may provide precisely such multitasking capabilities. This article reviews some of the driving theoretical ideas and first experimental realizations of hybrid quantum systems and the opportunities and challenges they present and offers a glance at the near- and long-term perspectives of this fascinating and rapidly expanding field.

743 citations


Journal ArticleDOI
Khachatryan, Albert M. Sirunyan, Armen Tumasyan, Wolfgang Adam  +2118 moreInstitutions (3)
TL;DR: In this article, the performance and strategies used in electron reconstruction and selection at CERN LHC are presented based on data corresponding to an integrated luminosity of 19.7 inverse femtobarns, collected in proton-proton collisions at sqrt(s) = 8 TeV.
Abstract: The performance and strategies used in electron reconstruction and selection at CMS are presented based on data corresponding to an integrated luminosity of 19.7 inverse femtobarns, collected in proton-proton collisions at sqrt(s) = 8 TeV at the CERN LHC. The paper focuses on prompt isolated electrons with transverse momenta ranging from about 5 to a few 100 GeV. A detailed description is given of the algorithms used to cluster energy in the electromagnetic calorimeter and to reconstruct electron trajectories in the tracker. The electron momentum is estimated by combining the energy measurement in the calorimeter with the momentum measurement in the tracker. Benchmark selection criteria are presented, and their performances assessed using Z, Upsilon, and J/psi decays into electron-positron pairs. The spectra of the observables relevant to electron reconstruction and selection as well as their global efficiencies are well reproduced by Monte Carlo simulations. The momentum scale is calibrated with an uncertainty smaller than 0.3%. The momentum resolution for electrons produced in Z boson decays ranges from 1.7 to 4.5%, depending on electron pseudorapidity and energy loss through bremsstrahlung in the detector material.

633 citations


Journal ArticleDOI
TL;DR: In this paper, the state-of-the-art research, current obstacles and future needs and directions for the following four-step iterative process: (1) occupant monitoring and data collection, (2) model development, (3) model evaluation, and (4) model implementation into building simulation tools.

629 citations


Journal ArticleDOI
10 Apr 2015-Science
TL;DR: It is shown experimentally that a degenerate one-dimensional Bose gas relaxes to a state that can be described by such a generalized ensemble, and this is verified through a detailed study of correlation functions up to 10th order.
Abstract: The description of the non-equilibrium dynamics of isolated quantum many-body systems within the framework of statistical mechanics is a fundamental open question. Conventional thermodynamical ensembles fail to describe the large class of systems that exhibit nontrivial conserved quantities, and generalized ensembles have been predicted to maximize entropy in these systems. We show experimentally that a degenerate one-dimensional Bose gas relaxes to a state that can be described by such a generalized ensemble. This is verified through a detailed study of correlation functions up to 10th order. The applicability of the generalized ensemble description for isolated quantum many-body systems points to a natural emergence of classical statistical properties from the microscopic unitary quantum evolution.

541 citations


Journal ArticleDOI
TL;DR: Techniques essential to the functioning of an STE are described and it is argued that data emerging from these technologies are the driver for new business models, interaction paradigms and even new species.

494 citations


Journal ArticleDOI
Vardan Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam2  +2802 moreInstitutions (215)
04 Jun 2015-Nature
TL;DR: In this paper, the branching fractions of the B meson (B-s(0)) and the B-0 meson decaying into two oppositely charged muons (mu(+) and mu(-)) were observed.
Abstract: The standard model of particle physics describes the fundamental particles and their interactions via the strong, electromagnetic and weak forces. It provides precise predictions for measurable quantities that can be tested experimentally. The probabilities, or branching fractions, of the strange B meson (B-s(0)) and the B-0 meson decaying into two oppositely charged muons (mu(+) and mu(-)) are especially interesting because of their sensitivity to theories that extend the standard model. The standard model predicts that the B-s(0)->mu(+)mu(-) and B-0 ->mu(+)mu(-) decays are very rare, with about four of the former occurring for every billion B-s(0) mesons produced, and one of the latter occurring for every ten billion B-0 mesons(1). A difference in the observed branching fractions with respect to the predictions of the standard model would provide a direction in which the standard model should be extended. Before the Large Hadron Collider (LHC) at CERN2 started operating, no evidence for either decay mode had been found. Upper limits on the branching fractions were an order of magnitude above the standard model predictions. The CMS (Compact Muon Solenoid) and LHCb(Large Hadron Collider beauty) collaborations have performed a joint analysis of the data from proton-proton collisions that they collected in 2011 at a centre-of-mass energy of seven teraelectronvolts and in 2012 at eight teraelectronvolts. Here we report the first observation of the B-s(0)->mu(+)mu(-) decay, with a statistical significance exceeding six standard deviations, and the best measurement so far of its branching fraction. Furthermore, we obtained evidence for the B-0 ->mu(+)mu(-) decay with a statistical significance of three standard deviations. Both measurements are statistically compatible with standard model predictions and allow stringent constraints to be placed on theories beyond the standard model. The LHC experiments will resume taking data in 2015, recording proton-proton collisions at a centre-of-mass energy of 13 teraelectronvolts, which will approximately double the production rates of B-s(0) and B-0 mesons and lead to further improvements in the precision of these crucial tests of the standard model.

467 citations


Journal ArticleDOI
TL;DR: In this paper, the authors evaluated the skill of a new, merged soil moisture product (ECV_SM) that has been developed in the framework of the European Space Agency's Water Cycle Multi-mission Observation Strategy and Climate Change Initiative projects.

463 citations


Journal ArticleDOI
TL;DR: WTE plants are supporting decisions about waste and environmental management: They can routinely and cost effectively supply information about chemical waste composition as well as about the ratio of biogenic to fossil carbon in MSW and off-gas.

Proceedings ArticleDOI
01 Jan 2015
TL;DR: This paper presents and compares existing IoT application layer protocols as well as protocols that are utilized to connect the “things” but also end-user applications to the Internet, and argues their suitability for the IoT by considering reliability, security, and energy consumption aspects.
Abstract: It has been more than fifteen years since the term Internet of Things (IoT) was introduced. However, despite the efforts of research groups and innovative corporations, still today it is not possible to say that the IoT is upon us. This is mainly due to the fact that a unified IoT architecture has not yet been clearly defined and there is no common agreement in defining communication protocols and standards for all the IoT parts. The framework that current IoT platforms use consists mostly in technologies that partially fulfill the IoT requirements. While developers employ existing technologies to build the IoT, research groups are working on adapting protocols to the IoT in order to optimize communications. In this paper, we present and compare existing IoT application layer protocols as well as protocols that are utilized to connect the “things” but also end-user applications to the Internet. We highlight IETF’s CoAP, IBM’s MQTT, HTML 5’s Websocket among others, and we argue their suitability for the IoT by considering reliability, security, and energy consumption aspects. Finally, we provide our conclusions for the IoT application layer communications based on the study that we have conducted.

Journal ArticleDOI
TL;DR: In this paper, the authors present various models for the origin of the electric noise, provide a critical review of the experimental findings, and summarizes the important questions that are still open in this active research area.
Abstract: How can the electric noise in the vicinity of a metallic body be measured and understood? Trapped ions, known as unique tools for metrology and quantum information processing, also constitute very sensitive probes of this electric noise for distances from micrometers to millimeters. This paper presents various models for the origin of the electric noise, provides a critical review of the experimental findings, and summarizes the important questions that are still open in this active research area.

Journal ArticleDOI
TL;DR: In this paper, the authors focus on the photoelectric effect by attosecond streaking and highlight the unresolved and open questions and point to future directions aiming at the observation and control of electronic motion in more complex nanoscale structures and in condensed matter.
Abstract: Recent advances in the generation of well characterized sub-femtosecond laser pulses have opened up unpredicted opportunities for the real-time observation of ultrafast electronic dynamics in matter. Such attosecond chronoscopy allows a novel look at a wide range of fundamental photophysical and photochemical processes in the time domain, including Auger and autoionization processes, photoemission from atoms, molecules, and surfaces, complementing conventional energy-domain spectroscopy. Attosecond chronoscopy raises fundamental conceptual and theoretical questions as which novel information becomes accessible and which dynamical processes can be controlled and steered. These questions are currently a matter of lively debate which we address in this review. We will focus on one prototypical case, the chronoscopy of the photoelectric effect by attosecond streaking. Is photoionization instantaneous or is there a finite response time of the electronic wavefunction to the photoabsorption event? Answers to this question turn out to be far more complex and multi-faceted than initially thought. They touch upon fundamental issues of time and time delay as observables in quantum theory. We review recent progress of our understanding of time-resolved photoemission from atoms, molecules, and solids. We will highlight the unresolved and open questions and we point to future directions aiming at the observation and control of electronic motion in more complex nanoscale structures and in condensed matter. PACS numbers: 32.80.Fb, 42.50.Hz, 42.65.Re, 31.15.A-

Journal ArticleDOI
TL;DR: In this article, an empirical troposphere delay model providing the mean values plus annual and semiannual amplitudes of pressure, temperature and its lapse rate, water vapor pressure and its decrease factor, weighted mean temperature, as well as hydrostatic and wet mapping function coefficients of the Vienna mapping function 1.
Abstract: Global pressure and temperature 2 wet (GPT2w) is an empirical troposphere delay model providing the mean values plus annual and semiannual amplitudes of pressure, temperature and its lapse rate, water vapor pressure and its decrease factor, weighted mean temperature, as well as hydrostatic and wet mapping function coefficients of the Vienna mapping function 1. All climatological parameters have been derived consistently from monthly mean pressure level data of ERA-Interim fields (European Centre for Medium-Range Weather Forecasts Re-Analysis) with a horizontal resolution of 1°, and the model is suitable to calculate slant hydrostatic and wet delays down to 3° elevation at sites in the vicinity of the earth surface using the date and approximate station coordinates as input. The wet delay estimation builds upon gridded values of the water vapor pressure, the weighted mean temperature, and the water vapor decrease factor, with the latter being tuned to ray-traced zenith wet delays. Comparisons with zenith delays at 341 globally distributed global navigation satellite systems stations show that the mean bias over all stations is below 1 mm and the mean standard deviation is about 3.6 cm. The GPT2w model with the gridded input file is provided at http://ggosatm.hg.tuwien.ac.at/DELAY/SOURCE/GPT2w/.

Journal ArticleDOI
TL;DR: In this article, the authors developed a new approach whereby the mutual interactions and continuous feedbacks between floods and societies are explicitly accounted for and showed an application of this approach by using a socio-hydrological model to simulate the behavior of two main prototypes of societies.
Abstract: In flood risk assessment, there remains a lack of analytical frameworks capturing the dynamics emerging from two-way feedbacks between physical and social processes, such as adaptation and levee effect. The former, “adaptation effect”, relates to the observation that the occurrence of more frequent flooding is often associated with decreasing vulnerability. The latter, “levee effect”, relates to the observation that the non-occurrence of frequent flooding (possibly caused by flood protection structures, e.g. levees) is often associated to increasing vulnerability. As current analytical frameworks do not capture these dynamics, projections of future flood risk are not realistic. In this paper, we develop a new approach whereby the mutual interactions and continuous feedbacks between floods and societies are explicitly accounted for. Moreover, we show an application of this approach by using a socio-hydrological model to simulate the behavior of two main prototypes of societies: green societies, which cope with flooding by resettling out of flood-prone areas; and technological societies, which deal with flooding also by building levees or dikes. This application shows that the proposed approach is able to capture and explain the aforementioned dynamics (i.e. adaptation and levee effect) and therefore contribute to a better understanding of changes in flood risk, within an iterative process of theory development and empirical research.

Journal ArticleDOI
TL;DR: C cultivation conditions for a recombinant P. pastoris Δoch1 strain are determined allowing high productivity and product purity and the effects of the 3 process parameters temperature, pH and dissolved oxygen concentration on cell physiology, cell morphology, cell lysis and productivity are investigated in a multivariate manner.
Abstract: Pichia pastoris is a prominent host for recombinant protein production, amongst other things due to its capability of glycosylation. However, N-linked glycans on recombinant proteins get hypermannosylated, causing problems in subsequent unit operations and medical applications. Hypermannosylation is triggered by an α-1,6-mannosyltransferase called OCH1. In a recent study, we knocked out OCH1 in a recombinant P. pastoris CBS7435 MutS strain (Δoch1) expressing the biopharmaceutically relevant enzyme horseradish peroxidase. We characterized the strain in the controlled environment of a bioreactor in dynamic batch cultivations and identified the strain to be physiologically impaired. We faced cell cluster formation, cell lysis and uncontrollable foam formation. In the present study, we investigated the effects of the 3 process parameters temperature, pH and dissolved oxygen concentration on 1) cell physiology, 2) cell morphology, 3) cell lysis, 4) productivity and 5) product purity of the recombinant Δoch1 strain in a multivariate manner. Cultivation at 30°C resulted in low specific methanol uptake during adaptation and the risk of methanol accumulation during cultivation. Cell cluster formation was a function of the C-source rather than process parameters and went along with cell lysis. In terms of productivity and product purity a temperature of 20°C was highly beneficial. In summary, we determined cultivation conditions for a recombinant P. pastoris Δoch1 strain allowing high productivity and product purity.

Journal ArticleDOI
TL;DR: The first direct search for lepton-flavour-violating decays of the recently discovered Higgs boson (H) is described in this paper, where the search is performed in the H→μτ_e and H→mτ_h channels, where τeτe and τ_h are tau leptons reconstructed in the electronic and hadronic decay channels, respectively.

Journal ArticleDOI
TL;DR: In this paper, the results of discussions inside the CIRP Collaborative Working Group (CWG) on Learning Factories enables a lively exchange on the topic of learning factories for future oriented research and education in manufacturing.

Journal ArticleDOI
TL;DR: In this paper, the performance of the Cern LHC detector for photon reconstruction and identification in proton-proton collisions at a centre-of-mass energy of 8 TeV at the CERN LHC is described.
Abstract: A description is provided of the performance of the CMS detector for photon reconstruction and identification in proton-proton collisions at a centre-of-mass energy of 8 TeV at the CERN LHC. Details are given on the reconstruction of photons from energy deposits in the electromagnetic calorimeter (ECAL) and the extraction of photon energy estimates. The reconstruction of electron tracks from photons that convert to electrons in the CMS tracker is also described, as is the optimization of the photon energy reconstruction and its accurate modelling in simulation, in the analysis of the Higgs boson decay into two photons. In the barrel section of the ECAL, an energy resolution of about 1% is achieved for unconverted or late-converting photons from H→γγ decays. Different photon identification methods are discussed and their corresponding selection efficiencies in data are compared with those found in simulated events.

Journal ArticleDOI
TL;DR: This review will focus on major contributions in the field of cascade reactions over the last three years.

Journal ArticleDOI
27 Apr 2015-ACS Nano
TL;DR: By careful selection of the reagents and optimizing reaction conditions, a high density of covalently grafted molecules is obtained, a result that is demonstrated in an unprecedented way by scanning tunneling microscopy (STM) under ambient conditions.
Abstract: We shine light on the covalent modification of graphite and graphene substrates using diazonium chemistry under ambient conditions. We report on the nature of the chemical modification of these graphitic substrates, the relation between molecular structure and film morphology, and the impact of the covalent modification on the properties of the substrates, as revealed by local microscopy and spectroscopy techniques and electrochemistry. By careful selection of the reagents and optimizing reaction conditions, a high density of covalently grafted molecules is obtained, a result that is demonstrated in an unprecedented way by scanning tunneling microscopy (STM) under ambient conditions. With nanomanipulation, i.e., nanoshaving using STM, surface structuring and functionalization at the nanoscale is achieved. This manipulation leads to the removal of the covalently anchored molecules, regenerating pristine sp2 hybridized graphene or graphite patches, as proven by space-resolved Raman microscopy and molecular ...

Journal ArticleDOI
TL;DR: A comprehensive dataset detailing the bacterioplankton diversity along the midstream of the Danube River and its tributaries is presented, revealing that bacterial richness and evenness gradually declined downriver in both the free‐living and particle‐associated bacterial communities.
Abstract: The bacterioplankton diversity in large rivers has thus far been under-sampled despite the importance of streams and rivers as components of continental landscapes. Here, we present a comprehensive dataset detailing the bacterioplankton diversity along the midstream of the Danube River and its tributaries. Using 16S rRNA-gene amplicon sequencing, our analysis revealed that bacterial richness and evenness gradually declined downriver in both the free-living and particle-associated bacterial communities. These shifts were also supported by beta diversity analysis, where the effects of tributaries were negligible in regards to the overall variation. In addition, the river was largely dominated by bacteria that are commonly observed in freshwaters. Dominated by the acI lineage, the freshwater SAR11 (LD12) and the Polynucleobacter group, typical freshwater taxa increased in proportion downriver and were accompanied by a decrease in soil and groundwater-affiliated bacteria. Based on views of the meta-community and River Continuum Concept, we interpret the observed taxonomic patterns and accompanying changes in alpha and beta diversity with the intention of laying the foundation for a unified concept for river bacterioplankton diversity.

Journal ArticleDOI
TL;DR: There is some evidence that implementing clinical and/or quality dashboards that provide immediate access to information for clinicians can improve adherence to quality guidelines and may help improve patient outcomes.

Journal ArticleDOI
TL;DR: A novel efficient algorithm for computing the exact Hausdorff distance that has efficient performance for large point set sizes as well as for large grid size; performs equally for sparse and dense point sets; and is general without restrictions on the characteristics of the point set.
Abstract: The Hausdorff distance (HD) between two point sets is a commonly used dissimilarity measure for comparing point sets and image segmentations. Especially when very large point sets are compared using the HD, for example when evaluating magnetic resonance volume segmentations, or when the underlying applications are based on time critical tasks, like motion detection, then the computational complexity of HD algorithms becomes an important issue. In this paper we propose a novel efficient algorithm for computing the exact Hausdorff distance. In a runtime analysis, the proposed algorithm is demonstrated to have nearly-linear complexity. Furthermore, it has efficient performance for large point set sizes as well as for large grid size; performs equally for sparse and dense point sets; and finally it is general without restrictions on the characteristics of the point set. The proposed algorithm is tested against the HD algorithm of the widely used national library of medicine insight segmentation and registration toolkit (ITK) using magnetic resonance volumes with extremely large size. The proposed algorithm outperforms the ITK HD algorithm both in speed and memory required. In an experiment using trajectories from a road network, the proposed algorithm significantly outperforms an HD algorithm based on R-Trees.

Journal ArticleDOI
Vardan Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam  +2353 moreInstitutions (181)
TL;DR: In this paper, a search for a heavy Higgs boson in the H to WW and H to ZZ decay channels is reported, based upon proton-proton collision data samples corresponding to an integrated luminosity of up to 5.1 inverse femtobarns at sqrt(s)=7 TeV and up to 19.7 inverse femto-bars at square root of 8 TeV, recorded by the CMS experiment at the CERN LHC.
Abstract: A search for a heavy Higgs boson in the H to WW and H to ZZ decay channels is reported. The search is based upon proton-proton collision data samples corresponding to an integrated luminosity of up to 5.1 inverse femtobarns at sqrt(s)=7 TeV and up to 19.7 inverse femtobarns at sqrt(s)=8 TeV, recorded by the CMS experiment at the CERN LHC. Several final states of the H to WW and H to ZZ decays are analyzed. The combined upper limit at the 95% confidence level on the product of the cross section and branching fraction exclude a Higgs boson with standard model-like couplings and decays in the range 145 < m[H] < 1000 GeV. We also interpret the results in the context of an electroweak singlet extension of the standard model.

Journal ArticleDOI
TL;DR: In this paper, the authors present the latest knowledge on approaches to recover phosphorus from municipal wastewater and related waste flows with a specific focus on the existing well-developed wastewater management infrastructure, available in significant parts of Europe (e.g., secondary treated effluent, digester supernatant, sewage sludge, sludge ash).
Abstract: Over the past years, numerous technologies have been developed to recover phosphorus (P) from waste streams to repair currently broken nutrient cycles. These developments were largely triggered by environmental considerations (sustainability, resource efficiency), concerns regarding the finite and geopolitically concentrated deposits of raw phosphate ore, and phosphate price increases. Municipal wastewater is a promising and viable source to recover P in larger quantities, to re-establish a circular economy and therefore increase net use efficiency. This work compiles the latest knowledge on approaches to recover P from municipal wastewater and related waste flows with a specific focus on the existing well-developed wastewater management infrastructure, available in significant parts of Europe (e.g., secondary treated effluent, digester supernatant, sewage sludge, sewage sludge ash). About 50 technologies were identified at various levels of development (industrial-, full-, pilot- and laboratory scale). The current selection of P recovery processes is broad and ranges from simple precipitation of dissolved P to complex multi-step approaches, and only a few of these displayed potential for full-scale implementation. They are discussed with regard to their technical principles, process parameters, recovery efficiency, resource demand, possible effects on wastewater treatment, waste flows, and fate of pollutants. We also evaluated them with respect to their rates of P removal from wastewater and their access points of P recovery. For selected technologies, material flow models are presented to facilitate the understanding of even complex processes. This work serves as a basis for future integrated comparative assessments of selected recovery approaches according to technical, environmental and economic criteria.

Journal ArticleDOI
TL;DR: In this paper, the authors employ the link between the local polarization of strongly confined light and its direction of propagation to realize low-loss non-reciprocal transmission through a silica nanofiber at the single-photon level.
Abstract: The realization of nanophotonic optical isolators with high optical isolation even at ultralow light levels and low optical losses is an open problem. Here, we employ the link between the local polarization of strongly confined light and its direction of propagation to realize low-loss nonreciprocal transmission through a silica nanofiber at the single-photon level. The direction of the resulting optical isolator is controlled by the spin state of cold atoms. We perform our experiment in two qualitatively different regimes, i.e., with an ensemble of cold atoms where each atom is weakly coupled to the waveguide and with a single atom strongly coupled to the waveguide mode. In both cases, we observe simultaneously high isolation and high forward transmission. The isolator concept constitutes a nanoscale quantum optical analog of microwave ferrite resonance isolators, can be implemented with all kinds of optical waveguides and emitters, and might enable novel integrated optical devices for fiber-based classical and quantum networks.

Journal ArticleDOI
TL;DR: Laser ablation-inductively coupled plasma mass spectrometry (LA-ICP-MS) is a widely accepted method for direct sampling of solid materials for trace elemental analysis as mentioned in this paper.
Abstract: Laser ablation–inductively coupled plasma–mass spectrometry (LA-ICP-MS) is a widely accepted method for direct sampling of solid materials for trace elemental analysis. The number of reported applications is high and the application range is broad; besides geochemistry, LA-ICP-MS is mostly used in environmental chemistry and the life sciences. This review focuses on the application of LA-ICP-MS for quantification of trace elements in environmental, biological, and medical samples. The fundamental problems of LA-ICP-MS, such as sample-dependent ablation behavior and elemental fractionation, can be even more pronounced in environmental and life science applications as a result of the large variety of sample types and conditions. Besides variations in composition, the range of available sample states is highly diverse, including powders (e.g., soil samples, fly ash), hard tissues (e.g., bones, teeth), soft tissues (e.g., plants, tissue thin-cuts), or liquid samples (e.g., whole blood). Within this article, quantification approaches that have been proposed in the past are critically discussed and compared regarding the results obtained in the applications described. Although a large variety of sample types is discussed within this article, the quantification approaches used are similar for many analytical questions and have only been adapted to the specific questions. Nevertheless, none of them has proven to be a universally applicable method.

Journal ArticleDOI
TL;DR: This work presents the analytical calculation of entanglement entropy for a class of two-dimensional field theories governed by the symmetries of the Galilean conformal algebra, thus providing a rare example of such an exact computation.
Abstract: We present the analytical calculation of entanglement entropy for a class of two-dimensional field theories governed by the symmetries of the Galilean conformal algebra, thus providing a rare example of such an exact computation. These field theories are the putative holographic duals to theories of gravity in three-dimensional asymptotically flat spacetimes. We provide a check of our field theory answers by an analysis of geodesics. We also exploit the Chern-Simons formulation of three-dimensional gravity and adapt recent proposals of calculating entanglement entropy by Wilson lines in this context to find an independent confirmation of our results from holography.